Skip navigation links
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z _ 

A

abort() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
De-registers the handler for all future requests for state for the registered process bundle instruction id.
abort(Executor) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
absolute(String, String...) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
Construct a path from an absolute component path hierarchy.
AbstractBeamCalcRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace Project and Filter node.
AbstractBeamCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
 
AbstractGetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
 
AbstractReadFileRangesFn(SerializableFunction<String, ? extends FileBasedSource<InT>>, ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn
 
AbstractResult() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
accept(BeamFnApi.Elements) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
accept(ByteString) - Method in class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
 
accept(T) - Method in interface org.apache.beam.sdk.fn.data.FnDataReceiver
 
accept(T1, T2) - Method in interface org.apache.beam.sdk.function.ThrowingBiConsumer
 
accept(T) - Method in interface org.apache.beam.sdk.function.ThrowingConsumer
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
 
accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
 
accept(SchemaZipFold.Context, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
Accepts two components, context.parent() is always ROW, MAP, ARRAY or absent.
accept(SchemaZipFold.Context, Optional<Schema.Field>, Optional<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
Accepts two fields, context.parent() is always ROW.
accessPattern() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
accessType() - Method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
 
accumulate(T, T) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
Accumulate two results together.
accumulateWeight(long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
 
AccumulatingCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
accumulatingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
Returns a new Window PTransform that uses the registered WindowFn and Triggering behavior, and that accumulates elements in a pane after they are triggered.
ACCUMULATOR_NAME - Static variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
 
AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
 
ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return the ack deadline, in seconds, for subscription.
ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
ackId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Id to pass back to Pubsub to acknowledge receipt of this message.
acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Acknowldege messages from subscription with ackIds.
acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
acquireTaskAttemptIdLock(Configuration, int) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
Creates unique TaskAttemptID for given taskId.
acquireTaskAttemptIdLock(Configuration, int) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
 
acquireTaskIdLock(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
Creates TaskID with unique id among given job.
acquireTaskIdLock(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
 
ActionFactory - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
Factory class for creating instances that will handle different functions of DoFns.
ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
 
ActionFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
Factory class for creating instances that will handle each type of record within a change stream query.
ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
 
activate(MetricsContainer) - Method in class org.apache.beam.sdk.metrics.MetricsEnvironment.MetricsContainerHolder
 
activate(MetricsContainer) - Method in interface org.apache.beam.sdk.metrics.MetricsEnvironment.MetricsEnvironmentState
 
ACTIVE_PARTITION_READ_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the active partition reads during the execution of the Connector.
actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in interface org.apache.beam.sdk.schemas.ProjectionProducer
Actuate a projection pushdown.
add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
add(String) - Method in class org.apache.beam.runners.jet.metrics.StringSetImpl
 
add(String...) - Method in class org.apache.beam.runners.jet.metrics.StringSetImpl
 
add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
 
add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
 
add(int, GlobalWatermarkHolder.SparkWatermarks) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
add(T) - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
 
add(long, Instant, boolean) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
 
add(T, long, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
 
add(T, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
 
add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
add(Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
 
add(Type, String, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
 
add(T, long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
 
add(List<ValueInSingleWindow<T>>, TableDataInsertAllResponse.InsertErrors, TableReference, FailsafeValueInSingleWindow<TableRow, TableRow>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
add(String, String, Iterable<String>) - Method in class org.apache.beam.sdk.metrics.Lineage
Add a FQN (fully-qualified name) to Lineage.
add(String, Iterable<String>) - Method in class org.apache.beam.sdk.metrics.Lineage
Add a FQN (fully-qualified name) to Lineage.
add(String) - Method in class org.apache.beam.sdk.metrics.Lineage
Adds the given details as Lineage.
add(String) - Method in interface org.apache.beam.sdk.metrics.StringSet
Add a value to this set.
add(String...) - Method in interface org.apache.beam.sdk.metrics.StringSet
Add values to this set.
add(InputT) - Method in interface org.apache.beam.sdk.state.GroupingState
Add a value to the buffer.
add(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
For internal use only: no backwards compatibility guarantees.
add(long) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
Deprecated.
Adds a value to the heap, returning whether the value is (large enough to be) in the heap.
add(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register the given display item.
addAccum(AccumT) - Method in interface org.apache.beam.sdk.state.CombiningState
Add an accumulator to this state cell.
addAll(Map<Integer, Queue<GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
addAll(WeightedList<T>) - Method in class org.apache.beam.sdk.fn.data.WeightedList
 
addAll(List<T>, long) - Method in class org.apache.beam.sdk.fn.data.WeightedList
 
addAnnotation(String, byte[]) - Method in class org.apache.beam.sdk.transforms.PTransform
 
addArray(Collection<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
 
addArray(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
 
addArrayField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addAttempted(T, BiFunction<T, T, T>) - Method in class org.apache.beam.sdk.metrics.MetricResult
 
addBatchWriteRequest(long, boolean) - Method in interface org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler.Stats
 
addBoolean(Map<String, Object>, String, boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addBooleanField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addByteArrayField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addByteField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addCoderAndEncodedRecord(Coder<T>, T) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
 
addCollectionToSingletonOutput(PCollection<?>, String, PCollectionView<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Adds an output to this CollectionToSingleton Dataflow step, consuming the specified input PValue and producing the specified output PValue.
addCommitted(T, BiFunction<T, T, T>) - Method in class org.apache.beam.sdk.metrics.MetricResult
 
addDataSet(String, DataSet<T>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
addDataStream(String, DataStream<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
addDateTimeField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addDecimalField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addDouble(Map<String, Object>, String, Double) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addDoubleField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addElements(T, T...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Adds the specified elements to the source with timestamp equal to the current watermark.
addElements(TimestampedValue<T>, TimestampedValue<T>...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Adds the specified elements to the source with the provided timestamps.
addEncodingInput(Coder<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Sets the encoding for this Dataflow step.
addErrorCollection(PCollection<ErrorT>) - Method in interface org.apache.beam.sdk.transforms.errorhandling.ErrorHandler
 
addErrorCollection(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
 
addErrorCollection(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
 
addErrorForCode(int, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
Adds a matcher to log the provided string if the error matches a particular status code.
addErrorForCodeAndUrlContains(int, String, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
Adds a matcher to log the provided string if the error matches a particular status code and the url contains a certain string.
addExceptionStackTrace(Exception) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
 
addExperiment(ExperimentalOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
Adds experiment to options if not already present.
addFailure(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
 
addField(Schema.Field) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addFields(List<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addFields(Schema.Field...) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
AddFields - Class in org.apache.beam.sdk.schemas.transforms
A transform to add new nullable fields to a PCollection's schema.
AddFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.AddFields
 
AddFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
Inner PTransform for AddFields.
addFloatField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
AddHarnessIdInterceptor - Class in org.apache.beam.sdk.fn.channel
A ClientInterceptor that attaches a provided SDK Harness ID to outgoing messages.
addHumanReadableJson(Object) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
 
addIfAbsent(T) - Method in interface org.apache.beam.sdk.state.SetState
Ensures a value is a member of the set, returning true if it was added and false otherwise.
addIfNotDefault(DisplayData.ItemSpec<T>, T) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register the given display item if the value is different than the specified default.
addIfNotNull(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register the given display item if the value is not null.
addIncompleteNewPartitions(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
Add NewPartition if it hasn't been updated for 15 minutes.
addIncompleteNewPartitions(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
Capture NewPartition row that cannot merge on its own.
addInput(String, Boolean) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Adds an input with the given name and value to this Dataflow step.
addInput(String, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Adds an input with the given name and value to this Dataflow step.
addInput(String, Long) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Adds an input with the given name and value to this Dataflow step.
addInput(String, PInput) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Adds an input with the given name to this Dataflow step, coming from the specified input PValue.
addInput(String, Map<String, Object>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Adds an input that is a dictionary of strings to objects.
addInput(String, List<? extends Map<String, Object>>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Adds an input that is a list of objects.
addInput(SequenceRangeAccumulator, TimestampedValue<KV<EventKeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
 
addInput(HyperLogLogPlus, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
addInput(SketchFrequencies.Sketch<InputT>, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
addInput(MergingDigest, Double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
addInput(long[], Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
 
addInput(CovarianceAccumulator, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
addInput(VarianceAccumulator, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
addInput(BeamBuiltinAggregations.BitXOr.Accum, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
 
addInput(AccumT, InputT, Long, Long, Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
 
addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
 
addInput(List<T>, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
 
addInput(String, byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
 
addInput(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
 
addInput(Long, Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
 
addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
addInput(AccumT, InputT) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
Adds the given input value to the given accumulator, returning the new accumulator value.
addInput(List<String>, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
 
addInput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique, T) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
Deprecated.
 
addInput(InputT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
Adds the given input value to this accumulator, modifying this accumulator.
addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
addInput(double[], Double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
addInput(Combine.Holder<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
addInput(int[], Integer) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
addInput(long[], Long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Adds the given input value to the given accumulator, returning the new accumulator value.
addInput(List<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
addInput(Object[], DataT) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
addInput(Object[], DataT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
addInput(AccumT, InputT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Adds the given input value to the given accumulator, returning the new accumulator value.
addInput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>, T) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
addInt16Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addInt32Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addInt64Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addIterable(Iterable<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
 
addIterableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addKnownCoderUrn(String) - Static method in class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
Registers a coder as being of known type and as such not meriting length prefixing.
addLabel(String, String) - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
Add a metric label KV pair to the metric name.
addLengthPrefixedCoder(String, RunnerApi.Components.Builder, boolean) - Static method in class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
Recursively traverses the coder tree and wraps the first unknown coder in every branch with a LengthPrefixCoder unless an ancestor coder is itself a LengthPrefixCoder.
addList(Map<String, Object>, String, List<? extends Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addList(Map<String, Object>, String, T[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addLogicalTypeConversions(GenericData) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
 
addLogicalTypeField(String, Schema.LogicalType<InputT, BaseT>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addLong(Map<String, Object>, String, long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addLongs(Map<String, Object>, String, long...) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addMapField(String, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addMessageListener(Consumer<JobApi.JobMessage>) - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
Listen for job messages with a Consumer.
addMethodParameters(Method) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
 
addMissingPartitions(List<Range.ByteStringRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
Add all the missingPartitions.
addMissingPartitions(List<Range.ByteStringRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
Capture partitions that are not currently being streamed.
addNameFilter(MetricNameFilter) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
addNull(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addNullableArrayField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableBooleanField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableByteArrayField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableByteField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableDateTimeField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableDecimalField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableDoubleField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableFloatField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableInt16Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableInt32Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableInt64Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableIterableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableLogicalTypeField(String, Schema.LogicalType<InputT, BaseT>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableMapField(String, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableRowField(String, Schema) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addNullableStringField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
 
addOutput(String, PCollection<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
Adds a primitive output to this Dataflow step with the given name as the local output name, producing the specified output PValue, including its Coder if a TypedPValue.
addOutput(Output) - Method in class org.apache.beam.sdk.io.cdap.context.BatchSinkContextImpl
Overrides the output configuration of this Batch job to the specified Output.
addOutputColumnList(List<ResolvedNodes.ResolvedOutputColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
 
addOverrideForClass(Class<?>, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Deprecated.
Overrides the default log level for the passed in class.
addOverrideForClass(Class<?>, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
Overrides the default log level for the passed in class.
addOverrideForName(String, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Deprecated.
Overrides the default log level for the passed in name.
addOverrideForName(String, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
Overrides the default log logLevel for the passed in name.
addOverrideForPackage(Package, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Deprecated.
Overrides the default log level for the passed in package.
addOverrideForPackage(Package, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
Overrides the default log level for the passed in package.
addProperties(MetadataEntity, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
addResolvedTable(TableResolution.SimpleTableWithPath) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
Store a table together with its full path for repeated resolutions.
addRowField(String, Schema) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addRows(Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
Add rows to the builder.
addRows(String, Row...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
addRows(Duration, Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
Add rows to the builder.
addRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
Creates a runner-side wire coder for a port read/write for the given PCollection.
addSchema(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Add a top-level schema backed by the table provider.
addSdkWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
Creates an SDK-side wire coder for a port read/write for the given PCollection.
AddShardKeyDoFn - Class in org.apache.beam.sdk.io.solace.write
This class a pseudo-key with a given cardinality.
AddShardKeyDoFn(int) - Constructor for class org.apache.beam.sdk.io.solace.write.AddShardKeyDoFn
 
addStateListener(Consumer<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
Listen for job state changes with a Consumer.
addStep(PTransform<?, ?>, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
Adds a step to the Dataflow workflow for the given transform, with the given Dataflow step type.
addStep(String) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
Add a step filter.
addString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addStringField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
addStringList(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addTags(MetadataEntity, String...) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
addTags(MetadataEntity, Iterable<String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
addToCurrentBundle(Solace.Record) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
addTraceFor(AbstractGoogleClient, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
Creates a GoogleApiDebugOptions.GoogleApiTracer that sets the trace destination on all calls that match the given client type.
addTraceFor(AbstractGoogleClientRequest<?>, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
Creates a GoogleApiDebugOptions.GoogleApiTracer that sets the trace traceDestination on all calls that match for the given request type.
addUdaf(String, Combine.CombineFn) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Register a UDAF function which can be used in GROUP-BY expression.
addUdf(String, Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Register a UDF function which can be used in SQL expression.
addUdf(String, Class<? extends BeamSqlUdf>) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Register a UDF function which can be used in SQL expression.
addUdf(String, SerializableFunction) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Register a UDF function which can be used in SQL expression.
addUuids() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
Add Uuids to to-be-published messages that ensures that uniqueness is maintained.
AddUuidsTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A transform to add UUIDs to each message to be written to Pub/Sub Lite.
AddUuidsTransform() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
 
addValue(Object) - Method in class org.apache.beam.sdk.values.Row.Builder
 
addValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
 
addValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
 
advance() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
advance() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
advance() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
For subscription mode only: Track progression of time according to the Clock passed .
advance() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
advance() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
advance() - Method in class org.apache.beam.sdk.io.Source.Reader
Advances the reader to the next valid record.
advance() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Advances the reader to the next valid record.
advanceBy(Duration) - Static method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
For internal use only: no backwards compatibility guarantees.
advanceImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
advanceImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Advances to the next record and returns true, or returns false if there is no next record.
advanceNextBatchWatermarkToInfinity() - Method in class org.apache.beam.runners.spark.io.CreateStream
Advances the watermark in the next batch to the end-of-time.
advanceProcessingTime(Duration) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Advance the processing time by the specified amount.
advanceTo(Instant) - Static method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
For internal use only: no backwards compatibility guarantees.
advanceWatermark() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
Advances the watermark.
advanceWatermarkForNextBatch(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
Advances the watermark in the next batch.
advanceWatermarkTo(Instant) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Advance the watermark of this source to the specified instant.
advanceWatermarkToInfinity() - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Advance the watermark to infinity, completing this TestStream.
AdvancingPhaser - Class in org.apache.beam.sdk.fn.stream
A Phaser which never terminates.
AdvancingPhaser(int) - Constructor for class org.apache.beam.sdk.fn.stream.AdvancingPhaser
 
AfterAll - Class in org.apache.beam.sdk.transforms.windowing
A composite Trigger that fires when all of its sub-triggers are ready.
afterBundleCommit(Instant, DoFn.BundleFinalizer.Callback) - Method in interface org.apache.beam.sdk.transforms.DoFn.BundleFinalizer
The provided function will be called after the runner successfully commits the output of a successful bundle.
AfterEach - Class in org.apache.beam.sdk.transforms.windowing
A composite Trigger that executes its sub-triggers in order.
AfterFirst - Class in org.apache.beam.sdk.transforms.windowing
A composite Trigger that fires once after at least one of its sub-triggers have fired.
afterIterations(int) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Returns a Watch.Growth.TerminationCondition that holds after the given number of polling iterations have occurred per-input.
AfterPane - Class in org.apache.beam.sdk.transforms.windowing
A Trigger that fires at some point after a specified number of input elements have arrived.
AfterProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
A Trigger trigger that fires at a specified point in processing time, relative to when input first arrives.
AfterSynchronizedProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
FOR INTERNAL USE ONLY.
afterTimeSinceNewOutput(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Returns a Watch.Growth.TerminationCondition that holds after the given time has elapsed after the last time the Watch.Growth.PollResult for the current input contained a previously unseen output.
afterTimeSinceNewOutput(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Like Watch.Growth.afterTimeSinceNewOutput(ReadableDuration), but the duration is input-dependent.
afterTotalOf(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Returns a Watch.Growth.TerminationCondition that holds after the given time has elapsed after the current input was seen.
afterTotalOf(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Like Watch.Growth.afterTotalOf(ReadableDuration), but the duration is input-dependent.
AfterWatermark - Class in org.apache.beam.sdk.transforms.windowing
AfterWatermark triggers fire based on progress of the system watermark.
AfterWatermark.AfterWatermarkEarlyAndLate - Class in org.apache.beam.sdk.transforms.windowing
 
AfterWatermark.FromEndOfWindow - Class in org.apache.beam.sdk.transforms.windowing
A watermark trigger targeted relative to the end of the window.
aggregate(Combine.CombineFn<InputT, ?, OutputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
Aggregate the grouped data using the specified Combine.CombineFn.
AggregateCombiner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
 
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
Build up an aggregation function over the input elements.
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
Build up an aggregation function over the input elements.
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
Build up an aggregation function over the input elements.
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
Build up an aggregation function over the input elements.
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Build up an aggregation function over the input elements.
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Build up an aggregation function over the input elements.
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
Build up an aggregation function over the input elements.
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
Build up an aggregation function over the input elements.
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
Build up an aggregation function over the input elements.
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
Build up an aggregation function over the input elements.
aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
Build up an aggregation function over the input elements.
aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
Build up an aggregation function over the input elements.
aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
Build up an aggregation function over the input elements.
aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
Build up an aggregation function over the input elements.
aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Build up an aggregation function over the input elements.
aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Build up an aggregation function over the input elements.
aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Build up an aggregation function over the input elements.
aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Build up an aggregation function over the input elements.
aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
Build up an aggregation function over the input elements.
aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
Build up an aggregation function over the input elements.
aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
Build up an aggregation function over the input elements.
aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
Build up an aggregation function over the input elements.
aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
Build up an aggregation function over the input elements.
aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
Build up an aggregation function over the input elements.
aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
Build up an aggregation function over the input elements.
aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
Build up an aggregation function over the input elements.
aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
Build up an aggregation function over the input elements by field id.
aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
AggregateFn<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.extensions.sql.udf
An aggregate function that can be executed as part of a SQL query.
AggregationCombineFnAdapter<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
Wrapper Combine.CombineFns for aggregation function calls.
AggregationCombineFnAdapter() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
 
AggregationQuery - Class in org.apache.beam.sdk.io.mongodb
Builds a MongoDB AggregateIterable object.
AggregationQuery() - Constructor for class org.apache.beam.sdk.io.mongodb.AggregationQuery
 
algorithm(String) - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
 
align(Duration) - Method in interface org.apache.beam.sdk.state.Timer
Aligns the target timestamp used by Timer.setRelative() to the next boundary of period.
alignedTo(Duration, Instant) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
Aligns timestamps to the smallest multiple of period since the offset greater than the timestamp.
alignedTo(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
Aligns the time to be the smallest multiple of period greater than the epoch boundary (aka new Instant(0)).
alignTo(Duration, Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
For internal use only; no backwards-compatibility guarantees.
alignTo(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
For internal use only; no backwards-compatibility guarantees.
AlignTo() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
ALL_CONTEXTS - Static variable in class org.apache.beam.sdk.testing.CoderProperties
All the contexts, for use in test cases.
ALL_KEYS - Static variable in class org.apache.beam.sdk.io.range.ByteKeyRange
The range of all keys, with empty start and end keys.
allLeavesDescriptor(Schema, SerializableFunction<List<String>, String>) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
 
allMatches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.AllMatches PTransform that checks if the entire line matches the Regex.
allMatches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.AllMatches PTransform that checks if the entire line matches the Regex.
AllMatches(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.AllMatches
 
allMetrics() - Method in class org.apache.beam.sdk.metrics.MetricResults
 
allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
 
allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.ServerFactory
Creates an instance of this server using an ephemeral address.
allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.ServerFactory.InetSocketAddressServerFactory
 
allocatePortAndCreateFor(ServiceT, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
Create a GrpcFnServer for the provided FnService running on an arbitrary port.
allocatePortAndCreateFor(List<? extends FnService>, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
Create GrpcFnServers for the provided FnServices running on an arbitrary port.
allOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.allOf(Iterable).
allOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.allOf(Matcher[]).
allOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Returns a Watch.Growth.TerminationCondition that holds when both of the given two conditions hold.
ALLOW_DUPLICATES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ALLOWS_SHARDABLE_STATE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Whether this reader should allow dynamic splitting of the offset ranges.
AlwaysPassMatcher() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
 
AlwaysPassMatcherFactory() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
 
alwaysRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
Always retry all failures.
alwaysUseRead() - Method in class org.apache.beam.sdk.transforms.Create.Values
 
AmqpIO - Class in org.apache.beam.sdk.io.amqp
AmqpIO supports AMQP 1.0 protocol using the Apache QPid Proton-J library.
AmqpIO.Read - Class in org.apache.beam.sdk.io.amqp
A PTransform to read/receive messages using AMQP 1.0 protocol.
AmqpIO.Write - Class in org.apache.beam.sdk.io.amqp
A PTransform to send messages using AMQP 1.0 protocol.
AmqpMessageCoder - Class in org.apache.beam.sdk.io.amqp
A coder for AMQP message.
AmqpMessageCoder() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
 
AmqpMessageCoderProviderRegistrar - Class in org.apache.beam.sdk.io.amqp
A CoderProviderRegistrar for standard types used with AmqpIO.
AmqpMessageCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
 
and(TupleTag<V>, List<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns a new CoGbkResult based on this, with the given tag and given data added to it.
and(TupleTag<V>, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns a new KeyedPCollectionTuple<K> that is the same as this, appended with the given PCollection.
and(String, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
A version of KeyedPCollectionTuple.and(String, PCollection) that takes in a string instead of a TupleTag.
and(PCollection.IsBounded) - Method in enum org.apache.beam.sdk.values.PCollection.IsBounded
Returns the composed IsBounded property.
and(PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionList
Returns a new PCollectionList that has all the PCollections of this PCollectionList plus the given PCollection appended to the end.
and(Iterable<PCollection<T>>) - Method in class org.apache.beam.sdk.values.PCollectionList
Returns a new PCollectionList that has all the PCollections of this PCollectionList plus the given PCollections appended to the end, in order.
and(String, PCollection<Row>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
Returns a new PCollectionRowTuple that has each PCollection and tag of this PCollectionRowTuple plus the given PCollection associated with the given tag.
and(TupleTag<T>, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns a new PCollectionTuple that has each PCollection and TupleTag of this PCollectionTuple plus the given PCollection associated with the given TupleTag.
and(String, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
A version of PCollectionTuple.and(TupleTag, PCollection) that takes in a String instead of a TupleTag.
and(TupleTag<?>) - Method in class org.apache.beam.sdk.values.TupleTagList
Returns a new TupleTagList that has all the TupleTags of this TupleTagList plus the given TupleTag appended to the end.
and(List<TupleTag<?>>) - Method in class org.apache.beam.sdk.values.TupleTagList
Returns a new TupleTagList that has all the TupleTags of this TupleTagList plus the given TupleTags appended to the end, in order.
annotateFromBytes(PCollectionView<Map<ByteString, VideoContext>>, List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
Annotates videos from ByteStrings of their contents.
annotateFromBytesWithContext(List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
Annotates videos from key-value pairs of ByteStrings and VideoContext.
annotateFromURI(List<Feature>, PCollectionView<Map<String, VideoContext>>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
Annotates videos from GCS URIs.
annotateFromUriWithContext(List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
Annotates videos from key-value pairs of GCS URI and VideoContext.
annotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
Creates a PTransform that annotates images from their contents encoded in ByteStrings.
annotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
Creates a PTransform that annotates images from their contents encoded in ByteStrings.
AnnotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
 
annotateImagesFromBytesWithContext(List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
Creates a PTransform that annotates images from KVs of their GCS addresses in Strings and ImageContext for each image.
annotateImagesFromBytesWithContext(List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
Creates a PTransform that annotates images from KVs of their GCS addresses in Strings and ImageContext for each image.
AnnotateImagesFromBytesWithContext(List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
 
annotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
Creates a PTransform that annotates images from their GCS addresses.
annotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
Creates a PTransform that annotates images from their GCS addresses.
AnnotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
 
annotateImagesFromGcsUriWithContext(List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
Creates a PTransform that annotates images from KVs of their String-encoded contents and ImageContext for each image.
annotateImagesFromGcsUriWithContext(List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
Creates a PTransform that annotates images from KVs of their String-encoded contents and ImageContext for each image.
AnnotateImagesFromGcsUriWithContext(List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
 
AnnotateText - Class in org.apache.beam.sdk.extensions.ml
A PTransform using the Cloud AI Natural language processing capability.
AnnotateText() - Constructor for class org.apache.beam.sdk.extensions.ml.AnnotateText
 
AnnotateText.Builder - Class in org.apache.beam.sdk.extensions.ml
 
AnnotateVideoFromBytes(PCollectionView<Map<ByteString, VideoContext>>, List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytes
 
AnnotateVideoFromBytesWithContext(List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytesWithContext
 
AnnotateVideoFromUri(PCollectionView<Map<String, VideoContext>>, List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromUri
 
AnnotateVideoFromURIWithContext(List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromURIWithContext
 
annotations - Variable in class org.apache.beam.sdk.transforms.PTransform
 
any(long) - Static method in class org.apache.beam.sdk.transforms.Sample
Sample#any(long) takes a PCollection<T> and a limit, and produces a new PCollection<T> containing up to limit elements of the input PCollection.
anyCombineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
Returns a Combine.CombineFn that computes a fixed-sized potentially non-uniform sample of its inputs.
anyOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.anyOf(Iterable).
anyOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.anyOf(Matcher[]).
anything() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.anything().
anyValueCombineFn() - Static method in class org.apache.beam.sdk.transforms.Sample
Returns a Combine.CombineFn that computes a single and potentially non-uniform sample value of its inputs.
API_METRIC_LABEL - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
ApiIOError - Class in org.apache.beam.io.requestresponse
ApiIOError is a data class for storing details about an error.
ApiIOError() - Constructor for class org.apache.beam.io.requestresponse.ApiIOError
 
append(K, W, Iterator<V>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
Appends the values to the bag user state for the given key and window.
appendRows(long, ProtoRows) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
Append rows to a Storage API write stream at the given offset.
appendRowsRowStatusCounter(BigQuerySinkMetrics.RowStatus, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
applicableTo(PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
 
ApplicationNameOptions - Interface in org.apache.beam.sdk.options
Options that allow setting the application name.
apply(Tuple2<ByteArray, byte[]>) - Method in class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
 
apply(KV<String, Long>) - Method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.FormatAsTextFn
 
apply(InputT) - Method in interface org.apache.beam.sdk.coders.DelegateCoder.CodingFunction
 
apply(Pipeline, String, RunnerApi.FunctionSpec, Map<String, PCollection<?>>) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
 
apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
 
apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
 
apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
 
apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
 
apply(Schema, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
 
apply(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
 
apply(Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroSink.DatumWriterFactory
 
apply(Schema, Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroSource.DatumReaderFactory
 
apply(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
 
apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
apply(T1, T2) - Method in interface org.apache.beam.sdk.function.ThrowingBiFunction
 
apply(T1) - Method in interface org.apache.beam.sdk.function.ThrowingFunction
 
apply(String, T) - Method in interface org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.EntryMapperFn.Builder
 
apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
 
apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableRowToBeamRow
 
apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow
 
apply(HealthcareIOError<T>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
 
apply(ValueInSingleWindow<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue
 
apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
 
apply(PubsubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
 
apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
 
apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
 
apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
apply(Void) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
 
apply(SQLException) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
 
apply(Void) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
 
apply(SQLException) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.RetryStrategy
 
apply(String, Session) - Method in class org.apache.beam.sdk.io.jms.TextMessageMapper
 
apply(TopicPartition) - Method in class org.apache.beam.sdk.io.kafka.CheckStopReadingFnWrapper
 
apply(MongoCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
 
apply(MongoCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
 
apply(Void) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
 
apply(FileIO.ReadableFile, OffsetRange, Exception) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler
 
apply(Void) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
 
apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
Like Pipeline.apply(String, PTransform) but the transform node in the Pipeline graph will be named according to PTransform.getName().
apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
Adds a root PTransform, such as Read or Create, to this Pipeline.
apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
 
apply(Schema, Schema) - Method in interface org.apache.beam.sdk.schemas.transforms.Cast.Validator
 
apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
 
apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
 
apply(T) - Method in class org.apache.beam.sdk.testing.PAssert.MatcherCheckerFn
 
apply(Statement, Description) - Method in class org.apache.beam.sdk.testing.TestPipeline
 
apply(double, double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
Applies the binary operation to the two operands, returning the result.
apply(V, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
Applies the binary operation to the two operands, returning the result.
apply(int, int) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
Applies the binary operation to the two operands, returning the result.
apply(long, long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
Applies the binary operation to the two operands, returning the result.
apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Applies this CombineFn to a collection of input values to produce a combined output value.
apply(Iterable<? extends InputT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Applies this CombineFnWithContext to a collection of input values to produce a combined output value.
apply(InputT, Contextful.Fn.Context) - Method in interface org.apache.beam.sdk.transforms.Contextful.Fn
Invokes the function on the given input with the given context.
apply(InputT) - Method in class org.apache.beam.sdk.transforms.InferableFunction
 
apply(PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Like KeyedPCollectionTuple.apply(String, PTransform) but defaulting to the name provided by the PTransform.
apply(String, PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Applies the given PTransform to this input KeyedPCollectionTuple and returns its OutputT.
apply(InputT) - Method in interface org.apache.beam.sdk.transforms.ProcessFunction
Returns the result of invoking this function on the given input.
apply(InputT) - Method in interface org.apache.beam.sdk.transforms.SerializableFunction
Returns the result of invoking this function on the given input.
apply(InputT) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
 
apply(PrimitiveViewT) - Method in class org.apache.beam.sdk.transforms.ViewFn
A function to adapt a primitive view type to a desired view type.
apply(WithFailures.ExceptionElement<T>) - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionAsMapHandler
 
apply(WithFailures.ExceptionElement<T>) - Method in class org.apache.beam.sdk.transforms.WithFailures.ThrowableHandler
 
apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
Like PBegin.apply(String, PTransform) but defaulting to the name of the PTransform.
apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
Applies the given PTransform to this PBegin, using name to identify this specific application of the transform.
apply(PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
of the PTransform.
apply(String, PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
Applies the given PTransform to this input PCollection, using name to identify this specific application of the transform.
apply(PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
Like PCollectionList.apply(String, PTransform) but defaulting to the name of the PTransform.
apply(String, PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
Applies the given PTransform to this input PCollectionList, using name to identify this specific application of the transform.
apply(PTransform<? super PCollectionRowTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
Like PCollectionRowTuple.apply(String, PTransform) but defaulting to the name of the PTransform.
apply(String, PTransform<? super PCollectionRowTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
Applies the given PTransform to this input PCollectionRowTuple, using name to identify this specific application of the transform.
apply(PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Like PCollectionTuple.apply(String, PTransform) but defaulting to the name of the PTransform.
apply(String, PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Applies the given PTransform to this input PCollectionTuple, using name to identify this specific application of the transform.
apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
 
apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
 
apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
 
apply(Materializations.IterableView<KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
 
apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
 
apply(Materializations.IterableView<KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
 
apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
 
apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
Deprecated.
 
apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
 
apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
apply(Materializations.MultimapView<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
 
apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
Deprecated.
 
apply(Materializations.MultimapView<K, V>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
 
apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
Deprecated.
 
apply(Materializations.MultimapView<K, V>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
 
apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
Deprecated.
 
apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
 
applyBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PCollection<OutputT>>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyMultiOutputBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyMultiOutputBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyMultiOutputBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyMultiOutputBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyMultiOutputBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyMultiOutputBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PCollectionTuple>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyNoOutputBeamPTransform(DataSet<InputT>, PTransform<CollectionT, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyNoOutputBeamPTransform(Map<String, ? extends DataSet<?>>, PTransform<PCollectionTuple, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyNoOutputBeamPTransform(ExecutionEnvironment, PTransform<PBegin, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
applyNoOutputBeamPTransform(DataStream<InputT>, PTransform<CollectionT, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyNoOutputBeamPTransform(Map<String, ? extends DataStream<?>>, PTransform<PCollectionTuple, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyNoOutputBeamPTransform(StreamExecutionEnvironment, PTransform<PBegin, PDone>) - Method in class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
applyRowMutations() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Write RowMutation messages to BigQuery.
applySdkEnvironmentOverrides(RunnerApi.Pipeline, DataflowPipelineOptions) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
applyTransform(InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
For internal use only; no backwards-compatibility guarantees.
applyTransform(String, InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
For internal use only; no backwards-compatibility guarantees.
applyWindowing() - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
 
ApproximateCountDistinct - Class in org.apache.beam.sdk.extensions.zetasketch
PTransforms for estimating the number of distinct elements in a PCollection, or the number of distinct values associated with each key in a PCollection of KVs.
ApproximateCountDistinct() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
 
ApproximateCountDistinct.Globally<T> - Class in org.apache.beam.sdk.extensions.zetasketch
PTransform for estimating the number of distinct elements in a PCollection.
ApproximateCountDistinct.Globally.Builder<T> - Class in org.apache.beam.sdk.extensions.zetasketch
 
ApproximateCountDistinct.PerKey<K,V> - Class in org.apache.beam.sdk.extensions.zetasketch
 
ApproximateCountDistinct.PerKey.Builder<K,V> - Class in org.apache.beam.sdk.extensions.zetasketch
 
ApproximateDistinct - Class in org.apache.beam.sdk.extensions.sketching
PTransforms for computing the approximate number of distinct elements in a stream.
ApproximateDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
 
ApproximateDistinct.ApproximateDistinctFn<InputT> - Class in org.apache.beam.sdk.extensions.sketching
Implements the Combine.CombineFn of ApproximateDistinct transforms.
ApproximateDistinct.GloballyDistinct<InputT> - Class in org.apache.beam.sdk.extensions.sketching
ApproximateDistinct.HyperLogLogPlusCoder - Class in org.apache.beam.sdk.extensions.sketching
Coder for HyperLogLogPlus class.
ApproximateDistinct.PerKeyDistinct<K,V> - Class in org.apache.beam.sdk.extensions.sketching
Implementation of ApproximateDistinct.perKey().
ApproximateQuantiles - Class in org.apache.beam.sdk.transforms
PTransforms for getting an idea of a PCollection's data distribution using approximate N-tiles (e.g.
ApproximateQuantiles.ApproximateQuantilesCombineFn<T,ComparatorT extends java.util.Comparator<T> & java.io.Serializable> - Class in org.apache.beam.sdk.transforms
The ApproximateQuantilesCombineFn combiner gives an idea of the distribution of a collection of values using approximate N-tiles.
ApproximateUnique - Class in org.apache.beam.sdk.transforms
Deprecated.

Consider using ApproximateCountDistinct in the zetasketch extension module, which makes use of the HllCount implementation.

If ApproximateCountDistinct does not meet your needs then you can directly use HllCount. Direct usage will also give you access to save intermediate aggregation result into a sketch for later processing.

For example, to estimate the number of distinct elements in a PCollection<String>:


 PCollection<String> input = ...;
 PCollection<Long> countDistinct =
     input.apply(HllCount.Init.forStrings().globally()).apply(HllCount.Extract.globally());
 
For more details about using HllCount and the zetasketch extension module, see https://s.apache.org/hll-in-beam#bookmark=id.v6chsij1ixo7.
ApproximateUnique() - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique
Deprecated.
 
ApproximateUnique.ApproximateUniqueCombineFn<T> - Class in org.apache.beam.sdk.transforms
Deprecated.
CombineFn that computes an estimate of the number of distinct values that were combined.
ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique - Class in org.apache.beam.sdk.transforms
Deprecated.
A heap utility class to efficiently track the largest added elements.
ApproximateUnique.Globally<T> - Class in org.apache.beam.sdk.transforms
Deprecated.
PTransform for estimating the number of distinct elements in a PCollection.
ApproximateUnique.PerKey<K,V> - Class in org.apache.beam.sdk.transforms
Deprecated.
PTransform for estimating the number of distinct values associated with each key in a PCollection of KVs.
ApproximateUniqueCombineFn(long, Coder<T>) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
Deprecated.
 
arbitrarily() - Static method in class org.apache.beam.sdk.transforms.Redistribute
 
array() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns the backing array.
array(TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
array(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
Create an array type for the given field type.
array(Schema.FieldType, boolean) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
Deprecated.
Set the nullability on the elementType instead
ARRAY_AGG_FN - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
ArrayAgg - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
 
ArrayAgg() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg
 
ArrayAgg.ArrayAggArray<T> - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
 
ArrayAggArray() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
 
arrayContaining(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
arrayContaining(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
arrayContaining(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
arrayContaining(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
arrayContainingInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
arrayContainingInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
arrayContainingInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
arrayContainingInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
ArrayCopyState() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState
 
arrayElementType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
ArrayNewState() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState
 
ArrayOfNestedStringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle
 
ArrayOfStringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle
 
arrayQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
arrayQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
 
ArrayQualifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
 
ArrayQualifierListContext(FieldSpecifierNotationParser.QualifierListContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
 
arrayWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.arrayWithSize(int).
arrayWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
ArrowConversion - Class in org.apache.beam.sdk.extensions.arrow
Utilities to create Iterables of Beam Row instances backed by Arrow record batches.
ArrowConversion.ArrowSchemaTranslator - Class in org.apache.beam.sdk.extensions.arrow
Converts Arrow schema to Beam row schema.
ArrowConversion.RecordBatchRowIterator - Class in org.apache.beam.sdk.extensions.arrow
 
arrowSchemaFromInput(InputStream) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion
 
ArrowSchemaTranslator() - Constructor for class org.apache.beam.sdk.extensions.arrow.ArrowConversion.ArrowSchemaTranslator
 
ArtifactDestination() - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
 
ArtifactRetrievalService - Class in org.apache.beam.runners.fnexecution.artifact
An ArtifactRetrievalService that uses FileSystems as its backing storage.
ArtifactRetrievalService() - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
ArtifactRetrievalService(ArtifactResolver) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
ArtifactRetrievalService(int) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
ArtifactRetrievalService(ArtifactResolver, int) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
ArtifactStagingService - Class in org.apache.beam.runners.fnexecution.artifact
 
ArtifactStagingService(ArtifactStagingService.ArtifactDestinationProvider) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
 
ArtifactStagingService.ArtifactDestination - Class in org.apache.beam.runners.fnexecution.artifact
A pairing of a newly created artifact type and an output stream that will be readable at that type.
ArtifactStagingService.ArtifactDestinationProvider - Interface in org.apache.beam.runners.fnexecution.artifact
Provides a concrete location to which artifacts can be staged on retrieval.
as(Class<T>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
Transforms this object into an object of type <T> saving each property that has been manipulated.
as(Class<T>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Creates and returns an object that implements <T>.
as(Class<T>) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
Creates and returns an object that implements <T> using the values configured on this builder during construction.
asCloudObject(Coder<?>, SdkComponents) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
Convert the provided Coder into a CloudObject.
asInputStream(int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns an InputStream wrapper which supplies the portion of this backing byte buffer starting at offset and up to length bytes.
asIterable() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsIterable transform that takes a PCollection as input and produces a PCollectionView mapping each window to an Iterable of the values in that window.
AsJsons<InputT> - Class in org.apache.beam.sdk.extensions.jackson
PTransform for serializing objects to JSON Strings.
AsJsons.AsJsonsWithFailures<FailureT> - Class in org.apache.beam.sdk.extensions.jackson
A PTransform that adds exception handling to AsJsons.
asList() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsList transform that takes a PCollection and returns a PCollectionView mapping each window to a List containing all of the elements in the window.
asMap() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
asMap() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsMap transform that takes a PCollection<KV<K, V>> as input and produces a PCollectionView mapping each window to a Map<K, V>.
asMultimap() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsMultimap transform that takes a PCollection<KV<K, V>> as input and produces a PCollectionView mapping each window to its contents as a Map<K, Iterable<V>> for use as a side input.
asOutputReference(PValue, AppliedPTransform<?, ?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
Encode a PValue reference as an output reference.
asOutputStream() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns an output stream which writes to the backing buffer from the current position.
asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub API.
asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Returns the string representation of this topic as a path used in the Cloud Pub/Sub API.
asQueryable(QueryProvider, SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
 
asResponseObserver() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
assertionError() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
assertSourcesEqualReferenceSource(BoundedSource<T>, List<? extends BoundedSource<T>>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Given a reference Source and a list of Sources, assert that the union of the records read from the list of sources is equal to the records read from the reference source.
assertSplitAtFractionBehavior(BoundedSource<T>, int, double, SourceTestUtils.ExpectedSplitOutcome, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Asserts that the source's reader either fails to splitAtFraction(fraction) after reading numItemsToReadBeforeSplit items, or succeeds in a way that is consistent according to SourceTestUtils.assertSplitAtFractionSucceedsAndConsistent(org.apache.beam.sdk.io.BoundedSource<T>, int, double, org.apache.beam.sdk.options.PipelineOptions).
assertSplitAtFractionExhaustive(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Asserts that for each possible start position, BoundedSource.BoundedReader#splitAtFraction at every interesting fraction (halfway between two fractions that differ by at least one item) can be called successfully and the results are consistent if a split succeeds.
assertSplitAtFractionFails(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Asserts that the source's reader fails to splitAtFraction(fraction) after reading numItemsToReadBeforeSplit items.
assertSplitAtFractionSucceedsAndConsistent(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Verifies some consistency properties of BoundedSource.BoundedReader#splitAtFraction on the given source.
assertSubscriptionEventuallyCreated(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Block until a subscription is created for this test topic in the specified project.
assertThatAllRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
assertThatTopicEventuallyReceives(Matcher<PubsubMessage>...) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Repeatedly pull messages from TestPubsub.subscriptionPath() until receiving one for each matcher (or timeout is reached), then assert that the received messages match the expectations.
assertUnstartedReaderReadsSameAsItsSource(BoundedSource.BoundedReader<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Assert that a Reader returns a Source that, when read from, produces the same records as the reader.
assign(BoundedWindow, Instant) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
assignableTo(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
Returns true if this Schema can be assigned to another Schema.
assignableToIgnoreNullable(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
Returns true if this Schema can be assigned to another Schema, ignoring nullable.
AssignContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
 
assignedWindows(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
assignedWindowsWithValue(WindowFn<T, W>, TimestampedValue<T>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
AssignShardFn(Integer) - Constructor for class org.apache.beam.sdk.transforms.Reshuffle.AssignShardFn
 
assignShardKey(DestinationT, UserT, int) - Method in interface org.apache.beam.sdk.io.ShardingFunction
 
assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns true if this WindowFn always assigns an element to exactly one window.
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
Returns the single window to which elements with this timestamp belong.
AssignWindowP<T> - Class in org.apache.beam.runners.jet.processors
/** * Jet Processor implementation for Beam's Windowing primitive.
assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
 
assignWindows(WindowFn<Object, GlobalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
assignWindows(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Given a timestamp and element, returns the set of windows into which it should be placed.
AssignWindowsFunction<T> - Class in org.apache.beam.runners.twister2.translators.functions
Assign Windows function.
AssignWindowsFunction(WindowFn<T, BoundedWindow>, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
 
AssignWindowTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
Assign Window translator.
AssignWindowTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.AssignWindowTranslatorBatch
 
asSingleton() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsSingleton transform that takes a PCollection with a single value per window as input and produces a PCollectionView that returns the value in the main input window when read as a side input.
asSingletonView() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform that produces a PCollectionView whose elements are the result of combining elements per-window in the input PCollection.
assumeSingleMessageSchema() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
 
ASTERISK - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
 
ASTERISK_RELUCTANT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
 
asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Deprecated.
the v1beta1 API for Cloud Pub/Sub is deprecated.
asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Deprecated.
the v1beta1 API for Cloud Pub/Sub is deprecated.
asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Deprecated.
the v1beta2 API for Cloud Pub/Sub is deprecated.
asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Deprecated.
the v1beta2 API for Cloud Pub/Sub is deprecated.
AsyncBatchWriteHandler<RecT,ResT> - Class in org.apache.beam.sdk.io.aws2.common
Async handler that automatically retries unprocessed records in case of a partial success.
AsyncBatchWriteHandler(int, FluentBackoff, AsyncBatchWriteHandler.Stats, Function<ResT, String>, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>) - Constructor for class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
 
AsyncBatchWriteHandler.Stats - Interface in org.apache.beam.sdk.io.aws2.common
Statistics on the batch request.
atMinimumTimestamp(V) - Static method in class org.apache.beam.sdk.values.TimestampedValue
Returns a new TimestampedValue with the minimum timestamp.
AtomicCoder<T> - Class in org.apache.beam.sdk.coders
A Coder that has no component Coders or other configuration.
AtomicCoder() - Constructor for class org.apache.beam.sdk.coders.AtomicCoder
 
AtomicLongFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
 
attached() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
attachValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
 
attachValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
 
attempted(MetricKey, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
 
ATTRIBUTE_ARRAY_ENTRY_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
ATTRIBUTE_ARRAY_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
ATTRIBUTE_MAP_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
AttributeValueCoder - Class in org.apache.beam.sdk.io.aws.dynamodb
A Coder that serializes and deserializes the AttributeValue objects.
AttributeValueCoder - Class in org.apache.beam.sdk.io.aws2.dynamodb
A Coder that serializes and deserializes the AttributeValue objects.
AttributeValueCoderProviderRegistrar - Class in org.apache.beam.sdk.io.aws.dynamodb
A CoderProviderRegistrar for standard types used with DynamoDBIO.
AttributeValueCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoderProviderRegistrar
 
AUTH_VALIDATION_GROUP - Static variable in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
AuthenticatedRetryInitializer(GoogleCredentials) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
 
AUTO - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
autoCastField(Schema.Field, Object) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
Attempt to cast an object to a specified Schema.Field.Type.
autoLoadUserDefinedFunctions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Load UDF/UDAFs from UdfUdafProvider.
AutoScaler - Interface in org.apache.beam.sdk.io.jms
Enables users to specify their own `JMS` backlog reporters enabling JmsIO to report UnboundedSource.UnboundedReader#getTotalBacklogBytes().
AUTOVALUE_CLASS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
AUTOVALUE_CLASS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
AutoValueSchema - Class in org.apache.beam.sdk.schemas
A SchemaProvider for AutoValue classes.
AutoValueSchema() - Constructor for class org.apache.beam.sdk.schemas.AutoValueSchema
 
AutoValueSchema.AbstractGetterTypeSupplier - Class in org.apache.beam.sdk.schemas
FieldValueTypeSupplier that's based on AutoValue getters.
AutoValueUtils - Class in org.apache.beam.sdk.schemas.utils
Utilities for managing AutoValue schemas.
AutoValueUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.AutoValueUtils
 
AvailableParallelismFactory() - Constructor for class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
 
AVRO_CODER_URN - Static variable in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
 
AvroCoder<T> - Class in org.apache.beam.sdk.extensions.avro.coders
A Coder using Avro binary format.
AvroCoder(Class<T>, Schema) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
AvroCoder(Class<T>, Schema, boolean) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
AvroCoder(AvroDatumFactory<T>, Schema) - Constructor for class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
AvroConvertType(boolean) - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertType
 
AvroDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
Create DatumReader and DatumWriter for given schemas.
AvroDatumFactory(Class<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
 
AvroDatumFactory.GenericDatumFactory - Class in org.apache.beam.sdk.extensions.avro.io
AvroDatumFactory.ReflectDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
Specialized AvroDatumFactory for java classes transforming to avro through reflection.
AvroDatumFactory.SpecificDatumFactory<T> - Class in org.apache.beam.sdk.extensions.avro.io
AvroGenericCoder - Class in org.apache.beam.sdk.extensions.avro.coders
AvroCoder specialisation for GenericRecord, needed for cross-language transforms.
AvroGenericCoderRegistrar - Class in org.apache.beam.sdk.extensions.avro
Coder registrar for AvroGenericCoder.
AvroGenericCoderRegistrar() - Constructor for class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
 
AvroGenericCoderTranslator - Class in org.apache.beam.sdk.extensions.avro
Coder translator for AvroGenericCoder.
AvroGenericCoderTranslator() - Constructor for class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
 
AvroGenericRecordToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
Utility methods for converting Avro GenericRecord objects to dynamic protocol message, for use with the Storage write API.
AvroGenericRecordToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
 
AvroIO - Class in org.apache.beam.sdk.extensions.avro.io
PTransforms for reading and writing Avro files.
AvroIO.Parse<T> - Class in org.apache.beam.sdk.extensions.avro.io
AvroIO.ParseAll<T> - Class in org.apache.beam.sdk.extensions.avro.io
AvroIO.ParseFiles<T> - Class in org.apache.beam.sdk.extensions.avro.io
AvroIO.Read<T> - Class in org.apache.beam.sdk.extensions.avro.io
AvroIO.ReadAll<T> - Class in org.apache.beam.sdk.extensions.avro.io
Deprecated.
See AvroIO.readAll(Class) for details.
AvroIO.ReadFiles<T> - Class in org.apache.beam.sdk.extensions.avro.io
AvroIO.RecordFormatter<ElementT> - Interface in org.apache.beam.sdk.extensions.avro.io
Deprecated.
Users can achieve the same by providing this transform in a ParDo before using write in AvroIO AvroIO.write(Class).
AvroIO.Sink<ElementT> - Class in org.apache.beam.sdk.extensions.avro.io
AvroIO.TypedWrite<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.extensions.avro.io
AvroIO.Write<T> - Class in org.apache.beam.sdk.extensions.avro.io
This class is used as the default return value of AvroIO.write(java.lang.Class<T>)
AvroJavaTimeConversions - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
Avro 1.8 ships with joda time conversions only.
AvroJavaTimeConversions() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions
 
AvroJavaTimeConversions.DateConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJavaTimeConversions.LocalTimestampMicros - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJavaTimeConversions.LocalTimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJavaTimeConversions.LocalTimestampMillis - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJavaTimeConversions.LocalTimestampMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJavaTimeConversions.TimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJavaTimeConversions.TimeMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJavaTimeConversions.TimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJavaTimeConversions.TimestampMillisConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJodaTimeConversions - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
Avro 1.8 & 1.9 ship joda time conversions.
AvroJodaTimeConversions() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions
 
AvroJodaTimeConversions.DateConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJodaTimeConversions.LossyTimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJodaTimeConversions.LossyTimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJodaTimeConversions.TimeConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJodaTimeConversions.TimeMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJodaTimeConversions.TimestampConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroJodaTimeConversions.TimestampMicrosConversion - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroPayloadSerializerProvider - Class in org.apache.beam.sdk.extensions.avro.schemas.io.payloads
 
AvroPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.io.payloads.AvroPayloadSerializerProvider
 
AvroReader(AvroSource<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
Reads Avro records of type T from the specified source.
AvroReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
 
AvroReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.AvroReadSchemaTransformFormatProvider
 
AvroRecordSchema - Class in org.apache.beam.sdk.extensions.avro.schemas
A SchemaProvider for AVRO generated SpecificRecords and POJOs.
AvroRecordSchema() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
 
AvroSchemaInformationProvider - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroSchemaInformationProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroSchemaInformationProvider
 
AvroSchemaIOProvider - Class in org.apache.beam.sdk.extensions.avro.io
An implementation of SchemaIOProvider for reading and writing Avro files with AvroIO.
AvroSchemaIOProvider() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
 
AvroSink<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.extensions.avro.io
A FileBasedSink for Avro files.
AvroSink.DatumWriterFactory<T> - Interface in org.apache.beam.sdk.extensions.avro.io
 
AvroSource<T> - Class in org.apache.beam.sdk.extensions.avro.io
Do not use in pipelines directly: most users should use AvroIO.Read.
AvroSource.AvroReader<T> - Class in org.apache.beam.sdk.extensions.avro.io
A BlockBasedReader for reading blocks from Avro files.
AvroSource.DatumReaderFactory<T> - Interface in org.apache.beam.sdk.extensions.avro.io
 
AvroTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.avro
TableProvider for AvroIO for consumption by Beam SQL.
AvroTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
 
AvroUtils - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
Utils to convert AVRO records to Beam rows.
AvroUtils.AvroConvertType - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroUtils.AvroConvertValueForGetter - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroUtils.AvroConvertValueForSetter - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroUtils.FixedBytesField - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
Wrapper for fixed byte fields.
AvroUtils.TypeWithNullability - Class in org.apache.beam.sdk.extensions.avro.schemas.utils
 
AvroWriteRequest<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
AvroWriteRequest(T, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
 
AvroWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
AvroWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.AvroWriteSchemaTransformFormatProvider
 
awaitCompletion() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
Uses the callers thread to process all elements received until we receive the end of the stream from the upstream producer for all endpoints specified.
awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
 
AwsBuilderFactory<PojoT extends SdkPojo,BuilderT extends SdkBuilder<BuilderT,PojoT> & SdkPojo> - Class in org.apache.beam.sdk.io.aws2.schemas
Builder factory for AWS SdkPojo to avoid using reflection to instantiate a builder.
AwsBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsBuilderFactory
 
AwsClientsProvider - Interface in org.apache.beam.sdk.io.aws.dynamodb
Provides instances of AWS clients.
AwsClientsProvider - Interface in org.apache.beam.sdk.io.aws.sns
Provides instances of AWS clients.
AWSClientsProvider - Interface in org.apache.beam.sdk.io.kinesis
Provides instances of AWS clients.
AwsCoders - Class in org.apache.beam.sdk.io.aws.coders
Coders for common AWS SDK objects.
AwsModule - Class in org.apache.beam.sdk.io.aws.options
A Jackson Module that registers a JsonSerializer and JsonDeserializer for AWSCredentialsProvider and some subclasses.
AwsModule() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsModule
 
AwsModule - Class in org.apache.beam.sdk.io.aws2.options
A Jackson Module that registers a JsonSerializer and JsonDeserializer for AwsCredentialsProvider and some subclasses.
AwsModule() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsModule
 
AwsOptions - Interface in org.apache.beam.sdk.io.aws.options
Options used to configure Amazon Web Services specific options such as credentials and region.
AwsOptions - Interface in org.apache.beam.sdk.io.aws2.options
Options used to configure Amazon Web Services specific options such as credentials and region.
AwsOptions.AwsRegionFactory - Class in org.apache.beam.sdk.io.aws.options
Attempt to load default region.
AwsOptions.AwsRegionFactory - Class in org.apache.beam.sdk.io.aws2.options
Attempt to load default region.
AwsOptions.AwsUserCredentialsFactory - Class in org.apache.beam.sdk.io.aws.options
Attempts to load AWS credentials.
AwsOptions.AwsUserCredentialsFactory - Class in org.apache.beam.sdk.io.aws2.options
Return DefaultCredentialsProvider as default provider.
AwsOptions.ClientConfigurationFactory - Class in org.apache.beam.sdk.io.aws.options
Default AWS client configuration.
AwsPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.aws.options
A registrar containing the default AWS options.
AwsPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsPipelineOptionsRegistrar
 
AwsPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.aws2.options
A registrar containing the default AWS options.
AwsPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsPipelineOptionsRegistrar
 
AwsRegionFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsRegionFactory
 
AwsRegionFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsRegionFactory
 
AwsSchemaProvider - Class in org.apache.beam.sdk.io.aws2.schemas
Schema provider for AWS SdkPojo models using the provided field metadata (@see SdkPojo.sdkFields()) rather than reflection.
AwsSchemaProvider() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
 
AwsSchemaRegistrar - Class in org.apache.beam.sdk.io.aws2.schemas
 
AwsSchemaRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaRegistrar
 
AwsSerializableUtils - Class in org.apache.beam.sdk.io.aws2.options
Utilities for working with AWS Serializables.
AwsSerializableUtils() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
 
AwsSerializableUtils - Class in org.apache.beam.sdk.io.kinesis.serde
Utilities for working with AWS Serializables.
AwsSerializableUtils() - Constructor for class org.apache.beam.sdk.io.kinesis.serde.AwsSerializableUtils
 
AwsTypes - Class in org.apache.beam.sdk.io.aws2.schemas
 
AwsTypes() - Constructor for class org.apache.beam.sdk.io.aws2.schemas.AwsTypes
 
AwsUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsUserCredentialsFactory
 
AwsUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsUserCredentialsFactory
 
AzureBlobStoreFileSystemRegistrar - Class in org.apache.beam.sdk.io.azure.blobstore
AutoService registrar for the AzureBlobStoreFileSystem.
AzureBlobStoreFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.azure.blobstore.AzureBlobStoreFileSystemRegistrar
 
AzureModule - Class in org.apache.beam.sdk.io.azure.options
A Jackson Module that registers a JsonSerializer and JsonDeserializer for Azure credential providers.
AzureModule() - Constructor for class org.apache.beam.sdk.io.azure.options.AzureModule
 
AzureOptions - Interface in org.apache.beam.sdk.io.azure.options
 
AzureOptions.AzureUserCredentialsFactory - Class in org.apache.beam.sdk.io.azure.options
Attempts to load Azure credentials.
AzurePipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.azure.options
A registrar containing the default Azure options.
AzurePipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.azure.options.AzurePipelineOptionsRegistrar
 
AzureUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.azure.options.AzureOptions.AzureUserCredentialsFactory
 

B

BACKLOG_UNKNOWN - Static variable in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Constant representing an unknown amount of backlog.
backlogBytes() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Gauge for source backlog in bytes.
backlogBytesOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Gauge for source split backlog in bytes.
backlogElements() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Gauge for source backlog in elements.
backlogElementsOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Gauge for source split backlog in elements.
BackOffAdapter - Class in org.apache.beam.sdk.extensions.gcp.util
An adapter for converting between Apache Beam and Google API client representations of backoffs.
BackOffAdapter() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.BackOffAdapter
 
BAD_RECORD_TAG - Static variable in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
 
BadRecord - Class in org.apache.beam.sdk.transforms.errorhandling
 
BadRecord() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord
 
BadRecord.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
 
BadRecord.Failure - Class in org.apache.beam.sdk.transforms.errorhandling
 
BadRecord.Failure.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
 
BadRecord.Record - Class in org.apache.beam.sdk.transforms.errorhandling
 
BadRecord.Record.Builder - Class in org.apache.beam.sdk.transforms.errorhandling
 
BadRecordErrorHandler(PTransform<PCollection<BadRecord>, OutputT>, Pipeline) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.BadRecordErrorHandler
Constructs a new ErrorHandler for handling BadRecords.
BadRecordRouter - Interface in org.apache.beam.sdk.transforms.errorhandling
 
BadRecordRouter.RecordingBadRecordRouter - Class in org.apache.beam.sdk.transforms.errorhandling
 
BadRecordRouter.ThrowingBadRecordRouter - Class in org.apache.beam.sdk.transforms.errorhandling
 
bag() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a BagState, optimized for adding values frequently and occasionally retrieving all the values that have been added.
bag(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.bag(), but with an element coder explicitly supplied.
BagState<T> - Interface in org.apache.beam.sdk.state
A ReadableState cell containing a bag of values.
BagUserStateSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
BASE_IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
Identifier of the unspecified precision numeric type.
baseBackoff() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
 
baseBackoff(Duration) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
 
BaseBeamTable - Class in org.apache.beam.sdk.extensions.sql.meta
Basic implementation of BeamSqlTable.
BaseBeamTable() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
 
baseNameBuilder(String) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
 
baseUrl() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
 
BASIC_CONNECTION_INFO_VALIDATION_GROUP - Static variable in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
BasicAuthJcsmpSessionService - Class in org.apache.beam.sdk.io.solace.broker
A class that manages a connection to a Solace broker using basic authentication.
BasicAuthJcsmpSessionService() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
BasicAuthJcsmpSessionService.Builder - Class in org.apache.beam.sdk.io.solace.broker
 
BasicAuthJcsmpSessionServiceFactory - Class in org.apache.beam.sdk.io.solace.broker
A factory for creating BasicAuthJcsmpSessionService instances.
BasicAuthJcsmpSessionServiceFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
 
BasicAuthJcsmpSessionServiceFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
 
BasicAuthSempClient - Class in org.apache.beam.sdk.io.solace.broker
A class that manages REST calls to the Solace Element Management Protocol (SEMP) using basic authentication.
BasicAuthSempClient(String, String, String, String, SerializableSupplier<HttpRequestFactory>) - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
 
BasicAuthSempClientFactory - Class in org.apache.beam.sdk.io.solace.broker
A factory for creating BasicAuthSempClient instances.
BasicAuthSempClientFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
 
BasicAuthSempClientFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
 
BasicDynamoDBProvider - Class in org.apache.beam.sdk.io.aws.dynamodb
Basic implementation of AwsClientsProvider used by default in DynamoDBIO.
BatchContextImpl - Class in org.apache.beam.sdk.io.cdap.context
Class for Batch, Sink and Stream CDAP wrapper classes that use it to provide common details.
BatchContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
batchGetDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for BatchGetDocumentsRequest operations.
BatchingParams() - Constructor for class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
 
BatchSideInputHandlerFactory - Class in org.apache.beam.runners.fnexecution.translation
BatchSideInputHandlerFactory.SideInputGetter - Interface in org.apache.beam.runners.fnexecution.translation
Returns the value for the side input with the given PCollection id from the runner.
BatchSinkContextImpl - Class in org.apache.beam.sdk.io.cdap.context
Class for creating context object of different CDAP classes with batch sink type.
BatchSinkContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchSinkContextImpl
 
batchSize() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
batchSize() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
BatchSourceContextImpl - Class in org.apache.beam.sdk.io.cdap.context
Class for creating context object of different CDAP classes with batch source type.
BatchSourceContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
 
BatchStatefulParDoOverrides - Class in org.apache.beam.runners.dataflow
PTransformOverrideFactories that expands to correctly implement stateful ParDo using window-unaware BatchViewOverrides.GroupByKeyAndSortValuesOnly to linearize processing per key.
BatchStatefulParDoOverrides() - Constructor for class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
 
BatchStatefulParDoOverrides.BatchStatefulDoFn<K,V,OutputT> - Class in org.apache.beam.runners.dataflow
A key-preserving DoFn that explodes an iterable that has been grouped by key and window.
BatchTransformTranslator<TransformT extends PTransform> - Interface in org.apache.beam.runners.twister2.translators
Batch TransformTranslator interface.
batchWrite(String, List<RecT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
Asynchronously trigger a batch write request (unless already in error state).
batchWrite(String, List<RecT>, boolean) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
Asynchronously trigger a batch write request (unless already in error state).
batchWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Write
Factory method to create a new type safe builder for Write operations.
BEAM_INSTANCE_PROPERTY - Static variable in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
 
BeamAggregateProjectMergeRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
This rule is essentially a wrapper around Calcite's AggregateProjectMergeRule.
BeamAggregateProjectMergeRule(Class<? extends Aggregate>, Class<? extends Project>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
 
BeamAggregationRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Aggregate node.
BeamAggregationRel(RelOptCluster, RelTraitSet, RelNode, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>, WindowFn<Row, IntervalWindow>, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
BeamAggregationRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
Rule to detect the window/trigger settings.
BeamAggregationRule(Class<? extends Aggregate>, Class<? extends Project>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
 
BeamBasicAggregationRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
Aggregation rule that doesn't include projection.
BeamBasicAggregationRule(Class<? extends Aggregate>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
 
BeamBatchTSetEnvironment - Class in org.apache.beam.runners.twister2
This is a shell tset environment which is used on as a central driver model to fit what beam expects.
BeamBatchTSetEnvironment() - Constructor for class org.apache.beam.runners.twister2.BeamBatchTSetEnvironment
 
BeamBatchWorker - Class in org.apache.beam.runners.twister2
The Twister2 worker that will execute the job logic once the job is submitted from the run method.
BeamBatchWorker() - Constructor for class org.apache.beam.runners.twister2.BeamBatchWorker
 
BeamBigQuerySqlDialect - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
 
BeamBigQuerySqlDialect(SqlDialect.Context) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
BeamBuiltinAggregations - Class in org.apache.beam.sdk.extensions.sql.impl.transform
Built-in aggregations functions for COUNT/MAX/MIN/SUM/AVG/VAR_POP/VAR_SAMP.
BeamBuiltinAggregations() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
 
BeamBuiltinAggregations.BitXOr<T extends java.lang.Number> - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
BeamBuiltinAnalyticFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.transform
Built-in Analytic Functions for the aggregation analytics functionality.
BeamBuiltinAnalyticFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
BeamBuiltinAnalyticFunctions.PositionAwareCombineFn<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
BeamBuiltinFunctionProvider - Class in org.apache.beam.sdk.extensions.sql.impl.udf
BeamBuiltinFunctionClass interface.
BeamBuiltinFunctionProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BeamBuiltinFunctionProvider
 
BeamBuiltinMethods - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
BeamBuiltinMethods.
BeamBuiltinMethods() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
BeamCalciteSchema - Class in org.apache.beam.sdk.extensions.sql.impl
Adapter from TableProvider to Schema.
BeamCalciteTable - Class in org.apache.beam.sdk.extensions.sql.impl
Adapter from BeamSqlTable to a calcite Table.
BeamCalcMergeRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
Planner rule to merge a BeamCalcRel with a BeamCalcRel.
BeamCalcMergeRule() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
 
BeamCalcRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace Project and Filter node.
BeamCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
BeamCalcRel.WrappedList<T> - Class in org.apache.beam.sdk.extensions.sql.impl.rel
WrappedList translates List on access.
BeamCalcRel.WrappedMap<V> - Class in org.apache.beam.sdk.extensions.sql.impl.rel
WrappedMap translates Map on access.
BeamCalcRel.WrappedRow - Class in org.apache.beam.sdk.extensions.sql.impl.rel
WrappedRow translates Row on access.
BeamCalcRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
A ConverterRule to replace Calc with BeamCalcRel.
BeamCalcSplittingRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
A RelOptRule that converts a LogicalCalc into a chain of AbstractBeamCalcRel nodes via CalcRelSplitter.
BeamCalcSplittingRule(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
 
BeamCodegenUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
BeamCodegenUtils.
BeamCodegenUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamCodegenUtils
 
BeamCoGBKJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
A BeamJoinRel which does CoGBK Join
BeamCoGBKJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
 
BeamCoGBKJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
Rule to convert LogicalJoin node to BeamCoGBKJoinRel node.
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
This method is called by org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl.
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
A dummy cost computation based on a fixed multiplier.
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
 
beamComputeSelfCost(RelOptPlanner, BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
 
BeamCostModel - Class in org.apache.beam.sdk.extensions.sql.impl.planner
VolcanoCost represents the cost of a plan node.
BeamCostModel.Factory - Class in org.apache.beam.sdk.extensions.sql.impl.planner
Implementation of RelOptCostFactory that creates BeamCostModels.
BeamEnumerableConverter - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Enumerable node.
BeamEnumerableConverter(RelOptCluster, RelTraitSet, RelNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
BeamEnumerableConverterRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
A ConverterRule to Convert BeamRelNode to EnumerableConvention.
beamFilesystemArtifactDestinationProvider(String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
An ArtifactDestinationProvider that places new artifacts as files in a Beam filesystem.
BeamFlinkDataSetAdapter - Class in org.apache.beam.runners.flink.adapter
An adapter class that allows one to apply Apache Beam PTransforms directly to Flink DataSets.
BeamFlinkDataSetAdapter() - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
BeamFlinkDataSetAdapter(PipelineOptions) - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataSetAdapter
 
BeamFlinkDataStreamAdapter - Class in org.apache.beam.runners.flink.adapter
An adapter class that allows one to apply Apache Beam PTransforms directly to Flink DataStreams.
BeamFlinkDataStreamAdapter() - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
BeamFlinkDataStreamAdapter(PipelineOptions) - Constructor for class org.apache.beam.runners.flink.adapter.BeamFlinkDataStreamAdapter
 
BeamFnDataGrpcMultiplexer - Class in org.apache.beam.sdk.fn.data
A gRPC multiplexer for a specific Endpoints.ApiServiceDescriptor.
BeamFnDataGrpcMultiplexer(Endpoints.ApiServiceDescriptor, OutboundObserverFactory, OutboundObserverFactory.BasicFactory<BeamFnApi.Elements, BeamFnApi.Elements>) - Constructor for class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
BeamFnDataInboundObserver - Class in org.apache.beam.sdk.fn.data
Decodes BeamFnApi.Elements partitioning them using the provided DataEndpoints and TimerEndpoints.
BeamFnDataInboundObserver.CloseException - Exception in org.apache.beam.sdk.fn.data
 
BeamFnDataOutboundAggregator - Class in org.apache.beam.sdk.fn.data
An outbound data buffering aggregator with size-based buffer and time-based buffer if corresponding options are set.
BeamFnDataOutboundAggregator(PipelineOptions, Supplier<String>, StreamObserver<BeamFnApi.Elements>, boolean) - Constructor for class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
 
BeamIntersectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Intersect node.
BeamIntersectRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
 
BeamIntersectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
ConverterRule to replace Intersect with BeamIntersectRel.
BeamIOPushDownRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
BeamIOPushDownRule(RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
 
BeamIOSinkRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a TableModify node.
BeamIOSinkRel(RelOptCluster, RelOptTable, Prepare.CatalogReader, RelNode, TableModify.Operation, List<String>, List<RexNode>, boolean, BeamSqlTable, Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
BeamIOSinkRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
A ConverterRule to replace TableModify with BeamIOSinkRel.
BeamIOSourceRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a TableScan node.
BeamIOSourceRel(RelOptCluster, RelTraitSet, RelOptTable, BeamSqlTable, Map<String, String>, BeamCalciteTable) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
BeamJavaTypeFactory - Class in org.apache.beam.sdk.extensions.sql.impl.planner
customized data type in Beam.
BeamJavaUdfCalcRule - Class in org.apache.beam.sdk.extensions.sql.zetasql
A BeamCalcSplittingRule to replace Calc with BeamCalcRel.
BeamJoinAssociateRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
This is very similar to JoinAssociateRule.
BeamJoinPushThroughJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
This is exactly similar to JoinPushThroughJoinRule.
BeamJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
An abstract BeamRelNode to implement Join Rels.
BeamJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
BeamJoinTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
Collections of PTransform and DoFn used to perform JOIN operation.
BeamJoinTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
 
BeamJoinTransforms.JoinAsLookup - Class in org.apache.beam.sdk.extensions.sql.impl.transform
Transform to execute Join as Lookup.
BeamKafkaCSVTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
A Kafka topic that saves records as CSV format.
BeamKafkaCSVTable(Schema, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
 
BeamKafkaCSVTable(Schema, String, List<String>, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
 
BeamKafkaTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
BeamKafkaTable represent a Kafka topic, as source or target.
BeamKafkaTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
BeamKafkaTable(Schema, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
BeamKafkaTable(Schema, List<TopicPartition>, String) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
BeamLogicalConvention - Enum in org.apache.beam.sdk.extensions.sql.impl.rel
Convention for Beam SQL.
BeamMatchRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Match node.
BeamMatchRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, RexNode, boolean, boolean, Map<String, RexNode>, Map<String, RexNode>, RexNode, Map<String, ? extends SortedSet<String>>, boolean, ImmutableBitSet, RelCollation, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
 
BeamMatchRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
ConverterRule to replace Match with BeamMatchRel.
BeamMinusRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Minus node.
BeamMinusRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
 
BeamMinusRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
ConverterRule to replace Minus with BeamMinusRel.
BeamPCollectionTable<InputT> - Class in org.apache.beam.sdk.extensions.sql.impl.schema
BeamPCollectionTable converts a PCollection<Row> as a virtual table, then a downstream query can query directly.
BeamPCollectionTable(PCollection<InputT>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
 
BeamPushDownIOSourceRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
BeamPushDownIOSourceRel(RelOptCluster, RelTraitSet, RelOptTable, BeamSqlTable, List<String>, BeamSqlTableFilter, Map<String, String>, BeamCalciteTable) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
 
BeamRelDataTypeSystem - Class in org.apache.beam.sdk.extensions.sql.impl.planner
customized data type in Beam.
BeamRelMetadataQuery - Class in org.apache.beam.sdk.extensions.sql.impl.planner
 
BeamRelNode - Interface in org.apache.beam.sdk.extensions.sql.impl.rel
A RelNode that can also give a PTransform that implements the expression.
beamRow2CsvLine(Row, CSVFormat) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
 
beamRowFromSourceRecordFn(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
 
BeamRowToBigtableMutation - Class in org.apache.beam.sdk.io.gcp.bigtable
Bigtable reference: .
BeamRowToBigtableMutation(Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
 
BeamRowToBigtableMutation.ToBigtableRowFn - Class in org.apache.beam.sdk.io.gcp.bigtable
 
beamRowToIcebergRecord(Schema, Row) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
Converts a Beam Row to an Iceberg Record.
BeamRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
Utility methods for converting Beam Row objects to dynamic protocol message, for use with the Storage write API.
BeamRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
 
BeamRuleSets - Class in org.apache.beam.sdk.extensions.sql.impl.planner
RuleSet used in BeamQueryPlanner.
BeamRuleSets() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
 
beamSchemaFromJsonSchema(String) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
 
beamSchemaFromKafkaConnectSchema(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
 
beamSchemaToIcebergSchema(Schema) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
Converts a Beam Schema to an Iceberg Schema.
beamSchemaTypeFromKafkaType(Schema) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
 
BeamSetOperatorRelBase - Class in org.apache.beam.sdk.extensions.sql.impl.rel
Delegate for Set operators: BeamUnionRel, BeamIntersectRel and BeamMinusRel.
BeamSetOperatorRelBase(BeamRelNode, BeamSetOperatorRelBase.OpType, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
 
BeamSetOperatorRelBase.OpType - Enum in org.apache.beam.sdk.extensions.sql.impl.rel
Set operator type.
BeamSetOperatorsTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
Collections of PTransform and DoFn used to perform Set operations.
BeamSetOperatorsTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms
 
BeamSetOperatorsTransforms.BeamSqlRow2KvFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
Transform a BeamSqlRow to a KV<BeamSqlRow, BeamSqlRow>.
BeamSetOperatorsTransforms.SetOperatorFilteringDoFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
Filter function used for Set operators.
BeamSideInputJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
A BeamJoinRel which does sideinput Join
BeamSideInputJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
 
BeamSideInputJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
Rule to convert LogicalJoin node to BeamSideInputJoinRel node.
BeamSideInputLookupJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
A BeamJoinRel which does Lookup Join
BeamSideInputLookupJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
 
BeamSideInputLookupJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
Rule to convert LogicalJoin node to BeamSideInputLookupJoinRel node.
BeamSideInputLookupJoinRule() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
 
BeamSortRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Sort node.
BeamSortRel(RelOptCluster, RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
BeamSortRel.BeamSqlRowComparator - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
BeamSortRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
ConverterRule to replace Sort with BeamSortRel.
BeamSqlCli - Class in org.apache.beam.sdk.extensions.sql
BeamSqlCli provides methods to execute Beam SQL with an interactive client.
BeamSqlCli() - Constructor for class org.apache.beam.sdk.extensions.sql.BeamSqlCli
 
BeamSqlDataCatalogExample - Class in org.apache.beam.sdk.extensions.sql.example
Example pipeline that uses Google Cloud Data Catalog to retrieve the table metadata.
BeamSqlDataCatalogExample() - Constructor for class org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample
 
BeamSqlDataCatalogExample.DCExamplePipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.example
Pipeline options to specify the query and the output for the example.
BeamSqlEnv - Class in org.apache.beam.sdk.extensions.sql.impl
Contains the metadata of tables/UDF functions, and exposes APIs to query/validate/optimize/translate SQL statements.
BeamSqlEnv.BeamSqlEnvBuilder - Class in org.apache.beam.sdk.extensions.sql.impl
BeamSqlEnv's Builder.
BeamSqlOutputToConsoleFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
A test PTransform to display output in console.
BeamSqlOutputToConsoleFn(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlOutputToConsoleFn
 
BeamSqlParser - Class in org.apache.beam.sdk.extensions.sql.impl.parser
 
BeamSqlPipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.impl
Options used to configure BeamSQL.
BeamSqlPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.sql.impl
AutoService registrar for BeamSqlPipelineOptions.
BeamSqlPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptionsRegistrar
 
BeamSqlRelUtils - Class in org.apache.beam.sdk.extensions.sql.impl.rel
Utilities for BeamRelNode.
BeamSqlRelUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
BeamSqlRow2KvFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
 
BeamSqlRowComparator(List<Integer>, List<Boolean>, List<Boolean>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel.BeamSqlRowComparator
 
BeamSqlSeekableTable - Interface in org.apache.beam.sdk.extensions.sql
A seekable table converts a JOIN operator to an inline lookup.
BeamSqlTable - Interface in org.apache.beam.sdk.extensions.sql.meta
This interface defines a Beam Sql Table.
BeamSqlTableFilter - Interface in org.apache.beam.sdk.extensions.sql.meta
This interface defines Beam SQL Table Filter.
BeamSqlUdf - Interface in org.apache.beam.sdk.extensions.sql
Interface to create a UDF in Beam SQL.
BeamSqlUnparseContext - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
 
BeamSqlUnparseContext(IntFunction<SqlNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
 
BeamTableFunctionScanRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace TableFunctionScan.
BeamTableFunctionScanRel(RelOptCluster, RelTraitSet, List<RelNode>, RexNode, Type, RelDataType, Set<RelColumnMapping>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
 
BeamTableFunctionScanRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
This is the conveter rule that converts a Calcite TableFunctionScan to Beam TableFunctionScanRel.
BeamTableStatistics - Class in org.apache.beam.sdk.extensions.sql.impl
This class stores row count statistics.
BeamTableUtils - Class in org.apache.beam.sdk.extensions.sql.impl.schema
Utility methods for working with BeamTable.
BeamTableUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
 
BeamUncollectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to implement an uncorrelated Uncollect, aka UNNEST.
BeamUncollectRel(RelOptCluster, RelTraitSet, RelNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
 
BeamUncollectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
A ConverterRule to replace Uncollect with BeamUncollectRule.
BeamUnionRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Union.
BeamUnionRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
 
BeamUnionRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
A ConverterRule to replace Union with BeamUnionRule.
BeamUnnestRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to implement UNNEST, supporting specifically only Correlate with Uncollect.
BeamUnnestRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, List<Integer>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
BeamUnnestRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
A ConverterRule to replace Correlate Uncollect with BeamUnnestRule.
BeamValuesRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Values node.
BeamValuesRel(RelOptCluster, RelDataType, ImmutableList<ImmutableList<RexLiteral>>, RelTraitSet) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
 
BeamValuesRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
ConverterRule to replace Values with BeamValuesRel.
BeamWindowRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
BeamRelNode to replace a Window node.
BeamWindowRel(RelOptCluster, RelTraitSet, RelNode, List<RexLiteral>, RelDataType, List<Window.Group>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
 
BeamWindowRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
A ConverterRule to replace Window with BeamWindowRel.
BeamWorkerStatusGrpcService - Class in org.apache.beam.runners.fnexecution.status
A Fn Status service which can collect run-time status information from SDK harnesses for debugging purpose.
BeamZetaSqlCalcMergeRule - Class in org.apache.beam.sdk.extensions.sql.zetasql
Planner rule to merge a BeamZetaSqlCalcRel with a BeamZetaSqlCalcRel.
BeamZetaSqlCalcMergeRule() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcMergeRule
 
BeamZetaSqlCalcRel - Class in org.apache.beam.sdk.extensions.sql.zetasql
BeamRelNode to replace Project and Filter node based on the ZetaSQL expression evaluator.
BeamZetaSqlCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
 
BeamZetaSqlCalcRule - Class in org.apache.beam.sdk.extensions.sql.zetasql
BeamZetaSqlCalcSplittingRule - Class in org.apache.beam.sdk.extensions.sql.zetasql
A BeamCalcSplittingRule that converts a LogicalCalc to a chain of BeamZetaSqlCalcRel and/or BeamCalcRel via CalcRelSplitter.
BeamZetaSqlCatalog - Class in org.apache.beam.sdk.extensions.sql.zetasql
Catalog for registering tables and functions.
BeamZetaSqlUncollectRel - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
BeamRelNode to implement an uncorrelated ZetaSqlUnnest, aka UNNEST.
BeamZetaSqlUncollectRel(RelOptCluster, RelTraitSet, RelNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
 
BeamZetaSqlUncollectRule - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
A ConverterRule to replace ZetaSqlUnnest with BeamZetaSqlUncollectRel.
BeamZetaSqlUnnestRel - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
BeamRelNode to implement UNNEST, supporting specifically only Correlate with ZetaSqlUnnest.
BeamZetaSqlUnnestRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, List<Integer>) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
 
BeamZetaSqlUnnestRule - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
A ConverterRule to replace Correlate ZetaSqlUnnest with BeamZetaSqlUnnestRel.
beforeProcessing(PipelineOptions) - Method in interface org.apache.beam.sdk.harness.JvmInitializer
Implement beforeProcessing to run some custom initialization after basic services such as logging, but before data processing begins.
beforeStart(ClientCallStreamObserver<RespT>) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
begin() - Method in class org.apache.beam.sdk.Pipeline
Returns a PBegin owned by this Pipeline.
beginningOnDay(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
beginningOnDay(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
benchmarkHadoopLineReader(TextSourceBenchmark.Data) - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark
 
benchmarkTextSource(TextSourceBenchmark.Data) - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark
 
BIG_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
BIG_QUERY_INSERT_ERROR_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
BigDecimalCoder - Class in org.apache.beam.sdk.coders
A BigDecimalCoder encodes a BigDecimal as an integer scale encoded with VarIntCoder and a BigInteger encoded using BigIntegerCoder.
BigDecimalConverter - Class in org.apache.beam.sdk.extensions.sql.impl.utils
Provides converters from BigDecimal to other numeric types based on the input Schema.TypeName.
BigDecimalConverter() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
 
bigdecimals() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for BigDecimal.
BigEndianIntegerCoder - Class in org.apache.beam.sdk.coders
A BigEndianIntegerCoder encodes Integers in 4 bytes, big-endian.
BigEndianLongCoder - Class in org.apache.beam.sdk.coders
A BigEndianLongCoder encodes Longs in 8 bytes, big-endian.
BigEndianShortCoder - Class in org.apache.beam.sdk.coders
A BigEndianShortCoder encodes Shorts in 2 bytes, big-endian.
BigIntegerCoder - Class in org.apache.beam.sdk.coders
A BigIntegerCoder encodes a BigInteger as a byte array containing the big endian two's-complement representation, encoded via ByteArrayCoder.
bigintegers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for BigInteger.
BIGQUERY - Static variable in class org.apache.beam.sdk.managed.Managed
 
BIGQUERY_EARLY_ROLLOUT_REGION - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
BIGQUERY_JOB_TEMPLATE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Template for BigQuery jobs created by BigQueryIO.
BigqueryClient - Class in org.apache.beam.sdk.io.gcp.testing
A wrapper class to call Bigquery API calls.
BigqueryClient(String) - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
BigQueryCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
A CoderProviderRegistrar for standard types used with BigQueryIO.
BigQueryCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
 
BigQueryDirectReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
BigQueryDirectReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
BigQueryDirectReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
Configuration for reading from BigQuery with Storage Read API.
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryDlqProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
 
BigQueryExportReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery
Configuration for reading from BigQuery.
BigQueryExportReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
 
BigQueryExportReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryExportReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of TypedSchemaTransformProvider for BigQuery read jobs configured using BigQueryExportReadSchemaTransformConfiguration.
BigQueryExportReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
 
BigQueryExportReadSchemaTransformProvider.BigQueryExportSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of SchemaTransform for BigQuery read jobs configured using BigQueryExportReadSchemaTransformConfiguration.
BigQueryFileLoadsSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
An implementation of TypedSchemaTransformProvider for BigQuery write jobs configured using BigQueryWriteConfiguration.
BigQueryFileLoadsSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
 
BigQueryFileLoadsSchemaTransformProvider.BigQueryFileLoadsSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryFilter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
 
BigQueryFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
 
BigQueryHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
A set of helper functions and classes used by BigQueryIO.
BigQueryHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
BigQueryInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
Model definition for BigQueryInsertError.
BigQueryInsertError(TableRow, TableDataInsertAllResponse.InsertErrors, TableReference) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
BigQueryInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder that encodes BigQuery BigQueryInsertError objects.
BigQueryInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
BigQueryIO - Class in org.apache.beam.sdk.io.gcp.bigquery
PTransforms for reading and writing BigQuery tables.
BigQueryIO.Read - Class in org.apache.beam.sdk.io.gcp.bigquery
Implementation of BigQueryIO.read().
BigQueryIO.TypedRead<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
BigQueryIO.TypedRead.Method - Enum in org.apache.beam.sdk.io.gcp.bigquery
Determines the method used to read data from BigQuery.
BigQueryIO.TypedRead.QueryPriority - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the priority of a query.
BigQueryIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
Implementation of BigQueryIO.write().
BigQueryIO.Write.CreateDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the BigQuery create disposition strings.
BigQueryIO.Write.Method - Enum in org.apache.beam.sdk.io.gcp.bigquery
Determines the method used to insert data in BigQuery.
BigQueryIO.Write.SchemaUpdateOption - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the BigQuery schema update options strings.
BigQueryIO.Write.WriteDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the BigQuery write disposition strings.
BigQueryIOTranslation - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryIOTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation
 
BigQueryIOTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryIOTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigqueryMatcher - Class in org.apache.beam.sdk.io.gcp.testing
A matcher to verify data in BigQuery by processing given query and comparing with content's checksum.
BigqueryMatcher.TableAndQuery - Class in org.apache.beam.sdk.io.gcp.testing
 
BigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
Properties needed when using Google BigQuery with the Apache Beam SDK.
BigQuerySchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of SchemaIOProvider for reading and writing to BigQuery with BigQueryIO.
BigQuerySchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
 
BigQuerySchemaRetrievalException - Exception in org.apache.beam.sdk.io.gcp.bigquery
Exception to signal that BigQuery schema retrieval failed.
BigQuerySchemaTransformTranslation - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQuerySchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation
 
BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQuerySchemaTransformTranslation.ReadWriteRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryServices - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface for real, mock, or fake implementations of Cloud BigQuery services.
BigQueryServices.BigQueryServerStream<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
Container for reading data from streaming endpoints.
BigQueryServices.DatasetService - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface to get, create and delete Cloud BigQuery datasets and tables.
BigQueryServices.DatasetService.TableMetadataView - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryServices.JobService - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface for the Cloud BigQuery load service.
BigQueryServices.StorageClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface representing a client object for making calls to the BigQuery Storage API.
BigQueryServices.StreamAppendClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface for appending records to a Storage API write stream.
BigQueryServices.WriteStreamService - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface to get, create and flush Cloud BigQuery STORAGE API write streams.
BigQueryServicesImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of BigQueryServices that actually communicates with the cloud BigQuery service.
BigQueryServicesImpl() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
BigQueryServicesImpl.DatasetServiceImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryServicesImpl.WriteStreamServiceImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQuerySinkMetrics - Class in org.apache.beam.sdk.io.gcp.bigquery
Helper class to create perworker metrics for BigQuery Sink stages.
BigQuerySinkMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
BigQuerySinkMetrics.RpcMethod - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryStorageApiInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryStorageApiInsertError(TableRow) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
BigQueryStorageApiInsertError(TableRow, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
BigQueryStorageApiInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryStorageApiInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
 
BigQueryStorageReadSchemaTransformTranslator() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator
 
BigQueryStorageTableSource<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
A Source representing reading from a table.
BigQueryStorageWriteApiSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
An implementation of TypedSchemaTransformProvider for BigQuery Storage Write API jobs configured via BigQueryWriteConfiguration.
BigQueryStorageWriteApiSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
A SchemaTransform for BigQuery Storage Write API, configured with BigQueryWriteConfiguration and instantiated by BigQueryStorageWriteApiSchemaTransformProvider.
BigQueryTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
BigQuery table provider.
BigQueryTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
 
BigQueryUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
Utility methods for BigQuery related operations.
BigQueryUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
BigQueryUtils.ConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
Options for how to convert BigQuery data to Beam data.
BigQueryUtils.ConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
BigQueryUtils.ConversionOptions.TruncateTimestamps - Enum in org.apache.beam.sdk.io.gcp.bigquery
Controls whether to truncate timestamps to millisecond precision lossily, or to crash when truncation would result.
BigQueryUtils.SchemaConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
Options for how to convert BigQuery schemas to Beam schemas.
BigQueryUtils.SchemaConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
BigQueryWriteConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
Configuration for writing to BigQuery with SchemaTransforms.
BigQueryWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
BigQueryWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
BigQueryWriteConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryWriteConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
A BigQuery Write SchemaTransformProvider that routes to either BigQueryFileLoadsSchemaTransformProvider or BigQueryStorageWriteApiSchemaTransformProvider.
BigQueryWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider
 
BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryWriteSchemaTransformTranslator() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator
 
BigtableChangeStreamAccessor - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
This is probably a temporary solution to what is a bigger migration from cloud-bigtable-client-core to java-bigtable.
BigtableChangeStreamTestOptions - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams
 
BigtableClientOverride - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Override the configuration of Cloud Bigtable data and admin client.
BigtableConfig - Class in org.apache.beam.sdk.io.gcp.bigtable
Configuration for a Cloud Bigtable client.
BigtableConfig() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
BigtableIO - Class in org.apache.beam.sdk.io.gcp.bigtable
Transforms for reading from and writing to Google Cloud Bigtable.
BigtableIO.ExistingPipelineOptions - Enum in org.apache.beam.sdk.io.gcp.bigtable
Overwrite options to determine what to do if change stream name is being reused and there exists metadata of the same change stream name.
BigtableIO.Read - Class in org.apache.beam.sdk.io.gcp.bigtable
A PTransform that reads from Google Cloud Bigtable.
BigtableIO.ReadChangeStream - Class in org.apache.beam.sdk.io.gcp.bigtable
 
BigtableIO.Write - Class in org.apache.beam.sdk.io.gcp.bigtable
A PTransform that writes to Google Cloud Bigtable.
BigtableIO.WriteWithResults - Class in org.apache.beam.sdk.io.gcp.bigtable
A PTransform that writes to Google Cloud Bigtable and emits a BigtableWriteResult for each batch written.
BigtableReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
BigtableReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigtable
BigtableReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigtable
Configuration for reading from Bigtable.
BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigtable
BigtableReadSchemaTransformProvider.BigtableRowToBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
 
BigtableRowToBeamRow() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableRowToBeamRow
 
BigtableRowToBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
Bigtable reference: .
BigtableRowToBeamRow(Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
 
BigtableRowToBeamRowFlat - Class in org.apache.beam.sdk.io.gcp.bigtable
Bigtable reference: .
BigtableRowToBeamRowFlat(Schema, Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
 
BigtableTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
 
BigtableTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
BigtableTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
 
BigtableUtils - Class in org.apache.beam.sdk.io.gcp.testing
 
BigtableUtils() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
BigtableWriteResult - Class in org.apache.beam.sdk.io.gcp.bigtable
The result of writing a batch of rows to Bigtable.
BigtableWriteResult() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
 
BigtableWriteResultCoder - Class in org.apache.beam.sdk.io.gcp.bigtable
A coder for BigtableWriteResult.
BigtableWriteResultCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
BigtableWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
 
BigtableWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigtable
BigtableWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigtable
Configuration for writing to Bigtable.
BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigtable
BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
 
BinaryCombineDoubleFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
BinaryCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
BinaryCombineIntegerFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
BinaryCombineLongFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
bind(String, StateBinder) - Method in interface org.apache.beam.sdk.state.StateSpec
For internal use only; no backwards-compatibility guarantees.
bindBag(String, StateSpec<BagState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindCombining(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindCombiningWithContext(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindMap(String, StateSpec<MapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindMultimap(String, StateSpec<MultimapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindOrderedList(String, StateSpec<OrderedListState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindSet(String, StateSpec<SetState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindValue(String, StateSpec<ValueState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindWatermark(String, StateSpec<WatermarkHoldState>, TimestampCombiner) - Method in interface org.apache.beam.sdk.state.StateBinder
Bind to a watermark StateSpec.
BIT_XOR - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
BitSetCoder - Class in org.apache.beam.sdk.coders
Coder for BitSet.
BitXOr() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
 
BlackholeOutput() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.BlackholeOutput
 
BlobstoreClientBuilderFactory - Interface in org.apache.beam.sdk.io.azure.options
Construct BlobServiceClientBuilder from Azure pipeline options.
BlobstoreOptions - Interface in org.apache.beam.sdk.io.azure.options
Options used to configure Microsoft Azure Blob Storage.
Block() - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.Block
 
BlockBasedReader(BlockBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
BlockBasedSource<T> - Class in org.apache.beam.sdk.io
A BlockBasedSource is a FileBasedSource where a file consists of blocks of records.
BlockBasedSource(String, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
Creates a BlockBasedSource based on a file name or pattern.
BlockBasedSource(String, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
BlockBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
BlockBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
BlockBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
Creates a BlockBasedSource for a single file.
BlockBasedSource.Block<T> - Class in org.apache.beam.sdk.io
A Block represents a block of records that can be read.
BlockBasedSource.BlockBasedReader<T> - Class in org.apache.beam.sdk.io
A Reader that reads records from a BlockBasedSource.
BlockingCommitterImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
BlockTracker(OffsetRange, long, long) - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
 
BOOL - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
BOOLEAN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
BOOLEAN - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of boolean fields.
BooleanCoder - Class in org.apache.beam.sdk.coders
A Coder for Boolean.
BooleanCoder() - Constructor for class org.apache.beam.sdk.coders.BooleanCoder
 
booleans() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Boolean.
booleanToByteArray(boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
Bounded(SparkContext, BoundedSource<T>, SerializablePipelineOptions, String) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
BOUNDED_UNKNOWN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
BoundedDatasetFactory - Class in org.apache.beam.runners.spark.structuredstreaming.io
 
BoundedReader() - Constructor for class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
BoundedReadFromUnboundedSource<T> - Class in org.apache.beam.sdk.io
PTransform that reads a bounded amount of data from an UnboundedSource, specified as one or both of a maximum number of elements or a maximum period of time to read.
BoundedSource<T> - Class in org.apache.beam.sdk.io
A Source that reads a finite amount of input and, because of that, supports some additional operations.
BoundedSource() - Constructor for class org.apache.beam.sdk.io.BoundedSource
 
BoundedSource.BoundedReader<T> - Class in org.apache.beam.sdk.io
A Reader that reads a bounded amount of input and supports some additional operations, such as progress estimation and dynamic work rebalancing.
BoundedSourceP<T> - Class in org.apache.beam.runners.jet.processors
Jet Processor implementation for reading from a bounded Beam source.
BoundedWindow - Class in org.apache.beam.sdk.transforms.windowing
A BoundedWindow represents window information assigned to data elements.
BoundedWindow() - Constructor for class org.apache.beam.sdk.transforms.windowing.BoundedWindow
 
boxIfPrimitive(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
broadcast(T) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
broadcast(JavaSparkContext) - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
BrokerResponse - Class in org.apache.beam.sdk.io.solace.broker
 
BrokerResponse(int, String, InputStream) - Constructor for class org.apache.beam.sdk.io.solace.broker.BrokerResponse
 
bucketAccessible(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Returns whether the GCS bucket exists and is accessible.
bucketOwner(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Returns the project number of the project which owns this bucket.
BufferedExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
Sorter that will use in memory sorting until the values can't fit into memory and will then fall back to external sorting.
BufferedExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
Contains configuration for the sorter.
BufferingStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
A thread safe StreamObserver which uses a bounded queue to pass elements to a processing thread responsible for interacting with the underlying CallStreamObserver.
BufferingStreamObserver(Phaser, CallStreamObserver<T>, ExecutorService, int) - Constructor for class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
build() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
build() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Build function to create an instance of BeamSqlEnv based on preset fields.
build() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
 
build() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
 
build() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
 
build() - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
 
build() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
 
build() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
 
build() - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
 
build() - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
build() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
Builds a BigQueryWriteConfiguration instance.
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Create a new instance of RpcQosOptions from the current builder state.
build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Builds a PartitionMetadata from the given fields.
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
 
build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
 
build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
 
build() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
build() - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
build() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService.Builder
 
build() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
 
build() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
 
build() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
 
build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
 
build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
 
build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
 
build() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
build() - Method in class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
 
build() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
build() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
 
build(String) - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.MetricNameBuilder
 
build() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
build() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.io.Failure.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
 
build() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
 
build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
 
build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
 
build() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
 
build() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder
 
build() - Method in class org.apache.beam.sdk.values.Row.Builder
 
build() - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
 
buildBeamSqlNullableSchema(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
 
buildBeamSqlSchema(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
Create a RowsBuilder with the specified row type info.
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
 
buildBeamSqlTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
Build a BeamSqlTable using the given table meta info.
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
 
buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
buildClient(AwsOptions, BuilderT, ClientConfiguration) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
Utility to directly build a client of type ClientT using builder of BuilderT.
buildDatasource() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
buildDatasource() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Builds SnowflakeBasicDataSource based on the current configuration.
builder() - Static method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.io.requestresponse.Monitoring
 
Builder() - Constructor for class org.apache.beam.io.requestresponse.Monitoring.Builder
 
builder() - Static method in class org.apache.beam.runners.jobsubmission.JobPreparation
 
builder() - Static method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
 
Builder() - Constructor for class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration.Builder
 
builder() - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
 
builder() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
builder() - Static method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
 
builder(TableProvider) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
Creates a builder with the default schema backed by the table provider.
builder() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
builder() - Static method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
Creates a ParameterListBuilder.
builder(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode
 
Builder(RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode.Builder
 
builder() - Static method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Creates a new uninitialized S3FileSystemConfiguration.Builder.
Builder() - Constructor for class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation
 
Builder() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
 
builder() - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
Creates a new uninitialized S3FileSystemConfiguration.Builder.
Builder() - Constructor for class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
 
builder() - Static method in class org.apache.beam.sdk.io.cdap.Plugin
Creates a plugin builder instance.
Builder() - Constructor for class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
builder() - Static method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
Builder() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
Instantiates a BigQueryWriteConfiguration.Builder instance.
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
 
Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
 
Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, boolean, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
builder(Dialect) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
 
Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
 
Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
builder() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
 
Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
Builder() - Constructor for class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.kinesis.WatermarkParameters
 
builder() - Static method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.data.Semp.Queue
 
builder() - Static method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
 
builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Record
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.solace.RetryCallableManager
 
Builder() - Constructor for class org.apache.beam.sdk.io.solace.RetryCallableManager.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
Builder() - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
builder() - Static method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
 
Builder() - Constructor for class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
 
builder() - Static method in class org.apache.beam.sdk.metrics.MetricsFilter
 
Builder() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
builder() - Static method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
 
Builder() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.io.Failure.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.Schema
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.Schema.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.Schema.Options
 
builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
 
builder() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
 
Builder() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.transforms.DoFnSchemaInformation.Builder
 
builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
 
Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
 
builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
 
Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
 
builder() - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
 
Builder() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.ParseResult.Builder
 
builderForType(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
 
builderFrom(S3Options) - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
Creates a new S3FileSystemConfiguration.Builder with values initialized by the properties of s3Options.
buildExternal(DebeziumTransformRegistrar.ReadBuilder.Configuration) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder
 
buildExternal(ExternalRead.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
 
buildExternal(ExternalWrite.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
 
buildExternal(SpannerTransformRegistrar.ReadBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
 
buildExternal(KinesisTransformRegistrar.ReadDataBuilder.Configuration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder
 
buildExternal(KinesisTransformRegistrar.WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.WriteBuilder
 
buildExternal(ReadBuilder.Configuration) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder
 
buildExternal(WriteBuilder.Configuration) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder
 
buildExternal(ConfigT) - Method in interface org.apache.beam.sdk.transforms.ExternalTransformBuilder
Builds the transform after it has been configured.
buildFrom(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
buildFrom(DescriptorProtos.FileDescriptorSet) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
buildFrom(Descriptors.FileDescriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
buildFrom(InputStream) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
 
buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
 
buildIOReader(PBegin) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
create a PCollection<Row> from source.
buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
create a PCollection<Row> from source with predicate and/or project pushed-down.
buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
 
buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
 
buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
 
buildIOReader(PBegin, BeamSqlTableFilter, List<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
 
buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
 
buildIOWriter(PCollection<Row>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
create a IO.write() instance to write to target.
buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
 
buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
 
buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
 
buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
buildPTransform(PTransform<PCollection<Row>, ? extends POutput>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
 
buildPTransform() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
 
buildPTransform(PTransform<PCollection<Row>, ? extends POutput>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
 
buildPTransform(PTransform<PCollection<Row>, ? extends POutput>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
 
buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
 
buildReader() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
buildReader() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIO
Returns a schema aware reader.
buildRows(Schema, List<?>) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
Convenient way to build a BeamSqlRows.
buildSchemaWithAttributes(Schema, List<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
Builds a new Schema by adding additional optional attributes and map field to the provided schema.
buildTemporaryFilename(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Constructs a temporary file resource given the temporary directory and a filename.
buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.AvroReadSchemaTransformFormatProvider
 
buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.AvroWriteSchemaTransformFormatProvider
Builds a PTransform that transforms a Row PCollection into result PCollectionTuple with two tags, one for file names written using AvroIO.Write, another for errored-out rows.
buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.CsvWriteSchemaTransformFormatProvider
 
buildTransform(FileReadSchemaTransformConfiguration) - Method in interface org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformFormatProvider
 
buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in interface org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformFormatProvider
Builds a PTransform that writes a Row PCollection and outputs the resulting PCollectionTuple with two tags, one for the file names, and another errored-out rows.
buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.JsonReadSchemaTransformFormatProvider
 
buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.JsonWriteSchemaTransformFormatProvider
Builds a PTransform that transforms a Row PCollection into result PCollectionTuple with two tags, one for file names written using TextIO.Write, another for errored-out rows.
buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.LineReadSchemaTransformFormatProvider
 
buildTransform(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetReadSchemaTransformFormatProvider
 
buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetWriteSchemaTransformFormatProvider
Builds a PTransform that transforms a Row PCollection into result PCollectionTuple with two tags, one for file names written using ParquetIO.Sink and FileIO.Write, another for errored-out rows.
buildTransform(FileWriteSchemaTransformConfiguration, Schema) - Method in class org.apache.beam.sdk.io.fileschematransform.XmlWriteSchemaTransformFormatProvider
Builds a PTransform that transforms a Row PCollection into result PCollectionTuple with two tags, one for file names written using XmlIO.Sink and FileIO.Write, another for errored-out rows.
buildWriter() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
buildWriter() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIO
Returns a schema aware writer.
BUILTIN_AGGREGATOR_FACTORIES - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
 
BUILTIN_ANALYTIC_FACTORIES - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
BuiltinHashFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
Hash Functions.
BuiltinHashFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
 
BuiltinStringFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
BuiltinStringFunctions.
BuiltinStringFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
BuiltinTrigonometricFunctions - Class in org.apache.beam.sdk.extensions.sql.impl.udf
TrigonometricFunctions.
BuiltinTrigonometricFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
 
bulkIO() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
 
BulkIO() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
 
Bundle<T,CollectionT> - Interface in org.apache.beam.runners.local
An immutable collection of elements which are part of a PCollection.
BundleCheckpointHandler - Interface in org.apache.beam.runners.fnexecution.control
A handler which is invoked when the SDK returns BeamFnApi.DelayedBundleApplications as part of the bundle completion.
BundleCheckpointHandlers - Class in org.apache.beam.runners.fnexecution.control
Utility methods for creating BundleCheckpointHandlers.
BundleCheckpointHandlers() - Constructor for class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers
 
BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler<T> - Class in org.apache.beam.runners.fnexecution.control
A BundleCheckpointHandler which uses TimerInternals.TimerData and ValueState to reschedule BeamFnApi.DelayedBundleApplication.
BundleFinalizationHandler - Interface in org.apache.beam.runners.fnexecution.control
A handler for the runner when a finalization request has been received.
BundleFinalizationHandlers - Class in org.apache.beam.runners.fnexecution.control
Utility methods for creating BundleFinalizationHandlers.
BundleFinalizationHandlers() - Constructor for class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers
 
BundleFinalizationHandlers.InMemoryFinalizer - Class in org.apache.beam.runners.fnexecution.control
BundleProgressHandler - Interface in org.apache.beam.runners.fnexecution.control
A handler for bundle progress messages, both during bundle execution and on its completion.
BundleSplitHandler - Interface in org.apache.beam.runners.fnexecution.control
A handler which is invoked whenever an active bundle is split.
by(SerializableFunction<UserT, DestinationT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies how to partition elements into groups ("destinations").
by(Contextful<Contextful.Fn<UserT, DestinationT>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
By() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup.By
 
by(PredicateT) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that satisfy the given predicate.
by(PredicateT) - Static method in class org.apache.beam.sdk.transforms.Filter
Binary compatibility adapter for Filter.by(ProcessFunction).
byFieldAccessDescriptor(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
Returns a transform that groups all elements in the input PCollection keyed by the fields specified.
byFieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
Returns a transform that groups all elements in the input PCollection keyed by the list of fields specified.
byFieldIds(Iterable<Integer>) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
byFieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
Returns a transform that groups all elements in the input PCollection keyed by the list of fields specified.
byFieldNames(Iterable<String>) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
ByFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
byId(int, int, RetryConfiguration, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ErrT>>>, Function<ErrT, String>, Function<RecT, String>, Function<ErrT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
AsyncBatchWriteHandler that correlates records and results by id, all results are erroneous.
byId(int, FluentBackoff, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ErrT>>>, Function<ErrT, String>, Function<RecT, String>, Function<ErrT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
AsyncBatchWriteHandler that correlates records and results by id, all results are erroneous.
byKey() - Static method in class org.apache.beam.sdk.transforms.Redistribute
 
byPosition(int, int, RetryConfiguration, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>, Function<ResT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
AsyncBatchWriteHandler that correlates records and results by position in the respective list.
byPosition(int, FluentBackoff, AsyncBatchWriteHandler.Stats, BiFunction<String, List<RecT>, CompletableFuture<List<ResT>>>, Function<ResT, String>) - Static method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
AsyncBatchWriteHandler that correlates records and results by position in the respective list.
BYTE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of byte fields.
ByteArray - Class in org.apache.beam.runners.spark.util
Serializable byte array.
ByteArray(byte[]) - Constructor for class org.apache.beam.runners.spark.util.ByteArray
 
ByteArrayCoder - Class in org.apache.beam.sdk.coders
A Coder for byte[].
ByteArrayKey(byte[]) - Constructor for class org.apache.beam.runners.jet.Utils.ByteArrayKey
 
ByteBuddyUtils - Class in org.apache.beam.sdk.schemas.utils
 
ByteBuddyUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
 
ByteBuddyUtils.ConvertType - Class in org.apache.beam.sdk.schemas.utils
Give a Java type, returns the Java type expected for use with Row.
ByteBuddyUtils.ConvertValueForGetter - Class in org.apache.beam.sdk.schemas.utils
Takes a StackManipulation that returns a value.
ByteBuddyUtils.ConvertValueForSetter - Class in org.apache.beam.sdk.schemas.utils
Row is going to call the setter with its internal Java type, however the user object being set might have a different type internally.
ByteBuddyUtils.DefaultTypeConversionsFactory - Class in org.apache.beam.sdk.schemas.utils
 
ByteBuddyUtils.InjectPackageStrategy - Class in org.apache.beam.sdk.schemas.utils
A naming strategy for ByteBuddy classes.
ByteBuddyUtils.TransformingMap<K1,V1,K2,V2> - Class in org.apache.beam.sdk.schemas.utils
 
ByteBuddyUtils.TypeConversion<T> - Class in org.apache.beam.sdk.schemas.utils
 
ByteBuddyUtils.TypeConversionsFactory - Interface in org.apache.beam.sdk.schemas.utils
 
ByteBufferBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle
 
ByteCoder - Class in org.apache.beam.sdk.coders
A ByteCoder encodes Byte values in 1 byte using Java serialization.
ByteKey - Class in org.apache.beam.sdk.io.range
A class representing a key consisting of an array of bytes.
ByteKeyRange - Class in org.apache.beam.sdk.io.range
A class representing a range of ByteKeys.
ByteKeyRangeTracker - Class in org.apache.beam.sdk.io.range
ByteKeyRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
A RestrictionTracker for claiming ByteKeys in a ByteKeyRange in a monotonically increasing fashion.
Bytes() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Bytes
 
BYTES - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of bytes fields.
bytes() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Byte.
BytesBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle
 
bytesRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Counter of bytes read by a source.
bytesReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Counter of bytes read by a source split.
BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
An estimator to provide an estimate on the byte throughput of the outputted elements.
BytesThroughputEstimator(SizeEstimator<T>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
 
BytesThroughputEstimator(SizeEstimator<T>, int, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
 
BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
An estimator to provide an estimate on the throughput of the outputted elements.
BytesThroughputEstimator(int, SizeEstimator<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
 
bytesToRowFn(SchemaProvider, TypeDescriptor<T>, ProcessFunction<byte[], ? extends T>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
 
bytesToRowFn(SchemaProvider, TypeDescriptor<T>, Coder<? extends T>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
 
byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
ByteStringCoder - Class in org.apache.beam.runners.fnexecution.wire
A duplicate of ByteStringCoder that uses the Apache Beam vendored protobuf.
ByteStringCoder - Class in org.apache.beam.sdk.extensions.protobuf
A Coder for ByteString objects based on their encoded Protocol Buffer form.
ByteStringOutput() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.ByteStringOutput
 
ByteStringOutputStreamBenchmark - Class in org.apache.beam.sdk.jmh.util
Benchmarks for ByteStringOutputStream.
ByteStringOutputStreamBenchmark() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
ByteStringOutputStreamBenchmark.NewVsCopy - Class in org.apache.beam.sdk.jmh.util
These benchmarks below provide good details as to the cost of creating a new buffer vs copying a subset of the existing one and re-using the larger one.
ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState - Class in org.apache.beam.sdk.jmh.util
 
ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState - Class in org.apache.beam.sdk.jmh.util
 
ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream - Class in org.apache.beam.sdk.jmh.util
 
ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream - Class in org.apache.beam.sdk.jmh.util
 
ByteStringRangeHelper - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
Helper functions to evaluate the completeness of collection of ByteStringRanges.
ByteStringRangeHelper() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
 
byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
bytesWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
Counter of bytes written to a sink.
ByteToElemFunction<V> - Class in org.apache.beam.runners.twister2.translators.functions
ByteToWindow function.
ByteToElemFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
 
ByteToElemFunction(WindowedValue.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
 
ByteToWindowFunction<K,V> - Class in org.apache.beam.runners.twister2.translators.functions
ByteToWindow function.
ByteToWindowFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
 
ByteToWindowFunction(Coder<K>, WindowedValue.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
 
ByteToWindowFunctionPrimitive<K,V> - Class in org.apache.beam.runners.twister2.translators.functions
ByteToWindow function.
ByteToWindowFunctionPrimitive() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
 
ByteToWindowFunctionPrimitive(Coder<K>, WindowedValue.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
 

C

Cache - Class in org.apache.beam.io.requestresponse
Transforms for reading and writing request/response associations to a cache.
Cache() - Constructor for class org.apache.beam.io.requestresponse.Cache
 
Cache.Pair<RequestT,ResponseT> - Class in org.apache.beam.io.requestresponse
A simple POJO that holds both cache read and write PTransforms.
CACHED_CREATORS - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
CACHED_CREATORS - Static variable in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
CachedSideInputReader - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
SideInputReader that caches results for costly Materializations.
CachedSideInputReader - Class in org.apache.beam.runners.spark.util
SideInputReader that caches materialized views.
CachingFactory<CreatedT> - Class in org.apache.beam.sdk.schemas
A wrapper around a Factory that assumes the schema parameter never changes.
CachingFactory(Factory<CreatedT>) - Constructor for class org.apache.beam.sdk.schemas.CachingFactory
 
CalciteConnectionWrapper - Class in org.apache.beam.sdk.extensions.sql.impl
Abstract wrapper for CalciteConnection to simplify extension.
CalciteConnectionWrapper(CalciteConnection) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
CalciteFactoryWrapper - Class in org.apache.beam.sdk.extensions.sql.impl
Wrapper for CalciteFactory.
CalciteQueryPlanner - Class in org.apache.beam.sdk.extensions.sql.impl
The core component to handle through a SQL statement, from explain execution plan, to generate a Beam pipeline.
CalciteQueryPlanner(JdbcConnection, Collection<RuleSet>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
Called by BeamSqlEnv.instantiatePlanner() reflectively.
CalciteQueryPlanner.NonCumulativeCostImpl - Class in org.apache.beam.sdk.extensions.sql.impl
 
CalciteUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
Utility methods for Calcite related operations.
CalciteUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
CalciteUtils.TimeWithLocalTzType - Class in org.apache.beam.sdk.extensions.sql.impl.utils
A LogicalType corresponding to TIME_WITH_LOCAL_TIME_ZONE.
CalcRelSplitter - Class in org.apache.beam.sdk.extensions.sql.impl.rel
CalcRelSplitter operates on a Calc with multiple RexCall sub-expressions that cannot all be implemented by a single concrete RelNode.
CalcRelSplitter(Calc, RelBuilder, CalcRelSplitter.RelType[]) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
Constructs a CalcRelSplitter.
CalcRelSplitter.RelType - Class in org.apache.beam.sdk.extensions.sql.impl.rel
Type of relational expression.
calculateRanges(PartitionT, PartitionT, Long) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcReadWithPartitionsHelper
Calculate the range of each partition from the lower and upper bound, and number of partitions.
CalendarWindows - Class in org.apache.beam.sdk.transforms.windowing
A collection of WindowFns that windows values into calendar-based windows such as spans of days, months, or years.
CalendarWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.CalendarWindows
 
CalendarWindows.DaysWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows elements into periods measured by days.
CalendarWindows.MonthsWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows elements into periods measured by months.
CalendarWindows.YearsWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows elements into periods measured by years.
call(RequestT) - Method in interface org.apache.beam.io.requestresponse.Caller
Calls a Web API with the RequestT and returns a ResponseT.
call(Tuple2<ByteArray, byte[]>) - Method in class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
 
call(K, Iterator<WindowedValue<KV<K, InputT>>>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.GroupAlsoByWindowViaOutputBufferFn
 
Caller<RequestT,ResponseT> - Interface in org.apache.beam.io.requestresponse
Caller interfaces user custom code intended for API calls.
CallShouldBackoff<ResponseT> - Interface in org.apache.beam.io.requestresponse
Informs whether a call to an API should backoff.
cancel() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
cancel() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
cancel() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
cancel() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
cancel() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
cancel() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
 
cancel() - Method in class org.apache.beam.runners.jet.JetPipelineResult
 
cancel(JobApi.CancelJobRequest, StreamObserver<JobApi.CancelJobResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
cancel() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
Cancel the job.
cancel() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
cancel() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
 
cancel() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
 
cancel(Exception) - Method in class org.apache.beam.sdk.fn.CancellableQueue
Causes any pending and future CancellableQueue.put(T) and CancellableQueue.take() invocations to throw an exception.
cancel() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.BigQueryServerStream
Cancels the stream, releasing any client- and server-side resources.
cancel() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
 
cancel() - Method in interface org.apache.beam.sdk.PipelineResult
Cancels the pipeline execution.
CancellableQueue<T> - Class in org.apache.beam.sdk.fn
A simplified ThreadSafe blocking queue that can be cancelled freeing any blocked Threads and preventing future Threads from blocking.
CancellableQueue(int) - Constructor for class org.apache.beam.sdk.fn.CancellableQueue
Creates a ThreadSafe blocking queue with a maximum capacity.
cancelled() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
Report that the pipeline has been cancelled.
canConvertConvention(Convention) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
canImplement(LogicalCalc, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
Returns whether a relational expression can be implemented solely in a given CalcRelSplitter.RelType.
canImplement(RexFieldAccess) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
 
canImplement(RexDynamicParam) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
 
canImplement(RexLiteral) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
 
canImplement(RexCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
 
canImplement(RexNode, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
Returns whether this RelType can implement a given expression.
canImplement(RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
Returns whether this tester's RelType can implement a given program.
CannotProvideCoderException - Exception in org.apache.beam.sdk.coders
The exception thrown when a CoderRegistry or CoderProvider cannot provide a Coder that has been requested.
CannotProvideCoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(String, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(String, Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException.ReasonCode - Enum in org.apache.beam.sdk.coders
Indicates the reason that Coder inference failed.
canStopPolling(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
Called by the Watch transform to determine whether the given termination state signals that Watch should stop calling Watch.Growth.PollFn for the current input, regardless of whether the last Watch.Growth.PollResult was complete or incomplete.
canTranslate(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
Checks if a composite / primitive transform can be translated.
CassandraIO - Class in org.apache.beam.sdk.io.cassandra
An IO to read and write from/to Apache Cassandra
CassandraIO.MutationType - Enum in org.apache.beam.sdk.io.cassandra
Specify the mutation type: either write or delete.
CassandraIO.Read<T> - Class in org.apache.beam.sdk.io.cassandra
A PTransform to read data from Apache Cassandra.
CassandraIO.ReadAll<T> - Class in org.apache.beam.sdk.io.cassandra
A PTransform to read data from Apache Cassandra.
CassandraIO.Write<T> - Class in org.apache.beam.sdk.io.cassandra
A PTransform to mutate into Apache Cassandra.
Cast<T> - Class in org.apache.beam.sdk.schemas.transforms
Set of utilities for casting rows between schemas.
Cast() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast
 
Cast.CompatibilityError - Class in org.apache.beam.sdk.schemas.transforms
Describes compatibility errors during casting.
Cast.Narrowing - Class in org.apache.beam.sdk.schemas.transforms
Narrowing changes type without guarantee to preserve data.
Cast.Validator - Interface in org.apache.beam.sdk.schemas.transforms
Interface for statically validating casts.
Cast.Widening - Class in org.apache.beam.sdk.schemas.transforms
Widening changes to type that can represent any possible value of the original type.
CAST_OP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
CastFunctionImpl - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
ZetaSQLCastFunctionImpl.
CastFunctionImpl() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
 
castNumber(Number, Schema.TypeName, Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
 
castRow(Row, Schema, Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
 
castValue(Object, Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
 
catalog() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
 
catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
 
catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
catchUpToNow(boolean) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
For internal use only; no backwards-compatibility guarantees.
catchUpToNow - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
 
CdapIO - Class in org.apache.beam.sdk.io.cdap
A CdapIO is a Transform for reading data from source or writing data to sink of a Cdap Plugin.
CdapIO() - Constructor for class org.apache.beam.sdk.io.cdap.CdapIO
 
CdapIO.Read<K,V> - Class in org.apache.beam.sdk.io.cdap
A PTransform to read from CDAP source.
CdapIO.Write<K,V> - Class in org.apache.beam.sdk.io.cdap
A PTransform to write to CDAP sink.
cdapPluginObj - Variable in class org.apache.beam.sdk.io.cdap.Plugin
 
CELL_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
CEPCall - Class in org.apache.beam.sdk.extensions.sql.impl.cep
A CEPCall instance represents an operation (node) that contains an operator and a list of operands.
CEPFieldRef - Class in org.apache.beam.sdk.extensions.sql.impl.cep
A CEPFieldRef instance represents a node that points to a specified field in a Row.
CEPKind - Enum in org.apache.beam.sdk.extensions.sql.impl.cep
CEPKind corresponds to Calcite's SqlKind.
CEPLiteral - Class in org.apache.beam.sdk.extensions.sql.impl.cep
CEPLiteral represents a literal node.
CEPMeasure - Class in org.apache.beam.sdk.extensions.sql.impl.cep
The CEPMeasure class represents the Measures clause and contains information about output columns.
CEPMeasure(Schema, String, CEPOperation) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
 
CEPOperation - Class in org.apache.beam.sdk.extensions.sql.impl.cep
CEPOperation is the base class for the evaluation operations defined in the DEFINE syntax of MATCH_RECOGNIZE.
CEPOperation() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperation
 
CEPOperator - Class in org.apache.beam.sdk.extensions.sql.impl.cep
The CEPOperator records the operators (i.e.
CEPPattern - Class in org.apache.beam.sdk.extensions.sql.impl.cep
Core pattern class that stores the definition of a single pattern.
CEPUtils - Class in org.apache.beam.sdk.extensions.sql.impl.cep
Some utility methods for transforming Calcite's constructs into our own Beam constructs (for serialization purpose).
CEPUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
 
CF_CONTINUATION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_INITIAL_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_LOCK - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_MISSING_PARTITIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_PARENT_LOW_WATERMARKS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_PARENT_PARTITIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_SHOULD_DELETE - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_VERSION - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CHANGE_SQN_COLUMN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
 
CHANGE_STREAM_MUTATION_GC_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of ChangeStreamMutations that are initiated by garbage collection (not user initiated) identified during the execution of the Connector.
CHANGE_STREAM_MUTATION_USER_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of ChangeStreamMutations that are initiated by users (not garbage collection) identified during the execution of the Connector.
CHANGE_TYPE_COLUMN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
 
changeStreamAction(ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class for processing individual ChangeStreamMutation in ReadChangeStreamPartitionDoFn.
ChangeStreamAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
This class is responsible for processing individual ChangeStreamRecord.
ChangeStreamAction(ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ChangeStreamAction
Constructs ChangeStreamAction to process individual ChangeStreamRecord.
ChangeStreamContinuationTokenHelper - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
 
ChangeStreamContinuationTokenHelper() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamContinuationTokenHelper
 
ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Data access object to list and read stream partitions of a table.
ChangeStreamDao(BigtableDataClient, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
 
ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Responsible for making change stream queries for a given partition.
ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
Class to aggregate metrics related functionality.
ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
 
ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
Class to aggregate metrics related functionality.
ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Constructs a ChangeStreamMetrics instance with the following metrics enabled by default.
ChangeStreamMetrics(Set<MetricName>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Constructs a ChangeStreamMetrics instance with the given metrics enabled.
changeStreamQuery(String, Timestamp, Timestamp, long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamDao
Performs a change stream query.
ChangeStreamRecord - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents a Spanner Change Stream Record.
ChangeStreamRecordMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
This class is responsible for transforming a Struct to a List of ChangeStreamRecord models.
changeStreamRecordMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
Creates and returns a singleton instance of a mapper class capable of transforming a Struct into a List of ChangeStreamRecord subclasses.
ChangeStreamRecordMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Holds internal execution metrics / metadata for the processed ChangeStreamRecord.
ChangeStreamRecordMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
 
ChangeStreamResultSet - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Decorator class over a ResultSet that provides telemetry for the streamed records.
ChangeStreamResultSetMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Represents telemetry metadata gathered during the consumption of a change stream query.
ChangeStreamsConstants - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
Single place for defining the constants used in the Spanner.readChangeStreams() connector.
ChangeStreamsConstants() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
 
channelNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
CHAR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
CHAR_LENGTH - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
CHAR_LENGTH_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
characters() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Character.
charLength(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
check(RelNode) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall.JoinChecker
 
checkClientTrusted(X509Certificate[], String) - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
 
checkConfiguration(ClientConfiguration, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
Check if all necessary configuration is available to create clients.
checkConfiguration(ClientConfiguration, AwsOptions) - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.DefaultClientBuilder
 
checkDone() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
This is to signal to the runner that this restriction has completed.
checkDone() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
Checks if the restriction has been processed successfully.
checkDone() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
 
checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
Checks whether the restriction has been fully processed.
checkForAsyncFailure() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
Check if any failure happened async.
checkIfAnySubscriptionExists(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
CheckpointMarkImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
checkServerTrusted(X509Certificate[], String) - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
 
CheckStopReadingFn - Interface in org.apache.beam.sdk.io.kafka
 
CheckStopReadingFnWrapper - Class in org.apache.beam.sdk.io.kafka
 
checksum() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
An optional checksum to identify the contents of a file.
ChildPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
A child partition represents a new partition that should be queried.
ChildPartition(String, HashSet<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
Constructs a child partition, which will have its own token and the parents that it originated from.
ChildPartition(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
Constructs a child partition, which will have its own token and the parent that it originated from.
ChildPartitionsRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents a ChildPartitionsRecord.
ChildPartitionsRecord(Timestamp, String, List<ChildPartition>, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
Constructs a child partitions record containing one or more child partitions.
childPartitionsRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class capable of process ChildPartitionsRecords.
ChildPartitionsRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
This class is part of the process for ReadChangeStreamPartitionDoFn SDF.
CivilTimeEncoder - Class in org.apache.beam.sdk.io.gcp.bigquery
Encoder for TIME and DATETIME values, according to civil_time encoding.
classesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
Gets a map from Coder to a CloudObjectTranslator that can translate that Coder.
classesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
ClassLoaderFileSystem - Class in org.apache.beam.sdk.io
A read-only FileSystem implementation looking up resources using a ClassLoader.
ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar - Class in org.apache.beam.sdk.io
AutoService registrar for the ClassLoaderFileSystem.
ClassLoaderFileSystem.ClassLoaderResourceId - Class in org.apache.beam.sdk.io
 
ClassLoaderFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar
 
classNamesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
Gets a map from the name returned by CloudObject.getClassName() to a translator that can convert into the equivalent Coder.
classNamesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
ClassWithSchema() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
 
CleanTmpFilesFromGcsFn(ValueProvider<String>, String) - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.CleanTmpFilesFromGcsFn
Created object that will remove temp files from stage.
cleanup() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
cleanUpPrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Delete all the metadata rows starting with the change stream name prefix, except for detect new partition row because it signals the existence of a pipeline with the change stream name.
CleanUpReadChangeStreamDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
 
CleanUpReadChangeStreamDoFn(DaoFactory) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
 
clear(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
Clears the bag user state for the given key and window.
clear() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
clear() - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
 
clear() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
clear() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
clear() - Method in interface org.apache.beam.sdk.state.State
Clear out the state location.
clear() - Method in interface org.apache.beam.sdk.state.Timer
Clears a timer.
clearCache() - Static method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
clearOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
clearOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
clearRange(Instant, Instant) - Method in interface org.apache.beam.sdk.state.OrderedListState
Clear a timestamp-limited subrange of the list.
clearState(ReduceFn<K, T, Iterable<T>, W>.Context) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
clearWarnings() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
ClickHouseIO - Class in org.apache.beam.sdk.io.clickhouse
An IO to write to ClickHouse.
ClickHouseIO() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
 
ClickHouseIO.Write<T> - Class in org.apache.beam.sdk.io.clickhouse
A PTransform to write to ClickHouse.
ClickHouseWriter - Class in org.apache.beam.sdk.io.clickhouse
Writes Rows and field values using ClickHousePipedOutputStream.
ClickHouseWriter() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseWriter
 
CLIENT_EXECUTION_TIMEOUT - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
clientBuffered(ExecutorService) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
Create a buffering OutboundObserverFactory for client-side RPCs with the specified ExecutorService and the default buffer size.
clientBuffered(ExecutorService, int) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
Create a buffering OutboundObserverFactory for client-side RPCs with the specified ExecutorService and buffer size.
ClientBuilderFactory - Interface in org.apache.beam.sdk.io.aws2.common
Factory to build and configure any AwsClientBuilder using a specific ClientConfiguration or the globally provided settings in AwsOptions as fallback.
ClientBuilderFactory.DefaultClientBuilder - Class in org.apache.beam.sdk.io.aws2.common
Default implementation of ClientBuilderFactory.
ClientConfiguration - Class in org.apache.beam.sdk.io.aws2.common
AWS client configuration.
ClientConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
 
ClientConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.common
 
ClientConfigurationFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsOptions.ClientConfigurationFactory
 
clientDirect() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
Create the default OutboundObserverFactory for client-side RPCs, which uses basic unbuffered flow control.
Clock - Interface in org.apache.beam.runners.direct
Access to the current time.
clone() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
 
clone() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
clonesOf(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
 
close() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil
 
close() - Method in class org.apache.beam.runners.flink.metrics.FileReporter
 
close() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
close() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
 
close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
 
close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
 
close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.WrappedSdkHarnessClient
 
close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
 
close() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
Closes this bundle.
close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
Blocks until bundle processing is finished.
close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
 
close() - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
Deprecated.
 
close() - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
close() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
 
close() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
.
close() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
 
close() - Method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
 
close() - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
 
close() - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
 
close() - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
 
close() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
 
close() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
 
close() - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
close() - Method in class org.apache.beam.runners.portability.CloseableResource
Closes the underlying resource.
close(T) - Method in interface org.apache.beam.runners.portability.CloseableResource.Closer
 
close() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
close() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
 
close() - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
 
close() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
 
close() - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
 
close() - Method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.RecordBatchRowIterator
 
close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
 
close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
close() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
 
close() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
close() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
close() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
.
close() - Method in interface org.apache.beam.sdk.fn.server.FnService
.
close() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
 
close() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
close() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Closes the channel and returns the bundle result.
close() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
Closes any ReadableByteChannel created for the current reader.
close() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
Close the client object.
close() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
 
close() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Gracefully close the underlying netty channel.
close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Closes the current change stream ResultSet.
close() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
close() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
close() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
Closes the message producer.
close() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
Closes the message receiver.
close() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
Gracefully closes the connection to the service.
close() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
 
close() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
 
close() - Method in class org.apache.beam.sdk.io.Source.Reader
Closes the reader.
close() - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ThriftWriter
 
close() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
close() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
 
close() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
 
CloseableFnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
A receiver of streamed data that can be closed.
CloseableResource<T> - Class in org.apache.beam.runners.portability
An AutoCloseable that wraps a resource that needs to be cleaned up but does not implement AutoCloseable itself.
CloseableResource.CloseException - Exception in org.apache.beam.runners.portability
An exception that wraps errors thrown while a resource is being closed.
CloseableResource.Closer<T> - Interface in org.apache.beam.runners.portability
A function that knows how to clean up after a resource.
CloseableThrowingConsumer<ExceptionT extends java.lang.Exception,T> - Interface in org.apache.beam.sdk.function
A ThrowingConsumer that can be closed.
CLOSESTREAM_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of heartbeats identified during the execution of the Connector.
closeTo(double, double) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
CloudObject - Class in org.apache.beam.runners.dataflow.util
A representation of an arbitrary Java object to be instantiated by Dataflow workers.
cloudObjectClassName() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
Gets the class name that will represent the CloudObject created by this CloudObjectTranslator.
cloudObjectClassName() - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
 
cloudObjectClassName() - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
 
CloudObjects - Class in org.apache.beam.runners.dataflow.util
Utilities for converting an object to a CloudObject.
CloudObjectTranslator<T> - Interface in org.apache.beam.runners.dataflow.util
A translator that takes an object and creates a CloudObject which can be converted back to the original object.
CloudPubsubTransforms - Class in org.apache.beam.sdk.io.gcp.pubsublite
A class providing transforms between Cloud Pub/Sub and Pub/Sub Lite message types.
CloudResourceManagerOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
Properties needed when using Google CloudResourceManager with the Apache Beam SDK.
CloudVision - Class in org.apache.beam.sdk.extensions.ml
Factory class for implementations of AnnotateImages.
CloudVision() - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision
 
CloudVision.AnnotateImagesFromBytes - Class in org.apache.beam.sdk.extensions.ml
Accepts ByteString (encoded image contents) with optional DoFn.SideInput with a Map of ImageContext to the image.
CloudVision.AnnotateImagesFromBytesWithContext - Class in org.apache.beam.sdk.extensions.ml
Accepts KVs of ByteString (encoded image contents) and ImageContext.
CloudVision.AnnotateImagesFromGcsUri - Class in org.apache.beam.sdk.extensions.ml
Accepts String (image URI on GCS) with optional DoFn.SideInput with a Map of ImageContext to the image.
CloudVision.AnnotateImagesFromGcsUriWithContext - Class in org.apache.beam.sdk.extensions.ml
Accepts KVs of String (GCS URI to the image) and ImageContext.
CO_GBK_RESULT_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
CodahaleCsvSink - Class in org.apache.beam.runners.spark.structuredstreaming.metrics.sink
A Sink for Spark's metric system reporting metrics (including Beam step metrics) to a CSV file.
CodahaleCsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
Constructor for Spark 3.1.x and earlier.
CodahaleCsvSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
Constructor for Spark 3.2.x and later.
CodahaleGraphiteSink - Class in org.apache.beam.runners.spark.structuredstreaming.metrics.sink
A Sink for Spark's metric system reporting metrics (including Beam step metrics) to Graphite.
CodahaleGraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
Constructor for Spark 3.1.x and earlier.
CodahaleGraphiteSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
Constructor for Spark 3.2.x and later.
coder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
Coder<T> - Class in org.apache.beam.sdk.coders
A Coder<T> defines how to encode and decode values of type T into byte streams.
Coder() - Constructor for class org.apache.beam.sdk.coders.Coder
 
CODER - Static variable in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
Coder() - Constructor for class org.apache.beam.sdk.io.range.OffsetRange.Coder
 
coder - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
Coder.Context - Class in org.apache.beam.sdk.coders
Deprecated.
To implement a coder, do not use any Coder.Context. Just implement only those abstract methods which do not accept a Coder.Context and leave the default implementations for methods accepting a Coder.Context.
Coder.NonDeterministicException - Exception in org.apache.beam.sdk.coders
Exception thrown by Coder.verifyDeterministic() if the encoding is not deterministic, including details of why the encoding is not deterministic.
CoderCloudObjectTranslatorRegistrar - Interface in org.apache.beam.runners.dataflow.util
Coder authors have the ability to automatically have their Coder registered with the Dataflow Runner by creating a ServiceLoader entry and a concrete implementation of this interface.
coderConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T> and values of type T, the values are equal if and only if the encoded bytes are equal.
coderConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and values of type T, the values are equal if and only if the encoded bytes are equal, in any Coder.Context.
coderDecodeEncodeContentsEqual(Coder<CollectionT>, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in any Coder.Context.
coderDecodeEncodeContentsEqualInContext(Coder<CollectionT>, Coder.Context, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in the given Coder.Context.
coderDecodeEncodeContentsInSameOrder(Coder<IterableT>, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in any Coder.Context.
coderDecodeEncodeContentsInSameOrderInContext(Coder<IterableT>, Coder.Context, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<Iterable<T>>, and value of type Iterable<T>, encoding followed by decoding yields an equal value of type Collection<T>, in the given Coder.Context.
coderDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, and value of type T, encoding followed by decoding yields an equal value of type T, in any Coder.Context.
coderDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and value of type T, encoding followed by decoding yields an equal value of type T.
coderDecodeEncodeInContext(Coder<T>, Coder.Context, T, Matcher<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and value of type T, encoding followed by decoding yields a value of type T and tests that the matcher succeeds on the values.
coderDecodesBase64(Coder<T>, String, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderDecodesBase64(Coder<T>, List<String>, List<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderDecodesBase64ContentsEqual(Coder<IterableT>, String, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderDecodesBase64ContentsEqual(Coder<IterableT>, List<String>, List<IterableT>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderDeterministic(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, and values of type T, if the values are equal then the encoded bytes are equal, in any Coder.Context.
coderDeterministicInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and values of type T, if the values are equal then the encoded bytes are equal.
coderEncodesBase64(Coder<T>, T, String) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderEncodesBase64(Coder<T>, List<T>, List<String>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
CoderException - Exception in org.apache.beam.sdk.coders
An Exception thrown if there is a problem encoding or decoding a value.
CoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
CoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
CoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.CoderProvider
Returns a Coder<T> to use for values of a particular type, given the Coders for each of the type's generic parameter types.
coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider
Returns the Coder returned according to the CoderProvider from any DefaultCoder annotation on the given class.
coderForFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
 
coderFromCloudObject(CloudObject) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
 
CoderHelpers - Class in org.apache.beam.runners.spark.coders
Serialization utility class.
CoderHelpers - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
Serialization utility class.
CoderHelpers.FromByteFunction<K,V> - Class in org.apache.beam.runners.spark.coders
A function for converting a byte array pair to a key-value pair.
CoderProperties - Class in org.apache.beam.sdk.testing
Properties for use in Coder tests.
CoderProperties() - Constructor for class org.apache.beam.sdk.testing.CoderProperties
 
CoderProperties.TestElementByteSizeObserver - Class in org.apache.beam.sdk.testing
An ElementByteSizeObserver that records the observed element sizes for testing purposes.
CoderProvider - Class in org.apache.beam.sdk.coders
A CoderProvider provides Coders.
CoderProvider() - Constructor for class org.apache.beam.sdk.coders.CoderProvider
 
CoderProviderRegistrar - Interface in org.apache.beam.sdk.coders
Coder creators have the ability to automatically have their coders registered with this SDK by creating a ServiceLoader entry and a concrete implementation of this interface.
CoderProviders - Class in org.apache.beam.sdk.coders
Static utility methods for creating and working with CoderProviders.
CoderRegistry - Class in org.apache.beam.sdk.coders
A CoderRegistry allows creating a Coder for a given Java class or type descriptor.
coderSerializable(Coder<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that the given Coder<T> can be correctly serialized and deserialized.
CoderSizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
This class is used to estimate the size in bytes of a given element.
CoderSizeEstimator(Coder<T>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.CoderSizeEstimator
 
CoGbkResult - Class in org.apache.beam.sdk.transforms.join
A row result of a CoGroupByKey.
CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
A row in the PCollection resulting from a CoGroupByKey transform.
CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>, int, int) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
 
CoGbkResult.CoGbkResultCoder - Class in org.apache.beam.sdk.transforms.join
CoGbkResultSchema - Class in org.apache.beam.sdk.transforms.join
A schema for the results of a CoGroupByKey.
CoGbkResultSchema(TupleTagList) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Builds a schema from a tuple of TupleTag<?>s.
CoGroup - Class in org.apache.beam.sdk.schemas.transforms
A transform that performs equijoins across multiple schema PCollections.
CoGroup() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup
 
CoGroup.By - Class in org.apache.beam.sdk.schemas.transforms
Defines the set of fields to extract for the join key, as well as other per-input join options.
CoGroup.ExpandCrossProduct - Class in org.apache.beam.sdk.schemas.transforms
A PTransform that calculates the cross-product join.
CoGroup.Impl - Class in org.apache.beam.sdk.schemas.transforms
The implementing PTransform.
CoGroup.Result - Class in org.apache.beam.sdk.schemas.transforms
 
CoGroupByKey<K> - Class in org.apache.beam.sdk.transforms.join
A PTransform that performs a CoGroupByKey on a tuple of tables.
collect(String, Dataset<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
The purpose of this utility is to mark the evaluation of Spark actions, both during Pipeline translation, when evaluation is required, and when finally evaluating the pipeline.
COLLECTION_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
CollectionCoder<T> - Class in org.apache.beam.sdk.coders
A CollectionCoder encodes Collections in the format of IterableLikeCoder.
CollectionCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.CollectionCoder
 
collectionEncoder(Encoder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Creates a Spark Encoder of ArrayType for Java Collections with nullable elements.
collectionEncoder(Encoder<T>, boolean) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Creates a Spark Encoder of ArrayType for Java Collections.
column(SqlParserPos, SqlIdentifier, SqlDataTypeSpec, SqlNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
Creates a column declaration.
Column() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
 
Column() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
 
COLUMN_CREATED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp at which the partition row was first created.
COLUMN_END_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp to end the change stream query of the partition.
COLUMN_FAMILIES - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
COLUMN_FINISHED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp at which the partition was marked as finished by the ReadChangeStreamPartitionDoFn SDF.
COLUMN_HEARTBEAT_MILLIS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the change stream query heartbeat interval in millis.
COLUMN_PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for parent partition tokens.
COLUMN_PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the partition token.
COLUMN_RUNNING_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp at which the partition was marked as running by the ReadChangeStreamPartitionDoFn SDF.
COLUMN_SCHEDULED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp at which the partition was scheduled by the DetectNewPartitionsDoFn SDF.
COLUMN_START_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp to start the change stream query of the partition.
COLUMN_STATE - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the state that the partition is currently in.
COLUMN_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the current watermark of the partition.
columns() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema
 
COLUMNS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
 
COLUMNS_MAPPING - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
columnType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
 
ColumnType() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
ColumnType - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Defines a column type from a Cloud Spanner table with the following information: column name, column type, flag indicating if column is primary key and column position in the table.
ColumnType(String, TypeCode, boolean, long) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
 
Combine - Class in org.apache.beam.sdk.transforms
PTransforms for combining PCollection elements globally and per-key.
combine(Iterable<? extends Instant>) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Combines the given times, which must be from the same window and must have been passed through TimestampCombiner.merge(org.apache.beam.sdk.transforms.windowing.BoundedWindow, java.lang.Iterable<? extends org.joda.time.Instant>).
combine(Instant...) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Combine.AccumulatingCombineFn<InputT,AccumT extends Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT>,OutputT> - Class in org.apache.beam.sdk.transforms
A CombineFn that uses a subclass of Combine.AccumulatingCombineFn.Accumulator as its accumulator type.
Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
The type of mutable accumulator values used by this AccumulatingCombineFn.
Combine.BinaryCombineDoubleFn - Class in org.apache.beam.sdk.transforms
An abstract subclass of Combine.CombineFn for implementing combiners that are more easily and efficiently expressed as binary operations on doubles.
Combine.BinaryCombineFn<V> - Class in org.apache.beam.sdk.transforms
An abstract subclass of Combine.CombineFn for implementing combiners that are more easily expressed as binary operations.
Combine.BinaryCombineIntegerFn - Class in org.apache.beam.sdk.transforms
An abstract subclass of Combine.CombineFn for implementing combiners that are more easily and efficiently expressed as binary operations on ints
Combine.BinaryCombineLongFn - Class in org.apache.beam.sdk.transforms
An abstract subclass of Combine.CombineFn for implementing combiners that are more easily and efficiently expressed as binary operations on longs.
Combine.CombineFn<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
A CombineFn<InputT, AccumT, OutputT> specifies how to combine a collection of input values of type InputT into a single output value of type OutputT.
Combine.Globally<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
Combine.Globally<InputT, OutputT> takes a PCollection<InputT> and returns a PCollection<OutputT> whose elements are the result of combining all the elements in each window of the input PCollection, using a specified CombineFn<InputT, AccumT, OutputT>.
Combine.GloballyAsSingletonView<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
Combine.GloballyAsSingletonView<InputT, OutputT> takes a PCollection<InputT> and returns a PCollectionView<OutputT> whose elements are the result of combining all the elements in each window of the input PCollection, using a specified CombineFn<InputT, AccumT, OutputT>.
Combine.GroupedValues<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
GroupedValues<K, InputT, OutputT> takes a PCollection<KV<K, Iterable<InputT>>>, such as the result of GroupByKey, applies a specified CombineFn<InputT, AccumT, OutputT> to each of the input KV<K, Iterable<InputT>> elements to produce a combined output KV<K, OutputT> element, and returns a PCollection<KV<K, OutputT>> containing all the combined output elements.
Combine.Holder<V> - Class in org.apache.beam.sdk.transforms
Holds a single value value of type V which may or may not be present.
Combine.IterableCombineFn<V> - Class in org.apache.beam.sdk.transforms
Converts a SerializableFunction from Iterable<V>s to Vs into a simple Combine.CombineFn over Vs.
Combine.PerKey<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
PerKey<K, InputT, OutputT> takes a PCollection<KV<K, InputT>>, groups it by key, applies a combining function to the InputT values associated with each key to produce a combined OutputT value, and returns a PCollection<KV<K, OutputT>> representing a map from each distinct key of the input PCollection to the corresponding combined value.
Combine.PerKeyWithHotKeyFanout<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
Like Combine.PerKey, but sharding the combining of hot keys.
Combine.SimpleCombineFn<V> - Class in org.apache.beam.sdk.transforms
Deprecated.
CombineFieldsByFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
combineFn() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf
 
CombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.CombineFn
 
combineFn() - Static method in class org.apache.beam.sdk.transforms.Count
Returns a Combine.CombineFn that counts the number of its inputs.
combineFn() - Static method in class org.apache.beam.sdk.transforms.Latest
Returns a Combine.CombineFn that selects the latest element among its inputs.
combineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
Returns a Combine.CombineFn that computes a fixed-sized uniform sample of its inputs.
CombineFnBase - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
CombineFnBase() - Constructor for class org.apache.beam.sdk.transforms.CombineFnBase
 
CombineFnBase.GlobalCombineFn<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
CombineFns - Class in org.apache.beam.sdk.transforms
Static utility methods that create combine function instances.
CombineFns() - Constructor for class org.apache.beam.sdk.transforms.CombineFns
 
CombineFns.CoCombineResult - Class in org.apache.beam.sdk.transforms
A tuple of outputs produced by a composed combine functions.
CombineFns.ComposeCombineFnBuilder - Class in org.apache.beam.sdk.transforms
A builder class to construct a composed CombineFnBase.GlobalCombineFn.
CombineFns.ComposedCombineFn<DataT> - Class in org.apache.beam.sdk.transforms
A composed Combine.CombineFn that applies multiple CombineFns.
CombineFns.ComposedCombineFnWithContext<DataT> - Class in org.apache.beam.sdk.transforms
CombineFnTester - Class in org.apache.beam.sdk.testing
Utilities for testing CombineFns.
CombineFnTester() - Constructor for class org.apache.beam.sdk.testing.CombineFnTester
 
CombineFnWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
CombineWithContext - Class in org.apache.beam.sdk.transforms
This class contains combine functions that have access to PipelineOptions and side inputs through CombineWithContext.Context.
CombineWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext
 
CombineWithContext.CombineFnWithContext<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
A combine function that has access to PipelineOptions and side inputs through CombineWithContext.Context.
CombineWithContext.Context - Class in org.apache.beam.sdk.transforms
Information accessible to all methods in CombineFnWithContext and KeyedCombineFnWithContext.
CombineWithContext.RequiresContextInternal - Interface in org.apache.beam.sdk.transforms
An internal interface for signaling that a GloballyCombineFn or a PerKeyCombineFn needs to access CombineWithContext.Context.
combining(Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a CombiningState which uses a Combine.CombineFn to automatically merge multiple values of type InputT into a single resulting OutputT.
combining(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards compatibility guarantees
combining(Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to #combining(CombineFn), but with an accumulator coder explicitly supplied.
combining(Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards compatibility guarantees
combiningFromInputInternal(Coder<InputT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards-compatibility guarantees.
CombiningState<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.state
A ReadableState cell defined by a Combine.CombineFn, accepting multiple input values, combining them as specified into accumulators, and producing a single output value.
comment(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
commit() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
commitOffset(Offset) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
 
commitOffsets() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Enable committing record offset.
commitOffsetsInFinalize() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Finalized offsets are committed to Kafka.
commitWriteStreams(String, Iterable<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Commit write streams of type PENDING.
commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
commonPrefixLength(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
Compute the length of the common prefix of the two provided sets of bytes.
compact(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
compact(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns an accumulator that represents the same logical value as the input accumulator, but may have a more compact representation.
compact(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
compact(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
compact(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
compact(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Returns an accumulator that represents the same logical value as the input accumulator, but may have a more compact representation.
compare(JobMessage, JobMessage) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
 
compare(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
 
compare(RandomAccessData, RandomAccessData, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
Compare the two sets of bytes starting at the given offset.
compare(Row, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel.BeamSqlRowComparator
 
compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Largest
Deprecated.
 
compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Natural
 
compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Reversed
 
compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Smallest
Deprecated.
 
compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByKey
 
compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByValue
 
compareSchemaField(Schema.Field, Schema.Field) - Static method in class org.apache.beam.sdk.io.jdbc.SchemaUtil
compares two fields.
compareTo(ByteArray) - Method in class org.apache.beam.runners.spark.util.ByteArray
 
compareTo(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
compareTo(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
compareTo(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKey
ByteKey implements Comparable<ByteKey> by comparing the arrays in lexicographic order.
compareTo(RedisCursor) - Method in class org.apache.beam.sdk.io.redis.RedisCursor
RedisCursor implements Comparable<RedisCursor> by transforming the cursors to an index of the Redis table.
compareTo(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
comparing(SerializableFunction<? super T, ? extends V>) - Static method in interface org.apache.beam.sdk.transforms.SerializableComparator
Analogous to Comparator.comparing(Function), except that it takes in a SerializableFunction as the key extractor and returns a SerializableComparator.
comparingNullFirst(Function<? super T, ? extends K>) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
CompatibilityError() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
 
compile(List<CEPPattern>, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.nfa.NFA
 
CompileException(DiagnosticCollector<?>) - Constructor for exception org.apache.beam.sdk.schemas.transforms.providers.StringCompiler.CompileException
 
complete() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
 
complete() - Method in class org.apache.beam.runners.jet.processors.ImpulseP
 
complete() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
 
complete() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
 
complete() - Method in class org.apache.beam.runners.jet.processors.ViewP
 
complete() - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
 
complete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
Constructs a Watch.Growth.PollResult with the given outputs and declares that there will be no new outputs for the current input.
complete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
Like Watch.Growth.PollResult.complete(List), but assigns the same timestamp to all new outputs.
completed() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
Report that the pipeline has successfully completed.
complexityFactor - Variable in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
 
COMPONENT_ENCODINGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
compose() - Static method in class org.apache.beam.sdk.transforms.CombineFns
compose(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
For a SerializableFunction<InputT, OutputT> fn, returns a PTransform given by applying fn.apply(v) to the input PCollection<InputT>.
compose(String, SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
Like PTransform.compose(SerializableFunction), but with a custom name.
ComposeCombineFnBuilder() - Constructor for class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
 
COMPOSITE_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
CompressedReader(CompressedSource<T>, FileBasedSource.FileBasedReader<T>) - Constructor for class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Create a CompressedReader from a CompressedSource and delegate reader.
CompressedSource<T> - Class in org.apache.beam.sdk.io
A Source that reads from compressed files.
CompressedSource.CompressedReader<T> - Class in org.apache.beam.sdk.io
Reader for a CompressedSource.
CompressedSource.CompressionMode - Enum in org.apache.beam.sdk.io
Deprecated.
Use Compression instead
CompressedSource.DecompressingChannelFactory - Interface in org.apache.beam.sdk.io
Factory interface for creating channels that decompress the content of an underlying channel.
Compression - Enum in org.apache.beam.sdk.io
Various compression types for reading/writing files.
compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
compute(Iterator<WindowedValue<T>>, RecordCollector<WindowedValue<T>>) - Method in class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
 
compute(Iterator<WindowedValue<InputT>>, RecordCollector<RawUnionValue>) - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
 
compute(Iterator<RawUnionValue>, RecordCollector<WindowedValue<OutputT>>) - Method in class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
 
computeIfAbsent(K, Function<? super K, ? extends V>) - Method in interface org.apache.beam.sdk.state.MapState
A deferred read-followed-by-write.
computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
concat(List<T>, List<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
concat(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
concat(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
concat(String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
concat(String, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
concat(String, String, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
CONCAT - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
concat(Iterable<T>...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
Concatentates the Iterables.
concat(Iterator<T>...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
Concatentates the Iterators.
CONCAT_FIELD_NAMES - Static variable in class org.apache.beam.sdk.schemas.utils.SelectHelpers
This policy keeps all levels of a name.
CONCAT_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
Concatenate() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
 
concatFieldNames() - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
For nested fields, concatenate all the names separated by a _ character in the flattened schema.
concatIterators(Iterator<Iterator<T>>) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
 
CONCRETE_CLASS - Static variable in class org.apache.beam.sdk.io.WriteFiles
For internal use by runners.
config() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
config() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
Config() - Constructor for class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
 
Configuration() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
 
configuration - Variable in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
Configuration() - Constructor for class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
 
configurationClass() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Returns the expected class of the configuration.
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
Returns the expected class of the configuration.
configurationClass() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
Returns the expected class of the configuration.
configurationClass() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
 
ConfigurationLocator() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
 
configurationSchema() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
 
configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
 
configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
configure() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
Returns a new builder for a Window transform for setting windowing parameters other than the windowing function.
ConfigWrapper<T extends PluginConfig> - Class in org.apache.beam.sdk.io.cdap
Class for building PluginConfig object of the specific class .
ConfigWrapper(Class<T>) - Constructor for class org.apache.beam.sdk.io.cdap.ConfigWrapper
 
ConfluentSchemaRegistryDeserializerProvider<T> - Class in org.apache.beam.sdk.io.kafka
A DeserializerProvider that uses Confluent Schema Registry to resolve a Deserializers and Coder given a subject.
connect(String, Properties) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
Configures Beam-specific options and opens a JDBC connection to Calcite.
connect(TableProvider, PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
Connects to the driver using standard JdbcDriver.connect(String, Properties) call, but overrides the initial schema factory.
connect() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
Connect to the Redis instance.
connect() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
connect() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
Establishes a connection to the service.
CONNECT_STRING_PREFIX - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
connection() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
CONNECTION_MAX_IDLE_TIME - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
CONNECTION_TIME_TO_LIVE - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
CONNECTION_TIMEOUT - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
connectionAcquisitionTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
Milliseconds to wait when acquiring a connection from the pool before giving up and timing out.
connectionAcquisitionTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
Milliseconds to wait when acquiring a connection from the pool before giving up and timing out.
ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ConnectionConfiguration
 
ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
 
ConnectionManager - Class in org.apache.beam.sdk.io.cassandra
 
ConnectionManager() - Constructor for class org.apache.beam.sdk.io.cassandra.ConnectionManager
 
connectionMaxIdleTime(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
Maximum milliseconds a connection should be allowed to remain open while idle.
connectionMaxIdleTime() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
Maximum milliseconds a connection should be allowed to remain open while idle.
connectionTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
Milliseconds to wait when initially establishing a connection before giving up and timing out.
connectionTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
Milliseconds to wait when initially establishing a connection before giving up and timing out.
connectionTimeToLive(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
Maximum milliseconds a connection should be allowed to remain open, regardless of usage frequency.
connectionTimeToLive() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
Maximum milliseconds a connection should be allowed to remain open, regardless of usage frequency.
ConnectorConfiguration() - Constructor for class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
 
Connectors - Enum in org.apache.beam.io.debezium
Enumeration of debezium connectors.
consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
consistentWithEquals() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ByteCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.Coder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DequeCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DoubleCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DurationCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.FloatCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.InstantCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.KvCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
LengthPrefixCoder is consistent with equals if the nested Coder is.
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ListCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.MapCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.NullableCoder
NullableCoder is consistent with equals if the nested Coder is.
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarIntCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarLongCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ZstdCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
 
consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
ConsoleIO - Class in org.apache.beam.runners.spark.io
Print to console.
ConsoleIO.Write - Class in org.apache.beam.runners.spark.io
Write to console.
ConsoleIO.Write.Unbound<T> - Class in org.apache.beam.runners.spark.io
PTransform writing PCollection to the console.
constant(FileBasedSink.FilenamePolicy, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
Returns a FileBasedSink.DynamicDestinations that always returns the same FileBasedSink.FilenamePolicy.
constant(FileBasedSink.FilenamePolicy) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
A specialization of #constant(FilenamePolicy, SerializableFunction) for the case where UserT and OutputT are the same type and the format function is the identity.
constant(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
 
CONSTANT_WINDOW_SIZE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Returns a DynamicAvroDestinations that always returns the same FileBasedSink.FilenamePolicy, schema, metadata, and codec.
constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>, AvroSink.DatumWriterFactory<OutputT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Returns a DynamicAvroDestinations that always returns the same FileBasedSink.FilenamePolicy, schema, metadata, and codec.
constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
 
constructFilter(List<RexNode>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
Generate an IO implementation of BeamSqlTableFilter for predicate push-down.
constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
 
constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
 
constructName(ResourceId, String, String, int, int, String, String) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
Constructs a fully qualified name from components.
consumesProjection() - Method in interface org.apache.beam.sdk.schemas.ProjectionConsumer
Returns a map from input TupleTag id to a FieldAccessDescriptor describing which Schema fields this must access from the corresponding input PCollection to complete successfully.
contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
 
contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
 
contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
 
contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
contains(PCollectionView<T>) - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
 
contains(Descriptors.Descriptor) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
contains(T) - Method in interface org.apache.beam.sdk.state.SetState
Returns a ReadableState whose ReadableState.read() method will return true if this set contains the specified element at the point when that ReadableState.read() call returns.
contains(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.contains(Object[]).
contains(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.contains(Object[]).
contains(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
contains(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.contains(List).
contains(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns whether this window contains the given window.
containsInAnyOrder(T...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Asserts that the iterable in question contains the provided elements.
containsInAnyOrder(SerializableMatcher<? super T>...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Asserts that the iterable in question matches the provided elements.
containsInAnyOrder() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Deprecated.
Prefer PAssert.IterableAssert.empty() to this method.
containsInAnyOrder(Iterable<T>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Asserts that the iterable in question contains the provided elements.
containsInAnyOrder(T...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Checks that the Iterable contains the expected elements, in any order.
containsInAnyOrder(Iterable<T>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Checks that the Iterable contains the expected elements, in any order.
containsInAnyOrder(SerializableMatcher<? super T>...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Checks that the Iterable contains elements that match the provided matchers, in any order.
containsInAnyOrder() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
containsInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
containsInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
containsInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
containsInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
containsKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns true if the specified ByteKey is contained within this range.
containsKey(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
containsKey(K) - Method in interface org.apache.beam.sdk.state.MultimapState
Returns a ReadableState whose ReadableState.read() method will return true if this multimap contains the specified key at the point when that ReadableState.read() call returns.
containsSeekableInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
This method returns whether any of the children of the relNode are Seekable.
containsString(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
containsValue(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
Context(boolean) - Constructor for class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
Context(TableDataInsertAllResponse.InsertErrors) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
 
Context() - Constructor for class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
 
Context() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.Context
 
Context() - Constructor for class org.apache.beam.sdk.transforms.Contextful.Fn.Context
 
Contextful<ClosureT> - Class in org.apache.beam.sdk.transforms
Pair of a bit of user code (a "closure") and the Requirements needed to run it.
Contextful.Fn<InputT,OutputT> - Interface in org.apache.beam.sdk.transforms
A function from an input to an output that may additionally access Contextful.Fn.Context when computing the result.
Contextful.Fn.Context - Class in org.apache.beam.sdk.transforms
An accessor for additional capabilities available in Contextful.Fn.apply(InputT, org.apache.beam.sdk.transforms.Contextful.Fn.Context).
ContextualTextIO - Class in org.apache.beam.sdk.io.contextualtextio
PTransforms that read text files and collect contextual information of the elements in the input.
ContextualTextIO.Read - Class in org.apache.beam.sdk.io.contextualtextio
Implementation of ContextualTextIO.read().
ContextualTextIO.ReadFiles - Class in org.apache.beam.sdk.io.contextualtextio
Implementation of ContextualTextIO.readFiles().
ContiguousSequenceRange - Class in org.apache.beam.sdk.extensions.ordered
A range of contiguous event sequences and the latest timestamp of the events in the range.
ContiguousSequenceRange() - Constructor for class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
 
continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.Match
continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.Match
continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
Like Match#continuously(Duration, TerminationCondition, boolean).
continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
Like Match#continuously(Duration, TerminationCondition).
continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
Continuously watches for new files at the given interval until the given termination condition is reached, where the input to the condition is the filepattern.
continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
Continuously watches for new files at the given interval until the given termination condition is reached, where the input to the condition is the filepattern.
control(StreamObserver<BeamFnApi.InstructionRequest>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
Called by gRPC for each incoming connection from an SDK harness, and enqueue an available SDK harness client.
ControlClientPool - Interface in org.apache.beam.runners.fnexecution.control
A pool of control clients that brokers incoming SDK harness connections (in the form of InstructionRequestHandlers.
ControlClientPool.Sink - Interface in org.apache.beam.runners.fnexecution.control
A sink for InstructionRequestHandlers keyed by worker id.
ControlClientPool.Source - Interface in org.apache.beam.runners.fnexecution.control
ConversionContext - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
Conversion context, some rules need this data to convert the nodes.
ConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMatchRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamTableFunctionScanRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamWindowRule
 
convert() - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.RowToDocument
 
convert(ResolvedNodes.ResolvedQueryStmt, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
 
convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRule
 
Convert - Class in org.apache.beam.sdk.schemas.transforms
A set of utilities for converting between different objects supporting schemas.
Convert() - Constructor for class org.apache.beam.sdk.schemas.transforms.Convert
 
convert(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertAvroFieldStrict(Object, Schema, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during conversion.
convertAvroFormat(Schema.FieldType, Object, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Tries to convert an Avro decoded value to a Beam field value based on the target type of the Beam field.
convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertType
 
convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForGetter
 
convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForSetter
 
convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
ConvertedSchemaInformation(SchemaCoder<T>, Schema.FieldType) - Constructor for class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
 
convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertGenericRecordToTableRow(GenericRecord, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
convertGenericRecordToTableRow(GenericRecord) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert generic record to Bq TableRow.
ConvertHelpers - Class in org.apache.beam.sdk.schemas.utils
Helper functions for converting between equivalent schema types.
ConvertHelpers() - Constructor for class org.apache.beam.sdk.schemas.utils.ConvertHelpers
 
ConvertHelpers.ConvertedSchemaInformation<T> - Class in org.apache.beam.sdk.schemas.utils
Return value after converting a schema.
convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertNewPartitionRowKeyToPartition(ByteString) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Convert new partition row key to partition to process metadata read from Bigtable.
convertNode2Map(JsonNode) - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
 
convertNumbers(TableRow) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
convertPartitionToNewPartitionRowKey(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Convert partition to a New Partition row key to query for partitions ready to be streamed as the result of splits and merges.
convertPartitionToStreamPartitionRowKey(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Convert partition to a Stream Partition row key to query for metadata of partitions that are currently being streamed.
convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
convertRelNodeToRexRangeRef(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
 
convertRelOptCost(RelOptCost) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
convertResolvedLiteral(ResolvedNodes.ResolvedLiteral) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
Convert a resolved literal to a RexNode.
convertRexNodeFromResolvedExpr(ResolvedNodes.ResolvedExpr, List<ResolvedColumn>, List<RelDataTypeField>, Map<String, RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
Create a RexNode for a corresponding resolved expression node.
convertRexNodeFromResolvedExpr(ResolvedNodes.ResolvedExpr) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
Create a RexNode for a corresponding resolved expression.
convertRootQuery(ConversionContext, ResolvedNodes.ResolvedQueryStmt) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
 
convertStreamPartitionRowKeyToPartition(ByteString) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Convert stream partition row key to partition to process metadata read from Bigtable.
convertTableValuedFunction(RelNode, TableValuedFunction, List<ResolvedNodes.ResolvedFunctionArgument>, List<ResolvedColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
Convert a TableValuedFunction in ZetaSQL to a RexCall in Calcite.
convertToBagSpecInternal(StateSpec<CombiningState<InputT, AccumT, OutputT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards-compatibility guarantees.
convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
It parses and validate the input query, then convert into a BeamRelNode tree.
convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner
It parses and validate the input query, then convert into a BeamRelNode tree.
convertToBeamRel(String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
convertToBeamRel(String, Map<String, Value>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
convertToBeamRel(String, List<Value>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
convertToFileResourceIfPossible(String) - Static method in class org.apache.beam.sdk.io.FileBasedSink
This is a helper function for turning a user-provided output filename prefix and converting it into a ResourceId for writing output files.
convertToJcsmpDestination(Solace.Destination) - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
Convert to a JCSMP destination from a schema-enabled Solace.Destination.
convertToMapSpecInternal(StateSpec<SetState<KeyT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards-compatibility guarantees.
convertToMultimapSpecInternal(StateSpec<MapState<KeyT, ValueT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards-compatibility guarantees.
ConvertType(boolean) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
 
ConvertValueForGetter(StackManipulation) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
ConvertValueForSetter(StackManipulation) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
copy() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns a copy of this RandomAccessData.
copy() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
 
copy() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
 
copy(Iterable<String>, Iterable<String>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
 
copy(RelTraitSet, RelNode, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
copy(RelTraitSet, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
 
copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
 
copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
copy(RelNode, RelDataType, RexNode, boolean, boolean, Map<String, RexNode>, Map<String, RexNode>, RexNode, Map<String, ? extends SortedSet<String>>, boolean, ImmutableBitSet, RelCollation, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
 
copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
 
copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
 
copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
 
copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
 
copy(RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
copy(RelTraitSet, List<RelNode>, RexNode, Type, RelDataType, Set<RelColumnMapping>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
 
copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
 
copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
 
copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
 
copy(RelTraitSet, RelNode, List<RexLiteral>, RelDataType, List<Window.Group>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
 
copy(RelTraitSet, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
 
copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
 
copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
 
copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
 
copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
 
copy(List<ClassLoaderFileSystem.ClassLoaderResourceId>, List<ClassLoaderFileSystem.ClassLoaderResourceId>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
copy(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
Copies a List of file-like resources from one location to another.
copy(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
Copies a List of file-like resources from one location to another.
copyFrom(ByteBuffer) - Static method in class org.apache.beam.sdk.io.range.ByteKey
Creates a new ByteKey backed by a copy of the data remaining in the specified ByteBuffer.
copyFrom(byte[]) - Static method in class org.apache.beam.sdk.io.range.ByteKey
Creates a new ByteKey backed by a copy of the specified byte[].
copyFrom(FieldSpecifierNotationParser.DotExpressionComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
 
copyFrom(FieldSpecifierNotationParser.QualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
 
copyResourcesFromJar(JarFile) - Method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
Copy resources from inputJar to PortablePipelineJarCreator.outputStream.
copyToList(ArrayData, DataType) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
 
coreName() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
 
coreUrl() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
 
CorrelationKey() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
 
cosh(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
COSH(X)
CosmosClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.azure.cosmos.CosmosOptions.CosmosClientBuilderFactory
 
CosmosIO - Class in org.apache.beam.sdk.io.azure.cosmos
 
CosmosIO.BoundedCosmosBDSource<T> - Class in org.apache.beam.sdk.io.azure.cosmos
A BoundedSource reading from Comos.
CosmosIO.Read<T> - Class in org.apache.beam.sdk.io.azure.cosmos
 
CosmosOptions - Interface in org.apache.beam.sdk.io.azure.cosmos
 
CosmosOptions.CosmosClientBuilderFactory - Class in org.apache.beam.sdk.io.azure.cosmos
Create a cosmos client from the pipeline options.
Count - Class in org.apache.beam.sdk.transforms
PTransforms to count the elements in a PCollection.
countAsserts(Pipeline) - Static method in class org.apache.beam.sdk.testing.PAssert
 
COUNTER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
 
Counter - Interface in org.apache.beam.sdk.metrics
A metric that reports a single long value and can be incremented or decremented.
counter(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
counter(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
CounterImpl - Class in org.apache.beam.runners.jet.metrics
Implementation of Counter.
CounterMark(long, Instant) - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMark
Creates a checkpoint mark reflecting the last emitted value.
CounterMarkCoder() - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
 
CountErrors(Counter) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics.CountErrors
 
CountIf - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
Returns the count of TRUE values for expression.
COUNTIF - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
CountIf.CountIfFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
 
CountIfFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
 
CountingReadableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
 
CountingReadableByteChannel(ReadableByteChannel, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
 
CountingSeekableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
 
CountingSeekableByteChannel(SeekableByteChannel, Consumer<Integer>, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
CountingSource - Class in org.apache.beam.sdk.io
Most users should use GenerateSequence instead.
CountingSource.CounterMark - Class in org.apache.beam.sdk.io
The checkpoint for an unbounded CountingSource is simply the last value produced.
CountingSource.CounterMarkCoder - Class in org.apache.beam.sdk.io
A custom coder for CounterMark.
CountingWritableByteChannel - Class in org.apache.beam.sdk.extensions.gcp.util.channels
 
CountingWritableByteChannel(WritableByteChannel, Consumer<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
 
countPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Counts all partitions with a PartitionMetadataAdminDao.COLUMN_CREATED_AT less than the given timestamp.
CountWords() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.CountWords
 
CovarianceFn<T extends java.lang.Number> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
Combine.CombineFn for Covariance on Number types.
coverSameKeySpace(List<Range.ByteStringRange>, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Returns true if parentPartitions form a proper superset of childPartition.
CrashingRunner - Class in org.apache.beam.sdk.testing
A PipelineRunner that applies no overrides and throws an exception on calls to Pipeline.run().
CrashingRunner() - Constructor for class org.apache.beam.sdk.testing.CrashingRunner
 
create() - Static method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Creates a ConnectorConfiguration.
create(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowClient
 
create(PCollectionView<?>, Coder<T>) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.UnboundedReaderMaxReadTimeFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.LocalWindmillHostportFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.MaxStackTraceDepthToReportFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.PeriodicStatusPageDirectoryFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.WindmillServiceStreamingRpcBatchLimitFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
 
create() - Static method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
 
create(PipelineOptions) - Method in class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
 
create(FlinkJobServerDriver.FlinkServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobInvoker
 
create(PipelineOptions) - Method in class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleSizeFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleTimeFactory
 
create(boolean) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
 
create(String, ByteString, OutputStream) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
 
create(JobInfo) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
 
create(JobInfo) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
 
create(JobInfo, Map<String, EnvironmentFactory.Provider>) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
 
create() - Static method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
create(String) - Method in interface org.apache.beam.runners.fnexecution.control.OutputReceiverFactory
Get a new FnDataReceiver for an output PCollection.
create(ReferenceCountingExecutableStageContextFactory.Creator, SerializableFunction<Object, Boolean>) - Static method in class org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory
 
create(EnvironmentFactory, GrpcFnServer<GrpcDataService>, GrpcFnServer<GrpcStateService>, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
Deprecated.
 
create(String, String) - Method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
 
create(PipelineOptions, ExecutorService, OutboundObserverFactory) - Static method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
create(PipelineOptions, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<FnApiControlClientPoolService>, ControlClientPool.Source) - Static method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
 
create(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory
 
create(ProcessManager, RunnerApi.Environment, String, InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
 
create(ProcessManager, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator, PipelineOptions) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
 
create() - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
 
create(String, String, String, Struct) - Static method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
create(ProvisionApi.ProvisionInfo, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
 
create() - Static method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
Create a new GrpcStateService.
create(Endpoints.ApiServiceDescriptor, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
Create new instance of BeamWorkerStatusGrpcService.
create(GrpcFnServer<ArtifactStagingService>, Function<String, String>, ThrowingConsumer<Exception, String>, JobInvoker) - Static method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
Creates an InMemoryJobService.
create(GrpcFnServer<ArtifactStagingService>, Function<String, String>, ThrowingConsumer<Exception, String>, JobInvoker, int) - Static method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
Creates an InMemoryJobService.
create() - Method in interface org.apache.beam.runners.jobsubmission.JobServerDriver.JobInvokerFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkCommonPipelineOptions.StorageLevelFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkCommonPipelineOptions.TmpCheckpointDirFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
 
create(SparkJobServerDriver.SparkServerConfiguration) - Static method in class org.apache.beam.runners.spark.SparkJobInvoker
 
create() - Static method in class org.apache.beam.runners.spark.SparkRunner
Creates and returns a new SparkRunner with default options.
create(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
Creates and returns a new SparkRunner with specified options.
create() - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
Creates and returns a new SparkStructuredStreamingRunner with default options.
create(SparkStructuredStreamingPipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
Creates and returns a new SparkStructuredStreamingRunner with specified options.
create(Map<String, Broadcast<SideInputValues<?>>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
Creates a SideInputReader for Spark from a map of PCollectionView tag ids and the corresponding broadcasted SideInputValues.
create(PipelineOptions) - Method in class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
 
create(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
create(ExpansionService, String, int) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServer
Create a ExpansionServer for the provided ExpansionService running on an arbitrary port.
create(List<String>, Map<String, List<Dependency>>) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.ExpansionServiceConfigFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.JavaClassLookupAllowListFactory
 
create(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.EnableStreamingEngineFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpOAuthScopesFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
 
create(GcsPath, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Deprecated.
create(GcsPath, String, Integer) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Deprecated.
create(GcsPath, GcsUtil.CreateOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Creates an object in GCS and prepares for uploading its contents.
create(String, String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
Returns an instance of GcsUtil based on the PipelineOptions.
create(PipelineOptions, Storage, HttpRequestInitializer, ExecutorService, Credentials, Integer, GcsUtil.GcsCountersOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
Returns an instance of GcsUtil based on the given parameters.
create(StorageObject) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
 
create(IOException) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
 
create(OrderedProcessingHandler<EventTypeT, EventKeyTypeT, StateTypeT, ResultTypeT>) - Static method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
Create the transform.
create(Long, long, Long, Long, long, long, long, boolean) - Static method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
create(EventT, UnprocessedEvent.Reason) - Static method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
Create new unprocessed event.
create(EventT, Exception) - Static method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
Create new unprocessed event which failed due to an exception thrown.
create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
Returns an ApproximateDistinct.ApproximateDistinctFn combiner with the given input coder.
create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
Returns a SketchFrequencies.CountMinSketchFn combiner with the given input coder.
create(double) - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
Returns TDigestQuantiles.TDigestQuantilesFn combiner with the given compression factor.
create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
create(ExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
Returns a Sorter configured with the given ExternalSorter.Options.
create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.SortValues
Returns a SortValues<PrimaryKeyT, SecondaryKeyT, ValueT> PTransform.
create(double, double, double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
 
create(double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
It creates an instance with rate=0 and window=rowCount for bounded sources.
create(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
Creates Function from given method.
create(Method, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
Creates org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function from given method.
create(List<String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
Full table name with path.
create(List<String>, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
Table name plus the path up to but not including table name.
create(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
 
create(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
create(DataCatalogPipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
create(Method, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
 
create(Class<?>, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
Creates Function from given class.
create(Method, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
Creates Function from given method.
create(RelTraitSet, RelNode, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
Creates an Uncollect.
create(String) - Static method in class org.apache.beam.sdk.fn.channel.AddHarnessIdInterceptor
 
create(String, Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.DataEndpoint
 
create(Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
 
create(String, String, Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.TimerEndpoint
 
create(List<? extends FnService>, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
Create GrpcFnServers for the provided FnServices running on a specified port.
create(ServiceT, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
Create a GrpcFnServer for the provided FnService which will run at the endpoint specified in the Endpoints.ApiServiceDescriptor.
create(ServiceT, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
Deprecated.
This create function is used for Dataflow migration purpose only.
create() - Static method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
 
create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
 
create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.ServerFactory
Creates an instance of this server at the address specified by the given service descriptor and bound to multiple services.
create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.ServerFactory.InetSocketAddressServerFactory
 
create(StreamObserver<ReqT>, Runnable) - Static method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
create(StreamObserver<ReqT>, Runnable, Runnable) - Static method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
create(int, Duration) - Static method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.RetryConfiguration
Deprecated.
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsRegionFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsUserCredentialsFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.AwsOptions.ClientConfigurationFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.S3Options.S3UploadBufferSizeBytesFactory
 
create(int, Duration) - Static method in class org.apache.beam.sdk.io.aws.sns.SnsIO.RetryConfiguration
Deprecated.
 
create(BuilderT, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
Configure a client builder BuilderT using the global defaults in AwsOptions.
create(BuilderT, ClientConfiguration, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
Configure a client builder BuilderT using the provided ClientConfiguration and fall back to the global defaults in AwsOptions where necessary.
create(BuilderT, ClientConfiguration, AwsOptions) - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.DefaultClientBuilder
 
create(AwsCredentialsProvider, Region, URI) - Static method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.MapFactory
 
create() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsRegionFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsUserCredentialsFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.S3Options.SSECustomerKeyFactory
 
create(String, String, String, long, long) - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosOptions.CosmosClientBuilderFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.options.AzureOptions.AzureUserCredentialsFactory
 
create(ClassLoaderFileSystem.ClassLoaderResourceId, CreateOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
create(String[], String, String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
Creates a new Elasticsearch connection configuration.
create(String[], String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
Creates a new Elasticsearch connection configuration with no default type.
create(String[]) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
Creates a new Elasticsearch connection configuration with no default index nor type.
create() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
 
create(int, Duration) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.RetryConfiguration
Creates RetryConfiguration for ElasticsearchIO with provided maxAttempts, maxDurations and exponential backoff based retries.
create(WritableByteChannel) - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
Deprecated.
 
create(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
 
create(EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
create(ResourceIdT, CreateOptions) - Method in class org.apache.beam.sdk.io.FileSystem
Returns a write channel for the given ResourceIdT.
create(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a write channel for the given ResourceId.
create(ResourceId, CreateOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a write channel for the given ResourceId with CreateOptions.
create(MatchResult.Status, List<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
create(MatchResult.Status, IOException) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
Returns a MatchResult given the MatchResult.Status and IOException.
create(ValueProvider<TableReference>, DataFormat, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
create(ValueProvider<TableReference>, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
create() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
 
create(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
Creates an instance of this rule.
create(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
 
create(Schema, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
Create a PTransform instance.
create(String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
Create a PTransform instance.
create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Creates an instance of this rule using options provided by TestPipeline.testingPipelineOptions().
create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Creates an instance of this rule.
create(SubscriptionPartition) - Method in interface org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactory
 
create(SubscriptionPartition) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
 
create(SpannerConfig, String, String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.MetadataSpannerConfigFactory
Generates a SpannerConfig that can be used to access the change stream metadata database by copying only the necessary fields from the given primary database SpannerConfig and setting the instance ID and database ID to the supplied metadata values.
create(Mutation, Mutation...) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
Creates a new group.
create(Mutation, Iterable<Mutation>) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
create(BatchTransactionId) - Static method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
create(String, String, String, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsOptions.GoogleAdsCredentialsFactory
 
create(Map<String, String>) - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
Create the schema adapter.
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
 
create(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.DataSourceConfiguration
 
create(DataSource) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
create(String, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
create(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
create(int, Duration, Duration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.RetryConfiguration
 
create() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcWriteResult
 
create() - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
 
create(int) - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
 
create(int, Duration, Duration) - Static method in class org.apache.beam.sdk.io.jms.RetryConfiguration
 
create() - Static method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
 
create() - Static method in class org.apache.beam.sdk.io.kinesis.WatermarkParameters
 
create() - Static method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
 
create() - Static method in class org.apache.beam.sdk.io.mongodb.FindQuery
 
create() - Static method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
 
create(String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
Describe a connection configuration to the MQTT broker.
create(String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
create() - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
create(String, String, String) - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
create() - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
create(String, int) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
create(ValueProvider<String>, ValueProvider<Integer>) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
create(String) - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
 
create() - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
create(DataSource) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Creates SnowflakeIO.DataSourceConfiguration from existing instance of DataSource.
create() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
 
create() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory
 
create() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 
create() - Method in interface org.apache.beam.sdk.io.solace.broker.SempClientFactory
This method is the core of the factory interface.
create() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
This is the core method that subclasses must implement.
create() - Static method in class org.apache.beam.sdk.io.solace.RetryCallableManager
Creates a new RetryCallableManager with default retry settings.
create(String) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
Creates a new Solr connection configuration.
create(int, Duration) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.RetryConfiguration
 
create() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
Creates a SplunkEvent object.
create() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
Builds a SplunkWriteError object.
create(long, long, long, long) - Static method in class org.apache.beam.sdk.metrics.DistributionResult
 
create(long, Instant) - Static method in class org.apache.beam.sdk.metrics.GaugeResult
 
create(String, Map<String, String>) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
 
create(String) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
 
create(String, MetricName) - Static method in class org.apache.beam.sdk.metrics.MetricKey
 
create(Iterable<MetricResult<Long>>, Iterable<MetricResult<DistributionResult>>, Iterable<MetricResult<GaugeResult>>, Iterable<MetricResult<StringSetResult>>) - Static method in class org.apache.beam.sdk.metrics.MetricQueryResults
 
create(MetricKey, Boolean, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
 
create(MetricKey, T, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.metrics.MetricsOptions.NoOpMetricsSink
 
create(Set<String>) - Static method in class org.apache.beam.sdk.metrics.StringSetResult
Creates a StringSetResult from the given Set by making an immutable copy.
create(PipelineOptions) - Method in interface org.apache.beam.sdk.options.DefaultValueFactory
Creates a default value for a getter marked with Default.InstanceFactory.
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.ExecutorOptions.ScheduledExecutorServiceFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.UserAgentFactory
 
create() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
Creates and returns an object that implements PipelineOptions using the values configured on this builder during construction.
create() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Creates and returns an object that implements PipelineOptions.
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory
 
create() - Static method in class org.apache.beam.sdk.Pipeline
Constructs a pipeline from default PipelineOptions.
create(PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
Constructs a pipeline from the provided PipelineOptions.
create() - Static method in class org.apache.beam.sdk.PipelineRunner
Creates a runner from the default app PipelineOptions.
create(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.CachingFactory
 
create(TypeDescriptor<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.Factory
 
create() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return an empty FieldAccessDescriptor.
create(Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
Create an enumeration type over a set of String->Integer values.
create(List<String>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
Create an enumeration type from a fixed set of String values; integer values will be automatically chosen.
create(String...) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
Create an enumeration type from a fixed set of String values; integer values will be automatically chosen.
create(Schema.Field...) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
Create an OneOfType logical type.
create(List<Schema.Field>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
Create an OneOfType logical type.
create(List<Schema.Field>, Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
Create an OneOfType logical type.
create(Object...) - Method in interface org.apache.beam.sdk.schemas.SchemaUserTypeCreator
 
create() - Static method in class org.apache.beam.sdk.schemas.transforms.AddFields
 
create(List<String>, String) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
 
create() - Static method in class org.apache.beam.sdk.schemas.transforms.Filter
 
create() - Static method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
Returns a transform that does a global combine using an aggregation built up by calls to aggregateField and aggregateFields.
create() - Static method in class org.apache.beam.sdk.schemas.transforms.RenameFields
Create an instance of this transform.
create() - Static method in class org.apache.beam.sdk.schemas.transforms.Select
 
create(Class<?>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
 
create(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
 
create(List<String>, Optional<Schema.TypeName>) - Static method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
 
create() - Static method in class org.apache.beam.sdk.testing.TestPipeline
Creates and returns a new test pipeline.
create(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
 
create(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream
Create a new TestStream.Builder with no elements and watermark equal to BoundedWindow.TIMESTAMP_MIN_VALUE.
create(Schema) - Static method in class org.apache.beam.sdk.testing.TestStream
 
create(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.testing.TestStream
 
create(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Returns an approximate quantiles combiner with the given compareFn and desired number of quantiles.
create(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Like ApproximateQuantiles.ApproximateQuantilesCombineFn.create(int, Comparator), but sorts values using their natural ordering.
create(int, ComparatorT, long, double) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Creates an approximate quantiles combiner with the given compareFn and desired number of quantiles.
Create<T> - Class in org.apache.beam.sdk.transforms
Create<T> takes a collection of elements of type T known when the pipeline is constructed and returns a PCollection<T> containing the elements.
Create() - Constructor for class org.apache.beam.sdk.transforms.Create
 
create() - Static method in class org.apache.beam.sdk.transforms.Distinct
Returns a Distinct<T> PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
Create an instance.
create() - Static method in class org.apache.beam.sdk.transforms.GroupByKey
Returns a GroupByKey<K, V> PTransform.
create(long, long, SerializableFunction<InputT, Long>, Duration) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
 
create() - Static method in class org.apache.beam.sdk.transforms.Impulse
Create a new Impulse PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
Returns a CoGroupByKey<K> PTransform.
create(JsonToRow.JsonToRowWithErrFn) - Static method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
 
create() - Static method in class org.apache.beam.sdk.transforms.Keys
Returns a Keys<K> PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.KvSwap
Returns a KvSwap<K, V> PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.PeriodicImpulse
 
create() - Static method in class org.apache.beam.sdk.transforms.PeriodicSequence
 
create() - Static method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
Creates a ResourceHints instance with no hints.
create(PipelineOptions) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.EmptyListDefault
 
create() - Static method in class org.apache.beam.sdk.transforms.Values
Returns a Values<V> PTransform.
create(T) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
 
create(Coder<T>, Coder<MetaT>) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
 
Create.OfValueProvider<T> - Class in org.apache.beam.sdk.transforms
Create.TimestampedValues<T> - Class in org.apache.beam.sdk.transforms
A PTransform that creates a PCollection whose elements have associated timestamps.
Create.Values<T> - Class in org.apache.beam.sdk.transforms
A PTransform that creates a PCollection from a set of in-memory objects.
Create.WindowedValues<T> - Class in org.apache.beam.sdk.transforms
A PTransform that creates a PCollection whose elements have associated windowing metadata.
createAccumulator() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
 
createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
createAccumulator() - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
createAccumulator() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
Deprecated.
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
createAll(Class<?>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
Creates Function for each method in a given class.
createArrayOf(String, Object[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createArtifactServerFactory(JobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
createBatch(Class<?>, Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
Creates a batch plugin instance.
createBatchExecutionEnvironment(FlinkPipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
If the submitted job is a batch processing job, this method creates the adequate Flink ExecutionEnvironment depending on the user-specified options.
createBigQueryClientCustomErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
createBitXOr(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
 
createBlob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createBlockGenerator(BlockGeneratorListener) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
createBoundedTableStatistics(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
createBucket(String, Bucket) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Creates a Bucket under the specified project in Cloud Storage or propagates an exception.
createBuilder(S3Options) - Method in interface org.apache.beam.sdk.io.aws.options.S3ClientBuilderFactory
 
createBuilder(S3Options) - Method in class org.apache.beam.sdk.io.aws.s3.DefaultS3ClientBuilderFactory
 
createBuilder(S3Options) - Method in interface org.apache.beam.sdk.io.aws2.options.S3ClientBuilderFactory
 
createBuilder(S3Options) - Method in class org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
 
createBuilder(BlobstoreOptions) - Method in class org.apache.beam.sdk.io.azure.blobstore.DefaultBlobstoreClientBuilderFactory
 
createBuilder(BlobstoreOptions) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreClientBuilderFactory
 
createBytesXMLMessage(Solace.Record, boolean, DeliveryMode) - Static method in class org.apache.beam.sdk.io.solace.broker.MessageProducerUtils
Create a BytesXMLMessage to be published in Solace.
createCatalogItems() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
 
createClassLoader(List<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
 
createClob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createCombineFn(AggregateCall, Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
Creates either a UDAF or a built-in Combine.CombineFn.
createCombineFnAnalyticsFunctions(AggregateCall, Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
Creates either a UDAF or a built-in Combine.CombineFn for Analytic Functions.
createConstantCombineFn() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
 
createConstructorCreator(Class<T>, Constructor<T>, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
createConstructorCreator(Class<? super T>, Constructor<T>, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
createContextual(DeserializationContext, BeanProperty) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
 
createDataCatalogClient(DataCatalogPipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
CreateDataflowView<ElemT,ViewT> - Class in org.apache.beam.runners.dataflow
A DataflowRunner marker class for creating a PCollectionView.
createDataset(List<WindowedValue<T>>, Encoder<WindowedValue<T>>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
createDataset(String, String, DatasetProperties) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
createDataset(String, String, String, String, Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Create a Dataset with the given location, description and default expiration time for tables in the dataset (if null, tables don't expire).
createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Create a Dataset with the given location, description and default expiration time for tables in the dataset (if null, tables don't expire).
createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
createDatasetFromRDD(SparkSession, BoundedSource<T>, Supplier<PipelineOptions>, Encoder<WindowedValue<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.io.BoundedDatasetFactory
Create a Dataset for a BoundedSource via a Spark RDD.
createDatasetFromRows(SparkSession, BoundedSource<T>, Supplier<PipelineOptions>, Encoder<WindowedValue<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.io.BoundedDatasetFactory
Create a Dataset for a BoundedSource via a Spark Table.
createDecompressingChannel(ReadableByteChannel) - Method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
Deprecated.
 
createDecompressingChannel(ReadableByteChannel) - Method in interface org.apache.beam.sdk.io.CompressedSource.DecompressingChannelFactory
Given a channel, create a channel that decompresses the content read from the channel.
createDefault() - Static method in class org.apache.beam.sdk.coders.CoderRegistry
Creates a CoderRegistry containing registrations for all standard coders part of the core Java Apache Beam SDK and also any registrations provided by coder registrars.
createDefault() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
Creates a ManagedChannel relying on the ManagedChannelBuilder to choose the channel type.
createDefault() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
createDefault() - Static method in interface org.apache.beam.sdk.fn.server.ServerFactory.UrlFactory
 
createDefault() - Static method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
createDefault() - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
 
createDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create a DicomStore.
createDicomStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create a DicomStore with a PubSub listener.
createDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createDicomStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
CreateDisposition - Enum in org.apache.beam.sdk.io.snowflake.enums
Enum containing all supported dispositions for table.
createDynamoDB() - Method in interface org.apache.beam.sdk.io.aws.dynamodb.AwsClientsProvider
 
createDynamoDB() - Method in class org.apache.beam.sdk.io.aws.dynamodb.BasicDynamoDBProvider
 
createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory
Creates a new, active RemoteEnvironment backed by a local Docker container.
createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
 
createEnvironment(RunnerApi.Environment, String) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory
Creates an active RunnerApi.Environment and returns a handle to it.
createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory
Creates a new, active RemoteEnvironment backed by an unmanaged worker.
createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
Creates a new, active RemoteEnvironment backed by a forked process.
createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory
 
createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
 
createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
 
createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
Creates EnvironmentFactory for the provided GrpcServices.
createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
 
createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.Provider
 
createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory.Provider
 
createEpoll() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
Creates a ManagedChannelFactory backed by an EpollDomainSocketChannel if the address is a DomainSocketAddress.
createEpollDomainSocket() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
Create a EpollDomainSocket.
createEpollSocket() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
Create a EpollSocket.
createFactory() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
createFactoryForCreateSubscription() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createFactoryForGetSchema(PubsubClient.TopicPath, PubsubClient.SchemaPath, Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createFactoryForPublish(PubsubClient.TopicPath, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
Return a factory for testing publishers.
createFactoryForPull(Clock, PubsubClient.SubscriptionPath, int, Iterable<PubsubClient.IncomingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
Return a factory for testing subscribers.
createFactoryForPullAndPublish(PubsubClient.SubscriptionPath, PubsubClient.TopicPath, Clock, int, Iterable<PubsubClient.IncomingMessage>, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
Returns a factory for a test that is expected to both publish and pull messages over the course of the test.
createFhirStore(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create FHIR Store with a PubSub topic listener.
createFhirStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create FHIR Store.
createFhirStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createFhirStore(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createFile() - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
Generates a random file with NUM_LINES between 60 and 120 characters each.
createForSubrangeOfFile(String, long, long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
Deprecated.
Used by Dataflow worker
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
 
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.BlockBasedSource
Creates a BlockBasedSource for the specified range in a single file.
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.CompressedSource
Creates a CompressedSource for a subrange of a file.
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
Creates and returns a new FileBasedSource of the same type as the current FileBasedSource backed by a given file and an offset range.
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.TextSource
 
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
createFrom(String) - Static method in class org.apache.beam.sdk.fn.channel.SocketAddressFactory
Parse a SocketAddress from the given string.
createGetter(FieldValueTypeInformation, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
createGetterConversions(StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
 
createGetterConversions(StackManipulation) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
 
createHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Creates an HL7v2 message.
createHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createHL7v2Store(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create hl 7 v 2 store hl 7 v 2 store.
createHL7v2Store(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createImplementor(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
 
createInProcess() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
Creates a ManagedChannel using an in-process channel.
createInput(Pipeline, Map<String, PCollection<?>>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
 
createInput(Pipeline, Map<String, PCollection<?>>) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
 
createInputFormatInstance() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
Creates instance of InputFormat class.
createInternal(WindowingStrategy) - Static method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
createIterator() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
 
createJCSMPSendMultipleEntry(List<Solace.Record>, boolean, SerializableFunction<Solace.Record, Destination>, DeliveryMode) - Static method in class org.apache.beam.sdk.io.solace.broker.MessageProducerUtils
Create a JCSMPSendMultipleEntry array to be published in Solace.
createJob(Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Creates the Dataflow Job.
createJobInvocation(String, String, ListeningExecutorService, RunnerApi.Pipeline, FlinkPipelineOptions, PortablePipelineRunner) - Method in class org.apache.beam.runners.flink.FlinkJobInvoker
 
createJobServerFactory(JobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
createJobService() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
createKafkaRead() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
createKinesisProducer(KinesisProducerConfiguration) - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
 
createMessagesArray(Iterable<Solace.Record>, boolean) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
createMetadata(MetaT) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
 
createMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Create the metadata table if it does not exist yet.
createNClob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createNewDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Creates a new dataset.
createNewDataset(String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Creates a new dataset with defaultTableExpirationMs.
createNewDataset(String, String, Long, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Creates a new dataset with defaultTableExpirationMs and in a specified location (GCP region).
createNewTable(String, String, Table) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
CreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
 
CreateOptions - Class in org.apache.beam.sdk.io.fs
An abstract class that contains common configuration options for creating resources.
CreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions
 
CreateOptions.Builder<BuilderT extends CreateOptions.Builder<BuilderT>> - Class in org.apache.beam.sdk.io.fs
An abstract builder for CreateOptions.
CreateOptions.StandardCreateOptions - Class in org.apache.beam.sdk.io.fs
A standard configuration options with builder.
CreateOptions.StandardCreateOptions.Builder - Class in org.apache.beam.sdk.io.fs
createOrUpdateReadChangeStreamMetadataTable(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Utility method to create or update Read Change Stream metadata table.
createOutboundAggregator(Supplier<String>, boolean) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
Creates a BeamFnDataOutboundAggregator for buffering and sending outbound data and timers over the data plane.
createOutboundAggregator(Supplier<String>, boolean) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
createOutputMap(Iterable<String>) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
Creates a mapping from PCollection id to output tag integer.
createPane(boolean, boolean, PaneInfo.Timing) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
createPane(boolean, boolean, PaneInfo.Timing, long, long) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Factory method to create a PaneInfo with the specified parameters.
createPartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Creates the metadata table in the given instance, database configuration, with the constructor specified table name.
createPipeline(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
 
createPipelineOptions(Map<String, String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
createPlanner(JdbcConnection, Collection<RuleSet>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.Factory
 
createPrepareContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollection
For internal use only; no backwards-compatibility guarantees.
createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>, TupleTag<?>) - Static method in class org.apache.beam.sdk.values.PCollection
For internal use only; no backwards-compatibility guarantees.
createProcessContext(ValueInSingleWindow<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
createProperties() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
createPushDownRel(RelDataType, List<String>, BeamSqlTableFilter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
createQuery(Expression, Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createQuery(Expression, Type) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createQuery(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
createQueryUsingStandardSql(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
createQueueForTopic(String, String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
 
createQueueForTopic(String, String) - Method in interface org.apache.beam.sdk.io.solace.broker.SempClient
This is only called when a user requests to read data from a topic.
createRandomSubscription(PubsubClient.ProjectPath, PubsubClient.TopicPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create a random subscription for topic.
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
 
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
Returns a new BoundedSource.BoundedReader that reads from this source.
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
createReader(PipelineOptions, CheckpointMarkImpl) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
 
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
 
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
createReader(PipelineOptions, SolaceCheckpointMark) - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
createReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.sdk.io.UnboundedSource
Create a new UnboundedSource.UnboundedReader to read from this source, resuming from the given checkpoint if present.
createReadSession(CreateReadSessionRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
Create a new read session against an existing table.
createRPCLatencyHistogram(KafkaSinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
Creates an Histogram metric to record RPC latency.
createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create Schema from Schema definition content.
createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Create Schema from Schema definition content.
createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Create Schema from Schema definition content.
createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createSessionToken(String) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
createSetter(FieldValueTypeInformation, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
createSetterConversions(StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
 
createSetterConversions(StackManipulation) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
 
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
 
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BlockBasedSource
Creates a BlockBasedReader.
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.CompressedSource
Creates a FileBasedReader to read a single file.
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
Creates and returns an instance of a FileBasedReader implementation for the current source assuming the source represents a single file.
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.TextSource
 
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
createSingleMessage(Solace.Record, boolean) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
createSnsPublisher() - Method in interface org.apache.beam.sdk.io.aws.sns.AwsClientsProvider
 
createSource - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns an OffsetBasedSource for a subrange of the current source.
createSQLXML() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createStateBackend(FlinkPipelineOptions) - Method in interface org.apache.beam.runners.flink.FlinkStateBackendFactory
 
createStatement() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createStatement(int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createStatement(int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createStateOnInitialEvent(EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.EventExaminer
If the event was the first event for a given key, create the state to hold the required data needed for processing.
createStaticCreator(Class<T>, Method, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
createStaticCreator(Class<T>, Method, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
CreateStream<T> - Class in org.apache.beam.runners.spark.io
Create an input stream from Queue.
createStreamExecutionEnvironment(FlinkPipelineOptions, List<String>, String) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
If the submitted job is a stream processing job, this method creates the adequate Flink StreamExecutionEnvironment depending on the user-specified options.
createStreaming(Class<?>, SerializableFunction<V, Long>, Class<? extends Receiver<V>>, SerializableFunction<PluginConfig, Object[]>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
Creates a streaming plugin instance.
createStreaming(Class<?>, SerializableFunction<V, Long>, Class<? extends Receiver<V>>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
Creates a streaming plugin instance with default function for getting args for Receiver.
createStringAggOperator(ResolvedNodes.ResolvedFunctionCallBase) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
createStruct(String, Object[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
createStructuralValues(Coder<T>, List<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Testing utilities below depend on standard assertions and matchers to compare elements read by sources.
createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create subscription to topic.
createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
 
createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
createTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
Creates a table.
createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
createTable(Table) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Creates the specified table if it does not exist.
createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Creates the specified table if it does not exist.
createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
CreateTableHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
 
CreateTableHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers
 
CreateTables<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Creates any tables needed before performing streaming writes to the tables.
CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
The list of tables created so far, so we don't try the creation each time.
createTest(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
 
createTimestampPolicy(TopicPartition, Optional<Instant>) - Method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
Creates a TimestampPolicy for a partition.
createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create topic.
createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create {link TopicPath} with PubsubClient.SchemaPath.
createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createTransaction() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
Returns a transform that creates a batch transaction.
CreateTransaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
Creates a batch translation context.
createTranslationContext(JobInfo, FlinkPipelineOptions, ExecutionEnvironment) - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
 
createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
 
createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
Creates a streaming translation context.
createTranslationContext(JobInfo, FlinkPipelineOptions, StreamExecutionEnvironment) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
Creates a streaming translation context.
createTranslator() - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
Creates a batch translator.
createTranslator(Map<String, FlinkBatchPortablePipelineTranslator.PTransformTranslator>) - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
Creates a batch translator.
createTypeConversion(boolean) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
 
createTypeConversion(boolean) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
 
createUnboundedTableStatistics(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
createUrl(String, int) - Method in interface org.apache.beam.sdk.fn.server.ServerFactory.UrlFactory
 
createValue(String, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
Create a OneOfType.Value specifying which field to set and the value to set.
createValue(int, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
Create a OneOfType.Value specifying which field to set and the value to set.
createValue(EnumerationType.Value, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
Create a OneOfType.Value specifying which field to set and the value to set.
createWatermarkPolicy() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
 
createWatermarkPolicy() - Method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory
 
createWithBytesReadConsumer(SeekableByteChannel, Consumer<Integer>) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
createWithBytesWrittenConsumer(SeekableByteChannel, Consumer<Integer>) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
createWithNoOpConsumer(ReadableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
 
createWithNoOpConsumer(SeekableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
createWithNoOpConsumer(WritableByteChannel) - Static method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
 
createWithPortSupplier(Supplier<Integer>) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
Create a ServerFactory.InetSocketAddressServerFactory that uses ports from a supplier.
createWithUrlFactory(ServerFactory.UrlFactory) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
Create a ServerFactory.InetSocketAddressServerFactory that uses the given url factory.
createWithUrlFactoryAndPortSupplier(ServerFactory.UrlFactory, Supplier<Integer>) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
Create a ServerFactory.InetSocketAddressServerFactory that uses the given url factory and ports from a supplier.
createWriteOperation() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSink
 
createWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink
Return a subclass of FileBasedSink.WriteOperation that will manage the write to the sink.
createWriter() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Clients must implement to return a subclass of FileBasedSink.Writer.
createWriteStream(String, WriteStream.Type) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Create a Write Stream for use with the Storage Write API.
createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
createZetaSqlFunction(String, SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
Create a dummy SqlFunction of type OTHER_FUNCTION from given function name and return type.
CredentialFactory - Interface in org.apache.beam.sdk.extensions.gcp.auth
Construct an oauth credential to be used by the SDK and the SDK workers.
credentialsProvider(AwsCredentialsProvider) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
credentialsProvider() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
CrossLanguageConfiguration - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
Parameters abstract class to expose the transforms to an external SDK.
CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
crossProductJoin() - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
Expand the join into individual rows, similar to SQL joins.
CsvConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration
 
csvConfigurationBuilder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
CsvIO - Class in org.apache.beam.sdk.io.csv
PTransforms for reading and writing CSV files.
CsvIO() - Constructor for class org.apache.beam.sdk.io.csv.CsvIO
 
CsvIO.Write<T> - Class in org.apache.beam.sdk.io.csv
PTransform for writing CSV files.
CsvIOParse<T> - Class in org.apache.beam.sdk.io.csv
PTransform for Parsing CSV Record Strings into Schema-mapped target types.
CsvIOParse() - Constructor for class org.apache.beam.sdk.io.csv.CsvIOParse
 
CsvIOParseError - Class in org.apache.beam.sdk.io.csv
CsvIOParseError is a data class to store errors from CSV record processing.
CsvIOParseError() - Constructor for class org.apache.beam.sdk.io.csv.CsvIOParseError
 
CsvIOParseResult<T> - Class in org.apache.beam.sdk.io.csv
The T and CsvIOParseError PCollection results of parsing CSV records.
csvLines2BeamRows(CSVFormat, String, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
Decode zero or more CSV records from the given string, according to the specified CSVFormat, and converts them to Rows with the specified Schema.
CsvSink - Class in org.apache.beam.runners.spark.metrics.sink
A Sink for Spark's metric system reporting metrics (including Beam step metrics) to a CSV file.
CsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
Constructor for Spark 3.1.x and earlier.
CsvSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
Constructor for Spark 3.2.x and later.
CsvToRow(Schema, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
 
CsvWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
 
CsvWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
CsvWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.CsvWriteSchemaTransformFormatProvider
 
CsvWriteTransformProvider - Class in org.apache.beam.sdk.io.csv.providers
CsvWriteTransformProvider() - Constructor for class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
 
CsvWriteTransformProvider.CsvWriteConfiguration - Class in org.apache.beam.sdk.io.csv.providers
Configuration for writing to BigQuery with Storage Write API.
CsvWriteTransformProvider.CsvWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.csv.providers
CsvWriteTransformProvider.CsvWriteTransform - Class in org.apache.beam.sdk.io.csv.providers
ctxt - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
CURRENT_METADATA_TABLE_VERSION - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
currentEventTime() - Method in interface org.apache.beam.sdk.state.Timers
Returns the current event time.
currentInputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
currentOutputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
currentProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
currentProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
Returns the current processing time.
currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
Returns the streamProgress that was successfully claimed.
currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
currentRestriction() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
 
currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
Returns a restriction accurately describing the full range of work the current DoFn.ProcessElement call will do, including already completed work.
currentSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
currentSynchronizedProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
Returns the current synchronized processing time or null if unknown.
currentWatermark - Variable in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
 
currentWatermark() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimator
Return estimated output watermark.
currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
 
currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
 
currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
 
custom() - Static method in class org.apache.beam.sdk.io.thrift.ThriftSchema
Builds a schema provider that maps any thrift type to a Beam schema, allowing for custom thrift typedef entries (which cannot be resolved using the available metadata) to be manually registered with their corresponding beam types.
CUSTOM_SOURCE_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
CustomCoder<T> - Class in org.apache.beam.sdk.coders
An abstract base class that implements all methods of Coder except Coder.encode(T, java.io.OutputStream) and Coder.decode(java.io.InputStream).
CustomCoder() - Constructor for class org.apache.beam.sdk.coders.CustomCoder
 
Customer - Class in org.apache.beam.sdk.extensions.sql.example.model
Describes a customer.
Customer(int, String, String) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
Customer() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
CustomHttpErrors - Class in org.apache.beam.sdk.extensions.gcp.util
An optional component to use with the RetryHttpRequestInitializer in order to provide custom errors for failing http calls.
CustomHttpErrors.Builder - Class in org.apache.beam.sdk.extensions.gcp.util
A Builder which allows building immutable CustomHttpErrors object.
CustomHttpErrors.MatcherAndError - Class in org.apache.beam.sdk.extensions.gcp.util
A simple Tuple class for creating a list of HttpResponseMatcher and HttpResponseCustomError to print for the responses.
CustomTableResolver - Interface in org.apache.beam.sdk.extensions.sql.meta
Interface that table providers can implement if they require custom table name resolution.
CustomTimestampPolicyWithLimitedDelay<K,V> - Class in org.apache.beam.sdk.io.kafka
A policy for custom record timestamps where timestamps within a partition are expected to be roughly monotonically increasing with a cap on out of order event delays (say 1 minute).
CustomTimestampPolicyWithLimitedDelay(SerializableFunction<KafkaRecord<K, V>, Instant>, Duration, Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
A policy for custom record timestamps where timestamps are expected to be roughly monotonically increasing with out of order event delays less than maxDelay.
CustomX509TrustManager - Class in org.apache.beam.sdk.io.splunk
A Custom X509TrustManager that trusts a user provided CA and default CA's.
CustomX509TrustManager(X509Certificate) - Constructor for class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
 

D

DAGBuilder - Class in org.apache.beam.runners.jet
Utility class for wiring up Jet DAGs based on Beam pipelines.
DAGBuilder.WiringListener - Interface in org.apache.beam.runners.jet
Listener that can be registered with a DAGBuilder in order to be notified when edges are being registered.
DaoFactory - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
 
DaoFactory(BigtableConfig, BigtableConfig, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
DaoFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Factory class to create data access objects to perform change stream queries and access the metadata tables.
DaoFactory(SpannerConfig, String, SpannerConfig, PartitionMetadataTableNames, Options.RpcPriority, String, Dialect, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
Constructs a DaoFactory with the configuration to be used for the underlying instances.
data(StreamObserver<BeamFnApi.Elements>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
data(String, String) - Static method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
data() - Method in class org.apache.beam.sdk.io.solace.data.Semp.Queue
 
Data() - Constructor for class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
 
DATA_BUFFER_SIZE_LIMIT - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
 
DATA_BUFFER_TIME_LIMIT_MS - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
 
DATA_RECORD_COMMITTED_TO_EMITTED_0MS_TO_1000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for record latencies [0, 1000) ms during the execution of the Connector.
DATA_RECORD_COMMITTED_TO_EMITTED_1000MS_TO_3000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for record latencies [1000, 3000) ms during the execution of the Connector.
DATA_RECORD_COMMITTED_TO_EMITTED_3000MS_TO_INF_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for record latencies equal or above 3000ms during the execution of the Connector.
DATA_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of data records identified during the execution of the Connector.
database() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
DataCatalogPipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
Pipeline options for Data Catalog table provider.
DataCatalogPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
 
DataCatalogPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptionsRegistrar
 
dataCatalogSegments(TableReference, BigQueryOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
dataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
DataCatalogTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
Uses DataCatalog to get the source type and schema for a table.
DataChangeRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
A data change record encodes modifications to Cloud Spanner rows.
DataChangeRecord(String, Timestamp, String, boolean, String, String, List<ColumnType>, List<Mod>, ModType, ValueCaptureType, long, long, String, boolean, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
Constructs a data change record for a given partition, at a given timestamp, for a given transaction.
dataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class capable of processing DataChangeRecords.
DataChangeRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
This class is part of the process for ReadChangeStreamPartitionDoFn SDF.
DataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
 
DataEndpoint<T> - Class in org.apache.beam.sdk.fn.data
 
DataEndpoint() - Constructor for class org.apache.beam.sdk.fn.data.DataEndpoint
 
DataflowClient - Class in org.apache.beam.runners.dataflow
Wrapper around the generated Dataflow client to provide common functionality.
DataflowClientFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
 
DataflowJobAlreadyExistsException - Exception in org.apache.beam.runners.dataflow
An exception that is thrown if the unique job name constraint of the Dataflow service is broken because an existing job with the same job name is currently active.
DataflowJobAlreadyExistsException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyExistsException
Create a new DataflowJobAlreadyExistsException with the specified DataflowPipelineJob and message.
DataflowJobAlreadyUpdatedException - Exception in org.apache.beam.runners.dataflow
An exception that is thrown if the existing job has already been updated within the Dataflow service and is no longer able to be updated.
DataflowJobAlreadyUpdatedException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyUpdatedException
Create a new DataflowJobAlreadyUpdatedException with the specified DataflowPipelineJob and message.
DataflowJobException - Exception in org.apache.beam.runners.dataflow
A RuntimeException that contains information about a DataflowPipelineJob.
DataflowPipelineDebugOptions - Interface in org.apache.beam.runners.dataflow.options
Internal.
DataflowPipelineDebugOptions.DataflowClientFactory - Class in org.apache.beam.runners.dataflow.options
Returns the default Dataflow client built from the passed in PipelineOptions.
DataflowPipelineDebugOptions.StagerFactory - Class in org.apache.beam.runners.dataflow.options
Creates a Stager object using the class specified in DataflowPipelineDebugOptions.getStagerClass().
DataflowPipelineDebugOptions.UnboundedReaderMaxReadTimeFactory - Class in org.apache.beam.runners.dataflow.options
Sets Integer value based on old, deprecated field (DataflowPipelineDebugOptions.getUnboundedReaderMaxReadTimeSec()).
DataflowPipelineJob - Class in org.apache.beam.runners.dataflow
A DataflowPipelineJob represents a job submitted to Dataflow using DataflowRunner.
DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>, RunnerApi.Pipeline) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
Constructs the job.
DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
Constructs the job.
DataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
Options that can be used to configure the DataflowRunner.
DataflowPipelineOptions.FlexResourceSchedulingGoal - Enum in org.apache.beam.runners.dataflow.options
Set of available Flexible Resource Scheduling goals.
DataflowPipelineOptions.StagingLocationFactory - Class in org.apache.beam.runners.dataflow.options
Returns a default staging location under GcpOptions.getGcpTempLocation().
DataflowPipelineRegistrar - Class in org.apache.beam.runners.dataflow
Contains the PipelineOptionsRegistrar and PipelineRunnerRegistrar for the DataflowRunner.
DataflowPipelineRegistrar.Options - Class in org.apache.beam.runners.dataflow
DataflowPipelineRegistrar.Runner - Class in org.apache.beam.runners.dataflow
Register the DataflowRunner.
DataflowPipelineTranslator - Class in org.apache.beam.runners.dataflow
DataflowPipelineTranslator knows how to translate Pipeline objects into Cloud Dataflow Service API Jobs.
DataflowPipelineTranslator.JobSpecification - Class in org.apache.beam.runners.dataflow
The result of a job translation.
DataflowPipelineWorkerPoolOptions - Interface in org.apache.beam.runners.dataflow.options
Options that are used to configure the Dataflow pipeline worker pool.
DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType - Enum in org.apache.beam.runners.dataflow.options
Type of autoscaling algorithm to use.
DataflowProfilingAgentConfiguration() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowProfilingOptions.DataflowProfilingAgentConfiguration
 
DataflowProfilingOptions - Interface in org.apache.beam.runners.dataflow.options
Options for controlling profiling of pipeline execution.
DataflowProfilingOptions.DataflowProfilingAgentConfiguration - Class in org.apache.beam.runners.dataflow.options
Configuration the for profiling agent.
DataflowRunner - Class in org.apache.beam.runners.dataflow
A PipelineRunner that executes the operations in the pipeline by first translating them to the Dataflow representation using the DataflowPipelineTranslator and then submitting them to a Dataflow service for execution.
DataflowRunner(DataflowPipelineOptions) - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner
 
DataflowRunner.DataflowTransformTranslator - Class in org.apache.beam.runners.dataflow
 
DataflowRunner.StreamingPCollectionViewWriterFn<T> - Class in org.apache.beam.runners.dataflow
A marker DoFn for writing the contents of a PCollection to a streaming PCollectionView backend implementation.
DataflowRunnerHooks - Class in org.apache.beam.runners.dataflow
An instance of this class can be passed to the DataflowRunner to add user defined hooks to be invoked at various times during pipeline execution.
DataflowRunnerHooks() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunnerHooks
 
DataflowRunnerInfo - Class in org.apache.beam.runners.dataflow
Populates versioning and other information for DataflowRunner.
DataflowServiceException - Exception in org.apache.beam.runners.dataflow
Signals there was an error retrieving information about a job from the Cloud Dataflow Service.
DataflowStreamingPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
[Internal] Options for configuring StreamingDataflowWorker.
DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory - Class in org.apache.beam.runners.dataflow.options
Read global get config request period from system property 'windmill.global_config_refresh_period'.
DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory - Class in org.apache.beam.runners.dataflow.options
Read counter reporting period from system property 'windmill.harness_update_reporting_period'.
DataflowStreamingPipelineOptions.LocalWindmillHostportFactory - Class in org.apache.beam.runners.dataflow.options
Factory for creating local Windmill address.
DataflowStreamingPipelineOptions.MaxStackTraceDepthToReportFactory - Class in org.apache.beam.runners.dataflow.options
Read 'MaxStackTraceToReport' from system property 'windmill.max_stack_trace_to_report' or Integer.MAX_VALUE if unspecified.
DataflowStreamingPipelineOptions.PeriodicStatusPageDirectoryFactory - Class in org.apache.beam.runners.dataflow.options
Read 'PeriodicStatusPageOutputDirector' from system property 'windmill.periodic_status_page_directory' or null if unspecified.
DataflowStreamingPipelineOptions.WindmillServiceStreamingRpcBatchLimitFactory - Class in org.apache.beam.runners.dataflow.options
Factory for setting value of WindmillServiceStreamingRpcBatchLimit based on environment.
DataflowTemplateJob - Class in org.apache.beam.runners.dataflow.util
A DataflowPipelineJob that is returned when --templateRunner is set.
DataflowTemplateJob() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
DataflowTransformTranslator() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner.DataflowTransformTranslator
 
DataflowTransport - Class in org.apache.beam.runners.dataflow.util
Helpers for cloud communication.
DataflowTransport() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTransport
 
DataflowWorkerHarnessOptions - Interface in org.apache.beam.runners.dataflow.options
Options that are used exclusively within the Dataflow worker harness.
DataflowWorkerLoggingOptions - Interface in org.apache.beam.runners.dataflow.options
Deprecated.
This interface will no longer be the source of truth for worker logging configuration once jobs are executed using a dedicated SDK harness instead of user code being co-located alongside Dataflow worker code. Consider set corresponding options within SdkHarnessOptions to ensure forward compatibility.
DataflowWorkerLoggingOptions.Level - Enum in org.apache.beam.runners.dataflow.options
Deprecated.
The set of log levels that can be used on the Dataflow worker.
DataflowWorkerLoggingOptions.WorkerLogLevelOverrides - Class in org.apache.beam.runners.dataflow.options
Deprecated.
Defines a log level override for a specific class, package, or name.
DataframeTransform - Class in org.apache.beam.sdk.extensions.python.transforms
Wrapper for invoking external Python DataframeTransform.
dataSchema - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
dataset - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
datasetExists(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
dataSets - Variable in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
DatasetServiceImpl(BigQueryOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.influxdb.InfluxDbIO.DataSourceConfiguration
 
DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
 
DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
DatastoreIO provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
DatastoreV1 provides an API to Read, Write and Delete PCollections of Google Cloud Datastore version v1 Entity objects.
DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that deletes Entities from Cloud Datastore.
DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that deletes Entities associated with the given Keys from Cloud Datastore.
DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that reads the result rows of a Cloud Datastore query as Entity objects.
DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that writes Entity objects to Cloud Datastore.
DataStoreV1SchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.datastore
An implementation of SchemaIOProvider for reading and writing payloads with DatastoreIO.
DataStoreV1SchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
 
DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO - Class in org.apache.beam.sdk.io.gcp.datastore
An abstraction to create schema aware IOs.
DataStoreV1TableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datastore
TableProvider for DatastoreIO for consumption by Beam SQL.
DataStoreV1TableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
 
DataStreamDecoder(Coder<T>, PrefetchableIterator<ByteString>) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
 
DataStreams - Class in org.apache.beam.sdk.fn.stream
DataStreams.DataStreamDecoder treats multiple ByteStrings as a single input stream decoding values with the supplied iterator.
DataStreams() - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams
 
DataStreams.DataStreamDecoder<T> - Class in org.apache.beam.sdk.fn.stream
An adapter which converts an InputStream to a PrefetchableIterator of T values using the specified Coder.
DataStreams.ElementDelimitedOutputStream - Class in org.apache.beam.sdk.fn.stream
An adapter which wraps an DataStreams.OutputChunkConsumer as an OutputStream.
DataStreams.OutputChunkConsumer<T> - Interface in org.apache.beam.sdk.fn.stream
DATE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
date(Integer, Integer, Integer) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
 
date(DateTime) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
 
date(DateTime, String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
 
DATE - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
Date - Class in org.apache.beam.sdk.schemas.logicaltypes
A date without a time-zone.
Date() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.Date
 
DATE - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
Beam LogicalType corresponding to ZetaSQL/CalciteSQL DATE type.
DATE_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
DATE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
DATE_OP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
DATE_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
DateConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
 
DateConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
 
DateFunctions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
DateFunctions.
DateFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
 
DateIncrementAllFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.DateIncrementAllFn
 
DATETIME - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
DateTime - Class in org.apache.beam.sdk.schemas.logicaltypes
A datetime without a time-zone.
DateTime() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
DATETIME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
Beam LogicalType corresponding to ZetaSQL DATETIME type.
DATETIME - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of datetime fields.
DATETIME_SCHEMA - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
DateTimeBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle
 
DateTimeUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql
DateTimeUtils.
DateTimeUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
days(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
Returns a WindowFn that windows elements into periods measured by days.
DDL_EXECUTOR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser
Ddl Executor.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
The tag for the deadletter output of FHIR resources.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
The tag for the deadletter output of FHIR Resources.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
The tag for the deadletter output of FHIR Resources from a GetPatientEverything request.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
The tag for the deadletter output of HL7v2 read responses.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
The tag for the deadletter output of HL7v2 Messages.
DeadLetteredTransform<InputT,OutputT> - Class in org.apache.beam.sdk.schemas.io
 
DeadLetteredTransform(SimpleFunction<InputT, OutputT>, String) - Constructor for class org.apache.beam.sdk.schemas.io.DeadLetteredTransform
 
DebeziumIO - Class in org.apache.beam.io.debezium
Utility class which exposes an implementation DebeziumIO.read() and a Debezium configuration.
DebeziumIO.ConnectorConfiguration - Class in org.apache.beam.io.debezium
A POJO describing a Debezium configuration.
DebeziumIO.Read<T> - Class in org.apache.beam.io.debezium
Implementation of DebeziumIO.read().
DebeziumReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
DebeziumReadSchemaTransformProvider - Class in org.apache.beam.io.debezium
A schema-aware transform provider for DebeziumIO.
DebeziumReadSchemaTransformProvider() - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
 
DebeziumReadSchemaTransformProvider(Boolean, Integer, Long) - Constructor for class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
 
DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration - Class in org.apache.beam.io.debezium
 
DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.io.debezium
 
debeziumRecordInstant(SourceRecord) - Static method in class org.apache.beam.io.debezium.KafkaConnectUtils
 
DebeziumSDFDatabaseHistory() - Constructor for class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
 
DebeziumTransformRegistrar - Class in org.apache.beam.io.debezium
Exposes DebeziumIO.Read as an external transform for cross-language usage.
DebeziumTransformRegistrar() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar
 
DebeziumTransformRegistrar.ReadBuilder - Class in org.apache.beam.io.debezium
 
DebeziumTransformRegistrar.ReadBuilder.Configuration - Class in org.apache.beam.io.debezium
 
dec() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
 
dec(long) - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
 
dec() - Method in interface org.apache.beam.sdk.metrics.Counter
 
dec(long) - Method in interface org.apache.beam.sdk.metrics.Counter
 
dec() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
 
dec(long) - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
 
dec() - Method in class org.apache.beam.sdk.metrics.NoOpCounter
 
dec(long) - Method in class org.apache.beam.sdk.metrics.NoOpCounter
 
decActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Decrements the ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNT by 1 if the metric is enabled.
DECIMAL - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
DECIMAL - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of decimal fields.
decode(InputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
decode(InputStream) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.Coder
Decodes a value of type T from the given input stream in the given context.
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
Deprecated.
only implement and call Coder.decode(InputStream)
decode(InputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.FloatCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.SnappyCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.ZstdCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator.SequenceRangeAccumulatorCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.thrift.ThriftCoder
Decodes a value of type T from the given input stream using provided ThriftCoder.protocolFactory.
decode(InputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
decodeFromChunkBoundaryToChunkBoundary() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
Skips any remaining bytes in the current ByteString moving to the next ByteString in the underlying ByteString iterator and decoding elements till at the next boundary.
decodePacked32TimeSeconds(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeSeconds as a LocalTime with seconds precision.
decodePacked32TimeSecondsAsJavaTime(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeSeconds as a LocalTime with seconds precision.
decodePacked64DatetimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldDatetimeMicros as a LocalDateTime with microseconds precision.
decodePacked64DatetimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldDatetimeMicros as a LocalDateTime with microseconds precision.
decodePacked64DatetimeSeconds(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldDatetimeSeconds as a LocalDateTime with seconds precision.
decodePacked64DatetimeSecondsAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldDatetimeSeconds as a LocalDateTime with seconds precision.
decodePacked64TimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeMicros as a LocalTime with microseconds precision.
decodePacked64TimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeMicros as a LocalTime with microseconds precision.
decodePacked64TimeNanos(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeNanos as a LocalTime with nanoseconds precision.
decodePacked64TimeNanosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeNanos as a LocalTime with nanoseconds precision.
decodePane(byte) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
decodeQueryResult(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
decodeTimerDataTimerId(String) - Static method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
Decodes a string into the transform and timer family ids.
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.CollectionCoder
Builds an instance of IterableT, this coder's associated Iterable-like subtype, from a list of decoded elements.
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.DequeCoder
 
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
 
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
Builds an instance of IterableT, this coder's associated Iterable-like subtype, from a list of decoded elements.
decodeToIterable(List<T>, long, InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
Builds an instance of IterableT, this coder's associated Iterable-like subtype, from a list of decoded elements with the InputStream at the position where this coder detected the end of the stream.
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
 
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.SetCoder
Builds an instance of IterableT, this coder's associated Iterable-like subtype, from a list of decoded elements.
decodeWindowedValue(byte[], Coder) - Static method in class org.apache.beam.runners.jet.Utils
 
DecodingFnDataReceiver<T> - Class in org.apache.beam.sdk.fn.data
A receiver of encoded data, decoding it and passing it onto a downstream consumer.
DecodingFnDataReceiver(Coder<T>, FnDataReceiver<T>) - Constructor for class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
 
decPartitionStreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
decrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
Returns an IdGenerators that will provide successive decrementing longs.
deduplicate(UuidDeduplicationOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
Remove duplicates from the PTransform from a read.
deduplicate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
Deduplicate - Class in org.apache.beam.sdk.transforms
A set of PTransforms which deduplicate input records over a time domain and threshold.
Deduplicate.KeyedValues<K,V> - Class in org.apache.beam.sdk.transforms
Deduplicates keyed values using the key over a specified time domain and threshold.
Deduplicate.Values<T> - Class in org.apache.beam.sdk.transforms
Deduplicates values over a specified time domain and threshold.
Deduplicate.WithRepresentativeValues<T,IdT> - Class in org.apache.beam.sdk.transforms
A PTransform that uses a SerializableFunction to obtain a representative value for each input element used for deduplication.
deepEquals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
 
deepEquals(Object, Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.values.Row.Equals
 
deepHashCode(Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.values.Row.Equals
 
DEF - Static variable in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
 
DEFAULT - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
 
DEFAULT - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
Default() - Constructor for class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
 
Default - Annotation Type in org.apache.beam.sdk.options
Default represents a set of annotations that can be used to annotate getter properties on PipelineOptions with information representing the default value to be returned if no value is specified.
Default.Boolean - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified boolean primitive value.
Default.Byte - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified byte primitive value.
Default.Character - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified char primitive value.
Default.Class - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified Class value.
Default.Double - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified double primitive value.
Default.Enum - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified enum.
Default.Float - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified float primitive value.
Default.InstanceFactory - Annotation Type in org.apache.beam.sdk.options
Value must be of type DefaultValueFactory and have a default constructor.
Default.Integer - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified int primitive value.
Default.Long - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified long primitive value.
Default.Short - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified short primitive value.
Default.String - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified String value.
DEFAULT_ADVANCE_TIMEOUT_IN_MILLIS - Static variable in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
 
DEFAULT_ATTRIBUTE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
DEFAULT_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
 
DEFAULT_BUFFER_LIMIT_TIME_MS - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
 
DEFAULT_BUFFER_SIZE - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
DEFAULT_BYTE_ARRAY_CODER - Static variable in class org.apache.beam.sdk.io.TFRecordIO
The default coder, which returns each record of the input file as a byte array.
DEFAULT_CALC - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
DEFAULT_CHANGE_STREAM_NAME - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The default change stream name for a change stream query is the empty String.
DEFAULT_CONTEXT - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
DEFAULT_DEDUPLICATE_DURATION - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
DEFAULT_DURATION - Static variable in class org.apache.beam.sdk.transforms.Deduplicate
The default duration is 10 mins.
DEFAULT_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The default end timestamp for a change stream query is ChangeStreamsConstants.MAX_INCLUSIVE_END_AT.
DEFAULT_INCLUSIVE_START_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The default start timestamp for a change stream query is Timestamp.MIN_VALUE.
DEFAULT_INITIAL_BACKOFF - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
 
DEFAULT_MASTER_URL - Static variable in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
DEFAULT_MAX_CUMULATIVE_BACKOFF - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
 
DEFAULT_MAX_ELEMENTS_TO_OUTPUT - Static variable in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
 
DEFAULT_MAX_INSERT_BLOCK_SIZE - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
 
DEFAULT_MAX_INVOCATION_HISTORY - Static variable in class org.apache.beam.runners.jobsubmission.InMemoryJobService
The default maximum number of completed invocations to keep.
DEFAULT_MAX_NUM_ELEMENTS - Static variable in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
The cost (in time and space) to compute quantiles to a given accuracy is a function of the total number of elements in the data set.
DEFAULT_MAX_RETRIES - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
 
DEFAULT_METADATA_TABLE_NAME - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
DEFAULT_OUTBOUND_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.stream.DataStreams
 
DEFAULT_PRECISION - Static variable in class org.apache.beam.sdk.extensions.zetasketch.HllCount
The default precision value used in HllCount.Init.Builder.withPrecision(int) is 15.
DEFAULT_RPC_PRIORITY - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The default priority for a change stream query is Options.RpcPriority.HIGH.
DEFAULT_SCHEMA_FIELD_NAME - Static variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
DEFAULT_SCHEMA_RECORD_NAME - Static variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
DEFAULT_SCHEME - Static variable in class org.apache.beam.sdk.io.FileSystems
 
DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.transforms.Deduplicate
The default is the processing time domain.
DEFAULT_TIMEOUT - Static variable in class org.apache.beam.io.requestresponse.RequestResponseIO
The default Duration to wait until completion of user code.
DEFAULT_UNWINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
The default sharding name template.
DEFAULT_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
 
DEFAULT_USES_RESHUFFLE - Static variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
DEFAULT_UUID_EXTRACTOR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
DEFAULT_VPN_NAME - Static variable in class org.apache.beam.sdk.io.solace.broker.SessionService
 
DEFAULT_WINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
The default windowed sharding name template used when writing windowed files.
DEFAULT_WRITER_CLIENTS_PER_WORKER - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
 
DEFAULT_WRITER_DELIVERY_MODE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
 
DEFAULT_WRITER_NUM_SHARDS - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
 
DEFAULT_WRITER_PUBLISH_LATENCY_METRICS - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
 
DEFAULT_WRITER_SUBMISSION_MODE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
 
DEFAULT_WRITER_TYPE - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
 
DefaultAutoscaler - Class in org.apache.beam.sdk.io.jms
Default implementation of AutoScaler.
DefaultAutoscaler() - Constructor for class org.apache.beam.sdk.io.jms.DefaultAutoscaler
 
DefaultBlobstoreClientBuilderFactory - Class in org.apache.beam.sdk.io.azure.blobstore
Construct BlobServiceClientBuilder with given values of Azure client properties.
DefaultBlobstoreClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.azure.blobstore.DefaultBlobstoreClientBuilderFactory
 
DefaultCoder - Annotation Type in org.apache.beam.sdk.coders
The DefaultCoder annotation specifies a Coder class to handle encoding and decoding instances of the annotated class.
DefaultCoder.DefaultCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
A CoderProviderRegistrar that registers a CoderProvider which can use the @DefaultCoder annotation to provide coder providers that creates Coders.
DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider - Class in org.apache.beam.sdk.coders
A CoderProvider that uses the @DefaultCoder annotation to provide coder providers that create Coders.
DefaultCoderCloudObjectTranslatorRegistrar - Class in org.apache.beam.runners.dataflow.util
The CoderCloudObjectTranslatorRegistrar containing the default collection of Coder Cloud Object Translators.
DefaultCoderCloudObjectTranslatorRegistrar() - Constructor for class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
DefaultCoderProvider() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar.DefaultCoderProvider
 
DefaultCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
 
DefaultConcludeTransform() - Constructor for class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
 
defaultConfig(JdbcConnection, Collection<RuleSet>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
 
DefaultErrorHandler() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
 
DefaultExecutableStageContext - Class in org.apache.beam.runners.fnexecution.control
Implementation of a ExecutableStageContext.
defaultFactory() - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
The default ClientBuilderFactory instance.
DefaultFilenamePolicy - Class in org.apache.beam.sdk.io
A default FileBasedSink.FilenamePolicy for windowed and unwindowed files.
DefaultFilenamePolicy.Params - Class in org.apache.beam.sdk.io
Encapsulates constructor parameters to DefaultFilenamePolicy.
DefaultFilenamePolicy.ParamsCoder - Class in org.apache.beam.sdk.io
DefaultGcpRegionFactory - Class in org.apache.beam.runners.dataflow.options
Factory for a default value for Google Cloud region according to https://cloud.google.com/compute/docs/gcloud-compute/#default-properties.
DefaultGcpRegionFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
 
DefaultGoogleAdsClientFactory - Class in org.apache.beam.sdk.io.googleads
The default way to construct a GoogleAdsClient.
DefaultGoogleAdsClientFactory() - Constructor for class org.apache.beam.sdk.io.googleads.DefaultGoogleAdsClientFactory
 
DefaultJobBundleFactory - Class in org.apache.beam.runners.fnexecution.control
A JobBundleFactory for which the implementation can specify a custom EnvironmentFactory for environment management.
DefaultJobBundleFactory.ServerInfo - Class in org.apache.beam.runners.fnexecution.control
A container for EnvironmentFactory and its corresponding Grpc servers.
DefaultJobBundleFactory.WrappedSdkHarnessClient - Class in org.apache.beam.runners.fnexecution.control
Holder for an SdkHarnessClient along with its associated state and data servers.
DefaultJobServerConfigFactory() - Constructor for class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
 
DefaultMaxCacheMemoryUsageMb() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb
 
DefaultMaxCacheMemoryUsageMbFactory() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory
 
defaultNaming(String, String) - Static method in class org.apache.beam.sdk.io.FileIO.Write
 
defaultNaming(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.FileIO.Write
Defines a default FileIO.Write.FileNaming which will use the prefix and suffix supplied to create a name based on the window, pane, number of shards, shard index, and compression.
defaultOptions() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Factory method to return a new instance of RpcQosOptions with all default values.
DefaultPipelineOptionsRegistrar - Class in org.apache.beam.sdk.options
A PipelineOptionsRegistrar containing the PipelineOptions subclasses available by default.
DefaultPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
 
DefaultProjectFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
 
defaultPublishResult() - Static method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoders
Returns a new PublishResult coder which by default serializes only the messageId.
DefaultRateLimiter(BackOff, BackOff) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
 
DefaultRateLimiter(Duration, Duration, Duration) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
 
DefaultRateLimiter(BackOff, BackOff) - Constructor for class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
 
DefaultRateLimiter(Duration, Duration, Duration) - Constructor for class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
 
DefaultRetryStrategy() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
 
defaults() - Static method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
Defaults() - Constructor for class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
DefaultS3ClientBuilderFactory - Class in org.apache.beam.sdk.io.aws.s3
Construct AmazonS3ClientBuilder with default values of S3 client properties like path style access, accelerated mode, etc.
DefaultS3ClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws.s3.DefaultS3ClientBuilderFactory
 
DefaultS3ClientBuilderFactory - Class in org.apache.beam.sdk.io.aws2.s3
Construct S3ClientBuilder with default values of S3 client properties like path style access, accelerated mode, etc.
DefaultS3ClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
 
DefaultS3FileSystemSchemeRegistrar - Class in org.apache.beam.sdk.io.aws.s3
Registers the "s3" uri schema to be handled by S3FileSystem.
DefaultS3FileSystemSchemeRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.s3.DefaultS3FileSystemSchemeRegistrar
 
DefaultS3FileSystemSchemeRegistrar - Class in org.apache.beam.sdk.io.aws2.s3
Registers the "s3" uri schema to be handled by S3FileSystem.
DefaultS3FileSystemSchemeRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.s3.DefaultS3FileSystemSchemeRegistrar
 
DefaultSchema - Annotation Type in org.apache.beam.sdk.schemas.annotations
The DefaultSchema annotation specifies a SchemaProvider class to handle obtaining a schema and row for the specified class.
DefaultSchema.DefaultSchemaProvider - Class in org.apache.beam.sdk.schemas.annotations
SchemaProvider for default schemas.
DefaultSchema.DefaultSchemaProviderRegistrar - Class in org.apache.beam.sdk.schemas.annotations
Registrar for default schemas.
DefaultSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
 
DefaultSchemaProviderRegistrar() - Constructor for class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProviderRegistrar
 
DefaultSequenceCombiner<EventKeyT,EventT,StateT extends MutableState<EventT,?>> - Class in org.apache.beam.sdk.extensions.ordered.combiner
Default global sequence combiner.
DefaultSequenceCombiner(EventExaminer<EventT, StateT>) - Constructor for class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
 
DefaultStopPipelineWatermarkFactory() - Constructor for class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
 
DefaultTableFilter - Class in org.apache.beam.sdk.extensions.sql.meta
This default implementation of BeamSqlTableFilter interface.
DefaultTableFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
 
DefaultTrigger - Class in org.apache.beam.sdk.transforms.windowing
A trigger that is equivalent to Repeatedly.forever(AfterWatermark.pastEndOfWindow()).
defaultType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
 
DefaultTypeConversionsFactory() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
 
defaultValue() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
defaultValue() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
 
defaultValue() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns the default value when there are no values added to the accumulator.
defaultValue() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
Returns the default value when there are no values added to the accumulator.
defaultValue() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
defaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
Returns the default value of this transform, or null if there isn't one.
DefaultValueFactory<T> - Interface in org.apache.beam.sdk.options
An interface used with the Default.InstanceFactory annotation to specify the class that will be an instance factory to produce default values for a given getter on PipelineOptions.
deidentify(String, String, DeidentifyConfig) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Deidentify FHIR resources.
deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Deidentify FHIR resources.
Deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
 
deidentify(DoFn<String, String>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
 
deidentifyFhirStore(String, String, DeidentifyConfig) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Deidentify a GCP FHIR Store and write the result into a new FHIR Store.
deidentifyFhirStore(String, String, DeidentifyConfig) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
DeidentifyFn(ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
 
delay(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
For internal use only; no backwards-compatibility guarantees.
Delay() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
 
DelayIntervalRateLimiter() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
 
DelayIntervalRateLimiter(Supplier<Duration>) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
 
DelayIntervalRateLimiter() - Constructor for class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
 
DelayIntervalRateLimiter(Supplier<Duration>) - Constructor for class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
 
delegate() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
delegate(HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register display data from the specified component on behalf of the current component.
delegateBasedUponType(EnumMap<BeamFnApi.StateKey.TypeCase, StateRequestHandler>) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
Returns a StateRequestHandler which delegates to the supplied handler depending on the BeamFnApi.StateRequests type.
DelegateCoder<T,IntermediateT> - Class in org.apache.beam.sdk.coders
A DelegateCoder<T, IntermediateT> wraps a Coder for IntermediateT and encodes/decodes values of type T by converting to/from IntermediateT and then encoding/decoding using the underlying Coder<IntermediateT>.
DelegateCoder(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.DelegateCoder
 
DelegateCoder.CodingFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.coders
A CodingFunction<InputT, OutputT> is a serializable function from InputT to OutputT that may throw any Exception.
DelegatingCounter - Class in org.apache.beam.sdk.metrics
Implementation of Counter that delegates to the instance for the current context.
DelegatingCounter(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
Create a DelegatingCounter with perWorkerCounter and processWideContainer set to false.
DelegatingCounter(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
Create a DelegatingCounter with perWorkerCounter set to false.
DelegatingCounter(MetricName, boolean, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
 
DelegatingDistribution - Class in org.apache.beam.sdk.metrics
Implementation of Distribution that delegates to the instance for the current context.
DelegatingDistribution(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingDistribution
 
DelegatingDistribution(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingDistribution
 
DelegatingHistogram - Class in org.apache.beam.sdk.metrics
Implementation of Histogram that delegates to the instance for the current context.
DelegatingHistogram(MetricName, HistogramData.BucketType, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingHistogram
Create a DelegatingHistogram with perWorkerHistogram set to false.
DelegatingHistogram(MetricName, HistogramData.BucketType, boolean, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingHistogram
 
delete() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
Provide a CassandraIO.Write PTransform to delete data to a Cassandra database.
delete(Collection<ClassLoaderFileSystem.ClassLoaderResourceId>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
delete(Collection<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
Deletes a collection of resources.
delete(Collection<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
Deletes a collection of resources.
DELETE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
deleteAsync(T) - Method in interface org.apache.beam.sdk.io.cassandra.Mapper
This method is called for each delete event.
DeleteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.DeleteBuilder
 
deleteDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Deletes the dataset specified by the datasetId value.
deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Deletes the dataset specified by the datasetId value.
deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
deleteDicomStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Delete a Dicom Store.
deleteDicomStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.DeleteEntity builder.
deleteFhirStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Delete Fhir store.
deleteFhirStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
deleteFile() - Method in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
 
deleteHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Deletes an HL7v2 message.
deleteHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
deleteHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Deletes an HL7v2 store.
deleteHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.DeleteKey builder.
deleteNewPartition(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
This is the 2nd step of 2 phase delete.
deletePartitionMetadataTable(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Drops the metadata table.
deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Delete SchemaPath.
deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Delete SchemaPath.
deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
Delete SchemaPath.
deleteStreamPartitionRow(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
This is the 2nd step of 2 phase delete of StreamPartition.
deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Delete subscription.
deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
deleteTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Deletes the table specified by tableId from the dataset.
deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Deletes the table specified by tableId from the dataset.
deleteTable(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
deleteTimer(StateNamespace, String, String, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
deleteTimer(StateNamespace, String, String) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
deleteTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
Removes the timer set in this context for the timestamp and timeDomain.
deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
delimitElement() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
dependencies(Row, PipelineOptions) - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
List the dependencies needed for this transform.
dependencies(Row, PipelineOptions) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
 
Dependency - Class in org.apache.beam.sdk.expansion.service
 
Dependency() - Constructor for class org.apache.beam.sdk.expansion.service.Dependency
 
dependsOnlyOnEarliestTimestamp() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Returns true if the result of combination of many output timestamps actually depends only on the earliest.
dependsOnlyOnWindow() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Returns true if the result does not depend on what outputs were combined but only the window they are in.
DequeCoder<T> - Class in org.apache.beam.sdk.coders
A Coder for Deque, using the format of IterableLikeCoder.
DequeCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.DequeCoder
 
deregister() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
De-registers the handler for all future requests for state for the registered process bundle instruction id.
deriveIterableValueCoder(WindowedValue.FullWindowedValueCoder) - Static method in class org.apache.beam.runners.jet.Utils
 
deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
 
deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
 
deriveUncollectRowType(RelNode, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
Returns the row type returned by applying the 'UNNEST' operation to a relational expression.
describe(Set<Class<? extends PipelineOptions>>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Outputs the set of options available to be set for the passed in PipelineOptions interfaces.
describeMismatchSafely(BigqueryMatcher.TableAndQuery, Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
describeMismatchSafely(ShardedFile, Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
describeMismatchSafely(T, Description) - Method in class org.apache.beam.sdk.testing.JsonMatcher
 
describePipelineOptions(JobApi.DescribePipelineOptionsRequest, StreamObserver<JobApi.DescribePipelineOptionsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
describeTo(Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
describeTo(Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
describeTo(Description) - Method in class org.apache.beam.sdk.testing.JsonMatcher
 
describeTo(Description) - Method in class org.apache.beam.sdk.testing.RegexMatcher
 
describeTo(Description) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
 
description() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
 
description() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
description() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
 
description() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
description() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
 
Description - Annotation Type in org.apache.beam.sdk.options
Descriptions are used to generate human readable output when the --help command is specified.
description() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
description() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
Returns a description regarding the SchemaTransform represented by the SchemaTransformProvider.
deserialize(String, byte[]) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
deserialize(String) - Static method in class org.apache.beam.sdk.io.kinesis.serde.AwsSerializableUtils
 
deserialize(JsonParser, DeserializationContext) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
 
deserialize(byte[]) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
 
deserializeAwsCredentialsProvider(String) - Static method in class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
 
DeserializeBytesIntoPubsubMessagePayloadOnly() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
 
deserializeOneOf(Expression, List<Encoder<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
 
DeserializerProvider<T> - Interface in org.apache.beam.sdk.io.kafka
Provides a configured Deserializer instance and its associated Coder.
deserializeTimers(Collection<byte[]>, TimerInternals.TimerDataCoderV2) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
desiredBundleSizeBytes - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
DESTINATION - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
 
Destination() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Destination
 
detect(String) - Static method in enum org.apache.beam.sdk.io.Compression
 
DETECT_NEW_PARTITION_SUFFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
detectNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant, GenerateInitialPartitionsAction, ResumeFromPreviousPipelineAction, ProcessNewPartitionsAction) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class for processing DetectNewPartitionsDoFn.
DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
This class processes DetectNewPartitionsDoFn.
DetectNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant, GenerateInitialPartitionsAction, ResumeFromPreviousPipelineAction, ProcessNewPartitionsAction) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.DetectNewPartitionsAction
 
detectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, ChangeStreamMetrics, Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a single instance of an action class capable of detecting and scheduling new partitions to be queried.
DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
This class is responsible for scheduling partitions.
DetectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, ChangeStreamMetrics, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
Constructs an action class for detecting / scheduling new partitions.
DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
 
DetectNewPartitionsDoFn(Instant, ActionFactory, DaoFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
A SplittableDoFn (SDF) that is responsible for scheduling partitions to be queried.
DetectNewPartitionsDoFn(DaoFactory, MapperFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
This class needs a DaoFactory to build DAOs to access the partition metadata tables.
DetectNewPartitionsRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
This restriction tracker delegates most of its behavior to an internal TimestampRangeTracker.
DetectNewPartitionsRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
 
DetectNewPartitionsState - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
Metadata of the progress of DetectNewPartitionsDoFn from the metadata table.
DetectNewPartitionsState(Instant, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
DetectNewPartitionsTracker - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
 
DetectNewPartitionsTracker(long) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.DetectNewPartitionsTracker
 
detectStreamingMode(Pipeline, StreamingOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
Analyse the pipeline to determine if we have to switch to streaming mode for the pipeline translation and update StreamingOptions accordingly.
DicomIO - Class in org.apache.beam.sdk.io.gcp.healthcare
The DicomIO connectors allows Beam pipelines to make calls to the Dicom API of the Google Cloud Healthcare API (https://cloud.google.com/healthcare/docs/how-tos#dicom-guide).
DicomIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
 
DicomIO.ReadStudyMetadata - Class in org.apache.beam.sdk.io.gcp.healthcare
This class makes a call to the retrieve metadata endpoint (https://cloud.google.com/healthcare/docs/how-tos/dicomweb#retrieving_metadata).
DicomIO.ReadStudyMetadata.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
 
dicomStorePath - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
DicomWebPath() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
DirectOptions - Interface in org.apache.beam.runners.direct
Options that can be used to configure the DirectRunner.
DirectOptions.AvailableParallelismFactory - Class in org.apache.beam.runners.direct
A DefaultValueFactory that returns the result of Runtime.availableProcessors() from the DirectOptions.AvailableParallelismFactory.create(PipelineOptions) method.
DIRECTORY_CONTAINER - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
Shard is a file within a directory.
DirectRegistrar - Class in org.apache.beam.runners.direct
Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the DirectRunner.
DirectRegistrar.Options - Class in org.apache.beam.runners.direct
Registers the DirectOptions.
DirectRegistrar.Runner - Class in org.apache.beam.runners.direct
Registers the DirectRunner.
DirectRunner - Class in org.apache.beam.runners.direct
A PipelineRunner that executes a Pipeline within the process that constructed the Pipeline.
DirectRunner() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
 
DirectRunner.DirectPipelineResult - Class in org.apache.beam.runners.direct
The result of running a Pipeline with the DirectRunner.
DirectStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
A StreamObserver which uses synchronization on the underlying CallStreamObserver to provide thread safety.
DirectStreamObserver(Phaser, CallStreamObserver<T>) - Constructor for class org.apache.beam.sdk.fn.stream.DirectStreamObserver
 
DirectTestOptions - Interface in org.apache.beam.runners.direct
Internal-only options for tweaking the behavior of the DirectRunner in ways that users should never do.
DISALLOW_COMBINER_LIFTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
DISALLOWED_CONSUMER_PROPERTIES - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIOUtils
 
discard() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
 
discardDataset(Dataset) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
discardingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
Returns a new Window PTransform that uses the registered WindowFn and Triggering behavior, and that discards elements in a pane after they are triggered.
discoverSchemaTransform(ExpansionApi.DiscoverSchemaTransformRequest, StreamObserver<ExpansionApi.DiscoverSchemaTransformResponse>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
 
dispatchBag(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
dispatchBag(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
dispatchDefault() - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
dispatchMap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
dispatchMap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
dispatchMultimap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
dispatchMultimap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
dispatchOrderedList(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
dispatchSet(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
dispatchSet(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
dispatchValue(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
dispatchValue(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
DISPLAY_DATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
DisplayData - Class in org.apache.beam.sdk.transforms.display
Static display data associated with a pipeline component.
displayData - Variable in class org.apache.beam.sdk.transforms.PTransform
 
DisplayData.Builder - Interface in org.apache.beam.sdk.transforms.display
Utility to build up display data from a component and its included subcomponents.
DisplayData.Identifier - Class in org.apache.beam.sdk.transforms.display
Unique identifier for a display data item within a component.
DisplayData.Item - Class in org.apache.beam.sdk.transforms.display
Items are the unit of display data.
DisplayData.ItemSpec<T> - Class in org.apache.beam.sdk.transforms.display
Specifies an DisplayData.Item to register as display data.
DisplayData.Path - Class in org.apache.beam.sdk.transforms.display
Structured path of registered display data within a component hierarchy.
DisplayData.Type - Enum in org.apache.beam.sdk.transforms.display
Display data type.
Distinct<T> - Class in org.apache.beam.sdk.transforms
Distinct<T> takes a PCollection<T> and returns a PCollection<T> that has all distinct elements of the input.
Distinct() - Constructor for class org.apache.beam.sdk.transforms.Distinct
 
Distinct.WithRepresentativeValues<T,IdT> - Class in org.apache.beam.sdk.transforms
A Distinct PTransform that uses a SerializableFunction to obtain a representative value for each input element.
Distribution - Interface in org.apache.beam.sdk.metrics
A metric that reports information about the distribution of reported values.
distribution(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that records various statistics about the distribution of reported values.
distribution(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that records various statistics about the distribution of reported values.
DistributionImpl - Class in org.apache.beam.runners.jet.metrics
Implementation of Distribution.
DistributionImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.DistributionImpl
 
DistributionResult - Class in org.apache.beam.sdk.metrics
The result of a Distribution metric.
DistributionResult() - Constructor for class org.apache.beam.sdk.metrics.DistributionResult
 
divideBy(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
DLPDeidentifyText - Class in org.apache.beam.sdk.extensions.ml
A PTransform connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and deidentifying text according to provided settings.
DLPDeidentifyText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
DLPDeidentifyText.Builder - Class in org.apache.beam.sdk.extensions.ml
 
DLPInspectText - Class in org.apache.beam.sdk.extensions.ml
A PTransform connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and inspecting text for identifying data according to provided settings.
DLPInspectText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
DLPInspectText.Builder - Class in org.apache.beam.sdk.extensions.ml
 
DLPReidentifyText - Class in org.apache.beam.sdk.extensions.ml
A PTransform connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and inspecting text for identifying data according to provided settings.
DLPReidentifyText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
DLPReidentifyText.Builder - Class in org.apache.beam.sdk.extensions.ml
 
DlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
DlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
 
doChecks(PAssert.PAssertionSite, ActualT, SerializableFunction<ActualT, Void>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
DockerEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
An EnvironmentFactory that creates docker containers by shelling out to docker.
DockerEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
Provider for DockerEnvironmentFactory.
docToBulk() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
 
DocToBulk() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
 
Document() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
 
doesMetadataTableExist() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
DoFn<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
The argument to ParDo providing the code to use to process elements of the input PCollection.
DoFn() - Constructor for class org.apache.beam.sdk.transforms.DoFn
 
DoFn.AlwaysFetched - Annotation Type in org.apache.beam.sdk.transforms
Annotation for declaring that a state parameter is always fetched.
DoFn.BoundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
Annotation on a splittable DoFn specifying that the DoFn performs a bounded amount of work per input element, so applying it to a bounded PCollection will produce also a bounded PCollection.
DoFn.BundleFinalizer - Interface in org.apache.beam.sdk.transforms
A parameter that is accessible during @StartBundle, @ProcessElement and @FinishBundle that allows the caller to register a callback that will be invoked after the bundle has been successfully completed and the runner has commit the output.
DoFn.BundleFinalizer.Callback - Interface in org.apache.beam.sdk.transforms
An instance of a function that will be invoked after bundle finalization.
DoFn.Element - Annotation Type in org.apache.beam.sdk.transforms
DoFn.FieldAccess - Annotation Type in org.apache.beam.sdk.transforms
Annotation for specifying specific fields that are accessed in a Schema PCollection.
DoFn.FinishBundle - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use to finish processing a batch of elements.
DoFn.FinishBundleContext - Class in org.apache.beam.sdk.transforms
Information accessible while within the DoFn.FinishBundle method.
DoFn.GetInitialRestriction - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that maps an element to an initial restriction for a splittable DoFn.
DoFn.GetInitialWatermarkEstimatorState - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that maps an element and restriction to initial watermark estimator state for a splittable DoFn.
DoFn.GetRestrictionCoder - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that returns the coder to use for the restriction of a splittable DoFn.
DoFn.GetSize - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that returns the corresponding size for an element and restriction pair.
DoFn.GetWatermarkEstimatorStateCoder - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that returns the coder to use for the watermark estimator state of a splittable DoFn.
DoFn.Key - Annotation Type in org.apache.beam.sdk.transforms
Parameter annotation for dereferencing input element key in KV pair.
DoFn.MultiOutputReceiver - Interface in org.apache.beam.sdk.transforms
Receives tagged output for a multi-output function.
DoFn.NewTracker - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that creates a new RestrictionTracker for the restriction of a splittable DoFn.
DoFn.NewWatermarkEstimator - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that creates a new WatermarkEstimator for the watermark state of a splittable DoFn.
DoFn.OnTimer - Annotation Type in org.apache.beam.sdk.transforms
Annotation for registering a callback for a timer.
DoFn.OnTimerContext - Class in org.apache.beam.sdk.transforms
Information accessible when running a DoFn.OnTimer method.
DoFn.OnTimerFamily - Annotation Type in org.apache.beam.sdk.transforms
Annotation for registering a callback for a timerFamily.
DoFn.OnWindowExpiration - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use for performing actions on window expiration.
DoFn.OnWindowExpirationContext - Class in org.apache.beam.sdk.transforms
 
DoFn.OutputReceiver<T> - Interface in org.apache.beam.sdk.transforms
Receives values of the given type.
DoFn.ProcessContext - Class in org.apache.beam.sdk.transforms
Information accessible when running a DoFn.ProcessElement method.
DoFn.ProcessContinuation - Class in org.apache.beam.sdk.transforms
When used as a return value of DoFn.ProcessElement, indicates whether there is more work to be done for the current element.
DoFn.ProcessElement - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use for processing elements.
DoFn.RequiresStableInput - Annotation Type in org.apache.beam.sdk.transforms
Annotation that may be added to a DoFn.ProcessElement, DoFn.OnTimer, or DoFn.OnWindowExpiration method to indicate that the runner must ensure that the observable contents of the input PCollection or mutable state must be stable upon retries.
DoFn.RequiresTimeSortedInput - Annotation Type in org.apache.beam.sdk.transforms
Annotation that may be added to a DoFn.ProcessElement method to indicate that the runner must ensure that the observable contents of the input PCollection is sorted by time, in ascending order.
DoFn.Restriction - Annotation Type in org.apache.beam.sdk.transforms
DoFn.Setup - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use to prepare an instance for processing bundles of elements.
DoFn.SideInput - Annotation Type in org.apache.beam.sdk.transforms
Parameter annotation for the SideInput for a DoFn.ProcessElement method.
DoFn.SplitRestriction - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that splits restriction of a splittable DoFn into multiple parts to be processed in parallel.
DoFn.StartBundle - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use to prepare an instance for processing a batch of elements.
DoFn.StartBundleContext - Class in org.apache.beam.sdk.transforms
Information accessible while within the DoFn.StartBundle method.
DoFn.StateId - Annotation Type in org.apache.beam.sdk.transforms
Annotation for declaring and dereferencing state cells.
DoFn.Teardown - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use to clean up this instance before it is discarded.
DoFn.TimerFamily - Annotation Type in org.apache.beam.sdk.transforms
Parameter annotation for the TimerMap for a DoFn.ProcessElement method.
DoFn.TimerId - Annotation Type in org.apache.beam.sdk.transforms
Annotation for declaring and dereferencing timers.
DoFn.Timestamp - Annotation Type in org.apache.beam.sdk.transforms
DoFn.TruncateRestriction - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that truncates the restriction of a splittable DoFn into a bounded one.
DoFn.UnboundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
Annotation on a splittable DoFn specifying that the DoFn performs an unbounded amount of work per input element, so applying it to a bounded PCollection will produce an unbounded PCollection.
DoFn.WatermarkEstimatorState - Annotation Type in org.apache.beam.sdk.transforms
Parameter annotation for the watermark estimator state for the DoFn.NewWatermarkEstimator method.
DoFn.WindowedContext - Class in org.apache.beam.sdk.transforms
Information accessible to all methods in this DoFn where the context is in some window.
DoFnFunction<OutputT,InputT> - Class in org.apache.beam.runners.twister2.translators.functions
DoFn function.
DoFnFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
 
DoFnFunction(Twister2TranslationContext, DoFn<InputT, OutputT>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, List<TupleTag<?>>, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, TupleTag<OutputT>, DoFnSchemaInformation, Map<TupleTag<?>, Integer>, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
 
DoFnOutputReceivers - Class in org.apache.beam.sdk.transforms
DoFnOutputReceivers() - Constructor for class org.apache.beam.sdk.transforms.DoFnOutputReceivers
 
DoFnRunnerWithMetricsUpdate<InputT,OutputT> - Class in org.apache.beam.runners.flink.metrics
DoFnRunner decorator which registers MetricsContainerImpl.
DoFnRunnerWithMetricsUpdate(String, DoFnRunner<InputT, OutputT>, FlinkMetricContainer) - Constructor for class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
DoFnSchemaInformation - Class in org.apache.beam.sdk.transforms
Represents information about how a DoFn extracts schemas.
DoFnSchemaInformation() - Constructor for class org.apache.beam.sdk.transforms.DoFnSchemaInformation
 
DoFnSchemaInformation.Builder - Class in org.apache.beam.sdk.transforms
The builder object.
DoFnTester<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
Deprecated.
Use TestPipeline with the DirectRunner.
DoFnTester.CloningBehavior - Enum in org.apache.beam.sdk.transforms
Deprecated.
Use TestPipeline with the DirectRunner.
doHoldLock(Range.ByteStringRange, String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Return true if the uuid holds the lock of the partition.
doPartitionsOverlap(Range.ByteStringRange, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Returns true if the two ByteStringRange overlaps, otherwise false.
dotExpression() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
dotExpression() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
 
dotExpressionComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
dotExpressionComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
 
dotExpressionComponent(int) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
 
DotExpressionComponentContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
 
DotExpressionComponentContext() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
 
DotExpressionContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
 
DOUBLE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
DOUBLE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of double fields.
DOUBLE_NAN_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
DOUBLE_NEGATIVE_INF_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
DOUBLE_POSITIVE_INF_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
DoubleCoder - Class in org.apache.beam.sdk.coders
A DoubleCoder encodes Double values in 8 bytes using Java serialization.
doubles() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Double.
doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<Double> and returns a PCollection<Double> whose contents is the maximum of the input PCollection's elements, or Double.NEGATIVE_INFINITY if there are no elements.
doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<Double> and returns a PCollection<Double> whose contents is the minimum of the input PCollection's elements, or Double.POSITIVE_INFINITY if there are no elements.
doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<Double> and returns a PCollection<Double> whose contents is the sum of the input PCollection's elements, or 0 if there are no elements.
doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns a PCollection<KV<K, Double>> that contains an output element mapping each distinct key in the input PCollection to the maximum of the values associated with that key in the input PCollection.
doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns a PCollection<KV<K, Double>> that contains an output element mapping each distinct key in the input PCollection to the minimum of the values associated with that key in the input PCollection.
doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns a PCollection<KV<K, Double>> that contains an output element mapping each distinct key in the input PCollection to the sum of the values associated with that key in the input PCollection.
doubleToByteArray(double) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
drive() - Method in interface org.apache.beam.runners.local.ExecutionDriver
 
DriverConfiguration() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
dropExpiredTimers(SparkTimerInternals, WindowingStrategy<?, W>) - Static method in class org.apache.beam.runners.spark.util.TimerUtils
 
DropFields - Class in org.apache.beam.sdk.schemas.transforms
A transform to drop fields from a schema.
DropFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.DropFields
 
DropFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
Implementation class for DropFields.
dropTable(SqlParserPos, boolean, SqlIdentifier) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
Creates a DROP TABLE.
dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
 
dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
dropTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
Drops a table.
dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
dryRunQuery(String, JobConfigurationQuery, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Dry runs the query in the given project.
dryRunQuery(String, JobConfigurationQuery, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
DurationCoder - Class in org.apache.beam.sdk.coders
A Coder that encodes a joda Duration as a Long using the format of VarLongCoder.
DurationConvert() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.DurationConvert
 
durationMilliSec - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
 
DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
DynamicAvroDestinations<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.extensions.avro.io
A specialization of FileBasedSink.DynamicDestinations for AvroIO.
DynamicAvroDestinations() - Constructor for class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
 
DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
DynamicDestinations<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
This class provides the most general way of specifying dynamic BigQuery table destinations.
DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 
DynamicDestinations - Interface in org.apache.beam.sdk.io.iceberg
 
DynamicFileDestinations - Class in org.apache.beam.sdk.io
Some helper classes that derive from FileBasedSink.DynamicDestinations.
DynamicFileDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicFileDestinations
 
DynamicProtoCoder - Class in org.apache.beam.sdk.extensions.protobuf
A Coder using Google Protocol Buffers binary format.
dynamicWrite() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
 
DynamoDBIO - Class in org.apache.beam.sdk.io.aws.dynamodb
Deprecated.
Module beam-sdks-java-io-amazon-web-services is deprecated and will be eventually removed. Please migrate to DynamoDBIO in module beam-sdks-java-io-amazon-web-services2.
DynamoDBIO() - Constructor for class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO
Deprecated.
 
DynamoDBIO - Class in org.apache.beam.sdk.io.aws2.dynamodb
IO to read from and write to DynamoDB tables.
DynamoDBIO() - Constructor for class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
 
DynamoDBIO.Read<T> - Class in org.apache.beam.sdk.io.aws.dynamodb
Deprecated.
Read data from DynamoDB and return ScanResult.
DynamoDBIO.Read<T> - Class in org.apache.beam.sdk.io.aws2.dynamodb
Read data from DynamoDB using DynamoDBIO.Read.getScanRequestFn() and emit an element of type T for each ScanResponse using the mapping function DynamoDBIO.Read.getScanResponseMapperFn().
DynamoDBIO.RetryConfiguration - Class in org.apache.beam.sdk.io.aws.dynamodb
Deprecated.
A POJO encapsulating a configuration for retry behavior when issuing requests to DynamoDB.
DynamoDBIO.Write<T> - Class in org.apache.beam.sdk.io.aws.dynamodb
Deprecated.
Write a PCollection data into DynamoDB.
DynamoDBIO.Write<T> - Class in org.apache.beam.sdk.io.aws2.dynamodb
Write a PCollection data into DynamoDB.

E

eitherOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Returns a Watch.Growth.TerminationCondition that holds when at least one of the given two conditions holds.
ElasticsearchIO - Class in org.apache.beam.sdk.io.elasticsearch
Transforms for reading and writing data from/to Elasticsearch.
ElasticsearchIO.BoundedElasticsearchSource - Class in org.apache.beam.sdk.io.elasticsearch
A BoundedSource reading from Elasticsearch.
ElasticsearchIO.BulkIO - Class in org.apache.beam.sdk.io.elasticsearch
A PTransform writing Bulk API entities created by ElasticsearchIO.DocToBulk to an Elasticsearch cluster.
ElasticsearchIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.elasticsearch
A POJO describing a connection configuration to Elasticsearch.
ElasticsearchIO.DocToBulk - Class in org.apache.beam.sdk.io.elasticsearch
A PTransform converting docs to their Bulk API counterparts.
ElasticsearchIO.Document - Class in org.apache.beam.sdk.io.elasticsearch
 
ElasticsearchIO.DocumentCoder - Class in org.apache.beam.sdk.io.elasticsearch
 
ElasticsearchIO.Read - Class in org.apache.beam.sdk.io.elasticsearch
A PTransform reading data from Elasticsearch.
ElasticsearchIO.RetryConfiguration - Class in org.apache.beam.sdk.io.elasticsearch
A POJO encapsulating a configuration for retry behavior when issuing requests to ES.
ElasticsearchIO.Write - Class in org.apache.beam.sdk.io.elasticsearch
A PTransform writing data to Elasticsearch.
ElasticsearchIO.Write.BooleanFieldValueExtractFn - Interface in org.apache.beam.sdk.io.elasticsearch
 
ElasticsearchIO.Write.FieldValueExtractFn - Interface in org.apache.beam.sdk.io.elasticsearch
 
ElasticsearchIOITCommon - Class in org.apache.beam.sdk.io.elasticsearch
Manipulates test data used by the ElasticsearchIO integration tests.
ElasticsearchIOITCommon() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon
 
ElasticsearchIOITCommon.ElasticsearchPipelineOptions - Interface in org.apache.beam.sdk.io.elasticsearch
Pipeline options for elasticsearch tests.
element() - Method in class org.apache.beam.runners.twister2.utils.Twister2AssignContext
 
element() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Returns the input element to be processed.
element() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
Returns the current element.
element() - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
 
elementCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
elementCoder() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.IterableSideInputHandler
Returns the Coder to use for the elements of the resulting values iterable.
elementCountAtLeast(int) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterPane
Creates a trigger that fires when the pane contains at least countElems elements.
ElementDelimitedOutputStream(DataStreams.OutputChunkConsumer<ByteString>, int) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
ElementEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ElementEvent
 
elements() - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each element of the input PCollection to a String using the Object.toString() method.
elementsRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Counter of elements read by a source.
elementsReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Counter of elements read by a source split.
elementsWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
Counter of elements written to a sink.
ElemToBytesFunction<V> - Class in org.apache.beam.runners.twister2.translators.functions
Map to tuple function.
ElemToBytesFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
 
ElemToBytesFunction(WindowedValue.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
 
EMBEDDED_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
EmbeddedEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
An EnvironmentFactory that communicates to a FnHarness which is executing in the same process.
EmbeddedEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
Provider of EmbeddedEnvironmentFactory.
empty() - Static method in class org.apache.beam.runners.local.StructuralKey
Get the empty StructuralKey.
empty() - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
 
empty() - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
 
EMPTY - Static variable in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
 
EMPTY - Static variable in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
 
EMPTY - Static variable in class org.apache.beam.sdk.io.range.ByteKey
An empty key.
empty() - Static method in class org.apache.beam.sdk.metrics.GaugeResult
 
empty() - Static method in class org.apache.beam.sdk.metrics.StringSetResult
 
EMPTY - Static variable in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
 
empty() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Asserts that the iterable in question is empty.
empty() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
empty() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.empty().
empty(Schema) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces an empty PCollection of rows.
empty(Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces an empty PCollection.
empty(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces an empty PCollection.
empty() - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns an empty CoGbkResult.
empty(Pipeline) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns an empty KeyedPCollectionTuple<K> on the given pipeline.
empty() - Static method in class org.apache.beam.sdk.transforms.Requirements
Describes an empty set of requirements.
empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionList
Returns an empty PCollectionList that is part of the given Pipeline.
empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
Returns an empty PCollectionRowTuple that is part of the given Pipeline.
empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
Returns an empty PCollectionTuple that is part of the given Pipeline.
empty() - Static method in class org.apache.beam.sdk.values.TupleTagList
Returns an empty TupleTagList.
EMPTY_ROW - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
 
EMPTY_SCHEMA - Static variable in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
 
emptyArray() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.emptyArray().
emptyBatch() - Method in class org.apache.beam.runners.spark.io.CreateStream
Adds an empty batch.
EmptyCheckpointMark - Class in org.apache.beam.runners.spark.io
Passing null values to Spark's Java API may cause problems because of Guava preconditions.
emptyIterable() - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
Returns an empty PrefetchableIterable.
emptyIterable() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.emptyIterable().
emptyIterator() - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
Returns an empty PrefetchableIterator.
emptyList() - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
EmptyListDefault() - Constructor for class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.EmptyListDefault
 
EmptyListenersList() - Constructor for class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
 
EmptyMatchTreatment - Enum in org.apache.beam.sdk.io.fs
Options for allowing or disallowing filepatterns that match no resources in FileSystems.match(java.util.List<java.lang.String>).
emptyProperties() - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
 
ENABLE_CUSTOM_PUBSUB_SINK - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
ENABLE_CUSTOM_PUBSUB_SOURCE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
enableAbandonedNodeEnforcement(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
Enables the abandoned node detection.
enableAutoRunIfMissing(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
If enabled, a pipeline.run() statement will be added automatically in case it is missing in the test.
enableSSL() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
Enable SSL connection to Redis server.
EnableStreamingEngineFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.EnableStreamingEngineFactory
 
EncodableThrowable - Class in org.apache.beam.sdk.values
A wrapper around a Throwable for use with coders.
encode(RandomAccessData, OutputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
encode(RandomAccessData, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
encode(ByteString, OutputStream) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
 
encode(ByteString, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
 
encode(T, Coder<T>) - Static method in class org.apache.beam.runners.jet.Utils
 
encode(BigDecimal, OutputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
encode(BigDecimal, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
encode(Short, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
encode(BigInteger, OutputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
encode(BigInteger, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
encode(BitSet, OutputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
encode(BitSet, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
encode(Boolean, OutputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
encode(byte[], OutputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
encode(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
encode(Byte, OutputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.Coder
Encodes the given value of type T onto the given output stream.
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
Deprecated.
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
encode(Double, OutputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
encode(ReadableDuration, OutputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
encode(Float, OutputStream) - Method in class org.apache.beam.sdk.coders.FloatCoder
 
encode(Instant, OutputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
encode(IterableT, OutputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
encode(KV<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
 
encode(KV<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
encode(Map<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
 
encode(Map<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
encode(ShardedKey<KeyT>, OutputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.SnappyCoder
 
encode(SortedMap<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
encode(SortedMap<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
encode(String, OutputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
encode(String, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
encode(Integer, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
encode(Void, OutputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.ZstdCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
encode(SequenceRangeAccumulator, OutputStream) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator.SequenceRangeAccumulatorCoder
 
encode(ByteString, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
encode(ByteString, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
encode(HyperLogLogPlus, OutputStream) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
encode(EncodedBoundedWindow, OutputStream) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
encode(Message, OutputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
 
encode(AttributeValue, OutputStream) - Method in class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoder
 
encode(AttributeValue, OutputStream) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
 
encode(CountingSource.CounterMark, OutputStream) - Method in class org.apache.beam.sdk.io.CountingSource.CounterMarkCoder
 
encode(DefaultFilenamePolicy.Params, OutputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
 
encode(ElasticsearchIO.Document, OutputStream) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
 
encode(FileBasedSink.FileResult<DestinationT>, OutputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
encode(MatchResult.Metadata, OutputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
 
encode(MatchResult.Metadata, OutputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
 
encode(ResourceId, OutputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
 
encode(BigQueryInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
encode(BigQueryStorageApiInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
 
encode(RowMutation, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
 
encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
encode(TableRow, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
encode(TableRow, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
encode(BigtableWriteResult, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
encode(FhirSearchParameter<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
 
encode(HealthcareIOError<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
 
encode(HL7v2Message, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
 
encode(HL7v2ReadResponse, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
 
encode(JsonArray, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
encode(OffsetByteRange, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
 
encode(SubscriptionPartition, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
 
encode(Uuid, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
encode(KafkaRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
encode(ProducerRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
encode(TopicPartition, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
 
encode(PulsarMessage, OutputStream) - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
 
encode(OffsetRange, OutputStream) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
 
encode(FileIO.ReadableFile, OutputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
 
encode(SplunkEvent, OutputStream) - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.thrift.ThriftCoder
Encodes the given value of type T onto the given output stream using provided ThriftCoder.protocolFactory.
encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
encode(TestStream<T>, OutputStream) - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
 
encode(CoGbkResult, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
encode(RawUnionValue, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
encode(RawUnionValue, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
encode(GlobalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
encode(IntervalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
encode(PaneInfo, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
encode(FailsafeValueInSingleWindow<T, ErrorT>, OutputStream) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
 
encode(FailsafeValueInSingleWindow<T, ErrorT>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
 
encode(PCollectionViews.ValueOrMetadata<T, MetaT>, OutputStream) - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
 
encode(TimestampedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
encode(ValueInSingleWindow<T>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
encode(ValueInSingleWindow<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
encode(ValueWithRecordId<ValueT>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
encode(ValueWithRecordId<ValueT>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
encodeAndOwn(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
Encodes the provided value with the identical encoding to ByteArrayCoder.encode(byte[], java.io.OutputStream), but with optimizations that take ownership of the value.
EncodedBoundedWindow - Class in org.apache.beam.sdk.fn.windowing
An encoded BoundedWindow used within Runners to track window information without needing to decode the window.
EncodedBoundedWindow() - Constructor for class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
 
EncodedBoundedWindow.Coder - Class in org.apache.beam.sdk.fn.windowing
encodeDoLoopBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
encodeDoLoopByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
encodeDoLoopTwiddleBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
encodeDoLoopTwiddleByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
encodeLoopBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
encodeLoopByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 4-byte integer with seconds precision.
encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 4-byte integer with seconds precision.
encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes dateTime as a 8-byte integer with microseconds precision.
encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes dateTime as a 8-byte integer with microseconds precision.
encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes dateTime as a 8-byte integer with seconds precision.
encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes dateTime as a 8-byte integer with seconds precision.
encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 8-byte integer with microseconds precision.
encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 8-byte integer with microseconds precision.
encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 8-byte integer with nanoseconds precision.
encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 8-byte integer with nanoseconds precision.
encodeQueryResult(Table) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
encodeQueryResult(Table, List<TableRow>) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
EncoderFactory - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
 
EncoderFactory() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderFactory
 
encoderFactory() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
 
encoderFor(Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Creates a Spark Encoder for T of BinaryType delegating to a Beam Coder underneath.
EncoderHelpers - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
Encoders utility class.
EncoderHelpers() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
 
EncoderHelpers.Utils - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
Encoder / expression utils that are called from generated code.
encoderOf(Class<? super T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Gets or creates a default Encoder for T.
encoderOf(Coder<T>, EncoderProvider.Factory<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
 
encoderOf(Coder<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
 
encoderOf(Coder<T>, EncoderProvider.Factory<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
EncoderProvider - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
 
EncoderProvider.Factory<T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
 
encodeToTimerDataTimerId(String, String) - Static method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
Encodes transform and timer family ids into a single string which retains the human readable format len(transformId):transformId:timerId.
encodeUnrolledBlackhole(VarIntBenchmark.Longs, VarIntBenchmark.BlackholeOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
encodeUnrolledByteString(VarIntBenchmark.Longs, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
ENCODING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
EncodingException - Exception in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
Represents an error during encoding (serializing) a class.
EncodingException(Throwable) - Constructor for exception org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.EncodingException
 
EncodingException - Exception in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
Represents an error during encoding (serializing) a class.
EncodingException(Throwable) - Constructor for exception org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.EncodingException
 
end() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns the end of this window, exclusive.
END_CURSOR - Static variable in class org.apache.beam.sdk.io.redis.RedisCursor
 
endpoint(URI) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
Optional service endpoint to use AWS compatible services instead, e.g.
endpoint() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
Optional service endpoint to use AWS compatible services instead, e.g.
ENDS_WITH - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
ENDS_WITH_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
endsWith(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
endsWith(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
endsWith(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
endsWith(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
endsWith(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
ensureUsableAsCloudPubsub() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
Ensure that all messages that pass through can be converted to Cloud Pub/Sub messages using the standard transformation methods in the client library.
enterArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by FieldSpecifierNotationParser.arrayQualifier().
enterArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by FieldSpecifierNotationParser.arrayQualifier().
enterArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by the arrayQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
enterArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by the arrayQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkNativePipelineVisitor
 
enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
enterCompositeTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called for each composite transform after all topological predecessors have been visited but before any of its component transforms.
enterDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by FieldSpecifierNotationParser.dotExpression().
enterDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by FieldSpecifierNotationParser.dotExpression().
enterEveryRule(ParserRuleContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
enterFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by FieldSpecifierNotationParser.fieldSpecifier().
enterFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by FieldSpecifierNotationParser.fieldSpecifier().
enterMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by FieldSpecifierNotationParser.mapQualifier().
enterMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by FieldSpecifierNotationParser.mapQualifier().
enterMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by the mapQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
enterMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by the mapQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
enterPipeline(Pipeline) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
enterPipeline(Pipeline) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called before visiting anything values or transforms, as many uses of a visitor require access to the Pipeline object itself.
enterQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
enterQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
enterQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by the qualifyComponent labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
enterQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by the qualifyComponent labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
 
enterRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
 
enterSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by the simpleIdentifier labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
enterSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by the simpleIdentifier labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
enterWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Enter a parse tree produced by the wildcard labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
enterWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Enter a parse tree produced by the wildcard labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
EntityToRow - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform to perform a conversion of Entity to Row.
entries() - Method in interface org.apache.beam.sdk.state.MapState
Returns an Iterable over the key-value pairs contained in this map.
entries() - Method in interface org.apache.beam.sdk.state.MultimapState
Returns an Iterable over all key-value pairs contained in this multimap.
entrySet() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
 
entrySet() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
enum16(Map<String, Integer>) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
enum8(Map<String, Integer>) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
EnumerationType - Class in org.apache.beam.sdk.schemas.logicaltypes
This Schema.LogicalType represent an enumeration over a fixed set of values.
EnumerationType.Value - Class in org.apache.beam.sdk.schemas.logicaltypes
This class represents a single enum value.
enumValues() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
ENVIRONMENT_VERSION_JOB_TYPE_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ENVIRONMENT_VERSION_MAJOR_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
EnvironmentFactory - Interface in org.apache.beam.runners.fnexecution.environment
Creates environments which communicate to an SdkHarnessClient.
EnvironmentFactory.Provider - Interface in org.apache.beam.runners.fnexecution.environment
Provider for a EnvironmentFactory and ServerFactory for the environment.
equal(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that equals to a given value.
equals(Object) - Method in class org.apache.beam.runners.dataflow.util.CloudObject
 
equals(Object) - Method in class org.apache.beam.runners.dataflow.util.OutputReference
 
equals(Object) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
equals(Object) - Method in class org.apache.beam.runners.jet.Utils.ByteArrayKey
 
equals(Object) - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
equals(Object) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
equals(Object) - Method in class org.apache.beam.runners.spark.util.ByteArray
 
equals(Object) - Method in class org.apache.beam.sdk.coders.AtomicCoder
.
equals(Object) - Method in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
equals(Object) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
equals(Object) - Method in class org.apache.beam.sdk.coders.RowCoder
 
equals(Object) - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
equals(Object) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
equals(Object) - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
equals(Object) - Method in class org.apache.beam.sdk.coders.StructuredCoder
equals(Object) - Method in class org.apache.beam.sdk.coders.ZstdCoder
equals(Object) - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
equals(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
equals(Object) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.cassandra.RingRange
 
equals(Object) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
equals(Object) - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
equals(Object) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
equals(Object) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
 
equals(Object) - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKey
 
equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
equals(Object) - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
equals(Object) - Method in class org.apache.beam.sdk.io.redis.RedisCursor
 
equals(Object) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
You need to override this method to be able to compare these objects by value.
equals(Object) - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
You need to override this method to be able to compare these objects by value.
equals(Object) - Method in class org.apache.beam.sdk.io.solace.read.SolaceCheckpointMark
 
equals(Object) - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
equals(Object) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
equals(Object) - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.CachingFactory
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema
Returns true if two Schemas have the same fields in the same order.
equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.Schema.Options
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
equals(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
equals(Object) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
 
equals(Object) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Deprecated.
Object.equals(Object) is not supported on PAssert objects. If you meant to test object equality, use a variant of PAssert.PCollectionContentsAssert.containsInAnyOrder(T...) instead.
equals(Object) - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
equals(Object) - Method in class org.apache.beam.sdk.testing.TestStream
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
Deprecated.
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
equals(Object) - Method in class org.apache.beam.sdk.values.EncodableThrowable
 
equals(Object) - Method in class org.apache.beam.sdk.values.KV
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionList
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
equals(Object) - Method in class org.apache.beam.sdk.values.Row
 
Equals() - Constructor for class org.apache.beam.sdk.values.Row.Equals
 
equals(Object) - Method in class org.apache.beam.sdk.values.RowWithGetters
 
equals(Object) - Method in class org.apache.beam.sdk.values.ShardedKey
 
equals(Object) - Method in class org.apache.beam.sdk.values.TimestampedValue
 
equals(Object) - Method in class org.apache.beam.sdk.values.TupleTag
 
equals(Object) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Two type descriptor are equal if and only if they represent the same type.
equals(Object) - Method in class org.apache.beam.sdk.values.TypeParameter
 
equals(Object) - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
equals(Object) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
equalTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.equalTo(Object).
equalTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.equalTo(Object).
equivalent(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
Returns true if two Schemas have the same fields, but possibly in different orders.
equivalent(Schema.FieldType, Schema.EquivalenceNullablePolicy) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Check whether two types are equivalent.
ERROR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
ERROR_MESSAGE - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
TupleTag for any error response.
ERROR_ROW_SCHEMA - Static variable in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
 
ERROR_ROW_WITH_ERR_MSG_SCHEMA - Static variable in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
 
ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
 
ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
errorCodeFn - Variable in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
 
ErrorContainer<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
ErrorContainer interface.
ErrorCounterFn(String, SerializableFunction<Row, byte[]>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
 
ErrorCounterFn(String, SerializableFunction<Row, byte[]>, Schema, boolean, List<String>, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
 
ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
 
ErrorFn(String, SerializableFunction<byte[], Row>, Schema, List<String>, String, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
 
ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
 
ErrorHandler<ErrorT,OutputT extends POutput> - Interface in org.apache.beam.sdk.transforms.errorhandling
An Error Handler is a utility object used for plumbing error PCollections to a configured sink Error Handlers must be closed before a pipeline is run to properly pipe error collections to the sink, and the pipeline will be rejected if any handlers aren't closed.
ErrorHandler.BadRecordErrorHandler<OutputT extends POutput> - Class in org.apache.beam.sdk.transforms.errorhandling
 
ErrorHandler.DefaultErrorHandler<ErrorT,OutputT extends POutput> - Class in org.apache.beam.sdk.transforms.errorhandling
A default, placeholder error handler that exists to allow usage of .addErrorCollection() without effects.
ErrorHandler.PTransformErrorHandler<ErrorT,OutputT extends POutput> - Class in org.apache.beam.sdk.transforms.errorhandling
 
ErrorHandler.PTransformErrorHandler.WriteErrorMetrics<ErrorT> - Class in org.apache.beam.sdk.transforms.errorhandling
 
ErrorHandler.PTransformErrorHandler.WriteErrorMetrics.CountErrors<ErrorT> - Class in org.apache.beam.sdk.transforms.errorhandling
 
ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
 
ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
 
ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
 
ErrorHandling - Class in org.apache.beam.sdk.schemas.transforms.providers
 
ErrorHandling() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
ErrorHandling.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
 
errorRecord(Schema, Row, Throwable) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
errorRecord(Schema, byte[], Throwable) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
errorSchema(Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
errorSchemaBytes() - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
estimate() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker.RangeEndEstimator
 
estimateCount(T, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
Utility class to retrieve the estimate frequency of an element from a CountMinSketch.
estimateFractionForKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns the fraction of this range [startKey, endKey) that is in the interval [startKey, key).
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
This method is called by org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats.
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
 
estimateNodeStats(BeamRelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
 
estimateRowCount(RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
estimateRowCount(PipelineOptions) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
Estimates the number of non empty rows.
eval(BatchTSetEnvironment, SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
 
eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
 
eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2StreamTranslationContext
 
eval(SinkTSet<?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
eval(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.PatternCondition
 
evaluate() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
Trigger evaluation of all leaf datasets.
evaluate(String, Dataset<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
The purpose of this utility is to mark the evaluation of Spark actions, both during Pipeline translation, when evaluation is required, and when finally evaluating the pipeline.
EvaluationContext - Class in org.apache.beam.runners.spark.structuredstreaming.translation
The EvaluationContext is the result of a pipeline translation and can be used to evaluate / run the pipeline.
Evaluator(SparkPipelineTranslator, EvaluationContext) - Constructor for class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
event() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
 
EventExaminer<EventT,StateT extends MutableState<EventT,?>> - Interface in org.apache.beam.sdk.extensions.ordered
Classes extending this interface will be called by OrderedEventProcessor to examine every incoming event.
eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
eventStore() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
eventually(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
 
ever() - Static method in class org.apache.beam.sdk.transforms.windowing.Never
Returns a trigger which never fires.
every(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Returns a new SlidingWindows with the original size, that assigns timestamps into half-open intervals of the form [N * period, N * period + size), where 0 is the epoch.
EXACTLY_ONCE - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
exceptAll(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET ALL semantics to compute the difference all (exceptAll) with provided PCollection<T>.
exceptAll() - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET ALL semantics which takes a PCollectionList<PCollection<T>> and returns a PCollection<T> containing the difference all (exceptAll) of collections done in order for all collections in PCollectionList<T>.
exceptDistinct(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET DISTINCT semantics to compute the difference (except) with provided PCollection<T>.
exceptDistinct() - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a PTransform that takes a PCollectionList<PCollection<T>> and returns a PCollection<T> containing the difference (except) of collections done in order for all collections in PCollectionList<T>.
exception() - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
 
ExceptionAsMapHandler() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.ExceptionAsMapHandler
 
ExceptionElement() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
 
exceptionHandler - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
Returns a new AsJsons.AsJsonsWithFailures transform that catches exceptions raised while writing JSON elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified using AsJsons.AsJsonsWithFailures.exceptionsVia(ProcessFunction).
exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
Returns a new ParseJsons.ParseJsonsWithFailures transform that catches exceptions raised while parsing elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified using ParseJsons.ParseJsonsWithFailures.exceptionsVia(ProcessFunction).
exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
Returns a new FlatMapElements.FlatMapWithFailures transform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified using FlatMapElements.FlatMapWithFailures.exceptionsVia(ProcessFunction).
exceptionsInto(TypeDescriptor<NewFailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements
Returns a new MapElements.MapWithFailures transform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified using MapElements.MapWithFailures.exceptionsVia(ProcessFunction).
exceptionsInto(TypeDescriptor<FailureT>) - Method in class org.apache.beam.sdk.transforms.MapKeys
Returns a new SimpleMapWithFailures transform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified using SimpleMapWithFailures.exceptionsVia(ProcessFunction).
exceptionsInto(TypeDescriptor<FailureT>) - Method in class org.apache.beam.sdk.transforms.MapValues
Returns a new SimpleMapWithFailures transform that catches exceptions raised while mapping elements, with the given type descriptor used for the failure collection but the exception handler yet to be specified using SimpleMapWithFailures.exceptionsVia(ProcessFunction).
exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons.AsJsonsWithFailures
Returns a new AsJsons.AsJsonsWithFailures transform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
Returns a new AsJsons.AsJsonsWithFailures transform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia() - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
Returns a new AsJsons.AsJsonsWithFailures transform that catches exceptions raised while writing JSON elements, passing the raised exception instance and the input element being processed through the default exception handler DefaultExceptionAsMapHandler and emitting the result to a failure collection.
exceptionsVia(InferableFunction<WithFailures.ExceptionElement<String>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
Returns a new ParseJsons.ParseJsonsWithFailures transform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia() - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
Returns a new ParseJsons.ParseJsonsWithFailures transform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the default exception handler DefaultExceptionAsMapHandler and emitting the result to a failure collection.
exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<String>, FailureT>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons.ParseJsonsWithFailures
Returns a new ParseJsons.ParseJsonsWithFailures transform that catches exceptions raised while parsing elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
Returns a new FlatMapElements.FlatMapWithFailures transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements.FlatMapWithFailures
Returns a new FlatMapElements.FlatMapWithFailures transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia(InferableFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements
Returns a new MapElements.MapWithFailures transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia(ProcessFunction<WithFailures.ExceptionElement<InputT>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapElements.MapWithFailures
Returns a PTransform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia(InferableFunction<WithFailures.ExceptionElement<KV<K1, V>>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapKeys
Returns a new SimpleMapWithFailures transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
exceptionsVia(InferableFunction<WithFailures.ExceptionElement<KV<K, V1>>, FailureT>) - Method in class org.apache.beam.sdk.transforms.MapValues
Returns a new SimpleMapWithFailures transform that catches exceptions raised while mapping elements, passing the raised exception instance and the input element being processed through the given exceptionHandler and emitting the result to a failure collection.
ExecutableGraph<ExecutableT,CollectionT> - Interface in org.apache.beam.runners.direct
The interface that enables querying of a graph of independently executable stages and the inputs and outputs of those stages.
ExecutableProcessBundleDescriptor() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
ExecutableStageContext - Interface in org.apache.beam.runners.fnexecution.control
The context required in order to execute stages.
ExecutableStageContext.Factory - Interface in org.apache.beam.runners.fnexecution.control
Creates ExecutableStageContext instances.
execute(String) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
execute(String) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.Executor
 
execute(String) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
execute(BatchTSetEnvironment) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
 
execute(String) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
Executes the given sql.
execute(Expression, Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
execute(Expression, Type) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
 
execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
 
execute(CalcitePrepare.Context) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlSetOptionBeam
 
execute() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
 
ExecuteBundles(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
Instantiates a new Execute bundles.
ExecuteBundles(String) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
 
executeBundles(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
Execute Bundle Method executes a batch of requests in batch or as a single transaction @see .
executeBundles(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
Execute Bundle Method executes a batch of requests in batch or as a single transaction @see .
executeDdl(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
executeFhirBundle(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Execute fhir bundle http body.
executeFhirBundle(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
executePipeline(BatchTSetEnvironment) - Method in class org.apache.beam.runners.twister2.BeamBatchWorker
 
executeQuery(Queryable<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
ExecutionDriver - Interface in org.apache.beam.runners.local
Drives the execution of a Pipeline by scheduling work.
ExecutionDriver.DriverState - Enum in org.apache.beam.runners.local
The state of the driver.
ExecutorOptions - Interface in org.apache.beam.sdk.options
Options for configuring the ScheduledExecutorService used throughout the Java runtime.
ExecutorOptions.ScheduledExecutorServiceFactory - Class in org.apache.beam.sdk.options
Returns the default ScheduledExecutorService to use within the Apache Beam SDK.
ExecutorServiceFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
 
exists() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
 
exitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by FieldSpecifierNotationParser.arrayQualifier().
exitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by FieldSpecifierNotationParser.arrayQualifier().
exitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by the arrayQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
exitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by the arrayQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
exitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by FieldSpecifierNotationParser.dotExpression().
exitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by FieldSpecifierNotationParser.dotExpression().
exitEveryRule(ParserRuleContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
exitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by FieldSpecifierNotationParser.fieldSpecifier().
exitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by FieldSpecifierNotationParser.fieldSpecifier().
exitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by FieldSpecifierNotationParser.mapQualifier().
exitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by FieldSpecifierNotationParser.mapQualifier().
exitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by the mapQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
exitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by the mapQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
exitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
exitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
exitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by the qualifyComponent labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
exitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by the qualifyComponent labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
 
exitRule(ParseTreeListener) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
 
exitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by the simpleIdentifier labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
exitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by the simpleIdentifier labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
exitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
Exit a parse tree produced by the wildcard labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
exitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationListener
Exit a parse tree produced by the wildcard labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
expand(PBegin) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
 
expand(PCollection<RequestT>) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
 
expand() - Method in class org.apache.beam.io.requestresponse.Result
 
expand(PCollection<ElemT>) - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
expand(PCollection<T>) - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
 
expand(PBegin) - Method in class org.apache.beam.runners.spark.io.CreateStream
 
expand(PCollection<String>) - Method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.CountWords
 
expand(ExpansionApi.ExpansionRequest, StreamObserver<ExpansionApi.ExpansionResponse>) - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
 
expand(PBegin) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
 
expand(PBegin) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
Deprecated.
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
 
expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
 
expand(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Expands a pattern into matched paths.
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons.AsJsonsWithFailures
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons.ParseJsonsWithFailures
 
expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.FullOuterJoin
 
expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.InnerJoin
 
expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.LeftOuterJoin
 
expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.extensions.joinlibrary.Join.RightOuterJoin
 
expand(PCollection<Document>) - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
 
expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
The transform converts the contents of input PCollection into Table.Rows and then calls Cloud DLP service to perform the deidentification according to provided settings.
expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
The transform converts the contents of input PCollection into Table.Rows and then calls Cloud DLP service to perform the data inspection according to provided settings.
expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
The transform converts the contents of input PCollection into Table.Rows and then calls Cloud DLP service to perform the reidentification according to provided settings.
expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
The transform converts the contents of input PCollection into CatalogItems and then calls the Recommendation AI service to create the catalog item.
expand(PCollection<KV<String, GenericJson>>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
The transform converts the contents of input PCollection into CatalogItems and then calls the Recommendation AI service to create the catalog item.
expand(PCollection<KV<String, GenericJson>>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
The transform converts the contents of input PCollection into UserEvents and then calls the Recommendation AI service to create the user event.
expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
expand(PCollection<GenericJson>) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
The transform converts the contents of input PCollection into UserEvents and then calls the Recommendation AI service to create the user event.
expand(PCollection<ByteString>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytes
 
expand(PCollection<KV<ByteString, VideoContext>>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytesWithContext
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromUri
 
expand(PCollection<KV<String, VideoContext>>) - Method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromURIWithContext
 
expand(PCollection<KV<EventKeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
 
expand() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
 
expand(InputT) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
 
expand(PCollection<?>) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
 
expand(PCollection<Double>) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
 
expand(PCollection<KV<K, Double>>) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
 
expand(PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>>) - Method in class org.apache.beam.sdk.extensions.sorter.SortValues
 
expand(PCollectionList<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.JoinAsLookup
 
expand(PCollection<Document>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.DocumentToRow
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.RowToDocument
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesReadConverter
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesWriteConverter
 
expand(PInput) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
expand(PCollection<ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
 
expand(PCollection<Message>) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Write
Deprecated.
 
expand(PCollection<PublishRequest>) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
Deprecated.
 
expand(PCollection<SendMessageRequest>) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Write
Deprecated.
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
 
expand() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
 
expand(PCollection<SendMessageRequest>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Write
Deprecated.
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
 
expand() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
expand(PCollection<CassandraIO.Read<T>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.ReadAll
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.csv.CsvIOParse
 
expand() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteTransform
 
expand(PCollection<ElasticsearchIO.Document>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.FileIO.Match
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
 
expand(PCollection<MatchResult.Metadata>) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
 
expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider.BigQueryExportSchemaTransform
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider.BigQueryFileLoadsSchemaTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
 
expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
 
expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
 
expand(PCollection<KV<ShardedKey<DestinationT>, Iterable<StorageApiWritePayload>>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
 
expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
 
expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
 
expand(PCollection<KV<TableDestination, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
expand() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
 
expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
 
expand(PCollection<BatchGetDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
 
expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
 
expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
 
expand(PCollection<ListCollectionIdsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
 
expand(PCollection<ListDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
 
expand(PCollection<PartitionQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
 
expand(PCollection<RunQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
 
expand(PCollection<FhirBundleParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
 
expand(PCollection<FhirSearchParameter<T>>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
 
expand(PCollection<FhirIOPatientEverything.PatientEverythingParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
 
expand(PCollection<HL7v2ReadParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
 
expand(PCollection<HL7v2ReadParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
 
expand(PCollection<HL7v2Message>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
expand(PCollection<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
 
expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoFromBytes
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
 
expand(PCollection<SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
 
expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage
 
expand(PInput) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
expand(PCollection<ReadOperation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
 
expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
expand(PCollection<MutationGroup>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
 
expand() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.Read
 
expand(PCollection<SearchGoogleAdsStreamRequest>) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.ReadAll
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
expand(PCollection<KV<KeyT, ValueT>>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
expand(PCollection<HBaseIO.Read>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.ReadAll
 
expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
expand(PCollection<KV<byte[], RowMutations>>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
expand(PCollection<HCatRecord>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
 
expand() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
expand(PCollection<EventT>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
 
expand() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteTransform
 
expand(PCollection<KV<KafkaSourceDescriptor, KafkaRecord<K, V>>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaCommitOffset
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
expand(PCollection<KafkaSourceDescriptor>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
expand(PCollection<ProducerRecord<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
 
expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
expand(PCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
 
expand(PCollection<RabbitMqMessage>) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Bounded
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
 
expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
expand(PCollection<KV<String, Map<String, String>>>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
 
expand() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
 
expand(PCollection<SolrIO.Read>) - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReadAll
 
expand(PCollection<SolrInputDocument>) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
 
expand(PCollection<SplunkEvent>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
Deprecated.
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
 
expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.TFRecordIO.ReadFiles
 
expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.WriteFiles
 
expand() - Method in class org.apache.beam.sdk.io.WriteFilesResult
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
expand(PInput) - Method in class org.apache.beam.sdk.managed.Managed.ManagedTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceSchemaTransform
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.schemas.io.DeadLetteredTransform
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Cast
 
expand(PCollectionTuple) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.ExpandCrossProduct
 
expand(PCollectionTuple) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.DropFields.Inner
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineGlobally
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
expand(PCollection) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.ExplodeTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.JavaFilterTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.JavaMapToFieldsTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.LoggingTransform
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.RenameFields.Inner
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Fields
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.schemas.transforms.WithKeys
 
expand(PCollection<SuccessOrFailure>) - Method in class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssert
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssertForSingleton
 
expand(PBegin) - Method in class org.apache.beam.sdk.testing.PAssert.OneSideInputAssert
 
expand(PBegin) - Method in class org.apache.beam.sdk.testing.TestStream
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
Deprecated.
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
Deprecated.
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
expand(PCollection<? extends KV<K, ? extends Iterable<InputT>>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
 
expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.OfValueProvider
 
expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
 
expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.Values
 
expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.KeyedValues
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.Values
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
 
expand(PCollection<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Filter
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements.FlatMapWithFailures
 
expand(PCollection<? extends Iterable<T>>) - Method in class org.apache.beam.sdk.transforms.Flatten.Iterables
 
expand(PCollectionList<T>) - Method in class org.apache.beam.sdk.transforms.Flatten.PCollections
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
 
expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.WithShardedKey
 
expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Impulse
 
expand(KeyedPCollectionTuple<K>) - Method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
 
expand() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Expands the component PCollections, stripping off any tag-specific information.
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
 
expand() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
 
expand(PCollection<? extends KV<K, ?>>) - Method in class org.apache.beam.sdk.transforms.Keys
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.KvSwap
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.MapElements.MapWithFailures
 
expand(PCollection<KV<K1, V>>) - Method in class org.apache.beam.sdk.transforms.MapKeys
 
expand(PCollection<KV<K, V1>>) - Method in class org.apache.beam.sdk.transforms.MapValues
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Partition
 
expand(PBegin) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
 
expand(PCollection<PeriodicSequence.SequenceDefinition>) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence
 
expand(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
Override this method to specify how this PTransform should be expanded on the given InputT.
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeByKey
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.AllMatches
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Find
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindAll
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindKV
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindName
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindNameKV
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Matches
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesKV
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesName
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesNameKV
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceAll
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Split
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Reshuffle
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Reshuffle.ViaRandomKey
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Tee
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.ToJson
 
expand(PCollection<? extends KV<?, V>>) - Method in class org.apache.beam.sdk.transforms.Values
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsIterable
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsList
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMap
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
 
expand(PCollection<ElemT>) - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Wait.OnSignal
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
expand() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
 
expand(PCollection<V>) - Method in class org.apache.beam.sdk.transforms.WithKeys
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
 
expand() - Method in class org.apache.beam.sdk.values.PBegin
 
expand() - Method in class org.apache.beam.sdk.values.PCollection
 
expand() - Method in class org.apache.beam.sdk.values.PCollectionList
 
expand() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
 
expand() - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
expand() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
expand() - Method in class org.apache.beam.sdk.values.PDone
A PDone contains no PValues.
expand() - Method in interface org.apache.beam.sdk.values.PInput
Expands this PInput into a list of its component output PValues.
expand() - Method in interface org.apache.beam.sdk.values.POutput
Expands this POutput into a list of its component output PValues.
expand() - Method in interface org.apache.beam.sdk.values.PValue
Deprecated.
A PValue always expands into itself. Calling PValue.expand() on a PValue is almost never appropriate.
expandInconsistent(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
expandInput(PInput) - Static method in class org.apache.beam.sdk.values.PValues
 
expandOutput(POutput) - Static method in class org.apache.beam.sdk.values.PValues
 
expandTriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>, Coder<StorageApiWritePayload>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
expandUntriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
expandValue(PValue) - Static method in class org.apache.beam.sdk.values.PValues
 
ExpansionServer - Class in org.apache.beam.sdk.expansion.service
A gRPC Server for an ExpansionService.
ExpansionService - Class in org.apache.beam.sdk.expansion.service
A service that allows pipeline expand transforms from a remote SDK.
ExpansionService() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
 
ExpansionService(String[]) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
 
ExpansionService(PipelineOptions) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
 
ExpansionService(PipelineOptions, String) - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService
 
ExpansionService.ExpansionServiceRegistrar - Interface in org.apache.beam.sdk.expansion.service
A registrar that creates TransformProvider instances from RunnerApi.FunctionSpecs.
ExpansionService.ExternalTransformRegistrarLoader - Class in org.apache.beam.sdk.expansion.service
Exposes Java transforms via ExternalTransformRegistrar.
ExpansionServiceConfig - Class in org.apache.beam.sdk.expansion.service
 
ExpansionServiceConfig() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
 
ExpansionServiceConfigFactory() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.ExpansionServiceConfigFactory
 
ExpansionServiceOptions - Interface in org.apache.beam.sdk.expansion.service
Options used to configure the ExpansionService.
ExpansionServiceOptions.ExpansionServiceConfigFactory - Class in org.apache.beam.sdk.expansion.service
Loads the ExpansionService config.
ExpansionServiceOptions.JavaClassLookupAllowListFactory - Class in org.apache.beam.sdk.expansion.service
Loads the allow list from ExpansionServiceOptions.getJavaClassLookupAllowlistFile(), defaulting to an empty JavaClassLookupTransformProvider.AllowList.
ExpansionServiceSchemaTransformProvider - Class in org.apache.beam.sdk.expansion.service
 
expectDryRunQuery(String, String, JobStatistics) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
EXPECTED_SQN_PATTERN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
Expected valid pattern for a StorageApiCDC.CHANGE_SQN_COLUMN value for use with BigQuery's _CHANGE_SEQUENCE_NUMBER format.
expectFileToNotExist() - Method in class org.apache.beam.sdk.io.fs.CreateOptions
True if the file is expected to not exist.
ExperimentalOptions - Interface in org.apache.beam.sdk.options
Apache Beam provides a number of experimental features that can be enabled with this flag.
explain(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
explainLazily(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
A lazy explain via Object.toString() for logging purposes.
explainQuery(String) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
Returns a human readable representation of the query execution plan.
explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamPushDownIOSourceRel
 
explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
 
explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
 
explicitRandomPartitioner(int) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
Explicit hash key partitioner that randomly returns one of x precalculated hash keys.
Export(ValueProvider<String>, ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
 
exportFhirResourceToBigQuery(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Export a FHIR Resource to BigQuery.
exportFhirResourceToBigQuery(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
exportFhirResourceToGcs(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Export a FHIR Resource to GCS.
exportFhirResourceToGcs(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
exportResources(DoFn<String, String>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
 
exportResources(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Export resources to GCS.
exportResources(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
 
ExportResourcesFn(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
 
ExpressionConverter - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
Extracts expressions (function calls, field accesses) from the resolve query nodes, converts them to RexNodes.
ExpressionConverter(RelOptCluster, QueryPlanner.QueryParameters, UserFunctionDefinitions) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
 
expressionsInFilter(List<RexNode>) - Static method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
Count a number of RexNodes involved in all supported filters.
extend(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
Extend the path by appending a sub-component path.
External() - Constructor for class org.apache.beam.sdk.io.GenerateSequence.External
 
External() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
 
External() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
 
ExternalConfiguration() - Constructor for class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
 
ExternalEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
An EnvironmentFactory which requests workers via the given URL in the Environment.
ExternalEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
Provider of ExternalEnvironmentFactory.
ExternalRead - Class in org.apache.beam.sdk.io.gcp.pubsub
Exposes PubsubIO.Read as an external transform for cross-language usage.
ExternalRead() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
 
ExternalRead.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
Parameters class to expose the transform to an external SDK.
ExternalRead.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
ExternalSchemaIOTransformRegistrar - Class in org.apache.beam.sdk.extensions.schemaio.expansion
 
ExternalSchemaIOTransformRegistrar() - Constructor for class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar
 
ExternalSchemaIOTransformRegistrar.Configuration - Class in org.apache.beam.sdk.extensions.schemaio.expansion
 
ExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
Does an external sort of the provided values.
ExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
ExternalSorter.Options contains configuration of the sorter.
ExternalSorter.Options.SorterType - Enum in org.apache.beam.sdk.extensions.sorter
Sorter type.
ExternalSqlTransformRegistrar - Class in org.apache.beam.sdk.extensions.sql.expansion
 
ExternalSqlTransformRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar
 
ExternalSqlTransformRegistrar.Configuration - Class in org.apache.beam.sdk.extensions.sql.expansion
 
ExternalSynchronization - Interface in org.apache.beam.sdk.io.hadoop.format
Provides mechanism for acquiring locks related to the job.
ExternalTransformBuilder<ConfigT,InputT extends PInput,OutputT extends POutput> - Interface in org.apache.beam.sdk.transforms
An interface for building a transform from an externally provided configuration.
ExternalTransformRegistrar - Interface in org.apache.beam.sdk.expansion
A registrar which contains a mapping from URNs to available ExternalTransformBuilders.
ExternalTransformRegistrarImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
ExternalTransformRegistrarImpl() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
 
ExternalTransformRegistrarLoader() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionService.ExternalTransformRegistrarLoader
 
externalWithMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
ExternalWrite - Class in org.apache.beam.sdk.io.gcp.pubsub
Exposes PubsubIO.Write as an external transform for cross-language usage.
ExternalWrite() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
 
ExternalWrite.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
Parameters class to expose the transform to an external SDK.
ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue - Class in org.apache.beam.sdk.io.gcp.pubsub
 
ExternalWrite.WriteBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
extractFromTypeParameters(T, Class<? super T>, TypeDescriptors.TypeVariableExtractor<T, V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
Extracts a type from the actual type parameters of a parameterized class, subject to Java type erasure.
extractFromTypeParameters(TypeDescriptor<T>, Class<? super T>, TypeDescriptors.TypeVariableExtractor<T, V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
Like TypeDescriptors.extractFromTypeParameters(Object, Class, TypeVariableExtractor), but takes a TypeDescriptor of the instance being analyzed rather than the instance itself.
extractOutput(SequenceRangeAccumulator) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
 
extractOutput(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
Output the whole structure so it can be queried, reused or stored easily.
extractOutput(SketchFrequencies.Sketch<InputT>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
Output the whole structure so it can be queried, reused or stored easily.
extractOutput(MergingDigest) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
Output the whole structure so it can be queried, reused or stored easily.
extractOutput(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
extractOutput(long[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
 
extractOutput(CovarianceAccumulator) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
extractOutput(VarianceAccumulator) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
extractOutput(BeamBuiltinAggregations.BitXOr.Accum) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
 
extractOutput(List<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
 
extractOutput(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
 
extractOutput(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
 
extractOutput(Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
 
extractOutput(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
extractOutput(AccumT) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
Returns the output value that is the result of combining all the input values represented by the given accumulator.
extractOutput(List<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
 
extractOutput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
Deprecated.
 
extractOutput() - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
Returns the output value that is the result of combining all the input values represented by this accumulator.
extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
extractOutput(double[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
extractOutput(Combine.Holder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
extractOutput(int[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
extractOutput(long[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns the output value that is the result of combining all the input values represented by the given accumulator.
extractOutput(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
extractOutput(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
extractOutput(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
extractOutput(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Returns the output value that is the result of combining all the input values represented by the given accumulator.
extractOutput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
extractOutputs(PCollectionRowTuple) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
 
extractOutputs(OutputT) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
 
extractTableNamesFromNode(SqlNode) - Static method in class org.apache.beam.sdk.extensions.sql.TableNameExtractionUtils
 
extractTimestampAttribute(String, Map<String, String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return the timestamp (in ms since unix epoch) to use for a Pubsub message with timestampAttribute and attriutes.
extractTimestampsFromValues() - Static method in class org.apache.beam.sdk.transforms.Reify
Extracts the timestamps from each value in a KV.

F

factory - Variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
 
FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
 
FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser
Parser factory.
FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
Factory() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
 
FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Factory for creating Pubsub clients using gRPC transport.
FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Factory for creating Pubsub clients using Json transport.
Factory<T> - Interface in org.apache.beam.sdk.schemas
A Factory interface for schema-related objects for a specific Java type.
failed(Exception) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
Report that a failure has occurred.
failed(Error) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
Report that a failure has occurred.
FAILED - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
The tag for the failed writes to HL7v2 store`.
FAILED_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
The tag for the failed writes to FHIR store.
FAILED_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
The TupleTag used for bundles that failed to be executed for any reason.
FAILED_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
The tag for the files that failed to FHIR store.
FAILED_PUBLISH_TAG - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO.Write
 
FAILED_WRITES - Static variable in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
failedRecords(List<RecT>, List<ResT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
 
FailedRunningPipelineResults - Class in org.apache.beam.runners.jet
Alternative implementation of PipelineResult used to avoid throwing Exceptions in certain situations.
FailedRunningPipelineResults(RuntimeException) - Constructor for class org.apache.beam.runners.jet.FailedRunningPipelineResults
 
FailedWritesException(List<FirestoreV1.WriteFailure>) - Constructor for exception org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
 
failOnInsert(Map<TableRow, List<TableDataInsertAllResponse.InsertErrors>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
Cause a given TableRow object to fail when it's inserted.
FailsafeValueInSingleWindow<T,ErrorT> - Class in org.apache.beam.sdk.values
An immutable tuple of value, timestamp, window, and pane.
FailsafeValueInSingleWindow() - Constructor for class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
 
FailsafeValueInSingleWindow.Coder<T,ErrorT> - Class in org.apache.beam.sdk.values
failure(String, String, Metadata, Throwable) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
 
Failure - Class in org.apache.beam.sdk.schemas.io
A generic failure of an SQL transform.
Failure() - Constructor for class org.apache.beam.sdk.schemas.io.Failure
 
failure(PAssert.PAssertionSite, Throwable) - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
Failure() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
 
Failure.Builder - Class in org.apache.beam.sdk.schemas.io
 
FAILURE_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
 
FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
 
FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
FailureCollectorWrapper - Class in org.apache.beam.sdk.io.cdap.context
Class FailureCollectorWrapper is a class for collecting ValidationFailure.
FailureCollectorWrapper() - Constructor for class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
 
failures() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
 
failuresTo(List<PCollection<FailureElementT>>) - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
Adds the failure collection to the passed list and returns just the output collection.
FakeBigQueryServerStream(List<T>) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
 
FakeBigQueryServices - Class in org.apache.beam.sdk.io.gcp.testing
A fake implementation of BigQuery's query service..
FakeBigQueryServices() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
FakeBigQueryServices.FakeBigQueryServerStream<T> - Class in org.apache.beam.sdk.io.gcp.testing
An implementation of BigQueryServerStream which takes a List as the Iterable to simulate a server stream.
FakeDatasetService - Class in org.apache.beam.sdk.io.gcp.testing
A fake dataset service that can be serialized, for use in testReadFromTable.
FakeDatasetService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
FakeJobService - Class in org.apache.beam.sdk.io.gcp.testing
A fake implementation of BigQuery's job service.
FakeJobService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
FakeJobService(int) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
Fanout() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
 
features() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
 
fetchDataflowJobId() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
 
fetchDataflowJobName() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
 
fetchDataflowWorkerId() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
 
FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message
Instantiates a new Fetch HL7v2 message DoFn.
FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
Instantiates a new Fetch HL7v2 message DoFn.
fewKeys(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.PerKey, and set fewKeys in GroupByKey.
fewKeys() - Method in class org.apache.beam.sdk.transforms.GroupByKey
Returns whether it groups just few keys.
FhirBundleParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
FhirBundleParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
 
FhirBundleResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
 
FhirBundleResponse() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
 
FhirIO - Class in org.apache.beam.sdk.io.gcp.healthcare
FhirIO provides an API for reading and writing resources to Google Cloud Healthcare Fhir API.
FhirIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
 
FhirIO.Deidentify - Class in org.apache.beam.sdk.io.gcp.healthcare
Deidentify FHIR resources from a FHIR store to a destination FHIR store.
FhirIO.Deidentify.DeidentifyFn - Class in org.apache.beam.sdk.io.gcp.healthcare
A function that schedules a deidentify operation and monitors the status.
FhirIO.ExecuteBundles - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Execute bundles.
FhirIO.ExecuteBundlesResult - Class in org.apache.beam.sdk.io.gcp.healthcare
ExecuteBundlesResult contains both successfully executed bundles and information help debugging failed executions (eg metadata & error msgs).
FhirIO.Export - Class in org.apache.beam.sdk.io.gcp.healthcare
Export FHIR resources from a FHIR store to new line delimited json files on GCS or BigQuery.
FhirIO.Export.ExportResourcesFn - Class in org.apache.beam.sdk.io.gcp.healthcare
A function that schedules an export operation and monitors the status.
FhirIO.Import - Class in org.apache.beam.sdk.io.gcp.healthcare
Writes each bundle of elements to a new-line delimited JSON file on GCS and issues a fhirStores.import Request for that file.
FhirIO.Import.ContentStructure - Enum in org.apache.beam.sdk.io.gcp.healthcare
The enum Content structure.
FhirIO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Read.
FhirIO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Result.
FhirIO.Search<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Search.
FhirIO.Search.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
 
FhirIO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Write.
FhirIO.Write.AbstractResult - Class in org.apache.beam.sdk.io.gcp.healthcare
 
FhirIO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Result.
FhirIO.Write.WriteMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
The enum Write method.
FhirIOPatientEverything - Class in org.apache.beam.sdk.io.gcp.healthcare
The type FhirIOPatientEverything for querying a FHIR Patient resource's compartment.
FhirIOPatientEverything() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
 
FhirIOPatientEverything.PatientEverythingParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
PatientEverythingParameter defines required attributes for a FHIR GetPatientEverything request in FhirIOPatientEverything.
FhirIOPatientEverything.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The Result for a FhirIOPatientEverything request.
FhirResourcePagesIterator(HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod, HealthcareApiClient, String, String, String, Map<String, Object>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
 
FhirSearchParameter<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
FhirSearchParameter represents the query parameters for a FHIR search request, used as a parameter for FhirIO.Search.
FhirSearchParameterCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
FhirSearchParameterCoder is the coder for FhirSearchParameter, which takes a coder for type T.
fhirStoresImport(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
Import method for batch writing resources.
fhirStoresImport(String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
 
fhirStoresImport(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
 
field(Row, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle.Field
 
field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle.Field
 
Field() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field
 
field(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
Add a new field of the specified type.
field(String, Schema.FieldType, Object) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
Add a new field of the specified type.
fieldAccess(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
Select a set of fields described in a FieldAccessDescriptor.
FieldAccessDescriptor - Class in org.apache.beam.sdk.schemas
Used inside of a DoFn to describe which fields in a schema type need to be accessed for processing.
FieldAccessDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
fieldAccessDescriptor(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
Join by the following field access descriptor.
FieldAccessDescriptor.FieldDescriptor - Class in org.apache.beam.sdk.schemas
Description of a single field.
FieldAccessDescriptor.FieldDescriptor.Builder - Class in org.apache.beam.sdk.schemas
Builder class.
FieldAccessDescriptor.FieldDescriptor.ListQualifier - Enum in org.apache.beam.sdk.schemas
Qualifier for a list selector.
FieldAccessDescriptor.FieldDescriptor.MapQualifier - Enum in org.apache.beam.sdk.schemas
Qualifier for a map selector.
FieldAccessDescriptor.FieldDescriptor.Qualifier - Class in org.apache.beam.sdk.schemas
OneOf union for a collection selector.
FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind - Enum in org.apache.beam.sdk.schemas
The kind of qualifier.
FieldAccessDescriptorParser - Class in org.apache.beam.sdk.schemas.parser
Parser for textual field-access selector.
FieldAccessDescriptorParser() - Constructor for class org.apache.beam.sdk.schemas.parser.FieldAccessDescriptorParser
 
FieldDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
 
fieldFromType(TypeDescriptor, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
Map a Java field type to a Beam Schema FieldType.
fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
Join by the following field ids.
fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
Select a set of top-level field ids from the row.
fieldIdsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return the field ids accessed.
fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
Join by the following field names.
fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
Select a set of top-level field names from the row.
fieldNamesAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return the field names accessed.
fields() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
 
fields(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
 
fields(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
 
fields(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
 
Fields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select.Fields
 
fields() - Static method in class org.apache.beam.sdk.state.StateKeySpec
 
FieldsEqual() - Constructor for class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
 
fieldSpecifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
FieldSpecifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
 
FieldSpecifierNotationBaseListener - Class in org.apache.beam.sdk.schemas.parser.generated
This class provides an empty implementation of FieldSpecifierNotationListener, which can be extended to create a listener which only needs to handle a subset of the available methods.
FieldSpecifierNotationBaseListener() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
 
FieldSpecifierNotationBaseVisitor<T> - Class in org.apache.beam.sdk.schemas.parser.generated
This class provides an empty implementation of FieldSpecifierNotationVisitor, which can be extended to create a visitor which only needs to handle a subset of the available methods.
FieldSpecifierNotationBaseVisitor() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
 
FieldSpecifierNotationLexer - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationLexer(CharStream) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
FieldSpecifierNotationListener - Interface in org.apache.beam.sdk.schemas.parser.generated
This interface defines a complete listener for a parse tree produced by FieldSpecifierNotationParser.
FieldSpecifierNotationParser - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser(TokenStream) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
FieldSpecifierNotationParser.ArrayQualifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.ArrayQualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.DotExpressionComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.DotExpressionContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.FieldSpecifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.MapQualifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.MapQualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.QualifiedComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.QualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.QualifyComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.SimpleIdentifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationParser.WildcardContext - Class in org.apache.beam.sdk.schemas.parser.generated
 
FieldSpecifierNotationVisitor<T> - Interface in org.apache.beam.sdk.schemas.parser.generated
This interface defines a complete generic visitor for a parse tree produced by FieldSpecifierNotationParser.
FieldType() - Constructor for class org.apache.beam.sdk.schemas.Schema.FieldType
 
FieldTypeDescriptors - Class in org.apache.beam.sdk.schemas
Utilities for converting between Schema field types and TypeDescriptors that define Java objects which can represent these field types.
FieldTypeDescriptors() - Constructor for class org.apache.beam.sdk.schemas.FieldTypeDescriptors
 
fieldTypeForJavaType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.FieldTypeDescriptors
fieldUpdate(String, String, String) - Static method in class org.apache.beam.sdk.io.mongodb.UpdateField
 
FieldValueGetter<ObjectT,ValueT> - Interface in org.apache.beam.sdk.schemas
For internal use only; no backwards-compatibility guarantees.
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
 
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
 
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
 
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
 
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
 
fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
new implementations should override GetterBasedSchemaProvider.fieldValueGetters(TypeDescriptor, Schema) and make this method throw an UnsupportedOperationException
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
Delegates to the GetterBasedSchemaProvider.fieldValueGetters(Class, Schema) for backwards compatibility, override it if you want to use the richer type signature contained in the TypeDescriptor not subject to the type erasure.
fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
 
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
 
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
fieldValueGetters(TypeDescriptor<T>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
 
FieldValueSetter<ObjectT,ValueT> - Interface in org.apache.beam.sdk.schemas
For internal use only; no backwards-compatibility guarantees.
FieldValueTypeInformation - Class in org.apache.beam.sdk.schemas
Represents type information for a Java type that will be used to infer a Schema type.
FieldValueTypeInformation() - Constructor for class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
FieldValueTypeInformation.Builder - Class in org.apache.beam.sdk.schemas
 
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
 
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
 
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
 
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
 
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
 
fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
new implementations should override GetterBasedSchemaProvider.fieldValueTypeInformations(TypeDescriptor, Schema) and make this method throw an UnsupportedOperationException
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
Delegates to the GetterBasedSchemaProvider.fieldValueTypeInformations(Class, Schema) for backwards compatibility, override it if you want to use the richer type signature contained in the TypeDescriptor not subject to the type erasure.
fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
 
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
 
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
fieldValueTypeInformations(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
 
FieldValueTypeSupplier - Interface in org.apache.beam.sdk.schemas.utils
A naming policy for schema fields.
FILE_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
FILE_NAME_FIELD - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
 
FILE_TRIGGERING_BYTE_COUNT - Static variable in class org.apache.beam.sdk.io.WriteFiles
 
FILE_TRIGGERING_RECORD_BUFFERING_DURATION - Static variable in class org.apache.beam.sdk.io.WriteFiles
 
FILE_TRIGGERING_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.WriteFiles
 
FileBasedReader(FileBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
Subclasses should not perform IO operations at the constructor.
FileBasedSink<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
Abstract class for file-based output.
FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
Construct a FileBasedSink with the given temp directory, producing uncompressed files.
FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, FileBasedSink.WritableByteChannelFactory) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
Construct a FileBasedSink with the given temp directory and output channel type.
FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, Compression) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
Construct a FileBasedSink with the given temp directory and output channel type.
FileBasedSink.CompressionType - Enum in org.apache.beam.sdk.io
Deprecated.
FileBasedSink.DynamicDestinations<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
A class that allows value-dependent writes in FileBasedSink.
FileBasedSink.FilenamePolicy - Class in org.apache.beam.sdk.io
A naming policy for output files.
FileBasedSink.FileResult<DestinationT> - Class in org.apache.beam.sdk.io
Result of a single bundle write.
FileBasedSink.FileResultCoder<DestinationT> - Class in org.apache.beam.sdk.io
A coder for FileBasedSink.FileResult objects.
FileBasedSink.OutputFileHints - Interface in org.apache.beam.sdk.io
Provides hints about how to generate output files, such as a suggested filename suffix (e.g.
FileBasedSink.WritableByteChannelFactory - Interface in org.apache.beam.sdk.io
Implementations create instances of WritableByteChannel used by FileBasedSink and related classes to allow decorating, or otherwise transforming, the raw data that would normally be written directly to the WritableByteChannel passed into FileBasedSink.WritableByteChannelFactory.create(WritableByteChannel).
FileBasedSink.WriteOperation<DestinationT,OutputT> - Class in org.apache.beam.sdk.io
Abstract operation that manages the process of writing to FileBasedSink.
FileBasedSink.Writer<DestinationT,OutputT> - Class in org.apache.beam.sdk.io
Abstract writer that writes a bundle to a FileBasedSink.
FileBasedSource<T> - Class in org.apache.beam.sdk.io
A common base class for all file-based Sources.
FileBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
Create a FileBaseSource based on a file or a file pattern specification, with the given strategy for treating filepatterns that do not match any files.
FileBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
FileBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
Create a FileBasedSource based on a single file.
FileBasedSource.FileBasedReader<T> - Class in org.apache.beam.sdk.io
A reader that implements code common to readers of FileBasedSources.
FileBasedSource.Mode - Enum in org.apache.beam.sdk.io
A given FileBasedSource represents a file resource of one of these types.
FileChecksumMatcher - Class in org.apache.beam.sdk.testing
Matcher to verify checksum of the contents of an ShardedFile in E2E test.
fileContentsHaveChecksum(String) - Static method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
FileIO - Class in org.apache.beam.sdk.io
General-purpose transforms for working with files: listing files (matching), reading and writing.
FileIO() - Constructor for class org.apache.beam.sdk.io.FileIO
 
FileIO.Match - Class in org.apache.beam.sdk.io
Implementation of FileIO.match().
FileIO.MatchAll - Class in org.apache.beam.sdk.io
Implementation of FileIO.matchAll().
FileIO.MatchConfiguration - Class in org.apache.beam.sdk.io
Describes configuration for matching filepatterns, such as EmptyMatchTreatment and continuous watching for matching files.
FileIO.ReadableFile - Class in org.apache.beam.sdk.io
A utility class for accessing a potentially compressed file.
FileIO.ReadMatches - Class in org.apache.beam.sdk.io
Implementation of FileIO.readMatches().
FileIO.ReadMatches.DirectoryTreatment - Enum in org.apache.beam.sdk.io
Enum to control how directories are handled.
FileIO.Sink<ElementT> - Interface in org.apache.beam.sdk.io
Specifies how to write elements to individual files in FileIO.write() and FileIO.writeDynamic().
FileIO.Write<DestinationT,UserT> - Class in org.apache.beam.sdk.io
Implementation of FileIO.write() and FileIO.writeDynamic().
FileIO.Write.FileNaming - Interface in org.apache.beam.sdk.io
A policy for generating names for shard files.
FilenamePolicy() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
 
filepattern(String) - Method in class org.apache.beam.sdk.io.FileIO.Match
Matches the given filepattern.
filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Match
filepattern(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
Matches the given filepattern.
filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
FileReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
 
FileReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
 
FileReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
 
FileReadSchemaTransformFormatProvider - Interface in org.apache.beam.sdk.io.fileschematransform
Interface that provides a PTransform that reads in a PCollection of FileIO.ReadableFiles and outputs the data represented as a PCollection of Rows.
FileReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.fileschematransform
 
FileReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
 
FileReporter - Class in org.apache.beam.runners.flink.metrics
Flink metrics reporter for writing metrics to a file specified via the "metrics.reporter.file.path" config key (assuming an alias of "file" for this reporter in the "metrics.reporters" setting).
FileReporter() - Constructor for class org.apache.beam.runners.flink.metrics.FileReporter
 
FileResult(ResourceId, int, BoundedWindow, PaneInfo, DestinationT) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
FileResultCoder(Coder<BoundedWindow>, Coder<DestinationT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
fileSize(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Returns the file size from GCS or throws FileNotFoundException if the resource does not exist.
FileStagingOptions - Interface in org.apache.beam.sdk.options
File staging related options.
FileSystem<ResourceIdT extends ResourceId> - Class in org.apache.beam.sdk.io
File system interface in Beam.
FileSystem() - Constructor for class org.apache.beam.sdk.io.FileSystem
 
FileSystem.LineageLevel - Enum in org.apache.beam.sdk.io
 
FileSystemRegistrar - Interface in org.apache.beam.sdk.io
A registrar that creates FileSystem instances from PipelineOptions.
FileSystems - Class in org.apache.beam.sdk.io
Clients facing FileSystem utility.
FileSystems() - Constructor for class org.apache.beam.sdk.io.FileSystems
 
FileSystemUtils - Class in org.apache.beam.sdk.io
 
FileSystemUtils() - Constructor for class org.apache.beam.sdk.io.FileSystemUtils
 
FileWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
The configuration for building file writing transforms using SchemaTransform and SchemaTransformProvider.
FileWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
FileWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
 
FileWriteSchemaTransformConfiguration.CsvConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
Configures extra details related to writing CSV formatted files.
FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
 
FileWriteSchemaTransformConfiguration.ParquetConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
Configures extra details related to writing Parquet formatted files.
FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
 
FileWriteSchemaTransformConfiguration.XmlConfiguration - Class in org.apache.beam.sdk.io.fileschematransform
Configures extra details related to writing XML formatted files.
FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder - Class in org.apache.beam.sdk.io.fileschematransform
 
FileWriteSchemaTransformFormatProvider - Interface in org.apache.beam.sdk.io.fileschematransform
Provides a PTransform that writes a PCollection of Rows and outputs a PCollection of the file names according to a registered AutoService FileWriteSchemaTransformFormatProvider implementation.
FileWriteSchemaTransformFormatProviders - Class in org.apache.beam.sdk.io.fileschematransform
FileWriteSchemaTransformFormatProviders() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformFormatProviders
 
FileWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.fileschematransform
A TypedSchemaTransformProvider implementation for writing a Row PCollection to file systems, driven by a FileWriteSchemaTransformConfiguration.
FileWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
 
FillGaps<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
Fill gaps in timeseries.
FillGaps() - Constructor for class org.apache.beam.sdk.extensions.timeseries.FillGaps
 
FillGaps.FillGapsDoFn<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
 
FillGaps.InterpolateData<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
Argument to withInterpolateFunction function.
Filter - Class in org.apache.beam.sdk.schemas.transforms
A PTransform for filtering a collection of schema types.
Filter() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter
 
Filter<T> - Class in org.apache.beam.sdk.transforms
PTransforms for filtering from a PCollection the elements satisfying a predicate, or satisfying an inequality with a given value based on the elements' natural ordering.
Filter.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
Implementation of the filter.
filterCharacters(String) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
 
FilterForMutationDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
 
FilterForMutationDoFn() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.FilterForMutationDoFn
 
finalizeAllOutstandingBundles() - Method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers.InMemoryFinalizer
All finalization requests will be sent without waiting for the responses.
finalizeCheckpoint() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
 
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.CheckpointMarkImpl
 
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.solace.read.SolaceCheckpointMark
 
finalizeCheckpoint() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
Called by the system to signal that this checkpoint mark has been committed along with all the records which have been read from the UnboundedSource.UnboundedReader since the previous checkpoint was taken.
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.UnboundedSource.CheckpointMark.NoopCheckpointMark
 
finalizeDestination(DestinationT, BoundedWindow, Integer, Collection<FileBasedSink.FileResult<DestinationT>>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
finalizeWriteStream(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Finalize a write stream.
finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
find(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Find PTransform that checks if a portion of the line matches the Regex.
find(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Find PTransform that checks if a portion of the line matches the Regex.
find(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Find PTransform that checks if a portion of the line matches the Regex.
find(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Find PTransform that checks if a portion of the line matches the Regex.
find(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindName PTransform that checks if a portion of the line matches the Regex.
find(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindName PTransform that checks if a portion of the line matches the Regex.
Find(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Find
 
findAll(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindAll PTransform that checks if a portion of the line matches the Regex.
findAll(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindAll PTransform that checks if a portion of the line matches the Regex.
FindAll(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindAll
 
findAllTableIndexes() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Finds all indexes for the metadata table.
findAvailablePort() - Static method in class org.apache.beam.sdk.extensions.python.PythonService
 
findDateTimePattern(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
findDateTimePattern(String, ImmutableMap<DateTimeUtils.TimestampPatterns, DateTimeFormatter>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
findKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindKV PTransform that checks if a portion of the line matches the Regex.
findKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindKV PTransform that checks if a portion of the line matches the Regex.
findKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindNameKV PTransform that checks if a portion of the line matches the Regex.
findKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindNameKV PTransform that checks if a portion of the line matches the Regex.
FindKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindKV
 
FindName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindName
 
FindNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindNameKV
 
FindQuery - Class in org.apache.beam.sdk.io.mongodb
Builds a MongoDB FindQuery object.
FindQuery() - Constructor for class org.apache.beam.sdk.io.mongodb.FindQuery
 
finish(DoFn<SequencedMessage, Row>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
 
finish() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
 
finish(DoFn<DataChangeRecord, Row>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
 
finish(DoFn<byte[], Row>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
 
finishBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
finishBundle(DoFn<Row, Row>.FinishBundleContext, PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
 
finishBundle(DoFn<Iterable<KV<DestinationT, WriteTables.Result>>, Iterable<KV<TableDestination, WriteTables.Result>>>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
 
finishBundle(DoFn<KV<Integer, Solace.Record>, Solace.PublishResult>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
 
finishBundle(DoFn<KV<Integer, Solace.Record>, Solace.PublishResult>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedStreamingSolaceWriter
 
finishBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
finishBundle(DoFn<T, KV<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>.FinishBundleContext) - Method in class org.apache.beam.sdk.transforms.View.ToListViewDoFn
 
FinishBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
 
finishSpecifying() - Method in interface org.apache.beam.sdk.state.StateSpec
For internal use only; no backwards-compatibility guarantees.
finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
After building, finalizes this PValue to make it ready for running.
finishSpecifying(PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.PValue
After building, finalizes this PValue to make it ready for being used as an input to a PTransform.
finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.io.requestresponse.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.WriteFilesResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionList
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PDone
Does nothing; there is nothing to finish specifying.
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.POutput
As part of applying the producing PTransform, finalizes this output to make it ready for being used as an input and for running.
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
 
finishWrite() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
fireEligibleTimers(InMemoryTimerInternals, Map<KV<String, String>, FnDataReceiver<Timer>>, Object) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
Fires all timers which are ready to be fired.
FirestoreIO - Class in org.apache.beam.sdk.io.gcp.firestore
FirestoreIO provides an API for reading from and writing to Google Cloud Firestore.
FirestoreOptions - Interface in org.apache.beam.sdk.io.gcp.firestore
 
FirestoreV1 - Class in org.apache.beam.sdk.io.gcp.firestore
FirestoreV1 provides an API which provides lifecycle managed PTransforms for Cloud Firestore v1 API.
FirestoreV1.BatchGetDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<BatchGetDocumentsRequest>, PTransform<BatchGetDocumentsResponse>> which will read from Firestore.
FirestoreV1.BatchGetDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.BatchGetDocuments allowing configuration and instantiation.
FirestoreV1.BatchWriteWithDeadLetterQueue - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<Write>, PCollection<FirestoreV1.WriteFailure> which will write to Firestore.
FirestoreV1.BatchWriteWithDeadLetterQueue.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.BatchWriteWithDeadLetterQueue allowing configuration and instantiation.
FirestoreV1.BatchWriteWithSummary - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<Write>, PDone> which will write to Firestore.
FirestoreV1.BatchWriteWithSummary.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.BatchWriteWithSummary allowing configuration and instantiation.
FirestoreV1.FailedWritesException - Exception in org.apache.beam.sdk.io.gcp.firestore
Exception that is thrown if one or more Writes is unsuccessful with a non-retryable status code.
FirestoreV1.ListCollectionIds - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<ListCollectionIdsRequest>, PTransform<ListCollectionIdsResponse>> which will read from Firestore.
FirestoreV1.ListCollectionIds.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.ListCollectionIds allowing configuration and instantiation.
FirestoreV1.ListDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<ListDocumentsRequest>, PTransform<ListDocumentsResponse>> which will read from Firestore.
FirestoreV1.ListDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.ListDocuments allowing configuration and instantiation.
FirestoreV1.PartitionQuery - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<PartitionQueryRequest>, PTransform<RunQueryRequest>> which will read from Firestore.
FirestoreV1.PartitionQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.PartitionQuery allowing configuration and instantiation.
FirestoreV1.Read - Class in org.apache.beam.sdk.io.gcp.firestore
Type safe builder factory for read operations.
FirestoreV1.RunQuery - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<RunQueryRequest>, PTransform<RunQueryResponse>> which will read from Firestore.
FirestoreV1.RunQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.RunQuery allowing configuration and instantiation.
FirestoreV1.Write - Class in org.apache.beam.sdk.io.gcp.firestore
Type safe builder factory for write operations.
FirestoreV1.WriteFailure - Class in org.apache.beam.sdk.io.gcp.firestore
Failure details for an attempted Write.
FirestoreV1.WriteSuccessSummary - Class in org.apache.beam.sdk.io.gcp.firestore
Summary object produced when a number of writes are successfully written to Firestore in a single BatchWrite.
fireTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
Returns the firing timestamp of the current timer.
first - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
 
fixDefaults() - Method in class org.apache.beam.sdk.values.WindowingStrategy
Fixes all the defaults so that equals can be used to check that two strategies are the same, regardless of the state of "defaulted-ness".
FIXED_WINDOW_TVF - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
 
FixedBytes - Class in org.apache.beam.sdk.schemas.logicaltypes
A LogicalType representing a fixed-length byte array.
FixedPrecisionNumeric - Class in org.apache.beam.sdk.schemas.logicaltypes
Fixed precision numeric types used to represent jdbc NUMERIC and DECIMAL types.
fixedSizeGlobally(int) - Static method in class org.apache.beam.sdk.transforms.Sample
Returns a PTransform that takes a PCollection<T>, selects sampleSize elements, uniformly at random, and returns a PCollection<Iterable<T>> containing the selected elements.
fixedSizePerKey(int) - Static method in class org.apache.beam.sdk.transforms.Sample
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a PCollection<KV<K, Iterable<V>>> that contains an output element mapping each distinct key in the input PCollection to a sample of sampleSize values associated with that key in the input PCollection, taken uniformly at random.
fixedString(int) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
FixedString - Class in org.apache.beam.sdk.schemas.logicaltypes
A LogicalType representing a fixed-length string.
fixedStringSize() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
FixedWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows values into fixed-size timestamp-based windows.
flatMap(KV<K, Iterable<WindowedValue<V>>>, RecordCollector<WindowedValue<KV<K, Iterable<V>>>>) - Method in class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
 
FlatMapElements<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
PTransforms for mapping a simple function that returns iterables over the elements of a PCollection and merging the results.
FlatMapElements.FlatMapWithFailures<InputT,OutputT,FailureT> - Class in org.apache.beam.sdk.transforms
A PTransform that adds exception handling to FlatMapElements.
Flatten - Class in org.apache.beam.sdk.transforms
Flatten<T> takes multiple PCollection<T>s bundled into a PCollectionList<T> and returns a single PCollection<T> containing all the elements in all the input PCollections.
Flatten() - Constructor for class org.apache.beam.sdk.transforms.Flatten
 
Flatten.Iterables<T> - Class in org.apache.beam.sdk.transforms
FlattenIterables<T> takes a PCollection<Iterable<T>> and returns a PCollection<T> that contains all the elements from each iterable.
Flatten.PCollections<T> - Class in org.apache.beam.sdk.transforms
A PTransform that flattens a PCollectionList into a PCollection containing all the elements of all the PCollections in its input.
Flattened() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select.Flattened
 
flattenedSchema() - Static method in class org.apache.beam.sdk.schemas.transforms.Select
Selects every leaf-level field.
FlattenP - Class in org.apache.beam.runners.jet.processors
Jet Processor implementation for Beam's Flatten primitive.
FlattenP.Supplier - Class in org.apache.beam.runners.jet.processors
Jet Processor supplier that will provide instances of FlattenP.
flattenRel(RelStructuredTypeFlattener) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
FlattenTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
An implementation of TypedSchemaTransformProvider for Flatten.
FlattenTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
 
FlattenTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
 
FlattenTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
 
FlattenTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
Flatten translator.
FlattenTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.FlattenTranslatorBatch
 
FlattenWithHeterogeneousCoders - Interface in org.apache.beam.sdk.testing
Category tag for tests that use a Flatten where the input PCollectionList contains PCollections heterogeneous coders.
FlinkBatchPortablePipelineTranslator - Class in org.apache.beam.runners.flink
A translator that translates bounded portable pipelines into executable Flink pipelines.
FlinkBatchPortablePipelineTranslator(Map<String, FlinkBatchPortablePipelineTranslator.PTransformTranslator>) - Constructor for class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
 
FlinkBatchPortablePipelineTranslator.BatchTranslationContext - Class in org.apache.beam.runners.flink
Batch translation context.
FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform - Class in org.apache.beam.runners.flink
Predicate to determine whether a URN is a Flink native transform.
FlinkBatchPortablePipelineTranslator.PTransformTranslator - Interface in org.apache.beam.runners.flink
Transform translation interface.
FlinkDetachedRunnerResult - Class in org.apache.beam.runners.flink
Result of a detached execution of a Pipeline with Flink.
FlinkExecutionEnvironments - Class in org.apache.beam.runners.flink
Utilities for Flink execution environments.
FlinkExecutionEnvironments() - Constructor for class org.apache.beam.runners.flink.FlinkExecutionEnvironments
 
FlinkJobInvoker - Class in org.apache.beam.runners.flink
Job Invoker for the FlinkRunner.
FlinkJobInvoker(FlinkJobServerDriver.FlinkServerConfiguration) - Constructor for class org.apache.beam.runners.flink.FlinkJobInvoker
 
FlinkJobServerDriver - Class in org.apache.beam.runners.flink
Driver program that starts a job server for the Flink runner.
FlinkJobServerDriver.FlinkServerConfiguration - Class in org.apache.beam.runners.flink
Flink runner-specific Configuration for the jobServer.
FlinkMetricContainer - Class in org.apache.beam.runners.flink.metrics
Helper class for holding a MetricsContainerImpl and forwarding Beam metrics to Flink accumulators and metrics.
FlinkMetricContainer(RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
 
FlinkMetricContainerWithoutAccumulator - Class in org.apache.beam.runners.flink.metrics
The base helper class for holding a MetricsContainerImpl and forwarding Beam metrics to Flink accumulators and metrics.
FlinkMetricContainerWithoutAccumulator(MetricGroup) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainerWithoutAccumulator
 
FlinkMiniClusterEntryPoint - Class in org.apache.beam.runners.flink
Entry point for starting an embedded Flink cluster.
FlinkMiniClusterEntryPoint() - Constructor for class org.apache.beam.runners.flink.FlinkMiniClusterEntryPoint
 
FlinkPipelineOptions - Interface in org.apache.beam.runners.flink
Options which can be used to configure the Flink Runner.
FlinkPipelineOptions.MaxBundleSizeFactory - Class in org.apache.beam.runners.flink
Maximum bundle size factory.
FlinkPipelineOptions.MaxBundleTimeFactory - Class in org.apache.beam.runners.flink
Maximum bundle time factory.
FlinkPipelineRunner - Class in org.apache.beam.runners.flink
Runs a Pipeline on Flink via FlinkRunner.
FlinkPipelineRunner(FlinkPipelineOptions, String, List<String>) - Constructor for class org.apache.beam.runners.flink.FlinkPipelineRunner
Setup a flink pipeline runner.
FlinkPortableClientEntryPoint - Class in org.apache.beam.runners.flink
Flink job entry point to launch a Beam pipeline by executing an external SDK driver program.
FlinkPortableClientEntryPoint(String) - Constructor for class org.apache.beam.runners.flink.FlinkPortableClientEntryPoint
 
FlinkPortablePipelineTranslator<T extends FlinkPortablePipelineTranslator.TranslationContext> - Interface in org.apache.beam.runners.flink
Interface for portable Flink translators.
FlinkPortablePipelineTranslator.Executor - Interface in org.apache.beam.runners.flink
A handle used to execute a translated pipeline.
FlinkPortablePipelineTranslator.TranslationContext - Interface in org.apache.beam.runners.flink
The context used for pipeline translation.
FlinkPortableRunnerResult - Class in org.apache.beam.runners.flink
Result of executing a portable Pipeline with Flink.
FlinkRunner - Class in org.apache.beam.runners.flink
A PipelineRunner that executes the operations in the pipeline by first translating them to a Flink Plan and then executing them either locally or on a Flink cluster, depending on the configuration.
FlinkRunner(FlinkPipelineOptions) - Constructor for class org.apache.beam.runners.flink.FlinkRunner
 
FlinkRunnerRegistrar - Class in org.apache.beam.runners.flink
AutoService registrar - will register FlinkRunner and FlinkOptions as possible pipeline runner services.
FlinkRunnerRegistrar.Options - Class in org.apache.beam.runners.flink
Pipeline options registrar.
FlinkRunnerRegistrar.Runner - Class in org.apache.beam.runners.flink
Pipeline runner registrar.
FlinkRunnerResult - Class in org.apache.beam.runners.flink
Result of executing a Pipeline with Flink.
FlinkServerConfiguration() - Constructor for class org.apache.beam.runners.flink.FlinkJobServerDriver.FlinkServerConfiguration
 
FlinkStateBackendFactory - Interface in org.apache.beam.runners.flink
Constructs a StateBackend to use from flink pipeline options.
FlinkStreamingPortablePipelineTranslator - Class in org.apache.beam.runners.flink
Translate an unbounded portable pipeline representation into a Flink pipeline representation.
FlinkStreamingPortablePipelineTranslator() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
 
FlinkStreamingPortablePipelineTranslator(Map<String, FlinkStreamingPortablePipelineTranslator.PTransformTranslator<FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext>>) - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
 
FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform - Class in org.apache.beam.runners.flink
Predicate to determine whether a URN is a Flink native transform.
FlinkStreamingPortablePipelineTranslator.PTransformTranslator<T> - Interface in org.apache.beam.runners.flink
 
FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext - Class in org.apache.beam.runners.flink
Streaming translation context.
FLOAT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
FLOAT - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of float fields.
FLOAT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
FLOAT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
FloatCoder - Class in org.apache.beam.sdk.coders
A FloatCoder encodes Float values in 4 bytes using Java serialization.
floats() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Float.
floatToByteArray(float) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
flush(boolean) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
 
flush() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
 
flush() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
flush() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
Deprecated.
to be removed once splitting/checkpointing are available in SDKs and rewinding in readers.
flush() - Method in interface org.apache.beam.sdk.io.FileIO.Sink
Flushes the buffered state (if any) before the channel is closed.
flush(String, long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Flush a given stream up to the given offset.
flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
flush() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
 
flush() - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
flush() - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
 
flush() - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
 
flush() - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
flushBundle(DoFn<KV<Integer, Solace.Record>, Solace.PublishResult>.OnTimerContext) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
 
fn(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
fn(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
Binary compatibility adapter for Contextful.fn(ProcessFunction).
fn(Contextful.Fn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
Same with Contextful.of(ClosureT, org.apache.beam.sdk.transforms.Requirements) but with better type inference behavior for the case of Contextful.Fn.
FnApiControlClient - Class in org.apache.beam.runners.fnexecution.control
A client for the control plane of an SDK harness, which can issue requests to it over the Fn API.
FnApiControlClientPoolService - Class in org.apache.beam.runners.fnexecution.control
A Fn API control service which adds incoming SDK harness connections to a sink.
FnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
A receiver of streamed data.
FnDataService - Interface in org.apache.beam.runners.fnexecution.data
The FnDataService is able to forward inbound elements to a consumer and is also a consumer of outbound elements.
FnService - Interface in org.apache.beam.sdk.fn.server
An interface sharing common behavior with services used during execution of user Fns.
forBagUserStateHandlerFactory(ProcessBundleDescriptors.ExecutableProcessBundleDescriptor, StateRequestHandlers.BagUserStateHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
forBatch(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
forBoolean(Boolean) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forBytes() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
Returns a HllCount.Init.Builder for a HllCount.Init combining PTransform that computes bytes-type HLL++ sketches.
forClass(Class<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject to be used for serializing an instance of the supplied class for transport via the Dataflow API.
forClassName(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject to be used for serializing data to be deserialized using the supplied class name the supplied class name for transport via the Dataflow API.
forCoder(TypeDescriptor<?>, Coder<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
Creates a CoderProvider that always returns the given coder for the specified type.
forConsumers(List<DataEndpoint<?>>, List<TimerEndpoint<?>>) - Static method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
Creates a receiver that is able to consume elements multiplexing on to the provided set of endpoints.
forDescriptor(ProtoDomain, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
Create a new ProtoDynamicMessageSchema from a ProtoDomain and for a message.
forDescriptor(ProtoDomain, Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
Create a new ProtoDynamicMessageSchema from a ProtoDomain and for a descriptor.
forDescriptor(Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
 
forEncoding(ByteString) - Static method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
 
forever(Trigger) - Static method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
Create a composite trigger that repeatedly executes the trigger repeated, firing each time it fires and ignoring any indications to finish.
forField(TypeDescriptor<?>, Field, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
forFloat(Float) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forFloat(Double) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forGetter(Method, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
forGetter(TypeDescriptor<?>, Method, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
forHandler(RunnerApi.Environment, InstructionRequestHandler) - Static method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
Create a new RemoteEnvironment for the provided RunnerApi.Environment and AutoCloseable InstructionRequestHandler.
forInteger(Long) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forInteger(Integer) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forIntegers() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
Returns a HllCount.Init.Builder for a HllCount.Init combining PTransform that computes integer-type HLL++ sketches.
forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
Returns an StateRequestHandlers.IterableSideInputHandler for the given pTransformId, sideInputId.
forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
 
forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
 
forKnownType(Object) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value of a well-known cloud object type.
forLongs() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
Returns a HllCount.Init.Builder for a HllCount.Init combining PTransform that computes long-type HLL++ sketches.
FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
FormatAsTextFn() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.FormatAsTextFn
 
formatByteStringRange(Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Returns formatted string of a partition for debugging.
formatRecord(ElementT, Schema) - Method in interface org.apache.beam.sdk.extensions.avro.io.AvroIO.RecordFormatter
Deprecated.
 
formatRecord(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
Convert an input record type into the output type.
formatTimestamp(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
Formats a Instant timestamp with additional Beam-specific metadata, such as indicating whether the timestamp is the end of the global window or one of the distinguished values BoundedWindow.TIMESTAMP_MIN_VALUE or BoundedWindow.TIMESTAMP_MIN_VALUE.
formatTimestampWithTimeZone(DateTime) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
Returns a StateRequestHandlers.MultimapSideInputHandler for the given pTransformId, sideInputId.
forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
 
forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
 
forNewInput(Instant, InputT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
Called by the Watch transform to create a new independent termination state for a newly arrived InputT.
forOneOf(String, boolean, Map<String, FieldValueTypeInformation>) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
forOrdinal(int) - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
forProject(String, int, String) - Static method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
Initializes a client for managing transform service instances.
forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
 
forRequestObserver(String, StreamObserver<BeamFnApi.InstructionRequest>, ConcurrentMap<String, BeamFnApi.ProcessBundleDescriptor>) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
Returns a FnApiControlClient which will submit its requests to the provided observer.
forService(InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory
 
forSetter(Method) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
forSetter(Method, String) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
forSetter(TypeDescriptor<?>, Method) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
forSetter(TypeDescriptor<?>, Method, String) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
forSideInputHandlerFactory(Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, StateRequestHandlers.SideInputHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
Returns an adapter which converts a StateRequestHandlers.SideInputHandlerFactory to a StateRequestHandler.
forSqlType(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
 
forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
 
forStage(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.JobBundleFactory
 
forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
Deprecated.
 
forStage(ExecutableStage, BatchSideInputHandlerFactory.SideInputGetter) - Static method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
Creates a new state handler for the given stage.
forStage(ExecutableStage, Map<RunnerApi.ExecutableStagePayload.SideInputId, PCollectionView<?>>, SideInputHandler) - Static method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
Creates a new state handler for the given stage.
forStreamFromSources(List<Integer>, Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
Build the TimerInternals according to the feeding streams.
forStreaming(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
forString(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forStrings() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
Returns a HllCount.Init.Builder for a HllCount.Init combining PTransform that computes string-type HLL++ sketches.
forThrowable(Throwable) - Static method in class org.apache.beam.sdk.values.EncodableThrowable
Wraps throwable and returns the result.
forTransformHierarchy(TransformHierarchy, PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
 
forTypeName(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
 
forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandlerFactory
 
ForwardingClientResponseObserver<ReqT,RespT> - Class in org.apache.beam.sdk.fn.stream
A ClientResponseObserver which delegates all StreamObserver calls.
forWriter(LogWriter) - Static method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
 
freeze() - Method in class org.apache.beam.runners.jet.metrics.JetMetricResults
 
from(DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
 
from(Map<String, String>) - Static method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Deprecated.
Expects a map keyed by logger Names with values representing Levels.
from(WindowIntoTransformProvider.Configuration) - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
 
from(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
Reads from the given filename or filepattern.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
Reads from the given filename or filepattern.
from(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
from(String, Row, Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(ValueProvider<String>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
Reads from the given file name or pattern ("glob").
from(MatchResult.Metadata) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
 
from(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
from(String) - Static method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Instantiates a cross-language wrapper for a Python transform with a given transform name.
from(String, String) - Static method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Instantiates a cross-language wrapper for a Python transform with a given transform name.
from(Row) - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
 
from(ExecutorService) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
 
from(Supplier<ExecutorService>) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
 
from(FileBasedSource<T>) - Static method in class org.apache.beam.sdk.io.CompressedSource
Creates a CompressedSource from an underlying FileBasedSource.
from(String) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
Reads text from the file(s) with the given filename or filename pattern.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
Same as from(filepattern), but accepting a ValueProvider.
from(CsvWriteTransformProvider.CsvWriteConfiguration) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
 
from(FileReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
 
from(FileWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
from(BigQueryExportReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Returns the expected SchemaTransform of the configuration.
from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Reads a BigQuery table specified as "[project_id]:[dataset_id].[table_id]" or "[dataset_id].[table_id]" for tables within the current project.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Same as from(String), but with a ValueProvider.
from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Read from table specified by a TableReference.
from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Produces a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
 
from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
from(BigQueryWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider
 
from(BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
from(BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(PubsubReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(PubsubWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
from(PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
from(PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
from(Struct) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.PartitionMetadataMapper
Transforms a Struct representing a partition metadata row into a PartitionMetadata model.
from(SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
from(SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
 
from(SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
from(long) - Static method in class org.apache.beam.sdk.io.GenerateSequence
Specifies the minimum number to generate (inclusive).
from(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
 
from(SchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
 
from(IcebergWriteSchemaTransformProvider.Configuration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
 
from(JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
 
from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
 
from(JsonWriteTransformProvider.JsonWriteConfiguration) - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
 
from(KafkaReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
from(KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
 
from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
 
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
Reads from the given filename or filepattern.
from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
from(BoundedSource<T>) - Method in class org.apache.beam.sdk.io.Read.Builder
Returns a new Read.Bounded PTransform reading from the given BoundedSource.
from(UnboundedSource<T, ?>) - Method in class org.apache.beam.sdk.io.Read.Builder
Returns a new Read.Unbounded PTransform reading from the given UnboundedSource.
from(BoundedSource<T>) - Static method in class org.apache.beam.sdk.io.Read
Returns a new Read.Bounded PTransform reading from the given BoundedSource.
from(UnboundedSource<T, ?>) - Static method in class org.apache.beam.sdk.io.Read
Returns a new Read.Unbounded PTransform reading from the given UnboundedSource.
from(SingleStoreSchemaTransformReadConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
Returns the expected SchemaTransform of the configuration.
from(SingleStoreSchemaTransformWriteConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
Returns the expected SchemaTransform of the configuration.
from(Solace.Queue) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
Set the queue name to read from.
from(Solace.Topic) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
Set the topic name to read from.
from(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
Provide name of collection while reading from Solr.
from(String) - Method in class org.apache.beam.sdk.io.TextIO.Read
Reads text files that reads from the file(s) with the given filename or filename pattern.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Read
Same as from(filepattern), but accepting a ValueProvider.
from(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
Returns a transform for reading TFRecord files that reads from the file(s) with the given filename or filename pattern.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
Same as from(filepattern), but accepting a ValueProvider.
from(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
from(ManagedSchemaTransformProvider.ManagedConfig) - Method in class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
 
from(TestSchemaTransformProvider.Config) - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
 
from(Map<String, String>) - Static method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
Expects a map keyed by logger Names with values representing LogLevels.
from(GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
from(String, Row, Schema) - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(FlattenTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
 
from(JavaExplodeTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
 
from(JavaFilterTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
 
from(JavaMapToFieldsTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
 
from(LoggingTransformProvider.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
 
from(Row) - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
Produce a SchemaTransform from some transform-specific configuration object.
from(ConfigT) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
Produce a SchemaTransform from ConfigT.
from(Row) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
Produces a SchemaTransform from a Row configuration.
from(HasDisplayData) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Collect the DisplayData from a component.
from(double, double) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
A representation for the amount of known completed and remaining work.
fromArgs(String...) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
Sets the command line arguments to parse when constructing the PipelineOptions.
fromArgs(String...) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Sets the command line arguments to parse when constructing the PipelineOptions.
fromArray(T...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
Returns a PrefetchableIterable over the specified values.
fromArray(T...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
Returns a PrefetchableIterator over the specified values.
fromAvroType(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
Create a AvroUtils.FixedBytesField from an AVRO type.
fromBeamFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for deserializing a byte array using the specified coder.
fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.CoderHelpers
Utility method for deserializing a byte array using the specified coder.
fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.twister2.utils.TranslationUtils
Utility method for deserializing a byte array using the specified coder.
fromByteArray(byte[], WindowedValue.WindowedValueCoder<T>) - Static method in class org.apache.beam.runners.twister2.utils.TranslationUtils
Utility method for deserializing a byte array using the specified coder.
fromByteArrays(Collection<byte[]>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for deserializing a Iterable of byte arrays using the specified coder.
fromByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting a byte array to an object.
FromByteFunction(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
 
fromByteFunctionIterable(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting a byte array pair to a key-value pair, where values are Iterable.
fromCanonical(Compression) - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
Deprecated.
 
fromCloudDuration(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
Converts a Dataflow API duration string into a Duration.
fromCloudObject(CloudObject) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
Converts back into the original object from a provided CloudObject.
fromCloudObject(CloudObject) - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
Convert from a cloud object.
fromCloudObject(CloudObject) - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
Convert from a cloud object.
fromCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
Transform messages publishable using PubsubIO to their equivalent Pub/Sub Lite publishable message.
fromCloudTime(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
Converts a time value received via the Dataflow API into the corresponding Instant.
fromComponents(List<Coder<?>>, byte[], CoderTranslation.TranslationContext) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
 
fromComponents(String, String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Creates a GcsPath from bucket and object components.
fromConfig(FlinkJobServerDriver.FlinkServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
fromConfig(FlinkJobServerDriver.FlinkServerConfiguration, JobServerDriver.JobInvokerFactory) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
fromConfig(SparkJobServerDriver.SparkServerConfiguration) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
 
fromConfigRow(Row, PipelineOptions) - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
 
fromExceptionInformation(RecordT, Coder<RecordT>, Exception, String) - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
 
fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
Note that the BeamFnApi.ProcessBundleDescriptor is constructed by: Adding gRPC read and write nodes wiring them to the specified data endpoint.
fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
 
fromExistingTable(String, String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
Encapsulates a selected table name.
fromFeedRange(FeedRange) - Static method in class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
 
fromFile(String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
 
fromFile(String, OutputStream) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
 
fromFile(File, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
fromHex(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
fromHttpResponse(HttpResponse) - Static method in class org.apache.beam.sdk.io.solace.broker.BrokerResponse
 
fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
 
fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
 
fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
 
fromInt(Integer, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
 
fromIr(Ir, SbeSchema.IrOptions) - Static method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
Creates a new SbeSchema from the given intermediate representation.
fromIr(Ir) - Static method in class org.apache.beam.sdk.extensions.sbe.SerializableIr
Creates a new instance from ir.
fromJsonFile(File) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
Gets ConfigWrapper by JSON file.
fromJsonString(String) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
Gets ConfigWrapper by JSON string.
fromJsonString(String, Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
 
fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
 
fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
 
fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
 
fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
 
fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
 
fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
 
fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
 
fromMap(Map<String, String>) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
Returns a new configuration instance using provided flags.
fromModel(Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
From model Message to hl7v2 message.
fromName(String) - Static method in enum org.apache.beam.io.debezium.Connectors
Returns a connector class corresponding to the given connector name.
fromName(String) - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Queue
 
fromName(String) - Static method in class org.apache.beam.sdk.io.solace.data.Solace.Topic
 
fromObject(StorageObject) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Creates a GcsPath from a StorageObject.
fromOptions(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Constructs a translator from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
Construct a runner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.TestDataflowRunner
Constructs a runner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.GcsStager
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.direct.DirectRunner
Construct a DirectRunner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkRunner
Construct a runner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.jet.JetRunner
 
fromOptions(PipelineOptions, Function<ClientConfig, JetInstance>) - Static method in class org.apache.beam.runners.jet.JetRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.PortableRunner
Constructs a runner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.testing.TestPortableRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.testing.TestUniversalRunner
Constructs a runner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.prism.PrismRunner
Invoked from Pipeline.run() where PrismRunner instantiates using PrismPipelineOptions configuration details.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.prism.TestPrismRunner
Invoked from Pipeline.run() where TestPrismRunner instantiates using TestPrismPipelineOptions configuration details.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
Creates and returns a new SparkRunner with specified options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunnerDebugger
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
Creates and returns a new SparkStructuredStreamingRunner with specified options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.TestSparkRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.twister2.Twister2Runner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.twister2.Twister2TestRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.s3.DefaultS3FileSystemSchemeRegistrar
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemRegistrar
 
fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.aws.s3.S3FileSystemSchemeRegistrar
Create zero or more S3FileSystemConfiguration instances from the given PipelineOptions.
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.s3.DefaultS3FileSystemSchemeRegistrar
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemRegistrar
 
fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.aws2.s3.S3FileSystemSchemeRegistrar
Create zero or more S3FileSystemConfiguration instances from the given PipelineOptions.
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.blobstore.AzureBlobStoreFileSystemRegistrar
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar
 
fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.FileSystemRegistrar
Create zero or more filesystems from the given PipelineOptions.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Creates an instance of this rule using provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.googleads.GoogleAdsUserCredentialFactory
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.LocalFileSystemRegistrar
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.PipelineRunner
Constructs a runner from the provided PipelineOptions.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.CrashingRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipeline
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
Creates a ResourceHints instance with hints supplied in options.
fromParams(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
fromParams(String[]) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
 
fromParams(DefaultFilenamePolicy.Params) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Creates a class representing a Pub/Sub subscription from the specified subscription path.
fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Creates a class representing a Cloud Pub/Sub topic from the specified topic path.
fromPath(Path, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
fromProcessFunctionWithOutputType(ProcessFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.InferableFunction
 
fromProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
 
fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Reads results received after executing the given query.
fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Same as fromQuery(String), but with a ValueProvider.
fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
fromQuery(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
A query to be executed in Snowflake.
fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
 
fromRawEvents(Coder<T>, List<TestStream.Event<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream
For internal use only.
fromResourceName(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Creates a GcsPath from a OnePlatform resource name in string form.
fromRow(Row) - Static method in class org.apache.beam.sdk.values.Row
Creates a row builder based on the specified row.
fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
 
fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
Given a type, returns a function that converts from a Row object to that type.
fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
 
fromRowFunction(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
Given a type, returns a function that converts from a Row object to that type.
fromRows(Class<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
Convert a PCollection<Row> into a PCollection<OutputT>.
fromRows(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
Convert a PCollection<Row> into a PCollection<Row>.
fromS3Options(S3Options) - Static method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Creates a new S3FileSystemConfiguration.Builder with values initialized by the properties of s3Options.
fromS3Options(S3Options) - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
 
fromSerializableFunctionWithOutputType(SerializableFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.SimpleFunction
 
fromSnapshot(Snapshot) - Static method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
fromSpec(Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject by copying the supplied serialized object spec, which must represent an SDK object serialized for transport via the Dataflow API.
fromSpec(HCatalogIO.Read) - Static method in class org.apache.beam.sdk.io.hcatalog.HCatToRow
Wraps the HCatalogIO.read() to convert HCatRecords to Rows.
fromStandardParameters(ValueProvider<ResourceId>, String, String, boolean) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
fromStaticMethods(Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
Creates a CoderProvider from a class's static <T> Coder<T> of(TypeDescriptor<T>, List<Coder<?>>) method.
fromString(String, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
fromString(ValueProvider<String>, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
fromSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Reads from the given subscription.
fromSubscription(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Like subscription() but with a ValueProvider.
fromSupplier(SerializableSupplier<Matcher<T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
Constructs a SerializableMatcher from a non-serializable Matcher via indirection through SerializableSupplier.
fromTable(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
A table name to be read in Snowflake.
fromTable(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
 
fromTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a BigQuery TableSchema to a Beam Schema.
fromTableSchema(TableSchema, BigQueryUtils.SchemaConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a BigQuery TableSchema to a Beam Schema.
fromTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Creates and returns a transform for reading from a Cloud Pub/Sub topic.
fromTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
fromUri(URI) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Creates a GcsPath from a URI.
fromUri(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Creates a GcsPath from a URI in string form.
FULL_RANGE - Static variable in class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
 
FullNameTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
Base class for table providers that look up table metadata using full table names, instead of querying it by parts of the name separately.
FullNameTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
 
fullOuterJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Full Outer Join of two collections of KV elements.
fullOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Full Outer Join of two collections of KV elements.
fullOuterJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
Perform a full outer join.
fullPublishResult() - Static method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoders
Returns a new PublishResult coder which serializes the sdkResponseMetadata and sdkHttpMetadata, including the HTTP response headers.
fullPublishResultWithoutHeaders() - Static method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoders
Returns a new PublishResult coder which serializes the sdkResponseMetadata and sdkHttpMetadata, but does not include the HTTP response headers.
fullUpdate(String, String) - Static method in class org.apache.beam.sdk.io.mongodb.UpdateField
Sets the limit of documents to find.
fullyExpand(Map<TupleTag<?>, PValue>) - Static method in class org.apache.beam.sdk.values.PValues
Returns all the tagged PCollections represented in the given PValue.
fun1(ScalaInterop.Fun1<T, V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
fun2(ScalaInterop.Fun2<T1, T2, V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
functionGroup - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
ZetaSQL function group identifier.
fuse(PipelineTranslator.UnresolvedTranslation<T, T2>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.UnresolvedTranslation
 

G

Gauge - Interface in org.apache.beam.sdk.metrics
A metric that reports the latest value out of reported values.
gauge(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
gauge(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
GaugeImpl - Class in org.apache.beam.runners.jet.metrics
Implementation of Gauge.
GaugeImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.GaugeImpl
 
GaugeResult - Class in org.apache.beam.sdk.metrics
The result of a Gauge metric.
GaugeResult() - Constructor for class org.apache.beam.sdk.metrics.GaugeResult
 
GaugeResult.EmptyGaugeResult - Class in org.apache.beam.sdk.metrics
Empty GaugeResult, representing no values reported.
GceMetadataUtil - Class in org.apache.beam.sdk.extensions.gcp.util
 
GceMetadataUtil() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
 
GcpCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
Construct an oauth credential to be used by the SDK and the SDK workers.
GcpIoPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.gcp.common
A registrar containing the default GCP options.
GcpIoPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
 
GcpOAuthScopesFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpOAuthScopesFactory
 
GcpOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
Options used to configure Google Cloud Platform specific options such as the project and credentials.
GcpOptions.DefaultProjectFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Attempts to infer the default project based upon the environment this application is executing within.
GcpOptions.EnableStreamingEngineFactory - Class in org.apache.beam.sdk.extensions.gcp.options
EnableStreamingEngine defaults to false unless one of the two experiments is set.
GcpOptions.GcpOAuthScopesFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Returns the default set of OAuth scopes.
GcpOptions.GcpTempLocationFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Returns PipelineOptions.getTempLocation() as the default GCP temp location.
GcpOptions.GcpUserCredentialsFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Attempts to load the GCP credentials.
GcpPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.gcp.options
A registrar containing the default GCP options.
GcpPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
 
GCPSecretSessionServiceFactory - Class in org.apache.beam.sdk.io.solace.broker
This class implements a SessionServiceFactory that retrieve the basic authentication credentials from a Google Cloud Secret Manager secret.
GCPSecretSessionServiceFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 
GCPSecretSessionServiceFactory.Builder - Class in org.apache.beam.sdk.io.solace.broker
 
GcpTempLocationFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
 
GcpUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
 
GCS_URI - Static variable in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Pattern that is used to parse a GCS URL.
GcsCountersOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
 
GcsCreateOptions - Class in org.apache.beam.sdk.extensions.gcp.storage
An abstract class that contains common configuration options for creating resources.
GcsCreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
 
GcsCreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.storage
A builder for GcsCreateOptions.
GcsFileSystemRegistrar - Class in org.apache.beam.sdk.extensions.gcp.storage
AutoService registrar for the GcsFileSystem.
GcsFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
 
GcsOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
Options used to configure Google Cloud Storage.
GcsOptions.ExecutorServiceFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Returns the default ExecutorService to use within the Apache Beam SDK.
GcsOptions.PathValidatorFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Creates a PathValidator object using the class specified in GcsOptions.getPathValidatorClass().
GcsPath - Class in org.apache.beam.sdk.extensions.gcp.util.gcsfs
Implements the Java NIO Path API for Google Cloud Storage paths.
GcsPath(FileSystem, String, String) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Constructs a GcsPath.
GcsPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
GCP implementation of PathValidator.
GcsResourceId - Class in org.apache.beam.sdk.extensions.gcp.storage
ResourceId implementation for Google Cloud Storage.
GcsStager - Class in org.apache.beam.runners.dataflow.util
Utility class for staging files to GCS.
gcsUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
The buffer size (in bytes) to use when uploading files to GCS.
GcsUtil - Class in org.apache.beam.sdk.extensions.gcp.util
Provides operations on GCS.
GcsUtil.CreateOptions - Class in org.apache.beam.sdk.extensions.gcp.util
 
GcsUtil.CreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.util
 
GcsUtil.GcsCountersOptions - Class in org.apache.beam.sdk.extensions.gcp.util
 
GcsUtil.GcsUtilFactory - Class in org.apache.beam.sdk.extensions.gcp.util
This is a DefaultValueFactory able to create a GcsUtil using any transport flags specified on the PipelineOptions.
GcsUtil.StorageObjectOrIOException - Class in org.apache.beam.sdk.extensions.gcp.util
A class that holds either a StorageObject or an IOException.
GcsUtilFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
 
generate(Schema) - Static method in class org.apache.beam.sdk.coders.RowCoderGenerator
 
generateInitialChangeStreamPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
Returns the result from GenerateInitialChangeStreamPartitions API.
generateInitialPartitionsAction(ChangeStreamMetrics, ChangeStreamDao, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class for processing DetectNewPartitionsDoFn
GenerateInitialPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
Class to generate first set of outputs for DetectNewPartitionsDoFn.
GenerateInitialPartitionsAction(ChangeStreamMetrics, ChangeStreamDao, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.GenerateInitialPartitionsAction
 
generateRandom(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
Generates a unique name for the partition metadata table and its indexes.
generateRowKeyPrefix() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
Return a random base64 encoded 8 byte string.
GenerateSequence - Class in org.apache.beam.sdk.io
A PTransform that produces longs starting from the given value, and either up to the given limit or until Long.MAX_VALUE / until the given time elapses.
GenerateSequence() - Constructor for class org.apache.beam.sdk.io.GenerateSequence
 
GenerateSequence.External - Class in org.apache.beam.sdk.io
Exposes GenerateSequence as an external transform for cross-language usage.
GenerateSequence.External.ExternalConfiguration - Class in org.apache.beam.sdk.io
Parameters class to expose the transform to an external SDK.
GenerateSequenceConfiguration() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
 
GenerateSequenceSchemaTransformProvider - Class in org.apache.beam.sdk.providers
 
GenerateSequenceSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration - Class in org.apache.beam.sdk.providers
 
GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder - Class in org.apache.beam.sdk.providers
 
GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate - Class in org.apache.beam.sdk.providers
 
GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder - Class in org.apache.beam.sdk.providers
 
GenerateSequenceSchemaTransformProvider.GenerateSequenceSchemaTransform - Class in org.apache.beam.sdk.providers
 
GenerateSequenceTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.seqgen
Sequence generator table provider.
GenerateSequenceTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
 
generic(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the Avro schema.
generic() - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
Returns an AvroDatumFactory instance for GenericRecord.
GenericDatumFactory() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
 
GenericDlq - Class in org.apache.beam.sdk.schemas.io
Helper to generate a DLQ transform to write PCollection to an external system.
GenericDlqProvider - Interface in org.apache.beam.sdk.schemas.io
A Provider for generic DLQ transforms that handle deserialization failures.
get(JobInfo) - Method in interface org.apache.beam.runners.fnexecution.control.ExecutableStageContext.Factory
Get or create ExecutableStageContext for given JobInfo.
get(JobInfo) - Method in class org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory
 
get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
Returns an Iterable of values representing the bag user state for the given key and window.
get(W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.IterableSideInputHandler
Returns an Iterable of values representing the side input for the given window.
get(W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
Returns an Iterable of keys representing the side input for the given window.
get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
Returns an Iterable of values representing the side input for the given key and window.
get() - Method in class org.apache.beam.runners.portability.CloseableResource
Gets the underlying resource.
get() - Static method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
 
get(BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.ByWindow
 
get(BoundedWindow) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues
 
get(BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.Global
 
get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
 
get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
 
get(Long) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
Returns the Broadcast containing the GlobalWatermarkHolder.SparkWatermarks mapped to their sources.
get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
get() - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
 
get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
 
get(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
 
get(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
 
get(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
 
get() - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsBuilderFactory
 
get() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
Returns the estimated throughput bytes for this run.
get() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
Returns the estimated throughput bytes for now.
get() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
Returns the estimated throughput for now.
get() - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
get() - Method in interface org.apache.beam.sdk.options.ValueProvider
Returns the runtime value wrapped by this ValueProvider in case it is ValueProvider.isAccessible(), otherwise fails.
get() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
get() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
get() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
 
get(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
 
get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
 
get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
 
get(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
 
get(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
get(TypeDescriptor<?>) - Method in interface org.apache.beam.sdk.schemas.utils.FieldValueTypeSupplier
Return all the FieldValueTypeInformations.
get(TypeDescriptor<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.utils.FieldValueTypeSupplier
Return all the FieldValueTypeInformations.
get(K) - Method in interface org.apache.beam.sdk.state.MapState
A deferred lookup, using null values if the item is not found.
get(K) - Method in interface org.apache.beam.sdk.state.MultimapState
A deferred lookup, returns an empty iterable if the item is not found.
get(String) - Method in interface org.apache.beam.sdk.state.TimerMap
 
get(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
Returns the value represented by the given TupleTag.
get(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
Returns an DoFn.OutputReceiver for the given tag.
get() - Method in interface org.apache.beam.sdk.transforms.Materializations.IterableView
Returns an iterable for all values.
get() - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
Returns an iterable of all keys.
get(K) - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
Returns an iterable of all the values for the specified key.
get(int) - Method in class org.apache.beam.sdk.values.PCollectionList
Returns the PCollection at the given index (origin zero).
get(String) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
Returns the PCollection associated with the given String in this PCollectionRowTuple.
get(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns the PCollection associated with the given TupleTag in this PCollectionTuple.
get(String) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns the PCollection associated with the given tag in this PCollectionTuple.
get() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
 
get(int) - Method in class org.apache.beam.sdk.values.TupleTagList
Returns the TupleTag at the given index (origin zero).
getAcceptedIssuers() - Method in class org.apache.beam.sdk.io.splunk.CustomX509TrustManager
 
getAccessKey() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getAccountName() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getAccum() - Method in interface org.apache.beam.sdk.state.CombiningState
Read the merged accumulator for this state cell.
getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
getAccumulatorCoder(CoderRegistry, Coder<TimestampedValue<KV<EventKeyT, KV<Long, EventT>>>>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
 
getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
getAccumulatorCoder(CoderRegistry, Coder<Boolean>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
 
getAccumulatorCoder(CoderRegistry, Coder<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
Deprecated.
 
getAccumulatorCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
getAccumulatorCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
getAccumulatorCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
Returns the Coder to use for accumulator AccumT values, or null if it is not able to be inferred.
getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
getActiveWorkRefreshPeriodMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getAdditionalInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
getAdditionalInputs() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns the side inputs of this Combine, tagged with the tag of the PCollectionView.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns the side inputs of this Combine, tagged with the tag of the PCollectionView.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
Returns the side inputs of this ParDo, tagged with the tag of the PCollectionView.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns the side inputs of this ParDo, tagged with the tag of the PCollectionView.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns all PValues that are consumed as inputs to this PTransform that are independent of the expansion of the InputT within PTransform.expand(PInput).
getAdditionalOutputTags() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getAddresses() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getAlgorithm() - Method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
Returns the string representation of this type.
getAlgorithm() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
 
getAll() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Retrieve all HL7v2 Messages from a PCollection of message IDs (such as from PubSub notification subscription).
getAll(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns the values from the table represented by the given TupleTag<V> as an Iterable<V> (which may be empty if there are no results).
getAll(String) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Like CoGbkResult.getAll(TupleTag) but using a String instead of a TupleTag.
getAll() - Method in class org.apache.beam.sdk.values.PCollectionList
Returns an immutable List of all the PCollections in this PCollectionList.
getAll() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
Returns an immutable Map from tag to corresponding PCollection, for all the members of this PCollectionRowTuple.
getAll() - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns an immutable Map from TupleTag to corresponding PCollection, for all the members of this PCollectionTuple.
getAll() - Method in class org.apache.beam.sdk.values.TupleTagList
Returns an immutable List of all the TupleTags in this TupleTagList.
getAllFields() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
If true, all fields are being accessed.
getAllIds(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getAllJobs() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
getAllMetadata() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Deprecated.
use schema options instead.
getAllowDuplicates() - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
 
getAllowDuplicates() - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeByKey
 
getAllowedLateness() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.DoFn
Deprecated.
This method permits a DoFn to emit elements behind the watermark. These elements are considered late, and if behind the allowed lateness of a downstream PCollection may be silently dropped. See https://github.com/apache/beam/issues/18065 for details on a replacement.
getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.WithTimestamps
Deprecated.
This method permits a to elements to be emitted behind the watermark. These elements are considered late, and if behind the allowed lateness of a downstream PCollection may be silently dropped. See https://github.com/apache/beam/issues/18065 for details on a replacement.
getAllowlist() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
 
getAllowNonRestoredState() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getAllPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Fetches all partitions with a PartitionMetadataAdminDao.COLUMN_CREATED_AT less than the given timestamp.
getAllRows(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getAllWorkerStatuses(long, TimeUnit) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
Get all the statuses from all connected SDK harnesses within specified timeout.
getAlpha() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
 
getAlsoStartLoopbackWorker() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
getAnnotatedConstructor(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
getAnnotatedCreateMethod(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
getAnnotations() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns annotations map to provide additional hints to the runner.
getApiKey() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getApiPrefix() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
Generates the API endpoint prefix based on the set values.
getApiRootUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The root URL for the Dataflow API.
getApiServiceDescriptor() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
Get an Endpoints.ApiServiceDescriptor describing the endpoint this GrpcFnServer is bound to.
getAppend() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
 
getAppId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getApplicationName() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
getAppliedFn(CoderRegistry, Coder<? extends KV<K, ? extends Iterable<InputT>>>, WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
Returns the Combine.CombineFn bound to its coders.
getApplyMethod(ScalarFn) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFnReflector
Gets the method annotated with ScalarFn.ApplyMethod from scalarFn.
getAppName() - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
Name of application, for display purposes.
getAppProfileId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Returns the app profile being read from.
getApproximateArrivalTimestamp() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getApproximateArrivalTimestamp() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
 
getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
 
getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
 
getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
getArgument() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
An optional argument to configure the type.
getArguments() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
 
getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
 
getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
 
getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
 
getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
getArgumentType() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
A schema type representing how to interpret the argument.
getArgumentTypes(Method) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a list of argument types for the given method, which must be a part of the class.
getArray(String) - Method in class org.apache.beam.sdk.values.Row
Get an array value by field name, IllegalStateException is thrown if schema doesn't match.
getArray(int) - Method in class org.apache.beam.sdk.values.Row
Get an array value by field index, IllegalStateException is thrown if schema doesn't match.
getArtifact(ArtifactApi.GetArtifactRequest, StreamObserver<ArtifactApi.GetArtifactResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
getArtifact(RunnerApi.ArtifactInformation) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
getArtifactPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
getArtifactStagingPath() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
getATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
getATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
getAttachedMode() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getAttachmentBytes() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the attachment data of the message as a byte array, if any.
getAttempted() - Method in class org.apache.beam.sdk.metrics.MetricResult
Return the value of this metric across all attempts of executing all parts of the pipeline.
getAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the given attribute value.
getAttributeId() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getAttributeId() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the full map of attributes.
getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getAttributesMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getAttributesMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getAuthenticator() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getAuthenticator() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getAuthToken(String, String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
Certain embedded scenarios and so on actually allow for having no authentication at all.
getAutoCommit() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getAutoOffsetResetConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getAutoscalingAlgorithm() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
The autoscaling algorithm to use for the workerpool.
getAutoSharding() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getAutosharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getAutoWatermarkInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getAvroBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Returns a function mapping encoded AVRO GenericRecords to Beam Rows.
getAwsCredentialsProvider() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
The credential instance that should be used to authenticate against AWS services.
getAwsCredentialsProvider() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
AwsCredentialsProvider used to configure AWS service clients.
getAwsRegion() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
AWS region used by the AWS client.
getAwsRegion() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
Region used to configure AWS service clients.
getAwsServiceEndpoint() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
The AWS service endpoint used by the AWS client.
getAzureConnectionString() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getAzureCredentialsProvider() - Method in interface org.apache.beam.sdk.io.azure.options.AzureOptions
The credential instance that should be used to authenticate against Azure services.
getBacking() - Method in class org.apache.beam.sdk.fn.data.WeightedList
 
getBacklogBytes(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
 
getBacklogBytes(String) - Method in interface org.apache.beam.sdk.io.solace.broker.SempClient
Retrieves the size of the backlog (in bytes) for the specified queue.
getBacklogCheckTime() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
The time at which latest offset for the partition was fetched in order to calculate backlog.
getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getBadRecordRouter() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getBadRecordRouter() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getBagUserStateSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
Get a mapping from PTransform id to user state input id to bag user states that are used during execution.
getBaseAutoValueClass(TypeDescriptor<?>) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
 
getBaseName() - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
 
getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
 
getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
 
getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
 
getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
 
getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
getBaseType() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
The base Schema.FieldType used to store values of this type.
getBaseValue(String, Class<T>) - Method in class org.apache.beam.sdk.values.Row
Returns the base type for this field.
getBaseValue(String) - Method in class org.apache.beam.sdk.values.Row
Returns the base type for this field.
getBaseValue(int, Class<T>) - Method in class org.apache.beam.sdk.values.Row
Returns the base type for this field.
getBaseValue(int) - Method in class org.apache.beam.sdk.values.Row
Returns the base type for this field.
getBaseValues() - Method in class org.apache.beam.sdk.values.Row
Return a list of data values.
getBatchClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getBatchDuration() - Method in class org.apache.beam.runners.spark.io.CreateStream
 
getBatches() - Method in class org.apache.beam.runners.spark.io.CreateStream
Get the underlying queue representing the mock stream of micro-batches.
getBatching() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
getBatchingParams() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
Returns user supplied parameters for batching.
getBatchingParams() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.WithShardedKey
Returns user supplied parameters for batching.
getBatchInitialCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
getBatchIntervalMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getBatchMaxBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The maximum number of bytes to include in a batch.
getBatchMaxCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The maximum number of writes to include in a batch.
getBatchService() - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices
 
getBatchService() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
 
getBatchSize() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
 
getBatchSize() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
 
getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
getBatchSizeBytes() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
 
getBatchTargetLatency() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Target latency for batch requests.
getBeamRelInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
getBeamSchemaFromProto(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
Retrieves a Beam Schema from a Protocol Buffer message.
getBeamSchemaFromProtoSchema(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
Parses the given Protocol Buffers schema string, retrieves the Descriptor for the specified message name, and constructs a Beam Schema from it.
getBeamSqlTable() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
getBeamSqlUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
For UDFs implement BeamSqlUdf.
getBearerToken() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getBigQueryEndpoint() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
BQ endpoint to use.
getBigQueryLocation() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
getBigQueryProject() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getBigtableChangeStreamInstanceId() - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.BigtableChangeStreamTestOptions
 
getBigtableClientOverride() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Returns the Bigtable client override.
getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Deprecated.
read options are configured directly on BigtableIO.read(). Use BigtableIO.Read.populateDisplayData(DisplayData.Builder) to view the current configurations.
getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
write options are configured directly on BigtableIO.write(). Use BigtableIO.Write.populateDisplayData(DisplayData.Builder) to view the current configurations.
getBlobServiceEndpoint() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
The Azure Blobstore service endpoint used by the Blob service client.
getBlobstoreClientFactoryClass() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getBody() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
Message body.
getBody() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getBoolean(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getBoolean(Map<String, Object>, String, Boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getBoolean() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getBoolean(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.BOOLEAN value by field name, IllegalStateException is thrown if schema doesn't match.
getBoolean(int) - Method in class org.apache.beam.sdk.values.Row
Get a Boolean value by field index, ClassCastException is thrown if schema doesn't match.
getBootstrapServers() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
getBootstrapServers() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
Sets the bootstrap servers for the Kafka consumer.
getBootstrapServers() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
getBoundednessOfRelNode(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
This method returns the Boundedness of a RelNode.
getBqStreamingApiLoggingFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getBranch() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getBroadcastSizeEstimate() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
getBucket() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Returns the bucket name associated with this GCS path, or an empty string if this is a relative path component.
getBucket(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Get the Bucket from Cloud Storage path or propagates an exception.
getBucketKeyEnabled() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
getBucketKeyEnabled() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Whether to ose an S3 Bucket Key for object encryption with server-side encryption using AWS KMS (SSE-KMS) or not.
getBucketKeyEnabled() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
getBucketKeyEnabled() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
Whether to use an S3 Bucket Key for object encryption with server-side encryption using AWS KMS (SSE-KMS) or not.
getBufferSize() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
getBuilder(S3Options) - Static method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Creates a new AmazonS3ClientBuilder as specified by s3Options.
getBuilderCreator(Class<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
Try to find an accessible builder class for creating an AutoValue class.
getBuiltinMethods() - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BeamBuiltinFunctionProvider
 
getBulkDirective() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
 
getBulkEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getBulkIO() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
Get a new bundle for processing the data in an executable stage.
getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
 
getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
 
getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
Get a new bundle for processing the data in an executable stage.
getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
 
getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
 
getBundle() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
FHIR R4 bundle resource object as a string.
getBundleSize() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getByte() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getByte(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.BYTE value by field name, IllegalStateException is thrown if schema doesn't match.
getByte(int) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.BYTE value by field index, ClassCastException is thrown if schema doesn't match.
getBytes(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getBytes(Map<String, Object>, String, byte[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getBytes() - Method in class org.apache.beam.sdk.io.range.ByteKey
Returns a newly-allocated byte[] representing this ByteKey.
getBytes(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.BYTES value by field name, IllegalStateException is thrown if schema doesn't match.
getBytes(int) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.BYTES value by field index, ClassCastException is thrown if schema doesn't match.
getBytesPerOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns approximately how many bytes of data correspond to a single offset in this source.
getCacheTokens() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
Retrieves a list of valid cache tokens.
getCallable() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
 
getCaseEnumType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
Returns the EnumerationType that is used to represent the case type.
getCaseSensitive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getCaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
Returns the enumeration that specified which OneOf field is set.
getCatalog() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getCatalogConfig() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getCatalogName() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
 
getCatalogName() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getCatalogName() - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration
 
getCatalogProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
 
getCatalogProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getCatalogProperties() - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration
 
getCause() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
 
getCellsMutatedPerColumn(String, String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
Return the total number of cells affected when the specified column is mutated.
getCellsMutatedPerRow(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
Return the total number of cells affected with the given row is deleted.
getCEPFieldRefFromParKeys(ImmutableBitSet) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
Transform the partition columns into serializable CEPFieldRef.
getCepKind() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
 
getCEPPatternFromPattern(Schema, RexNode, Map<String, RexNode>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
Construct a list of CEPPatterns from a RexNode.
getChangeSequenceNumber() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
The value supplied to the BigQuery _CHANGE_SEQUENCE_NUMBER pseudo-column.
getChangeStreamContinuationTokens() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getChangeStreamContinuationTokens() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
Creates and returns a singleton DAO instance for querying a partition change stream.
getChangeStreamName() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getChangeStreamName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getChangeStreamNamePrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Return the prefix used to identify the rows belonging to this job.
getChangeStreamNamePrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
 
getChannelFactory() - Method in class org.apache.beam.sdk.io.CompressedSource
 
getChannelNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
getChannelzShowOnlyWindmillServiceChannels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getCharset() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration
 
getCheckpointDir() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
getCheckpointDurationMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getCheckpointingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getCheckpointingInterval() - Method in interface org.apache.beam.sdk.io.kafka.KafkaIO.Read.FakeFlinkPipelineOptions
 
getCheckpointingMode() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getCheckpointMark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getCheckpointMark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getCheckpointMark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns a UnboundedSource.CheckpointMark representing the progress of this UnboundedReader.
getCheckpointMarkCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
 
getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.UnboundedSource
Returns a Coder for encoding and decoding the checkpoints for this source.
getCheckpointTimeoutMillis() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getCheckStopReadingFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getChildPartitions() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
List of child partitions yielded within this record.
getChildRels(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
getClass(String, String) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
 
getClasses() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a set of TypeDescriptors, one for each superclass (including this class).
getClassName() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
Gets the name of the Java class that this CloudObject represents.
getClazz() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
 
getClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
getClientBuilderFactory() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
 
getClientConfiguration() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
The client configuration instance that should be used to configure AWS service clients.
getClientFactory() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getClientInfo(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getClientInfo() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getClock() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getCloningBehavior() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
getCloseStream() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getClosingBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getClosure() - Method in class org.apache.beam.sdk.transforms.Contextful
Returns the closure.
getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.aws.dynamodb.AwsClientsProvider
Deprecated.
DynamoDBIO doesn't require a CloudWatch client
getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.aws.sns.AwsClientsProvider
Deprecated.
SnsIO doesn't require a CloudWatch client
getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
 
getClusterId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getClusterName() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
getClusterType() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
getCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
Returns the type code of the column.
getCodec(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
Return an AVRO codec for a given destination.
getCodeJarPathname() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
getCoder() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
 
getCoder() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
 
getCoder(Class<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Returns the Coder to use for values of the given class.
getCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Returns the Coder to use for values of the given type.
getCoder(TypeDescriptor<OutputT>, TypeDescriptor<InputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Deprecated.
This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
getCoder(Class<? extends T>, Class<T>, Map<Type, ? extends Coder<?>>, TypeVariable<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Deprecated.
This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
getCoder() - Method in class org.apache.beam.sdk.coders.DelegateCoder
Returns the coder used to encode/decode the intermediate values produced/consumed by the coding functions of this DelegateCoder.
getCoder() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
 
getCoder() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
 
getCoder(CoderRegistry) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
 
getCoder(CoderRegistry) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
 
getCoder() - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.RowMapperWithCoder
 
getCoder(Pipeline) - Static method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
 
getCoder() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
The coder for the record, or null if there is no coder.
getCoder() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns a Coder suitable for IntervalWindow.
getCoder() - Method in class org.apache.beam.sdk.values.PCollection
Returns the Coder used by this PCollection to encode and decode the values stored in it.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.AtomicCoder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.Coder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.CustomCoder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.KvCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.MapCoder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.SnappyCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.ZstdCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
 
getCoderArguments() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
getCoderInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
Deprecated.
this method will be removed entirely. The PCollection underlying a side input, including its Coder, is part of the side input's specification with a ParDo transform, which will obtain that information via a package-private channel.
getCoderInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
getCoderProvider() - Static method in class org.apache.beam.sdk.coders.SerializableCoder
Returns a CoderProvider which uses the SerializableCoder if possible for all types.
getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns a CoderProvider which uses the AvroCoder if possible for all types.
getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
Returns a CoderProvider which uses the DynamicProtoCoder for proto messages.
getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns a CoderProvider which uses the ProtoCoder for proto messages.
getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
 
getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
 
getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
 
getCoderProvider() - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
Returns a CoderProvider which uses the WritableCoder for Hadoop writable types.
getCoderProvider() - Static method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
 
getCoderProviders() - Method in interface org.apache.beam.sdk.coders.CoderProviderRegistrar
Returns a list of coder providers which will be registered by default within each coder registry instance.
getCoderProviders() - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.aws.sns.SnsCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.hbase.HBaseCoderProviderRegistrar
 
getCoderRegistry() - Method in class org.apache.beam.sdk.Pipeline
Returns the CoderRegistry that this Pipeline uses.
getCoderTranslators() - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
 
getCoderURNs() - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderRegistrar
 
getCoGbkResultSchema() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns the CoGbkResultSchema associated with this KeyedPCollectionTuple.
getCohorts() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
Returns a list of sets of expressions that should be on the same level.
getCollations() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
getCollection() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
Returns the underlying PCollection of this TaggedKeyedPCollection.
getCollectionElementType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
getColumns(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
getColumns() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
 
getCombineFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
getCombineFn() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
getComment() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
getCommitDeadline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getCommitRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getCommitted() - Method in class org.apache.beam.sdk.metrics.MetricResult
Return the value of this metric across all successfully completed parts of the pipeline.
getCommittedOrNull() - Method in class org.apache.beam.sdk.metrics.MetricResult
Return the value of this metric across all attempts of executing all parts of the pipeline.
getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
Returns the commit timestamp of the read / write transaction.
getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The timestamp at which the modifications within were committed in Cloud Spanner.
getComponents() - Method in class org.apache.beam.sdk.coders.AtomicCoder
Returns the list of Coders that are components of this Coder.
getComponents() - Method in class org.apache.beam.sdk.coders.StructuredCoder
Returns the list of Coders that are components of this Coder.
getComponents(AvroGenericCoder) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
 
getComponents() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
getComponents() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
Hierarchy list of component paths making up the full path, starting with the top-level child component path.
getComponents() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
getComponents() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
 
getComponents() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
getComponents() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
getComponentType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns the component type if this type is an array type, otherwise returns null.
getCompression() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
Returns the method with which this file will be decompressed in FileIO.ReadableFile.open().
getCompression() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
See Compression for expected values.
getCompression() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
getCompressionCodecName() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration
 
getComputeNumShards() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getConfig() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
 
getConfigProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
 
getConfigProperties() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getConfigProperties() - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
 
getConfigurationMap() - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Configuration Map Getter.
getConfigurationRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
 
getConfigurationRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform
 
getConfiguredLoggerFromOptions(SdkHarnessOptions) - Static method in interface org.apache.beam.sdk.options.SdkHarnessOptions
Configure log manager's default log level and log level overrides from the sdk harness options, and return the list of configured loggers.
getConfluentSchemaRegistrySubject() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getConfluentSchemaRegistryUrl() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getConnection(InfluxDbIO.DataSourceConfiguration, boolean) - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
 
getConnectionInitSql() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getConnectionInitSql() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getConnectionProperties() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getConnectionProperties() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getConnector() - Method in enum org.apache.beam.io.debezium.Connectors
Class connector to debezium.
getConnectStringPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
getConnectTimeout() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getConstructorCreator(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
Try to find an accessible constructor for creating an AutoValue class.
getConstructorCreator(TypeDescriptor<?>, Constructor, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
getConstructorCreator(TypeDescriptor<T>, Constructor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
getConsumerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getConsumerConfigUpdates() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getConsumerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getConsumerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getConsumerPollingTimeout() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getContainerImageBaseRepository() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Provides the version/tag for constructing the container image path.
getContent() - Method in class org.apache.beam.sdk.io.tika.ParseResult
Returns the extracted text.
getContentEncoding() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getContentType() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
The content type for the created file, eg "text/plain".
getContentType() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
getContext() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets the context of a plugin.
getContiguousSequenceRangeReevaluationFrequency() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
How frequently the combiner should reevaluate the maximum range? This parameter only affects the behaviour of streaming pipelines.
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
Return a trigger to use after a GroupByKey to preserve the intention of this trigger.
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
Subclasses should override this to return the Trigger.getContinuationTrigger() of this Trigger.
getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
 
getConversionOptions(ObjectNode) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
 
getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroSchemaInformationProvider
 
getConvertedSchemaInformation(Schema, TypeDescriptor<T>, SchemaRegistry) - Static method in class org.apache.beam.sdk.schemas.utils.ConvertHelpers
Get the coder used for converting from an inputSchema to a given type.
getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.utils.RowSchemaInformationProvider
 
getConvertedSchemaInformation(Schema, TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.utils.SchemaInformationProvider
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
 
getConvertedType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
 
getConvertPrimitive(Schema.FieldType, TypeDescriptor<?>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.ConvertHelpers
Returns a function to convert a Row into a primitive type.
getCorrelationId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getCosmosClientBuilder() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
 
getCosmosKey() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
The Azure Cosmos key used to perform authentication for accessing resource.
getCosmosServiceEndpoint() - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
The Azure Cosmos service endpoint used by the Cosmos client.
getCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
getCount() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
getCount() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
getCountBackoffs() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count invocations of BackOff.nextBackOffMillis().
getCountCacheReadFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count Cache read failures.
getCountCacheReadNonNulls() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count associated non-null values resulting from Cache reads.
getCountCacheReadNulls() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count associated null values resulting from Cache reads.
getCountCacheReadRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count number of attempts to read from the Cache.
getCountCacheWriteFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count Cache write failures.
getCountCacheWriteRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count number of attempts to write to the Cache.
getCountCacheWriteSuccesses() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count Cache write successes.
getCountCalls() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count invocations of Caller.call(RequestT).
getCountEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getCounter(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
 
getCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Counter that should be used for implementing the given metricName in this container.
getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
 
getCounters() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
Return the metric results for the counters that matched the filter.
getCountFailures() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count failures resulting from Call's successful Caller invocation.
getCountRequests() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count incoming request elements processed by Call's DoFn.
getCountResponses() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count outgoing responses resulting from Call's successful Caller invocation.
getCountryOfResidence() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
getCountSetup() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count invocations of SetupTeardown.setup().
getCountShouldBackoff() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count when CallShouldBackoff.isTrue() is found true.
getCountSleeps() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count invocations of Sleeper.sleep(long).
getCountTeardown() - Method in class org.apache.beam.io.requestresponse.Monitoring
Count invocations of SetupTeardown.teardown().
getCpu() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
getCpuRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
getCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time at which this partition was first detected and created in the metadata table.
getCreatedAtIndexName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
 
getCreateDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getCreateDisposition() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
 
getCreateDisposition() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
 
getCreateFromSnapshot() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
If set, the snapshot from which the job should be created.
getCreateTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets create time.
getCreator(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Get an object creator for an AVRO-generated SpecificRecord.
getCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.auth.CredentialFactory
 
getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
Returns a default GCP Credentials or null when it fails.
getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
 
getCredential() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsUserCredentialFactory
Returns Credentials as configured by GoogleAdsOptions.
getCredentialFactoryClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
The class of the credential factory that should be created and used to create credentials.
getCredentials() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getCrossProduct() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
 
getCsvConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
getCsvFormat() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
 
getCsvRecord() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
The CSV record associated with the caught Exception.
getCurrent() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getCurrent() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
getCurrent() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Gets the current record from the delegate reader.
getCurrent() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getCurrent() - Method in class org.apache.beam.sdk.io.Source.Reader
Returns the value of the data item that was read by the last Source.Reader.start() or Source.Reader.advance() call.
getCurrentBlock() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
 
getCurrentBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Returns the current block (the block that was read by the last successful call to BlockBasedSource.BlockBasedReader.readNextBlock()).
getCurrentBlockOffset() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
 
getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Returns the largest offset such that starting to read from that offset includes the current block.
getCurrentBlockSize() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
 
getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Returns the size of the current block in bytes as it is represented in the underlying file, if possible.
getCurrentBundle() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
getCurrentBundleTimestamp() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
getCurrentContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Return the MetricsContainer for the current thread.
getCurrentDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
getCurrentDirectory() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
 
getCurrentDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns the ResourceId that represents the current directory of this ResourceId.
getCurrentOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
getCurrentOffset() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
getCurrentOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Returns the starting offset of the current record, which has been read by the last successful Source.Reader.start() or Source.Reader.advance() call.
getCurrentParent() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
Gets the parent composite transform to the current transform, if one exists.
getCurrentRateLimit() - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
getCurrentRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
Returns the current record.
getCurrentRecordId() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns a unique identifier for the current record.
getCurrentRelativeTime() - Method in interface org.apache.beam.sdk.state.Timer
Returns the current relative time used by Timer.setRelative() and Timer.offset(org.joda.time.Duration).
getCurrentRowAsStruct() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Returns the record at the current pointer as a Struct.
getCurrentSchemaPlus() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
Calcite-created SchemaPlus wrapper for the current schema.
getCurrentSource() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getCurrentSource() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
 
getCurrentSource() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Returns a Source describing the same input that this Reader currently reads (including items already read).
getCurrentSource() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
getCurrentSource() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getCurrentSource() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
getCurrentSource() - Method in class org.apache.beam.sdk.io.Source.Reader
Returns a Source describing the same input that this Reader currently reads (including items already read).
getCurrentSource() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns the UnboundedSource that created this reader.
getCurrentTimestamp() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
By default, returns the minimum possible timestamp.
getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.Source.Reader
Returns the timestamp associated with the current data item.
getCurrentToken() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getCurrentTransform() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
getCurrentTransform() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getCurrentTransform() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getCursor() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
 
getCustomBeamRequirement() - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
 
getCustomerId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
getCustomerProvidedKey() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getCustomError(HttpRequestWrapper, HttpResponseWrapper) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors
 
getCustomError() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
 
getDanglingDataSets() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
getData() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets data.
getData(Row) - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
 
getData() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getDataAsBytes() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getDataAsBytes() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getDatabase() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting a Snowflake database.
getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getDatabase() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getDatabaseAdminClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getDatabaseClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
getDatabaseRole() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getDataBoostEnabled() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getDataCatalogEndpoint() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
DataCatalog endpoint.
getDataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
getDataCatalogSegments() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
Returns the data catalog segments.
getDataClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
 
getDataCoder() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
 
getDataflowClient() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
An instance of the Dataflow client.
getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Dataflow endpoint to use.
getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Dataflow endpoint to use.
getDataflowJobFile() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The path to write the translated Dataflow job specification out to at job submission time.
getDataflowKmsKey() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
GCP Cloud KMS key for Dataflow pipelines and buckets created by GcpTempLocationFactory.
getDataflowOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
getDataflowRunnerInfo() - Static method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Returns an instance of DataflowRunnerInfo.
getDataflowServiceOptions() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Service options are set by the user and configure the service.
getDataflowWorkerJar() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
getDataResource() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getDataSchema() - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
 
getDataset(PCollection<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
getDataset(PCollection<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getDataset(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getDataset(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getDataset(String, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getDataset(String, String, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Gets the specified Dataset resource by dataset ID.
getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Gets the specified Dataset resource by dataset ID.
getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getDataSetOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
getDatasetService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
Returns a real, mock, or fake BigQueryServices.DatasetService.
getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
getDataSource() - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
 
getDataSource() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
 
getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
 
getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getDataSourceProviderFn() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting a DataSource provider function for connection credentials.
getDataStreamOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
getDataType() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
getDateTime() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getDateTime(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.DATETIME value by field name, IllegalStateException is thrown if schema doesn't match.
getDateTime(int) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.DATETIME value by field index, IllegalStateException is thrown if schema doesn't match.
getDatumFactory() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns the datum factory used for encoding/decoding.
getDatumWriterFactory(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
Return a AvroSink.DatumWriterFactory for a given destination.
getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getDbSize() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
 
getDebeziumConnectionProperties() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
getDecimal() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getDecimal(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.DECIMAL value by field name, IllegalStateException is thrown if schema doesn't match.
getDecimal(int) - Method in class org.apache.beam.sdk.values.Row
Get a BigDecimal value by field index, ClassCastException is thrown if schema doesn't match.
getDef() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
 
getDef() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
 
getDefault() - Static method in class org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter
 
getDefaultCoder(TypeDescriptor<?>, CoderRegistry) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Returns the default coder for a given type descriptor.
getDefaultDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
Returns the default destination.
getDefaultEnvironmentConfig() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getDefaultEnvironmentType() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getDefaultHeaders() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getDefaultJobName() - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
 
getDefaultOutputCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.Source
Deprecated.
Override Source.getOutputCoder() instead.
getDefaultOutputCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
getDefaultOutputCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
getDefaultOutputCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
Returns the Coder to use by default for output OutputT values, or null if it is not able to be inferred.
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.transforms.PTransform
Deprecated.
Instead, the PTransform should explicitly call PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>) on the returned PCollection.
getDefaultOutputCoder(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
Deprecated.
Instead, the PTransform should explicitly call PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>) on the returned PCollection.
getDefaultOutputCoder(InputT, PCollection<T>) - Method in class org.apache.beam.sdk.transforms.PTransform
Deprecated.
Instead, the PTransform should explicitly call PCollection.setCoder(org.apache.beam.sdk.coders.Coder<T>) on the returned PCollection.
getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
getDefaultOverrides(boolean) - Static method in class org.apache.beam.runners.spark.SparkTransformOverrides
 
getDefaultPrecision(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
getDefaultSdkHarnessLogLevel() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
This option controls the default log level of all loggers without a log level override.
getDefaultTimezone() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
getDefaultValue() - Method in interface org.apache.beam.sdk.values.PCollectionViews.HasDefaultValue
 
getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
Deprecated.
Returns the default value that was specified.
getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
Returns the default value that was specified.
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
 
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Return a WindowMappingFn that returns the earliest window that contains the end of the main-input window.
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns the default WindowMappingFn to use to map main input windows to side input windows.
getDefaultWorkerLogLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Deprecated.
This option controls the default log level of all loggers without a log level override.
getDeidentifyConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
getDeidentifyTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
getDelay() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
 
getDelimiter() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
 
getDelimiters() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
getDeliveryMode() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getDeliveryMode() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
getDependencies() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceConfig
 
getDependencies(RunnerApi.FunctionSpec, PipelineOptions) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
 
getDependencies(ConfigT, PipelineOptions) - Method in interface org.apache.beam.sdk.transforms.ExternalTransformBuilder
List the dependencies needed for this transform.
getDescription() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
If the field has a description, returns the description for the field.
getDescription() - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns the field's description.
getDescription() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
The description of what was being attempted when the failure occurred.
getDescriptor(String) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
getDescriptorFromTableSchema(TableSchema, boolean, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
getDescriptorFromTableSchema(TableSchema, boolean, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
Given a BigQuery TableSchema, returns a protocol-buffer Descriptor that can be used to write data using the BigQuery Storage API.
getDeserializer(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
 
getDeserializer(Map<String, ?>, boolean) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
 
getDesiredNumUnboundedSourceSplits() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The desired number of initial splits for UnboundedSources.
getDestination() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
Staged target for this file.
getDestination(String, String) - Method in interface org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestinationProvider
 
getDestination(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
Returns an object that represents at a high level the destination being written to.
getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Return the user destination object for this writer.
getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns an object that represents at a high level which table is being written to.
getDestination(ValueInSingleWindow<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
 
getDestination() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the destination (topic or queue) to which the message was sent.
getDestinationCoder() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
Returns the coder for DestinationT.
getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the coder for DestinationT.
getDestinationFile(boolean, FileBasedSink.DynamicDestinations<?, DestinationT, ?>, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getDestinationFn() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
getDiagnostics() - Method in exception org.apache.beam.sdk.schemas.transforms.providers.StringCompiler.CompileException
 
getDictionary(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getDir() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
 
getDirectoryTreatment() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
getDisableAutoCommit() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getDisableMetrics() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getDiskSizeGb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Remote worker disk size, in gigabytes, or 0 to use the default size.
getDistribution(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
 
getDistribution() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
getDistribution(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Distribution that should be used for implementing the given metricName in this container.
getDistributions() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
Return the metric results for the distributions that matched the filter.
getDlqTransform(String) - Static method in class org.apache.beam.sdk.schemas.io.GenericDlq
 
getDocToBulk() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
getDocumentCount() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
getDoFnRunner(PipelineOptions, DoFn<InputT, OutputT>, SideInputReader, AbstractParDoP.JetOutputManager, TupleTag<OutputT>, List<TupleTag<?>>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Method in class org.apache.beam.runners.jet.processors.ParDoP
 
getDoFnRunner(PipelineOptions, DoFn<KV<?, ?>, OutputT>, SideInputReader, AbstractParDoP.JetOutputManager, TupleTag<OutputT>, List<TupleTag<?>>, Coder<KV<?, ?>>, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
 
getDoFnSchemaInformation(DoFn<?, ?>, PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.ParDo
Extract information on how the DoFn uses schemas.
getDouble() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getDouble(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.DOUBLE value by field name, IllegalStateException is thrown if schema doesn't match.
getDouble(int) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.DOUBLE value by field index, ClassCastException is thrown if schema doesn't match.
getDriverClassName() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getDriverClassName() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getDriverJars() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getDriverJars() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getDrop() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getDrop() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getDrop() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
 
getDumpHeapOnOOM() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
If true, save a heap dump before killing a thread or process which is GC thrashing or out of memory.
getDuplicateCount() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
getDynamicDestinations() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSink
 
getDynamicDestinations() - Method in class org.apache.beam.sdk.io.FileBasedSink
getEarliestBufferedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
getEarliestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Gets the earliest HL7v2 send time.
getEarliestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
getEarlyTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getElasticsearchHttpPort() - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
 
getElasticsearchServer() - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
 
getElemCoder() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
getElement() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
 
getElementByteSize() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
 
getElementCoders() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
getElementConverters() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
The schema of the @Element parameter.
getElementCount() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
The number of elements after which this trigger may fire.
getElements() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
 
getElements() - Method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
 
getElements() - Method in class org.apache.beam.sdk.transforms.Create.Values
 
getElementType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
If the field is a container type, returns the element type.
getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
getEmulatorHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
A host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
getEmulatorHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getEnableBucketReadMetricCounter() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
If true, reports number of bytes read from each gcs bucket.
getEnableBucketWriteMetricCounter() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
If true, reports number of bytes written to each gcs bucket.
getEnableHeapDumps() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
If true and PipelineOption tempLocation is set, save a heap dump before shutting down the JVM due to GC thrashing or out of memory.
getEnableSparkMetricSinks() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
getEnableStableInputDrain() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getEnableStorageReadApiV2() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getEnableWebUI() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
getEncodedElementByteSize(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
 
getEncodedElementByteSize(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
getEncodedElementByteSize(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
getEncodedElementByteSize(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.Coder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
Overridden to short-circuit the default StructuredCoder behavior of encoding and counting the bytes.
getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
Overridden to short-circuit the default StructuredCoder behavior of encoding and counting the bytes.
getEncodedElementByteSize(String) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
getEncodedElementByteSize(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
getEncodedElementByteSize(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
getEncodedElementByteSize(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
getEncodedElementByteSize(BigQueryInsertError) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
getEncodedElementByteSize(TableRow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
getEncodedElementByteSize(OffsetRange) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
 
getEncodedRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
Nullable to account for failing to encode, or if there is no coder for the record at the time of failure.
getEncodedTypeDescriptor() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.Coder
Returns the TypeDescriptor for the type encoded.
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.CollectionCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DequeCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DurationCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.FloatCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.InstantCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.IterableCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.KvCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ListCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.MapCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SetCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VoidCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
getEncodedWindow() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
 
getEncodingPositions() - Method in class org.apache.beam.sdk.schemas.Schema
Gets the encoding positions for this schema.
getEnd() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
 
getEnd() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
 
getEnd() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
 
getEnd() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
 
getEndAtTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getEndKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns the ByteKey representing the upper bound of this ByteKeyRange.
getEndOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns the specified ending offset of the source.
getEndOffset() - Method in interface org.apache.beam.sdk.io.sparkreceiver.HasOffset
 
getEndpoint() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
Endpoint used to configure AWS service clients.
getEndTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The end time for querying this given partition.
getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
 
getEnvironment() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
Return the environment that the remote handles.
getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
 
getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
 
getEnvironment() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getEnvironmentCacheMillis() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getEnvironmentExpirationMillis() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getEnvironmentId() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getEnvironmentOption(PortablePipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.PortablePipelineOptions
Return the value for the specified environment option or empty string if not present.
getEnvironmentOptions() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getEquivalentFieldType(TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
Returns Beam equivalent of ClickHouse column type.
getEquivalentSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
Returns Beam equivalent of ClickHouse schema.
getError() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
getError() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
The error details if the message could not be published.
getError() - Method in class org.apache.beam.sdk.io.tika.ParseResult
Returns the parse error, if the file was parsed unsuccessfully.
getError() - Method in class org.apache.beam.sdk.schemas.io.Failure
Information about the cause of the failure.
getErrorAsString() - Method in class org.apache.beam.sdk.io.tika.ParseResult
Same as ParseResult.getError(), but returns the complete stack trace of the error as a String.
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
 
getErrorHandling() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
 
getErrorInfo(IOException) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getErrorRowSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
getErrors() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
The CsvIOParseError PCollection as a result of errors associated with parsing CSV records.
getEstimatedLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
An estimate of the total size (in bytes) of the data that would be read from this source.
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
getEvent() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
 
getEventCoder(Pipeline, Coder<KV<KeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Provide the event coder.
getEventExaminer() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
 
getEvents() - Method in class org.apache.beam.sdk.testing.TestStream
Returns the sequence of Events in this TestStream.
getEx() - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
 
getException() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
The exception itself, e.g.
getExceptionStacktrace() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure
The full stacktrace.
getExecutables() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
getExecutableStageIntermediateId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
 
getExecuteStreamingSqlRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
getExecutionModeForBatch() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getExecutionRetryDelay() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getExecutorService() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
getExpansionPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
getExpansionServiceConfig() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
getExpansionServiceConfigFile() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
getExpectedAssertions() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
getExpectFileToNotExist() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
If true, the created file is expected to not exist.
getExperiments() - Method in interface org.apache.beam.sdk.options.ExperimentalOptions
 
getExperimentValue(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
Return the value for the specified experiment or null if not present.
getExpiration() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getExpiration() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the message expiration time in milliseconds since the Unix epoch.
getExplanation() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
 
getExplicitHashKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner.ExplicitPartitioner
Required hash value (128-bit integer) to determine explicitly the shard a record is assigned to based on the hash key range of each shard.
getExplicitHashKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
Optional hash value (128-bit integer) to determine explicitly the shard a record is assigned to based on the hash key range of each shard.
getExplicitHashKey(byte[]) - Method in interface org.apache.beam.sdk.io.kinesis.KinesisPartitioner
 
getExpression(SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getExpression() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
 
getExtendedSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getExtendedSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getExtensionHosts() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
getExtensionRegistry() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns the ExtensionRegistry listing all known Protocol Buffers extension messages to T registered with this ProtoCoder.
getExternalSorterType() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Returns the external sorter type.
getExtraInteger() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
 
getExtraString() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config
 
getFactory() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForGetter
 
getFactory() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.AvroConvertValueForSetter
 
getFactory(AwsOptions) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
getFactory() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
getFactory() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
Gets failed bodies with err.
getFailedBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
Gets failed FhirBundleResponse wrapped inside HealthcareIOError.
getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
Gets failed file imports with err.
getFailedInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Returns a PCollection containing the TableRows that didn't make it to BQ.
getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Returns a PCollection containing the BigQueryInsertErrors with detailed error information.
getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
 
getFailedLatencyMetric() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
getFailedMessages() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
 
getFailedMutations() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
Gets failed reads.
getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
Gets failed reads.
getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
Gets failed reads.
getFailedRowsTag() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
getFailedRowsTupleTag() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
getFailedSearches() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
Gets failed searches.
getFailedStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Return any rows that persistently fail to insert when using a storage-api method.
getFailedToParseLines() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
Returns a PCollection containing the Rows that didn't parse.
getFailOnCheckpointingErrors() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getFailsafeTableRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getFailsafeTableRowPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getFailsafeValue() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
Returns the failsafe value of this FailsafeValueInSingleWindow.
getFailure() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
Information about why the record failed.
getFailureCollector() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getFailures() - Method in class org.apache.beam.io.requestresponse.Result
 
getFanout() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
getFasterCopy() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getFetchSize() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getFhirBundleParameter() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
getFhirStore() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
 
getField() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
 
getField() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.KeyPart
 
getField() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
getField(int) - Method in class org.apache.beam.sdk.schemas.Schema
Return a field by index.
getField(String) - Method in class org.apache.beam.sdk.schemas.Schema
 
getFieldAccessDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
Effective FieldAccessDescriptor applied by DoFn.
getFieldCount() - Method in class org.apache.beam.sdk.schemas.Schema
Return the count of fields.
getFieldCount() - Method in class org.apache.beam.sdk.values.Row
Return the size of data fields.
getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithGetters
 
getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithStorage
 
getFieldDescription(T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
getFieldId() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
 
getFieldName() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
 
getFieldNames() - Method in class org.apache.beam.sdk.schemas.Schema
Return the list of all field names.
getFieldOptionById(int) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
getFieldRef(CEPOperation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
a function that finds a pattern reference recursively.
getFieldRename() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
 
getFields() - Method in class org.apache.beam.sdk.schemas.Schema
 
getFields() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration
 
getFields() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
 
getFields(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
getFieldsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
getFieldType(Schema, CEPOperation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
 
getFieldType(OneOfType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
getFieldTypes(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Get field types for an AVRO-generated SpecificRecord or a POJO.
getFieldTypes(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
getFieldTypes(TypeDescriptor<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
getFileDescriptor(String) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
getFileFormat() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
File format for created files.
getFileInputSplitMaxSizeMB() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getFileLocation() - Method in class org.apache.beam.sdk.io.tika.ParseResult
Returns the absolute path to the input file.
getFilename() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
getFileName() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
getFilename() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
 
getFilename() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
The filename associated with the caught Exception.
getFilename(BoundedWindow, PaneInfo, int, int, Compression) - Method in interface org.apache.beam.sdk.io.FileIO.Write.FileNaming
Generates the filename.
getFilename() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns the name of the file or directory denoted by this ResourceId.
getFilenamePolicy(DestinationT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
Converts a destination into a FileBasedSink.FilenamePolicy.
getFilenamePrefix() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
getFilenameSuffix() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
getFileOrPatternSpec() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getFileOrPatternSpecProvider() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getFilePattern() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
getFilepattern() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
The filepattern used to match and read files.
getFilePattern() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
getFilesList() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting list of names of staged files.
getFilesList() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
Getter for a list of staged files which are will be loaded to Snowflake.
getFilesToStage() - Method in interface org.apache.beam.sdk.options.FileStagingOptions
List of local files to make available to workers.
getFileSystem() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
getFilter() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getFilterFormatFunction(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
 
getFinishBundleBeforeCheckpointing() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getFinishedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time at which the connector finished processing this partition.
getFirestoreDb() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
The Firestore database ID to connect to.
getFirestoreHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
A host port pair to allow connecting to a Cloud Firestore instead of the default live service.
getFirestoreProject() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
The Firestore project ID to connect to.
getFlatJsonRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
Loads rows from BigQuery into Rows with given Schema.
getFlexRSGoal() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
This option controls Flexible Resource Scheduling mode.
getFlinkConfDir() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getFlinkMaster() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
The url of the Flink JobManager on which to execute pipelines.
getFloat() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getFloat(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.FLOAT value by field name, IllegalStateException is thrown if schema doesn't match.
getFloat(int) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.FLOAT value by field index, ClassCastException is thrown if schema doesn't match.
getFn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
getFn() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
getFn() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns the CombineFnBase.GlobalCombineFn used by this Combine operation.
getFn() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
Returns the CombineFnBase.GlobalCombineFn used by this Combine operation.
getFn() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns the CombineFnBase.GlobalCombineFn used by this Combine operation.
getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
getFnApiDevContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Provides the version/tag for dev SDK FnAPI container image.
getFnApiEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Provides the FnAPI environment's major version number.
getForceUnalignedCheckpointEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
The format of the file(s) to read.
getFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
getFormatClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets InputFormat or OutputFormat class for a plugin.
getFormatClass() - Method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Format
 
getFormatClass() - Method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
 
getFormatName() - Method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Format
 
getFormatProviderClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets InputFormatProvider or OutputFormatProvider class for a plugin.
getFormatProviderClass() - Method in enum org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
 
getFormatProviderName() - Method in enum org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
 
getFqName(String, String, Iterable<String>) - Static method in class org.apache.beam.sdk.metrics.Lineage
Assemble fully qualified name (FQN).
getFqName(String, Iterable<String>) - Static method in class org.apache.beam.sdk.metrics.Lineage
Assemble the FQN of given system, and segments.
getFractionConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
getFractionConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Returns a value in [0, 1] representing approximately what fraction of the current source this reader has read so far, or null if such an estimate is not available.
getFractionConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
getFractionConsumed() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Returns the approximate fraction of positions in the source that have been consumed by successful RangeTracker.tryReturnRecordAt(boolean, PositionT) calls, or 0.0 if no such calls have happened.
getFractionOfBlockConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
Returns the fraction of the block already consumed, if possible, as a value in [0, 1].
getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
Returns the estimated throughput bytes for a specified time.
getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
Always returns 0.
getFrom(Timestamp) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
Returns the estimated throughput for a specified time.
getFrom() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
Returns the range start timestamp (inclusive).
getFrom() - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
getFromRowFunction(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
 
getFromRowFunction() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
 
getFromRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
Returns the toRow conversion function.
getFromRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve the function that converts a Row object to the specified type.
getFromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve the function that converts a Row object to the specified type.
getFromRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
Returns the attached schema's fromRowFunction.
getFromSnapshotExclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getFromSnapshotInclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getFromSnapshotRefExclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getFromSnapshotRefInclusive() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getFullName(PTransform<?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
Returns the full name of the currently being translated transform.
getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
getFunction() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
 
getFunction() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf
 
getFunctionNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getFunctions(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getGapDuration() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
getGauge(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
 
getGauge(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Gauge that should be used for implementing the given metricName in this container.
getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
 
getGauges() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
Return the metric results for the gauges that matched the filter.
getGcloudCancelCommand(DataflowPipelineOptions, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
getGcpCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
The credential instance that should be used to authenticate against GCP services.
getGcpOauthScopes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
Controls the OAuth scopes that will be requested when creating Credentials with the GcpCredentialFactory (which is the default CredentialFactory).
getGcpTempLocation() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
A GCS path for storing temporary files in GCP.
getGcsEndpoint() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
GCS endpoint to use.
getGcsHttpRequestReadTimeout() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
getGcsHttpRequestWriteTimeout() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
getGcsPerformanceMetrics() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
If true, reports metrics of certain operations, such as batch copies.
getGcsReadCounterPrefix() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
getGcsRewriteDataOpBatchLimit() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
getGcsUploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The buffer size (in bytes) to use when uploading files to GCS.
getGcsUtil() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The GcsUtil instance that should be used to communicate with Google Cloud Storage.
getGcsWriteCounterPrefix() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
getGCThrashingPercentagePerPeriod() - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
The GC thrashing threshold percentage.
getGenericRecordToRowFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
getGetOffsetFn() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets a SerializableFunction that defines how to get record offset for CDAP Plugin class.
getGetReceiverArgsFromConfigFn() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets a SerializableFunction that defines how to get constructor arguments for Receiver using PluginConfig.
getGetters(TypeDescriptor<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Get generated getters for an AVRO-generated SpecificRecord or a POJO.
getGetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
Return the list of FieldValueGetters for a Java Bean class
getGetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
getGetters() - Method in class org.apache.beam.sdk.values.RowWithGetters
 
getGetterTarget() - Method in class org.apache.beam.sdk.values.RowWithGetters
 
getGlobalConfigRefreshPeriod() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getGlobalSequenceCombiner() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
Provide the global sequence combiner.
getGoogleAdsClientId() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
OAuth 2.0 Client ID identifying the application.
getGoogleAdsClientSecret() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
OAuth 2.0 Client Secret for the specified Client ID.
getGoogleAdsCredential() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
The credential instance that should be used to authenticate against the Google Ads API.
getGoogleAdsCredentialFactoryClass() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
The class of the credential factory to create credentials if none have been explicitly set.
getGoogleAdsDeveloperToken() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
Google Ads developer token for the user connecting to the Google Ads API.
getGoogleAdsEndpoint() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
Host endpoint to use for connections to the Google Ads API.
getGoogleAdsRefreshToken() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
OAuth 2.0 Refresh Token for the user connecting to the Google Ads API.
getGoogleApiTrace() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
This option enables tracing of API calls to Google services used within the Apache Beam SDK.
getGrammarFileName() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
getGrammarFileName() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
getGroupingTableMaxSizeMb() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
Size (in MB) of each grouping table used to pre-combine elements.
getGson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
 
getHadoopConfiguration() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets a plugin Hadoop configuration.
getHasError() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
 
getHashCode() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
 
getHdfsConfiguration() - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
 
getHeaderAccessor() - Static method in class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
 
getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
getHeaders() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getHeaders() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getHeartbeatMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The number of milliseconds after the stream is idle, which a heartbeat record will be emitted in the change stream query.
getHighWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
getHintMaxNumWorkers() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
A hint to the QoS system for the intended max number of workers for a pipeline.
getHistogram(MetricName, HistogramData.BucketType) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Histogram that should be used for implementing the given metricName in this container.
getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
 
getHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Gets a Hl7v2 message by its name from a Hl7v2 store.
getHL7v2Message() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
Gets hl7v2Message.
getHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
Gets HL7v2 message.
getHl7v2MessageId() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
HL7v2MessageId string.
getHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Gets an HL7v2 store.
getHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
Gets HL7v2 store.
getHoldability() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getHost() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
getHost() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
getHost() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
Get the host that this ExpansionServer is bound to.
getHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getHttpClient() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getHttpClientConfiguration() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
HttpClientConfiguration used to configure AWS service clients.
getHttpPipeline() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getHTTPReadTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getHTTPWriteTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getHumanReadableJsonRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
The failing record, encoded as JSON.
getIcebergCatalog() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getIcebergCatalog() - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration
 
getId() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
Get an id used to represent this bundle.
getId() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
Returns an id used to represent this bundle.
getId() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
 
getId() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
getId() - Method in interface org.apache.beam.sdk.fn.IdGenerator
 
getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
 
getId() - Method in class org.apache.beam.sdk.values.TupleTag
Returns the id of this TupleTag.
getId() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the id attribute.
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the id attribute.
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
 
getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
 
getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
 
getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
 
getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
getIdentifier() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
The unique identifier for this type.
getIdleShutdownTimeout() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
getImpersonateServiceAccount() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
All API requests will be made as the given service account or target service account in an impersonation delegation chain instead of the currently selected account.
getImplementor() - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
 
getImplementor(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
getImplementor() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
 
getInboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
getIncompatibleGlobalWindowErrorMessage() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
Returns the error message for not supported default values in Combine.globally().
getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
 
getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
 
getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexInputRef
 
getIndex() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
getIndex(TupleTag<?>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Returns the index for the given tuple tag, if the tag is present in this schema, -1 if it isn't.
getIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
The zero-based index of this trigger firing that produced this pane.
getIndexes() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexFieldAccess
 
getInferMaps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
/** Controls whether to use the map or row FieldType for a TableSchema field that appears to represent a map (it is an array of structs containing only key and value fields).
getInflightWaitSeconds() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
If the previous call to appendRows blocked due to flow control, returns how long the call blocked for.
getIngestManager() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
Getter for ingest manager which serves API to load data in streaming mode and retrieve a report about loaded data.
getInitialBackoff() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The initial backoff duration to be used before retrying a request for the first time.
getInitializedProducer(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
getInitializedProducer(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
Returns a MessageProducer object for publishing messages to Solace.
getInitialRestriction(Map<String, String>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
 
getInitialRestriction(PulsarSourceDescriptor) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
getInitialRestriction(InputT) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
 
getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
 
getInitialWatermarkEstimatorState(InitialPipelineState) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
getInitialWatermarkEstimatorState(PartitionRecord) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
 
getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
 
getInput(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
getInput() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.UnresolvedTranslation
 
getInput() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getInput(PTransform<T, ?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
getInputDataSet(PValue) - Method in class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
 
getInputDataSet(PValue) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getInputDoc() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
 
getInputFile() - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
 
getinputFormatClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getinputFormatKeyClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getInputFormatProvider() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getinputFormatValueClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getInputId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
 
getInputReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
Get a map of PCollection ids to receivers which consume input elements, forwarding them to the remote environment.
getInputReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
Get a map of PCollection ids to receivers which consume input elements, forwarding them to the remote environment.
getInputs(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
Returns the input of the currently being translated transform.
getInputs() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getInputs() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getInputSchema() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getInputSchemas() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
getInputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
getInputType() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
getInputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns a TypeDescriptor capturing what is known statically about the input type of this CombineFn instance's most-derived class.
getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
Returns a TypeDescriptor capturing what is known statically about the input type of this DoFn instance's most-derived class.
getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.InferableFunction
Returns a TypeDescriptor capturing what is known statically about the input type of this InferableFunction instance's most-derived class.
getInputValueCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
Returns the Coder of the values of the input to this transform.
getInsertBundleParallelism() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getInsertCount() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
getInsertErrors() - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
 
getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
getInstance() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
getInstance(SparkSession) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
Get the MetricsAccumulator on this driver.
getInstance() - Static method in class org.apache.beam.sdk.io.googleads.DefaultGoogleAdsClientFactory
 
getInstance() - Static method in class org.apache.beam.sdk.metrics.NoOpCounter
 
getInstance() - Static method in class org.apache.beam.sdk.metrics.NoOpHistogram
 
getInstance(String, String) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
 
getInstanceAdminClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
 
getInstanceConfigId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Returns the instance id being written to.
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
getInstructionId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
 
getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
 
getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
 
getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
Return an InstructionRequestHandler which can communicate with the environment.
getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
 
getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
 
getInt(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getInt(Map<String, Object>, String, Integer) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getInt16() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getInt16(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.INT16 value by field name, IllegalStateException is thrown if schema doesn't match.
getInt16(int) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.INT16 value by field index, ClassCastException is thrown if schema doesn't match.
getInt32() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getInt32(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.INT32 value by field name, IllegalStateException is thrown if schema doesn't match.
getInt32(int) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.INT32 value by field index, ClassCastException is thrown if schema doesn't match.
getInt64() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getInt64(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.INT64 value by field name, IllegalStateException is thrown if schema doesn't match.
getInt64(int) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.INT64 value by field index, ClassCastException is thrown if schema doesn't match.
getInterface() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
getInterfaces() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a set of TypeDescriptors, one for each interface implemented by this class.
getIntersectingPartition(Range.ByteStringRange, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Return the overlapping parts of 2 partitions.
getIo() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
getIr() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
 
getIrOptions() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
 
getIsLocalChannelProvider() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getIsWindmillServiceDirectPathEnabled() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getIterable(String) - Method in class org.apache.beam.sdk.values.Row
Get an iterable value by field name, IllegalStateException is thrown if schema doesn't match.
getIterable(int) - Method in class org.apache.beam.sdk.values.Row
Get an iterable value by field index, IllegalStateException is thrown if schema doesn't match.
getIterableComponentType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
For an array T[] or a subclass of Iterable, return a TypeDescriptor describing T.
getJarPath() - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
Optional Beam filesystem path to the jar containing the bytecode for this function.
getJavaClass(RelDataType) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamJavaTypeFactory
 
getJavaClassLookupAllowlist() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
getJavaClassLookupAllowlistFile() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
getJAXBClass() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
getJdbcType() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getJdbcType() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getJdbcUrl() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getJdbcUrl() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getJdkAddOpenModules() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Open modules needed for reflection that access JDK internals with Java 9+
getJdkAddOpenModules() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
Open modules needed for reflection that access JDK internals with Java 9+
getJetDefaultParallelism() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
getJetLocalMode() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
getJetProcessorsCooperative() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
getJetServers() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
getJfrRecordingDurationSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
getJmsCorrelationID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsDeliveryMode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsDestination() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsExpiration() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsMessageID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsPriority() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsRedelivered() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsReplyTo() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsTimestamp() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsType() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJob(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Gets the Dataflow Job with the given jobId.
getJob() - Method in exception org.apache.beam.runners.dataflow.DataflowJobException
Returns the failed job.
getJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
 
getJob(JobReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Gets the specified Job by the given JobReference.
getJob(JobReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
getJobCheckIntervalInSecs() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getJobEndpoint() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getJobFileZip() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
getJobId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Get the id of this job.
getJobId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
The identity of the Dataflow job.
getJobId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
getJobInfo() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
 
getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
getJobLabelsMap() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getJobMessages(String, long) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
Return job messages sorted in ascending order by timestamp.
getJobMetrics(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Gets the JobMetrics with the given jobId.
getJobMetrics(JobApi.GetJobMetricsRequest, StreamObserver<JobApi.GetJobMetricsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
getJobMetrics(JobApi.GetJobMetricsRequest, StreamObserver<JobApi.GetJobMetricsResponse>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
 
getJobMonitoringPageURL(String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
Deprecated.
this method defaults the region to "us-central1". Prefer using the overload with an explicit regionId parameter.
getJobMonitoringPageURL(String, String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
getJobName() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
getJobs(JobApi.GetJobsRequest, StreamObserver<JobApi.GetJobsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
getJobServerConfig() - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
 
getJobServerDriver() - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
 
getJobServerTimeout() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getJobServerUrl() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
getJobService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
Returns a real, mock, or fake BigQueryServices.JobService.
getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
getJobType() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
getJoinColumns(boolean, List<Pair<RexNode, RexNode>>, int, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
 
getJsonBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
Returns a SimpleFunction mapping JSON byte[] arrays to Beam Rows.
getJsonClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getJsonFactory() - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
 
getJsonFactory() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
getJsonStringToRowFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
Returns a SimpleFunction mapping JSON Strings to Beam Rows.
getJsonTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getJsonToRowWithErrFn() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
 
getKeep() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getKeep() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getKeep() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
 
getKey() - Method in interface org.apache.beam.runners.local.Bundle
Returns the key that was output in the most recent GroupByKey in the execution of this bundle.
getKey() - Method in class org.apache.beam.runners.local.StructuralKey
Returns the key that this StructuralKey was created from.
getKey() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
 
getKey() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
getKey() - Method in class org.apache.beam.sdk.metrics.MetricResult
 
getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
The key for the display item.
getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The key for the display item.
getKey() - Method in class org.apache.beam.sdk.values.KV
Returns the key of this KV.
getKey() - Method in class org.apache.beam.sdk.values.ShardedKey
 
getKeyClass() - Method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
 
getKeyCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
 
getKeyCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
 
getKeyCoder() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
getKeyCoder(Pipeline, Coder<KV<KeyT, KV<Long, EventT>>>) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Provide the key coder.
getKeyCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getKeyCoder() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getKeyCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
Returns the Coder of the keys of the input to this transform, which is also used as the Coder of the keys of the output of this transform.
getKeyCoder() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns the key Coder for all PCollections in this KeyedPCollectionTuple.
getKeyDeserializerProvider() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getKeyedCollections() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns a list of TaggedKeyedPCollections for the PCollections contained in this KeyedPCollectionTuple.
getKeyedResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
Gets resources with input SearchParameter key.
getKeyParts(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
getKeyRange() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Returns the range of keys that will be read from the table.
getKeys() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
getKeySerializer() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getKeySet() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getKeysJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
The primary keys of this specific modification.
getKeystorePassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getKeystorePath() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getKeyTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getKeyTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getKind() - Method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
 
getKind() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
getKind() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
Return the display name for this factory.
getKind() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
 
getKind() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
 
getKindString() - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
getKindString() - Method in class org.apache.beam.sdk.io.Read.Bounded
 
getKindString() - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
 
getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
getKindString() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns the name to use by default for this PTransform (not including the names of any enclosing PTransforms).
getKindString() - Method in class org.apache.beam.sdk.transforms.Tee
 
getKindString() - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
getKindString() - Method in class org.apache.beam.sdk.values.PValueBase
Returns a String capturing the kind of this PValueBase.
getKinesisClient() - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
 
getKinesisIOConsumerArns() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions
Used to enable / disable EFO.
getKmsKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getKmsKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getKV() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Retrieve the optional label for an item.
getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The optional label for an item.
getLabels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Labels that will be applied to the billing records for this job.
getLabels() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets labels.
getLanguage() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration
 
getLanguage() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration
 
getLanguageOptions() - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
getLastEmitted() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
Returns the last value emitted by the reader.
getLastFieldId() - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
getLastProcessedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
getLastRunTimestamp() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getLastUpdated() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getLastWatermarkedBatchTime() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
getLatencyNanos() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
The publishing latency in nanoseconds.
getLatencyTrackingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getLatestBufferedSequence() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
getLatestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Gets the latest HL7v2 send time.
getLatestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
getLateTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getLeaves() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
 
getLeaves() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getLegacyDevContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Provides the version/tag for legacy SDK FnAPI container image.
getLegacyEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Provides the legacy environment's major version number.
getLength() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
 
getLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
 
getLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
 
getLevel() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
 
getLimitCountOfSortRel() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
 
getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Retrieve the optional link URL for an item.
getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The optional link URL for an item.
getList() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
 
getListeners() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
getListOfMaps(Map<String, Object>, String, List<Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getLiteralGqlQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getLoadBalanceBundles() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getLocalhost() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getLocalJobServicePortFile() - Method in interface org.apache.beam.runners.portability.testing.TestUniversalRunner.Options
A file containing the job service port, since Gradle needs to know this filename statically to provide it in Beam testing options.
getLocalValue() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
getLocalWindmillHostport() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getLocation() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
getLocation() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getLocation() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getLocation() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getLocation() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getLogicalStartTime() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getLogicalType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
getLogicalType(Class<LogicalTypeT>) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Helper function for retrieving the concrete logical type subclass.
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
 
getLogicalTypeName() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
 
getLogicalTypeValue(String, Class<T>) - Method in class org.apache.beam.sdk.values.Row
Returns the Logical Type input type for this field.
getLogicalTypeValue(int, Class<T>) - Method in class org.apache.beam.sdk.values.Row
Returns the Logical Type input type for this field.
getLoginTimeout() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getLoginTimeout() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getLogLevel() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
 
getLogMdc() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
Whether to include SLF4J MDC in log entries.
getLong(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getLong(Map<String, Object>, String, Long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getLowWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
getLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getMainOutputTag() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getMainTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
The main trigger, which will continue firing until the "until" trigger fires.
getManifestListLocation() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
getMap() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
 
getMap(String) - Method in class org.apache.beam.sdk.values.Row
Get a MAP value by field name, IllegalStateException is thrown if schema doesn't match.
getMap(int) - Method in class org.apache.beam.sdk.values.Row
Get a MAP value by field index, IllegalStateException is thrown if schema doesn't match.
getMapKeyType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
If the field is a map type, returns the key type.
getMapKeyType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
getMapping() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
 
getMapping() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
 
getMapType(TypeDescriptor, int) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
getMapValueType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
If the field is a map type, returns the key type.
getMapValueType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
getMatcher() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
 
getMatchUpdatedFiles() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
getMaterialization() - Method in class org.apache.beam.sdk.transforms.ViewFn
Gets the materialization of this ViewFn.
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
 
getMax() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
getMaxAttempts() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The maximum number of times a request will be attempted for a complete successful result.
getMaxBufferingDuration() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
 
getMaxBufferingDurationMilliSec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getMaxBundlesFromWindmillOutstanding() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
Maximum number of bundles outstanding from windmill before the worker stops requesting.
getMaxBundleSize() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getMaxBundleTimeMills() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getMaxBytesFromWindmillOutstanding() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
Maximum number of bytes outstanding from windmill before the worker stops requesting.
getMaxCacheMemoryUsage(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb
 
getMaxCacheMemoryUsage(PipelineOptions) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions.MaxCacheMemoryUsageMb
 
getMaxCacheMemoryUsageMb() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
Size (in MB) for the process wide cache within the SDK harness.
getMaxCacheMemoryUsageMbClass() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
An instance of this class will be used to specify the maximum amount of memory to allocate to a cache within an SDK harness instance.
getMaxCacheMemoryUsagePercent() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
Size (in % [0 - 100]) for the process wide cache within the SDK harness.
getMaxCommitDelay() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getMaxConnectionPoolConnections() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getMaxCumulativeBackoff() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getMaxElementCountToTriggerContinuousSequenceRangeReevaluation() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
Number of new elements to trigger the re-evaluation.
getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns the actual ending offset of the current source.
getMaxInvocationHistory() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
getMaxLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
 
getMaxLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
 
getMaxNumericPrecision() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
getMaxNumericScale() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
getMaxNumRecords() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getMaxNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
The maximum number of workers to use for the workerpool.
getMaxOutputElementsPerBundle() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Returns the maximum number of elements which will be output per each bundle.
getMaxParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getMaxPrecision(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
getMaxPreviewRecords() - Method in class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
 
getMaxReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getMaxReadTimeSeconds() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getMaxRecordsPerBatch() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getMaxStackTraceDepthToReport() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getMaxStreamingBatchSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getMaxStreamingRowsToBatch() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getMD5() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
 
getMean() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
getMean() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Returns the configured size of the memory buffer.
getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
Returns the configured size of the memory buffer.
getMessage() - Method in class org.apache.beam.io.requestresponse.ApiIOError
The Exception message.
getMessage() - Method in exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
getMessage() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
The caught Throwable.getMessage().
getMessage() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
Underlying Message.
getMessage() - Method in exception org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
 
getMessageBacklog() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
Current backlog in messages (latest offset of the partition - last processed record offset).
getMessageConverter(DestinationT, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
 
getMessageId() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
SQS message id.
getMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the messageId of the message populated by Cloud Pub/Sub.
getMessageId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
 
getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
The message id of the message that was published.
getMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the unique identifier of the message, a string for an application-specific message identifier.
getMessageName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getMessageName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getMessageName() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getMessageName() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
getMessageRecord() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
 
getMessageStream(JobApi.JobMessagesRequest, StreamObserver<JobApi.JobMessagesResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
getMessageType() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns the Protocol Buffers Message type this ProtoCoder supports.
getMessageType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets message type.
getMetadata(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
Return AVRO file metadata for a given destination.
getMetaData() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getMetadata(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getMetadata(MetadataScope, MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getMetadata() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
Returns the MatchResult.Metadata of the file.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
String representing the metadata of the Bundle to be written.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
String representing the metadata of the messageId to be read.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
Gets metadata.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Returns the gathered metadata for the change stream query so far.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The connector execution metadata for this record.
getMetadata() - Method in class org.apache.beam.sdk.io.tika.ParseResult
Returns the extracted metadata.
getMetadata(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Deprecated.
use schema options instead.
getMetadata() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
 
getMetadataCoder() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
 
getMetadataQuery() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
getMetadataString(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Deprecated.
use schema options instead.
getMetadataTable() - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
Returns the name of the metadata table.
getMetadataTableAdminDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getMetadataTableDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getMetadataTableDebugString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getMetaStore() - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
 
getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
 
getMethod() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
getMethod() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
getMethod() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
getMethods(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
Returns the list of non private/protected, non-static methods in the class, caching the results.
getMethodsMap(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
getMetricLabels() - Method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
 
getMetrics() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
 
getMetrics() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getMetricsEnvironmentStateForCurrentThread() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Returns the container holder for the current thread.
getMetricsGraphiteHost() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
getMetricsGraphitePort() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
getMetricsHttpSinkUrl() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
getMetricsMapName(long) - Static method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
 
getMetricsPushPeriod() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
getMetricsSink() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
getMimeType() - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
Deprecated.
 
getMimeType() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
Returns the MIME type that should be used for the files that will hold the output data.
getMin() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
getMinBundleSize() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns the minimum bundle size that should be used when splitting the source into sub-sources.
getMinConnectionPoolConnections() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getMinCpuPlatform() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Specifies a Minimum CPU platform for VM instances.
getMinimumTimestamp() - Method in interface org.apache.beam.runners.local.Bundle
Return the minimum timestamp among elements in this bundle.
getMinPauseBetweenCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getMinReadTimeMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getMissingPartitionsFrom(List<Range.ByteStringRange>, ByteString, ByteString) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Return missing partitions within partitions that are within start and end.
getMissingPartitionsFromEntireKeySpace(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Return missing partitions from the entire keyspace.
getMode() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getMode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getModeNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
getModifiableCollection() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
 
getMods() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The modifications within this record.
getModType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The type of operation that caused the modifications within this record.
getMonitoringInfos() - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the cumulative values for any metrics in this container as MonitoringInfos.
getMonthOfYear() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getMutableOutput(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
getMutationInformation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
 
GetMutationsFromBeamRow() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow
 
getMutationType() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
 
getName() - Method in enum org.apache.beam.io.debezium.Connectors
The name of this connector class.
getName(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
getName() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
getName() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
 
getName() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
getName() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
getName() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets name.
getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
The name of the column.
getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
 
getName() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
Gets the name of the destination.
getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Queue
 
getName() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Topic
 
getName() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
 
getName() - Method in class org.apache.beam.sdk.metrics.DelegatingDistribution
 
getName() - Method in class org.apache.beam.sdk.metrics.DelegatingHistogram
 
getName() - Method in interface org.apache.beam.sdk.metrics.Metric
The MetricName given to this metric.
getName() - Method in class org.apache.beam.sdk.metrics.MetricName
The name of this metric.
getName() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
If set, the metric must have this name to match this MetricNameFilter.
getName() - Method in class org.apache.beam.sdk.metrics.MetricResult
Return the name of the metric.
getName() - Method in class org.apache.beam.sdk.metrics.NoOpCounter
 
getName() - Method in class org.apache.beam.sdk.metrics.NoOpHistogram
 
getName() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
Returns the field name.
getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
 
getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
 
getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
 
getName() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
 
getName() - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns the field name.
getName() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
 
getName() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns the transform name.
getName() - Method in class org.apache.beam.sdk.values.PCollection
Returns the name of this PCollection.
getName() - Method in interface org.apache.beam.sdk.values.PValue
Returns the name of this PValue.
getName() - Method in class org.apache.beam.sdk.values.PValueBase
Returns the name of this PValueBase.
getNameCount() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
getNameOverride(String, T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
getNameOverride() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
getNameOverride() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
getNamespace() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getNamespace() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricName
The namespace associated with this metric.
getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
The inNamespace that a metric must be in to match this MetricNameFilter.
getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
The namespace for the display item.
getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The namespace for the display item.
getNeedsAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
getNeedsMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
getNeedsOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
getNestedFieldsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
getNetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
GCE network for launching workers.
getNetworkTimeout() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getNewBigqueryClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
getNewValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
The new column values after the modification was applied.
getNextId() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
Return a random base64 encoded 8 byte string.
getNextOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
getNextWindow() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
 
getNodeStats(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelMetadataQuery
 
getNodeStats() - Method in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
 
getNodeStats(RelNode, RelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata.Handler
 
getNodeStats(RelNode, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
 
getNodeStats(RelNode, BeamRelMetadataQuery) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
getNonCumulativeCost(RelNode, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
 
getNonNullPrefix() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
 
getNonSpeculativeIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
The zero-based index of this trigger firing among non-speculative panes.
getNonWildcardPrefix(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Returns the prefix portion of the glob that doesn't contain wildcards.
getNotSupported() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
Identify parts of a predicate that are not supported by the IO push-down capabilities to be preserved in a Calc following BeamIOSourceRel.
getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
Since predicate push-down is assumed not to be supported by default - return an unchanged list of filters to be preserved.
getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
 
getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
 
getNullable() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
getNullableValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Workaround for autovalue code generation, which does not allow type variables to be instantiated with nullable actual parameters.
getNullFirst() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
 
getNullParams() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
 
getNum() - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
 
getNumber() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
Optionally returns the field index.
getNumber() - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
 
getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getNumberOfBufferedEvents() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
getNumberOfExecutionRetries() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getNumberOfPartitionsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The total number of partitions for the given transaction.
getNumberOfReceivedEvents() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
getNumberOfRecordsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The total number of data change records for the given transaction.
getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the total number of records read from the change stream so far.
getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The number of records read in the partition change stream query before reading this record.
getNumberOfWorkerHarnessThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Number of threads to use on the Dataflow worker harness.
getNumberOverride(int, T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
getNumBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
getNumConcurrentCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getNumEntities(PipelineOptions, String, String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns Number of entities available for reading.
getNumExtractJobCalls() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
getNumQuerySplits() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getNumRows(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
It returns the number of rows for a given table.
getNumSampledBytesPerFile() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
getNumShards() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
getNumShards() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getNumShardsProvider() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getNumStorageWriteApiStreamAppendClients() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getNumStorageWriteApiStreams() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getNumStreamingKeys() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getNumStreams() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Number of workers to use when executing the Dataflow job.
getNumWrites() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
getOAuthToken() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getOauthToken() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getOauthToken() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getObject(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getObject() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Returns the object name associated with this GCS path, or an empty string if no object is specified.
getObject(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Returns the StorageObject for the given GcsPath.
getObjectMapper() - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
 
getObjectReuse() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getObjects(List<GcsPath>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Returns StorageObjectOrIOExceptions for the given GcsPaths.
getObservedTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getObservedTimestamp() - Method in class org.apache.beam.io.requestresponse.ApiIOError
The observed timestamp of the error.
getObservedTimestamp() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
The date and time when the Exception occurred.
getOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
getOffsetConsumerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getOldValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
The old column values before the modification was applied.
getOnCreateMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
getOneOfSchema() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
Returns the schema of the underlying Row that is used to represent the union.
getOneOfTypes() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
getOneRecord(Map<String, String>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
 
getOnly() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getOnly() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getOnly(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
If there is a singleton value for the given tag, returns it.
getOnly(String) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Like CoGbkResult.getOnly(TupleTag) but using a String instead of a TupleTag.
getOnly(TupleTag<V>, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
If there is a singleton value for the given tag, returns it.
getOnly(String, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Like CoGbkResult.getOnly(TupleTag, Object) but using a String instead of a TupleTag.
getOnSuccessMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
getOnTimeBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getOperand0() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
 
getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
 
getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
 
getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
 
getOperands() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
 
getOperation() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
 
getOperation() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
getOperationMode() - Method in class org.apache.beam.runners.twister2.BeamBatchTSetEnvironment
 
getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
 
getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
 
getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
 
getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
 
getOperatorChaining() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getOptionNames() - Method in class org.apache.beam.sdk.schemas.Schema.Options
 
getOptions() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
getOptions() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getOptions() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getOptions() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getOptions() - Method in class org.apache.beam.sdk.Pipeline
 
getOptions() - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns the fields Schema.Options.
getOptions() - Method in class org.apache.beam.sdk.schemas.Schema
 
getOptions() - Method in class org.apache.beam.sdk.testing.TestPipeline
 
getOptionsId() - Method in interface org.apache.beam.sdk.options.PipelineOptions
Provides a process wide unique ID for this PipelineOptions object, assigned at graph construction time.
getOptionsSupplier() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
getOptionsSupplier() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getOrCreate(BigtableConfig) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
Create a BigtableAccess if it doesn't exist and store it in the cache for faster access.
getOrCreate(SpannerConfig) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getOrCreateReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
getOrCreateSession(SparkStructuredStreamingPipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory
Gets active SparkSession or creates one using SparkStructuredStreamingPipelineOptions.
getOrDefault(K, V) - Method in interface org.apache.beam.sdk.state.MapState
A deferred lookup.
getOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the ordering key of the message.
getOrdinalPosition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
The position of the column in the table.
getOrphanedNewPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
Returns a list of NewPartition that have been around for a while and do not overlap with any missing partition.
getOrThrowException() - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
 
getOutboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
getOutName(int) - Method in class org.apache.beam.sdk.values.TupleTag
If this TupleTag is tagging output outputIndex of a PTransform, returns the name that should be used by default for the output.
getOutput(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
getOutput() - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
 
getOutput() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getOutput(TupleTag<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getOutput(PTransform<?, T>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getOutput() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
The T PCollection as a result of successfully parsing CSV records.
getOutput() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling
 
getOutput() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
 
getOutput() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
 
getOutput() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
getOutput() - Method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
getOutput() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
 
getOutput() - Method in interface org.apache.beam.sdk.transforms.errorhandling.ErrorHandler
 
getOutput() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
 
getOutputCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
getOutputCoder(SerializableFunction<InputT, OutputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Deprecated.
This method is to change in an unknown backwards incompatible way once support for this functionality is refined.
getOutputCoder() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.CompressedSource
Returns the delegate source's output coder.
getOutputCoder() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.Source
Returns the Coder to use for the data read from this source.
getOutputCoder() - Method in class org.apache.beam.sdk.io.TextSource
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
getOutputCoders() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getOutputExecutablePath() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getOutputFile() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
getOutputFilePrefix() - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
Output file prefix.
getOutputFormatProvider() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getOutputId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
 
getOutputKvCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
Returns the Coder of the output of this transform.
getOutputOrNull(ErrorHandling) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
getOutputParallelization() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getOutputParallelization() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
 
getOutputPortSchemas() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getOutputs(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
Returns the output of the currently being translated transform.
getOutputs() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getOutputs() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getOutputSchema() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getOutputSchema(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
Get the output schema resulting from selecting the given FieldAccessDescriptor from the given schema.
getOutputStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Get the output strategy of this Window PTransform.
getOutputStream() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
 
getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
getOutputType() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf
 
getOutputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns a TypeDescriptor capturing what is known statically about the output type of this CombineFn instance's most-derived class.
getOutputTypeDescriptor() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
Returns a TypeDescriptor capturing what is known statically about the output type of this DoFn instance's most-derived class.
getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.InferableFunction
Returns a TypeDescriptor capturing what is known statically about the output type of this InferableFunction instance's most-derived class.
getOverlappingPartitions(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Return a list of overlapping partitions.
getOverloadRatio() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The target ratio between requests sent and successful requests.
getOverrideWindmillBinary() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
Custom windmill_main binary to use with the streaming runner.
getPane() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
Returns the pane of this FailsafeValueInSingleWindow in its window.
getPane() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Returns the pane of this ValueInSingleWindow in its window.
getPaneInfo() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getParallelism() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
Returns the parameters of this function.
getParameters() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
 
getParent() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
Returns the parent path, or null if this path does not have a parent.
getParentId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
getParentLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getParentPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getParentPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getParents() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
The unique partition identifiers of the parent partitions where this child partition originated from.
getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The unique partition identifiers of the parent partitions where this child partition originated from.
getParquetConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
getParseFn() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
getParser() - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
Get the memoized Parser, possibly initializing it lazily.
getParser() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Get the memoized Parser, possibly initializing it lazily.
getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Fetches the partition metadata row data for the given partition token.
getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Fetches the partition metadata row data for the given partition token.
getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getPartitionCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which this partition was first detected and created in the metadata table.
getPartitionEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The end time for the partition change stream query, which produced this record.
getPartitionKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner.ExplicitPartitioner
 
getPartitionKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
Determines which shard in the stream the record is assigned to.
getPartitionKey() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getPartitionKey(byte[]) - Method in interface org.apache.beam.sdk.io.kinesis.KinesisPartitioner
 
getPartitionKey() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getPartitionMetadataAdminDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
Creates and returns a singleton DAO instance for admin operations over the partition metadata table.
getPartitionMetadataDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
Creates and returns a singleton DAO instance for accessing the partition metadata table.
getPartitionQueryTimeout() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getPartitionReadTimeout() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getPartitionRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which the connector started processing this partition.
getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
getPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
getPartitionScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which this partition was scheduled to be queried.
getPartitionSpec() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
Partition spec destination, in the event that it must be dynamically created.
getPartitionStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The start time for the partition change stream query, which produced this record.
getPartitionsToReconcile(Instant, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
For missing partitions, try to organize the mismatched parent tokens in a way to fill the missing partitions.
getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The partition token that produced this change stream record.
getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The unique identifier of the partition that generated this record.
getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
Unique partition identifier, which can be used to perform a change stream query.
getPassword() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
getPassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getPassword() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getPassword() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getPassword() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getPassword() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getPassword() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getPath() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
Table path up to the leaf table name.
getPath() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
 
getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
 
getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
getPath() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
 
getPath() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
 
getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
The path for the display item within a component hierarchy.
getPathPrefix() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getPathValidator() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The path validator instance that should be used to validate paths.
getPathValidatorClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The class of the validator that should be created and used to validate paths.
getPatientCompartments() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
Gets the patient compartment responses for GetPatientEverything requests.
getPatientEverything() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Get the patient compartment for a FHIR Patient using the GetPatientEverything/$everything API.
getPatientEverything(String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Fhir get patient everything http body.
getPatientEverything(String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
getPatternCondition() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
 
getPatternVar() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
 
getPayload(AvroGenericCoder) - Method in class org.apache.beam.sdk.extensions.avro.AvroGenericCoderTranslator
 
getPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getPayload() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the main PubSub message.
getPayload() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getPayload() - Method in class org.apache.beam.sdk.io.mqtt.MqttRecord
 
getPayload() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the payload of the message as a byte array.
getPayload() - Method in class org.apache.beam.sdk.schemas.io.Failure
Bytes containing the payload which has failed.
getPayload() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UnknownLogicalType
 
getPCollection() - Method in interface org.apache.beam.runners.local.Bundle
Returns the PCollection that the elements of this bundle belong to.
getPCollection() - Method in interface org.apache.beam.sdk.values.PCollectionView
For internal use only.
getPCollection() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
getPCollectionInputs() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
getPCollectionInputs() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
 
getPerDestinationOutputFilenames() - Method in class org.apache.beam.sdk.io.WriteFilesResult
Returns a PCollection of all output filenames generated by this WriteFiles organized by user destination type.
getPerElementConsumers(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
getPerElementInputs(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
getPeriod() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
Amount of time between generated windows.
getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
getPeriodicStatusPageOutputDirectory() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getPerWorkerCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Counter that should be used for implementing the given per-worker {@code metricName) in this container.
getPerWorkerHistogram(MetricName, HistogramData.BucketType) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Histogram that should be used for implementing the given per-worker metricName in this container.
getPerWorkerMetricsUpdateReportingPeriodMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getPgJsonb(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Returns the record at the current pointer as JsonB.
getPipeline() - Method in class org.apache.beam.io.requestresponse.Result
 
getPipeline(JobApi.GetJobPipelineRequest, StreamObserver<JobApi.GetJobPipelineResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
getPipeline() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
Retrieve the job's pipeline.
getPipeline() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
 
getPipeline() - Method in class org.apache.beam.sdk.io.WriteFilesResult
 
getPipeline() - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
getPipeline() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
getPipeline() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
 
getPipeline() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
 
getPipeline() - Method in class org.apache.beam.sdk.values.PBegin
 
getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionList
 
getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
 
getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
getPipeline() - Method in class org.apache.beam.sdk.values.PDone
 
getPipeline() - Method in interface org.apache.beam.sdk.values.PInput
Returns the owning Pipeline of this PInput.
getPipeline() - Method in interface org.apache.beam.sdk.values.POutput
Returns the owning Pipeline of this POutput.
getPipeline() - Method in class org.apache.beam.sdk.values.PValueBase
 
getPipelineFromClasspath(String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
 
getPipelineName() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getPipelineOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
 
getPipelineOptions() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
Returns the configured pipeline options.
getPipelineOptions() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
getPipelineOptions() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
 
getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunner
For testing.
getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
getPipelineOptions() - Method in class org.apache.beam.runners.flink.TestFlinkRunner
 
getPipelineOptions() - Method in class org.apache.beam.runners.jet.JetRunnerRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner.OptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.runners.prism.PrismRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransformOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
getPipelineOptions() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
Perform a DFS(Depth-First-Search) to find the PipelineOptions config.
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws.options.AwsPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.KinesisIOOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws2.options.AwsPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.azure.options.AzurePipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
 
getPipelineOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.options.RemoteEnvironmentOptions.Options
 
getPipelineOptions() - Method in interface org.apache.beam.sdk.state.StateContext
Returns the PipelineOptions specified with the PipelineRunner.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
Returns the PipelineOptions specified with the PipelineRunner invoking this KeyedCombineFn.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
Returns the PipelineOptions specified with the PipelineRunner invoking this DoFn.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
Returns the PipelineOptions specified with the PipelineRunner invoking this DoFn.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Returns the PipelineOptions specified with the PipelineRunner invoking this DoFn.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.Options
 
getPipelineOptionsFromClasspath(String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
 
getPipelinePolicy() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Get the Runner API pipeline proto if available.
getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
 
getPipelineRunners() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.jet.JetRunnerRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.portability.PortableRunnerRegistrar
 
getPipelineRunners() - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner.RunnerRegistrar
 
getPipelineRunners() - Method in class org.apache.beam.runners.prism.PrismRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Runner
 
getPipelineUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
The URL of the staged portable pipeline.
getPlanner() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
getPlannerName() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
 
getPluginClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets the main class of a plugin.
getPluginConfig() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets a plugin config.
getPluginProperties() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getPluginProperties(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getPluginType() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets a plugin type.
getPollIntervalMillis() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
The time, in milliseconds, to wait before polling for new files.
getPort() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
getPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
getPort() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
Get the port that this ExpansionServer is bound to.
getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
getPortNumber() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getPortNumber() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getPositionForFractionConsumed(double) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
Returns a position P such that the range [start, P) represents approximately the given fraction of the range [start, end).
getPrecision() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
 
getPrecision() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
 
getPrecision() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
getPredefinedCsvFormat() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration
See CSVFormat.Predefined#values() for a list of allowed values.
getPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
First element in the path.
getPrefix() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration
 
getPrefixedEndpoint(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getPreviousWindow() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
 
getPrimary() - Method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
Returns the primary restriction.
getPrimaryKey() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getPriority() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getPriority() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the priority level of the message (0-255, higher is more important).
getPrismLocation() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
getPrismLogLevel() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
getPrismVersionOverride() - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
getPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getPrivateKeyPassphrase() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getPrivateKeyPassphrase() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getPrivateKeyPassphrase() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getPrivateKeyPath() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getPrivateKeyPath() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getProcessBundleDescriptor(String) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
getProcessBundleDescriptor(BeamFnApi.GetProcessBundleDescriptorRequest, StreamObserver<BeamFnApi.ProcessBundleDescriptor>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
 
getProcessBundleDescriptor() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
getProcessBundleDescriptor() - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
 
getProcessingTimeAdvance() - Method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
 
getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
Provides SdkHarnessClient.BundleProcessor that is capable of processing bundles not containing timers or state accesses such as: Side inputs User state Remote references
getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, StateDelegator) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
Provides SdkHarnessClient.BundleProcessor that is capable of processing bundles not containing timers.
getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, StateDelegator, Map<String, Map<String, ProcessBundleDescriptors.TimerSpec>>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
Provides SdkHarnessClient.BundleProcessor that is capable of processing bundles containing timers and state accesses such as: Side inputs User state Remote references
getProcessWideContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Return the MetricsContainer for the current process.
getProduced(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
getProducer(PValue) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
Get the AppliedPTransform that produced the provided PValue.
getProducer(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
getProducerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getProducerConfigUpdates() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
getProducerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getProducersMapCardinality() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
getProfilingAgentConfiguration() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
getProgress() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
Returns the progress made within the restriction so far.
getProgress() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
 
getProgress() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
 
getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
 
getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
getProgress() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.HasProgress
A representation for the amount of known completed and known remaining work.
getProject() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
getProject() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
Project id to use when launching jobs.
getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the project path.
getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getProjectId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Get the project this job exists in.
getProjectId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Returns the project id being written to.
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
getProperties() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
 
getProperties() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getProperties() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
getProperties() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getProtoBytesToRowFn(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
 
getProtoBytesToRowFromSchemaFunction(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
 
getProtoBytesToRowFunction(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
 
getProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
getProviderRuntimeValues() - Method in interface org.apache.beam.sdk.testing.TestPipeline.TestValueProviderOptions
 
getProvisionInfo(ProvisionApi.GetProvisionInfoRequest, StreamObserver<ProvisionApi.GetProvisionInfoResponse>) - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
 
getProxyConfiguration() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
ProxyConfiguration used to configure AWS service clients.
getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
 
getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.PayloadSerializerKafkaTable
 
getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
 
getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.PayloadSerializerKafkaTable
 
getPTransformId() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
 
getPublished() - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
Whether the message was published or not.
getPublishedResultsQueue() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
getPublishedResultsQueue() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
Returns the Queue instance associated with this session, with the asynchronously received callbacks from Solace for message publications.
getPublishLatencyMetric() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
getPublishMonotonicNanos() - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey
 
getPublishTimestamp() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
 
getPublishTimestampFunction() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getPubsubRootUrl() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
Root URL for use with the Google Cloud Pub/Sub API.
getQualifiers() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
 
getQuantifier() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
 
getQueries() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
Configures the BigQuery read job with the SQL query.
getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
getQuery() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
 
getQuery() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getQuery() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting a query which can be source for reading.
getQuery() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getQueryLocation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
BigQuery geographic location where the query job will be executed.
getQueryName() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the timestamp at which the change stream query for a ChangeStreamResultSet first started.
getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time that the change stream query which produced this record started.
getQueryString() - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
SQL Query.
getQueue() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
getQuotationMark() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting a character that will surround String in staged CSV files.
getRamMegaBytes() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
getRange() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
Returns the current range.
getRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
getRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
 
getRate() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
 
getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
 
getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsV17.RateLimitPolicyFactory
 
getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory
 
getRaw(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
Returns the raw value of the getter before any further transformations.
getRawBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
getRawBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
getRawPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getRawPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getRawPrivateKey() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getRawType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
Returns the raw class type.
getRawType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns the Class underlying the Type represented by this TypeDescriptor.
getRead() - Method in class org.apache.beam.io.requestresponse.Cache.Pair
 
getReadCounterPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
 
getReaderCacheTimeoutSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The amount of time before UnboundedReaders are considered idle and closed during streaming execution.
getReadOperation() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
getReadQuery() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getReadResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
Gets resources.
getReadTime() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getReadTime() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getReadTime() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getReadTimePercentage() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getReason() - Method in exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
getReason() - Method in class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
 
getReasons() - Method in exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
getReceiptHandle() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
SQS receipt handle.
getReceiver() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
 
getReceiver() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
 
getReceiver() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
 
getReceiver() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
getReceiver() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
Returns a MessageReceiver object for receiving messages from Solace.
getReceiverBuilder() - Method in class org.apache.beam.sdk.io.cdap.Plugin
getReceiverClass() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets Spark Receiver class for a CDAP plugin.
getReceiveTimestamp() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the timestamp (in milliseconds since the Unix epoch) when the message was received by the Solace broker.
getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
 
getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
 
getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
 
getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
 
getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
 
getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
 
getRecommendedSchema() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
 
getRecord() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
getRecord() - Method in class org.apache.beam.sdk.io.kudu.TableAndRecord
 
getRecord() - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord
Information about the record that failed.
getRecordJfrOnGcThrashing() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
If true, save a JFR profile when GC thrashing is first detected.
getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the timestamp at which a record was read from the ChangeStreamResultSet.
getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which the record was fully read.
getRecordSchema() - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
 
getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
Indicates the order in which a record was put to the stream.
getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
Indicates the order in which this record was put into the change stream in the scope of a partition, commit timestamp and transaction tuple.
getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the timestamp at which a record finished to be streamed.
getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which the record finished streaming.
getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the timestamp at which a record first started to be streamed.
getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which the record started to be streamed.
getRecordTimestamp() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecord
The timestamp associated with the record.
getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The Cloud Spanner timestamp time when this record occurred.
getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
Returns the timestamp that which this partition started being valid in Cloud Spanner.
getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The timestamp at which the modifications within were committed in Cloud Spanner.
getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
Indicates the timestamp for which the change stream query has returned all changes.
getRecordType() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
getRedelivered() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Indicates whether the message has been redelivered due to a prior delivery failure.
getRedistributeNumKeys() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getReferentialConstraints() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
getRegexFromPattern(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
Recursively construct a regular expression from a RexNode.
getRegion() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Get the region this job exists in.
getRegion() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
The Google Compute Engine region for creating Dataflow jobs.
getRegionFromEnvironment() - Static method in class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
 
getRegisteredOptions() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
getReidentifyConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
getReidentifyTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
getReIterableGroupByKeyResult() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getRelList() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
 
getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamJavaUdfCalcRule
 
getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRule
 
getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcSplittingRule
 
getRemoteInputDestinations() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
Get RemoteInputDestinations that input data are sent to the BeamFnApi.ProcessBundleDescriptor over.
getRemoteOutputCoders() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
Get all of the transforms materialized by this ProcessBundleDescriptors.ExecutableProcessBundleDescriptor and the Java Coder for the wire format of that transform.
getRepeatedTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Returns a new DataflowPipelineJob for the job that replaced this one, if applicable.
getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollection<OutputT>, ParDo.SingleOutput<InputT, OutputT>>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
 
getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollectionTuple, PTransform<PCollection<? extends InputT>, PCollectionTuple>>) - Method in class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
 
getReplicationGroupMessageId() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the ID for the message within its replication group (if applicable).
getReplyTo() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getReplyTo() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the destination to which replies to this message should be sent.
getReportCheckpointDuration() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getRequestAsString() - Method in class org.apache.beam.io.requestresponse.ApiIOError
The string representation of the request associated with the error.
getRequestTimeStamp() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
Timestamp the message was received at (in epoch millis).
getRequirements() - Method in class org.apache.beam.sdk.transforms.Contextful
Returns the requirements needed to run the closure.
getResidual() - Method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
Returns the residual restriction.
getResourceHints() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns resource hints set on the transform.
getResourceHints() - Method in interface org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions
 
getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
Gets resources.
getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
Gets resources.
getResourceType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
getResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
HTTP response from the FHIR store after attempting to write the Bundle method.
getResponseItemJson() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
 
getResponses() - Method in class org.apache.beam.io.requestresponse.Result
 
getRestrictionCoder() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
 
getRestrictionCoder() - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
getRestrictionCoder() - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
 
getResult() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
Returns the result of the transaction execution.
getResultCoder(Pipeline) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Provide the result coder.
getResultCount() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
getResults() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
Returns a PCollection containing the Rows that have been parsed.
getRetainDockerContainers() - Method in interface org.apache.beam.sdk.options.ManualDockerEnvironmentOptions
 
getRetainExternalizedCheckpointsOnCancellation() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getRetryableCodes() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
 
getReturnType(RelDataTypeFactory, SqlOperatorBinding) - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
 
getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
getRole() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getRole() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getRole() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getRoot() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
getRootCause() - Method in exception org.apache.beam.sdk.coders.CannotProvideCoderException
Returns the inner-most CannotProvideCoderException when they are deeply nested.
getRootElement() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration
 
getRootSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getRootTransforms() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
getRoutingKey() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
getRow(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.ROW value by field name, IllegalStateException is thrown if schema doesn't match.
getRow(int) - Method in class org.apache.beam.sdk.values.Row
Get a Row value by field index, IllegalStateException is thrown if schema doesn't match.
getRowCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
getRowCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
 
getRowGroupSize() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration
 
getRowReceiver(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
Returns a DoFn.OutputReceiver for publishing Row objects to the given tag.
getRowRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getRows() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
getRows() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.TableWithRows
 
getRowSchema() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
getRowSelector(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
 
getRowSelectorOptimized(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
 
getRowsWritten() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
The number of rows written in this batch.
getRowToAvroBytesFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Returns a function mapping Beam Rows to encoded AVRO GenericRecords.
getRowToGenericRecordFunction(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
getRowToJsonBytesFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
Returns a SimpleFunction mapping Beam Rows to JSON byte[] arrays.
getRowToJsonStringsFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
Returns a SimpleFunction mapping Beam Rows to JSON Strings.
getRowToProtoBytes(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
 
getRowToProtoBytesFn(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
 
getRowToProtoBytesFromSchema(String, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
 
getRowToRawBytesFunction(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
getRowToRawBytesFunction(String) - Static method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
getRowType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
 
getRowType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The type of the primary keys and modified columns within this record.
getRpcPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getRule() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
 
getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
 
getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
 
getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
 
getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
 
getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
 
getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
 
getRuleNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
getRuleNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
getRuleSets() - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
 
getRunner() - Method in interface org.apache.beam.sdk.options.PipelineOptions
The pipeline runner that will be used to execute the pipeline.
getRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time at which the connector started processing this partition.
getS3ClientBuilder() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Builder used to create the AmazonS3Client.
getS3ClientBuilder() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
Builder used to create the S3Client.
getS3ClientFactoryClass() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
getS3ClientFactoryClass() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
getS3StorageClass() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
getS3StorageClass() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
The AWS S3 storage class used for creating S3 objects.
getS3StorageClass() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
getS3StorageClass() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
The AWS S3 storage class used for creating S3 objects.
getS3ThreadPoolSize() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
getS3ThreadPoolSize() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Thread pool size, limiting the max concurrent S3 operations.
getS3ThreadPoolSize() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
getS3ThreadPoolSize() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
Thread pool size, limiting the max concurrent S3 operations.
getS3UploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
getS3UploadBufferSizeBytes() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Size of S3 upload chunks.
getS3UploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
getS3UploadBufferSizeBytes() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
Size of S3 upload chnks.
getSafeFilepattern() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
 
getSafeSchema() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
 
getSamplePeriod() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The length of time sampled request data will be retained.
getSamplePeriodBucketSize() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The size of buckets within the specified samplePeriod.
getSamplingStrategy() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
getSasToken() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
getSaveHeapDumpsToGcsPath() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
CAUTION: This option implies dumpHeapOnOOM, and has similar caveats.
getSavepointPath() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getSaveProfilesToGcs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
getSbeFields() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
 
getScale() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
getScan() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
getScanType() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time at which this partition was scheduled to be queried.
getScheduledExecutorService() - Method in interface org.apache.beam.sdk.options.ExecutorOptions
The ScheduledExecutorService instance to use to create threads, can be overridden to specify a ScheduledExecutorService that is compatible with the user's environment.
getSchema() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns the schema used by this coder.
getSchema(DestinationT) - Method in class org.apache.beam.sdk.extensions.avro.io.DynamicAvroDestinations
Return an AVRO schema for a given destination.
getSchema(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
 
getSchema() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
 
getSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getSchema() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
Get the schema info of the table.
getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.SchemaBaseBeamTable
 
getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
getSchema() - Static method in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
 
getSchema() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
The schema used by sources to deserialize data and create Beam Rows.
getSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
 
getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the table schema for the destination.
getSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
 
getSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
 
getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return a Beam Schema from the Pub/Sub schema resource, if exists.
getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Return a Beam Schema from the Pub/Sub schema resource, if exists.
getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Return a Beam Schema from the Pub/Sub schema resource, if exists.
getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getSchema() - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
Schema for the destination, in the event that it must be dynamically created.
getSchema() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
getSchema() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
getSchema() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getSchema() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting a schema of a Snowflake table.
getSchema() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getSchema() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getSchema() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
Returns the schema associated with this type.
getSchema(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve a Schema for a given Class type.
getSchema(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve a Schema for a given TypeDescriptor type.
getSchema() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
 
getSchema() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
 
getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns the schema used by this CoGbkResult.
getSchema() - Method in class org.apache.beam.sdk.values.PCollection
Returns the attached schema.
getSchema() - Method in class org.apache.beam.sdk.values.Row.Builder
Return the schema for the row being built.
getSchema() - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
 
getSchema() - Method in class org.apache.beam.sdk.values.Row
Return Schema which describes the fields.
getSchemaCoder(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve a SchemaCoder for a given Class type.
getSchemaCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve a SchemaCoder for a given TypeDescriptor type.
getSchemaId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
 
getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
 
getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
 
getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
 
getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Return SchemaPath from TopicPath if exists.
getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Return SchemaPath from TopicPath if exists.
getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
getSchemaProvider(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve a registered SchemaProvider for a given TypeDescriptor.
getSchemaProvider(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve a registered SchemaProvider for a given Class.
getSchemaProviders() - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaRegistrar
 
getSchemaProviders() - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProviderRegistrar
 
getSchemaProviders() - Method in interface org.apache.beam.sdk.schemas.SchemaProviderRegistrar
Returns a list of schema providers which will be registered by default within each schema registry instance.
getSchemaRegistry() - Method in class org.apache.beam.sdk.Pipeline
 
getSchematizedData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets schematized data.
getSchemaWithoutAttributes(Schema, List<String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
getScheme() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
getScheme() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
The uri scheme used by resources on this filesystem.
getScheme() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
The uri scheme used by resources on this filesystem.
getScheme() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
 
getScheme() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
getScheme() - Method in class org.apache.beam.sdk.io.FileSystem
Get the URI scheme which defines the namespace of the FileSystem.
getScheme() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Get the scheme which defines the namespace of the ResourceId.
getSdkComponents() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
getSdkContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Container image used to configure SDK execution environment on worker.
getSdkHarnessContainerImageOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Overrides for SDK harness container images.
getSdkHarnessLogLevelOverrides() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
This option controls the log levels for specifically named loggers.
getSdkWorkerId() - Method in interface org.apache.beam.sdk.fn.server.HeaderAccessor
This method should be called from the request method.
getSdkWorkerParallelism() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
getSearchEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getSeconds() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
 
getSelectedFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getSemiPersistDir() - Method in interface org.apache.beam.sdk.options.RemoteEnvironmentOptions
 
getSempClientFactory() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
getSenderTimestamp() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the timestamp (in milliseconds since the Unix epoch) when the message was sent by the sender.
getSendFacility() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets send facility.
getSendTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets send time.
getSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getSequenceNumber() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
Deprecated.
getSequenceNumber() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
getSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getSequenceNumber() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
Gets the sequence number of the message (if applicable).
getSerializableFunctionUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
For UDFs implement SerializableFunction.
getSerializedATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
getSerializedATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
getSerializedWindowingStrategy() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration
 
getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.io.payloads.AvroPayloadSerializerProvider
 
getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
 
getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
 
getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
 
getSerializer(Schema, Map<String, Object>) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializerProvider
Get a PayloadSerializer.
getSerializer(String, Schema, Map<String, Object>) - Static method in class org.apache.beam.sdk.schemas.io.payloads.PayloadSerializers
 
getServer() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
Get the underlying Server contained by this GrpcFnServer.
getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
 
getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
 
getServerFactory() - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
Create the ServerFactory applicable to this environment.
getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
 
getServerName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getServerName() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getServerName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getServerTransactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The unique transaction id in which the modifications occurred.
getService() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
Get the service exposed by this GrpcFnServer.
getServiceAccount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Run the job as a specific service account, instead of the default GCE robot.
getServiceURL(String, String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getServiceURL(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getSessionServiceFactory() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
getSetFieldCreator(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
getSetters(TypeDescriptor<?>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
Return the list of FieldValueSetters for a Java Bean class
getSetters(TypeDescriptor<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
getSha256() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
The SHA-256 hash of the source file.
getShard() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getShardId() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getShardId() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getShardingFunction() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getShardNameTemplate() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
See ShardNameTemplate for the expected values.
getShardNumber() - Method in class org.apache.beam.sdk.values.ShardedKey
 
getShortTableUrn() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
Return shortened tablespec in datasets/[dataset]/tables/[table] format.
getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Return the optional short value for an item, or null if none is provided.
getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The optional short value for an item, or null if none is provided.
getShutdownSourcesAfterIdleMs() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getSideInput(String) - Method in interface org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory.SideInputGetter
 
getSideInputBroadcast(PCollection<T>, SideInputValues.Loader<T>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
getSideInputBroadcast(PCollection<T>, SideInputValues.Loader<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getSideInputDataSets() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
getSideInputKeys() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
get the tag id's of all the keys.
getSideInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
getSideInputs(ExecutableStage) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
 
getSideInputs() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
 
getSideInputs() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
Override to specify that this object needs access to one or more side inputs.
getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Specifies that this object needs access to one or more side inputs.
getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns the side inputs used by this Combine operation.
getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns the side inputs used by this Combine operation.
getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
Returns the side inputs used by this Combine operation.
getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
getSideInputs() - Method in class org.apache.beam.sdk.transforms.Requirements
The side inputs that this Contextful needs access to.
getSideInputSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
Get a mapping from PTransform id to side input id to side inputs that are used during execution.
getSideInputWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
Returns the window of the side input corresponding to the given window of the main input.
getSingleFileMetadata() - Method in class org.apache.beam.sdk.io.FileBasedSource
Returns the information about the single file that this source is reading from.
getSinglePCollection() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
Like PCollectionRowTuple.get(String), but is a convenience method to get a single PCollection without providing a tag for that output.
getSingleTokenNewPartition(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
Return a new NewPartition that only contains one token that matches the parentPartition.
getSingleWorkerStatus(String, long, TimeUnit) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
Get the latest SDK worker status from the client's corresponding SDK harness.
getSink() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
Sink for control clients.
getSink() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
 
getSink() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Returns the FileBasedSink for this write operation.
getSink() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getSinkGroupId() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getSinks() - Static method in class org.apache.beam.sdk.metrics.Lineage
Lineage representing sinks.
getSize() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
Get the size.
getSize() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
Size of the generated windows.
getSize() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
getSize(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
getSize(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
 
getSize(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
getSize(PulsarSourceDescriptor, OffsetRange) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
getSize() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
 
getSize() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
getSize() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
getSketchFromByteBuffer(ByteBuffer) - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount
Converts the passed-in sketch from ByteBuffer to byte[], mapping null ByteBuffers (representing empty sketches) to empty byte[]s.
getSkipHeaderLines() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
 
getSkipKeyClone() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getSkipValueClone() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getSnapshot() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getSnapshotId() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
getSnapshots() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteResult
 
getSnowPipe() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getSocketTimeout() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getSorterType() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
Returns the sorter type.
getSource() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
The file to stage.
getSource() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
Source of control clients.
getSource() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
 
getSource() - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
 
getSource() - Method in class org.apache.beam.sdk.io.Read.Bounded
Returns the BoundedSource used to create this Read PTransform.
getSource() - Method in class org.apache.beam.sdk.io.Read.Unbounded
Returns the UnboundedSource used to create this Read PTransform.
getSource() - Method in class org.apache.beam.sdk.io.TextIO.Read
 
getSource() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
getSources() - Static method in class org.apache.beam.sdk.metrics.Lineage
Lineage representing sources and optionally side inputs.
getSparkMaster() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
getSparkReceiverClass() - Method in class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
 
getSparkSession() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext
 
getSparkSession() - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
getSparkSession() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
getSplit() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.SerializableSplit
 
getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns the size of the backlog of unread data in the underlying data source represented by this split of this source.
getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Returns the total amount of parallelism in the consumed (returned and processed) range of this reader's current BoundedSource (as would be returned by BoundedSource.BoundedReader.getCurrentSource()).
getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
getSplitPointsProcessed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
Returns the total number of split points that have been processed.
getSplitPointsRemaining() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
 
getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Returns the total amount of parallelism in the unprocessed part of this reader's current BoundedSource (as would be returned by BoundedSource.BoundedReader.getCurrentSource()).
getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
getSSEAlgorithm() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
getSSEAlgorithm() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Algorithm for SSE-S3 encryption, e.g.
getSSEAlgorithm() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
getSSEAlgorithm() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
Algorithm for SSE-S3 encryption, e.g.
getSSEAwsKeyManagementParams() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
getSSEAwsKeyManagementParams() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
KMS key id for SSE-KMS encryption, e.g.
getSSECustomerKey() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
getSSECustomerKey() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
SSE key for SSE-C encryption, e.g.
getSSECustomerKey() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
getSSECustomerKey() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
SSE key for SSE-C encryption, e.g.
getSSEKMSKeyId() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
getSSEKMSKeyId() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
KMS key id for SSE-KMS encyrption, e.g.
getSsl() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getStableUniqueNames() - Method in interface org.apache.beam.sdk.options.PipelineOptions
Whether to check for stable unique names on each transform.
getStackTrace() - Method in class org.apache.beam.io.requestresponse.ApiIOError
The Exception stack trace.
getStackTrace() - Method in class org.apache.beam.sdk.io.csv.CsvIOParseError
The caught Throwable.getStackTrace().
getStackTrace() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getStageBundleFactory(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
 
getStageBundleFactory(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.ExecutableStageContext
 
getStagedArtifacts(String) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
Returns the rewritten artifacts associated with this job, keyed by environment.
getStageName() - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
getStager() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The resource stager instance that should be used to stage resources.
getStagerClass() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The class responsible for staging resources to be accessible by workers during job execution.
getStagingBucketDir() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting directory where files are staged.
getStagingBucketDir() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
Getter for a bucket name with directory where files were staged and waiting for loading.
getStagingBucketName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getStagingBucketName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getStagingLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
GCS path for staging local files, e.g.
getStart() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
 
getStart() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
 
getStart() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
 
getStart() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
 
getStartAtTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getStartKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns the ByteKey representing the lower bound of this ByteKeyRange.
getStartOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns the starting offset of the source.
getStartPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
getStartPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
getStartPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Returns the starting position of the current range, inclusive.
getStartReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getStartTime() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
Returns the time the reader was started.
getStartTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
getStartTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
It is the partition_start_time of the child partition token.
getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
It is the start time at which the partition started existing in Cloud Spanner.
getState() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
getState() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
getState() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
getState() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
getState() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
getState() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
 
getState() - Method in class org.apache.beam.runners.jet.JetPipelineResult
 
getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
getState() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
Retrieve the job's current state.
getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
 
getState() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
getState() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
 
getState() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
 
getState() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The state in which the current partition is in.
getState() - Method in interface org.apache.beam.sdk.PipelineResult
Retrieves the current state of the pipeline execution.
getState() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimator
Get current state of the WatermarkEstimator instance, which can be used to recreate the WatermarkEstimator when processing the restriction.
getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
 
getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
 
getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
 
getStateBackend() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getStateBackendFactory() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
Deprecated.
Please use setStateBackend below.
getStateBackendStoragePath() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getStateCoder(Pipeline) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Provide the state coder.
getStateCoder() - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
Used to encode the state of this Watch.Growth.TerminationCondition.
getStateEvent() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
Retrieve the job's current state.
getStateStream(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
getStaticCreator(TypeDescriptor<?>, Method, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
getStaticCreator(TypeDescriptor<?>, Method, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
getStatistic() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
 
getStatus() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
getStatusCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getStatusDate() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
getStatusUpdateFrequency() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Determines the frequency of emission of the OrderedProcessingStatus elements.
getStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
Returns the mapping of AppliedPTransforms to the internal step name for that AppliedPTransform.
getStopPipelineWatermark() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
getStopPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
getStopPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
getStopPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Returns the ending position of the current range, exclusive.
getStopReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getStorageApiAppendThresholdBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageApiAppendThresholdRecordCount() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageClient(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
Returns a real, mock, or fake BigQueryServices.StorageClient.
getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
getStorageIntegrationName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getStorageIntegrationName() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting Snowflake integration which is used in COPY statement.
getStorageIntegrationName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getStorageLevel() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
getStorageWriteApiMaxRequestSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageWriteApiMaxRetries() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageWriteApiTriggeringFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageWriteMaxInflightBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageWriteMaxInflightRequests() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Create an append client for a given Storage API write stream.
getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getStreamingService() - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices
 
getStreamingService() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
 
getStreamingSideInputCacheExpirationMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getStreamingSideInputCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getStreamingTimeoutMs() - Method in interface org.apache.beam.runners.spark.SparkPortableStreamingPipelineOptions
 
getStreamName() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getStreamName() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getStreamTableDebugString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getString(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getString(String) - Method in class org.apache.beam.sdk.values.Row
Get a Schema.TypeName.STRING value by field name, IllegalStateException is thrown if schema doesn't match.
getString(int) - Method in class org.apache.beam.sdk.values.Row
Get a String value by field index, ClassCastException is thrown if schema doesn't match.
getStrings(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getStringSet(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
 
getStringSet(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the StringSet that should be used for implementing the given metricName in this container.
getStringSet() - Method in class org.apache.beam.sdk.metrics.StringSetResult.EmptyStringSetResult
Returns an empty immutable set.
getStringSet() - Method in class org.apache.beam.sdk.metrics.StringSetResult
 
getStringSets() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
Return the metric results for the sets that matched the filter.
getStuckCommitDurationMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getSubmissionMode() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
 
getSubmissionMode() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
getSubnetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
GCE subnetwork for launching workers.
getSubProvider(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
 
getSubProvider(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
Returns a sub-provider, e.g.
getSubProviders() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
Returns all sub-providers, e.g.
getSubSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getSubSchemaNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the subscription being read from.
getSubscriptionName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getSubscriptionProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the ValueProvider for the subscription being read from.
getSubSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getSubSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
Gets successful bodies from Write.
getSuccessfulBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
Gets successful FhirBundleResponse from execute bundles operation.
getSuccessfulInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Returns a PCollection containing the TableRows that were written to BQ via the streaming insert API.
getSuccessfulPublish() - Method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
 
getSuccessfulStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Return all rows successfully inserted using one of the storage-api insert methods.
getSuccessfulTableLoads() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Returns a PCollection containing the TableDestinations that were successfully loaded using the batch load API.
getSuggestedFilenameSuffix() - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
Deprecated.
 
getSuggestedFilenameSuffix() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
 
getSuggestedSuffix() - Method in enum org.apache.beam.sdk.io.Compression
 
getSum() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
getSum() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
getSumAndReset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
getSummary() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
getSupertype(Class<? super T>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns the generic form of a supertype.
getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
 
getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
 
getSupportedClass() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
Gets the class this CloudObjectTranslator is capable of converting.
getSupportedClass() - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
 
getSupportedClass() - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
 
getSynchronizedProcessingOutputWatermark() - Method in interface org.apache.beam.runners.local.Bundle
Returns the processing time output watermark at the time the producing Executable committed this bundle.
getSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
getTable() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
getTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
Get a specific table from this provider it is present, or null if it is not present.
getTable(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Returns the table to read, or null if reading from a query instead.
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Returns the table reference, or null.
getTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Gets the specified Table resource by table ID.
getTable(TableReference, List<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Gets the specified Table resource by table ID.
getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns a TableDestination object for the destination.
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getTable(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getTable() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getTable() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getTable() - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration
 
getTable() - Method in class org.apache.beam.sdk.io.kudu.TableAndRecord
 
getTable() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
 
getTable() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
 
getTable() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getTable() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting a table as a source of reading or destination to write.
getTable() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getTableAdminClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
 
getTableByFullName(TableName) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
getTableByFullName(TableName) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
 
getTableConstraints(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns TableConstraints (including primary and foreign key) to be used when creating the table.
getTableConstraints(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
 
getTableCreateConfig() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
Metadata and constraints for creating a new table, if it must be done dynamically.
getTableDescription() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns the table being read from.
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
 
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Return the metadata table name.
getTableId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
getTableId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
 
getTableIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination
The iceberg table identifier to write data to.
getTableIdentifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getTableImpl(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getTableName() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
Table name, the last element of the fully-specified table name with path.
getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
 
getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The name of the table in which the modifications within this record occurred.
getTableNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getTablePath(Table) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
Returns a full table path (exlucding top-level schema) for a given ZetaSQL Table.
getTableProvider() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Returns the table to read, or null if reading from a query instead.
getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
getTableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTableResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
getTableRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
 
getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
 
getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
getTables() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
Get all tables from this provider.
getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
getTables() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
getTableSchema(String, String) - Static method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
Returns TableSchema for a given table.
getTableSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
getTableSchema(String, String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
Gets the table schema, or absent optional if the table doesn't exist in the database.
getTableSchema() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
 
getTableSchema() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
 
getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
Specifies a table for a BigQuery read job.
getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
Return the tablespec in [project:].dataset.tableid format.
getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
 
getTableStatistics(PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
Estimates the number of rows or the rate for unbounded Tables.
getTableStatistics(PipelineOptions, SchemaIO) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
 
getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
 
getTableStatistics(PipelineOptions, SchemaIO) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
 
getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
getTableStringIdentifier(ValueInSingleWindow<Row>) - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
 
getTableType() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
Gets the table type this provider handles.
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
 
getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
getTableUrn(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
Return the tablespec in projects/[project]/datasets/[dataset]/tables/[table] format.
getTag() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getTag(int) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Returns the tuple tag at the given index.
getTag() - Method in class org.apache.beam.sdk.values.TaggedPValue
Returns the local tag associated with the PValue.
getTagInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
Deprecated.
this method will be removed entirely. The PCollection underlying a side input, is part of the side input's specification with a ParDo transform, which will obtain that information via a package-private channel.
getTagInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
Returns a unique TupleTag identifying this PCollectionView.
getTargetDataset() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
getTargetParallelism() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
getTargetTable(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
getTargetTableId(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
getTempDatasetId() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getTempDirectory() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
getTempDirectoryProvider() - Method in class org.apache.beam.sdk.io.FileBasedSink
Returns the directory inside which temporary files will be written according to the configured FileBasedSink.FilenamePolicy.
getTempFilename() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getTemplateLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Where the runner should generate a template file.
getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Returns the configured temporary location.
getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
Returns the configured temporary location.
getTempLocation() - Method in interface org.apache.beam.sdk.options.PipelineOptions
A pipeline level default location for storing temporary files.
getTempRoot() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
GETTER_WITH_NULL_METHOD_ERROR - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
GetterBasedSchemaProvider - Class in org.apache.beam.sdk.schemas
Deprecated.
new implementations should extend the GetterBasedSchemaProviderV2 class' methods which receive TypeDescriptors instead of ordinary Classes as arguments, which permits to support generic type signatures during schema inference
GetterBasedSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
 
GetterBasedSchemaProviderBenchmark - Class in org.apache.beam.sdk.jmh.schemas
Benchmarks for GetterBasedSchemaProvider on reading / writing fields based on toRowFunction / fromRowFunction.
GetterBasedSchemaProviderBenchmark() - Constructor for class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
GetterBasedSchemaProviderV2 - Class in org.apache.beam.sdk.schemas
A newer version of GetterBasedSchemaProvider, which works with TypeDescriptors, and which by default delegates the old, Class based methods, to the new ones.
GetterBasedSchemaProviderV2() - Constructor for class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
 
getTerminateAfterSecondsSinceNewOutput() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
If no new files are found after this many seconds, this transform will cease to watch for new files.
GetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
 
getTestMode() - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
Set to true to run the job in test mode.
getTestTimeoutSeconds() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
getThrottleDuration() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The amount of time an attempt will be throttled if deemed necessary based on previous success rate.
getThroughputEstimate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getTimeDomain() - Method in interface org.apache.beam.sdk.state.TimerSpec
 
getTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTimerFamilyId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
getTimerFamilyId() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
 
getTimerReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
Get a map of (transform id, timer id) to receivers which consume timers, forwarding them to the remote environment.
getTimerReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
 
getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getTimers() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
 
getTimerSpec() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
getTimerSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
Get a mapping from PTransform id to timer id to timer specs that are used during execution.
getTimes() - Method in class org.apache.beam.runners.spark.io.CreateStream
Get times so they can be pushed into the GlobalWatermarkHolder.
getTimestamp() - Method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
 
getTimeStamp() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
Timestamp the message was sent at (in epoch millis).
getTimestamp() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
 
getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
Indicates the timestamp for which the change stream query has returned all changes.
getTimestamp() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
Returns timestamp for element being published to Kafka.
getTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getTimestamp() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
 
getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult
 
getTimestamp() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
Returns the timestamp of this FailsafeValueInSingleWindow.
getTimestamp() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
getTimestamp() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Returns the timestamp of this ValueInSingleWindow.
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the timestamp attribute.
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the timestamp attribute.
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getTimestampCombiner() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
Return the TimestampCombiner which will be used to determine a watermark hold time given an element timestamp, and to combine watermarks from windows which are about to be merged.
getTimestampCombiner() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getTimestampFn() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
 
getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
Returns record timestamp (aka event time).
getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
 
getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
 
getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
 
getTimestampMillis() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
getTimestampMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
Timestamp for element (ms since epoch).
getTimestampPolicyFactory() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getTimestampTransforms() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
The transforms applied to the arrival time of an element to determine when this trigger allows output.
getTimestampType() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getTimeToLive() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record
The number of milliseconds before the message is discarded or moved to Dead Message Queue.
getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getTiming() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Return the timing of this pane.
getTo() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
Returns the range end timestamp (exclusive).
getTo() - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
getToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
Unique partition identifier, which can be used to perform a change stream query.
getTokenNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
Deprecated.
getTokenNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
Deprecated.
getTokenWithCorrectPartition(Range.ByteStringRange, ChangeStreamContinuationToken) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamContinuationTokenHelper
Return the continuation token with correct partition.
getToKV() - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the topic being written to.
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the topic being read from.
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
Sets the topic from which to read.
getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
getTopic() - Method in class org.apache.beam.sdk.io.mqtt.MqttRecord
 
getTopic() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
 
getTopicName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getTopicPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
 
getTopicPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getTopicPattern() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the ValueProvider for the topic being written to.
getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the ValueProvider for the topic being read from.
getTopics() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
getTopics() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getToRowFunction(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
 
getToRowFunction() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
 
getToRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
Returns the fromRow conversion function.
getToRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve the function that converts an object of the specified type to a Row object.
getToRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Retrieve the function that converts an object of the specified type to a Row object.
getToRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
Returns the attached schema's toRowFunction.
getToSnapshot() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getToSnapshotRef() - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
getTotalBacklogBytes() - Method in interface org.apache.beam.sdk.io.jms.AutoScaler
Returns the size of the backlog of unread data in the underlying data source represented by all splits of this source.
getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.jms.DefaultAutoscaler
 
getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns the size of the backlog of unread data in the underlying data source represented by all splits of this source.
getTotalStreamDuration() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the total stream duration of change stream records so far.
getTotalStreamTimeMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The total streaming time (in millis) for this record.
getTraitDef() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
getTransactionIsolation() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getTransactionTag() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The transaction tag associated with the given transaction.
getTransform(RunnerApi.FunctionSpec, PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
 
getTransform(RunnerApi.FunctionSpec, PipelineOptions) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
 
getTransformId() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
 
getTransformId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
getTransformId() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
 
getTransformingMap(Map<K1, V1>, Function<K1, K2>, Function<V1, V2>) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
 
getTransformNameMapping() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Mapping of old PTransform names to new ones, specified as JSON {"oldName":"newName",...} .
getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.DataflowTransformTranslator
 
getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.Registrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.ReadRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.WriteRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.ReadWriteRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.ReadRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.WriteRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.ReadRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.WriteRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.ReadRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.WriteRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.managed.ManagedSchemaTransformTranslation.ManagedTransformRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.transforms.Redistribute.Registrar
 
getTransformStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
getTransformTranslator(Class<TransformT>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Returns the TransformTranslator to use for instances of the specified PTransform class, or null if none registered.
getTransformTranslator(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.PipelineTranslatorBatch
Returns a TransformTranslator for the given PTransform if known.
getTransformTranslator(TransformT) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
Returns a TransformTranslator for the given PTransform if known.
getTransformUniqueID(RunnerApi.FunctionSpec) - Method in interface org.apache.beam.sdk.expansion.service.TransformProvider
 
getTranslator() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
Returns the DataflowPipelineTranslator associated with this object.
getTransport() - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
 
getTrigger() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getTriggeringFrequencySeconds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getTriggeringFrequencySeconds() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration
 
getTruncatedRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
 
getTruncateTimestamps() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
Whether to truncate timestamps in tables described by Data Catalog.
getTruncateTimestamps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
 
getTSetEnvironment() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
getTSetGraph() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
 
getTupleTag() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
Returns the TupleTag of this TaggedKeyedPCollection.
getTupleTagId(PValue) - Static method in class org.apache.beam.runners.jet.Utils
 
getTupleTagList() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Returns the TupleTagList tuple associated with this schema.
getTwister2Home() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
getType() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns the type this coder encodes/decodes.
getType() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
Returns the type for the datum factory.
getType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
 
getType(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getType() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
 
getType() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
type of the table.
getType() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
The type of the column.
getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
 
getType() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getType() - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination
Gets the type of the destination (TOPIC, QUEUE or UNKNOWN).
getType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
Returns the field type.
getType() - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns the fields Schema.FieldType.
getType(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
Get the type of an option.
getType() - Method in interface org.apache.beam.sdk.testing.TestStream.Event
 
getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Retrieve the DisplayData.Type of display data.
getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The DisplayData.Type of display data.
getType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns the Type represented by this TypeDescriptor.
getTypeDescriptor() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.ViewFn
Return the TypeDescriptor describing the output of this fn.
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollection
Returns a TypeDescriptor<T> with some reflective information about T, if possible.
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
Deprecated.
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
Deprecated.
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
Deprecated.
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
Deprecated.
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
 
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.TupleTag
Returns a TypeDescriptor capturing what is known statically about the type of this TupleTag instance's most-derived class.
getTypeFactory() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getTypeMap() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getTypeName() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
getTypeName() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
getTypeNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
getTypeParameter(String) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a TypeVariable for the named type parameter.
getTypePayload() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
 
getTypes() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a set of TypeDescriptor, one for each superclass as well as each interface implemented by this class.
getTypeUrn() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
 
getUdaf(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
 
getUdafImpl() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
getUdafs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
 
getUnalignedCheckpointEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getUnboundedReaderMaxElements() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The max elements read from an UnboundedReader before checkpointing.
getUnboundedReaderMaxReadTimeMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The max amount of time an UnboundedReader is consumed before checkpointing.
getUnboundedReaderMaxReadTimeSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
getUnboundedReaderMaxWaitForElementsMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The max amount of time waiting for elements when reading from UnboundedReader.
getUnderlyingDoFn() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
getUnderlyingSchemaProvider(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
Retrieves the underlying SchemaProvider for the given TypeDescriptor.
getUnderlyingSchemaProvider(Class<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
Retrieves the underlying SchemaProvider for the given Class.
getUnfinishedEndpoints() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
Get all unfinished data and timers endpoints represented as [transform_id]:data and [transform_id]:timers:[timer_family_id].
getUnfinishedMinWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Fetches the earliest partition watermark from the partition metadata table that is not in a PartitionMetadata.State.FINISHED state.
getUnionCoder() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
getUnionTag() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
getUniqueId() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
getUniqueId() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getUnknownFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getUnknownFieldsPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getUntilTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
The trigger that signals termination of this trigger.
getUpdateCompatibilityVersion() - Method in interface org.apache.beam.sdk.options.StreamingOptions
 
getUpdatedSchema() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
If the table schema has been updated, returns the new schema.
getUpdatedSchema(TableSchema, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaUpdateUtils
 
getUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
If non-null, the upload buffer size to be used.
getUrl() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getUrl() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getUrn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
 
getUrn() - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
 
getUrn() - Method in interface org.apache.beam.sdk.transforms.Materialization
Gets the URN describing this Materialization.
getUseActiveSparkSession() - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
 
getUseAltsServer() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
getUseAtLeastOnceSemantics() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getUseCdcWrites() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getUseDataStreamForBatch() - Method in interface org.apache.beam.runners.flink.VersionDependentFlinkPipelineOptions
 
getUsePublicIps() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Specifies whether worker pools should be started with public IP addresses.
getUserAgent() - Method in interface org.apache.beam.sdk.options.PipelineOptions
A user agent string as per RFC2616, describing the pipeline to external services.
getUserId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
getUsername() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration
 
getUsername() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
getUsername() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
getUsername() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getUsername() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getUsername() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getUsername() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getUseSeparateWindmillHeartbeatStreams() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getUsesProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getUseStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
Enables BigQuery's Standard SQL dialect when reading from a query.
getUseStorageApiConnectionPool() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getUseStorageWriteApi() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getUseStorageWriteApiAtLeastOnce() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getUseTransformService() - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
 
getUseWindmillIsolatedChannels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getUsingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
getUuid() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getUUID() - Method in class org.apache.beam.sdk.schemas.Schema
Get this schema's UUID.
getUuidFromMessage(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
getValidationFailures() - Method in class org.apache.beam.sdk.io.cdap.context.FailureCollectorWrapper
 
getValue() - Method in class org.apache.beam.runners.spark.util.ByteArray
 
getValue() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
getValue() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
getValue() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
 
getValue() - Method in class org.apache.beam.sdk.io.range.ByteKey
Returns a read-only ByteBuffer representing this ByteKey.
getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
 
getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult
 
getValue() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
Return the integer enum value.
getValue(Class<T>) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
Returns the current value of the OneOf as the destination type.
getValue() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
Returns the current value of the OneOf.
getValue(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
Get the value of an option.
getValue(String, Class<T>) - Method in class org.apache.beam.sdk.schemas.Schema.Options
Get the value of an option.
getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Retrieve the value of the display item.
getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The value of the display item.
getValue() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
getValue() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
Returns the value of this FailsafeValueInSingleWindow.
getValue() - Method in class org.apache.beam.sdk.values.KV
Returns the value of this KV.
getValue(int) - Method in class org.apache.beam.sdk.values.Row
Get value by field index, ClassCastException is thrown if schema doesn't match.
getValue(String) - Method in class org.apache.beam.sdk.values.Row
Get value by field name, ClassCastException is thrown if type doesn't match.
getValue(int) - Method in class org.apache.beam.sdk.values.RowWithGetters
 
getValue(int) - Method in class org.apache.beam.sdk.values.RowWithStorage
 
getValue() - Method in class org.apache.beam.sdk.values.TaggedPValue
Returns the PCollection.
getValue() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
getValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Returns the value of this ValueInSingleWindow.
getValue() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
getValueCaptureType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The capture type of the change stream that generated this record.
getValueClass() - Method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
 
getValueCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
 
getValueCoder() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
Gets the value coder that will be prefixed by the length.
getValueCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
 
getValueCoder() - Method in class org.apache.beam.sdk.coders.NullableCoder
Returns the inner Coder wrapped by this NullableCoder instance.
getValueCoder() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
getValueCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getValueCoder() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getValueCoder() - Method in class org.apache.beam.sdk.testing.TestStream
 
getValueCoder() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
getValueCoder() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
getValueDeserializerProvider() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getValueOrDefault(String, T) - Method in class org.apache.beam.sdk.schemas.Schema.Options
Get the value of an option.
getValues() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
getValues() - Method in class org.apache.beam.sdk.values.Row
Return the list of raw unmodified data values to enable 0-copy code.
getValues() - Method in class org.apache.beam.sdk.values.RowWithGetters
Return the list of raw unmodified data values to enable 0-copy code.
getValues() - Method in class org.apache.beam.sdk.values.RowWithStorage
 
getValueSerializer() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
getValuesMap() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
getValueTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getValueTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
getVerifyRowValues() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
 
getView() - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
getView() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
 
getView() - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
Deprecated.
This should not be used to obtain the output of any given application of this PTransform. That should be obtained by inspecting the TransformHierarchy.Node that contains this View.CreatePCollectionView, as this view may have been replaced within pipeline surgery.
getViewFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
Deprecated.
this method will be removed entirely. The ViewFn for a side input is an attribute of the side input's specification with a ParDo transform, which will obtain this specification via a package-private channel.
getViewFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
getVocabulary() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
getVocabulary() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
getWarehouse() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
getWarehouse() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
getWarehouse() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
getWarnings() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
getWatchInterval() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
getWatchTopicPartitionDuration() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getWatermark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getWatermark() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicy
 
getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
 
getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
 
getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
 
getWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
getWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
getWatermark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time for which all records with a timestamp less than it have been processed.
getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
 
getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
Returns watermark for the partition.
getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
 
getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
 
getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
 
getWatermark() - Method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicy
 
getWatermark() - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
 
getWatermark() - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
 
getWatermark() - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
 
getWatermark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns a timestamp before or at the timestamps of all future elements read by this reader.
getWatermark() - Method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
 
getWatermarkAndState() - Method in interface org.apache.beam.sdk.fn.splittabledofn.WatermarkEstimators.WatermarkAndStateObserver
 
getWatermarkFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
getWatermarkIdleDurationThreshold() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
getWatermarkIndexName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
 
getWatermarkLastUpdated() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
getWatermarkMillis() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
For internal use only; no backwards-compatibility guarantees.
getWeigher(Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
 
getWeight() - Method in class org.apache.beam.sdk.fn.data.WeightedList
 
getWindmillGetDataStreamCount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillHarnessUpdateReportingPeriod() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillMessagesBetweenIsReadyChecks() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillServiceCommitThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillServiceEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
Custom windmill service endpoint.
getWindmillServicePort() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillServiceRpcChannelAliveTimeoutSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillServiceStreamingLogEveryNStreamFailures() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillServiceStreamingRpcBatchLimit() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillServiceStreamingRpcHealthCheckPeriodMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindmillServiceStreamMaxBackoffMillis() - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
getWindow() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
This method returns the number of tuples in each window.
getWindow() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getWindow() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
Returns the window of this FailsafeValueInSingleWindow.
getWindow() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Returns the window of this ValueInSingleWindow.
getWindowCoder() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
getWindowedValueCoder(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
 
getWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
getWindowFn() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getWindowingStrategy(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
 
getWindowingStrategy() - Method in class org.apache.beam.sdk.values.PCollection
Returns the WindowingStrategy of this PCollection.
getWindowingStrategyInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
Deprecated.
this method will be removed entirely. The PCollection underlying a side input, including its WindowingStrategy, is part of the side input's specification with a ParDo transform, which will obtain that information via a package-private channel.
getWindowingStrategyInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
Returns the WindowingStrategy of this PCollectionView, which should be that of the underlying PCollection.
getWindowMappingFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
For internal use only.
getWindowMappingFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns a TypeDescriptor capturing what is known statically about the window type of this WindowFn instance's most-derived class.
getWithAutoSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getWithPartitions() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
 
getWorkCompleted() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
The known amount of completed work.
getWorkerCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The size of the worker's in-memory cache, in megabytes.
getWorkerCPUs() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
getWorkerDiskType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Specifies what type of persistent disk is used.
getWorkerHarnessContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
getWorkerId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
The identity of the worker running this pipeline.
getWorkerId() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
getWorkerLogLevelOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Deprecated.
This option controls the log levels for specifically named loggers.
getWorkerMachineType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Machine type to create Dataflow worker VMs as.
getWorkerPool() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
The identity of the worker pool of this worker.
getWorkerRegion() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.
getWorkerSystemErrMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Deprecated.
Controls the log level given to messages printed to System.err.
getWorkerSystemOutMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Deprecated.
Controls the log level given to messages printed to System.out.
getWorkerZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.
getWorkRemaining() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
The known amount of work remaining.
getWritableByteChannelFactory() - Method in class org.apache.beam.sdk.io.FileBasedSink
getWrite() - Method in class org.apache.beam.io.requestresponse.Cache.Pair
 
getWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
getWriteCounterPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
 
getWriteDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
getWriteDisposition() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
 
getWriteDisposition() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Getting disposition how write data to table, see: WriteDisposition.
getWriteFailures() - Method in exception org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
This list of FirestoreV1.WriteFailures detailing which writes failed and for what reason.
getWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Return the WriteOperation that this Writer belongs to.
getWriteRecordsTransform() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
getWriteResult() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
getWriteStatement() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
getWriteStream(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
 
getWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
getWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getWriteStreamService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
Returns a real, mock, or fake BigQueryServices.WriteStreamService.
getWriteStreamService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
getWriteStreamService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
getXmlConfiguration() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
getZetaSqlDefaultTimezone() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
 
getZetaSqlRuleSets() - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
getZetaSqlRuleSets(Collection<RelOptRule>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
getZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
Deprecated.
global(Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
Build a global TimerInternals for all feeding streams.
Global() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.Global
 
GLOBAL_SEQUENCE_TRACKER - Static variable in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
 
GlobalConfigRefreshPeriodFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.GlobalConfigRefreshPeriodFactory
 
globalDefault() - Static method in class org.apache.beam.sdk.values.WindowingStrategy
Return a fully specified, default windowing strategy.
GlobalDigest() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
 
globally() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
Computes the approximate number of distinct elements in the input PCollection<InputT> and returns a PCollection<Long>.
globally() - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
Create the PTransform that will build a Count-min sketch for keeping track of the frequency of the elements in the whole stream.
globally() - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
Compute the stream in order to build a T-Digest structure (MergingDigest) for keeping track of the stream distribution and returns a PCollection<MergingDigest>.
globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
 
Globally() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
 
globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Extract
Returns a PTransform that takes an input PCollection<byte[]> of HLL++ sketches and returns a PCollection<Long> of the estimated count of distinct elements extracted from each sketch.
globally() - Method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init.Builder
Returns a Combine.Globally PTransform that takes an input PCollection<InputT> and returns a PCollection<byte[]> which consists of the HLL++ sketch computed from the elements in the input PCollection.
globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.MergePartial
Returns a Combine.Globally PTransform that takes an input PCollection<byte[]> of HLL++ sketches and returns a PCollection<byte[]> of a new sketch merged from the input sketches.
globally() - Static method in class org.apache.beam.sdk.schemas.transforms.Group
Returns a transform that groups all elements in the input PCollection.
globally(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
Returns a PTransform that takes a PCollection<T> and returns a PCollection<List<T>> whose single value is a List of the approximate N-tiles of the elements of the input PCollection.
globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
Like ApproximateQuantiles.globally(int, Comparator), but sorts using the elements' natural ordering.
globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
Deprecated.
Returns a PTransform that takes a PCollection<T> and returns a PCollection<Long> containing a single value that is an estimate of the number of distinct elements in the input PCollection.
globally(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
Deprecated.
Like ApproximateUnique.globally(int), but specifies the desired maximum estimation error instead of the sample size.
Globally(int) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
Deprecated.
 
Globally(double) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
Deprecated.
 
globally(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.Globally PTransform that uses the given SerializableFunction to combine all the elements in each window of the input PCollection into a single value in the output PCollection.
globally(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.Globally PTransform that uses the given SerializableBiFunction to combine all the elements in each window of the input PCollection into a single value in the output PCollection.
globally(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.Globally PTransform that uses the given GloballyCombineFn to combine all the elements in each window of the input PCollection into a single value in the output PCollection.
globally() - Static method in class org.apache.beam.sdk.transforms.Count
Returns a PTransform that counts the number of elements in its input PCollection.
globally() - Static method in class org.apache.beam.sdk.transforms.Latest
Returns a PTransform that takes as input a PCollection<T> and returns a PCollection<T> whose contents is the latest element according to its event time, or null if there are no elements.
globally() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the maximum according to the natural ordering of T of the input PCollection's elements, or null if there are no elements.
globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the maximum of the input PCollection's elements, or null if there are no elements.
globally() - Static method in class org.apache.beam.sdk.transforms.Mean
Returns a PTransform that takes an input PCollection<NumT> and returns a PCollection<Double> whose contents is the mean of the input PCollection's elements, or 0 if there are no elements.
globally() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the minimum according to the natural ordering of T of the input PCollection's elements, or null if there are no elements.
globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the minimum of the input PCollection's elements, or null if there are no elements.
GloballyDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
 
GlobalSketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
 
GlobalWatermarkHolder - Class in org.apache.beam.runners.spark.util
A store to hold the global watermarks for a micro-batch.
GlobalWatermarkHolder() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
GlobalWatermarkHolder.SparkWatermarks - Class in org.apache.beam.runners.spark.util
A GlobalWatermarkHolder.SparkWatermarks holds the watermarks and batch time relevant to a micro-batch input from a specific source.
GlobalWatermarkHolder.WatermarkAdvancingStreamingListener - Class in org.apache.beam.runners.spark.util
Advance the WMs onBatchCompleted event.
GlobalWindow - Class in org.apache.beam.sdk.transforms.windowing
The default window into which all data is placed (via GlobalWindows).
GlobalWindow.Coder - Class in org.apache.beam.sdk.transforms.windowing
GlobalWindow.Coder for encoding and decoding GlobalWindows.
GlobalWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that assigns all data to the same window.
GlobalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
GoogleAdsClientFactory - Interface in org.apache.beam.sdk.io.googleads
Defines how to construct a GoogleAdsClient.
GoogleAdsCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsOptions.GoogleAdsCredentialsFactory
 
GoogleAdsIO - Class in org.apache.beam.sdk.io.googleads
GoogleAdsIO provides an API for reading from the Google Ads API over supported versions of the Google Ads client libraries.
GoogleAdsOptions - Interface in org.apache.beam.sdk.io.googleads
Options used to configure Google Ads API specific options.
GoogleAdsOptions.GoogleAdsCredentialsFactory - Class in org.apache.beam.sdk.io.googleads
Attempts to load the Google Ads credentials.
GoogleAdsUserCredentialFactory - Class in org.apache.beam.sdk.io.googleads
Constructs and returns Credentials to be used by Google Ads API calls.
GoogleAdsV17 - Class in org.apache.beam.sdk.io.googleads
GoogleAdsV17 provides an API to read Google Ads API v17 reports.
GoogleAdsV17.RateLimitPolicy - Interface in org.apache.beam.sdk.io.googleads
This interface can be used to implement custom client-side rate limiting policies.
GoogleAdsV17.RateLimitPolicyFactory - Interface in org.apache.beam.sdk.io.googleads
Implement this interface to create a GoogleAdsV17.RateLimitPolicy.
GoogleAdsV17.Read - Class in org.apache.beam.sdk.io.googleads
A PTransform that reads the results of a Google Ads query as GoogleAdsRow objects.
GoogleAdsV17.ReadAll - Class in org.apache.beam.sdk.io.googleads
A PTransform that reads the results of many SearchGoogleAdsStreamRequest objects as GoogleAdsRow objects.
GoogleAdsV17.SimpleRateLimitPolicy - Class in org.apache.beam.sdk.io.googleads
This rate limit policy wraps a RateLimiter and can be used in low volume and development use cases as a client-side rate limiting policy.
GoogleApiDebugOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
These options configure debug settings for Google API clients created within the Apache Beam SDK.
GoogleApiDebugOptions.GoogleApiTracer - Class in org.apache.beam.sdk.extensions.gcp.options
A GoogleClientRequestInitializer that adds the trace destination to Google API calls.
GoogleApiTracer() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
 
GraphiteSink - Class in org.apache.beam.runners.spark.metrics.sink
A Sink for Spark's metric system reporting metrics (including Beam step metrics) to Graphite.
GraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
Constructor for Spark 3.1.x and earlier.
GraphiteSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
Constructor for Spark 3.2.x and later.
greaterThan(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
greaterThan(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
greaterThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that are greater than a given value, based on the elements' natural ordering.
greaterThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that are greater than or equal to a given value, based on the elements' natural ordering.
greaterThanOrEqualTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
greaterThanOrEqualTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
Group - Class in org.apache.beam.sdk.schemas.transforms
A generic grouping transform for schema PCollections.
Group() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group
 
Group.AggregateCombiner<InputT> - Class in org.apache.beam.sdk.schemas.transforms
a PTransform that does a combine using an aggregation built up by calls to aggregateField and aggregateFields.
Group.ByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
a PTransform that groups schema elements based on the given fields.
Group.CombineFieldsByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
a PTransform that does a per-key combine using an aggregation built up by calls to aggregateField and aggregateFields.
Group.CombineFieldsByFields.Fanout - Class in org.apache.beam.sdk.schemas.transforms
 
Group.CombineFieldsByFields.Fanout.Kind - Enum in org.apache.beam.sdk.schemas.transforms
 
Group.CombineFieldsGlobally<InputT> - Class in org.apache.beam.sdk.schemas.transforms
a PTransform that does a global combine using an aggregation built up by calls to aggregateField and aggregateFields.
Group.CombineGlobally<InputT,OutputT> - Class in org.apache.beam.sdk.schemas.transforms
a PTransform that does a global combine using a provider Combine.CombineFn.
Group.Global<InputT> - Class in org.apache.beam.sdk.schemas.transforms
A PTransform for doing global aggregations on schema PCollections.
GroupAlsoByWindowViaOutputBufferFn<K,InputT,W extends BoundedWindow> - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
A FlatMap function that groups by windows in batch mode using ReduceFnRunner.
GroupAlsoByWindowViaOutputBufferFn(WindowingStrategy<?, W>, StateInternalsFactory<K>, SystemReduceFn<K, InputT, Iterable<InputT>, Iterable<InputT>, W>, Supplier<PipelineOptions>) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.GroupAlsoByWindowViaOutputBufferFn
 
GroupByKey<K,V> - Class in org.apache.beam.sdk.transforms
GroupByKey<K, V> takes a PCollection<KV<K, V>>, groups the values by key and windows, and returns a PCollection<KV<K, Iterable<V>>> representing a map from each distinct key and window of the input PCollection to an Iterable over all the values associated with that key in the input per window.
groupByKeyAndWindow(JavaDStream<WindowedValue<KV<K, InputT>>>, Coder<K>, Coder<WindowedValue<InputT>>, WindowingStrategy<?, W>, SerializablePipelineOptions, List<Integer>, String) - Static method in class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
 
GroupByKeyTranslatorBatch<K,V> - Class in org.apache.beam.runners.twister2.translators.batch
GroupByKey translator.
GroupByKeyTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.GroupByKeyTranslatorBatch
 
GroupByWindowFunction<K,V,W extends BoundedWindow> - Class in org.apache.beam.runners.twister2.translators.functions
GroupBy window function.
GroupByWindowFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
 
GroupByWindowFunction(WindowingStrategy<?, W>, SystemReduceFn<K, V, Iterable<V>, Iterable<V>, W>, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
 
grouped() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Same transform but can be applied to PCollection of MutationGroup.
groupedValues(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.GroupedValues PTransform that takes a PCollection of KVs where a key maps to an Iterable of values, e.g., the result of a GroupByKey, then uses the given SerializableFunction to combine all the values associated with a key, ignoring the key.
groupedValues(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.GroupedValues PTransform that takes a PCollection of KVs where a key maps to an Iterable of values, e.g., the result of a GroupByKey, then uses the given SerializableFunction to combine all the values associated with a key, ignoring the key.
groupedValues(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.GroupedValues PTransform that takes a PCollection of KVs where a key maps to an Iterable of values, e.g., the result of a GroupByKey, then uses the given CombineFn to combine all the values associated with a key, ignoring the key.
GroupingState<InputT,OutputT> - Interface in org.apache.beam.sdk.state
A ReadableState cell that combines multiple input values and outputs a single value of a different type.
GroupIntoBatches<K,InputT> - Class in org.apache.beam.sdk.transforms
A PTransform that batches inputs to a desired batch size.
GroupIntoBatches.BatchingParams<InputT> - Class in org.apache.beam.sdk.transforms
Wrapper class for batching parameters supplied by users.
GroupIntoBatches.WithShardedKey - Class in org.apache.beam.sdk.transforms
 
GroupIntoBatchesOverride - Class in org.apache.beam.runners.dataflow
 
GroupIntoBatchesOverride() - Constructor for class org.apache.beam.runners.dataflow.GroupIntoBatchesOverride
 
GrowableOffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
An OffsetRangeTracker for tracking a growable offset range.
GrowableOffsetRangeTracker(long, GrowableOffsetRangeTracker.RangeEndEstimator) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
 
GrowableOffsetRangeTracker.RangeEndEstimator - Interface in org.apache.beam.sdk.transforms.splittabledofn
Provides the estimated end offset of the range.
Growth() - Constructor for class org.apache.beam.sdk.transforms.Watch.Growth
 
growthOf(Watch.Growth.PollFn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Watch
Watches the growth of the given poll function.
growthOf(Watch.Growth.PollFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch
Watches the growth of the given poll function.
growthOf(Contextful<Watch.Growth.PollFn<InputT, OutputT>>, SerializableFunction<OutputT, KeyT>) - Static method in class org.apache.beam.sdk.transforms.Watch
Watches the growth of the given poll function, using the given "key function" to deduplicate outputs.
GrpcContextHeaderAccessorProvider - Class in org.apache.beam.sdk.fn.server
A HeaderAccessorProvider which intercept the header in a GRPC request and expose the relevant fields.
GrpcContextHeaderAccessorProvider() - Constructor for class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
 
GrpcDataService - Class in org.apache.beam.runners.fnexecution.data
A FnDataService implemented via gRPC.
GrpcDataService() - Constructor for class org.apache.beam.runners.fnexecution.data.GrpcDataService
Deprecated.
This constructor is for migrating Dataflow purpose only.
GrpcFnServer<ServiceT extends FnService> - Class in org.apache.beam.sdk.fn.server
A gRPC Server which manages a single FnService.
GrpcLoggingService - Class in org.apache.beam.runners.fnexecution.logging
An implementation of the Beam Fn Logging Service over gRPC.
GrpcStateService - Class in org.apache.beam.runners.fnexecution.state
An implementation of the Beam Fn State service.
guessExpressionType(String, Map<String, Type>) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
 

H

hadoopConfiguration - Variable in class org.apache.beam.sdk.io.cdap.Plugin
 
HadoopFileSystemModule - Class in org.apache.beam.sdk.io.hdfs
A Jackson Module that registers a JsonSerializer and JsonDeserializer for a Hadoop Configuration.
HadoopFileSystemModule() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemModule
 
HadoopFileSystemOptions - Interface in org.apache.beam.sdk.io.hdfs
PipelineOptions which encapsulate Hadoop Configuration for the HadoopFileSystem.
HadoopFileSystemOptions.ConfigurationLocator - Class in org.apache.beam.sdk.io.hdfs
A DefaultValueFactory which locates a Hadoop Configuration.
HadoopFileSystemOptionsRegistrar - Class in org.apache.beam.sdk.io.hdfs
AutoService registrar for HadoopFileSystemOptions.
HadoopFileSystemOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
 
HadoopFileSystemRegistrar - Class in org.apache.beam.sdk.io.hdfs
AutoService registrar for the HadoopFileSystem.
HadoopFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
 
HadoopFormatIO - Class in org.apache.beam.sdk.io.hadoop.format
A HadoopFormatIO is a Transform for reading data from any source or writing data to any sink which implements Hadoop InputFormat or OutputFormat.
HadoopFormatIO() - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
 
HadoopFormatIO.HadoopInputFormatBoundedSource<K,V> - Class in org.apache.beam.sdk.io.hadoop.format
Bounded source implementation for HadoopFormatIO.
HadoopFormatIO.Read<K,V> - Class in org.apache.beam.sdk.io.hadoop.format
A PTransform that reads from any data source which implements Hadoop InputFormat.
HadoopFormatIO.SerializableSplit - Class in org.apache.beam.sdk.io.hadoop.format
A wrapper to allow Hadoop InputSplit to be serialized using Java's standard serialization mechanisms.
HadoopFormatIO.Write<KeyT,ValueT> - Class in org.apache.beam.sdk.io.hadoop.format
A PTransform that writes to any data sink which implements Hadoop OutputFormat.
HadoopFormatIO.Write.ExternalSynchronizationBuilder<KeyT,ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format
Builder for External Synchronization defining.
HadoopFormatIO.Write.PartitionedWriterBuilder<KeyT,ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format
Builder for partitioning determining.
HadoopFormatIO.Write.WriteBuilder<KeyT,ValueT> - Interface in org.apache.beam.sdk.io.hadoop.format
Main builder of Write transformation.
HadoopInputFormatBoundedSource(SerializableConfiguration, Coder<K>, Coder<V>, SimpleFunction<?, K>, SimpleFunction<?, V>, HadoopFormatIO.SerializableSplit, boolean, boolean) - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
 
handle(BeamFnApi.InstructionRequest) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
handle(BeamFnApi.InstructionRequest) - Method in interface org.apache.beam.runners.fnexecution.control.InstructionRequestHandler
 
handle(BeamFnApi.StateRequest) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
Handle a BeamFnApi.StateRequest asynchronously.
handle(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
Opportunity to further refine the relational expression created for a given level.
handleErrorEx(Object, JCSMPException, long) - Method in class org.apache.beam.sdk.io.solace.broker.PublishResultHandler
 
HarnessUpdateReportingPeriodFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.HarnessUpdateReportingPeriodFactory
 
has(String) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
Returns whether this PCollectionRowTuple contains a PCollection with the given tag.
has(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns whether this PCollectionTuple contains a PCollection with the given tag.
has(String) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns whether this PCollectionTuple contains a PCollection with the given tag.
hasAnyPrefix() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsCountersOptions
 
hasCommitted() - Method in class org.apache.beam.sdk.metrics.MetricResult
 
hasDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
Checks if metastore client has the specified database.
hasDefault() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
Deprecated.
Returns if a default value was specified.
hasDefault() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
Returns if a default value was specified.
HasDefaultTracker<RestrictionT extends HasDefaultTracker<RestrictionT,TrackerT>,TrackerT extends RestrictionTracker<RestrictionT,?>> - Interface in org.apache.beam.sdk.transforms.splittabledofn
Interface for restrictions for which a default implementation of DoFn.NewTracker is available, depending only on the restriction itself.
hasDefaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
Returns whether this transform has a default value.
HasDefaultWatermarkEstimator<WatermarkEstimatorStateT,WatermarkEstimatorT extends WatermarkEstimator<WatermarkEstimatorStateT>> - Interface in org.apache.beam.sdk.transforms.splittabledofn
Interface for watermark estimator state for which a default implementation of DoFn.NewWatermarkEstimator is available, depending only on the watermark estimator state itself.
HasDisplayData - Interface in org.apache.beam.sdk.transforms.display
Marker interface for PTransforms and components to specify display data used within UIs and diagnostic tools.
hasErrored() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
If this handler has errored since it was last reset.
hasExperiment(DataflowPipelineDebugOptions, String) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
Returns true if the specified experiment is enabled, handling null experiments.
hasExperiment(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
Returns true iff the provided pipeline options has the specified experiment enabled.
hasFailedRecords(List<ResT>) - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
 
hasField(String) - Method in class org.apache.beam.sdk.schemas.Schema
Returns true if fieldName exists in the schema, false otherwise.
hasGlobWildcard(String) - Static method in class org.apache.beam.sdk.io.FileSystems
Checks whether the given spec contains a glob wildcard character.
hashCode() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
 
hashCode() - Method in class org.apache.beam.runners.dataflow.util.OutputReference
 
hashCode() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
hashCode() - Method in class org.apache.beam.runners.jet.Utils.ByteArrayKey
 
hashCode() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
hashCode() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
hashCode() - Method in class org.apache.beam.runners.spark.util.ByteArray
 
hashCode() - Method in class org.apache.beam.sdk.coders.AtomicCoder
.
hashCode() - Method in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
hashCode() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.RowCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
hashCode() - Method in class org.apache.beam.sdk.coders.StructuredCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.ZstdCoder
 
hashCode() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
hashCode() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
 
hashCode() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
hashCode() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
hashCode() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
 
hashCode() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
 
hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
hashCode() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
hashCode() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
hashCode() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
hashCode() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
hashCode() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
 
hashCode() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
hashCode() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
hashCode() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
hashCode() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
hashCode() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
hashCode() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
 
hashCode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
hashCode() - Method in class org.apache.beam.sdk.io.range.ByteKey
 
hashCode() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
hashCode() - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
hashCode() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
 
hashCode() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
You need to override this method to be able to compare these objects by value.
hashCode() - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
You need to override this method to be able to compare these objects by value.
hashCode() - Method in class org.apache.beam.sdk.io.solace.read.SolaceCheckpointMark
 
hashCode() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
hashCode() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
hashCode() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
hashCode() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
hashCode() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
hashCode() - Method in class org.apache.beam.sdk.schemas.CachingFactory
 
hashCode() - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
 
hashCode() - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
 
hashCode() - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
hashCode() - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
 
hashCode() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
 
hashCode() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
 
hashCode() - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
hashCode() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
hashCode() - Method in class org.apache.beam.sdk.schemas.Schema
 
hashCode() - Method in class org.apache.beam.sdk.schemas.Schema.Options
 
hashCode() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
hashCode() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
hashCode() - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
 
hashCode() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Deprecated.
Object.hashCode() is not supported on PAssert objects.
hashCode() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
hashCode() - Method in class org.apache.beam.sdk.testing.TestStream
 
hashCode() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
Deprecated.
 
hashCode() - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
 
hashCode() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
hashCode() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
hashCode() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
hashCode() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
hashCode() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
hashCode() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
 
hashCode() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
 
hashCode() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
 
hashCode() - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
hashCode() - Method in class org.apache.beam.sdk.values.EncodableThrowable
 
hashCode() - Method in class org.apache.beam.sdk.values.KV
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionList
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
hashCode() - Method in class org.apache.beam.sdk.values.Row
 
hashCode() - Method in class org.apache.beam.sdk.values.RowWithGetters
 
hashCode() - Method in class org.apache.beam.sdk.values.ShardedKey
 
hashCode() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
hashCode() - Method in class org.apache.beam.sdk.values.TupleTag
 
hashCode() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
hashCode() - Method in class org.apache.beam.sdk.values.TypeParameter
 
hashCode() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
hashCode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
hasItem(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.hasItem(Object).
hasItem(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.hasItem(Object).
hasItem(SerializableMatcher<? super T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.hasItem(Matcher).
hasNext() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
 
hasNext() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2EmptySource
 
hasNext() - Method in class org.apache.beam.runners.twister2.translators.functions.ImpulseSource
 
hasNext() - Method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.RecordBatchRowIterator
 
hasNext() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
 
hasNext() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
 
hasNext() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator
 
HasOffset - Interface in org.apache.beam.sdk.io.sparkreceiver
Interface for any Spark Receiver that supports reading from and to some offset.
hasOption(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
 
hasOptions() - Method in class org.apache.beam.sdk.schemas.Schema.Options
 
hasOutput(ErrorHandling) - Static method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling
 
hasReplacementJob() - Method in enum org.apache.beam.sdk.PipelineResult.State
 
hasSchema() - Method in class org.apache.beam.sdk.values.PCollection
Returns whether this PCollection has an attached schema.
hasSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.hasSize(int).
hasSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.hasSize(Matcher).
hasUnboundedPCollections(RunnerApi.Pipeline) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
Indicates whether the given pipeline has any unbounded PCollections.
hasUnresolvedParameters() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns whether this TypeDescriptor has any unresolved type parameters, as opposed to being a concrete type.
HBaseCoderProviderRegistrar - Class in org.apache.beam.sdk.io.hbase
A CoderProviderRegistrar for standard types used with HBaseIO.
HBaseCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.hbase.HBaseCoderProviderRegistrar
 
HBaseIO - Class in org.apache.beam.sdk.io.hbase
A bounded source and sink for HBase.
HBaseIO.Read - Class in org.apache.beam.sdk.io.hbase
A PTransform that reads from HBase.
HBaseIO.ReadAll - Class in org.apache.beam.sdk.io.hbase
Implementation of HBaseIO.readAll().
HBaseIO.Write - Class in org.apache.beam.sdk.io.hbase
A PTransform that writes to HBase.
HBaseIO.WriteRowMutations - Class in org.apache.beam.sdk.io.hbase
Transformation that writes RowMutation objects to a Hbase table.
HCatalogBeamSchema - Class in org.apache.beam.sdk.io.hcatalog
Adapter from HCatalog table schema to Beam Schema.
HCatalogIO - Class in org.apache.beam.sdk.io.hcatalog
IO to read and write data using HCatalog.
HCatalogIO.Read - Class in org.apache.beam.sdk.io.hcatalog
A PTransform to read data using HCatalog.
HCatalogIO.Write - Class in org.apache.beam.sdk.io.hcatalog
A PTransform to write to a HCatalog managed source.
HCatalogTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog
Beam SQL table that wraps HCatalogIO.
HCatalogTable() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
HCatalogUtils - Class in org.apache.beam.sdk.io.hcatalog
Utility classes to enable meta store conf/client creation.
HCatalogUtils() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatalogUtils
 
HCatToRow - Class in org.apache.beam.sdk.io.hcatalog
Utilities to convert HCatRecords to Rows.
HCatToRow() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatToRow
 
HDFSSynchronization - Class in org.apache.beam.sdk.io.hadoop.format
Implementation of ExternalSynchronization which registers locks in the HDFS.
HDFSSynchronization(String) - Constructor for class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
Creates instance of HDFSSynchronization.
HeaderAccessor - Interface in org.apache.beam.sdk.fn.server
Interface to access headers in the client request.
HealthcareApiClient - Interface in org.apache.beam.sdk.io.gcp.healthcare
Defines a client to communicate with the GCP HCLS API (version v1).
HealthcareIOError<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
Class for capturing errors on IO operations on Google Cloud Healthcare APIs resources.
HealthcareIOErrorCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HealthcareIOErrorToTableRow<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
Convenience transform to write dead-letter HealthcareIOErrors to BigQuery TableRows.
HealthcareIOErrorToTableRow() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
 
HEARTBEAT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of heartbeats identified during the execution of the Connector.
HEARTBEAT_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of heartbeat records identified during the execution of the Connector.
HeartbeatRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
A heartbeat record serves as a notification that the change stream query has returned all changes for the partition less or equal to the record timestamp.
HeartbeatRecord(Timestamp, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
Constructs the heartbeat record with the given timestamp and metadata.
heartbeatRecordAction(ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class capable of processing HeartbeatRecords.
HeartbeatRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
This class is part of the process for ReadChangeStreamPartitionDoFn SDF.
helloWorld() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.HelloWorldFn
 
HelloWorldFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.HelloWorldFn
 
Hidden - Annotation Type in org.apache.beam.sdk.options
Methods and/or interfaces annotated with @Hidden will be suppressed from being output when --help is specified on the command-line.
hints() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
 
Histogram - Interface in org.apache.beam.sdk.metrics
A metric that reports information about the histogram of reported values.
HISTOGRAM_BUCKET_TYPE - Static variable in class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
 
HL7v2IO - Class in org.apache.beam.sdk.io.gcp.healthcare
HL7v2IO provides an API for reading from and writing to Google Cloud Healthcare HL7v2 API.
HL7v2IO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
 
HL7v2IO.HL7v2Read - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Read that reads HL7v2 message contents given a PCollection of HL7v2ReadParameter.
HL7v2IO.HL7v2Read.FetchHL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
PTransform to fetch a message from an Google Cloud Healthcare HL7v2 store based on msgID.
HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn - Class in org.apache.beam.sdk.io.gcp.healthcare
DoFn for fetching messages from the HL7v2 store with error handling.
HL7v2IO.HL7v2Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Result includes PCollection of HL7v2ReadResponse objects for successfully read results and PCollection of HealthcareIOError objects for failed reads.
HL7v2IO.ListHL7v2Messages - Class in org.apache.beam.sdk.io.gcp.healthcare
List HL7v2 messages in HL7v2 Stores with optional filter.
HL7v2IO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Read that reads HL7v2 message contents given a PCollection of message IDs strings.
HL7v2IO.Read.FetchHL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
PTransform to fetch a message from an Google Cloud Healthcare HL7v2 store based on msgID.
HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn - Class in org.apache.beam.sdk.io.gcp.healthcare
DoFn for fetching messages from the HL7v2 store with error handling.
HL7v2IO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Result includes PCollection of HL7v2Message objects for successfully read results and PCollection of HealthcareIOError objects for failed reads.
HL7v2IO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Write that writes the given PCollection of HL7v2 messages.
HL7v2IO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HL7v2IO.Write.WriteMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
The enum Write method.
HL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
The type HL7v2 message to wrap the Message model.
HL7v2Message(String, String, String, String, String, String, String, Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
HL7v2MessageCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HL7v2MessageGetFn() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
 
HL7v2Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
 
HL7v2ReadParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
HL7v2ReadParameter represents the read parameters for a HL7v2 read request, used as the input type for HL7v2IO.HL7v2Read.
HL7v2ReadParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
 
HL7v2ReadResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
HL7v2ReadResponse represents the response format for a HL7v2 read request, used as the output type of HL7v2IO.HL7v2Read.
HL7v2ReadResponse(String, HL7v2Message) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
 
HL7v2ReadResponseCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
HllCount - Class in org.apache.beam.sdk.extensions.zetasketch
PTransforms to compute HyperLogLogPlusPlus (HLL++) sketches on data streams based on the ZetaSketch implementation.
HllCount.Extract - Class in org.apache.beam.sdk.extensions.zetasketch
Provides PTransforms to extract the estimated count of distinct elements (as Longs) from each HLL++ sketch.
HllCount.Init - Class in org.apache.beam.sdk.extensions.zetasketch
Provides PTransforms to aggregate inputs into HLL++ sketches.
HllCount.Init.Builder<InputT> - Class in org.apache.beam.sdk.extensions.zetasketch
Builder for the HllCount.Init combining PTransform.
HllCount.MergePartial - Class in org.apache.beam.sdk.extensions.zetasketch
Provides PTransforms to merge HLL++ sketches into a new sketch.
host(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService.Builder
 
host() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
The host name or IP address of the Solace broker.
host(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
Set Solace host, format: Host[:Port] e.g.
host() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
The host name or IP address of the Solace broker.
host(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
Set Solace SEMP host, format: [Protocol://]Host[:Port].
host(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
The location of the broker, including port details if it is not listening in the default port.
host() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 
host() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
 
HttpClientConfiguration - Class in org.apache.beam.sdk.io.aws2.common
HTTP client configuration for both, sync and async AWS clients.
HttpClientConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
 
HttpClientConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.common
 
HttpHealthcareApiClient - Class in org.apache.beam.sdk.io.gcp.healthcare
A client that talks to the Cloud Healthcare API through HTTP requests.
HttpHealthcareApiClient() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
Instantiates a new Http healthcare api client.
HttpHealthcareApiClient(CloudHealthcare) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
Instantiates a new Http healthcare api client.
HttpHealthcareApiClient.AuthenticatedRetryInitializer - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HttpHealthcareApiClient.FhirResourcePagesIterator - Class in org.apache.beam.sdk.io.gcp.healthcare
The type FhirResourcePagesIterator for methods which return paged output.
HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
 
HttpHealthcareApiClient.HealthcareHttpException - Exception in org.apache.beam.sdk.io.gcp.healthcare
Wraps HttpResponse in an exception with a statusCode field for use with HealthcareIOError.
HttpHealthcareApiClient.HL7v2MessagePages - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Hl7v2 message id pages iterator.
HyperLogLogPlusCoder() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 

I

ICEBERG - Static variable in class org.apache.beam.sdk.managed.Managed
 
IcebergCatalogConfig - Class in org.apache.beam.sdk.io.iceberg
 
IcebergCatalogConfig() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig
 
IcebergCatalogConfig.Builder - Class in org.apache.beam.sdk.io.iceberg
 
IcebergDestination - Class in org.apache.beam.sdk.io.iceberg
 
IcebergDestination() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergDestination
 
IcebergDestination.Builder - Class in org.apache.beam.sdk.io.iceberg
 
IcebergIO - Class in org.apache.beam.sdk.io.iceberg
A connector that reads and writes to Apache Iceberg tables.
IcebergIO() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergIO
 
IcebergIO.ReadRows - Class in org.apache.beam.sdk.io.iceberg
 
IcebergIO.WriteRows - Class in org.apache.beam.sdk.io.iceberg
 
IcebergReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.iceberg
IcebergReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
 
icebergRecordToBeamRow(Schema, Record) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
Converts an Iceberg Record to a Beam Row.
IcebergScanConfig - Class in org.apache.beam.sdk.io.iceberg
 
IcebergScanConfig() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergScanConfig
 
IcebergScanConfig.Builder - Class in org.apache.beam.sdk.io.iceberg
 
IcebergScanConfig.ScanType - Enum in org.apache.beam.sdk.io.iceberg
 
icebergSchemaToBeamSchema(Schema) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergUtils
Converts an Iceberg Schema to a Beam Schema.
IcebergSchemaTransformTranslation - Class in org.apache.beam.sdk.io.iceberg
 
IcebergSchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation
 
IcebergSchemaTransformTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.iceberg
 
IcebergSchemaTransformTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.iceberg
 
IcebergTableCreateConfig - Class in org.apache.beam.sdk.io.iceberg
 
IcebergTableCreateConfig() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig
 
IcebergTableCreateConfig.Builder - Class in org.apache.beam.sdk.io.iceberg
 
IcebergUtils - Class in org.apache.beam.sdk.io.iceberg
Utilities for converting between Beam and Iceberg types, made public for user's convenience.
IcebergWriteResult - Class in org.apache.beam.sdk.io.iceberg
 
IcebergWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.iceberg
IcebergWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
 
IcebergWriteSchemaTransformProvider.Configuration - Class in org.apache.beam.sdk.io.iceberg
 
IcebergWriteSchemaTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.io.iceberg
 
id() - Method in class org.apache.beam.runners.jobsubmission.JobPreparation
 
id - Variable in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
identifier() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
Returns an id that uniquely represents this IO.
identifier() - Method in class org.apache.beam.sdk.extensions.avro.schemas.io.payloads.AvroPayloadSerializerProvider
 
identifier() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.Fixed32
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.Fixed64
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SFixed32
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SFixed64
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SInt32
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SInt64
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.UInt32
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.UInt64
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint16
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint32
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint64
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint8
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
 
identifier() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.TimeWithLocalTzType
 
identifier() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.AvroReadSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.AvroWriteSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.CsvWriteSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
Returns the SchemaTransformProvider.identifier() required for registration.
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.JsonReadSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.JsonWriteSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.LineReadSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetReadSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.ParquetWriteSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.fileschematransform.XmlWriteSchemaTransformFormatProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider identifier method.
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Returns an id that uniquely represents this IO.
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
Returns an id that uniquely represents this IO.
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
Returns an id that uniquely represents this IO.
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
Returns an id that uniquely represents this IO.
identifier() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
Implementation of the TypedSchemaTransformProvider identifier method.
identifier() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
Implementation of the TypedSchemaTransformProvider identifier method.
identifier() - Method in class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
 
identifier() - Method in class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
 
identifier() - Method in interface org.apache.beam.sdk.schemas.io.Providers.Identifyable
Returns an id that uniquely represents this among others implementing its derived interface.
identifier() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
Returns an id that uniquely represents this IO.
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.Date
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.Time
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
IDENTIFIER() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
 
IDENTIFIER() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
 
identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
 
identifier() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
Returns an id that uniquely represents this transform.
Identifier() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
Returns the identity element of this operation, i.e.
identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
Returns the value that should be used for the combine of the empty set.
identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
Returns the identity element of this operation, i.e.
identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
Returns the identity element of this operation, i.e.
identity() - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
 
IDENTITY_ELEMENT - Static variable in class org.apache.beam.sdk.metrics.DistributionResult
The IDENTITY_ELEMENT is used to start accumulating distributions.
IdGenerator - Interface in org.apache.beam.sdk.fn
A generator of unique IDs.
IdGenerators - Class in org.apache.beam.sdk.fn
Common IdGenerator implementations.
IdGenerators() - Constructor for class org.apache.beam.sdk.fn.IdGenerators
 
ignored() - Static method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
Returns a handler that ignores metrics.
ignoreInput(Watch.Growth.TerminationCondition<?, StateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Wraps a given input-independent Watch.Growth.TerminationCondition as an equivalent condition with a given input type, passing null to the original condition as input.
ignoreInsertIds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Setting this option to true disables insertId based data deduplication offered by BigQuery.
ignoreUnknownValues() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Accept rows that contain values that do not match the schema.
immediate(T) - Static method in class org.apache.beam.sdk.state.ReadableStates
A ReadableState constructed from a constant value, hence immediately available.
immutableNames() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
immutableNamesBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
immutableSteps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
immutableStepsBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
implement(EnumerableRelImplementor, EnumerableRel.Prefer) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
implementor() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
 
importCatalogItems() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
 
importFhirResource(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Importing a FHIR resource from GCS.
importFhirResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
importResources(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Import resources.
importResources(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Import resources.
importUserEvents() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
 
Impulse - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
IMPULSE_ELEMENT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ImpulseP - Class in org.apache.beam.runners.jet.processors
/** * Jet Processor implementation for Beam's Impulse primitive.
ImpulseSource - Class in org.apache.beam.runners.twister2.translators.functions
A SourceFunc which executes the impulse transform contract.
ImpulseSource() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ImpulseSource
 
ImpulseTranslatorBatch - Class in org.apache.beam.runners.twister2.translators.batch
Impulse translator.
ImpulseTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.ImpulseTranslatorBatch
 
in(Pipeline, PCollection<FhirBundleResponse>, PCollection<HealthcareIOError<FhirBundleParameter>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
Entry point for the ExecuteBundlesResult, storing the successful and failed bundles and their metadata.
in(Pipeline, PCollection<Solace.PublishResult>) - Static method in class org.apache.beam.sdk.io.solace.write.SolaceOutput
 
in(Pipeline) - Static method in class org.apache.beam.sdk.values.PBegin
Returns a PBegin in the given Pipeline.
in(Pipeline) - Static method in class org.apache.beam.sdk.values.PDone
Creates a PDone in the given Pipeline.
IN_ARRAY_OPERATOR - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
inc() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
 
inc(long) - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
 
inc() - Method in interface org.apache.beam.sdk.metrics.Counter
Increment the counter.
inc(long) - Method in interface org.apache.beam.sdk.metrics.Counter
Increment the counter by the given amount.
inc() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
Increment the counter.
inc(long) - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
Increment the counter by the given amount.
inc() - Method in class org.apache.beam.sdk.metrics.NoOpCounter
 
inc(long) - Method in class org.apache.beam.sdk.metrics.NoOpCounter
 
incActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNT by 1 if the metric is enabled.
incChangeStreamMutationGcCounter() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.CHANGE_STREAM_MUTATION_GC_COUNT by 1 if the metric is enabled.
incChangeStreamMutationUserCounter() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.CHANGE_STREAM_MUTATION_USER_COUNT by 1 if the metric is enabled.
incClosestreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.CLOSESTREAM_COUNT by 1 if the metric is enabled.
incDataRecordCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.DATA_RECORD_COUNT by 1 if the metric is enabled.
incHeartbeatCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.HEARTBEAT_COUNT by 1 if the metric is enabled.
incHeartbeatRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.HEARTBEAT_RECORD_COUNT by 1 if the metric is enabled.
incListPartitionsCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.LIST_PARTITIONS_COUNT by 1 if the metric is enabled.
include(String, HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register display data from the specified subcomponent at the given path.
inCombinedNonLatePanes(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window across all panes that were not produced by the arrival of late data.
inCombinedNonLatePanes(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
IncomingMessage() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
 
IncompatibleWindowException - Exception in org.apache.beam.sdk.transforms.windowing
Exception thrown by WindowFn.verifyCompatibility(WindowFn) if two compared WindowFns are not compatible, including the explanation of incompatibility.
IncompatibleWindowException(WindowFn<?, ?>, String) - Constructor for exception org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
 
incomplete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
Constructs a Watch.Growth.PollResult with the given outputs and declares that new outputs might appear for the current input.
incomplete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
Like Watch.Growth.PollResult.incomplete(List), but assigns the same timestamp to all new outputs.
incOrphanedNewPartitionCleanedCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
incPartitionMergeCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_MERGE_COUNT by 1 if the metric is enabled.
incPartitionReconciledWithoutTokenCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
incPartitionReconciledWithTokenCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
incPartitionRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_RECORD_COUNT by 1 if the metric is enabled.
incPartitionRecordMergeCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_RECORD_MERGE_COUNT by 1 if the metric is enabled.
incPartitionRecordSplitCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_RECORD_SPLIT_COUNT by 1 if the metric is enabled.
incPartitionSplitCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_SPLIT_COUNT by 1 if the metric is enabled.
incPartitionStreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
incQueryCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.QUERY_COUNT by 1 if the metric is enabled.
increment() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns a RandomAccessData that is the smallest value of same length which is strictly greater than this.
increment(Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IncrementFn
 
incrementAll(Date) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.DateIncrementAllFn
 
IncrementFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IncrementFn
 
incrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
Returns an IdGenerator which provides successive incrementing longs.
index() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
 
INDEX_OF_MAX - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
Shard name containing the index and max.
indexOf(String) - Method in class org.apache.beam.sdk.schemas.Schema
Find the index of a given field.
indexOfProjectionColumnRef(long, List<ResolvedColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
Return an index of the projection column reference.
inEarlyGlobalWindowPanes() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on panes in the GlobalWindow that were emitted before the GlobalWindow closed.
inEarlyGlobalWindowPanes() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inEarlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window across all panes that were produced by the arrival of early data.
inEarlyPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inEarlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert like this one, but with the assertion restricted to only run on the provided window, running the checker only on early panes for each key.
InferableFunction<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A ProcessFunction which is not a functional interface.
InferableFunction() - Constructor for class org.apache.beam.sdk.transforms.InferableFunction
 
InferableFunction(ProcessFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.transforms.InferableFunction
 
inferBeamSchema(DataSource, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
inferType(Object) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Infer the DisplayData.Type for the given object.
inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window, running the checker only on the final pane for each key.
inFinalPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert like this one, but with the assertion restricted to only run on the provided window, running the checker only on the final pane for each key.
InfluxDbIO - Class in org.apache.beam.sdk.io.influxdb
IO to read and write from InfluxDB.
InfluxDbIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.influxdb
A POJO describing a DataSourceConfiguration such as URL, userName and password.
InfluxDbIO.Read - Class in org.apache.beam.sdk.io.influxdb
A PTransform to read from InfluxDB metric or data related to query.
InfluxDbIO.Write - Class in org.apache.beam.sdk.io.influxdb
A PTransform to write to a InfluxDB datasource.
ingestHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Ingest an HL7v2 message.
ingestHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
ingestMessages(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Write with Messages.Ingest method.
INHERIT_IO_FILE - Static variable in class org.apache.beam.runners.fnexecution.environment.ProcessManager
A symbolic file to indicate that we want to inherit I/O of parent process.
init(Processor.Context) - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
 
init(Processor.Context) - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
 
init(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
Init metrics accumulator if it has not been initiated.
init(ResultSetMetaData) - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.RowMapperWithInit
 
initAccumulators(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
Init Metrics/Aggregators accumulators.
initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
 
initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
 
initContext(Class<?>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
Initializes BatchContextImpl for CDAP plugin.
initialBackoff() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
 
initialize(AbstractGoogleClientRequest<?>) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
 
initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
 
initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
 
initialize(HttpRequest) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
 
InitializeDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
A DoFn responsible to initialize the metadata table and prepare it for managing the state of the pipeline.
InitializeDoFn(DaoFactory, Instant, BigtableIO.ExistingPipelineOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.InitializeDoFn
 
InitializeDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
A DoFn responsible for initializing the change stream Connector.
InitializeDoFn(DaoFactory, MapperFactory, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
 
initializeSessionProperties(JCSMPProperties) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
initializeSessionProperties(JCSMPProperties) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
Override this method and provide your specific properties, including all those related to authentication, and possibly others too.
initializeWriteSessionProperties(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
This method will be called by the write connector when a new session is started.
InitialPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Utility class to determine initial partition constants and methods.
InitialPartition() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
 
InitialPipelineState - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
States to initialize a pipeline outputted by InitializeDoFn.
InitialPipelineState(Instant, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
initialRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
initialRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
Uses an TimestampRange with a max range.
initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
The restriction for a partition will be defined from the start and end timestamp to query the partition for.
initialSystemTimeAt(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
Set the initial synchronized processing time.
initPluginType(Class<?>) - Static method in class org.apache.beam.sdk.io.cdap.Plugin
Gets value of a plugin type.
initPulsarClients() - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
InjectPackageStrategy(Class<?>) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.InjectPackageStrategy
 
inLatePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert with the assertion restricted to only run on the provided window across all panes that were produced by the arrival of late data.
inLatePane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inLatePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert with the assertion restricted to only run on the provided window, running the checker only on late panes for each key.
inMemory(TableProvider...) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
This method creates BeamSqlEnv using empty * Pipeline Options.
inMemory() - Method in class org.apache.beam.sdk.transforms.View.AsList
Returns a PCollection view like this one, but whose resulting list will be entirely cached in memory.
inMemory(boolean) - Method in class org.apache.beam.sdk.transforms.View.AsList
Returns a PCollection view like this one, but whose resulting list will be entirely cached in memory according to the input parameter.
inMemory() - Method in class org.apache.beam.sdk.transforms.View.AsMap
Returns a PCollection view like this one, but whose resulting map will be entirely cached in memory.
inMemory(boolean) - Method in class org.apache.beam.sdk.transforms.View.AsMap
Returns a PCollection view like this one, but whose resulting map will be entirely cached in memory according to the input parameter.
inMemory() - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
Returns a PCollection view like this one, but whose resulting map will be entirely cached in memory.
inMemory(boolean) - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
Returns a PCollection view like this one, but whose resulting map will be entirely cached in memory according to the input parameter.
InMemoryBagUserStateFactory<K,V,W extends BoundedWindow> - Class in org.apache.beam.runners.fnexecution.state
Holds user state in memory.
InMemoryBagUserStateFactory() - Constructor for class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
 
inMemoryFinalizer(InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers
A bundle finalizer that stores all bundle finalization requests in memory.
InMemoryJobService - Class in org.apache.beam.runners.jobsubmission
A InMemoryJobService that prepares and runs jobs on behalf of a client using a JobInvoker.
InMemoryListFromMultimapViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryListFromMultimapViewFn
 
inMemoryListView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<List<T>> capable of processing elements windowed using the provided WindowingStrategy.
InMemoryListViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryListViewFn
 
inMemoryListViewUsingVoidKey(PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<List<T>> capable of processing elements windowed using the provided WindowingStrategy.
InMemoryMapFromVoidKeyViewFn(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapFromVoidKeyViewFn
 
inMemoryMapView(PCollection<KV<K, V>>, Coder<K>, Coder<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Map<K, V>> capable of processing elements windowed using the provided WindowingStrategy.
InMemoryMapViewFn(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryMapViewFn
 
inMemoryMapViewUsingVoidKey(PCollection<KV<Void, KV<K, V>>>, Coder<K>, Coder<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Map<K, V>> capable of processing elements windowed using the provided WindowingStrategy.
InMemoryMetaStore - Class in org.apache.beam.sdk.extensions.sql.meta.store
A MetaStore which stores the meta info in memory.
InMemoryMetaStore() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
InMemoryMetaTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
A InMemoryMetaTableProvider is an abstract TableProvider for in-memory types.
InMemoryMetaTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
 
InMemoryMultimapFromVoidKeyViewFn(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapFromVoidKeyViewFn
 
inMemoryMultimapView(PCollection<KV<K, V>>, Coder<K>, Coder<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Map<K, Iterable<V>>> capable of processing elements windowed using the provided WindowingStrategy.
InMemoryMultimapViewFn(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.InMemoryMultimapViewFn
 
inMemoryMultimapViewUsingVoidKey(PCollection<KV<Void, KV<K, V>>>, Coder<K>, Coder<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Map<K, Iterable<V>>> capable of processing elements windowed using the provided WindowingStrategy.
inNamespace(String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
inNamespace(Class<?>) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
Inner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter.Inner
 
innerBroadcastJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
Perform an inner join, broadcasting the right side.
innerJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Inner join of two collections of KV elements.
innerJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Inner join of two collections of KV elements.
innerJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
Perform an inner join.
inOnlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window.
inOnlyPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inOnlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert like this one, but with the assertion restricted to only run on the provided window.
inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window.
inOnTimePane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert like this one, but with the assertion restricted to only run on the provided window, running the checker only on the on-time pane for each key.
inOrder(Trigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
Returns an AfterEach Trigger with the given subtriggers.
inOrder(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
Returns an AfterEach Trigger with the given subtriggers.
InProcessServerFactory - Class in org.apache.beam.sdk.fn.server
A ServerFactory which creates servers with the InProcessServerBuilder.
INPUT - Static variable in class org.apache.beam.sdk.managed.ManagedTransformConstants
 
INPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
 
INPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
 
INPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
 
INPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
 
INPUT_TAG - Static variable in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
 
inputCollectionNames() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
The expected PCollectionRowTuple input tags.
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider inputCollectionNames method.
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
Implementation of the TypedSchemaTransformProvider inputCollectionNames method.
inputCollectionNames() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
Implementation of the TypedSchemaTransformProvider inputCollectionNames method.
inputCollectionNames() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
 
inputCollectionNames() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
Returns the input collection names of this transform.
inputFormatProvider - Variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
This should be set after SubmitterLifecycle.prepareRun(Object) call with passing this context object as a param.
inputOf(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
Returns a type descriptor for the input of the given ProcessFunction, subject to Java type erasure: may contain unresolved type variables if the type was erased.
inputOf(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
Binary compatibility adapter for TypeDescriptors.inputOf(ProcessFunction).
inputOf(Contextful.Fn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
Like TypeDescriptors.inputOf(ProcessFunction) but for Contextful.Fn.
INPUTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Inserts the partition metadata.
insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Inserts the partition metadata.
INSERT_OR_UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
INSERT_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Inserts TableRows with the specified insertIds if not null.
insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
insertAll(TableReference, List<TableRow>, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
InsertBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertBuilder
 
insertDataToTable(String, String, String, List<Map<String, Object>>) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Inserts rows to a table using a BigQuery streaming write.
insertDeduplicate() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
insertDistributedSync() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
InsertOrUpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertOrUpdateBuilder
 
insertQuorum() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
InsertRetryPolicy - Class in org.apache.beam.sdk.io.gcp.bigquery
A retry policy for streaming BigQuery inserts.
InsertRetryPolicy() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
 
InsertRetryPolicy.Context - Class in org.apache.beam.sdk.io.gcp.bigquery
Contains information about a failed insert.
insertRows(Schema, Row...) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
INSTANCE - Static variable in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider.Factory
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.GenericDatumFactory
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamJavaTypeFactory
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
instance() - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelMetadataQuery
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCoGBKJoinRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinAssociateRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMatchRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputJoinRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamTableFunctionScanRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamWindowRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamJavaUdfCalcRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcMergeRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcSplittingRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRule
 
INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRule
 
INSTANCE - Static variable in exception org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.CloseException
 
INSTANCE - Static variable in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
INSTANCE - Static variable in class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
 
INSTANCE - Static variable in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
 
INSTANCE - Static variable in class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
 
INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
Singleton instance of GlobalWindow.
INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
instanceId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
InstantCoder - Class in org.apache.beam.sdk.coders
A Coder for joda Instant that encodes it as a big endian Long shifted such that lexicographic ordering of the bytes corresponds to chronological order.
InstantDeserializer - Class in org.apache.beam.sdk.io.kafka.serialization
InstantDeserializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
instantiateCoder(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
Creates a coder for a given PCollection id from the Proto definition.
instantiateDestination(String) - Method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
 
instantiateHealthcareClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
Instantiate healthcare client (version v1).
instantiateHealthcareClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
Instantiate healthcare client (version v1).
instantiateRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
Instantiates a runner-side wire coder for the given PCollection.
instantiateRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
Instantiates a runner-side wire coder for the given PCollection.
InstantSerializer - Class in org.apache.beam.sdk.io.kafka.serialization
Kafka Serializer for Instant.
InstantSerializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
InstructionRequestHandler - Interface in org.apache.beam.runners.fnexecution.control
Interface for any function that can handle a Fn API BeamFnApi.InstructionRequest.
INT16 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
INT16 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of int16 fields.
INT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
INT32 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of int32 fields.
INT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
INT64 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of int64 fields.
INT8 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
IntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle
 
INTEGER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
integers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Integer.
integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<Integer> and returns a PCollection<Integer> whose contents is the maximum of the input PCollection's elements, or Integer.MIN_VALUE if there are no elements.
integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<Integer> and returns a PCollection<Integer> whose contents is a single value that is the minimum of the input PCollection's elements, or Integer.MAX_VALUE if there are no elements.
integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<Integer> and returns a PCollection<Integer> whose contents is the sum of the input PCollection's elements, or 0 if there are no elements.
integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and returns a PCollection<KV<K, Integer>> that contains an output element mapping each distinct key in the input PCollection to the maximum of the values associated with that key in the input PCollection.
integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and returns a PCollection<KV<K, Integer>> that contains an output element mapping each distinct key in the input PCollection to the minimum of the values associated with that key in the input PCollection.
integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and returns a PCollection<KV<K, Integer>> that contains an output element mapping each distinct key in the input PCollection to the sum of the values associated with that key in the input PCollection.
interceptor() - Static method in class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
 
interceptResponse(HttpResponse) - Method in class org.apache.beam.sdk.extensions.gcp.util.UploadIdResponseInterceptor
 
Internal - Annotation Type in org.apache.beam.sdk.annotations
Signifies that a publicly accessible API (public class, method or field) is intended for internal use only and not for public consumption.
InterpolateData() - Constructor for class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
 
interpolateKey(double) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns a ByteKey key such that [startKey, key) represents approximately the specified fraction of the range [startKey, endKey).
intersectAll(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET ALL semantics to compute the intersection with provided PCollection<T>.
intersectAll() - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET ALL semantics which takes a PCollectionList<PCollection<T>> and returns a PCollection<T> containing the intersection all of collections done in order for all collections in PCollectionList<T>.
intersectDistinct(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET DISTINCT semantics to compute the intersection with provided PCollection<T>.
intersectDistinct() - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a PTransform that takes a PCollectionList<PCollection<T>> and returns a PCollection<T> containing the intersection of collections done in order for all collections in PCollectionList<T>.
intersects(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns whether this window intersects the given window.
IntervalWindow - Class in org.apache.beam.sdk.transforms.windowing
An implementation of BoundedWindow that represents an interval from IntervalWindow.start (inclusive) to IntervalWindow.end (exclusive).
IntervalWindow(Instant, Instant) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Creates a new IntervalWindow that represents the half-open time interval [start, end).
IntervalWindow(Instant, ReadableDuration) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
IntervalWindow.IntervalWindowCoder - Class in org.apache.beam.sdk.transforms.windowing
Encodes an IntervalWindow as a pair of its upper bound and duration.
IntervalWindowCoder() - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
Returns a new FlatMapElements transform with the given type descriptor for the output type, but the mapping function yet to be specified using FlatMapElements.via(ProcessFunction).
into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
Returns a new MapElements transform with the given type descriptor for the output type, but the mapping function yet to be specified using MapElements.via(ProcessFunction).
into(TypeDescriptor<K2>) - Static method in class org.apache.beam.sdk.transforms.MapKeys
Returns a new MapKeys transform with the given type descriptor for the output type, but the mapping function yet to be specified using MapKeys.via(SerializableFunction).
into(TypeDescriptor<V2>) - Static method in class org.apache.beam.sdk.transforms.MapValues
Returns a new MapValues transform with the given type descriptor for the output type, but the mapping function yet to be specified using MapValues.via(SerializableFunction).
into(WindowFn<? super T, ?>) - Static method in class org.apache.beam.sdk.transforms.windowing.Window
Creates a Window PTransform that uses the given WindowFn to window the data.
InTransactionContext(String, TransactionContext, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Constructs a context to execute a user defined function transactionally.
InvalidConfigurationException - Exception in org.apache.beam.sdk.schemas.io
Exception thrown when the configuration for a SchemaIO is invalid.
InvalidConfigurationException(String) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidConfigurationException
 
InvalidConfigurationException(String, Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidConfigurationException
 
InvalidConfigurationException(Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidConfigurationException
 
InvalidLocationException - Exception in org.apache.beam.sdk.schemas.io
Exception thrown when the configuration for a SchemaIO is invalid.
InvalidLocationException(String) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidLocationException
 
InvalidLocationException(String, Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidLocationException
 
InvalidLocationException(Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidLocationException
 
InvalidSchemaException - Exception in org.apache.beam.sdk.schemas.io
Exception thrown when the schema for a SchemaIO is invalid.
InvalidSchemaException(String) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidSchemaException
 
InvalidSchemaException(String, Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidSchemaException
 
InvalidSchemaException(Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidSchemaException
 
InvalidTableException - Exception in org.apache.beam.sdk.extensions.sql.meta.provider
Exception thrown when the request for a table is invalid, such as invalid metadata.
InvalidTableException(String) - Constructor for exception org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
 
InvalidTableException(String, Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
 
InvalidTableException(Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
 
invokeAdvance(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
invokeStart(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.flink.FlinkJobInvoker
 
invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.jobsubmission.JobInvoker
Start running a job, abstracting its state as a JobInvocation instance.
invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.spark.SparkJobInvoker
 
inWindow(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window.
inWindow(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inWindow(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert like this one, but with the assertion restricted to only run on the provided window.
ioException() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
Returns the IOException.
ir() - Method in class org.apache.beam.sdk.extensions.sbe.SerializableIr
Returns the underlying Ir.
IrOptions() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
 
IS_MERGING_WINDOW_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
IS_PAIR_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
IS_STREAM_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
IS_WRAPPER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
isAbsolute() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
isAccessible() - Method in interface org.apache.beam.sdk.options.ValueProvider
Whether the contents of this ValueProvider is currently available via ValueProvider.get().
isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
isAliveOrThrow() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager.RunningProcess
Checks if the underlying process is still running.
isAllowDuplicates() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
isAllowedLatenessSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isAlreadyMerged() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isAppProfileSingleClusterAndTransactional(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Verify the app profile is for single cluster routing with allow single-row transactions enabled.
isArray() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns true if this type is known to be an array type.
isAtSplitPoint() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Returns true if the reader is at a split point.
isAtSplitPoint() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Returns true only for the first record; compressed sources cannot be split.
isAtSplitPoint() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Returns whether the current record is at a split point (i.e., whether the current record would be the first record to be read by a source with a specified start offset of OffsetBasedSource.OffsetBasedReader.getCurrentOffset()).
isAutoBalanceWriteFilesShardingEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
isBlockOnRun() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
isBlockOnRun() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
isBounded() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
 
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
isBounded() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
Whether the collection of rows represented by this relational expression is bounded (known to be finite) or unbounded (may or may not be finite).
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
 
isBounded() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
Whether this table is bounded (known to be finite) or unbounded (may or may not be finite).
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
 
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
 
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
isBounded() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Indicates whether the PCollections produced by this transform will contain a bounded or unbounded number of elements.
isBounded() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
This restriction tracker is for unbounded streams.
isBounded() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
 
isBounded() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
isBounded() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
isBounded() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
 
isBounded() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
 
isBounded() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
 
isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
 
isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
Return the boundedness of the current restriction.
isBounded() - Method in class org.apache.beam.sdk.values.PCollection
 
isBoundedCollection(Collection<PCollection<?>>) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
isCacheDisabled() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
isCleanArtifactsPerJob() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
isClosed() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
isClosed() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
 
isClosed() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
Returns true if the message producer is closed, false otherwise.
isClosed() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
Returns true if the message receiver is closed, false otherwise.
isClosed() - Method in class org.apache.beam.sdk.io.solace.broker.SessionService
Checks whether the connection to the service is currently closed.
isClosed() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
 
isClosed() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
 
isClosed() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.DefaultErrorHandler
 
isClosed() - Method in interface org.apache.beam.sdk.transforms.errorhandling.ErrorHandler
 
isClosed() - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
 
isCollectionType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isCommitOffsetsInFinalizeEnabled() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
 
isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
For internal use only; no backwards-compatibility guarantees.
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Deprecated.
please override verifyCompatibility to throw a useful error message; we will remove isCompatible at version 3.0.0
isCompositeType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isCompound() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
Whether it's a compound table name (with multiple path components).
isCompressed(String) - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
Deprecated.
Returns whether the file's extension matches of one of the known compression formats.
isCompressed(String) - Method in enum org.apache.beam.sdk.io.Compression
 
isCompressionEnabled() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
isConsumingReceivedData() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
isCooperative() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
 
isCooperative() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
 
isDateTimeType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
Returns true if the type is any of the various date time types.
isDateType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isDdl(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
isDecimal(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
Checks if type is decimal.
isDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
isDirectory() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
 
isDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns true if this ResourceId represents a directory, false otherwise.
isDisjoint(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns whether this window is disjoint from the given window.
isDone() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
isDone() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
isDone() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
isDynamicRead() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
isEmpty() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
 
isEmpty() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
 
isEmpty() - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
 
isEmpty() - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
isEmpty(StateAccessor<K>) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
isEmpty() - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
 
isEmpty() - Method in class org.apache.beam.sdk.fn.data.WeightedList
 
isEmpty() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
isEmpty() - Method in class org.apache.beam.sdk.io.range.ByteKey
Returns true if the byte[] backing this ByteKey is of length 0.
isEmpty() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
isEmpty() - Method in interface org.apache.beam.sdk.state.GroupingState
Returns a ReadableState whose ReadableState.read() method will return true if this state is empty at the point when that ReadableState.read() call returns.
isEmpty() - Method in interface org.apache.beam.sdk.state.MapState
Returns a ReadableState whose ReadableState.read() method will return true if this state is empty at the point when that ReadableState.read() call returns.
isEmpty() - Method in interface org.apache.beam.sdk.state.MultimapState
Returns a ReadableState whose ReadableState.read() method will return true if this state is empty at the point when that ReadableState.read() call returns.
isEmpty() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
isEmpty() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
isEmpty() - Method in class org.apache.beam.sdk.transforms.Requirements
Whether this is an empty set of requirements.
isEnableStreamingEngine() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
isEncodingPositionsOverridden() - Method in class org.apache.beam.sdk.schemas.Schema
Returns whether encoding positions have been explicitly overridden.
isEnforceEncodability() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
isEnforceImmutability() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
isEOF() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
Test clients may return true to signal that all expected messages have been pulled and the test may complete.
isEOS() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
isEqualTo(T) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Asserts that the value in question is equal to the provided value, according to Object.equals(java.lang.Object).
isEqWithEpsilon(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
isExternalizedCheckpointsEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
Enables or disables externalized checkpoints.
isFailToLock() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
isFirst() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Return true if this is the first pane produced for the associated window.
IsFlinkNativeTransform() - Constructor for class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform
 
IsFlinkNativeTransform() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform
 
isForceStreaming() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
isForceWatermarkSync() - Method in class org.apache.beam.runners.spark.io.CreateStream
 
isGetter(Method) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
isHeartbeat() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
isHotKeyLoggingEnabled() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
If enabled then the literal key will be logged to Cloud Logging if a hot key is detected.
isIn(Collection<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.isIn(Collection).
isIn(Coder<T>, Collection<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.isIn(Collection).
isIn(T[]) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.isIn(Object[]).
isIn(Coder<T>, T[]) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.isIn(Object[]).
isInboundEdgeOfVertex(Edge, String, String, String) - Method in interface org.apache.beam.runners.jet.DAGBuilder.WiringListener
 
isInboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
 
IsInf - Class in org.apache.beam.sdk.extensions.sql.impl.udf
IS_INF(X)
IsInf() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
 
isInf(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
 
isInf(Float) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
 
isInfinite() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
isInitialEvent(long, EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.EventExaminer
Is this event the first expected event for the given key and window if the per key sequence is used? In case of global sequence it determines the first global sequence event.
isInitialPartition(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
Verifies if the given partition token is the initial partition.
isInputSortRelAndLimitOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
 
isInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns whether or not this transformation applies a default value.
isIntegral(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
Checks if type is integral.
isJoinLegal(Join) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
This method checks if a join is legal and can be converted into Beam SQL.
isKey(ImmutableBitSet) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
isLast() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Return true if this is the last pane that will be produced in the associated window.
isLastEvent(long, EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.EventExaminer
Is this event the last expected event for a given key and window?
isLastEventReceived() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
isLastRecordInTransactionInPartition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
Indicates whether this record is the last emitted for the given transaction in the given partition.
isLe(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
isLeaf(PCollection<?>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
isLeaf(PCollection<?>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
isLimitOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
isLogicalType(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
isLogicalType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isLt(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
isMapType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isMetadata() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
 
isMetricsSupported() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Indicates whether metrics reporting is supported.
isModeSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isMutable() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
IsNan - Class in org.apache.beam.sdk.extensions.sql.impl.udf
IS_NAN(X)
IsNan() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
 
isNan(Float) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
 
isNan(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
 
isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
 
isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns true if this WindowFn never needs to merge any windows.
isNull(String) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IsNullFn
 
isNullable() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
 
isNullable() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
isNullable() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
Returns whether the field is nullable.
IsNullFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IsNullFn
 
isNumericType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isOneOf(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.isOneOf(T...).
isOneOf(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.isOneOf(T...).
isOpen() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
 
isOpen() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
isOpen() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
 
isOutboundEdgeOfVertex(Edge, String, String, String) - Method in interface org.apache.beam.runners.jet.DAGBuilder.WiringListener
 
isOutboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
 
isPreviewEnabled() - Method in class org.apache.beam.sdk.io.cdap.context.BatchSinkContextImpl
 
isPreviewEnabled() - Method in class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
 
isPreviewEnabled() - Method in class org.apache.beam.sdk.io.cdap.context.StreamingSourceContextImpl
 
isPrimaryKey() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
True if the column is part of the primary key, false otherwise.
isPrimitiveType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isProduceStatusUpdateOnEveryEvent() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Indicates if the status update needs to be sent after each event's processing.
isQueueNonExclusive(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClient
 
isQueueNonExclusive(String) - Method in interface org.apache.beam.sdk.io.solace.broker.SempClient
Determines if the specified queue is non-exclusive.
isReadOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
isReadSeekEfficient() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
isReady() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
 
isReady() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterator
Returns true if and only if Iterator.hasNext() and Iterator.next() will not require an expensive operation.
isReceiverStopped() - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
isRedistributed() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
isRegisterByteSizeObserverCheap(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
isRegisterByteSizeObserverCheap(ByteString) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
isRegisterByteSizeObserverCheap(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.Coder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(ReadableDuration) - Method in class org.apache.beam.sdk.coders.DurationCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(IterableT) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
Returns whether both keyCoder and valueCoder are considered not expensive.
isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
LengthPrefixCoder is cheap if valueCoder is cheap.
isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
NullableCoder is cheap if valueCoder is cheap.
isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
isRegisterByteSizeObserverCheap(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
isRegisterByteSizeObserverCheap(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
isRegisterByteSizeObserverCheap(ProducerRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
isRegisterByteSizeObserverCheap(OffsetRange) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
 
isRegisterByteSizeObserverCheap(RawUnionValue) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
Since this coder uses elementCoders.get(index) and coders that are known to run in constant time, we defer the return value to that coder.
isRegisterByteSizeObserverCheap(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
isResume() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
isRowLocked(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Returns true if row is locked.
isRunnerDeterminedSharding() - Method in interface org.apache.beam.runners.direct.DirectTestOptions
 
isSdfTimer(String) - Static method in class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
A helper function to help check whether the given timer is the timer which is set for rescheduling BeamFnApi.DelayedBundleApplication.
isSetter(Method) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
isShouldReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Whether additional diagnostic metrics should be reported for a Transform.
isSideInputLookupJoin() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
isSimple() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
Whether it's a simple name, with a single name component.
isSplittable() - Method in class org.apache.beam.sdk.io.CompressedSource
Determines whether a single file represented by this source is splittable.
isSplittable() - Method in class org.apache.beam.sdk.io.FileBasedSource
Determines whether a file represented by this source is can be split into bundles.
isStart() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
 
isStarted() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Returns true if there has been a call to OffsetBasedSource.OffsetBasedReader.start().
isStarted() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
isStreaming() - Method in interface org.apache.beam.sdk.options.StreamingOptions
Set to true if running a streaming pipeline.
isStreamingEngine() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
isStringType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
isStringType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isSubtypeOf(Schema.TypeName) - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
isSubtypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Return true if this type is a subtype of the given type.
isSuccess() - Method in class org.apache.beam.sdk.io.tika.ParseResult
Returns whether this file was parsed successfully.
isSuccess() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
isSupertypeOf(Schema.TypeName) - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
Whether this is a super type of the another type.
isSupertypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns true if this type is assignable from the given type.
isSupported() - Method in enum org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
 
isSystemTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
Whether the given transaction is Spanner system transaction.
isTableEmpty(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Returns true if the table is empty.
isTableEmpty(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
isTableEmpty(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
isTableResolved(Table) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
True if the table was resolved using the Calcite schema.
isTerminal() - Method in enum org.apache.beam.runners.local.ExecutionDriver.DriverState
 
isTerminal() - Method in enum org.apache.beam.sdk.PipelineResult.State
 
isTerminated(JobApi.JobState.Enum) - Static method in class org.apache.beam.runners.jobsubmission.JobInvocation
 
isTimer() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
isTimestampCombinerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isTopicExists(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return true if topic exists.
isTopicExists(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
isTopicExists(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
isTopicExists(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
isTriggerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isTrue() - Method in interface org.apache.beam.io.requestresponse.CallShouldBackoff
Report whether to backoff.
isTrustSelfSignedCerts() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
isUnbounded() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Gets value of a plugin type.
isUnknown() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
isUnknown() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
If any of the values for rowCount, rate or window is infinite, it returns true.
isUnknown() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Return true if there is no timing information for the current PaneInfo.
isUpdate() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Whether to update the currently running pipeline with the same name as this one.
isValid(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
isValidPartition(Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Checks if the partition's start key is before its end key.
isWholeStream - Variable in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
Whether the encoded or decoded value fills the remainder of the output or input (resp.) record/stream contents.
isWildcard(GcsPath) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Returns true if the given spec contains wildcard.
isWrapperFor(Class<?>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
isWrapping() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
 
isZero() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
 
isZero() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
 
item(String, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and string value.
item(String, ValueProvider<?>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and ValueProvider.
item(String, Integer) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and integer value.
item(String, Long) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and integer value.
item(String, Float) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and floating point value.
item(String, Double) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and floating point value.
item(String, Boolean) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and boolean value.
item(String, Instant) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and timestamp value.
item(String, Duration) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and duration value.
item(String, Class<T>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and class value.
item(String, DisplayData.Type, T) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key, type, and value.
Item() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
items() - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
 
items() - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
 
items() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
ItemSpec() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
iterable(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
iterable() - Static method in class org.apache.beam.sdk.transforms.Materializations
For internal use only; no backwards-compatibility guarantees.
ITERABLE_MATERIALIZATION_URN - Static variable in class org.apache.beam.sdk.transforms.Materializations
The URN for a Materialization where the primitive view type is an iterable of fully specified windowed values.
IterableBackedListViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableBackedListViewFn
 
IterableCoder<T> - Class in org.apache.beam.sdk.coders
An IterableCoder encodes any Iterable in the format of IterableLikeCoder.
IterableCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.IterableCoder
 
IterableLikeCoder<T,IterableT extends java.lang.Iterable<T>> - Class in org.apache.beam.sdk.coders
An abstract base class with functionality for assembling a Coder for a class that implements Iterable.
IterableLikeCoder(Coder<T>, String) - Constructor for class org.apache.beam.sdk.coders.IterableLikeCoder
 
iterables() - Static method in class org.apache.beam.sdk.transforms.Flatten
Returns a PTransform that takes a PCollection<Iterable<T>> and returns a PCollection<T> containing all the elements from all the Iterables.
iterables() - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each item in the iterable of the input PCollection to a String using the Object.toString() method followed by a "," until the last element in the iterable.
iterables(String) - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each item in the iterable of the input PCollection to a String using the Object.toString() method followed by the specified delimiter until the last element in the iterable.
iterables(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Iterable.
iterableView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Iterable<T>> capable of processing elements windowed using the provided WindowingStrategy.
IterableViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
Deprecated.
 
IterableViewFn2(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
 
iterableViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, T>>, PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
iterableWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
iterableWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
iterator() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
iterator() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterable
 
iterator() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
 
iterator() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
 
iterator() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
iterator() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
 

J

jarPath() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
The Beam filesystem path to the jar where the method was defined.
javaAggregateFunctions() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
 
JavaBeanSchema - Class in org.apache.beam.sdk.schemas
A SchemaProvider for Java Bean objects.
JavaBeanSchema() - Constructor for class org.apache.beam.sdk.schemas.JavaBeanSchema
 
JavaBeanSchema.GetterTypeSupplier - Class in org.apache.beam.sdk.schemas
FieldValueTypeSupplier that's based on getter methods.
JavaBeanSchema.SetterTypeSupplier - Class in org.apache.beam.sdk.schemas
FieldValueTypeSupplier that's based on setter methods.
JavaBeanUtils - Class in org.apache.beam.sdk.schemas.utils
A set of utilities to generate getter and setter classes for JavaBean objects.
JavaBeanUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
JavaClassLookupAllowListFactory() - Constructor for class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.JavaClassLookupAllowListFactory
 
JavaExplodeTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
An implementation of TypedSchemaTransformProvider for Explode.
JavaExplodeTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
 
JavaExplodeTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaExplodeTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaExplodeTransformProvider.ExplodeTransform - Class in org.apache.beam.sdk.schemas.transforms.providers
A SchemaTransform for Explode.
JavaFieldSchema - Class in org.apache.beam.sdk.schemas
A SchemaProvider for Java POJO objects.
JavaFieldSchema() - Constructor for class org.apache.beam.sdk.schemas.JavaFieldSchema
 
JavaFieldSchema.JavaFieldTypeSupplier - Class in org.apache.beam.sdk.schemas
FieldValueTypeSupplier that's based on public fields.
JavaFieldTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
 
JavaFilterTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
An implementation of TypedSchemaTransformProvider for Filter for the java language.
JavaFilterTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
 
JavaFilterTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaFilterTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaFilterTransformProvider.JavaFilterTransform - Class in org.apache.beam.sdk.schemas.transforms.providers
A SchemaTransform for Filter-java.
javaIterator(Iterator<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
Java Iterator of Scala Iterator.
JavaMapToFieldsTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
An implementation of TypedSchemaTransformProvider for MapToFields for the java language.
JavaMapToFieldsTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
 
JavaMapToFieldsTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaMapToFieldsTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaMapToFieldsTransformProvider.JavaMapToFieldsTransform - Class in org.apache.beam.sdk.schemas.transforms.providers
A SchemaTransform for MapToFields-java.
JavaRowUdf - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaRowUdf(JavaRowUdf.Configuration, Schema) - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf
 
JavaRowUdf.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaRowUdf.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
 
JavaScalarFunction() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
 
javaScalarFunctions() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
 
javaTypeForFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.FieldTypeDescriptors
JavaUdfLoader - Class in org.apache.beam.sdk.extensions.sql.impl
Loads UdfProvider implementations from user-provided jars.
JavaUdfLoader() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
 
JAXBCoder<T> - Class in org.apache.beam.sdk.io.xml
A coder for JAXB annotated objects.
JdbcConnection - Class in org.apache.beam.sdk.extensions.sql.impl
Beam JDBC Connection.
JdbcDriver - Class in org.apache.beam.sdk.extensions.sql.impl
Calcite JDBC driver with Beam defaults.
JdbcDriver() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
JdbcIO - Class in org.apache.beam.sdk.io.jdbc
IO to read and write data on JDBC.
JdbcIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.jdbc
A POJO describing a DataSource, either providing directly a DataSource or all properties allowing to create a DataSource.
JdbcIO.DataSourceProviderFromDataSourceConfiguration - Class in org.apache.beam.sdk.io.jdbc
Wraps a JdbcIO.DataSourceConfiguration to provide a DataSource.
JdbcIO.DefaultRetryStrategy - Class in org.apache.beam.sdk.io.jdbc
This is the default Predicate we use to detect DeadLock.
JdbcIO.PoolableDataSourceProvider - Class in org.apache.beam.sdk.io.jdbc
JdbcIO.PreparedStatementSetter<T> - Interface in org.apache.beam.sdk.io.jdbc
An interface used by the JdbcIO Write to set the parameters of the PreparedStatement used to setParameters into the database.
JdbcIO.Read<T> - Class in org.apache.beam.sdk.io.jdbc
Implementation of JdbcIO.read().
JdbcIO.ReadAll<ParameterT,OutputT> - Class in org.apache.beam.sdk.io.jdbc
Implementation of JdbcIO.readAll().
JdbcIO.ReadRows - Class in org.apache.beam.sdk.io.jdbc
Implementation of JdbcIO.readRows().
JdbcIO.ReadWithPartitions<T,PartitionColumnT> - Class in org.apache.beam.sdk.io.jdbc
JdbcIO.RetryConfiguration - Class in org.apache.beam.sdk.io.jdbc
Builder used to help with retry configuration for JdbcIO.
JdbcIO.RetryStrategy - Interface in org.apache.beam.sdk.io.jdbc
An interface used to control if we retry the statements when a SQLException occurs.
JdbcIO.RowMapper<T> - Interface in org.apache.beam.sdk.io.jdbc
An interface used by JdbcIO.Read for converting each row of the ResultSet into an element of the resulting PCollection.
JdbcIO.StatementPreparator - Interface in org.apache.beam.sdk.io.jdbc
An interface used by the JdbcIO Write to set the parameters of the PreparedStatement used to setParameters into the database.
JdbcIO.Write<T> - Class in org.apache.beam.sdk.io.jdbc
This class is used as the default return value of JdbcIO.write().
JdbcIO.WriteVoid<T> - Class in org.apache.beam.sdk.io.jdbc
A PTransform to write to a JDBC datasource.
JdbcIO.WriteWithResults<T,V extends JdbcWriteResult> - Class in org.apache.beam.sdk.io.jdbc
A PTransform to write to a JDBC datasource.
JdbcReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
JdbcReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc
An implementation of SchemaTransformProvider for reading from JDBC connections using JdbcIO.
JdbcReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
 
JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.jdbc
 
JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.jdbc
 
JdbcReadWithPartitionsHelper<PartitionT> - Interface in org.apache.beam.sdk.io.jdbc
A helper for JdbcIO.ReadWithPartitions that handles range calculations.
JdbcSchemaIOProvider - Class in org.apache.beam.sdk.io.jdbc
An implementation of SchemaIOProvider for reading and writing JSON payloads with JdbcIO.
JdbcSchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
 
jdbcUrl() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
JdbcWriteResult - Class in org.apache.beam.sdk.io.jdbc
The result of writing a row to JDBC datasource.
JdbcWriteResult() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteResult
 
JdbcWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
JdbcWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.jdbc
An implementation of SchemaTransformProvider for writing to a JDBC connections using JdbcIO.
JdbcWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
 
JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.jdbc
 
JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.jdbc
 
JetMetricResults - Class in org.apache.beam.runners.jet.metrics
Jet specific MetricResults.
JetMetricResults(IMap<String, MetricUpdates>) - Constructor for class org.apache.beam.runners.jet.metrics.JetMetricResults
 
JetMetricsContainer - Class in org.apache.beam.runners.jet.metrics
Jet specific implementation of MetricsContainer.
JetMetricsContainer(String, String, Processor.Context) - Constructor for class org.apache.beam.runners.jet.metrics.JetMetricsContainer
 
JetPipelineOptions - Interface in org.apache.beam.runners.jet
Pipeline options specific to the Jet runner.
JetPipelineResult - Class in org.apache.beam.runners.jet
Jet specific implementation of PipelineResult.
JetRunner - Class in org.apache.beam.runners.jet
Jet specific implementation of Beam's PipelineRunner.
JetRunnerRegistrar - Class in org.apache.beam.runners.jet
Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the JetRunner.
JetRunnerRegistrar.Options - Class in org.apache.beam.runners.jet
Registers the JetPipelineOptions.
JetRunnerRegistrar.Runner - Class in org.apache.beam.runners.jet
Registers the JetRunner.
JmsIO - Class in org.apache.beam.sdk.io.jms
An unbounded source for JMS destinations (queues or topics).
JmsIO.ConnectionFactoryContainer<T extends JmsIO.ConnectionFactoryContainer<T>> - Interface in org.apache.beam.sdk.io.jms
 
JmsIO.MessageMapper<T> - Interface in org.apache.beam.sdk.io.jms
An interface used by JmsIO.Read for converting each jms Message into an element of the resulting PCollection.
JmsIO.Read<T> - Class in org.apache.beam.sdk.io.jms
A PTransform to read from a JMS destination.
JmsIO.Write<EventT> - Class in org.apache.beam.sdk.io.jms
A PTransform to write to a JMS queue.
JmsIOException - Exception in org.apache.beam.sdk.io.jms
 
JmsIOException(String) - Constructor for exception org.apache.beam.sdk.io.jms.JmsIOException
 
JmsIOException(String, Throwable) - Constructor for exception org.apache.beam.sdk.io.jms.JmsIOException
 
JmsRecord - Class in org.apache.beam.sdk.io.jms
JmsRecord contains message payload of the record as well as metadata (JMS headers and properties).
JmsRecord(String, long, String, Destination, Destination, int, boolean, String, long, int, Map<String, Object>, String) - Constructor for class org.apache.beam.sdk.io.jms.JmsRecord
 
JOB_ID - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
JOB_PORT_FLAG_NAME - Static variable in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
JobBundleFactory - Interface in org.apache.beam.runners.fnexecution.control
A factory that has all job-scoped information, and can be combined with stage-scoped information to create a StageBundleFactory.
jobId() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
JobInfo - Class in org.apache.beam.runners.fnexecution.provisioning
A subset of ProvisionApi.ProvisionInfo that specifies a unique job, while omitting fields that are not known to the runner operator.
JobInfo() - Constructor for class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
JobInvocation - Class in org.apache.beam.runners.jobsubmission
Internal representation of a Job which has been invoked (prepared and run) by a client.
JobInvocation(JobInfo, ListeningExecutorService, RunnerApi.Pipeline, PortablePipelineRunner) - Constructor for class org.apache.beam.runners.jobsubmission.JobInvocation
 
JobInvoker - Class in org.apache.beam.runners.jobsubmission
Factory to create JobInvocation instances.
JobInvoker(String) - Constructor for class org.apache.beam.runners.jobsubmission.JobInvoker
 
jobName() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
JobNameFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
 
JobPreparation - Class in org.apache.beam.runners.jobsubmission
A job that has been prepared, but not invoked.
JobPreparation() - Constructor for class org.apache.beam.runners.jobsubmission.JobPreparation
 
JobServerDriver - Class in org.apache.beam.runners.jobsubmission
Shared code for starting and serving an InMemoryJobService.
JobServerDriver(JobServerDriver.ServerConfiguration, ServerFactory, ServerFactory, JobServerDriver.JobInvokerFactory) - Constructor for class org.apache.beam.runners.jobsubmission.JobServerDriver
 
JobServerDriver.JobInvokerFactory - Interface in org.apache.beam.runners.jobsubmission
 
JobServerDriver.ServerConfiguration - Class in org.apache.beam.runners.jobsubmission
Configuration for the jobServer.
JobSpecification(Job, RunnerApi.Pipeline, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
 
jobToString(Job) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Renders a Job as a string.
Join - Class in org.apache.beam.sdk.extensions.joinlibrary
Utility class with different versions of joins.
Join() - Constructor for class org.apache.beam.sdk.extensions.joinlibrary.Join
 
join(String, CoGroup.By) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.ExpandCrossProduct
Select the following fields for the specified PCollection with the specified join args.
join(String, CoGroup.By) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
Select the following fields for the specified PCollection with the specified join args.
join(CoGroup.By) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup
Join all input PCollections using the same args.
join(String, CoGroup.By) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup
Specify the following join arguments (including fields to join by_ for the specified PCollection.
Join - Class in org.apache.beam.sdk.schemas.transforms
A transform that performs equijoins across two schema PCollections.
Join() - Constructor for class org.apache.beam.sdk.schemas.transforms.Join
 
Join.FieldsEqual - Class in org.apache.beam.sdk.schemas.transforms
Predicate object to specify fields to compare when doing an equi-join.
Join.FieldsEqual.Impl - Class in org.apache.beam.sdk.schemas.transforms
Implementation class for FieldsEqual.
Join.FullOuterJoin<K,V1,V2> - Class in org.apache.beam.sdk.extensions.joinlibrary
PTransform representing a full outer join of two collections of KV elements.
Join.Impl<LhsT,RhsT> - Class in org.apache.beam.sdk.schemas.transforms
Implementation class .
Join.InnerJoin<K,V1,V2> - Class in org.apache.beam.sdk.extensions.joinlibrary
PTransform representing an inner join of two collections of KV elements.
Join.LeftOuterJoin<K,V1,V2> - Class in org.apache.beam.sdk.extensions.joinlibrary
PTransform representing a left outer join of two collections of KV elements.
Join.RightOuterJoin<K,V1,V2> - Class in org.apache.beam.sdk.extensions.joinlibrary
PTransform representing a right outer join of two collections of KV elements.
JoinAsLookup(RexNode, BeamSqlSeekableTable, Schema, Schema, int, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.JoinAsLookup
 
JoinRelOptRuleCall - Class in org.apache.beam.sdk.extensions.sql.impl.rule
This is a class to catch the built join and check if it is a legal join before passing it to the actual RelOptRuleCall.
JoinRelOptRuleCall.JoinChecker - Interface in org.apache.beam.sdk.extensions.sql.impl.rule
This is a function gets the output relation and checks if it is a legal relational node.
JsonArrayCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
 
JsonArrayCoder() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
 
jsonBytesLike(String) - Static method in class org.apache.beam.sdk.testing.JsonMatcher
 
jsonBytesLike(Map<String, Object>) - Static method in class org.apache.beam.sdk.testing.JsonMatcher
 
JsonIO - Class in org.apache.beam.sdk.io.json
PTransforms for reading and writing JSON files.
JsonIO() - Constructor for class org.apache.beam.sdk.io.json.JsonIO
 
JsonIO.Write<T> - Class in org.apache.beam.sdk.io.json
PTransform for writing JSON files.
JsonMatcher<T> - Class in org.apache.beam.sdk.testing
Matcher to compare a string or byte[] representing a JSON Object, independent of field order.
JsonMatcher(Map<String, Object>) - Constructor for class org.apache.beam.sdk.testing.JsonMatcher
 
JsonPayloadSerializerProvider - Class in org.apache.beam.sdk.schemas.io.payloads
 
JsonPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
 
JsonReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
A FileReadSchemaTransformFormatProvider that reads newline-delimited JSONs.
JsonReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.JsonReadSchemaTransformFormatProvider
 
jsonSchemaFromBeamSchema(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
 
jsonSchemaStringFromBeamSchema(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JsonUtils
 
jsonStringLike(String) - Static method in class org.apache.beam.sdk.testing.JsonMatcher
 
jsonStringLike(Map<String, Object>) - Static method in class org.apache.beam.sdk.testing.JsonMatcher
 
JsonToRow - Class in org.apache.beam.sdk.transforms
Creates a PTransform to convert input JSON objects to Rows with given Schema.
JsonToRow() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow
 
JsonToRow.JsonToRowWithErrFn - Class in org.apache.beam.sdk.transforms
 
JsonToRow.JsonToRowWithErrFn.Builder - Class in org.apache.beam.sdk.transforms
 
JsonToRow.JsonToRowWithErrFn.ParseWithError - Class in org.apache.beam.sdk.transforms
 
JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder - Class in org.apache.beam.sdk.transforms
 
JsonToRow.ParseResult - Class in org.apache.beam.sdk.transforms
The result of a JsonToRow.withExceptionReporting(Schema) transform.
JsonToRow.ParseResult.Builder - Class in org.apache.beam.sdk.transforms
 
JsonToRowWithErrFn() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
 
JsonUtils - Class in org.apache.beam.sdk.schemas.utils
Utils to convert JSON records to Beam Row.
JsonUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.JsonUtils
 
jsonValueFromMessageValue(Descriptors.FieldDescriptor, Object, boolean, Predicate<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
JsonWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
 
JsonWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
JsonWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.JsonWriteSchemaTransformFormatProvider
 
JsonWriteTransformProvider - Class in org.apache.beam.sdk.io.json.providers
JsonWriteTransformProvider() - Constructor for class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
 
JsonWriteTransformProvider.JsonWriteConfiguration - Class in org.apache.beam.sdk.io.json.providers
Configuration for writing to BigQuery with Storage Write API.
JsonWriteTransformProvider.JsonWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.json.providers
JsonWriteTransformProvider.JsonWriteTransform - Class in org.apache.beam.sdk.io.json.providers
JvmInitializer - Interface in org.apache.beam.sdk.harness
A service interface for defining one-time initialization of the JVM during pipeline execution.
JvmInitializers - Class in org.apache.beam.sdk.fn
Helpers for executing JvmInitializer implementations.
JvmInitializers() - Constructor for class org.apache.beam.sdk.fn.JvmInitializers
 

K

KAFKA - Static variable in class org.apache.beam.sdk.managed.Managed
 
KAFKA_READ_OVERRIDE - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A PTransformOverride for runners to swap ReadFromKafkaViaSDF to legacy Kafka read if runners doesn't have a good support on executing unbounded Splittable DoFn.
KAFKA_READ_WITH_METADATA_TRANSFORM_URN_V2 - Static variable in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation
 
KAFKA_WRITE_TRANSFORM_URN_V2 - Static variable in class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation
 
KafkaCheckpointMark - Class in org.apache.beam.sdk.io.kafka
Checkpoint for a KafkaUnboundedReader.
KafkaCheckpointMark(List<KafkaCheckpointMark.PartitionMark>, Optional<KafkaUnboundedReader<?, ?>>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
KafkaCheckpointMark.PartitionMark - Class in org.apache.beam.sdk.io.kafka
A tuple to hold topic, partition, and offset that comprise the checkpoint for a single partition.
KafkaCommitOffset<K,V> - Class in org.apache.beam.sdk.io.kafka
A PTransform that commits offsets of KafkaRecord.
KafkaConnectUtils - Class in org.apache.beam.io.debezium
 
KafkaConnectUtils() - Constructor for class org.apache.beam.io.debezium.KafkaConnectUtils
 
KafkaIO - Class in org.apache.beam.sdk.io.kafka
An unbounded source and a sink for Kafka topics.
KafkaIO.Read<K,V> - Class in org.apache.beam.sdk.io.kafka
A PTransform to read from Kafka topics.
KafkaIO.Read.External - Class in org.apache.beam.sdk.io.kafka
Exposes KafkaIO.TypedWithoutMetadata as an external transform for cross-language usage.
KafkaIO.Read.External.Configuration - Class in org.apache.beam.sdk.io.kafka
Parameters class to expose the Read transform to an external SDK.
KafkaIO.Read.FakeFlinkPipelineOptions - Interface in org.apache.beam.sdk.io.kafka
 
KafkaIO.ReadSourceDescriptors<K,V> - Class in org.apache.beam.sdk.io.kafka
KafkaIO.TypedWithoutMetadata<K,V> - Class in org.apache.beam.sdk.io.kafka
A PTransform to read from Kafka topics.
KafkaIO.Write<K,V> - Class in org.apache.beam.sdk.io.kafka
A PTransform to write to a Kafka topic with KVs .
KafkaIO.Write.External - Class in org.apache.beam.sdk.io.kafka
Exposes KafkaIO.Write as an external transform for cross-language usage.
KafkaIO.Write.External.Configuration - Class in org.apache.beam.sdk.io.kafka
Parameters class to expose the Write transform to an external SDK.
KafkaIO.WriteRecords<K,V> - Class in org.apache.beam.sdk.io.kafka
A PTransform to write to a Kafka topic with ProducerRecord's.
KafkaIOTranslation - Class in org.apache.beam.sdk.io.kafka.upgrade
Utility methods for translating KafkaIO transforms to and from RunnerApi representations.
KafkaIOTranslation() - Constructor for class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation
 
KafkaIOTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.kafka.upgrade
 
KafkaIOTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.kafka.upgrade
 
KafkaIOUtils - Class in org.apache.beam.sdk.io.kafka
Common utility functions and default configurations for KafkaIO.Read and KafkaIO.ReadSourceDescriptors.
KafkaIOUtils() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIOUtils
 
KafkaMetrics - Interface in org.apache.beam.sdk.io.kafka
Stores and exports metrics for a batch of Kafka Client RPCs.
KafkaMetrics.KafkaMetricsImpl - Class in org.apache.beam.sdk.io.kafka
Metrics of a batch of RPCs.
KafkaMetrics.NoOpKafkaMetrics - Class in org.apache.beam.sdk.io.kafka
No-op implementation of KafkaResults.
KafkaMetricsImpl() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
 
KafkaPublishTimestampFunction<T> - Interface in org.apache.beam.sdk.io.kafka
An interface for providing custom timestamp for elements written to Kafka.
KafkaReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.kafka
Configuration for reading from a Kafka topic.
KafkaReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
KafkaReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.kafka
KafkaReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.kafka
 
KafkaReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
KafkaReadSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.kafka
 
KafkaRecord<K,V> - Class in org.apache.beam.sdk.io.kafka
KafkaRecord contains key and value of the record as well as metadata for the record (topic name, partition id, and offset).
KafkaRecord(String, int, long, long, KafkaTimestampType, Headers, K, V) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecord
 
KafkaRecord(String, int, long, long, KafkaTimestampType, Headers, KV<K, V>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecord
 
KafkaRecordCoder<K,V> - Class in org.apache.beam.sdk.io.kafka
KafkaRecordCoder(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
KafkaSchemaTransformTranslation - Class in org.apache.beam.sdk.io.kafka
 
KafkaSchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation
 
KafkaSchemaTransformTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.kafka
 
KafkaSchemaTransformTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.kafka
 
KafkaSinkMetrics - Class in org.apache.beam.sdk.io.kafka
Helper class to create per worker metrics for Kafka Sink stages.
KafkaSinkMetrics() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
 
KafkaSourceConsumerFn<T> - Class in org.apache.beam.io.debezium
Quick Overview
KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory - Class in org.apache.beam.io.debezium
 
KafkaSourceDescriptor - Class in org.apache.beam.sdk.io.kafka
Represents a Kafka source description.
KafkaSourceDescriptor() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
 
KafkaTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
Kafka table provider.
KafkaTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
 
KafkaTimestampType - Enum in org.apache.beam.sdk.io.kafka
This is a copy of Kafka's TimestampType.
KafkaWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
 
KafkaWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.kafka
 
KafkaWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.kafka
 
KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.kafka
 
KEEP_NESTED_NAME - Static variable in class org.apache.beam.sdk.schemas.utils.SelectHelpers
This policy keeps the raw nested field name.
keepEarliest() - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
keepLatest() - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
keepMostNestedFieldName() - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
For nested fields, keep just the most-nested field name.
key(String) - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
 
KEY - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
KEY_FIELD_PROPERTY - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
 
keyCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
keyCoder() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
Returns the Coder to use for the elements of the resulting keys iterable.
KeyedPCollectionTuple<K> - Class in org.apache.beam.sdk.transforms.join
An immutable tuple of keyed PCollections with key type K.
KeyedPCollectionTuple.TaggedKeyedPCollection<K,V> - Class in org.apache.beam.sdk.transforms.join
A utility class to help ensure coherence of tag and input PCollection types.
keyedValues() - Static method in class org.apache.beam.sdk.transforms.Deduplicate
Returns a deduplication transform that deduplicates keyed values using the key for up to 10 mins within the processing time domain.
keyEncoderOf(KvCoder<K, V>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
 
keyField - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
KeyPairUtils - Class in org.apache.beam.sdk.io.snowflake
 
KeyPairUtils() - Constructor for class org.apache.beam.sdk.io.snowflake.KeyPairUtils
 
KeyPart() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.KeyPart
 
keys() - Method in interface org.apache.beam.sdk.state.MapState
Returns an Iterable over the keys contained in this map.
keys() - Method in interface org.apache.beam.sdk.state.MultimapState
Returns an Iterable over the keys contained in this multimap.
Keys<K> - Class in org.apache.beam.sdk.transforms
Keys<K> takes a PCollection of KV<K, V>s and returns a PCollection<K> of the keys.
keySet() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
kind - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
KinesisClientThrottledException - Exception in org.apache.beam.sdk.io.aws2.kinesis
Thrown when the Kinesis client was throttled due to rate limits.
KinesisClientThrottledException(String, KinesisException) - Constructor for exception org.apache.beam.sdk.io.aws2.kinesis.KinesisClientThrottledException
 
KinesisClientThrottledException - Exception in org.apache.beam.sdk.io.kinesis
Thrown when the Kinesis client was throttled due to rate limits.
KinesisClientThrottledException(String, AmazonClientException) - Constructor for exception org.apache.beam.sdk.io.kinesis.KinesisClientThrottledException
 
KinesisIO - Class in org.apache.beam.sdk.io.aws2.kinesis
IO to read from Kinesis streams.
KinesisIO() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
 
KinesisIO - Class in org.apache.beam.sdk.io.kinesis
Deprecated.
Module beam-sdks-java-io-kinesis is deprecated and will be eventually removed. Please migrate to KinesisIO in module beam-sdks-java-io-amazon-web-services2.
KinesisIO() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisIO
Deprecated.
 
KinesisIO.Read - Class in org.apache.beam.sdk.io.aws2.kinesis
Implementation of KinesisIO.read().
KinesisIO.Read<T> - Class in org.apache.beam.sdk.io.kinesis
Deprecated.
Implementation of KinesisIO.read().
KinesisIO.RecordAggregation - Class in org.apache.beam.sdk.io.aws2.kinesis
Configuration of Kinesis record aggregation.
KinesisIO.RecordAggregation.Builder - Class in org.apache.beam.sdk.io.aws2.kinesis
 
KinesisIO.Write<T> - Class in org.apache.beam.sdk.io.aws2.kinesis
Implementation of KinesisIO.write().
KinesisIO.Write - Class in org.apache.beam.sdk.io.kinesis
Deprecated.
Implementation of KinesisIO.write().
KinesisIO.Write.Result - Class in org.apache.beam.sdk.io.aws2.kinesis
KinesisIOOptions - Interface in org.apache.beam.sdk.io.aws2.kinesis
PipelineOptions for KinesisIO.
KinesisIOOptions.KinesisIOOptionsRegistrar - Class in org.apache.beam.sdk.io.aws2.kinesis
A registrar containing the default KinesisIOOptions.
KinesisIOOptions.MapFactory - Class in org.apache.beam.sdk.io.aws2.kinesis
 
KinesisIOOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.KinesisIOOptionsRegistrar
 
KinesisPartitioner<T> - Interface in org.apache.beam.sdk.io.aws2.kinesis
Kinesis interface for custom partitioner.
KinesisPartitioner - Interface in org.apache.beam.sdk.io.kinesis
Kinesis interface for custom partitioner.
KinesisPartitioner.ExplicitPartitioner<T> - Interface in org.apache.beam.sdk.io.aws2.kinesis
An explicit partitioner that always returns a Nonnull explicit hash key.
KinesisRecord - Class in org.apache.beam.sdk.io.aws2.kinesis
KinesisClientRecord enhanced with utility methods.
KinesisRecord(KinesisClientRecord, String, String) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
KinesisRecord(ByteBuffer, String, long, String, Instant, Instant, String, String) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
KinesisRecord - Class in org.apache.beam.sdk.io.kinesis
UserRecord enhanced with utility methods.
KinesisRecord(UserRecord, String, String) - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
KinesisRecord(ByteBuffer, String, long, String, Instant, Instant, String, String) - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
KinesisTransformRegistrar - Class in org.apache.beam.sdk.io.kinesis
Exposes KinesisIO.Write and KinesisIO.Read as an external transform for cross-language usage.
KinesisTransformRegistrar() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar
 
KinesisTransformRegistrar.ReadDataBuilder - Class in org.apache.beam.sdk.io.kinesis
 
KinesisTransformRegistrar.ReadDataBuilder.Configuration - Class in org.apache.beam.sdk.io.kinesis
 
KinesisTransformRegistrar.WriteBuilder - Class in org.apache.beam.sdk.io.kinesis
 
KinesisTransformRegistrar.WriteBuilder.Configuration - Class in org.apache.beam.sdk.io.kinesis
 
knownBuilderInstances() - Method in interface org.apache.beam.sdk.expansion.ExternalTransformRegistrar
A mapping from URN to an ExternalTransformBuilder instance.
knownBuilderInstances() - Method in class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar
 
knownBuilderInstances() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
 
knownBuilderInstances() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
knownBuilderInstances() - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar
 
knownBuilderInstances() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.SnowflakeTransformRegistrar
 
knownBuilders() - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar
 
knownBuilders() - Method in interface org.apache.beam.sdk.expansion.ExternalTransformRegistrar
Deprecated.
Prefer implementing 'knownBuilderInstances'. This method will be removed in a future version of Beam.
knownBuilders() - Method in class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar
 
knownBuilders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
 
knownBuilders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
 
knownBuilders() - Method in class org.apache.beam.sdk.io.GenerateSequence.External
 
knownBuilders() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
 
knownBuilders() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
 
knownTransforms() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionService.ExpansionServiceRegistrar
 
knownTransforms() - Method in class org.apache.beam.sdk.expansion.service.ExpansionService.ExternalTransformRegistrarLoader
 
knownUrns() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
 
knownUrns() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
 
knownUrns() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
 
KuduIO - Class in org.apache.beam.sdk.io.kudu
A bounded source and sink for Kudu.
KuduIO.FormatFunction<T> - Interface in org.apache.beam.sdk.io.kudu
An interface used by the KuduIO Write to convert an input record into an Operation to apply as a mutation in Kudu.
KuduIO.Read<T> - Class in org.apache.beam.sdk.io.kudu
Implementation of KuduIO.read().
KuduIO.Write<T> - Class in org.apache.beam.sdk.io.kudu
A PTransform that writes to Kudu.
kv(SerializableMatcher<? super K>, SerializableMatcher<? super V>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher that matches any KV with matching key and value.
KV<K,V> - Class in org.apache.beam.sdk.values
An immutable key/value pair.
KV.OrderByKey<K extends java.lang.Comparable<? super K>,V> - Class in org.apache.beam.sdk.values
A Comparator that orders KVs by the natural ordering of their keys.
KV.OrderByValue<K,V extends java.lang.Comparable<? super V>> - Class in org.apache.beam.sdk.values
A Comparator that orders KVs by the natural ordering of their values.
KvCoder<K,V> - Class in org.apache.beam.sdk.coders
A KvCoder encodes KVs.
kvEncoder(Encoder<K>, Encoder<V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Creates a Spark Encoder for KV of StructType with fields key and value.
kvEncoderOf(KvCoder<K, V>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
 
kvs() - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each element of the input PCollection to a String by using the Object.toString() on the key followed by a "," followed by the Object.toString() of the value.
kvs(String) - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each element of the input PCollection to a String by using the Object.toString() on the key followed by the specified delimiter followed by the Object.toString() of the value.
kvs(TypeDescriptor<K>, TypeDescriptor<V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
KvSwap<K,V> - Class in org.apache.beam.sdk.transforms
KvSwap<K, V> takes a PCollection<KV<K, V>> and returns a PCollection<KV<V, K>>, where all the keys and values have been swapped.
kvWithKey(K) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher that matches any KV with the specified key.
kvWithKey(Coder<K>, K) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher that matches any KV with the specified key.
kvWithKey(SerializableMatcher<? super K>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher that matches any KV with matching key.
kvWithValue(V) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher that matches any KV with the specified value.
kvWithValue(Coder<V>, V) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher that matches any KV with the specified value.
kvWithValue(SerializableMatcher<? super V>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher that matches any KV with matching value.

L

LabeledMetricNameUtils - Class in org.apache.beam.sdk.metrics
Util class for building/parsing labeled MetricName.
LabeledMetricNameUtils() - Constructor for class org.apache.beam.sdk.metrics.LabeledMetricNameUtils
 
LabeledMetricNameUtils.MetricNameBuilder - Class in org.apache.beam.sdk.metrics
Builder class for a labeled MetricName.
LabeledMetricNameUtils.ParsedMetricName - Class in org.apache.beam.sdk.metrics
 
LABELS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
languageHint() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
 
LargeKeys - Interface in org.apache.beam.sdk.testing
Category tags for tests which validate that a Beam runner can handle keys up to a given size.
LargeKeys.Above100KB - Interface in org.apache.beam.sdk.testing
Tests if a runner supports 100KB keys.
LargeKeys.Above100MB - Interface in org.apache.beam.sdk.testing
Tests if a runner supports 100MB keys.
LargeKeys.Above10KB - Interface in org.apache.beam.sdk.testing
Tests if a runner supports 10KB keys.
LargeKeys.Above10MB - Interface in org.apache.beam.sdk.testing
Tests if a runner supports 10MB keys.
LargeKeys.Above1MB - Interface in org.apache.beam.sdk.testing
Tests if a runner supports 1MB keys.
largest(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<List<T>> with a single element containing the largest count elements of the input PCollection<T>, in decreasing order, sorted according to their natural order.
Largest() - Constructor for class org.apache.beam.sdk.transforms.Top.Largest
Deprecated.
 
largestContinuousRange() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
 
largestDoublesFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a Top.TopCombineFn that aggregates the largest count double values.
largestFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a Top.TopCombineFn that aggregates the largest count values.
largestIntsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a Top.TopCombineFn that aggregates the largest count int values.
largestLongsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a Top.TopCombineFn that aggregates the largest count long values.
largestPerKey(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the input PCollection to the largest count values associated with that key in the input PCollection<KV<K, V>>, in decreasing order, sorted according to their natural order.
LargestUnique(long) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
Deprecated.
Creates a heap to track the largest sampleSize elements.
last - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
 
lastAttemptedOffset - Variable in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
lastAttemptedPosition - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
lastClaimedOffset - Variable in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
lastClaimedPosition - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
lastModifiedMillis() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
Last modification timestamp in milliseconds since Unix epoch.
LatencyRecordingHttpRequestInitializer - Class in org.apache.beam.sdk.extensions.gcp.util
HttpRequestInitializer for recording request to response latency of Http-based API calls.
LatencyRecordingHttpRequestInitializer(Histogram) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
 
LatencyRecordingHttpRequestInitializer(Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
 
Latest - Class in org.apache.beam.sdk.transforms
PTransform and Combine.CombineFn for computing the latest element in a PCollection.
latestContiguousRange() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
 
LazyAggregateCombineFn<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.extensions.sql.impl
LazyAggregateCombineFn(List<String>, String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
leaseWorkItem(String, LeaseWorkItemRequest) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Leases the work item for jobId.
LEAST_SIGNIFICANT_BITS_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
leaveCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
leaveCompositeTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called for each composite transform after all of its component transforms and their outputs have been visited.
leavePipeline(Pipeline) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
leavePipeline(Pipeline) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called when all values and transforms in a Pipeline have been visited.
LEFT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinPushThroughJoinRule
Instance of the rule that works on logical joins only, and pushes to the left.
left(String...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
 
left(Integer...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
 
left(FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
 
left(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
 
left(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
 
left(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
 
leftOuterBroadcastJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
Perform a left outer join, broadcasting the right side.
leftOuterJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Left Outer Join of two collections of KV elements.
leftOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
 
leftOuterJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
Perform a left outer join.
length - Variable in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
 
lengthBytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
LengthPrefixCoder<T> - Class in org.apache.beam.sdk.coders
A Coder which is able to take any existing coder and wrap it such that it is only invoked in the outer context.
LengthPrefixUnknownCoders - Class in org.apache.beam.runners.fnexecution.wire
Utilities for replacing or wrapping unknown coders with LengthPrefixCoder.
LengthPrefixUnknownCoders() - Constructor for class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
 
lengthString(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
lessThan(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
lessThan(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
lessThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection and returns a PCollection with elements that are less than a given value, based on the elements' natural ordering.
lessThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that are less than or equal to a given value, based on the elements' natural ordering.
lessThanOrEqualTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
lessThanOrEqualTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
LEVEL_CONFIGURATION - Static variable in enum org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
Map from LogLevel enums to java logging level.
LHS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.Join
 
like(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
LIKE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
LIKE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
limit(Iterable<T>, int) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
Limits the PrefetchableIterable to the specified number of elements.
LimitNumberOfFiles(int) - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.LimitNumberOfFiles
 
LimitNumberOfTotalBytes(long) - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.LimitNumberOfTotalBytes
 
Lineage - Class in org.apache.beam.sdk.metrics
Standard collection of metrics used to record source and sinks information for lineage tracking.
Lineage.Type - Enum in org.apache.beam.sdk.metrics
Lineage metrics resource types.
LINEAGE_NAMESPACE - Static variable in class org.apache.beam.sdk.metrics.Lineage
 
LineReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
A FileReadSchemaTransformFormatProvider that reads lines as Strings.
LineReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.LineReadSchemaTransformFormatProvider
 
LinesReadConverter() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesReadConverter
 
LinesWriteConverter() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.LinesWriteConverter
 
LIST_PARTITIONS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partitions identified during the execution of the Connector.
listAllFhirStores(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
List all FHIR stores in a dataset.
listAllFhirStores(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
ListCoder<T> - Class in org.apache.beam.sdk.coders
A Coder for List, using the format of IterableLikeCoder.
ListCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.ListCoder
 
listCollectionIds() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for ListCollectionIdsRequest operations.
listDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for ListDocumentsRequest operations.
listJobMessages(String, String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Lists job messages with the given jobId.
listJobs(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Lists Dataflow Jobs in the project associated with the DataflowPipelineOptions.
listObjects(String, String, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
 
listObjects(String, String, String, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Lists Objects given the bucket, prefix, pageToken.
listOf(T) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
lists(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for List.
listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return a list of subscriptions for topic in project.
listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return a list of topics for project.
listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
listView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<List<T>> capable of processing elements windowed using the provided WindowingStrategy.
listView(PCollection<T>, TupleTag<Materializations.IterableView<T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<List<T>> capable of processing elements windowed using the provided WindowingStrategy.
ListViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
ListViewFn2(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
 
listViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, T>>, PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
listViewUsingVoidKey(PCollection<KV<Void, T>>, TupleTag<Materializations.MultimapView<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
listViewWithRandomAccess(PCollection<KV<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<List<T>> capable of processing elements windowed using the provided WindowingStrategy.
listViewWithRandomAccess(PCollection<KV<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>, TupleTag<Materializations.MultimapView<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<List<T>> capable of processing elements windowed using the provided WindowingStrategy.
loadAggregateFunction(List<String>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
Load a user-defined aggregate function from the specified jar.
loader(PCollection<T>) - Static method in interface org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues
Factory to load SideInputValues from a Dataset based on the window strategy.
loadPluginClass(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
loadProviders() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformFormatProviders
Load all FileWriteSchemaTransformFormatProvider implementations.
loadProviders(Class<T>) - Static method in class org.apache.beam.sdk.schemas.io.Providers
 
loadScalarFunction(List<String>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
Load a user-defined scalar function from the specified jar.
LocalFileSystemRegistrar - Class in org.apache.beam.sdk.io
AutoService registrar for the LocalFileSystem.
LocalFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.LocalFileSystemRegistrar
 
LocalMktDate() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
 
LocalResources - Class in org.apache.beam.sdk.io
Helper functions for producing a ResourceId that references a local file or directory.
LocalTimestampMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
 
LocalTimestampMillisConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
 
LocalWindmillHostportFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.LocalWindmillHostportFactory
 
location(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
location - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
location - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
lockAndRecordPartition(PartitionRecord) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Lock the partition in the metadata table for the DoFn streaming it.
log(BeamFnApi.LogEntry) - Method in interface org.apache.beam.runners.fnexecution.logging.LogWriter
Write the contents of the Log Entry to some logging backend.
log(BeamFnApi.LogEntry) - Method in class org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter
 
LogAppendTimePolicy(Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
 
logging(StreamObserver<BeamFnApi.LogControl>) - Method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
 
LoggingHandler() - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler
 
LoggingTransformProvider - Class in org.apache.beam.sdk.schemas.transforms.providers
An implementation of TypedSchemaTransformProvider for Logging.
LoggingTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
 
LoggingTransformProvider.Configuration - Class in org.apache.beam.sdk.schemas.transforms.providers
 
LoggingTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.schemas.transforms.providers
 
LoggingTransformProvider.LoggingTransform - Class in org.apache.beam.sdk.schemas.transforms.providers
A SchemaTransform for logging.
LogicalEndpoint - Class in org.apache.beam.sdk.fn.data
A logical endpoint is a pair of an instruction ID corresponding to the BeamFnApi.ProcessBundleRequest and the transform within the processing graph.
LogicalEndpoint() - Constructor for class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
logicalType(Schema.LogicalType<InputT, BaseT>) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
Creates a logical type based on a primitive field type.
logInfo(Function0<String>) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
LogWriter - Interface in org.apache.beam.runners.fnexecution.logging
A consumer of Beam Log Entries.
Longs() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Longs
 
longs() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Long.
longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<Long> and returns a PCollection<Long> whose contents is the maximum of the input PCollection's elements, or Long.MIN_VALUE if there are no elements.
longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<Long> and returns a PCollection<Long> whose contents is the minimum of the input PCollection's elements, or Long.MAX_VALUE if there are no elements.
longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<Long> and returns a PCollection<Long> whose contents is the sum of the input PCollection's elements, or 0 if there are no elements.
longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, Long>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the input PCollection to the maximum of the values associated with that key in the input PCollection.
longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, Long>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the input PCollection to the minimum of the values associated with that key in the input PCollection.
longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<KV<K, Long>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the input PCollection to the sum of the values associated with that key in the input PCollection.
longToByteArray(long) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
LossyTimeMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.LossyTimeMicrosConversion
 
LossyTimestampMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.LossyTimestampMicrosConversion
 
lpad(String, Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
lpad(String, Long, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
lpad(byte[], Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
lpad(byte[], Long, byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
ltrim(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
ltrim(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
LTRIM - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
LTRIM_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 

M

main(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
main(String[]) - Static method in class org.apache.beam.runners.flink.FlinkMiniClusterEntryPoint
 
main(String[]) - Static method in class org.apache.beam.runners.flink.FlinkPipelineRunner
Main method to be called only as the entry point to an executable jar with structure as defined in PortablePipelineJarUtils.
main(String[]) - Static method in class org.apache.beam.runners.flink.FlinkPortableClientEntryPoint
Main method to be called standalone or by Flink (CLI or REST API).
main(String[]) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
 
main(String[]) - Static method in class org.apache.beam.runners.spark.SparkPipelineRunner
Main method to be called only as the entry point to an executable jar with structure as defined in PortablePipelineJarUtils.
main(String[]) - Static method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount
 
main(String[]) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionService
 
main(String[]) - Static method in class org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample
 
main(String[]) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon
Use this to create the index for reading before IT read tests.
main(String[]) - Static method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
 
makeCost(double, double, double) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
 
makeCost(double, double) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
 
makeHL7v2ListRequest(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Make hl 7 v 2 list request list messages response.
makeHL7v2ListRequest(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
makeHugeCost() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
 
makeInfiniteCost() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
 
makeListRequest(HealthcareApiClient, String, Instant, Instant, String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
Make list request list messages response.
makeOrderKeysFromCollation(RelCollation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
Transform a list of keys in Calcite to ORDER BY to OrderKeys.
makeOutput(FileIO.ReadableFile, OffsetRange, FileBasedSource<InT>, BoundedSource.BoundedReader<InT>) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn
 
makeProgress() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
 
makeRel(RelOptCluster, RelTraitSet, RelBuilder, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
 
makeSendTimeBoundHL7v2ListRequest(String, Instant, Instant, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Time Bound HL7v2 list request.
makeSendTimeBoundHL7v2ListRequest(String, Instant, Instant, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
makeTinyCost() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
 
makeZeroCost() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
 
Managed - Class in org.apache.beam.sdk.managed
Top-level PTransforms that build and instantiate turnkey transforms.
Managed() - Constructor for class org.apache.beam.sdk.managed.Managed
 
Managed.ManagedTransform - Class in org.apache.beam.sdk.managed
 
ManagedChannelFactory - Class in org.apache.beam.sdk.fn.channel
A Factory which creates ManagedChannel instances.
ManagedFactory<T extends java.lang.AutoCloseable> - Interface in org.apache.beam.sdk.io.gcp.pubsublite.internal
A ManagedFactory produces instances and tears down any produced instances when it is itself closed.
ManagedFactoryImpl<T extends java.lang.AutoCloseable> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
ManagedSchemaTransformProvider - Class in org.apache.beam.sdk.managed
 
ManagedSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.managed.ManagedSchemaTransformProvider
 
ManagedSchemaTransformTranslation - Class in org.apache.beam.sdk.managed
 
ManagedSchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.managed.ManagedSchemaTransformTranslation
 
ManagedSchemaTransformTranslation.ManagedTransformRegistrar - Class in org.apache.beam.sdk.managed
 
ManagedTransform() - Constructor for class org.apache.beam.sdk.managed.Managed.ManagedTransform
 
ManagedTransformConstants - Class in org.apache.beam.sdk.managed
This class contains constants for supported managed transforms, including: Identifiers of supported transforms Configuration parameter renaming
ManagedTransformConstants() - Constructor for class org.apache.beam.sdk.managed.ManagedTransformConstants
 
ManagedTransformRegistrar() - Constructor for class org.apache.beam.sdk.managed.ManagedSchemaTransformTranslation.ManagedTransformRegistrar
 
Manual(Instant) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
 
ManualDockerEnvironmentOptions - Interface in org.apache.beam.sdk.options
Pipeline options to tune DockerEnvironment.
ManualDockerEnvironmentOptions.Options - Class in org.apache.beam.sdk.options
ManualWatermarkEstimator<WatermarkEstimatorStateT> - Interface in org.apache.beam.sdk.transforms.splittabledofn
A WatermarkEstimator which is controlled manually from within a DoFn.
map(byte[]) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
 
map(Tuple<byte[], Iterator<byte[]>>) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
 
map(Tuple<byte[], byte[]>) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
 
map(WindowedValue<V>) - Method in class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
 
map(WindowedValue<KV<K, V>>) - Method in class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
 
map(ResultSet) - Method in interface org.apache.beam.sdk.io.cassandra.Mapper
This method is called when reading data from Cassandra.
map(BytesXMLMessage) - Static method in class org.apache.beam.sdk.io.solace.data.Solace.SolaceRecordMapper
Maps a BytesXMLMessage (if not null) to a Solace.Record.
map(Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
Create a map type for the given key and value types.
map(Schema.FieldType, Schema.FieldType, boolean) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
Deprecated.
Set the nullability on the valueType instead
map() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a MapState, optimized for key lookups and writes.
map(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.map(), but with key and value coders explicitly supplied.
MAP_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
MapCoder<K,V> - Class in org.apache.beam.sdk.coders
A Coder for Maps that encodes them according to provided coders for keys and values.
MapControlClientPool - Class in org.apache.beam.runners.fnexecution.control
A ControlClientPool backed by a client map.
MapCsvToStringArrayFn(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.MapCsvToStringArrayFn
 
MapElements<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
PTransforms for mapping a simple function over the elements of a PCollection.
MapElements.MapWithFailures<InputT,OutputT,FailureT> - Class in org.apache.beam.sdk.transforms
A PTransform that adds exception handling to MapElements.
mapEncoder(Encoder<K>, Encoder<V>, Class<MapT>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Creates a Spark Encoder of MapType that deserializes to MapT.
MapFactory() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions.MapFactory
 
MapKeys<K1,K2,V> - Class in org.apache.beam.sdk.transforms
MapKeys maps a SerializableFunction<K1,K2> over keys of a PCollection<KV<K1,V>> and returns a PCollection<KV<K2, V>>.
mapMessage(Message) - Method in interface org.apache.beam.sdk.io.jms.JmsIO.MessageMapper
 
MapOfIntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle
 
MapOfNestedIntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle
 
mapOutputs(Map<TupleTag<?>, PCollection<?>>, PCollectionTuple) - Method in class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
 
Mapper<T> - Interface in org.apache.beam.sdk.io.cassandra
This interface allows you to implement a custom mapper to read and persist elements from/to Cassandra.
MapperFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
Factory class for creating instances that will map a struct to a connector model.
MapperFactory(Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
 
MAPPINGS - Static variable in class org.apache.beam.sdk.managed.ManagedTransformConstants
 
MappingUtils - Class in org.apache.beam.sdk.io.cdap
Util class for mapping plugins.
MappingUtils() - Constructor for class org.apache.beam.sdk.io.cdap.MappingUtils
 
mapQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
mapQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
 
MapQualifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
 
MapQualifierListContext(FieldSpecifierNotationParser.QualifierListContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
 
mapRow(ResultSet) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.RowMapper
 
mapRow(ResultSet) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcReadWithPartitionsHelper
 
mapRow(ResultSet) - Method in class org.apache.beam.sdk.io.jdbc.SchemaUtil.BeamRowMapper
 
mapRow(Record) - Method in interface org.apache.beam.sdk.io.neo4j.Neo4jIO.RowMapper
 
mapRow(ResultSet) - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.RowMapper
 
mapRow(T) - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.UserDataMapper
 
mapRow(String[]) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakeIO.CsvMapper
 
mapRow(T) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakeIO.UserDataMapper
 
maps(TypeDescriptor<K>, TypeDescriptor<V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Map.
mapSourceFunction(SerializablePipelineOptions, String) - Static method in class org.apache.beam.runners.spark.stateful.StateSpecFunctions
A StateSpec function to support reading from an UnboundedSource.
mapSourceRecord(SourceRecord) - Method in class org.apache.beam.io.debezium.SourceRecordJson.SourceRecordJsonMapper
 
mapSourceRecord(SourceRecord) - Method in interface org.apache.beam.io.debezium.SourceRecordMapper
 
MapState<K,V> - Interface in org.apache.beam.sdk.state
A ReadableState cell mapping keys to values.
mapToRequest(ByteString, ImageContext) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
Maps the ByteString with encoded image data and the optional ImageContext into an AnnotateImageRequest.
mapToRequest(KV<ByteString, ImageContext>, ImageContext) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
Maps KV of ByteString (encoded image contents) and ImageContext to AnnotateImageRequest.
mapToRequest(String, ImageContext) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
Maps the String with encoded image data and the optional ImageContext into an AnnotateImageRequest.
mapToRequest(KV<String, ImageContext>, ImageContext) - Method in class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
Maps KV of String (GCS URI to the image) and ImageContext to a valid AnnotateImageRequest.
MapToTupleFunction<K,V> - Class in org.apache.beam.runners.twister2.translators.functions
Map to tuple function.
MapToTupleFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
 
MapToTupleFunction(Coder<K>, WindowedValue.WindowedValueCoder<V>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
 
MapValues<K,V1,V2> - Class in org.apache.beam.sdk.transforms
MapValues maps a SerializableFunction<V1,V2> over values of a PCollection<KV<K,V1>> and returns a PCollection<KV<K, V2>>.
mapView(PCollection<KV<K, V>>, PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Map<K, V>> capable of processing elements windowed using the provided WindowingStrategy.
MapViewFn(PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
Deprecated.
 
MapViewFn2(PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
 
mapViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, KV<K, V>>>, PCollection<KV<Void, KV<K, V>>>, PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
markDone() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
Marks this range tracker as being done.
markDone() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
Marks this range tracker as being done.
markNewPartitionForDeletion(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
This is the 1st step of 2 phase delete.
match(Class<V>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
match(List<String>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
match() - Static method in class org.apache.beam.sdk.io.FileIO
Matches a filepattern using FileSystems.match(java.util.List<java.lang.String>) and produces a collection of matched resources (both files and directories) as MatchResult.Metadata.
Match() - Constructor for class org.apache.beam.sdk.io.FileIO.Match
 
match(List<String>) - Method in class org.apache.beam.sdk.io.FileSystem
This is the entry point to convert user-provided specs to ResourceIds.
match(List<String>) - Static method in class org.apache.beam.sdk.io.FileSystems
This is the entry point to convert user-provided specs to ResourceIds.
match(List<String>, EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileSystems
Like FileSystems.match(List), but with a configurable EmptyMatchTreatment.
match(String) - Static method in class org.apache.beam.sdk.io.FileSystems
Like FileSystems.match(List), but for a single resource specification.
match(String, EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileSystems
Like FileSystems.match(String), but with a configurable EmptyMatchTreatment.
match(StateSpec.Cases<ResultT>) - Method in interface org.apache.beam.sdk.state.StateSpec
For internal use only; no backwards-compatibility guarantees.
matchAll() - Static method in class org.apache.beam.sdk.io.FileIO
Like FileIO.match(), but matches each filepattern in a collection of filepatterns.
MatchAll() - Constructor for class org.apache.beam.sdk.io.FileIO.MatchAll
 
MatchConfiguration() - Constructor for class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
MatcherAndError() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
 
MatcherCheckerFn(SerializableMatcher<T>) - Constructor for class org.apache.beam.sdk.testing.PAssert.MatcherCheckerFn
 
matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
 
matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
 
matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCoGBKJoinRule
 
matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputJoinRule
 
matches(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
 
matches(String, String) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.MatchFn
 
matches(String) - Method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
Deprecated.
Returns true if the given file name implies that the contents are compressed according to the compression embodied by this factory.
matches(String) - Method in enum org.apache.beam.sdk.io.Compression
 
matches(String) - Method in enum org.apache.beam.sdk.io.TextIO.CompressionType
Deprecated.
 
matches(String) - Method in enum org.apache.beam.sdk.io.TFRecordIO.CompressionType
Deprecated.
 
matches(String) - Method in enum org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
Deprecated.
 
matches(MetricsFilter, MetricKey) - Static method in class org.apache.beam.sdk.metrics.MetricFiltering
Matching logic is implemented here rather than in MetricsFilter because we would like MetricsFilter to act as a "dumb" value-object, with the possibility of replacing it with a Proto/JSON/etc.
matches(Object) - Method in class org.apache.beam.sdk.testing.RegexMatcher
 
matches(String) - Static method in class org.apache.beam.sdk.testing.RegexMatcher
 
matches(Object) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
 
matches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Matches PTransform that checks if the entire line matches the Regex.
matches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Matches PTransform that checks if the entire line matches the Regex.
matches(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Matches PTransform that checks if the entire line matches the Regex.
matches(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Matches PTransform that checks if the entire line matches the Regex.
matches(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesName PTransform that checks if the entire line matches the Regex.
matches(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesName PTransform that checks if the entire line matches the Regex.
Matches(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Matches
 
matchesKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesKV PTransform that checks if the entire line matches the Regex.
matchesKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesKV PTransform that checks if the entire line matches the Regex.
matchesKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesNameKV PTransform that checks if the entire line matches the Regex.
matchesKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesNameKV PTransform that checks if the entire line matches the Regex.
MatchesKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesKV
 
MatchesName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesName
 
MatchesNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesNameKV
 
matchesSafely(BigqueryMatcher.TableAndQuery) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
matchesSafely(ShardedFile) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
matchesSafely(T) - Method in class org.apache.beam.sdk.testing.JsonMatcher
 
matchesScope(String, Set<String>) - Static method in class org.apache.beam.sdk.metrics.MetricFiltering
matchesScope(actualScope, scopes) returns true if the scope of a metric is matched by any of the filters in scopes.
MatchFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.MatchFn
 
matchNewDirectory(String, String...) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a new ResourceId that represents the named directory resource.
matchNewResource(String, boolean) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
matchNewResource(String, boolean) - Method in class org.apache.beam.sdk.io.FileSystem
Returns a new ResourceId for this filesystem that represents the named resource.
matchNewResource(String, boolean) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a new ResourceId that represents the named resource of a type corresponding to the resource type.
matchResources(List<ResourceId>) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns MatchResults for the given resourceIds.
MatchResult - Class in org.apache.beam.sdk.io.fs
MatchResult.Metadata - Class in org.apache.beam.sdk.io.fs
MatchResult.Metadata of a matched file.
MatchResult.Metadata.Builder - Class in org.apache.beam.sdk.io.fs
Builder class for MatchResult.Metadata.
MatchResult.Status - Enum in org.apache.beam.sdk.io.fs
Status of a MatchResult.
matchSingleFileSpec(String) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns the MatchResult.Metadata for a single file resource.
Materialization<T> - Interface in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
Materializations - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
Materializations() - Constructor for class org.apache.beam.sdk.transforms.Materializations
 
Materializations.IterableView<V> - Interface in org.apache.beam.sdk.transforms
Represents the PrimitiveViewT supplied to the ViewFn when it declares to use the iterable materialization.
Materializations.MultimapView<K,V> - Interface in org.apache.beam.sdk.transforms
Represents the PrimitiveViewT supplied to the ViewFn when it declares to use the multimap materialization.
materializedOrAlias() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
 
Max - Class in org.apache.beam.sdk.transforms
PTransforms for computing the maximum of the elements in a PCollection, or the maximum of the values associated with each key in a PCollection of KVs.
MAX_CONNECTIONS - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
MAX_HASH_KEY - Static variable in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
 
MAX_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
Represents the max end at that can be specified for a change stream.
MAX_LENGTH - Static variable in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
 
MAX_SIZE - Static variable in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
 
MAX_UNIX_MILLIS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
maxBackoff(Duration) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
 
maxBackoff() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
 
maxBufferedTime(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
Buffer timeout for user records.
maxBufferingDuration() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
maxBufferingDuration() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
MaxBundleSizeFactory() - Constructor for class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleSizeFactory
 
MaxBundleTimeFactory() - Constructor for class org.apache.beam.runners.flink.FlinkPipelineOptions.MaxBundleTimeFactory
 
maxBytes(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
Max bytes per aggregated record.
maxConnections(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
The maximum number of connections allowed in the connection pool.
maxConnections() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
The maximum number of connections allowed in the connection pool.
maxCumulativeBackoff() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
MAXIMUM_PRECISION - Static variable in class org.apache.beam.sdk.extensions.zetasketch.HllCount
The maximum precision value you can set in HllCount.Init.Builder.withPrecision(int) is 24.
maximumLookback() - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
The maximum distance between the end of any main input window mainWindow and the end of the side input window returned by WindowMappingFn.getSideInputWindow(BoundedWindow)
maxInsertBlockSize() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
maxRetries() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
MaxStackTraceDepthToReportFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.MaxStackTraceDepthToReportFactory
 
maxTimestamp(Iterable<BoundedWindow>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
The end of the only window (max timestamp).
maxTimestamp() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
 
maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
Returns the inclusive upper bound of timestamps for values in this window.
maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
 
maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns the largest timestamp that can be included in this window.
mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
For internal use only; no backwards-compatibility guarantees.
mayFinish() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
 
md5(String) - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
 
md5Bytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
MD5(X)
md5String(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
MD5(X)
Mean - Class in org.apache.beam.sdk.transforms
PTransforms for computing the arithmetic mean (a.k.a.
MemoryMonitorOptions - Interface in org.apache.beam.sdk.options
Options that are used to control the Memory Monitor.
merge(Accumulator<MetricsContainerStepMap, MetricsContainerStepMap>) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
merge(AccumulatorV2<MetricsContainerStepMap, MetricsContainerStepMap>) - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
 
merge(AccumulatorV2<MetricsContainerStepMap, MetricsContainerStepMap>) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
 
merge(SequenceRangeAccumulator) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
 
merge(BoundedWindow, Iterable<? extends Instant>) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Merges the given timestamps, which may have originated in separate windows, into the context of the result window.
merge(BoundedWindow, Instant...) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
merge(Collection<W>, W) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
Signals to the framework that the windows in toBeMerged should be merged together to form mergeResult.
mergeAccumulator(AccumT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
Adds the input values represented by the given accumulator into this accumulator.
mergeAccumulators(Iterable<SequenceRangeAccumulator>) - Method in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
 
mergeAccumulators(Iterable<HyperLogLogPlus>) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
mergeAccumulators(Iterable<SketchFrequencies.Sketch<InputT>>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
mergeAccumulators(Iterable<MergingDigest>) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
mergeAccumulators(Iterable<long[]>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
 
mergeAccumulators(Iterable<CovarianceAccumulator>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
mergeAccumulators(Iterable<VarianceAccumulator>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
mergeAccumulators(Iterable<BeamBuiltinAggregations.BitXOr.Accum>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
 
mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
 
mergeAccumulators(Iterable<List<T>>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
 
mergeAccumulators(Iterable<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
 
mergeAccumulators(Iterable<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
 
mergeAccumulators(Long, Iterable<Long>) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
 
mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
mergeAccumulators(AccumT, Iterable<AccumT>) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
Returns an accumulator representing the accumulation of all the input values accumulated in the merging accumulators.
mergeAccumulators(Iterable<List<String>>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
 
mergeAccumulators(Iterable<AccumT>) - Method in interface org.apache.beam.sdk.state.CombiningState
Merge the given accumulators according to the underlying Combine.CombineFn.
mergeAccumulators(Iterable<ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
Deprecated.
 
mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
mergeAccumulators(Iterable<double[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
mergeAccumulators(Iterable<Combine.Holder<V>>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
mergeAccumulators(Iterable<int[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
mergeAccumulators(Iterable<long[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns an accumulator representing the accumulation of all the input values accumulated in the merging accumulators.
mergeAccumulators(Iterable<List<V>>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
mergeAccumulators(Iterable<Object[]>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
mergeAccumulators(Iterable<Object[]>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
mergeAccumulators(Iterable<AccumT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Returns an accumulator representing the accumulation of all the input values accumulated in the merging accumulators.
mergeAccumulators(Iterable<Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
MergeContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
 
MergeOverlappingIntervalWindows - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards compatibility guarantees.
MergeOverlappingIntervalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.MergeOverlappingIntervalWindows
 
mergeWideningNullable(Schema, Schema) - Static method in class org.apache.beam.sdk.schemas.SchemaUtils
Given two schema that have matching types, return a nullable-widened schema.
mergeWindows(WindowFn<?, IntervalWindow>.MergeContext) - Static method in class org.apache.beam.sdk.transforms.windowing.MergeOverlappingIntervalWindows
Merge overlapping IntervalWindows.
mergeWindows(WindowFn<T, W>.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
 
mergeWindows(WindowFn<Object, IntervalWindow>.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
mergeWindows(WindowFn<T, W>.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Does whatever merging of windows is necessary.
mergeWithOuter(ResourceHint) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
Reconciles values of a hint when the hint specified on a transform is also defined in an outer context, for example on a composite transform, or specified in the transform's execution environment.
mergeWithOuter(ResourceHints) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
 
message() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Underlying Message.
message() - Method in class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
 
messageFromBeamRow(Descriptors.Descriptor, Row, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
Forwards (@param changeSequenceNum) to #messageFromBeamRow(Descriptor, Row, String, String) via Long.toHexString(long).
messageFromBeamRow(Descriptors.Descriptor, Row, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
Given a Beam Row object, returns a protocol-buffer message that can be used to write data using the BigQuery Storage streaming API.
messageFromGenericRecord(Descriptors.Descriptor, GenericRecord, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
Forwards to #messageFromGenericRecord(Descriptor, GenericRecord, String, String) via Long.toHexString(long).
messageFromGenericRecord(Descriptors.Descriptor, GenericRecord, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
Given an Avro GenericRecord object, returns a protocol-buffer message that can be used to write data using the BigQuery Storage streaming API.
messageFromMap(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, AbstractMap<String, Object>, boolean, boolean, TableRow, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
messageFromTableRow(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, TableRow, boolean, boolean, TableRow, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
Forwards to #messageFromTableRow(SchemaInformation, Descriptor, TableRow, boolean, boolean, TableRow, String, String) via Long.toHexString(long).
messageFromTableRow(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, TableRow, boolean, boolean, TableRow, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
Given a BigQuery TableRow, returns a protocol-buffer message that can be used to write data using the BigQuery Storage API.
messageId() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
 
messageName() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
 
MessageProducer - Interface in org.apache.beam.sdk.io.solace.broker
Base class for publishing messages to a Solace broker.
MessageProducerUtils - Class in org.apache.beam.sdk.io.solace.broker
 
MessageProducerUtils() - Constructor for class org.apache.beam.sdk.io.solace.broker.MessageProducerUtils
 
MessageReceiver - Interface in org.apache.beam.sdk.io.solace.broker
Interface for receiving messages from a Solace broker.
Metadata(long, Instant, Instant, long, MetricsContainerStepMap) - Constructor for class org.apache.beam.runners.spark.io.SparkUnboundedSource.Metadata
 
metadata() - Method in class org.apache.beam.sdk.io.fs.MatchResult
MatchResult.Metadata of matched files.
Metadata() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
METADATA - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
TupleTag for the main output.
MetadataCoder - Class in org.apache.beam.sdk.io.fs
MetadataCoderV2 - Class in org.apache.beam.sdk.io.fs
MetadataSpannerConfigFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
This class generates a SpannerConfig for the change stream metadata database by copying only the necessary fields from the SpannerConfig of the primary database.
MetadataSpannerConfigFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.MetadataSpannerConfigFactory
 
MetadataTableAdminDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Data access object for creating and dropping the metadata table.
MetadataTableAdminDao(BigtableTableAdminClient, BigtableInstanceAdminClient, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
MetadataTableDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Data access object for managing the state of the metadata Bigtable table.
MetadataTableDao(BigtableDataClient, String, ByteString) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
 
MetadataTableEncoder - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder
Helper methods that simplifies some conversion and extraction of metadata table content.
MetadataTableEncoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
 
metaStore(MetaStore) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
 
metaStore(MetaStore, boolean, PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
 
MetaStore - Interface in org.apache.beam.sdk.extensions.sql.meta.store
The interface to handle CRUD of BeamSql table metadata.
METHOD - Static variable in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
 
method - Variable in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
Method that implements the function.
method() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
 
Metric - Interface in org.apache.beam.sdk.metrics
Marker interface for all user-facing metrics.
MetricFiltering - Class in org.apache.beam.sdk.metrics
Implements matching for metrics filters.
MetricKey - Class in org.apache.beam.sdk.metrics
Metrics are keyed by the step name they are associated with and the name of the metric.
MetricKey() - Constructor for class org.apache.beam.sdk.metrics.MetricKey
 
metricName() - Method in class org.apache.beam.sdk.metrics.MetricKey
The name of the metric.
MetricName - Class in org.apache.beam.sdk.metrics
The name of a metric consists of a MetricName.getNamespace() and a MetricName.getName().
MetricName() - Constructor for class org.apache.beam.sdk.metrics.MetricName
 
MetricNameFilter - Class in org.apache.beam.sdk.metrics
The name of a metric.
MetricNameFilter() - Constructor for class org.apache.beam.sdk.metrics.MetricNameFilter
 
MetricQueryResults - Class in org.apache.beam.sdk.metrics
The results of a query for metrics.
MetricQueryResults() - Constructor for class org.apache.beam.sdk.metrics.MetricQueryResults
 
metricRegistry() - Method in class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
 
metricRegistry() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.SparkBeamMetricSource
 
MetricResult<T> - Class in org.apache.beam.sdk.metrics
The results of a single current metric.
MetricResult() - Constructor for class org.apache.beam.sdk.metrics.MetricResult
 
MetricResults - Class in org.apache.beam.sdk.metrics
Methods for interacting with the metrics of a pipeline that has been executed.
MetricResults() - Constructor for class org.apache.beam.sdk.metrics.MetricResults
 
metrics() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
metrics() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
metrics() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
metrics() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
metrics() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
Metrics - Class in org.apache.beam.runners.flink.metrics
Helper for pretty-printing Flink metrics.
Metrics() - Constructor for class org.apache.beam.runners.flink.metrics.Metrics
 
metrics() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
 
metrics() - Method in class org.apache.beam.runners.jet.JetPipelineResult
 
metrics() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
metrics() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
 
metrics() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
 
Metrics - Class in org.apache.beam.sdk.metrics
The Metrics is a utility class for producing various kinds of metrics for reporting properties of an executing pipeline.
metrics() - Method in interface org.apache.beam.sdk.PipelineResult
Returns the object to access metrics from the pipeline.
METRICS_NAMESPACE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
METRICS_NAMESPACE - Static variable in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
 
MetricsAccumulator - Class in org.apache.beam.runners.flink.metrics
Accumulator of MetricsContainerStepMap.
MetricsAccumulator() - Constructor for class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
MetricsAccumulator - Class in org.apache.beam.runners.spark.metrics
For resilience, Accumulators are required to be wrapped in a Singleton.
MetricsAccumulator() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
MetricsAccumulator - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
AccumulatorV2 for Beam metrics captured in MetricsContainerStepMap.
MetricsAccumulator() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
 
MetricsAccumulator.AccumulatorCheckpointingSparkListener - Class in org.apache.beam.runners.spark.metrics
Spark Listener which checkpoints MetricsContainerStepMap values for fault-tolerance.
MetricsContainer - Interface in org.apache.beam.sdk.metrics
Holds the metrics for a single step.
MetricsContainerHolder() - Constructor for class org.apache.beam.sdk.metrics.MetricsEnvironment.MetricsContainerHolder
 
MetricsContainerStepMapAccumulator - Class in org.apache.beam.runners.spark.metrics
AccumulatorV2 implementation for MetricsContainerStepMap.
MetricsContainerStepMapAccumulator(MetricsContainerStepMap) - Constructor for class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
 
MetricsEnvironment - Class in org.apache.beam.sdk.metrics
Manages and provides the metrics container associated with each thread.
MetricsEnvironment() - Constructor for class org.apache.beam.sdk.metrics.MetricsEnvironment
 
MetricsEnvironment.MetricsContainerHolder - Class in org.apache.beam.sdk.metrics
 
MetricsEnvironment.MetricsEnvironmentState - Interface in org.apache.beam.sdk.metrics
Set the MetricsContainer for the associated MetricsEnvironment.
MetricsFilter - Class in org.apache.beam.sdk.metrics
Simple POJO representing a filter for querying metrics.
MetricsFilter() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter
 
MetricsFilter.Builder - Class in org.apache.beam.sdk.metrics
Builder for creating a MetricsFilter.
MetricsOptions - Interface in org.apache.beam.sdk.metrics
Extension of PipelineOptions that defines MetricsSink specific options.
MetricsOptions.NoOpMetricsSink - Class in org.apache.beam.sdk.metrics
A DefaultValueFactory that obtains the class of the NoOpMetricsSink if it exists on the classpath, and throws an exception otherwise.
MetricsSink - Interface in org.apache.beam.sdk.metrics
Interface for all metric sinks.
MicrobatchSource<T,CheckpointMarkT extends UnboundedSource.CheckpointMark> - Class in org.apache.beam.runners.spark.io
A Source that accommodates Spark's micro-batch oriented nature and wraps an UnboundedSource.
MicrobatchSource.Reader - Class in org.apache.beam.runners.spark.io
Mostly based on BoundedReadFromUnboundedSource's UnboundedToBoundedSourceAdapter, with some adjustments for Spark specifics.
microsecondToInstant(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
MicrosInstant - Class in org.apache.beam.sdk.schemas.logicaltypes
A timestamp represented as microseconds since the epoch.
MicrosInstant() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
 
MILLIS_PER_DAY - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
mimeType() - Method in class org.apache.beam.sdk.io.fs.CreateOptions
The file-like resource mime type.
Min - Class in org.apache.beam.sdk.transforms
PTransforms for computing the minimum of the elements in a PCollection, or the minimum of the values associated with each key in a PCollection of KVs.
MIN_HASH_KEY - Static variable in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
 
MIN_UNIX_MILLIS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
MINIMUM_PRECISION - Static variable in class org.apache.beam.sdk.extensions.zetasketch.HllCount
The minimum precision value you can set in HllCount.Init.Builder.withPrecision(int) is 10.
MINIMUM_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws.options.S3Options.S3UploadBufferSizeBytesFactory
 
MINIMUM_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
 
MINIMUM_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
 
MINIMUM_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
 
minus(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
minus(NodeStats) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
 
Mod - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents a modification in a table emitted within a DataChangeRecord.
Mod(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
Constructs a mod from the primary key values, the old state of the row and the new state of the row.
modeNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
modeToProtoMode(String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Modify the ack deadline for messages from subscription with ackIds to be deadlineSeconds from now.
modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
modifyEnvironmentBeforeSubmission(Environment) - Method in class org.apache.beam.runners.dataflow.DataflowRunnerHooks
Allows the user to modify the environment of their job before their job is submitted to the service for execution.
ModType - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents the type of modification applied in the DataChangeRecord.
MongoDbGridFSIO - Class in org.apache.beam.sdk.io.mongodb
IO to read and write data on MongoDB GridFS.
MongoDbGridFSIO() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
 
MongoDbGridFSIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.mongodb
Encapsulate the MongoDB GridFS connection logic.
MongoDbGridFSIO.Parser<T> - Interface in org.apache.beam.sdk.io.mongodb
Interface for the parser that is used to parse the GridFSDBFile into the appropriate types.
MongoDbGridFSIO.ParserCallback<T> - Interface in org.apache.beam.sdk.io.mongodb
Callback for the parser to use to submit data.
MongoDbGridFSIO.Read<T> - Class in org.apache.beam.sdk.io.mongodb
A PTransform to read data from MongoDB GridFS.
MongoDbGridFSIO.Read.BoundedGridFSSource - Class in org.apache.beam.sdk.io.mongodb
A BoundedSource for MongoDB GridFS.
MongoDbGridFSIO.Write<T> - Class in org.apache.beam.sdk.io.mongodb
A PTransform to write data to MongoDB GridFS.
MongoDbGridFSIO.WriteFn<T> - Interface in org.apache.beam.sdk.io.mongodb
Function that is called to write the data to the give GridFS OutputStream.
MongoDbIO - Class in org.apache.beam.sdk.io.mongodb
IO to read and write data on MongoDB.
MongoDbIO.Read - Class in org.apache.beam.sdk.io.mongodb
A PTransform to read data from MongoDB.
MongoDbIO.Write - Class in org.apache.beam.sdk.io.mongodb
A PTransform to write to a MongoDB database.
MongoDbTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
 
MongoDbTable.DocumentToRow - Class in org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
 
MongoDbTable.RowToDocument - Class in org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
 
MongoDbTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
MongoDbTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
 
Monitoring - Class in org.apache.beam.io.requestresponse
Configures Metrics throughout various features of RequestResponseIO.
Monitoring() - Constructor for class org.apache.beam.io.requestresponse.Monitoring
 
Monitoring.Builder - Class in org.apache.beam.io.requestresponse
 
MonitoringUtil - Class in org.apache.beam.runners.dataflow.util
A helper class for monitoring jobs submitted to the service.
MonitoringUtil(DataflowClient) - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil
Construct a helper for monitoring.
MonitoringUtil.JobMessagesHandler - Interface in org.apache.beam.runners.dataflow.util
An interface that can be used for defining callbacks to receive a list of JobMessages containing monitoring information.
MonitoringUtil.LoggingHandler - Class in org.apache.beam.runners.dataflow.util
A handler that logs monitoring messages.
MonitoringUtil.TimeStampComparator - Class in org.apache.beam.runners.dataflow.util
Comparator for sorting rows in increasing order based on timestamp.
MonotonicallyIncreasing(Instant) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
 
months(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
Returns a WindowFn that windows elements into periods measured by months.
MOST_SIGNIFICANT_BITS_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
MoveOptions - Interface in org.apache.beam.sdk.io.fs
MoveOptions.StandardMoveOptions - Enum in org.apache.beam.sdk.io.fs
Defines the standard MoveOptions.
MqttIO - Class in org.apache.beam.sdk.io.mqtt
An unbounded source for MQTT broker.
MqttIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.mqtt
A POJO describing a MQTT connection.
MqttIO.Read<T> - Class in org.apache.beam.sdk.io.mqtt
A PTransform to read from a MQTT broker.
MqttIO.Write<InputT> - Class in org.apache.beam.sdk.io.mqtt
A PTransform to write and send a message to a MQTT server.
MqttRecord - Class in org.apache.beam.sdk.io.mqtt
A container class for MQTT message metadata, including the topic name and payload.
MqttRecord() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttRecord
 
msgSpoolUsage() - Method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
 
MultiLanguageBuilderMethod - Annotation Type in org.apache.beam.sdk.expansion.service
 
MultiLanguageConstructorMethod - Annotation Type in org.apache.beam.sdk.expansion.service
 
multimap() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a MultimapState, optimized for key lookups, key puts, and clear.
multimap(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.multimap(), but with key and value coders explicitly supplied.
multimap() - Static method in class org.apache.beam.sdk.transforms.Materializations
For internal use only; no backwards-compatibility guarantees.
MULTIMAP_MATERIALIZATION_URN - Static variable in class org.apache.beam.sdk.transforms.Materializations
The URN for a Materialization where the primitive view type is a multimap of fully specified windowed values.
MultimapState<K,V> - Interface in org.apache.beam.sdk.state
A ReadableState cell mapping keys to bags of values.
multimapView(PCollection<KV<K, V>>, PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Map<K, Iterable<V>>> capable of processing elements windowed using the provided WindowingStrategy.
MultimapViewFn(PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
Deprecated.
 
MultimapViewFn2(PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
 
multimapViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, KV<K, V>>>, PCollection<KV<Void, KV<K, V>>>, PCollectionViews.TypeDescriptorSupplier<K>, PCollectionViews.TypeDescriptorSupplier<V>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
multiOutputOverrideFactory(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
Returns a PTransformOverrideFactory that replaces a multi-output ParDo with a composite transform specialized for the DataflowRunner.
multiplexElements(BeamFnApi.Elements) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
Dispatches the data and timers from the elements to corresponding receivers.
multiply(double) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
 
multiplyBy(double) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
mutablePairEncoder(Encoder<T1>, Encoder<T2>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Creates a Spark Encoder for Spark's MutablePair of StructType with fields `_1` and `_2`.
MutableState<EventT,ResultT> - Interface in org.apache.beam.sdk.extensions.ordered
Mutable state mutates when events apply to it.
mutate(EventT) - Method in interface org.apache.beam.sdk.extensions.ordered.MutableState
The interface assumes that events will mutate the state without the possibility of throwing an error.
MutationGroup - Class in org.apache.beam.sdk.io.gcp.spanner
A bundle of mutations that must be submitted atomically.

N

name(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
name() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
 
name - Variable in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
name() - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
 
name() - Method in interface org.apache.beam.sdk.schemas.FieldValueSetter
Returns the name of the field.
name(TypeDescription) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.InjectPackageStrategy
 
name - Variable in class org.apache.beam.sdk.transforms.PTransform
The base name of this PTransform, e.g., from defaults, or null if not yet assigned.
named() - Method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
 
named(String, String) - Static method in class org.apache.beam.sdk.metrics.MetricName
 
named(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.MetricName
 
named(String, String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
named(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
nameOf(int) - Method in class org.apache.beam.sdk.schemas.Schema
Return the name of field by index.
names() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
NanosDuration - Class in org.apache.beam.sdk.schemas.logicaltypes
A duration represented in nanoseconds.
NanosDuration() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
 
NanosInstant - Class in org.apache.beam.sdk.schemas.logicaltypes
A timestamp represented as nanoseconds since the epoch.
NanosInstant() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
 
narrowing(Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
 
Narrowing() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
 
nativeSQL(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
Natural() - Constructor for class org.apache.beam.sdk.transforms.Top.Natural
 
naturalOrder(T) - Static method in class org.apache.beam.sdk.transforms.Max
 
naturalOrder() - Static method in class org.apache.beam.sdk.transforms.Max
 
naturalOrder(T) - Static method in class org.apache.beam.sdk.transforms.Min
 
naturalOrder() - Static method in class org.apache.beam.sdk.transforms.Min
 
navigationFirstValue() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
navigationLastValue() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
NeedsDocker - Interface in org.apache.beam.runners.fnexecution.environment.testing
Category for integration tests that require Docker.
needsMerge() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
NeedsRunner - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize TestPipeline for execution and expect to be executed by a PipelineRunner.
Neo4jIO - Class in org.apache.beam.sdk.io.neo4j
This is a Beam IO to read from, and write data to, Neo4j.
Neo4jIO() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO
 
Neo4jIO.DriverConfiguration - Class in org.apache.beam.sdk.io.neo4j
This describes all the information needed to create a Neo4j Session.
Neo4jIO.DriverProviderFromDriverConfiguration - Class in org.apache.beam.sdk.io.neo4j
Wraps a Neo4jIO.DriverConfiguration to provide a Driver.
Neo4jIO.ReadAll<ParameterT,OutputT> - Class in org.apache.beam.sdk.io.neo4j
This is the class which handles the work behind the Neo4jIO.readAll() method.
Neo4jIO.RowMapper<T> - Interface in org.apache.beam.sdk.io.neo4j
An interface used by Neo4jIO.ReadAll for converting each row of a Neo4j Result record Record into an element of the resulting PCollection.
Neo4jIO.WriteUnwind<ParameterT> - Class in org.apache.beam.sdk.io.neo4j
This is the class which handles the work behind the Neo4jIO.writeUnwind() method.
NESTED - Static variable in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
The nested context: the value being encoded or decoded is (potentially) a part of a larger record/stream contents, and may have other parts encoded or decoded after it.
nested() - Method in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
NestedBytesBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle
 
nestedFieldsById() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return the nested fields keyed by field ids.
nestedFieldsByName() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return the nested fields keyed by field name.
NestedIntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle
 
never() - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
Returns a Watch.Growth.TerminationCondition that never holds (i.e., poll each input until its output is complete).
Never - Class in org.apache.beam.sdk.transforms.windowing
A Trigger which never fires.
Never() - Constructor for class org.apache.beam.sdk.transforms.windowing.Never
 
Never.NeverTrigger - Class in org.apache.beam.sdk.transforms.windowing
The actual trigger class for Never triggers.
neverRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
Never retry any failures.
NEW_PARTITION_PREFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
newBuilder() - Static method in class org.apache.beam.sdk.extensions.ml.AnnotateText
 
newBuilder() - Static method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
 
newBuilder() - Static method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
 
newBuilder() - Static method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
 
newBuilder() - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Factory method to return a new instance of RpcQosOptions.Builder with all values set to their initial default values.
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
Creates a builder for constructing a partition metadata instance.
newBuilder() - Static method in class org.apache.beam.sdk.io.splunk.SplunkEvent
Provides a builder for creating SplunkEvent objects.
newBuilder() - Static method in class org.apache.beam.sdk.io.splunk.SplunkWriteError
Provides a builder for creating SplunkWriteError objects.
newBuilder() - Static method in class org.apache.beam.sdk.schemas.io.Failure
 
newBundle(Map<String, RemoteOutputReceiver<?>>, BundleProgressHandler) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
Start a new bundle for the given BeamFnApi.ProcessBundleDescriptor identifier.
newBundle(Map<String, RemoteOutputReceiver<?>>, StateRequestHandler, BundleProgressHandler) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
Start a new bundle for the given BeamFnApi.ProcessBundleDescriptor identifier.
newBundle(Map<String, RemoteOutputReceiver<?>>, Map<KV<String, String>, RemoteOutputReceiver<Timer<?>>>, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
Start a new bundle for the given BeamFnApi.ProcessBundleDescriptor identifier.
newBundle(Map<String, RemoteOutputReceiver<?>>, Map<KV<String, String>, RemoteOutputReceiver<Timer<?>>>, StateRequestHandler, BundleProgressHandler, BundleSplitHandler, BundleCheckpointHandler, BundleFinalizationHandler) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
Start a new bundle for the given BeamFnApi.ProcessBundleDescriptor identifier.
newClient(String, String, PubsubOptions, String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
Construct a new Pubsub client.
newClient(String, String, PubsubOptions) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
 
newConfiguration(SerializableConfiguration) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
Returns new populated Configuration object.
newConnection(UnregisteredDriver, AvaticaFactory, String, Properties, CalciteSchema, JavaTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
 
newDatabaseMetaData(AvaticaConnection) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
 
newDataflowClient(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.DataflowTransport
Returns a Google Cloud Dataflow client builder.
newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
 
newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
 
newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
 
newDlqTransform(String) - Method in interface org.apache.beam.sdk.schemas.io.GenericDlqProvider
Generate a DLQ output from the provided config value.
newGoogleAdsClient(GoogleAdsOptions, String, Long, Long) - Method in class org.apache.beam.sdk.io.googleads.DefaultGoogleAdsClientFactory
 
newGoogleAdsClient(GoogleAdsOptions, String, Long, Long) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsClientFactory
 
newJob(SerializableConfiguration) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
Returns new configured Job object.
NewPartition - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
Represent new partition as a result of splits and merges.
NewPartition(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
NewPartition(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
newPluginInstance(String) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
newPopulation(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
newPopulation(SerializableFunction<BigDecimal, V>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
newPopulation(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
newPopulation(SerializableFunction<BigDecimal, V>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
newPreparedStatement(AvaticaConnection, Meta.StatementHandle, Meta.Signature, int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
 
newProvider(T) - Method in class org.apache.beam.sdk.testing.TestPipeline
Returns a new ValueProvider that is inaccessible before TestPipeline.run(), but will be accessible while the pipeline runs.
newReader(PulsarClient, String) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
newResultSet(AvaticaStatement, QueryState, Meta.Signature, TimeZone, Meta.Frame) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
 
newResultSetMetaData(AvaticaStatement, Meta.Signature) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
 
newSample(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
newSample(SerializableFunction<BigDecimal, V>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
newSample(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
newSample(SerializableFunction<BigDecimal, V>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
newStatement(AvaticaConnection, Meta.StatementHandle, int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
 
newStorageClient(GcsOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
Returns a Cloud Storage client builder using the specified GcsOptions.
newTracker(KafkaSourceConsumerFn.OffsetHolder) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
 
newTracker(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
 
newTracker(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
newTracker() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
newTracker() - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
newTracker() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.HasDefaultTracker
Creates a new tracker for this.
newTracker(Watch.GrowthState) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
 
NewVsCopy() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
newWatermarkEstimator() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.HasDefaultWatermarkEstimator
Creates a new watermark estimator for this.
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
 
next() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
 
next() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
 
next() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2EmptySource
 
next() - Method in class org.apache.beam.runners.twister2.translators.functions.ImpulseSource
 
next() - Method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.RecordBatchRowIterator
 
next() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
 
next() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
 
next() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator
 
next() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Moves the pointer to the next record in the ResultSet if there is one.
next(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
Adds one nanosecond to the given timestamp.
nextBatch(TimestampedValue<T>...) - Method in class org.apache.beam.runners.spark.io.CreateStream
Enqueue next micro-batch elements.
nextBatch(T...) - Method in class org.apache.beam.runners.spark.io.CreateStream
For non-timestamped elements.
nextFieldId() - Method in class org.apache.beam.sdk.values.Row.Builder
 
NFA - Class in org.apache.beam.sdk.extensions.sql.impl.nfa
NFA is an implementation of non-deterministic finite automata.
NO_FIRING - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo
PaneInfo to use for elements on (and before) initial window assignment (including elements read from sources) before they have passed through a GroupByKey and are associated with a particular trigger firing.
NodeStats - Class in org.apache.beam.sdk.extensions.sql.impl.planner
This is a utility class to represent rowCount, rate and window.
NodeStats() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
 
NodeStatsMetadata - Interface in org.apache.beam.sdk.extensions.sql.impl.planner
This is a metadata used for row count and rate estimation.
NodeStatsMetadata.Handler - Interface in org.apache.beam.sdk.extensions.sql.impl.planner
Handler API.
NON_PARALLEL_INPUTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
NonCumulativeCostImpl() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
 
NonDeterministicException(Coder<?>, String, Coder.NonDeterministicException) - Constructor for exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
NonDeterministicException(Coder<?>, String) - Constructor for exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
NonDeterministicException(Coder<?>, List<String>) - Constructor for exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
NonDeterministicException(Coder<?>, List<String>, Coder.NonDeterministicException) - Constructor for exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
NONE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
 
NONE - Static variable in interface org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler.Stats
 
none() - Static method in class org.apache.beam.sdk.schemas.Schema.Options
 
none() - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Default empty DisplayData instance.
NONE - Static variable in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
Constant Progress instance to be used when no work has been completed yet.
NonMergingWindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
Abstract base class for WindowFns that do not merge windows.
NonMergingWindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
 
nonSeekableInputIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
NOOP_CHECKPOINT_MARK - Static variable in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
 
NoopCheckpointMark() - Constructor for class org.apache.beam.sdk.io.UnboundedSource.CheckpointMark.NoopCheckpointMark
 
NoOpCounter - Class in org.apache.beam.sdk.metrics
A no-op implementation of Counter.
NoopCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
Construct an oauth credential to be used by the SDK and the SDK workers.
NoopCredentialFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
 
NoOpHistogram - Class in org.apache.beam.sdk.metrics
A no-op implementation of Histogram.
NoOpMetricsSink() - Constructor for class org.apache.beam.sdk.metrics.MetricsOptions.NoOpMetricsSink
 
NoopPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
For internal use only; no backwards compatibility guarantees.
NoOpStepContext - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
A StepContext for Spark Batch Runner execution.
NoOpStepContext() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.NoOpStepContext
 
NoOpStepContext - Class in org.apache.beam.runners.twister2.utils
doc.
NoOpStepContext() - Constructor for class org.apache.beam.runners.twister2.utils.NoOpStepContext
 
normalize() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
NormalizedRange - Class in org.apache.beam.sdk.io.azure.cosmos
 
NormalizedRange(String, String) - Constructor for class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
 
NoSuchSchemaException - Exception in org.apache.beam.sdk.schemas
Indicates that we are missing a schema for a type.
NoSuchSchemaException() - Constructor for exception org.apache.beam.sdk.schemas.NoSuchSchemaException
 
not(SerializableMatcher<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
notEqualTo(T) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Asserts that the value in question is not equal to the provided value, according to Object.equals(java.lang.Object).
notifyOfRemovedMetric(Metric, String, MetricGroup) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
 
notRegistered() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.UnusedFn
 
now() - Method in interface org.apache.beam.runners.direct.Clock
Returns the current time as an Instant.
now(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
 
nullable - Variable in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
 
nullable() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
nullable(TableSchema.TypeName) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
nullable(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.Field
Return's a nullable field with the give name and type.
NULLABLE_DATE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
NULLABLE_TIME - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
NULLABLE_TIMESTAMP - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
NULLABLE_TIMESTAMP_WITH_LOCAL_TZ - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
NullableCoder<T> - Class in org.apache.beam.sdk.coders
A NullableCoder encodes nullable values of type T using a nested Coder<T> that does not tolerate null values.
nullContext() - Static method in class org.apache.beam.sdk.state.StateContexts
Returns a fake StateContext.
NullCredentialInitializer - Class in org.apache.beam.sdk.extensions.gcp.auth
A HttpRequestInitializer for requests that don't have credentials.
NullCredentialInitializer() - Constructor for class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
 
nullRow(Schema) - Static method in class org.apache.beam.sdk.values.Row
Creates a new record filled with nulls.
nulls() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for nulls/Void.
NullSizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
NoOp implementation of a size estimator.
NullSizeEstimator() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.NullSizeEstimator
 
NullThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
NoOp implementation of a throughput estimator.
NullThroughputEstimator() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
 
nullValue() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
A SerializableMatcher with identical criteria to Matchers.nullValue().
NUM_QUERY_SPLITS_MAX - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
An upper bound on the number of splits for a query.
NUM_REDUCES - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
numberingDenseRank() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
numberingPercentRank() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
numberingRank() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
numberingRowNumber() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
 
numberOfRanges() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
 
numberOfRecordsForRate - Variable in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
NUMERIC_LITERAL_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
NUMERIC_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
numRetries(int) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
 
numRetries() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
 
numSupported() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
This is primarily used by the cost based optimization to determine the benefit of performing predicate push-down for an IOSourceRel.
numSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
 
numSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
 
numSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
 

O

OBJECT_TYPE_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ObjectPool<KeyT,ObjectT> - Class in org.apache.beam.sdk.io.aws2.common
Reference counting object pool to easily share & destroy objects.
ObjectPool(Function<KeyT, ObjectT>) - Constructor for class org.apache.beam.sdk.io.aws2.common.ObjectPool
 
ObjectPool(Function<KeyT, ObjectT>, ThrowingConsumer<Exception, ObjectT>) - Constructor for class org.apache.beam.sdk.io.aws2.common.ObjectPool
 
ObjectPool.ClientPool<ClientT extends SdkClient> - Class in org.apache.beam.sdk.io.aws2.common
Client pool to easily share AWS clients per configuration.
observe(RestrictionTracker<RestrictionT, PositionT>, RestrictionTrackers.ClaimObserver<PositionT>) - Static method in class org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers
Returns a thread safe RestrictionTracker which reports all claim attempts to the specified RestrictionTrackers.ClaimObserver.
observeTimestamp(Instant) - Method in interface org.apache.beam.sdk.transforms.splittabledofn.TimestampObservingWatermarkEstimator
Update watermark estimate with latest output timestamp.
observeTimestamp(Instant) - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
 
of(PTransform<PCollection<RequestT>, Result<KV<RequestT, ResponseT>>>, PTransform<PCollection<KV<RequestT, ResponseT>>, Result<KV<RequestT, ResponseT>>>) - Static method in class org.apache.beam.io.requestresponse.Cache.Pair
 
of(Caller<RequestT, ResponseT>, Coder<ResponseT>) - Static method in class org.apache.beam.io.requestresponse.RequestResponseIO
Instantiates a RequestResponseIO with a Caller and a ResponseT Coder.
of(String, String, String) - Static method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
 
of() - Static method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
of(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, Map<String, Coder>, Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, Map<String, Map<String, ProcessBundleDescriptors.BagUserStateSpec>>, Map<String, Map<String, ProcessBundleDescriptors.TimerSpec>>) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
of(String, String, RunnerApi.FunctionSpec, Coder<T>, Coder<W>) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
of(Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
 
of(Coder<T>, String) - Static method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
 
of() - Static method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
 
of(K, Coder<K>) - Static method in class org.apache.beam.runners.local.StructuralKey
Create a new Structural Key of the provided key that can be encoded by the provided coder.
of(T, CloseableResource.Closer<T>) - Static method in class org.apache.beam.runners.portability.CloseableResource
Creates a CloseableResource with the given resource and closer.
of(JobApi.MetricResults) - Static method in class org.apache.beam.runners.portability.PortableMetrics
 
of(Coder<T>, Duration, boolean) - Static method in class org.apache.beam.runners.spark.io.CreateStream
Creates a new Spark based stream intended for test purposes.
of(Coder<T>, Duration) - Static method in class org.apache.beam.runners.spark.io.CreateStream
Creates a new Spark based stream without forced watermark sync, intended for test purposes.
of(SideInputReader, Collection<PCollectionView<?>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.CachedSideInputReader
Creates a SideInputReader that caches results for costly Materializations if present, otherwise the SideInputReader is returned as is.
of(SideInputReader) - Static method in class org.apache.beam.runners.spark.util.CachedSideInputReader
Create a new cached SideInputReader.
of() - Static method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BitSetCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BooleanCoder
Returns the singleton instance of BooleanCoder.
of() - Static method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
of() - Static method in class org.apache.beam.sdk.coders.ByteCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.CollectionCoder
 
of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
 
of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.DequeCoder
 
of() - Static method in class org.apache.beam.sdk.coders.DoubleCoder
 
of() - Static method in class org.apache.beam.sdk.coders.DurationCoder
 
of() - Static method in class org.apache.beam.sdk.coders.FloatCoder
 
of() - Static method in class org.apache.beam.sdk.coders.InstantCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.IterableCoder
 
of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.KvCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.ListCoder
 
of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.MapCoder
Produces a MapCoder with the given keyCoder and valueCoder.
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.NullableCoder
 
of(Schema) - Static method in class org.apache.beam.sdk.coders.RowCoder
 
of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
Returns a SerializableCoder instance for the provided element type.
of(Class<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
Returns a SerializableCoder instance for the provided element class.
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SetCoder
Produces a SetCoder with the given elementCoder.
of(Coder<KeyT>) - Static method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SnappyCoder
Wraps the given coder into a SnappyCoder.
of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.SortedMapCoder
Produces a MapCoder with the given keyCoder and valueCoder.
of(Class<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
of(Class<T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
of() - Static method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
of() - Static method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
of() - Static method in class org.apache.beam.sdk.coders.VarIntCoder
 
of() - Static method in class org.apache.beam.sdk.coders.VarLongCoder
 
of() - Static method in class org.apache.beam.sdk.coders.VoidCoder
 
of(Coder<T>, byte[], int) - Static method in class org.apache.beam.sdk.coders.ZstdCoder
Wraps the given coder into a ZstdCoder.
of(Coder<T>, byte[]) - Static method in class org.apache.beam.sdk.coders.ZstdCoder
Wraps the given coder into a ZstdCoder.
of(Coder<T>, int) - Static method in class org.apache.beam.sdk.coders.ZstdCoder
Wraps the given coder into a ZstdCoder.
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.ZstdCoder
Wraps the given coder into a ZstdCoder.
of() - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServiceSchemaTransformProvider
 
of(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroGenericCoder instance for the Avro schema.
of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type.
of(TypeDescriptor<T>, boolean) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type, respecting whether to use Avro's Reflect* or Specific* suite for encoding and decoding.
of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element class.
of(Class<T>, boolean) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the given class, respecting whether to use Avro's Reflect* or Specific* suite for encoding and decoding.
of(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type using the provided Avro schema
of(AvroDatumFactory<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided AvroDatumFactory using the provided Avro schema.
of(Class<T>, Schema, boolean) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type using the provided Avro schema, respecting whether to use Avro's Reflect* or Specific* suite for encoding and decoding.
of(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroGenericCoder
 
of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
Returns an AvroDatumFactory instance for the provided element type.
of(Class<T>, boolean) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
Returns an AvroDatumFactory instance for the provided element type respecting Avro's Reflect* or Specific* suite for encoding and decoding.
of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
 
of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
 
of(Class<? extends InputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.AsJsons
Creates a AsJsons PTransform that will transform a PCollection<InputT> into a PCollection of JSON Strings representing those objects using a Jackson ObjectMapper.
of(Class<? extends OutputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
Creates a ParseJsons PTransform that will parse JSON Strings into a PCollection<OutputT> using a Jackson ObjectMapper.
of() - Static method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator.SequenceRangeAccumulatorCoder
 
of(long, long, Instant) - Static method in class org.apache.beam.sdk.extensions.ordered.ContiguousSequenceRange
 
of() - Static method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
of(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
Returns a DynamicProtoCoder for the Protocol Buffers DynamicMessage for the given Descriptors.Descriptor.
of(ProtoDomain, Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
Returns a DynamicProtoCoder for the Protocol Buffers DynamicMessage for the given Descriptors.Descriptor.
of(ProtoDomain, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
Returns a DynamicProtoCoder for the Protocol Buffers DynamicMessage for the given message name in a ProtoDomain.
of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns a ProtoCoder for the given Protocol Buffers Message.
of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns a ProtoCoder for the Protocol Buffers Message indicated by the given TypeDescriptor.
of(String) - Static method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
Instantiates a multi-language wrapper for a Python DataframeTransform with a given lambda function.
of(String, Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
Instantiates a multi-language wrapper for a Python RunInference with a given model loader.
of(String, Schema) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
Instantiates a multi-language wrapper for a Python RunInference with a given model loader.
of() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
of(BeamSqlTable) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
 
of(RexCall) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
 
of(RexPatternFieldRef) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
 
of(RexLiteral) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(Byte) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(Short) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(Integer) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(Long) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(BigDecimal) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(Float) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(ReadableDateTime) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(Boolean) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
 
of(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperation
 
of(SqlOperator) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
 
of(Schema, String, RexCall, Quantifier) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
 
of(RelFieldCollation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
 
of(Duration, Duration) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
 
of(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
Convenient way to build a mocked bounded table.
of(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
Build a mocked bounded table with the specified type.
of(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
Convenient way to build a mocked unbounded table.
of(FrameworkConfig, ExpressionConverter, RelOptCluster, QueryTrait) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ConversionContext
 
of(Duration, String...) - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
Construct the transform for the given duration and key fields.
of(Duration, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
Construct the transform for the given duration and key fields.
of() - Static method in class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoder
 
of() - Static method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
 
of(BigInteger, BigInteger) - Static method in class org.apache.beam.sdk.io.cassandra.RingRange
 
of(String, TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
 
of(String, TableSchema.ColumnType, TableSchema.DefaultType, Object) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
 
of(TableSchema.TypeName) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
of(TableSchema.Column...) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
 
of() - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
 
of() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
 
of(Coder<BoundedWindow>, Coder<DestinationT>) - Static method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
of() - Static method in class org.apache.beam.sdk.io.fs.MetadataCoder
Returns the singleton MetadataCoder instance.
of() - Static method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
Returns the singleton MetadataCoderV2 instance.
of() - Static method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
Creates a ResourceIdCoder.
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
 
of(TableRow, RowMutationInformation) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
 
of(RowMutationInformation.MutationType, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
Deprecated.
of(RowMutationInformation.MutationType, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
Instantiate RowMutationInformation with RowMutationInformation.MutationType and the , which sets the BigQuery API _CHANGE_SEQUENCE_NUMBER pseudo column, enabling custom user-supplied ordering of RowMutations.
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
of(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
 
of(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
 
of(FhirBundleParameter, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
 
of(String, String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
Creates a FhirSearchParameter of type T.
of(String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
Creates a FhirSearchParameter of type T, without a key.
of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
 
of(PCollectionTuple) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
 
of(PCollectionTuple) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
 
of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
 
of(Class<HL7v2Message>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
 
of(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
 
of(String, HL7v2Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
From metadata and hl7v2Message to HL7v2ReadResponse.
of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
 
of(Class<HL7v2ReadResponse>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
 
of(PubsubMessage, long, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
 
of(PubsubMessage, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
of(PubsubMessage, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
of(ByteString) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
of(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
Constructs a timestamp range.
of(Class<T>) - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
Returns a WritableCoder instance for the provided element class.
of(JdbcIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
 
of(JdbcIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
 
of(Schema) - Static method in class org.apache.beam.sdk.io.jdbc.SchemaUtil.BeamRowMapper
 
of(SerializableFunction<TopicPartition, Boolean>) - Static method in class org.apache.beam.sdk.io.kafka.CheckStopReadingFnWrapper
 
of(String, String) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
 
of(String, String, Integer) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
 
of(String, String, Integer, Map<String, ?>) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
 
of(String, int, String) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
 
of(String, int, String, Integer) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
 
of(String, int, String, Integer, Map<String, ?>) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
 
of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
of(TopicPartition, Long, Instant, Long, Instant, List<String>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
 
of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
of(Neo4jIO.DriverConfiguration) - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
 
of() - Static method in class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
 
of(String, Long, Long, MessageId, String, String) - Static method in class org.apache.beam.sdk.io.pulsar.PulsarSourceDescriptor
 
of(int...) - Static method in class org.apache.beam.sdk.io.range.ByteKey
Creates a new ByteKey backed by a copy of the specified int[].
of(ByteKey, ByteKey) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRange
Creates a new ByteKeyRange with the given start and end keys.
of(ByteKeyRange) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
Instantiates a new ByteKeyRangeTracker with the specified range.
of() - Static method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
 
of(Coder<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.ReadableFileCoder
 
of() - Static method in class org.apache.beam.sdk.io.ReadableFileCoder
 
of(String, long, boolean) - Static method in class org.apache.beam.sdk.io.redis.RedisCursor
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDate
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDateTime
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTime
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestamp
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampLTZ
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampNTZ
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampTZ
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.geospatial.SnowflakeGeography
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.logical.SnowflakeBoolean
 
of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDecimal
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDouble
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeFloat
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeInteger
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumeric
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeReal
 
of(String, SnowflakeDataType) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
of(String, SnowflakeDataType, boolean) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
of(SnowflakeColumn...) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeArray
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeObject
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeVariant
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
 
of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeChar
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
 
of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
 
of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarBinary
 
of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
 
of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
 
of(SnowflakeIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
 
of() - Static method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
 
of(Class<T>, TProtocolFactory) - Static method in class org.apache.beam.sdk.io.thrift.ThriftCoder
Returns an ThriftCoder instance for the provided clazz and protocolFactory.
of(Class<T>) - Static method in class org.apache.beam.sdk.io.xml.JAXBCoder
Create a coder for a given type of JAXB annotated objects.
of(ValueProvider<X>, SerializableFunction<X, T>) - Static method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
Creates a ValueProvider.NestedValueProvider that wraps the provided value.
of(T) - Static method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
Creates a ValueProvider.StaticValueProvider that wraps the provided value.
of(FieldAccessDescriptor.FieldDescriptor.ListQualifier) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
 
of(FieldAccessDescriptor.FieldDescriptor.MapQualifier) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
 
of(SerializableFunction<Row, byte[]>, SerializableFunction<byte[], Row>) - Static method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
 
of(String, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
Return an instance of FixedBytes with specified byte array length.
of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
Return an instance of FixedBytes with specified byte array length.
of(int, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
Create a FixedPrecisionNumeric instance with specified precision and scale.
of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
Create a FixedPrecisionNumeric instance with specified scale and unspecified precision.
of(Row) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
Create a FixedPrecisionNumeric instance with specified argument row.
of(String, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
Return an instance of FixedString with specified string length.
of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
Return an instance of FixedString with specified string length.
of(String, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
Return an instance of VariableBytes with specified max byte array length.
of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
Return an instance of VariableBytes with specified max byte array length.
of(String, int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
Return an instance of VariableString with specified max string length.
of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
Return an instance of VariableString with specified max string length.
of(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.Field
Return's a field with the give name and type.
of(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
Create a Schema.FieldType for the given type.
of(Schema.Field...) - Static method in class org.apache.beam.sdk.schemas.Schema
 
of(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
Returns a SchemaCoder for the specified class.
of(Schema) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
Returns a SchemaCoder for Row instances with the given schema.
of() - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
 
of(Schema, Cast.Validator) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
 
of() - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
 
of(int) - Static method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
 
of(SerializableFunction<Row, Integer>) - Static method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout
 
of(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.WithKeys
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
 
of(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
Returns a CombineFn that uses the given SerializableBiFunction to combine values.
of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
Returns a CombineFn that uses the given SerializableFunction to combine values.
of(SerializableFunction<Iterable<V>, V>, int) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
Returns a CombineFn that uses the given SerializableFunction to combine values, attempting to buffer at least bufferSize values between invocations.
of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
Deprecated.
Returns a CombineFn that uses the given SerializableFunction to combine values.
of(ClosureT, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
Constructs a pair of the given closure and its requirements.
of(Iterable<T>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces a PCollection containing elements of the provided Iterable.
of(T, T...) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces a PCollection containing the specified elements.
of(Map<K, V>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces a PCollection of KVs corresponding to the keys and values of the specified Map.
of(DisplayData.Path, Class<?>, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
of(CoGbkResultSchema, UnionCoder) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
Returns a CoGbkResult.CoGbkResultCoder for the given schema and UnionCoder.
of(TupleTag<V>, List<V>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns a new CoGbkResult that contains just the given tag and given data.
of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
of(TupleTag<InputT>, PCollection<KV<K, InputT>>) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns a new KeyedPCollectionTuple<K> with the given tag and initial PCollection.
of(String, PCollection<KV<K, InputT>>) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
A version of KeyedPCollectionTuple.of(TupleTag, PCollection) that takes in a string instead of a TupleTag.
of(List<Coder<?>>) - Static method in class org.apache.beam.sdk.transforms.join.UnionCoder
Builds a union coder with the given list of element coders.
of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
A CombineFn that computes the maximum of a collection of elements of type T using an arbitrary Comparator and identity, useful as an argument to Combine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>) or Combine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>).
of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
A CombineFn that computes the maximum of a collection of elements of type T using an arbitrary Comparator, useful as an argument to Combine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>) or Combine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>).
of() - Static method in class org.apache.beam.sdk.transforms.Mean
A Combine.CombineFn that computes the arithmetic mean (a.k.a.
of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
A CombineFn that computes the minimum of a collection of elements of type T using an arbitrary Comparator and an identity, useful as an argument to Combine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>) or Combine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>).
of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
A CombineFn that computes the minimum of a collection of elements of type T using an arbitrary Comparator, useful as an argument to Combine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>) or Combine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>).
of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.ParDo
Creates a ParDo PTransform that will invoke the given DoFn function.
of(int, Partition.PartitionWithSideInputsFn<? super T>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Partition
Returns a new Partition PTransform that divides its input PCollection into the given number of partitions, using the given partitioning function.
of(int, Partition.PartitionFn<? super T>) - Static method in class org.apache.beam.sdk.transforms.Partition
Returns a new Partition PTransform that divides its input PCollection into the given number of partitions, using the given partitioning function.
of() - Static method in class org.apache.beam.sdk.transforms.Reshuffle
 
of(ByteKeyRange) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
of(RestrictionT) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
Returns a RestrictionTracker.TruncateResult for the given restriction.
of(RestrictionT, RestrictionT) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
Returns a SplitResult for the specified primary and residual restrictions.
of(PTransform<PCollection<T>, ?>) - Static method in class org.apache.beam.sdk.transforms.Tee
Returns a new Tee PTransform that will apply an auxilary transform to the input as well as pass it on.
of(Consumer<PCollection<T>>) - Static method in class org.apache.beam.sdk.transforms.Tee
Returns a new Tee PTransform that will apply an auxilary transform to the input as well as pass it on.
of() - Static method in class org.apache.beam.sdk.transforms.ToJson
 
of(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<List<T>> with a single element containing the largest count elements of the input PCollection<T>, in decreasing order, sorted using the given Comparator<T>.
of(PCollectionView<ViewT>) - Static method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
 
of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
Returns an AfterAll Trigger with the given subtriggers.
of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
Returns an AfterAll Trigger with the given subtriggers.
of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
Returns an AfterFirst Trigger with the given subtriggers.
of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
Returns an AfterFirst Trigger with the given subtriggers.
of() - Static method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
Returns the default trigger.
of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
Partitions the timestamp space into half-open intervals of the form [N * size, (N + 1) * size), where 0 is the epoch.
of() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
of() - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Assigns timestamps into half-open intervals of the form [N * period, N * period + size), where 0 is the epoch.
of(T, Exception) - Static method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
 
of(OutputT, PCollection<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
 
of(PCollection<OutputElementT>, PCollection<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
 
of(PCollectionTuple, TupleTag<OutputElementT>, TupleTag<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
 
of(SerializableFunction<V, K>) - Static method in class org.apache.beam.sdk.transforms.WithKeys
Returns a PTransform that takes a PCollection<V> and returns a PCollection<KV<K, V>>, where each of the values in the input PCollection has been paired with a key computed from the value by invoking the given SerializableFunction.
of(K) - Static method in class org.apache.beam.sdk.transforms.WithKeys
Returns a PTransform that takes a PCollection<V> and returns a PCollection<KV<K, V>>, where each of the values in the input PCollection has been paired with the given key.
of(SerializableFunction<T, Instant>) - Static method in class org.apache.beam.sdk.transforms.WithTimestamps
For a SerializableFunction fn from T to Instant, outputs a PTransform that takes an input PCollection<T> and outputs a PCollection<T> containing every element v in the input where each element is output with a timestamp obtained as the result of fn.apply(v).
of(Coder<T>, Coder<ErrorT>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
 
of(T, Instant, BoundedWindow, PaneInfo, ErrorT) - Static method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
 
of(K, V) - Static method in class org.apache.beam.sdk.values.KV
Returns a KV with the given key and value.
of(PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionList
Returns a singleton PCollectionList containing the given PCollection.
of(Iterable<PCollection<T>>) - Static method in class org.apache.beam.sdk.values.PCollectionList
Returns a PCollectionList containing the given PCollections, in order.
of(String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
Returns a singleton PCollectionRowTuple containing the given PCollection keyed by the given tag.
of(String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
A version of PCollectionRowTuple.of(String, PCollection) that takes in two PCollections of the same type.
of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
A version of PCollectionRowTuple.of(String, PCollection) that takes in three PCollections of the same type.
of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
A version of PCollectionRowTuple.of(String, PCollection) that takes in four PCollections of the same type.
of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
A version of PCollectionRowTuple.of(String, PCollection) that takes in five PCollections of the same type.
of(TupleTag<T>, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
Returns a singleton PCollectionTuple containing the given PCollection keyed by the given TupleTag.
of(String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
A version of PCollectionTuple.of(TupleTag, PCollection) that takes in a String instead of a TupleTag.
of(String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
A version of PCollectionTuple.of(String, PCollection) that takes in two PCollections of the same type.
of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
A version of PCollectionTuple.of(String, PCollection) that takes in three PCollections of the same type.
of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
A version of PCollectionTuple.of(String, PCollection) that takes in four PCollections of the same type.
of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
A version of PCollectionTuple.of(String, PCollection) that takes in five PCollections of the same type.
of(K, int) - Static method in class org.apache.beam.sdk.values.ShardedKey
 
of(TupleTag<?>, PCollection<?>) - Static method in class org.apache.beam.sdk.values.TaggedPValue
 
of(V, Instant) - Static method in class org.apache.beam.sdk.values.TimestampedValue
Returns a new TimestampedValue with the given value and timestamp.
of(Coder<T>) - Static method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
of(TupleTag<?>) - Static method in class org.apache.beam.sdk.values.TupleTagList
Returns a singleton TupleTagList containing the given TupleTag.
of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.values.TupleTagList
Returns a TupleTagList containing the given TupleTags, in order.
of(Class<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a TypeDescriptor representing the given type.
of(Type) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a TypeDescriptor representing the given type.
of(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
of(T, Instant, BoundedWindow, PaneInfo) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow
 
of(Coder<ValueT>) - Static method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
of(WindowFn<T, W>) - Static method in class org.apache.beam.sdk.values.WindowingStrategy
 
ofByteSize(long) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
Aim to create batches each with the specified byte size.
ofByteSize(long, SerializableFunction<InputT, Long>) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
Aim to create batches each with the specified byte size.
ofCallerAndSetupTeardown(CallerSetupTeardownT, Coder<ResponseT>) - Static method in class org.apache.beam.io.requestresponse.RequestResponseIO
Instantiates a RequestResponseIO with a ResponseT Coder and an implementation of both the Caller and SetupTeardown interfaces.
ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Max
ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Min
ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Sum
ofExpandedValue(PCollection<?>) - Static method in class org.apache.beam.sdk.values.TaggedPValue
 
offer(ArtifactRetrievalService, ArtifactStagingServiceGrpc.ArtifactStagingServiceStub, String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
Lazily stages artifacts by letting an ArtifactStagingService resolve and request artifacts.
offerCoders(Coder[]) - Method in interface org.apache.beam.sdk.state.StateSpec
For internal use only; no backwards-compatibility guarantees.
offeringClientsToPool(ControlClientPool.Sink, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
Creates a new FnApiControlClientPoolService which will enqueue and vend new SDK harness connections.
ofFirstElement() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
offset(Duration) - Method in interface org.apache.beam.sdk.state.Timer
Offsets the target timestamp used by Timer.setRelative() by the given duration.
OFFSET_INFINITY - Static variable in class org.apache.beam.sdk.io.range.OffsetRangeTracker
Offset corresponding to infinity.
OffsetBasedReader(OffsetBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
OffsetBasedSource<T> - Class in org.apache.beam.sdk.io
A BoundedSource that uses offsets to define starting and ending positions.
OffsetBasedSource(long, long, long) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource
 
OffsetBasedSource.OffsetBasedReader<T> - Class in org.apache.beam.sdk.io
A Source.Reader that implements code common to readers of all OffsetBasedSources.
OffsetByteRangeCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
OffsetByteRangeCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
 
OffsetRange - Class in org.apache.beam.sdk.io.range
A restriction represented by a range of integers [from, to).
OffsetRange(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRange
 
OffsetRange.Coder - Class in org.apache.beam.sdk.io.range
A coder for OffsetRanges.
OffsetRangeTracker - Class in org.apache.beam.sdk.io.range
A RangeTracker for non-negative positions of type long.
OffsetRangeTracker(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRangeTracker
Creates an OffsetRangeTracker for the specified range.
OffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
A RestrictionTracker for claiming offsets in an OffsetRange in a monotonically increasing fashion.
OffsetRangeTracker(OffsetRange) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Max
ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Min
ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Sum
ofKVs(String, Schema.FieldType, Schema.FieldType, Coder<KeyT>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
Similar to RunInference#of(String, FieldType, FieldType) but the input is a PCollection of KVs.
ofKVs(String, Schema, Coder<KeyT>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
Similar to RunInference.of(String, Schema) but the input is a PCollection of KVs.
ofLongs() - Static method in class org.apache.beam.sdk.transforms.Max
ofLongs() - Static method in class org.apache.beam.sdk.transforms.Min
ofLongs() - Static method in class org.apache.beam.sdk.transforms.Sum
ofNamed(Map<String, ?>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
 
ofNone() - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
 
ofPatientEverything(HealthcareApiClient, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
Instantiates a new GetPatientEverything FHIR resource pages iterator.
ofPositional(List) - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
 
ofPrimitiveOutputsInternal(Pipeline, TupleTagList, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, PCollection.IsBounded) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
For internal use only; no backwards-compatibility guarantees.
ofProvider(ValueProvider<T>, Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns an Create.OfValueProvider transform that produces a PCollection of a single element provided by the given ValueProvider.
ofSearch(HealthcareApiClient, String, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
Instantiates a new search FHIR resource pages iterator.
ofSize(long) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
Aim to create batches each with the specified element count.
OK - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
OLDEST_TIMESTAMP_SELECTOR - Static variable in class org.apache.beam.sdk.extensions.ordered.combiner.DefaultSequenceCombiner
 
on(Join.FieldsEqual.Impl) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
Join the PCollections using the provided predicate.
on(PCollection<?>...) - Static method in class org.apache.beam.sdk.transforms.Wait
Waits on the given signal collections.
on(List<PCollection<?>>) - Static method in class org.apache.beam.sdk.transforms.Wait
Waits on the given signal collections.
ON_TIME_AND_ONLY_FIRING - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo
PaneInfo to use when there will be exactly one firing and it is on time.
onAdvance(int, int) - Method in class org.apache.beam.sdk.fn.stream.AdvancingPhaser
 
onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
 
onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarkAdvancingStreamingListener
 
onBeforeRequest(String, String, Message) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsV17.RateLimitPolicy
Called before a request is sent.
onBeforeRequest(String, String, Message) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.SimpleRateLimitPolicy
 
onBundleSuccess() - Method in interface org.apache.beam.sdk.transforms.DoFn.BundleFinalizer.Callback
 
OnceTrigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
 
onCheckpoint(BeamFnApi.ProcessBundleResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleCheckpointHandler
 
onCheckpoint(BeamFnApi.ProcessBundleResponse) - Method in class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
 
onClaimed(PositionT) - Method in interface org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers.ClaimObserver
Called when RestrictionTracker.tryClaim(PositionT) returns true.
onClaimFailed(PositionT) - Method in interface org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers.ClaimObserver
Called when RestrictionTracker.tryClaim(PositionT) returns false.
onClose(Consumer<FnApiControlClient>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
onCompleted(BeamFnApi.ProcessBundleResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
Handles the bundle's completion report.
onCompleted() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
onCompleted() - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
 
onCompleted() - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
onCompleted() - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
 
oneOfEncoder(List<Encoder<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Creates a one-of Spark Encoder of StructType where each alternative is represented as colum / field named by its index with a separate Encoder each.
OneOfType - Class in org.apache.beam.sdk.schemas.logicaltypes
A logical type representing a union of fields.
OneOfType.Value - Class in org.apache.beam.sdk.schemas.logicaltypes
Represents a single OneOf value.
onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
 
onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
 
onError(String, String, Message, GoogleAdsError) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsV17.RateLimitPolicy
Called after a request fails with a retryable error.
onError(String, String, Message, GoogleAdsError) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.SimpleRateLimitPolicy
 
onGcTimer(Instant, ValueState<SortedMap<Instant, TimestampedValue<ValueT>>>, ValueState<SortedMap<Instant, Long>>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.FillGapsDoFn
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCoGBKJoinRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinAssociateRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinPushThroughJoinRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputJoinRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcMergeRule
 
onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRule
 
onMerge(ReduceFn<K, T, Iterable<T>, W>.OnMergeContext) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
onNext(T) - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
onNext(T) - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
 
onNext(ReqT) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
onNext(V) - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
 
onPollComplete(StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
Called by the Watch transform to compute a new termination state after every poll completion.
onProgress(BeamFnApi.ProcessBundleProgressResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
Handles a progress report from the bundle while it is executing.
onReceiverStart() - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
onSeenNewOutput(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
Called by the Watch transform to compute a new termination state, in case after calling the Watch.Growth.PollFn for the current input, the Watch.Growth.PollResult included a previously unseen OutputT.
onStartup() - Method in interface org.apache.beam.sdk.harness.JvmInitializer
Implement onStartup to run some custom initialization immediately after the JVM is launched for pipeline execution.
onSuccess(List<KinesisRecord>) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicy
Called after Kinesis records are successfully retrieved.
onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
 
onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
 
onSuccess(String, String, Message) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsV17.RateLimitPolicy
Called after a request succeeds.
onSuccess(String, String, Message) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.SimpleRateLimitPolicy
 
onSuccess(List<KinesisRecord>) - Method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicy
Called after Kinesis records are successfully retrieved.
onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
 
onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
 
onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
 
onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
 
onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
onThrottle(KinesisClientThrottledException) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicy
Called after the Kinesis client is throttled.
onThrottle(KinesisClientThrottledException) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
 
onThrottle(KinesisClientThrottledException) - Method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicy
Called after the Kinesis client is throttled.
onThrottle(KinesisClientThrottledException) - Method in class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
 
onTimer(String, String, KeyT, BoundedWindow, Instant, Instant, TimeDomain) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
onTimer(String, Instant, TimerMap, TimerMap, ValueState<SortedMap<Instant, TimestampedValue<ValueT>>>, ValueState<SortedMap<Instant, Long>>, DoFn.OutputReceiver<ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.FillGapsDoFn
 
OnTimerContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
 
onTrigger(ReduceFn<K, T, Iterable<T>, W>.OnTriggerContext) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
onWindowExpiration(BoundedWindow, Instant, KeyT) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
OnWindowExpirationContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.OnWindowExpirationContext
 
open(MetricConfig) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
 
open(WritableByteChannel) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
 
open(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Opens an object in GCS.
open(ClassLoaderFileSystem.ClassLoaderResourceId) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
open(String) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Opens a uniquely named temporary file and initializes the writer using FileBasedSink.Writer.prepareWrite(java.nio.channels.WritableByteChannel).
open() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
Returns a ReadableByteChannel reading the data from this file, potentially decompressing it using FileIO.ReadableFile.getCompression().
open(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileIO.Sink
Initializes writing to the given channel.
open(ResourceIdT) - Method in class org.apache.beam.sdk.io.FileSystem
Returns a read channel for the given ResourceIdT.
open(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a read channel for the given ResourceId.
open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
 
open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
 
open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
 
open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
openSeekable() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
Returns a SeekableByteChannel equivalent to FileIO.ReadableFile.open(), but fails if this file is not seekable.
optimizedWrites() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, enables new codepaths that are expected to use less resources while writing to BigQuery.
Options() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.jet.JetRunnerRegistrar.Options
 
options() - Method in class org.apache.beam.runners.jobsubmission.JobPreparation
 
Options() - Constructor for class org.apache.beam.runners.prism.PrismRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Options
 
options() - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
options - Variable in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
 
Options() - Constructor for class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
 
Options() - Constructor for class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
 
Options() - Constructor for class org.apache.beam.sdk.options.RemoteEnvironmentOptions.Options
 
Options() - Constructor for class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.Options
 
OptionsRegistrar() - Constructor for class org.apache.beam.runners.portability.testing.TestUniversalRunner.OptionsRegistrar
 
Order - Class in org.apache.beam.sdk.extensions.sql.example.model
Describes an order.
Order(int, int) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Order
 
Order() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Order
 
OrderByKey() - Constructor for class org.apache.beam.sdk.values.KV.OrderByKey
 
OrderByValue() - Constructor for class org.apache.beam.sdk.values.KV.OrderByValue
 
OrderedEventProcessor<EventT,EventKeyT,ResultT,StateT extends MutableState<EventT,ResultT>> - Class in org.apache.beam.sdk.extensions.ordered
Transform for processing ordered events.
OrderedEventProcessor() - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessor
 
OrderedEventProcessorResult<KeyT,ResultT,EventT> - Class in org.apache.beam.sdk.extensions.ordered
The result of the ordered processing.
orderedList(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
OrderedListState<T> - Interface in org.apache.beam.sdk.state
A ReadableState cell containing a list of values sorted by timestamp.
OrderedProcessingGlobalSequenceHandler(Class<EventT>, Class<KeyT>, Class<StateT>, Class<ResultT>) - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler
 
OrderedProcessingHandler<EventT,KeyT,StateT extends MutableState<EventT,?>,ResultT> - Class in org.apache.beam.sdk.extensions.ordered
Parent class for Ordered Processing configuration handlers.
OrderedProcessingHandler(Class<EventT>, Class<KeyT>, Class<StateT>, Class<ResultT>) - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Provide concrete classes which will be used by the ordered processing transform.
OrderedProcessingHandler.OrderedProcessingGlobalSequenceHandler<EventT,KeyT,StateT extends MutableState<EventT,?>,ResultT> - Class in org.apache.beam.sdk.extensions.ordered
Parent class for Ordered Processing configuration handlers to handle processing of the events where global sequence is used.
OrderedProcessingStatus - Class in org.apache.beam.sdk.extensions.ordered
Indicates the status of ordered processing for a particular key.
OrderedProcessingStatus() - Constructor for class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus
 
OrderedProcessingStatus.Builder - Class in org.apache.beam.sdk.extensions.ordered
 
OrderKey - Class in org.apache.beam.sdk.extensions.sql.impl.cep
The OrderKey class stores the information to sort a column.
orFinally(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
Specify an ending condition for this trigger.
OrFinallyTrigger - Class in org.apache.beam.sdk.transforms.windowing
A Trigger that executes according to its main trigger until its "finally" trigger fires.
org.apache.beam.io.debezium - package org.apache.beam.io.debezium
Transforms for reading from DebeziumIO.
org.apache.beam.io.requestresponse - package org.apache.beam.io.requestresponse
Package provides Beam I/O transform support for safely reading from and writing to Web APIs.
org.apache.beam.runners.dataflow - package org.apache.beam.runners.dataflow
Provides a Beam runner that executes pipelines on the Google Cloud Dataflow service.
org.apache.beam.runners.dataflow.options - package org.apache.beam.runners.dataflow.options
Provides PipelineOptions specific to Google Cloud Dataflow.
org.apache.beam.runners.dataflow.util - package org.apache.beam.runners.dataflow.util
Provides miscellaneous internal utilities used by the Google Cloud Dataflow runner.
org.apache.beam.runners.direct - package org.apache.beam.runners.direct
Defines the PipelineOptions.DirectRunner which executes both Bounded and Unbounded Pipelines on the local machine.
org.apache.beam.runners.flink - package org.apache.beam.runners.flink
Internal implementation of the Beam runner for Apache Flink.
org.apache.beam.runners.flink.adapter - package org.apache.beam.runners.flink.adapter
Adaptors for using Beam transforms in Apache Flink pipelines.
org.apache.beam.runners.flink.metrics - package org.apache.beam.runners.flink.metrics
Internal metrics implementation of the Beam runner for Apache Flink.
org.apache.beam.runners.fnexecution.artifact - package org.apache.beam.runners.fnexecution.artifact
Pipeline execution-time artifact-management services, including abstract implementations of the Artifact Retrieval Service.
org.apache.beam.runners.fnexecution.control - package org.apache.beam.runners.fnexecution.control
Utilities for a Beam runner to interact with the Fn API Control Service via java abstractions.
org.apache.beam.runners.fnexecution.data - package org.apache.beam.runners.fnexecution.data
Utilities for a Beam runner to interact with the Fn API Data Service via java abstractions.
org.apache.beam.runners.fnexecution.environment - package org.apache.beam.runners.fnexecution.environment
Classes used to instantiate and manage SDK harness environments.
org.apache.beam.runners.fnexecution.environment.testing - package org.apache.beam.runners.fnexecution.environment.testing
Test utilities for the environment management package.
org.apache.beam.runners.fnexecution.logging - package org.apache.beam.runners.fnexecution.logging
Classes used to log informational messages over the Beam Fn Logging Service.
org.apache.beam.runners.fnexecution.provisioning - package org.apache.beam.runners.fnexecution.provisioning
Provision api services.
org.apache.beam.runners.fnexecution.state - package org.apache.beam.runners.fnexecution.state
State API services.
org.apache.beam.runners.fnexecution.status - package org.apache.beam.runners.fnexecution.status
Worker Status API services.
org.apache.beam.runners.fnexecution.translation - package org.apache.beam.runners.fnexecution.translation
Shared utilities for a Beam runner to translate portable pipelines.
org.apache.beam.runners.fnexecution.wire - package org.apache.beam.runners.fnexecution.wire
Wire coders for communications between runner and SDK harness.
org.apache.beam.runners.jet - package org.apache.beam.runners.jet
Implementation of the Beam runner for Hazelcast Jet.
org.apache.beam.runners.jet.metrics - package org.apache.beam.runners.jet.metrics
Helper classes for implementing metrics in the Hazelcast Jet based runner.
org.apache.beam.runners.jet.processors - package org.apache.beam.runners.jet.processors
Individual DAG node processors used by the Beam runner for Hazelcast Jet.
org.apache.beam.runners.jobsubmission - package org.apache.beam.runners.jobsubmission
Job management services for use in beam runners.
org.apache.beam.runners.local - package org.apache.beam.runners.local
Utilities useful when executing a pipeline on a single machine.
org.apache.beam.runners.portability - package org.apache.beam.runners.portability
Support for executing a pipeline locally over the Beam fn API.
org.apache.beam.runners.portability.testing - package org.apache.beam.runners.portability.testing
Testing utilities for the reference runner.
org.apache.beam.runners.prism - package org.apache.beam.runners.prism
Support for executing a pipeline on Prism.
org.apache.beam.runners.spark - package org.apache.beam.runners.spark
Internal implementation of the Beam runner for Apache Spark.
org.apache.beam.runners.spark.coders - package org.apache.beam.runners.spark.coders
Beam coders and coder-related utilities for running on Apache Spark.
org.apache.beam.runners.spark.io - package org.apache.beam.runners.spark.io
Spark-specific transforms for I/O.
org.apache.beam.runners.spark.metrics - package org.apache.beam.runners.spark.metrics
Provides internal utilities for implementing Beam metrics using Spark accumulators.
org.apache.beam.runners.spark.metrics.sink - package org.apache.beam.runners.spark.metrics.sink
Spark sinks that supports beam metrics and aggregators.
org.apache.beam.runners.spark.stateful - package org.apache.beam.runners.spark.stateful
Spark-specific stateful operators.
org.apache.beam.runners.spark.structuredstreaming - package org.apache.beam.runners.spark.structuredstreaming
Internal implementation of the Beam runner for Apache Spark.
org.apache.beam.runners.spark.structuredstreaming.examples - package org.apache.beam.runners.spark.structuredstreaming.examples
 
org.apache.beam.runners.spark.structuredstreaming.io - package org.apache.beam.runners.spark.structuredstreaming.io
Spark-specific transforms for I/O.
org.apache.beam.runners.spark.structuredstreaming.metrics - package org.apache.beam.runners.spark.structuredstreaming.metrics
Provides internal utilities for implementing Beam metrics using Spark accumulators.
org.apache.beam.runners.spark.structuredstreaming.metrics.sink - package org.apache.beam.runners.spark.structuredstreaming.metrics.sink
Spark sinks that supports beam metrics and aggregators.
org.apache.beam.runners.spark.structuredstreaming.translation - package org.apache.beam.runners.spark.structuredstreaming.translation
Internal translators for running Beam pipelines on Spark.
org.apache.beam.runners.spark.structuredstreaming.translation.batch - package org.apache.beam.runners.spark.structuredstreaming.translation.batch
Internal utilities to translate Beam pipelines to Spark batching.
org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions - package org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
Internal implementation of the Beam runner for Apache Spark.
org.apache.beam.runners.spark.structuredstreaming.translation.helpers - package org.apache.beam.runners.spark.structuredstreaming.translation.helpers
Internal helpers to translate Beam pipelines to Spark streaming.
org.apache.beam.runners.spark.structuredstreaming.translation.utils - package org.apache.beam.runners.spark.structuredstreaming.translation.utils
Internal utils to translate Beam pipelines to Spark streaming.
org.apache.beam.runners.spark.util - package org.apache.beam.runners.spark.util
Internal utilities to translate Beam pipelines to Spark.
org.apache.beam.runners.twister2 - package org.apache.beam.runners.twister2
Internal implementation of the Beam runner for Twister2.
org.apache.beam.runners.twister2.translation.wrappers - package org.apache.beam.runners.twister2.translation.wrappers
Internal implementation of the Beam runner for Twister2.
org.apache.beam.runners.twister2.translators - package org.apache.beam.runners.twister2.translators
Internal implementation of the Beam runner for Twister2.
org.apache.beam.runners.twister2.translators.batch - package org.apache.beam.runners.twister2.translators.batch
Internal implementation of the Beam runner for Twister2.
org.apache.beam.runners.twister2.translators.functions - package org.apache.beam.runners.twister2.translators.functions
Internal implementation of the Beam runner for Twister2.
org.apache.beam.runners.twister2.translators.functions.internal - package org.apache.beam.runners.twister2.translators.functions.internal
Internal implementation of the Beam runner for Twister2.
org.apache.beam.runners.twister2.translators.streaming - package org.apache.beam.runners.twister2.translators.streaming
Internal implementation of the Beam runner for Twister2.
org.apache.beam.runners.twister2.utils - package org.apache.beam.runners.twister2.utils
Internal implementation of the Beam runner for Twister2.
org.apache.beam.sdk - package org.apache.beam.sdk
Provides a simple, powerful model for building both batch and streaming parallel data processing Pipelines.
org.apache.beam.sdk.annotations - package org.apache.beam.sdk.annotations
Defines annotations used across the SDK.
org.apache.beam.sdk.coders - package org.apache.beam.sdk.coders
Defines Coders to specify how data is encoded to and decoded from byte strings.
org.apache.beam.sdk.expansion - package org.apache.beam.sdk.expansion
Contains classes needed to expose transforms to other SDKs.
org.apache.beam.sdk.expansion.service - package org.apache.beam.sdk.expansion.service
Classes used to expand cross-language transforms.
org.apache.beam.sdk.extensions.arrow - package org.apache.beam.sdk.extensions.arrow
Extensions for using Apache Arrow with Beam.
org.apache.beam.sdk.extensions.avro - package org.apache.beam.sdk.extensions.avro
Defines Schema and other classes for representing schema'd data in a Pipeline using Apache Avro.
org.apache.beam.sdk.extensions.avro.coders - package org.apache.beam.sdk.extensions.avro.coders
Defines Coders to specify how data is encoded to and decoded from byte strings using Apache Avro.
org.apache.beam.sdk.extensions.avro.io - package org.apache.beam.sdk.extensions.avro.io
Defines transforms for reading and writing Avro storage format.
org.apache.beam.sdk.extensions.avro.schemas - package org.apache.beam.sdk.extensions.avro.schemas
Defines Schema and other classes for representing schema'd data in a Pipeline using Apache Avro.
org.apache.beam.sdk.extensions.avro.schemas.io.payloads - package org.apache.beam.sdk.extensions.avro.schemas.io.payloads
Provides abstractions for schema-aware AvroIO.
org.apache.beam.sdk.extensions.avro.schemas.utils - package org.apache.beam.sdk.extensions.avro.schemas.utils
Defines utilities for deailing with schemas using Apache Avro.
org.apache.beam.sdk.extensions.gcp.auth - package org.apache.beam.sdk.extensions.gcp.auth
Defines classes related to interacting with Credentials for pipeline creation and execution containing Google Cloud Platform components.
org.apache.beam.sdk.extensions.gcp.options - package org.apache.beam.sdk.extensions.gcp.options
Defines PipelineOptions for configuring pipeline execution for Google Cloud Platform components.
org.apache.beam.sdk.extensions.gcp.storage - package org.apache.beam.sdk.extensions.gcp.storage
Defines IO connectors for Google Cloud Storage.
org.apache.beam.sdk.extensions.gcp.util - package org.apache.beam.sdk.extensions.gcp.util
Defines Google Cloud Platform component utilities that can be used by Beam runners.
org.apache.beam.sdk.extensions.gcp.util.channels - package org.apache.beam.sdk.extensions.gcp.util.channels
Package contains java channel wrappers used with GCS.
org.apache.beam.sdk.extensions.gcp.util.gcsfs - package org.apache.beam.sdk.extensions.gcp.util.gcsfs
Defines utilities used to interact with Google Cloud Storage.
org.apache.beam.sdk.extensions.jackson - package org.apache.beam.sdk.extensions.jackson
Utilities for parsing and creating JSON serialized objects.
org.apache.beam.sdk.extensions.joinlibrary - package org.apache.beam.sdk.extensions.joinlibrary
Utilities for performing SQL-style joins of keyed PCollections.
org.apache.beam.sdk.extensions.ml - package org.apache.beam.sdk.extensions.ml
Provides DoFns for integration with Google Cloud AI Video Intelligence service.
org.apache.beam.sdk.extensions.ordered - package org.apache.beam.sdk.extensions.ordered
Provides a transform for ordered processing.
org.apache.beam.sdk.extensions.ordered.combiner - package org.apache.beam.sdk.extensions.ordered.combiner
Default implementation of the global sequence combiner used by OrderedEventProcessor when processing events using global sequences.
org.apache.beam.sdk.extensions.protobuf - package org.apache.beam.sdk.extensions.protobuf
Defines a Coder for Protocol Buffers messages, ProtoCoder.
org.apache.beam.sdk.extensions.python - package org.apache.beam.sdk.extensions.python
Extensions for invoking Python transforms from the Beam Java SDK.
org.apache.beam.sdk.extensions.python.transforms - package org.apache.beam.sdk.extensions.python.transforms
Extensions for invoking Python transforms from the Beam Java SDK.
org.apache.beam.sdk.extensions.sbe - package org.apache.beam.sdk.extensions.sbe
Extension for working with SBE messages in Beam.
org.apache.beam.sdk.extensions.schemaio.expansion - package org.apache.beam.sdk.extensions.schemaio.expansion
External Transform Registration for SchemaIOs.
org.apache.beam.sdk.extensions.sketching - package org.apache.beam.sdk.extensions.sketching
Utilities for computing statistical indicators using probabilistic sketches.
org.apache.beam.sdk.extensions.sorter - package org.apache.beam.sdk.extensions.sorter
Utility for performing local sort of potentially large sets of values.
org.apache.beam.sdk.extensions.sql - package org.apache.beam.sdk.extensions.sql
BeamSQL provides a new interface to run a SQL statement with Beam.
org.apache.beam.sdk.extensions.sql.example - package org.apache.beam.sdk.extensions.sql.example
Example how to use Data Catalog table provider.
org.apache.beam.sdk.extensions.sql.example.model - package org.apache.beam.sdk.extensions.sql.example.model
Java classes used to for modeling the examples.
org.apache.beam.sdk.extensions.sql.expansion - package org.apache.beam.sdk.extensions.sql.expansion
External Transform Registration for Beam SQL.
org.apache.beam.sdk.extensions.sql.impl - package org.apache.beam.sdk.extensions.sql.impl
Implementation classes of BeamSql.
org.apache.beam.sdk.extensions.sql.impl.cep - package org.apache.beam.sdk.extensions.sql.impl.cep
Utilities for Complex Event Processing (CEP).
org.apache.beam.sdk.extensions.sql.impl.nfa - package org.apache.beam.sdk.extensions.sql.impl.nfa
Package of Non-deterministic Finite Automata (NFA) for MATCH_RECOGNIZE.
org.apache.beam.sdk.extensions.sql.impl.parser - package org.apache.beam.sdk.extensions.sql.impl.parser
Beam SQL parsing additions to Calcite SQL.
org.apache.beam.sdk.extensions.sql.impl.planner - package org.apache.beam.sdk.extensions.sql.impl.planner
BeamQueryPlanner is the main interface.
org.apache.beam.sdk.extensions.sql.impl.rel - package org.apache.beam.sdk.extensions.sql.impl.rel
BeamSQL specified nodes, to replace RelNode.
org.apache.beam.sdk.extensions.sql.impl.rule - package org.apache.beam.sdk.extensions.sql.impl.rule
RelOptRule to generate BeamRelNode.
org.apache.beam.sdk.extensions.sql.impl.schema - package org.apache.beam.sdk.extensions.sql.impl.schema
define table schema, to map with Beam IO components.
org.apache.beam.sdk.extensions.sql.impl.transform - package org.apache.beam.sdk.extensions.sql.impl.transform
PTransform used in a BeamSql pipeline.
org.apache.beam.sdk.extensions.sql.impl.transform.agg - package org.apache.beam.sdk.extensions.sql.impl.transform.agg
Implementation of standard SQL aggregation functions, e.g.
org.apache.beam.sdk.extensions.sql.impl.udaf - package org.apache.beam.sdk.extensions.sql.impl.udaf
UDAF classes.
org.apache.beam.sdk.extensions.sql.impl.udf - package org.apache.beam.sdk.extensions.sql.impl.udf
UDF classes.
org.apache.beam.sdk.extensions.sql.impl.utils - package org.apache.beam.sdk.extensions.sql.impl.utils
Utility classes.
org.apache.beam.sdk.extensions.sql.meta - package org.apache.beam.sdk.extensions.sql.meta
Metadata related classes.
org.apache.beam.sdk.extensions.sql.meta.provider - package org.apache.beam.sdk.extensions.sql.meta.provider
Table providers.
org.apache.beam.sdk.extensions.sql.meta.provider.avro - package org.apache.beam.sdk.extensions.sql.meta.provider.avro
Table schema for AvroIO.
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery - package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
Table schema for BigQuery.
org.apache.beam.sdk.extensions.sql.meta.provider.bigtable - package org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
Table schema for BigTable.
org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog - package org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
Table schema for Google Cloud Data Catalog.
org.apache.beam.sdk.extensions.sql.meta.provider.datastore - package org.apache.beam.sdk.extensions.sql.meta.provider.datastore
Table schema for DataStore.
org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog - package org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog
Table schema for HCatalog.
org.apache.beam.sdk.extensions.sql.meta.provider.kafka - package org.apache.beam.sdk.extensions.sql.meta.provider.kafka
Table schema for KafkaIO.
org.apache.beam.sdk.extensions.sql.meta.provider.mongodb - package org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
Table schema for MongoDb.
org.apache.beam.sdk.extensions.sql.meta.provider.parquet - package org.apache.beam.sdk.extensions.sql.meta.provider.parquet
Table schema for ParquetIO.
org.apache.beam.sdk.extensions.sql.meta.provider.pubsub - package org.apache.beam.sdk.extensions.sql.meta.provider.pubsub
Table schema for PubsubIO.
org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite - package org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite
Provides abstractions for schema-aware IOs.
org.apache.beam.sdk.extensions.sql.meta.provider.seqgen - package org.apache.beam.sdk.extensions.sql.meta.provider.seqgen
Table schema for streaming sequence generator.
org.apache.beam.sdk.extensions.sql.meta.provider.test - package org.apache.beam.sdk.extensions.sql.meta.provider.test
Table schema for in-memory test data.
org.apache.beam.sdk.extensions.sql.meta.provider.text - package org.apache.beam.sdk.extensions.sql.meta.provider.text
Table schema for text files.
org.apache.beam.sdk.extensions.sql.meta.store - package org.apache.beam.sdk.extensions.sql.meta.store
Meta stores.
org.apache.beam.sdk.extensions.sql.provider - package org.apache.beam.sdk.extensions.sql.provider
Package containing UDF providers for testing.
org.apache.beam.sdk.extensions.sql.udf - package org.apache.beam.sdk.extensions.sql.udf
Provides interfaces for defining user-defined functions in Beam SQL.
org.apache.beam.sdk.extensions.sql.zetasql - package org.apache.beam.sdk.extensions.sql.zetasql
ZetaSQL Dialect package.
org.apache.beam.sdk.extensions.sql.zetasql.translation - package org.apache.beam.sdk.extensions.sql.zetasql.translation
Conversion logic between ZetaSQL resolved query nodes and Calcite rel nodes.
org.apache.beam.sdk.extensions.sql.zetasql.translation.impl - package org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
Java implementation of ZetaSQL functions.
org.apache.beam.sdk.extensions.sql.zetasql.unnest - package org.apache.beam.sdk.extensions.sql.zetasql.unnest
Temporary solution to support ZetaSQL UNNEST.
org.apache.beam.sdk.extensions.timeseries - package org.apache.beam.sdk.extensions.timeseries
Utilities for operating on timeseries data.
org.apache.beam.sdk.extensions.zetasketch - package org.apache.beam.sdk.extensions.zetasketch
PTransforms to compute statistical sketches on data streams based on the ZetaSketch implementation.
org.apache.beam.sdk.fn - package org.apache.beam.sdk.fn
The top level package for the Fn Execution Java libraries.
org.apache.beam.sdk.fn.channel - package org.apache.beam.sdk.fn.channel
gRPC channel management.
org.apache.beam.sdk.fn.data - package org.apache.beam.sdk.fn.data
Classes to interact with the portability framework data plane.
org.apache.beam.sdk.fn.server - package org.apache.beam.sdk.fn.server
gPRC server factory.
org.apache.beam.sdk.fn.splittabledofn - package org.apache.beam.sdk.fn.splittabledofn
Defines utilities related to executing splittable DoFn.
org.apache.beam.sdk.fn.stream - package org.apache.beam.sdk.fn.stream
gRPC stream management.
org.apache.beam.sdk.fn.test - package org.apache.beam.sdk.fn.test
Utilities for testing use of this package.
org.apache.beam.sdk.fn.windowing - package org.apache.beam.sdk.fn.windowing
Common utilities related to windowing during execution of a pipeline.
org.apache.beam.sdk.function - package org.apache.beam.sdk.function
Java 8 functional interface extensions.
org.apache.beam.sdk.harness - package org.apache.beam.sdk.harness
Utilities for configuring worker environment.
org.apache.beam.sdk.io - package org.apache.beam.sdk.io
Defines transforms for reading and writing common storage formats, including org.apache.beam.sdk.io.AvroIO, and TextIO.
org.apache.beam.sdk.io.amqp - package org.apache.beam.sdk.io.amqp
Transforms for reading and writing using AMQP 1.0 protocol.
org.apache.beam.sdk.io.aws.coders - package org.apache.beam.sdk.io.aws.coders
Defines common coders for Amazon Web Services.
org.apache.beam.sdk.io.aws.dynamodb - package org.apache.beam.sdk.io.aws.dynamodb
Defines IO connectors for Amazon Web Services DynamoDB.
org.apache.beam.sdk.io.aws.options - package org.apache.beam.sdk.io.aws.options
Defines PipelineOptions for configuring pipeline execution for Amazon Web Services components.
org.apache.beam.sdk.io.aws.s3 - package org.apache.beam.sdk.io.aws.s3
Defines IO connectors for Amazon Web Services S3.
org.apache.beam.sdk.io.aws.sns - package org.apache.beam.sdk.io.aws.sns
Defines IO connectors for Amazon Web Services SNS.
org.apache.beam.sdk.io.aws.sqs - package org.apache.beam.sdk.io.aws.sqs
Defines IO connectors for Amazon Web Services SQS.
org.apache.beam.sdk.io.aws2.common - package org.apache.beam.sdk.io.aws2.common
Common code for AWS sources and sinks such as retry configuration.
org.apache.beam.sdk.io.aws2.dynamodb - package org.apache.beam.sdk.io.aws2.dynamodb
Defines IO connectors for Amazon Web Services DynamoDB.
org.apache.beam.sdk.io.aws2.kinesis - package org.apache.beam.sdk.io.aws2.kinesis
Transforms for reading from Amazon Kinesis.
org.apache.beam.sdk.io.aws2.options - package org.apache.beam.sdk.io.aws2.options
Defines PipelineOptions for configuring pipeline execution for Amazon Web Services components.
org.apache.beam.sdk.io.aws2.s3 - package org.apache.beam.sdk.io.aws2.s3
Defines IO connectors for Amazon Web Services S3.
org.apache.beam.sdk.io.aws2.schemas - package org.apache.beam.sdk.io.aws2.schemas
Schemas for AWS model classes.
org.apache.beam.sdk.io.aws2.sns - package org.apache.beam.sdk.io.aws2.sns
Defines IO connectors for Amazon Web Services SNS.
org.apache.beam.sdk.io.aws2.sqs - package org.apache.beam.sdk.io.aws2.sqs
Defines IO connectors for Amazon Web Services SQS.
org.apache.beam.sdk.io.azure.blobstore - package org.apache.beam.sdk.io.azure.blobstore
Defines IO connectors for Azure Blob Storage.
org.apache.beam.sdk.io.azure.cosmos - package org.apache.beam.sdk.io.azure.cosmos
Defines IO connectors for Azure Cosmos DB.
org.apache.beam.sdk.io.azure.options - package org.apache.beam.sdk.io.azure.options
Defines IO connectors for Microsoft Azure Blobstore.
org.apache.beam.sdk.io.cassandra - package org.apache.beam.sdk.io.cassandra
Transforms for reading and writing from/to Apache Cassandra.
org.apache.beam.sdk.io.cdap - package org.apache.beam.sdk.io.cdap
Transforms for reading and writing from CDAP.
org.apache.beam.sdk.io.cdap.context - package org.apache.beam.sdk.io.cdap.context
Context for CDAP classes.
org.apache.beam.sdk.io.clickhouse - package org.apache.beam.sdk.io.clickhouse
Transform for writing to ClickHouse.
org.apache.beam.sdk.io.contextualtextio - package org.apache.beam.sdk.io.contextualtextio
Transforms for reading from Files with contextual Information.
org.apache.beam.sdk.io.csv - package org.apache.beam.sdk.io.csv
Transforms for reading and writing CSV files.
org.apache.beam.sdk.io.csv.providers - package org.apache.beam.sdk.io.csv.providers
Transforms for reading and writing CSV files.
org.apache.beam.sdk.io.elasticsearch - package org.apache.beam.sdk.io.elasticsearch
Common test utilities for Elasticsearch.
org.apache.beam.sdk.io.fileschematransform - package org.apache.beam.sdk.io.fileschematransform
Defines transforms for File reading and writing support with Schema Transform.
org.apache.beam.sdk.io.fs - package org.apache.beam.sdk.io.fs
Apache Beam FileSystem interfaces and their default implementations.
org.apache.beam.sdk.io.gcp.bigquery - package org.apache.beam.sdk.io.gcp.bigquery
Defines transforms for reading and writing from Google BigQuery.
org.apache.beam.sdk.io.gcp.bigquery.providers - package org.apache.beam.sdk.io.gcp.bigquery.providers
Defines SchemaTransformProviders for reading and writing from Google BigQuery.
org.apache.beam.sdk.io.gcp.bigtable - package org.apache.beam.sdk.io.gcp.bigtable
Defines transforms for reading and writing from Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.bigtable.changestreams - package org.apache.beam.sdk.io.gcp.bigtable.changestreams
Change stream for Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.action - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
Business logic to process change stream for Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Data access object for change stream for Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
DoFn and SDF definitions to process Google Cloud Bigtable Change Streams.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder
Encoders for writing and reading from Metadata Table for Google Cloud Bigtable Change Streams.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
Classes related to estimating the throughput of the change streams SDFs.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.model - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
User models for the Google Cloud Bigtable change stream API.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
Partition reconciler for Google Cloud Bigtable Change Streams.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
Custom RestrictionTracker for Google Cloud Bigtable Change Streams.
org.apache.beam.sdk.io.gcp.common - package org.apache.beam.sdk.io.gcp.common
Defines common Google Cloud Platform IO support classes.
org.apache.beam.sdk.io.gcp.datastore - package org.apache.beam.sdk.io.gcp.datastore
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
org.apache.beam.sdk.io.gcp.firestore - package org.apache.beam.sdk.io.gcp.firestore
Provides an API for reading from and writing to Google Cloud Firestore.
org.apache.beam.sdk.io.gcp.healthcare - package org.apache.beam.sdk.io.gcp.healthcare
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
org.apache.beam.sdk.io.gcp.pubsub - package org.apache.beam.sdk.io.gcp.pubsub
Defines transforms for reading and writing from Google Cloud Pub/Sub.
org.apache.beam.sdk.io.gcp.pubsublite - package org.apache.beam.sdk.io.gcp.pubsublite
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
org.apache.beam.sdk.io.gcp.pubsublite.internal - package org.apache.beam.sdk.io.gcp.pubsublite.internal
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
org.apache.beam.sdk.io.gcp.spanner - package org.apache.beam.sdk.io.gcp.spanner
Provides an API for reading from and writing to Google Cloud Spanner.
org.apache.beam.sdk.io.gcp.spanner.changestreams - package org.apache.beam.sdk.io.gcp.spanner.changestreams
Provides an API for reading change stream data from Google Cloud Spanner.
org.apache.beam.sdk.io.gcp.spanner.changestreams.action - package org.apache.beam.sdk.io.gcp.spanner.changestreams.action
Action processors for each of the types of Change Stream records received.
org.apache.beam.sdk.io.gcp.spanner.changestreams.dao - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Database Access Objects for querying change streams and modifying the Connector's metadata tables.
org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
DoFn and SDF definitions to process Google Cloud Spanner Change Streams.
org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder - package org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
User model for the Spanner change stream API.
org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator - package org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
Classes related to estimating the throughput of the change streams SDFs.
org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper - package org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
Mapping related functionality, such as from ResultSets to Change Stream models.
org.apache.beam.sdk.io.gcp.spanner.changestreams.model - package org.apache.beam.sdk.io.gcp.spanner.changestreams.model
User models for the Spanner change stream API.
org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction - package org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
Custom restriction tracker related classes.
org.apache.beam.sdk.io.gcp.testing - package org.apache.beam.sdk.io.gcp.testing
Defines utilities for unit testing Google Cloud Platform components of Apache Beam pipelines.
org.apache.beam.sdk.io.googleads - package org.apache.beam.sdk.io.googleads
Defines transforms for reading from Google Ads.
org.apache.beam.sdk.io.hadoop - package org.apache.beam.sdk.io.hadoop
Classes shared by Hadoop based IOs.
org.apache.beam.sdk.io.hadoop.format - package org.apache.beam.sdk.io.hadoop.format
Defines transforms for writing to Data sinks that implement HadoopFormatIO .
org.apache.beam.sdk.io.hbase - package org.apache.beam.sdk.io.hbase
Transforms for reading and writing from/to Apache HBase.
org.apache.beam.sdk.io.hcatalog - package org.apache.beam.sdk.io.hcatalog
Transforms for reading and writing using HCatalog.
org.apache.beam.sdk.io.hdfs - package org.apache.beam.sdk.io.hdfs
FileSystem implementation for any Hadoop FileSystem.
org.apache.beam.sdk.io.iceberg - package org.apache.beam.sdk.io.iceberg
Iceberg connectors.
org.apache.beam.sdk.io.influxdb - package org.apache.beam.sdk.io.influxdb
Transforms for reading and writing from/to InfluxDB.
org.apache.beam.sdk.io.jdbc - package org.apache.beam.sdk.io.jdbc
Transforms for reading and writing from JDBC.
org.apache.beam.sdk.io.jms - package org.apache.beam.sdk.io.jms
Transforms for reading and writing from JMS (Java Messaging Service).
org.apache.beam.sdk.io.json - package org.apache.beam.sdk.io.json
Transforms for reading and writing JSON files.
org.apache.beam.sdk.io.json.providers - package org.apache.beam.sdk.io.json.providers
Transforms for reading and writing JSON files.
org.apache.beam.sdk.io.kafka - package org.apache.beam.sdk.io.kafka
Transforms for reading and writing from Apache Kafka.
org.apache.beam.sdk.io.kafka.serialization - package org.apache.beam.sdk.io.kafka.serialization
Kafka serializers and deserializers.
org.apache.beam.sdk.io.kafka.upgrade - package org.apache.beam.sdk.io.kafka.upgrade
A library to support upgrading Kafka transforms without upgrading the pipeline.
org.apache.beam.sdk.io.kinesis - package org.apache.beam.sdk.io.kinesis
Transforms for reading and writing from Amazon Kinesis.
org.apache.beam.sdk.io.kinesis.serde - package org.apache.beam.sdk.io.kinesis.serde
Defines serializers / deserializers for AWS.
org.apache.beam.sdk.io.kudu - package org.apache.beam.sdk.io.kudu
Transforms for reading and writing from/to Apache Kudu.
org.apache.beam.sdk.io.mongodb - package org.apache.beam.sdk.io.mongodb
Transforms for reading and writing from MongoDB.
org.apache.beam.sdk.io.mqtt - package org.apache.beam.sdk.io.mqtt
Transforms for reading and writing from MQTT.
org.apache.beam.sdk.io.neo4j - package org.apache.beam.sdk.io.neo4j
Transforms for reading from and writing to from Neo4j.
org.apache.beam.sdk.io.parquet - package org.apache.beam.sdk.io.parquet
Transforms for reading and writing from Parquet.
org.apache.beam.sdk.io.pulsar - package org.apache.beam.sdk.io.pulsar
Transforms for reading and writing from Apache Pulsar.
org.apache.beam.sdk.io.rabbitmq - package org.apache.beam.sdk.io.rabbitmq
Transforms for reading and writing from RabbitMQ.
org.apache.beam.sdk.io.range - package org.apache.beam.sdk.io.range
Provides thread-safe helpers for implementing dynamic work rebalancing in position-based bounded sources.
org.apache.beam.sdk.io.redis - package org.apache.beam.sdk.io.redis
Transforms for reading and writing from Redis.
org.apache.beam.sdk.io.singlestore - package org.apache.beam.sdk.io.singlestore
Transforms for reading and writing from SingleStoreDB.
org.apache.beam.sdk.io.singlestore.schematransform - package org.apache.beam.sdk.io.singlestore.schematransform
SingleStoreIO SchemaTransforms.
org.apache.beam.sdk.io.snowflake - package org.apache.beam.sdk.io.snowflake
Snowflake IO transforms.
org.apache.beam.sdk.io.snowflake.crosslanguage - package org.apache.beam.sdk.io.snowflake.crosslanguage
Cross-language for SnowflakeIO.
org.apache.beam.sdk.io.snowflake.data - package org.apache.beam.sdk.io.snowflake.data
Snowflake IO data types.
org.apache.beam.sdk.io.snowflake.data.datetime - package org.apache.beam.sdk.io.snowflake.data.datetime
Snowflake IO date/time types.
org.apache.beam.sdk.io.snowflake.data.geospatial - package org.apache.beam.sdk.io.snowflake.data.geospatial
Snowflake IO geospatial types.
org.apache.beam.sdk.io.snowflake.data.logical - package org.apache.beam.sdk.io.snowflake.data.logical
Snowflake IO logical types.
org.apache.beam.sdk.io.snowflake.data.numeric - package org.apache.beam.sdk.io.snowflake.data.numeric
Snowflake IO numeric types.
org.apache.beam.sdk.io.snowflake.data.structured - package org.apache.beam.sdk.io.snowflake.data.structured
Snowflake IO structured types.
org.apache.beam.sdk.io.snowflake.data.text - package org.apache.beam.sdk.io.snowflake.data.text
Snowflake IO text types.
org.apache.beam.sdk.io.snowflake.enums - package org.apache.beam.sdk.io.snowflake.enums
Snowflake IO data types.
org.apache.beam.sdk.io.snowflake.services - package org.apache.beam.sdk.io.snowflake.services
Snowflake IO services and POJOs.
org.apache.beam.sdk.io.solace - package org.apache.beam.sdk.io.solace
Solace IO connector.
org.apache.beam.sdk.io.solace.broker - package org.apache.beam.sdk.io.solace.broker
Solace IO broker-related classes.
org.apache.beam.sdk.io.solace.data - package org.apache.beam.sdk.io.solace.data
Solace IO connector - data-related classes.
org.apache.beam.sdk.io.solace.read - package org.apache.beam.sdk.io.solace.read
Solace IO connector - read connector classes.
org.apache.beam.sdk.io.solace.write - package org.apache.beam.sdk.io.solace.write
SolaceIO Write connector.
org.apache.beam.sdk.io.solr - package org.apache.beam.sdk.io.solr
Transforms for reading and writing from/to Solr.
org.apache.beam.sdk.io.sparkreceiver - package org.apache.beam.sdk.io.sparkreceiver
Transforms for reading and writing from streaming CDAP plugins.
org.apache.beam.sdk.io.splunk - package org.apache.beam.sdk.io.splunk
Transforms for writing events to Splunk's Http Event Collector (HEC).
org.apache.beam.sdk.io.thrift - package org.apache.beam.sdk.io.thrift
Transforms for reading and writing to Thrift files.
org.apache.beam.sdk.io.tika - package org.apache.beam.sdk.io.tika
Transform for reading and parsing files with Apache Tika.
org.apache.beam.sdk.io.xml - package org.apache.beam.sdk.io.xml
Transforms for reading and writing Xml files.
org.apache.beam.sdk.jmh.io - package org.apache.beam.sdk.jmh.io
Benchmarks for IO.
org.apache.beam.sdk.jmh.schemas - package org.apache.beam.sdk.jmh.schemas
Benchmarks for schemas.
org.apache.beam.sdk.jmh.util - package org.apache.beam.sdk.jmh.util
Benchmarks for core SDK utility classes.
org.apache.beam.sdk.managed - package org.apache.beam.sdk.managed
Managed reads and writes.
org.apache.beam.sdk.managed.testing - package org.apache.beam.sdk.managed.testing
Test transform for Managed API.
org.apache.beam.sdk.metrics - package org.apache.beam.sdk.metrics
Metrics allow exporting information about the execution of a pipeline.
org.apache.beam.sdk.options - package org.apache.beam.sdk.options
Defines PipelineOptions for configuring pipeline execution.
org.apache.beam.sdk.providers - package org.apache.beam.sdk.providers
Defines SchemaTransformProviders for transforms in the core module.
org.apache.beam.sdk.schemas - package org.apache.beam.sdk.schemas
Defines Schema and other classes for representing schema'd data in a Pipeline.
org.apache.beam.sdk.schemas.annotations - package org.apache.beam.sdk.schemas.annotations
Defines Schema and other classes for representing schema'd data in a Pipeline.
org.apache.beam.sdk.schemas.io - package org.apache.beam.sdk.schemas.io
Provides abstractions for schema-aware IOs.
org.apache.beam.sdk.schemas.io.payloads - package org.apache.beam.sdk.schemas.io.payloads
Provides abstractions for schema-aware IOs.
org.apache.beam.sdk.schemas.logicaltypes - package org.apache.beam.sdk.schemas.logicaltypes
A set of common LogicalTypes for use with schemas.
org.apache.beam.sdk.schemas.parser - package org.apache.beam.sdk.schemas.parser
Defines utilities for deailing with schemas.
org.apache.beam.sdk.schemas.parser.generated - package org.apache.beam.sdk.schemas.parser.generated
Defines utilities for deailing with schemas.
org.apache.beam.sdk.schemas.transforms - package org.apache.beam.sdk.schemas.transforms
Defines transforms that work on PCollections with schemas..
org.apache.beam.sdk.schemas.transforms.providers - package org.apache.beam.sdk.schemas.transforms.providers
Defines transforms that work on PCollections with schemas..
org.apache.beam.sdk.schemas.utils - package org.apache.beam.sdk.schemas.utils
Defines utilities for deailing with schemas.
org.apache.beam.sdk.state - package org.apache.beam.sdk.state
Classes and interfaces for interacting with state.
org.apache.beam.sdk.testing - package org.apache.beam.sdk.testing
Defines utilities for unit testing Apache Beam pipelines.
org.apache.beam.sdk.transforms - package org.apache.beam.sdk.transforms
Defines PTransforms for transforming data in a pipeline.
org.apache.beam.sdk.transforms.display - package org.apache.beam.sdk.transforms.display
Defines HasDisplayData for annotating components which provide display data used within UIs and diagnostic tools.
org.apache.beam.sdk.transforms.errorhandling - package org.apache.beam.sdk.transforms.errorhandling
Provides utilities for handling errors in Pipelines.
org.apache.beam.sdk.transforms.join - package org.apache.beam.sdk.transforms.join
Defines the CoGroupByKey transform for joining multiple PCollections.
org.apache.beam.sdk.transforms.resourcehints - package org.apache.beam.sdk.transforms.resourcehints
Defines ResourceHints for configuring pipeline execution.
org.apache.beam.sdk.transforms.splittabledofn - package org.apache.beam.sdk.transforms.splittabledofn
Defines utilities related to splittable DoFn.
org.apache.beam.sdk.transforms.windowing - package org.apache.beam.sdk.transforms.windowing
Defines the Window transform for dividing the elements in a PCollection into windows, and the Trigger for controlling when those elements are output.
org.apache.beam.sdk.transformservice.launcher - package org.apache.beam.sdk.transformservice.launcher
A library that can be used to start up a Docker-composed based Beam transform service.
org.apache.beam.sdk.values - package org.apache.beam.sdk.values
Defines PCollection and other classes for representing data in a Pipeline.
ORPHANED_NEW_PARTITION_CLEANED_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of orphaned new partitions cleaned up.
OrphanedMetadataCleaner - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
 
OrphanedMetadataCleaner() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
 
out() - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
Prints elements from the PCollection to the console.
out(int) - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
Prints num elements from the PCollection to stdout.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
The tag for the main output of FHIR resources.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
The tag for the main output of FHIR Resources from a search.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
The tag for the main output of FHIR Resources from a GetPatientEverything request.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
The tag for the main output of HL7v2 read responses.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
The tag for the main output of HL7v2 Messages.
outbound(DataStreams.OutputChunkConsumer<ByteString>) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
Converts a single element delimited OutputStream into multiple ByteStrings.
outbound(DataStreams.OutputChunkConsumer<ByteString>, int) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
Converts a single element delimited OutputStream into multiple ByteStrings using the specified maximum chunk size.
OutboundObserverFactory - Class in org.apache.beam.sdk.fn.stream
Creates factories which determine an underlying StreamObserver implementation to use in to interact with fn execution APIs.
OutboundObserverFactory() - Constructor for class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
 
OutboundObserverFactory.BasicFactory<ReqT,RespT> - Interface in org.apache.beam.sdk.fn.stream
Creates an outbound observer for the given inbound observer.
outboundObserverFor(StreamObserver<ReqT>) - Method in interface org.apache.beam.sdk.fn.stream.OutboundObserverFactory.BasicFactory
 
outboundObserverFor(OutboundObserverFactory.BasicFactory<ReqT, RespT>, StreamObserver<ReqT>) - Method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
Creates an outbound observer for the given inbound observer by potentially inserting hooks into the inbound and outbound observers.
OUTER - Static variable in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
The outer context: the value being encoded or decoded takes up the remainder of the record/stream contents.
OutgoingMessage() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
OUTPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
output() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
 
output(T) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
Output the object.
output(T, Instant) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
Output the object using the specified timestamp.
output(OutputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
Adds the given element to the main output PCollection at the given timestamp in the given window.
output(TupleTag<T>, T, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
Adds the given element to the output PCollection with the given tag at the given timestamp in the given window.
output(T) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
 
output(OutputT) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the main output PCollection.
output(TupleTag<T>, T) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the output PCollection with the given tag.
output() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
 
OUTPUT_DIR - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
OUTPUT_FORMAT_CLASS_ATTR - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
OUTPUT_INFO - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
OUTPUT_KEY_CLASS - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
OUTPUT_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
 
OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
 
OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
 
OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
 
OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
 
OUTPUT_ROWS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
 
OUTPUT_SCHEMA - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
 
OUTPUT_SCHEMA - Static variable in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
OUTPUT_VALUE_CLASS - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
outputCollectionNames() - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformProvider
The expected PCollectionRowTuple output tags.
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider outputCollectionNames method.
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryFileLoadsSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.iceberg.IcebergReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
Implementation of the TypedSchemaTransformProvider outputCollectionNames method.
outputCollectionNames() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
Implementation of the TypedSchemaTransformProvider outputCollectionNames method.
outputCollectionNames() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.FlattenTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider
 
outputCollectionNames() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
Returns the output collection names of this transform.
outputColumnMap - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
 
outputFormatProvider - Variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
This should be set after SubmitterLifecycle.prepareRun(Object) call with passing this context object as a param.
outputOf(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
Returns a type descriptor for the output of the given ProcessFunction, subject to Java type erasure: may contain unresolved type variables if the type was erased.
outputOf(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
Binary compatibility adapter for TypeDescriptors.outputOf(ProcessFunction).
outputOf(Contextful.Fn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
Like TypeDescriptors.outputOf(ProcessFunction) but for Contextful.Fn.
OutputRangeTracker(OffsetRange) - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
 
OutputReceiverFactory - Interface in org.apache.beam.runners.fnexecution.control
A factory that can create output receivers during an executable stage.
OutputReference - Class in org.apache.beam.runners.dataflow.util
A representation used by Steps to reference the output of other Steps.
OutputReference(String, String) - Constructor for class org.apache.beam.runners.dataflow.util.OutputReference
 
outputRuntimeOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptions
Returns a map of properties which correspond to ValueProvider.RuntimeValueProvider, keyed by the property name.
outputSchema() - Method in class org.apache.beam.sdk.schemas.transforms.Cast
 
outputSchemaCoder - Variable in class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
 
OutputTagFilter<OutputT,InputT> - Class in org.apache.beam.runners.twister2.translators.functions
Output tag filter.
OutputTagFilter() - Constructor for class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
 
OutputTagFilter(int) - Constructor for class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
 
outputWindowedValue(T, Instant, Collection<? extends BoundedWindow>, PaneInfo) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
 
outputWindowedValue(OutputT, Instant, Collection<? extends BoundedWindow>, PaneInfo) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the main output PCollection, with the given windowing metadata.
outputWindowedValue(TupleTag<T>, T, Instant, Collection<? extends BoundedWindow>, PaneInfo) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the main output PCollection, with the given windowing metadata.
outputWithTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
 
outputWithTimestamp(OutputT, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the main output PCollection, with the given timestamp.
outputWithTimestamp(TupleTag<T>, T, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the specified output PCollection, with the given timestamp.
overlaps(ByteKeyRange) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns true if the specified ByteKeyRange overlaps this range.
overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.coders.RowCoder
Override encoding positions for the given schema.
overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.coders.RowCoderGenerator
 
overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
Override encoding positions for the given schema.

P

PackageUtil - Class in org.apache.beam.runners.dataflow.util
Helper routines for packages.
PackageUtil.StagedFile - Class in org.apache.beam.runners.dataflow.util
 
pane() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Returns information about the pane within this window into which the input element has been assigned.
PaneInfo - Class in org.apache.beam.sdk.transforms.windowing
Provides information about the pane an element belongs to.
PaneInfo.PaneInfoCoder - Class in org.apache.beam.sdk.transforms.windowing
A Coder for encoding PaneInfo instances.
PaneInfo.Timing - Enum in org.apache.beam.sdk.transforms.windowing
Enumerates the possibilities for the timing of this pane firing related to the input and output watermarks for its computation.
paneInfoFromBytes(byte[]) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
 
paneInfoToBytes(PaneInfo) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
 
PARALLEL_INPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ParameterListBuilder() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
 
parameters - Variable in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
Types of parameter for the function call.
Params() - Constructor for class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
Construct a default Params object.
ParamsCoder() - Constructor for class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
 
ParDo - Class in org.apache.beam.sdk.transforms
ParDo is the core element-wise transform in Apache Beam, invoking a user-specified function on each of the elements of the input PCollection to produce zero or more output elements, all of which are collected into the output PCollection.
ParDo() - Constructor for class org.apache.beam.sdk.transforms.ParDo
 
ParDo.MultiOutput<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A PTransform that, when applied to a PCollection<InputT>, invokes a user-specified DoFn<InputT, OutputT> on all its elements, which can emit elements to any of the PTransform's output PCollections, which are bundled into a result PCollectionTuple.
ParDo.SingleOutput<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A PTransform that, when applied to a PCollection<InputT>, invokes a user-specified DoFn<InputT, OutputT> on all its elements, with all its outputs collected into an output PCollection<OutputT>.
ParDoMultiOutputTranslatorBatch<InputT,OutputT> - Class in org.apache.beam.runners.twister2.translators.batch
ParDo translator.
ParDoMultiOutputTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.ParDoMultiOutputTranslatorBatch
 
ParDoMultiOverrideFactory<InputT,OutputT> - Class in org.apache.beam.runners.direct
A PTransformOverrideFactory that provides overrides for applications of a ParDo in the direct runner.
ParDoMultiOverrideFactory() - Constructor for class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
 
ParDoP<InputT,OutputT> - Class in org.apache.beam.runners.jet.processors
Jet Processor implementation for Beam's ParDo primitive (when no user-state is being used).
ParDoP.Supplier<InputT,OutputT> - Class in org.apache.beam.runners.jet.processors
Jet Processor supplier that will provide instances of ParDoP.
parent() - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
Type of parent node in a tree.
PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
The empty set representing the initial partition parent tokens.
ParquetConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration
 
parquetConfigurationBuilder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
ParquetIO - Class in org.apache.beam.sdk.io.parquet
IO to read and write Parquet files.
ParquetIO.Parse<T> - Class in org.apache.beam.sdk.io.parquet
ParquetIO.ParseFiles<T> - Class in org.apache.beam.sdk.io.parquet
ParquetIO.Read - Class in org.apache.beam.sdk.io.parquet
Implementation of ParquetIO.read(Schema).
ParquetIO.ReadFiles - Class in org.apache.beam.sdk.io.parquet
Implementation of ParquetIO.readFiles(Schema).
ParquetIO.ReadFiles.BlockTracker - Class in org.apache.beam.sdk.io.parquet
 
ParquetIO.Sink - Class in org.apache.beam.sdk.io.parquet
ParquetReadSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
 
ParquetReadSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.ParquetReadSchemaTransformFormatProvider
 
ParquetTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.parquet
TableProvider for ParquetIO for consumption by Beam SQL.
ParquetTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
 
ParquetWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
ParquetWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.ParquetWriteSchemaTransformFormatProvider
 
Parse() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
 
parse(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
Parse input SQL query, and return a SqlNode as grammar tree.
parse(String) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner
Parse input SQL query, and return a SqlNode as grammar tree.
parse(String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
parse(String) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
Parse string with ClickHouse type to TableSchema.ColumnType.
parse(String) - Static method in enum org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
 
parse(Class<T>, CSVFormat) - Static method in class org.apache.beam.sdk.io.csv.CsvIO
Instantiates a CsvIOParse for parsing CSV string records into custom Schema-mapped Class<T>es from the records' assumed CsvFormat.
parse(GridFSDBFile, MongoDbGridFSIO.ParserCallback<T>) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Parser
 
Parse() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
 
parse() - Static method in class org.apache.beam.sdk.io.tika.TikaIO
Parses files matching a given filepattern.
Parse() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO.Parse
 
parse(String) - Static method in class org.apache.beam.sdk.schemas.parser.FieldAccessDescriptorParser
 
parse(T) - Method in class org.apache.beam.sdk.testing.JsonMatcher
 
ParseAll() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
 
parseAllGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Deprecated.
You can achieve The functionality of AvroIO.parseAllGenericRecords(SerializableFunction) using FileIO matching plus AvroIO.parseFilesGenericRecords(SerializableFunction) ()}. This is the preferred method to make composition explicit. AvroIO.ParseAll will not receive upgrades and will be removed in a future version of Beam.
parseArgs(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
parseDate(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseDateToValue(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseDefaultExpression(TableSchema.ColumnType, String) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
Get default value of a column based on expression.
parseDicomWebpath(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser
 
ParsedMetricName() - Constructor for class org.apache.beam.sdk.metrics.LabeledMetricNameUtils.ParsedMetricName
 
ParseException - Exception in org.apache.beam.sdk.extensions.sql.impl
Exception thrown when Beam SQL is unable to parse the statement.
ParseException(Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.impl.ParseException
 
ParseException(String, Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.impl.ParseException
 
ParseFiles() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
 
ParseFiles() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
 
parseFiles() - Static method in class org.apache.beam.sdk.io.tika.TikaIO
Parses files in a PCollection of FileIO.ReadableFile.
ParseFiles() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
parseFilesGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
parseFilesGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
Reads GenericRecord from Parquet files and converts to user defined type using provided parseFn.
parseGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Reads Avro file(s) containing records of an unspecified schema and converting each record to a custom type.
parseGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
Reads GenericRecord from a Parquet file (or multiple Parquet files matching the pattern) and converts to user defined type using provided parseFn.
parseInitialContinuationTokens(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Return a list of initial token from a row.
ParseJsons<OutputT> - Class in org.apache.beam.sdk.extensions.jackson
PTransform for parsing JSON Strings.
ParseJsons.ParseJsonsWithFailures<FailureT> - Class in org.apache.beam.sdk.extensions.jackson
A PTransform that adds exception handling to ParseJsons.
parseLockUuid(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Returns the uuid from a row.
parseMetricName(String) - Static method in class org.apache.beam.sdk.metrics.LabeledMetricNameUtils
Parse a 'metric name' String that was created with 'MetricNameBuilder'.
ParsePayloadAsPubsubMessageProto() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
 
parseProperties(String) - Static method in class org.apache.beam.sdk.extensions.sql.TableUtils
 
ParsePubsubMessageProtoAsPayload() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
 
ParsePubsubMessageProtoAsPayloadFromWindowedValue() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue
 
parseQuery(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
parseQuery(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
ParseResult - Class in org.apache.beam.sdk.io.tika
The result of parsing a single file with Tika: contains the file's location, metadata, extracted text, and optionally an error.
ParseResult() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
 
parseRows(Schema, CSVFormat) - Static method in class org.apache.beam.sdk.io.csv.CsvIO
Instantiates a CsvIOParse for parsing CSV string records into Rows from the records' assumed CsvFormat and expected Schema.
parseTableSpec(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
Parse a table specification in the form "[project_id]:[dataset_id].[table_id]" or "[project_id].[dataset_id].[table_id]" or "[dataset_id].[table_id]".
parseTableUrn(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
parseTime(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseTimestampAsMsSinceEpoch(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return timestamp as ms-since-unix-epoch corresponding to timestamp.
parseTimestampWithLocalTimeZone(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseTimestampWithoutTimeZone(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseTimestampWithTimeZone(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseTimestampWithTZToValue(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseTimestampWithUTCTimeZone(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseTimeToValue(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
 
parseTokenFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Read the continuation token cell of a row from ReadRows.
parseWatermarkFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Read the watermark cell of a row from ReadRows.
parseWatermarkLastUpdatedFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Return the timestamp (the time it was updated) of the watermark cell.
ParseWithError() - Constructor for class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
 
Partition<T> - Class in org.apache.beam.sdk.transforms
Partition takes a PCollection<T> and a PartitionFn, uses the PartitionFn to split the elements of the input PCollection into N partitions, and returns a PCollectionList<T> that bundles N PCollection<T>s containing the split elements.
Partition.PartitionFn<T> - Interface in org.apache.beam.sdk.transforms
A function object that chooses an output partition for an element.
Partition.PartitionWithSideInputsFn<T> - Interface in org.apache.beam.sdk.transforms
A function object that chooses an output partition for an element.
PARTITION_CREATED_TO_SCHEDULED_MS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Time in milliseconds that a partition took to transition from PartitionMetadata.State.CREATED to PartitionMetadata.State.SCHEDULED.
PARTITION_MERGE_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partition merges identified during the execution of the Connector.
PARTITION_RECONCILED_WITH_TOKEN_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partitions reconciled with continuation tokens.
PARTITION_RECONCILED_WITHOUT_TOKEN_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partitions reconciled without continuation tokens.
PARTITION_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of partitions identified during the execution of the Connector.
PARTITION_RECORD_MERGE_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of partition merges identified during the execution of the Connector.
PARTITION_RECORD_SPLIT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of partition splits / moves identified during the execution of the Connector.
PARTITION_SCHEDULED_TO_RUNNING_MS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Time in milliseconds that a partition took to transition from PartitionMetadata.State.SCHEDULED to PartitionMetadata.State.RUNNING.
PARTITION_SPLIT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partition splits / moves identified during the execution of the Connector.
PARTITION_STREAM_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of active partitions being streamed.
PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
The token of the initial partition.
PartitionContext() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
 
partitioner() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
PARTITIONER_CLASS_ATTR - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
partitionFor(T, int) - Method in interface org.apache.beam.sdk.transforms.Partition.PartitionFn
Chooses the partition into which to put the given element.
partitionFor(T, int, Contextful.Fn.Context) - Method in interface org.apache.beam.sdk.transforms.Partition.PartitionWithSideInputsFn
Chooses the partition into which to put the given element.
PartitioningWindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that places each value into exactly one window based on its timestamp and never merges windows.
PartitioningWindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
PartitionMark(String, int, long, long) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
PartitionMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Model for the partition metadata database table used in the Connector.
PartitionMetadata(String, HashSet<String>, Timestamp, Timestamp, long, PartitionMetadata.State, Timestamp, Timestamp, Timestamp, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
 
PartitionMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Partition metadata builder for better user experience.
PartitionMetadata.State - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
The state at which a partition can be in the system: CREATED: the partition has been created, but no query has been done against it yet.
PartitionMetadataAdminDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Data access object for creating and dropping the partition metadata table.
PartitionMetadataDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Data access object for the Connector metadata tables.
PartitionMetadataDao.InTransactionContext - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Represents the execution of a read / write transaction in Cloud Spanner.
PartitionMetadataDao.TransactionResult<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Represents a result from executing a Cloud Spanner read / write transaction.
partitionMetadataMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
Creates and returns a single instance of a mapper class capable of transforming a Struct into a PartitionMetadata class.
PartitionMetadataMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
This class is responsible for transforming a Struct to a PartitionMetadata.
PartitionMetadataTableNames - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Configuration for a partition metadata table.
PartitionMetadataTableNames(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
 
partitionQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for PartitionQueryRequest operations.
PartitionReconciler - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
There can be a race when many splits and merges happen to a single partition in quick succession.
PartitionReconciler(MetadataTableDao, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
 
PartitionRecord - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
Output result of DetectNewPartitionsDoFn containing information required to stream a partition.
PartitionRecord(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant, List<NewPartition>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
PartitionRecord(Range.ByteStringRange, Instant, Instant, List<NewPartition>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
PartitionRecord(Range.ByteStringRange, Instant, String, Instant, List<NewPartition>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
PartitionRecord(Range.ByteStringRange, List<ChangeStreamContinuationToken>, String, Instant, List<NewPartition>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
partitionsToString(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Convert partitions to a string for debugging.
PAssert - Class in org.apache.beam.sdk.testing
An assertion on the contents of a PCollection incorporated into the pipeline.
PAssert.DefaultConcludeTransform - Class in org.apache.beam.sdk.testing
Default transform to check that a PAssert was successful.
PAssert.GroupThenAssert<T> - Class in org.apache.beam.sdk.testing
A transform that applies an assertion-checking function over iterables of ActualT to the entirety of the contents of its input.
PAssert.GroupThenAssertForSingleton<T> - Class in org.apache.beam.sdk.testing
A transform that applies an assertion-checking function to the sole element of a PCollection.
PAssert.IterableAssert<T> - Interface in org.apache.beam.sdk.testing
Builder interface for assertions applicable to iterables and PCollection contents.
PAssert.MatcherCheckerFn<T> - Class in org.apache.beam.sdk.testing
Check that the passed-in matchers match the existing data.
PAssert.OneSideInputAssert<ActualT> - Class in org.apache.beam.sdk.testing
An assertion checker that takes a single PCollectionView<ActualT> and an assertion over ActualT, and checks it within a Beam pipeline.
PAssert.PAssertionSite - Class in org.apache.beam.sdk.testing
Track the place where an assertion is defined.
PAssert.PCollectionContentsAssert<T> - Class in org.apache.beam.sdk.testing
An PAssert.IterableAssert about the contents of a PCollection.
PAssert.PCollectionListContentsAssert<T> - Class in org.apache.beam.sdk.testing
An assert about the contents of each PCollection in the given PCollectionList.
PAssert.SingletonAssert<T> - Interface in org.apache.beam.sdk.testing
Builder interface for assertions applicable to a single value.
PassThroughLogicalType<T> - Class in org.apache.beam.sdk.schemas.logicaltypes
A base class for LogicalTypes that use the same Java type as the underlying base type.
PassThroughLogicalType(String, Schema.FieldType, Object, Schema.FieldType) - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
 
password(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService.Builder
 
password() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
The password to use for authentication.
password(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
Set Solace password.
password() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
The password to use for authentication.
password(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
Set Solace password.
passwordSecretName(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
The Secret Manager secret name where the password is stored.
passwordSecretName() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 
passwordSecretVersion(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
Optional.
passwordSecretVersion() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 
pastEndOfWindow() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark
Creates a trigger that fires when the watermark passes the end of the window.
pastFirstElementInPane() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
Creates a trigger that fires when the current processing time passes the processing time at which this trigger saw the first element in a pane.
patchTableDescription(TableReference, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Patch BigQuery Table description.
patchTableDescription(TableReference, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
patchTableDescription(TableReference, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
path - Variable in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
 
path() - Method in class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
 
path() - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
Field path from a root of a schema.
pathString - Variable in class org.apache.beam.sdk.jmh.io.TextSourceBenchmark.Data
 
PathValidator - Interface in org.apache.beam.sdk.extensions.gcp.storage
For internal use only; no backwards compatibility guarantees.
PathValidatorFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
 
PatientEverythingParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.PatientEverythingParameter
 
PatternCondition - Class in org.apache.beam.sdk.extensions.sql.impl.cep
PatternCondition stores the function to decide whether a row is a match of a single pattern.
payload() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError
 
PAYLOAD_TOO_LARGE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
PayloadSerializer - Interface in org.apache.beam.sdk.schemas.io.payloads
 
PayloadSerializerKafkaTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
 
PayloadSerializerProvider - Interface in org.apache.beam.sdk.schemas.io.payloads
 
PayloadSerializers - Class in org.apache.beam.sdk.schemas.io.payloads
 
payloadToConfig(ExternalTransforms.ExternalConfigurationPayload, Class<ConfigT>) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionService
Attempt to create an instance of ConfigT from an ExternalTransforms.ExternalConfigurationPayload.
PBegin - Class in org.apache.beam.sdk.values
PBegin is the "input" to a root PTransform, such as Read or Create.
PBegin(Pipeline) - Constructor for class org.apache.beam.sdk.values.PBegin
Constructs a PBegin in the given Pipeline.
PCollection<T> - Class in org.apache.beam.sdk.values
A PCollection<T> is an immutable collection of values of type T.
PCollection.IsBounded - Enum in org.apache.beam.sdk.values
The enumeration of cases for whether a PCollection is bounded.
PCOLLECTION_NAME - Static variable in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
PCollectionContentsAssert(PCollection<T>, PAssert.PAssertionSite) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
PCollectionContentsAssert(PCollection<T>, PAssert.AssertionWindows, SimpleFunction<Iterable<ValueInSingleWindow<T>>, Iterable<T>>, PAssert.PAssertionSite) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
PCollectionList<T> - Class in org.apache.beam.sdk.values
A PCollectionList<T> is an immutable list of homogeneously typed PCollection<T>s.
PCollectionListContentsAssert(PCollectionList<T>) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionListContentsAssert
 
PCollectionRowTuple - Class in org.apache.beam.sdk.values
A PCollectionRowTuple is an immutable tuple of PCollections, "keyed" by a string tag.
pCollections() - Static method in class org.apache.beam.sdk.transforms.Flatten
Returns a PTransform that flattens a PCollectionList into a PCollection containing all the elements of all the PCollections in its input.
PCollectionTuple - Class in org.apache.beam.sdk.values
A PCollectionTuple is an immutable tuple of heterogeneously-typed PCollections, "keyed" by TupleTags.
PCollectionView<T> - Interface in org.apache.beam.sdk.values
A PCollectionView<T> is an immutable view of a PCollection as a value of type T that can be accessed as a side input to a ParDo transform.
PCollectionViews - Class in org.apache.beam.sdk.values
For internal use only; no backwards compatibility guarantees.
PCollectionViews() - Constructor for class org.apache.beam.sdk.values.PCollectionViews
 
PCollectionViews.HasDefaultValue<T> - Interface in org.apache.beam.sdk.values
 
PCollectionViews.InMemoryListFromMultimapViewFn<T> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt a multimap materialization to an in-memory List<T>.
PCollectionViews.InMemoryListViewFn<T> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt an iterable materialization to an in-memory List<T>.
PCollectionViews.InMemoryMapFromVoidKeyViewFn<K,V> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt a multimap materialization to an in-memory Map<K, V>.
PCollectionViews.InMemoryMapViewFn<K,V> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt an iterable materialization to an in-memory Map<K, V>.
PCollectionViews.InMemoryMultimapFromVoidKeyViewFn<K,V> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt a multimap materialization to an in-memory Map<K, Iterable<V>>.
PCollectionViews.InMemoryMultimapViewFn<K,V> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt an iterable materialization to an in-memory Map<K, Iterable<V>>.
PCollectionViews.IsSingletonView<T> - Interface in org.apache.beam.sdk.values
 
PCollectionViews.IterableBackedListViewFn<T> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt an iterable materialization to a List<T>.
PCollectionViews.IterableViewFn<T> - Class in org.apache.beam.sdk.values
PCollectionViews.IterableViewFn2<T> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt an iterable materialization to a Iterable<T>.
PCollectionViews.ListViewFn<T> - Class in org.apache.beam.sdk.values
PCollectionViews.ListViewFn2<T> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt a multimap materialization to a List<T>.
PCollectionViews.MapViewFn<K,V> - Class in org.apache.beam.sdk.values
Deprecated.
PCollectionViews.MapViewFn2<K,V> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt a multimap materialization to a Map<K, V>.
PCollectionViews.MultimapViewFn<K,V> - Class in org.apache.beam.sdk.values
PCollectionViews.MultimapViewFn2<K,V> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt a multimap materialization to a Map<K, Iterable<V>>.
PCollectionViews.SimplePCollectionView<ElemT,PrimitiveViewT,ViewT,W extends BoundedWindow> - Class in org.apache.beam.sdk.values
A class for PCollectionView implementations, with additional type parameters that are not visible at pipeline assembly time when the view is used as a side input.
PCollectionViews.SingletonViewFn<T> - Class in org.apache.beam.sdk.values
PCollectionViews.SingletonViewFn2<T> - Class in org.apache.beam.sdk.values
Implementation which is able to adapt an iterable materialization to a T.
PCollectionViews.TypeDescriptorSupplier<T> - Interface in org.apache.beam.sdk.values
 
PCollectionViews.ValueOrMetadata<T,MetaT> - Class in org.apache.beam.sdk.values
Stores values or metadata about values.
PCollectionViews.ValueOrMetadataCoder<T,MetaT> - Class in org.apache.beam.sdk.values
PCollectionViewTranslatorBatch<ElemT,ViewT> - Class in org.apache.beam.runners.twister2.translators.batch
PCollectionView translator.
PCollectionViewTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.PCollectionViewTranslatorBatch
 
PDone - Class in org.apache.beam.sdk.values
PDone is the output of a PTransform that has a trivial result, such as a WriteFiles.
peekOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
peekOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
peekOutputElementsInWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
peekOutputElementsInWindow(TupleTag<OutputT>, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
peekOutputElementsWithTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
perElement() - Static method in class org.apache.beam.sdk.transforms.Count
Returns a PTransform that counts the number of occurrences of each element in its input PCollection.
PeriodicImpulse - Class in org.apache.beam.sdk.transforms
A PTransform which produces a sequence of elements at fixed runtime intervals.
PeriodicSequence - Class in org.apache.beam.sdk.transforms
A PTransform which generates a sequence of timestamped elements at given runtime intervals.
PeriodicSequence.OutputRangeTracker - Class in org.apache.beam.sdk.transforms
 
PeriodicSequence.SequenceDefinition - Class in org.apache.beam.sdk.transforms
 
PeriodicStatusPageDirectoryFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.PeriodicStatusPageDirectoryFactory
 
perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
Like ApproximateDistinct.globally() but per key, i.e computes the approximate number of distinct values per key in a PCollection<KV<K, V>> and returns PCollection<KV<K, Long>>.
perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
Like SketchFrequencies.globally() but per key, i.e a Count-min sketch per key in PCollection<KV<K, V>> and returns a PCollection<KV<K, {@link CountMinSketch}>>.
perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
Like TDigestQuantiles.globally(), but builds a digest for each key in the stream.
perKey() - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
 
PerKey() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
 
perKey() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Extract
Returns a PTransform that takes an input PCollection<KV<K, byte[]>> of (key, HLL++ sketch) pairs and returns a PCollection<KV<K, Long>> of (key, estimated count of distinct elements extracted from each sketch).
perKey() - Method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init.Builder
Returns a Combine.PerKey PTransform that takes an input PCollection<KV<K, InputT>> and returns a PCollection<KV<K, byte[]>> which consists of the per-key HLL++ sketch computed from the values matching each key in the input PCollection.
perKey() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.MergePartial
Returns a Combine.PerKey PTransform that takes an input PCollection<KV<K, byte[]>> of (key, HLL++ sketch) pairs and returns a PCollection<KV<K, byte[]>> of (key, new sketch merged from the input sketches under the key).
perKey(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
Returns a PTransform that takes a PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the input PCollection to a List of the approximate N-tiles of the values associated with that key in the input PCollection.
perKey(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
Like ApproximateQuantiles.perKey(int, Comparator), but sorts values using their natural ordering.
perKey(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
Deprecated.
Returns a PTransform that takes a PCollection<KV<K, V>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the input PCollection to an estimate of the number of distinct values associated with that key in the input PCollection.
perKey(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
Deprecated.
Like ApproximateUnique.perKey(int), but specifies the desired maximum estimation error instead of the sample size.
PerKey(int) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
Deprecated.
 
PerKey(double) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
Deprecated.
 
perKey(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.PerKey PTransform that first groups its input PCollection of KVs by keys and windows, then invokes the given function on each of the values lists to produce a combined value, and then returns a PCollection of KVs mapping each distinct key to its combined value for each window.
perKey(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.PerKey PTransform that first groups its input PCollection of KVs by keys and windows, then invokes the given function on each of the values lists to produce a combined value, and then returns a PCollection of KVs mapping each distinct key to its combined value for each window.
perKey(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.PerKey PTransform that first groups its input PCollection of KVs by keys and windows, then invokes the given function on each of the values lists to produce a combined value, and then returns a PCollection of KVs mapping each distinct key to its combined value for each window.
perKey() - Static method in class org.apache.beam.sdk.transforms.Count
Returns a PTransform that counts the number of elements associated with each key of its input PCollection.
perKey() - Static method in class org.apache.beam.sdk.transforms.Latest
Returns a PTransform that takes as input a PCollection<KV<K, V>> and returns a PCollection<KV<K, V>> whose contents is the latest element per-key according to its event time.
perKey() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a PCollection<KV<K, T>> that contains an output element mapping each distinct key in the input PCollection to the maximum according to the natural ordering of T of the values associated with that key in the input PCollection.
perKey(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a PCollection<KV<K, T>> that contains one output element per key mapping each to the maximum of the values associated with that key in the input PCollection.
perKey() - Static method in class org.apache.beam.sdk.transforms.Mean
Returns a PTransform that takes an input PCollection<KV<K, N>> and returns a PCollection<KV<K, Double>> that contains an output element mapping each distinct key in the input PCollection to the mean of the values associated with that key in the input PCollection.
perKey() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a PCollection<KV<K, T>> that contains an output element mapping each distinct key in the input PCollection to the minimum according to the natural ordering of T of the values associated with that key in the input PCollection.
perKey(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a PCollection<KV<K, T>> that contains one output element per key mapping each to the minimum of the values associated with that key in the input PCollection.
perKey(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the input PCollection to the largest count values associated with that key in the input PCollection<KV<K, V>>, in decreasing order, sorted using the given Comparator<V>.
PerKeyDigest() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
 
PerKeyDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
 
PerKeySketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
 
pin() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
Pin this object.
PInput - Interface in org.apache.beam.sdk.values
The interface for things that might be input to a PTransform.
pipeline() - Method in class org.apache.beam.runners.jobsubmission.JobPreparation
 
Pipeline - Class in org.apache.beam.sdk
A Pipeline manages a directed acyclic graph of PTransforms, and the PCollections that the PTransforms consume and produce.
Pipeline(PipelineOptions) - Constructor for class org.apache.beam.sdk.Pipeline
 
Pipeline.PipelineExecutionException - Exception in org.apache.beam.sdk
Thrown during execution of a Pipeline, whenever user code within that Pipeline throws an exception.
Pipeline.PipelineVisitor - Interface in org.apache.beam.sdk
For internal use only; no backwards-compatibility guarantees.
Pipeline.PipelineVisitor.CompositeBehavior - Enum in org.apache.beam.sdk
Control enum for indicating whether or not a traversal should process the contents of a composite transform or not.
Pipeline.PipelineVisitor.Defaults - Class in org.apache.beam.sdk
Default no-op Pipeline.PipelineVisitor that enters all composite transforms.
PIPELINE_PROTO_CODER_ID - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PIPELINED - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
PipelineExecutionException(Throwable) - Constructor for exception org.apache.beam.sdk.Pipeline.PipelineExecutionException
PipelineMessageReceiver - Interface in org.apache.beam.runners.local
Handles failures in the form of exceptions.
pipelineOptions() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
PipelineOptions - Interface in org.apache.beam.sdk.options
PipelineOptions are used to configure Pipelines.
PipelineOptions.AtomicLongFactory - Class in org.apache.beam.sdk.options
DefaultValueFactory which supplies an ID that is guaranteed to be unique within the given process.
PipelineOptions.CheckEnabled - Enum in org.apache.beam.sdk.options
Enumeration of the possible states for a given check.
PipelineOptions.DirectRunner - Class in org.apache.beam.sdk.options
A DefaultValueFactory that obtains the class of the DirectRunner if it exists on the classpath, and throws an exception otherwise.
PipelineOptions.JobNameFactory - Class in org.apache.beam.sdk.options
Returns a normalized job name constructed from ApplicationNameOptions.getAppName(), the local system user name (if available), the current time, and a random integer.
PipelineOptions.UserAgentFactory - Class in org.apache.beam.sdk.options
Returns a user agent string constructed from ReleaseInfo.getName() and ReleaseInfo.getVersion(), in the format [name]/[version].
PipelineOptionsFactory - Class in org.apache.beam.sdk.options
Constructs a PipelineOptions or any derived interface that is composable to any other derived interface of PipelineOptions via the PipelineOptions.as(java.lang.Class<T>) method.
PipelineOptionsFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptionsFactory
 
PipelineOptionsFactory.Builder - Class in org.apache.beam.sdk.options
A fluent PipelineOptions builder.
PipelineOptionsRegistrar - Interface in org.apache.beam.sdk.options
PipelineOptions creators have the ability to automatically have their PipelineOptions registered with this SDK by creating a ServiceLoader entry and a concrete implementation of this interface.
PipelineOptionsValidator - Class in org.apache.beam.sdk.options
Validates that the PipelineOptions conforms to all the Validation criteria.
PipelineOptionsValidator() - Constructor for class org.apache.beam.sdk.options.PipelineOptionsValidator
 
PipelineResult - Interface in org.apache.beam.sdk
Result of Pipeline.run().
PipelineResult.State - Enum in org.apache.beam.sdk
Possible job states, for both completed and ongoing jobs.
PipelineRunner<ResultT extends PipelineResult> - Class in org.apache.beam.sdk
PipelineRunner() - Constructor for class org.apache.beam.sdk.PipelineRunner
 
PipelineTranslator - Class in org.apache.beam.runners.spark.structuredstreaming.translation
The pipeline translator translates a Beam Pipeline into a Spark correspondence, that can then be evaluated.
PipelineTranslator() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
 
PipelineTranslator.TranslationState - Interface in org.apache.beam.runners.spark.structuredstreaming.translation
Shared, mutable state during the translation of a pipeline and omitted afterwards.
PipelineTranslator.UnresolvedTranslation<InT,T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation
Unresolved translation, allowing to optimize the generated Spark DAG.
PipelineTranslatorBatch - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch
PipelineTranslator for executing a Pipeline in Spark in batch mode.
PipelineTranslatorBatch() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.PipelineTranslatorBatch
 
PipelineTranslatorUtils - Class in org.apache.beam.runners.fnexecution.translation
Utilities for pipeline translation.
placementId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
Plugin<K,V> - Class in org.apache.beam.sdk.io.cdap
Class wrapper for a CDAP plugin.
Plugin() - Constructor for class org.apache.beam.sdk.io.cdap.Plugin
 
Plugin.Builder<K,V> - Class in org.apache.beam.sdk.io.cdap
Builder class for a Plugin.
pluginConfig - Variable in class org.apache.beam.sdk.io.cdap.Plugin
 
PluginConfigInstantiationUtils - Class in org.apache.beam.sdk.io.cdap
Class for getting any filled PluginConfig configuration object.
PluginConfigInstantiationUtils() - Constructor for class org.apache.beam.sdk.io.cdap.PluginConfigInstantiationUtils
 
PluginConstants - Class in org.apache.beam.sdk.io.cdap
Class for CDAP plugin constants.
PluginConstants() - Constructor for class org.apache.beam.sdk.io.cdap.PluginConstants
 
PluginConstants.Format - Enum in org.apache.beam.sdk.io.cdap
Format types.
PluginConstants.FormatProvider - Enum in org.apache.beam.sdk.io.cdap
Format provider types.
PluginConstants.Hadoop - Enum in org.apache.beam.sdk.io.cdap
Hadoop types.
PluginConstants.PluginType - Enum in org.apache.beam.sdk.io.cdap
Plugin types.
PLUS - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
 
plus(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
plus(NodeStats) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
 
PLUS_RELUCTANT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
 
plusDelayOf(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
Adds some delay to the original target time.
poisonInstructionId(String) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
Poisons an instruction id.
POJOUtils - Class in org.apache.beam.sdk.schemas.utils
A set of utilities to generate getter and setter classes for POJOs.
POJOUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.POJOUtils
 
PollFn() - Constructor for class org.apache.beam.sdk.transforms.Watch.Growth.PollFn
 
pollFor(Duration) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.PollingAssertion
 
pollJob(JobReference, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Waits for the job is Done, and returns the job.
pollJob(JobReference, int) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
pollOperation(Operation, Long) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Poll operation.
pollOperation(Operation, Long) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
pooledClientFactory(BuilderT) - Static method in class org.apache.beam.sdk.io.aws2.common.ObjectPool
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
Deprecated.
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.CompressedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
Populates the display data.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
Populates the display data.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.Match
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Read.Bounded
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
Populate the display data with connectionConfiguration details.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
populateDisplayData(SingleStoreIO.DataSourceConfiguration, DisplayData.Builder) - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Source
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
Deprecated.
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.WriteFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
Deprecated.
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.PerKey
Deprecated.
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
populateDisplayData(DisplayData.Builder) - Method in interface org.apache.beam.sdk.transforms.display.HasDisplayData
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.DoFn
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Filter
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.FlatMapElements.FlatMapWithFailures
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.InferableFunction
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.MapElements.MapWithFailures
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.MapElements
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Partition
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.PTransform
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Register display data for the given transform or component.
PortableBigQueryDestinations - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
PortableBigQueryDestinations(Schema, BigQueryWriteConfiguration) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
 
portableMetrics() - Method in class org.apache.beam.runners.flink.FlinkPortableRunnerResult
 
portableMetrics() - Method in interface org.apache.beam.runners.jobsubmission.PortablePipelineResult
Returns the object to access monitoring infos from the pipeline.
PortableMetrics - Class in org.apache.beam.runners.portability
 
PortablePipelineJarCreator - Class in org.apache.beam.runners.jobsubmission
PortablePipelineRunner that bundles the input pipeline along with all dependencies, artifacts, etc.
PortablePipelineJarCreator(Class) - Constructor for class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
 
PortablePipelineJarUtils - Class in org.apache.beam.runners.jobsubmission
Contains common code for writing and reading portable pipeline jars.
PortablePipelineJarUtils() - Constructor for class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
 
PortablePipelineOptions - Interface in org.apache.beam.sdk.options
Pipeline options common to all portable runners.
PortablePipelineResult - Interface in org.apache.beam.runners.jobsubmission
PortablePipelineRunner - Interface in org.apache.beam.runners.jobsubmission
Runs a portable Beam pipeline on some execution engine.
PortableRunner - Class in org.apache.beam.runners.portability
A PipelineRunner a Pipeline against a JobService.
PortableRunnerRegistrar - Class in org.apache.beam.runners.portability
Registrar for the portable runner.
PortableRunnerRegistrar() - Constructor for class org.apache.beam.runners.portability.PortableRunnerRegistrar
 
position() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
position(long) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
positional() - Method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
 
PositionAwareCombineFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
 
PostProcessingMetricsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
A DoFn class to gather metrics about the emitted DataChangeRecords.
PostProcessingMetricsDoFn(ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.PostProcessingMetricsDoFn
 
POutput - Interface in org.apache.beam.sdk.values
The interface for things that might be output from a PTransform.
PRE_DEFINED_WINDOW_FUNCTIONS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
 
precisionForRelativeError(double) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
Computes the precision based on the desired relative error.
predictAll() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
 
prefetch() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
 
prefetch() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterable
Ensures that the next iterator returned has been prefetched.
prefetch() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
 
prefetch() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterator
If not PrefetchableIterator.isReady(), schedules the next expensive operation such that at some point in time in the future PrefetchableIterator.isReady() will return true.
PrefetchableIterable<T> - Interface in org.apache.beam.sdk.fn.stream
An Iterable that returns PrefetchableIterators.
PrefetchableIterables - Class in org.apache.beam.sdk.fn.stream
This class contains static utility functions that operate on or return objects of type PrefetchableIterable.
PrefetchableIterables() - Constructor for class org.apache.beam.sdk.fn.stream.PrefetchableIterables
 
PrefetchableIterables.Default<T> - Class in org.apache.beam.sdk.fn.stream
A default implementation that caches an iterator to be returned when PrefetchableIterables.Default.prefetch() is invoked.
PrefetchableIterator<T> - Interface in org.apache.beam.sdk.fn.stream
Iterator that supports prefetching the next set of records.
PrefetchableIterators - Class in org.apache.beam.sdk.fn.stream
 
PrefetchableIterators() - Constructor for class org.apache.beam.sdk.fn.stream.PrefetchableIterators
 
prefetchOnMerge(MergingStateAccessor<K, W>) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
prefetchOnTrigger(StateAccessor<K>) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
prepare(JobApi.PrepareJobRequest, StreamObserver<JobApi.PrepareJobResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
prepare(JobApi.PrepareJobRequest, StreamObserver<JobApi.PrepareJobResponse>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
 
prepare(TSetContext) - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
 
prepareCall(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareCall(String, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareCall(String, int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareFilesToStage(SparkCommonPipelineOptions) - Static method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
Classpath contains non jar files (eg.
prepareForProcessing() - Method in class org.apache.beam.sdk.transforms.DoFn
Deprecated.
use DoFn.Setup or DoFn.StartBundle instead. This method will be removed in a future release.
prepareForTranslation(RunnerApi.Pipeline) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
 
preparePrivateKey(String, String) - Static method in class org.apache.beam.sdk.io.snowflake.KeyPairUtils
 
PreparePubsubWriteDoFn<InputT> - Class in org.apache.beam.sdk.io.gcp.pubsub
 
prepareRun() - Method in class org.apache.beam.sdk.io.cdap.Plugin
Calls SubmitterLifecycle.prepareRun(Object) method on the Plugin.cdapPluginObj passing needed configuration object as a parameter.
prepareStatement(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareStatement(String, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareStatement(String, int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareStatement(String, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareStatement(String, int[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareStatement(String, String[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
prepareWrite(WritableByteChannel) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Called with the channel that a subclass will write its header, footer, and values to.
PrepareWrite<InputT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Prepare an input PCollection for writing to BigQuery.
PrepareWrite(DynamicDestinations<InputT, DestinationT>, SerializableFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
 
PRESERVES_KEYS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
previous(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
 
primary() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
PrimitiveParDoSingleFactory<InputT,OutputT> - Class in org.apache.beam.runners.dataflow
A PTransformOverrideFactory that produces PrimitiveParDoSingleFactory.ParDoSingle instances from ParDo.SingleOutput instances.
PrimitiveParDoSingleFactory() - Constructor for class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
 
PrimitiveParDoSingleFactory.ParDoSingle<InputT,OutputT> - Class in org.apache.beam.runners.dataflow
A single-output primitive ParDo.
PrimitiveParDoSingleFactory.PayloadTranslator - Class in org.apache.beam.runners.dataflow
PrimitiveParDoSingleFactory.Registrar - Class in org.apache.beam.runners.dataflow
printHelp(PrintStream) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Outputs the set of registered options with the PipelineOptionsFactory with a description for each one if available to the output stream.
printHelp(PrintStream, Class<? extends PipelineOptions>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Outputs the set of options available to be set for the passed in PipelineOptions interface.
PrismPipelineOptions - Interface in org.apache.beam.runners.prism
PipelineOptions for running a Pipeline on the PrismRunner.
PrismRegistrar - Class in org.apache.beam.runners.prism
Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the PrismRunner.
PrismRegistrar.Options - Class in org.apache.beam.runners.prism
PrismRegistrar.Runner - Class in org.apache.beam.runners.prism
Registers PrismRunner and TestPrismRunner with PipelineRunnerRegistrar.
PrismRunner - Class in org.apache.beam.runners.prism
A PipelineRunner executed on Prism.
PrismRunner(PrismPipelineOptions) - Constructor for class org.apache.beam.runners.prism.PrismRunner
 
process(Map<String, String>, RestrictionTracker<KafkaSourceConsumerFn.OffsetHolder, Map<String, Object>>, DoFn.OutputReceiver<T>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
Process the retrieved element and format it for output.
process(List<JobMessage>) - Method in interface org.apache.beam.runners.dataflow.util.MonitoringUtil.JobMessagesHandler
Process the rows.
process(List<JobMessage>) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler
 
process(KV<Row, ValueT>, Instant, TimerMap, TimerMap, ValueState<SortedMap<Instant, TimestampedValue<ValueT>>>, DoFn.OutputReceiver<ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.FillGapsDoFn
 
process(PipelineOptions, KV<String, StorageApiFlushAndFinalizeDoFn.Operation>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
 
process(InputT, Instant, BoundedWindow, PaneInfo, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PreparePubsubWriteDoFn
 
process(SequencedMessage, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
 
process(Row, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
 
process(DataChangeRecord, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
 
process(byte[], DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
 
process(DoFn<KV<FileIO.ReadableFile, OffsetRange>, T>.ProcessContext) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn
 
process(DoFn<FileIO.ReadableFile, KV<FileIO.ReadableFile, OffsetRange>>.ProcessContext) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.SplitIntoRangesFn
 
process(DoFn<InputT, KV<InputT, List<TimestampedValue<OutputT>>>>.ProcessContext, RestrictionTracker<Watch.GrowthState, KV<Watch.Growth.PollResult<OutputT>, TerminationStateT>>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
 
processArrayOfNestedStringField(RowBundles.ArrayOfNestedStringBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processArrayOfStringField(RowBundles.ArrayOfStringBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processBundle(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
processBundle(InputT...) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
ProcessBundleDescriptors - Class in org.apache.beam.runners.fnexecution.control
Utility methods for creating BeamFnApi.ProcessBundleDescriptor instances.
ProcessBundleDescriptors() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
 
ProcessBundleDescriptors.BagUserStateSpec<K,V,W extends BoundedWindow> - Class in org.apache.beam.runners.fnexecution.control
A container type storing references to the key, value, and window Coder used when handling bag user state requests.
ProcessBundleDescriptors.ExecutableProcessBundleDescriptor - Class in org.apache.beam.runners.fnexecution.control
 
ProcessBundleDescriptors.SideInputSpec<T,W extends BoundedWindow> - Class in org.apache.beam.runners.fnexecution.control
A container type storing references to the value, and window Coder used when handling side input state requests.
ProcessBundleDescriptors.TimerSpec<K,V,W extends BoundedWindow> - Class in org.apache.beam.runners.fnexecution.control
A container type storing references to the key, timer and payload coders and the remote input destination used when handling timer requests.
processByteBufferField(RowBundles.ByteBufferBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processBytesField(RowBundles.BytesBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
ProcessContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.ProcessContext
 
ProcessContinuation() - Constructor for class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
 
processDateTimeField(RowBundles.DateTimeBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processElement(DoFn<KV<K, Iterable<KV<Instant, WindowedValue<KV<K, V>>>>>, OutputT>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
processElement(DoFn<Iterable<T>, T>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
 
processElement(WindowedValue<InputT>) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
processElement(Row, DoFn.OutputReceiver<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.SetOperatorFilteringDoFn
 
processElement(DoFn<Row, Void>.ProcessContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlOutputToConsoleFn
 
processElement(DoFn<KV<DestinationT, ElementT>, KV<DestinationT, StorageApiWritePayload>>.ProcessContext, PipelineOptions, KV<DestinationT, ElementT>, Instant, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
 
processElement(Iterable<KV<DestinationT, WriteTables.Result>>, DoFn<Iterable<KV<DestinationT, WriteTables.Result>>, Iterable<KV<TableDestination, WriteTables.Result>>>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
processElement(InitialPipelineState, RestrictionTracker<OffsetRange, Long>, DoFn.OutputReceiver<PartitionRecord>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
processElement(KV<ByteString, ChangeStreamRecord>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamMutation>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.FilterForMutationDoFn
 
processElement(DoFn.OutputReceiver<InitialPipelineState>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.InitializeDoFn
 
processElement(PartitionRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
processElement(DoFn<T, T>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
Emit only as many elements as the exponentially increasing budget allows.
processElement(DoFn<HL7v2ReadParameter, HL7v2ReadResponse>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
Process element.
processElement(DoFn<String, HL7v2Message>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
Process element.
processElement(PubsubMessage, Instant) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
processElement(Row, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider.ErrorFn
 
processElement(PubSubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
 
processElement(PubSubMessage, DoFn.OutputReceiver<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn
 
processElement(DoFn.OutputReceiver<Void>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
 
processElement(RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<PartitionMetadata>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
Main processing function for the DetectNewPartitionsDoFn function.
processElement(DoFn.OutputReceiver<PartitionMetadata>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
 
processElement(DataChangeRecord, DoFn.OutputReceiver<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.PostProcessingMetricsDoFn
Stage to measure a data records latencies and metrics.
processElement(PartitionMetadata, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>, DoFn.BundleFinalizer) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
Performs a change stream query for a given partition.
processElement(DoFn<Void, SpannerSchema>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
 
processElement(PulsarSourceDescriptor, RestrictionTracker<OffsetRange, Long>, WatermarkEstimator, DoFn.OutputReceiver<PulsarMessage>) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
processElement(byte[]) - Method in class org.apache.beam.sdk.io.pulsar.WriteToPulsarDoFn
 
processElement(DoFn<Object, Object>.ProcessContext) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.CleanTmpFilesFromGcsFn
 
processElement(DoFn<String, String[]>.ProcessContext) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.MapCsvToStringArrayFn
 
processElement(Solace.Record, DoFn.OutputReceiver<KV<Integer, Solace.Record>>) - Method in class org.apache.beam.sdk.io.solace.write.AddShardKeyDoFn
 
processElement(Solace.Record, DoFn.OutputReceiver<Solace.PublishResult>) - Method in class org.apache.beam.sdk.io.solace.write.RecordToPublishResultDoFn
 
processElement(KV<Integer, Solace.Record>, Timer, Instant) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
 
processElement(KV<Integer, Solace.Record>, Instant, ValueState<Integer>) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedStreamingSolaceWriter
 
processElement(InputT) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
processElement(ErrorT, DoFn.OutputReceiver<ErrorT>) - Method in class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics.CountErrors
 
processElement(String, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
 
processElement(T, DoFn.OutputReceiver<KV<Integer, T>>) - Method in class org.apache.beam.sdk.transforms.Reshuffle.AssignShardFn
 
processElement(DoFn<T, KV<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.View.ToListViewDoFn
 
processElement(DoFn<ValueWithRecordId<T>, T>.ProcessContext) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.StripIdsDoFn
 
processElementWithRunner(DoFnRunner<KV<?, ?>, OutputT>, WindowedValue<KV<?, ?>>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
 
ProcessEnvironment - Class in org.apache.beam.runners.fnexecution.environment
Environment for process-based execution.
ProcessEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
An EnvironmentFactory which forks processes based on the parameters in the Environment.
ProcessEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
Provider of ProcessEnvironmentFactory.
ProcessFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.transforms
A function that computes an output value of type OutputT from an input value of type InputT and is Serializable.
PROCESSING_DELAY_FROM_COMMIT_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Distribution for measuring processing delay from commit timestamp.
processingStatuses() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
 
ProcessingTimeEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
 
ProcessingTimePolicy() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
 
ProcessingTimeWatermarkPolicy() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
 
ProcessingTimeWatermarkPolicy() - Constructor for class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
 
processIntField(RowBundles.IntBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
ProcessManager - Class in org.apache.beam.runners.fnexecution.environment
A simple process manager which forks processes and kills them if necessary.
ProcessManager.RunningProcess - Class in org.apache.beam.runners.fnexecution.environment
 
processMapOfIntField(RowBundles.MapOfIntBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processMapOfNestedIntField(RowBundles.MapOfNestedIntBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processNestedBytesField(RowBundles.NestedBytesBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processNestedIntField(RowBundles.NestedIntBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processNewPartition(NewPartition, DoFn.OutputReceiver<PartitionRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ProcessNewPartitionsAction
Process a single new partition.
processNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
 
ProcessNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
 
ProcessNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ProcessNewPartitionsAction
 
processNewRow(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.nfa.NFA
 
processRows(Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundle
Runs benchmark iteration on a bundle of rows.
processStringBuilderField(RowBundles.StringBuilderBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processStringField(RowBundles.StringBundle, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
 
processTimestampedElement(TimestampedValue<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
processValue(ReduceFn<K, T, Iterable<T>, W>.ProcessValueContext) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
processWindowedElement(InputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
produceResult() - Method in interface org.apache.beam.sdk.extensions.ordered.MutableState
This method is called after each state mutation.
ProducerRecordCoder<K,V> - Class in org.apache.beam.sdk.io.kafka
ProducerRecordCoder(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
program - Variable in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
 
Progress() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
 
project - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
PROJECT_ID_REGEXP - Static variable in class org.apache.beam.runners.dataflow.DataflowRunner
Project IDs must contain lowercase letters, digits, or dashes.
projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
 
projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
projectId() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
projectId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
ProjectionConsumer - Interface in org.apache.beam.sdk.schemas
A ProjectionConsumer is a Schema-aware operation (such as a DoFn or PTransform) that has a FieldAccessDescriptor describing which fields the operation accesses.
ProjectionProducer<T> - Interface in org.apache.beam.sdk.schemas
A factory for operations that execute a projection on a Schema-aware PCollection.
projectPathFromId(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
projectPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
ProjectSupport - Enum in org.apache.beam.sdk.extensions.sql.meta
 
properties(ObjectNode) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
properties() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
PROPERTY_BEAM_TEST_PIPELINE_OPTIONS - Static variable in class org.apache.beam.sdk.testing.TestPipeline
System property used to set TestPipelineOptions.
propertyName() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
Returns the property name associated with this provider.
propertyName() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
Returns the property name that corresponds to this provider.
PropertyNames - Class in org.apache.beam.runners.dataflow.util
Constant property names used by the SDK in CloudWorkflow specifications.
PropertyNames() - Constructor for class org.apache.beam.runners.dataflow.util.PropertyNames
 
ProtobufByteStringOutputStream() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream
 
ProtobufCoderProviderRegistrar - Class in org.apache.beam.sdk.extensions.protobuf
A CoderProviderRegistrar for standard types used with Google Protobuf.
ProtobufCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
 
ProtoByteUtils - Class in org.apache.beam.sdk.extensions.protobuf
Utility class for working with Protocol Buffer (Proto) data.
ProtoByteUtils() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoByteUtils
 
ProtoCoder<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.extensions.protobuf
A Coder using Google Protocol Buffers binary format.
ProtoCoder(Class<T>, Set<Class<?>>) - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Private constructor.
ProtoDomain - Class in org.apache.beam.sdk.extensions.protobuf
ProtoDomain is a container class for Protobuf descriptors.
ProtoDynamicMessageSchema<T> - Class in org.apache.beam.sdk.extensions.protobuf
 
ProtoFromBytes<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
ProtoMessageSchema - Class in org.apache.beam.sdk.extensions.protobuf
 
ProtoMessageSchema() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
 
protoModeToJsonMode(TableFieldSchema.Mode) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
ProtoPayloadSerializerProvider - Class in org.apache.beam.sdk.extensions.protobuf
 
ProtoPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
 
ProtoSchemaLogicalTypes - Class in org.apache.beam.sdk.extensions.protobuf
A set of Schema.LogicalType classes to represent protocol buffer types.
ProtoSchemaLogicalTypes() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes
 
ProtoSchemaLogicalTypes.DurationConvert - Class in org.apache.beam.sdk.extensions.protobuf
 
ProtoSchemaLogicalTypes.Fixed32 - Class in org.apache.beam.sdk.extensions.protobuf
A Fixed32 type.
ProtoSchemaLogicalTypes.Fixed64 - Class in org.apache.beam.sdk.extensions.protobuf
A Fixed64 type.
ProtoSchemaLogicalTypes.SFixed32 - Class in org.apache.beam.sdk.extensions.protobuf
A SFixed32 type.
ProtoSchemaLogicalTypes.SFixed64 - Class in org.apache.beam.sdk.extensions.protobuf
An SFixed64 type.
ProtoSchemaLogicalTypes.SInt32 - Class in org.apache.beam.sdk.extensions.protobuf
A SInt32 type.
ProtoSchemaLogicalTypes.SInt64 - Class in org.apache.beam.sdk.extensions.protobuf
A SIn64 type.
ProtoSchemaLogicalTypes.TimestampConvert - Class in org.apache.beam.sdk.extensions.protobuf
 
ProtoSchemaLogicalTypes.UInt32 - Class in org.apache.beam.sdk.extensions.protobuf
A UInt32 type.
ProtoSchemaLogicalTypes.UInt64 - Class in org.apache.beam.sdk.extensions.protobuf
A UIn64 type.
protoSchemaToTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
protoTableFieldToTableField(TableFieldSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
protoTableSchemaFromAvroSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
Given an Avro Schema, returns a protocol-buffer TableSchema that can be used to write data through BigQuery Storage API.
ProtoToBytes<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
ProtoToBytes() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
 
protoTypeToJsonType(TableFieldSchema.Type) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
provide(String, Map<String, String>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
Provider(PipelineOptions) - Constructor for class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
 
Provider(PipelineOptions) - Constructor for class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
 
Provider() - Constructor for class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
 
Provider(PipelineOptions) - Constructor for class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.Provider
 
Provider(InstructionRequestHandler) - Constructor for class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory.Provider
 
provider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator
 
provider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator
 
provider() - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema.Customizer
 
provider() - Static method in class org.apache.beam.sdk.io.thrift.ThriftSchema
Schema provider that maps any thrift type to a Beam schema, assuming that any typedefs that might have been used in the thrift definitions will preserve all required metadata to infer the beam type (which is the case for any primitive typedefs and alike).
provider() - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
 
Providers - Class in org.apache.beam.sdk.schemas.io
Helpers for implementing the "Provider" pattern.
Providers.Identifyable - Interface in org.apache.beam.sdk.schemas.io
 
PROXY_HOST - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
PROXY_PASSWORD - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
PROXY_PORT - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
PROXY_USERNAME - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
PTransform<InputT extends PInput,OutputT extends POutput> - Class in org.apache.beam.sdk.transforms
A PTransform<InputT, OutputT> is an operation that takes an InputT (some subtype of PInput) and produces an OutputT (some subtype of POutput).
PTransform() - Constructor for class org.apache.beam.sdk.transforms.PTransform
 
PTransform(String) - Constructor for class org.apache.beam.sdk.transforms.PTransform
 
PTransformErrorHandler(PTransform<PCollection<ErrorT>, OutputT>, Pipeline, Coder<ErrorT>) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler
Constructs a new ErrorHandler, but should not be called directly.
publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Publish outgoingMessages to Pubsub topic.
publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
publish(List<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Publish messages to TestPubsub.topicPath().
publishBatch(List<Solace.Record>, boolean, SerializableFunction<Solace.Record, Destination>, DeliveryMode) - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
Publishes a batch of messages to the broker.
publishBatch(List<Solace.Record>, boolean, SerializableFunction<Solace.Record, Destination>, DeliveryMode) - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
 
PublisherOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
Options needed for a Pub/Sub Lite Publisher.
PublisherOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
 
PublisherOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
publishLatencyMetrics() - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
Publish latency metrics using Beam metrics.
PublishResult() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.PublishResult
 
PublishResultCoders - Class in org.apache.beam.sdk.io.aws.sns
Coders for SNS PublishResult.
PublishResultHandler - Class in org.apache.beam.sdk.io.solace.broker
This class is required to handle callbacks from Solace, to find out if messages were actually published or there were any kind of error.
PublishResultHandler(Queue<Solace.PublishResult>) - Constructor for class org.apache.beam.sdk.io.solace.broker.PublishResultHandler
 
publishResults(UnboundedSolaceWriter.BeamContextWrapper) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
publishSingleMessage(Solace.Record, Destination, boolean, DeliveryMode) - Method in interface org.apache.beam.sdk.io.solace.broker.MessageProducer
Publishes a message to the broker.
publishSingleMessage(Solace.Record, Destination, boolean, DeliveryMode) - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
 
PUBSUB_DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_ID_ATTRIBUTE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_SERIALIZED_ATTRIBUTES_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_SUBSCRIPTION - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_SUBSCRIPTION_OVERRIDE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_TIMESTAMP_ATTRIBUTE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_TOPIC - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_TOPIC_OVERRIDE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PubsubClient - Class in org.apache.beam.sdk.io.gcp.pubsub
An (abstract) helper class for talking to Pubsub via an underlying transport.
PubsubClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
PubsubClient.IncomingMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
A message received from Pubsub.
PubsubClient.OutgoingMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
A message to be sent to Pubsub.
PubsubClient.ProjectPath - Class in org.apache.beam.sdk.io.gcp.pubsub
Path representing a cloud project id.
PubsubClient.PubsubClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
Factory for creating clients.
PubsubClient.SchemaPath - Class in org.apache.beam.sdk.io.gcp.pubsub
Path representing a Pubsub schema.
PubsubClient.SubscriptionPath - Class in org.apache.beam.sdk.io.gcp.pubsub
Path representing a Pubsub subscription.
PubsubClient.TopicPath - Class in org.apache.beam.sdk.io.gcp.pubsub
Path representing a Pubsub topic.
PubsubCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
A CoderProviderRegistrar for standard types used with PubsubIO.
PubsubCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
 
PubsubDlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
 
PubsubGrpcClient - Class in org.apache.beam.sdk.io.gcp.pubsub
A helper class for talking to Pubsub via grpc.
PubsubIO - Class in org.apache.beam.sdk.io.gcp.pubsub
Read and Write PTransforms for Cloud Pub/Sub streams.
PubsubIO.PubsubSubscription - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Cloud Pub/Sub Subscription.
PubsubIO.PubsubTopic - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Cloud Pub/Sub Topic.
PubsubIO.Read<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
Implementation of read methods.
PubsubIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
Implementation of write methods.
PubsubIO.Write.PubsubBoundedWriter - Class in org.apache.beam.sdk.io.gcp.pubsub
Writer to Pubsub which batches messages from bounded collections.
PubsubJsonClient - Class in org.apache.beam.sdk.io.gcp.pubsub
A Pubsub client using JSON transport.
PubsubLiteIO - Class in org.apache.beam.sdk.io.gcp.pubsublite
I/O transforms for reading from Google Pub/Sub Lite.
PubsubLiteReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
PubsubLiteReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
PubsubLiteReadSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteSink - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A sink which publishes messages to Pub/Sub Lite.
PubsubLiteSink(PublisherOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
 
PubsubLiteTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite
Pub/Sub Lite table provider.
PubsubLiteTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
 
PubsubLiteWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
PubsubLiteWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Pub/Sub message.
PubsubMessage(byte[], Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
PubsubMessage(byte[], Map<String, String>, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
PubsubMessage(byte[], Map<String, String>, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
PubsubMessagePayloadOnlyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload.
PubsubMessagePayloadOnlyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
PubsubMessages - Class in org.apache.beam.sdk.io.gcp.pubsub
Common util functions for converting between PubsubMessage proto and PubsubMessage.
PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubMessages.ParsePayloadAsPubsubMessageProto - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubMessages.ParsePubsubMessageProtoAsPayload - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including all fields of a PubSub message from server.
PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
PubsubMessageWithAttributesAndMessageIdCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including attributes and the message id from the PubSub server.
PubsubMessageWithAttributesAndMessageIdCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
PubsubMessageWithAttributesCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including attributes.
PubsubMessageWithAttributesCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
PubsubMessageWithMessageIdCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload, with the message id from the PubSub server.
PubsubMessageWithMessageIdCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
 
PubsubMessageWithTopicCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including the topic from the PubSub server.
PubsubMessageWithTopicCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
PubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
Properties that can be set when using Google Cloud Pub/Sub with the Apache Beam SDK.
PubSubPayloadTranslation - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubSubPayloadTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation
 
PubSubPayloadTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubSubPayloadTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsub
Configuration for reading from Pub/Sub.
PubsubReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
PubsubReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubReadSchemaTransformConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
An implementation of TypedSchemaTransformProvider for Pub/Sub reads configured using PubsubReadSchemaTransformConfiguration.
PubsubReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
PubsubSchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
An implementation of SchemaIOProvider for reading and writing JSON/AVRO payloads with PubsubIO.
PubsubSchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
PubsubTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.pubsub
TableProvider for PubsubIO for consumption by Beam SQL.
PubsubTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
 
PubsubTestClient - Class in org.apache.beam.sdk.io.gcp.pubsub
A (partial) implementation of PubsubClient for use by unit tests.
PubsubTestClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
PubsubTestClient.PubsubTestClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
Closing the factory will validate all expected messages were processed.
PubsubUnboundedSink - Class in org.apache.beam.sdk.io.gcp.pubsub
A PTransform which streams messages to Pubsub.
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, int, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, int, int, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSource - Class in org.apache.beam.sdk.io.gcp.pubsub
Users should use PubsubIO#read instead.
PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Construct an unbounded source to consume from the Pubsub subscription.
PubsubUnboundedSource(Clock, PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Construct an unbounded source to consume from the Pubsub subscription.
PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Construct an unbounded source to consume from the Pubsub subscription.
PubsubWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsub
Configuration for writing to Pub/Sub.
PubsubWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
PubsubWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubWriteSchemaTransformConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
An implementation of TypedSchemaTransformProvider for Pub/Sub reads configured using PubsubWriteSchemaTransformConfiguration.
PubsubWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
PubsubWriteSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.gcp.pubsub
 
pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Request the next batch of up to batchSize messages from subscription.
pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
PulsarIO - Class in org.apache.beam.sdk.io.pulsar
Class for reading and writing from Apache Pulsar.
PulsarIO.Read - Class in org.apache.beam.sdk.io.pulsar
 
PulsarIO.Write - Class in org.apache.beam.sdk.io.pulsar
 
PulsarMessage - Class in org.apache.beam.sdk.io.pulsar
Class representing a Pulsar Message record.
PulsarMessage(String, Long, Object) - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarMessage
 
PulsarMessage(String, Long) - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarMessage
 
PulsarMessageCoder - Class in org.apache.beam.sdk.io.pulsar
 
PulsarMessageCoder() - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
 
PulsarSourceDescriptor - Class in org.apache.beam.sdk.io.pulsar
 
PulsarSourceDescriptor() - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarSourceDescriptor
 
PUSH_DOWN_OPTION - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
pushArrayBuffer(ArrayBuffer<?>, Option<Object>, Option<StreamBlockId>) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
pushBytes(ByteBuffer, Option<Object>, Option<StreamBlockId>) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
pushIterator(Iterator<?>, Option<Object>, Option<StreamBlockId>) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
pushSingle(Object) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
put(String, InstructionRequestHandler) - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool.Sink
Puts an InstructionRequestHandler into a client pool.
put(T) - Method in class org.apache.beam.sdk.fn.CancellableQueue
Adds an element to this queue.
put(K2, V2) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
put(K, V) - Method in interface org.apache.beam.sdk.state.MapState
Associates the specified value with the specified key in this state.
put(K, V) - Method in interface org.apache.beam.sdk.state.MultimapState
Associates the specified value with the specified key in this multimap.
putAll(Map<? extends K2, ? extends V2>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
putDataset(PCollection<T>, Dataset<WindowedValue<T>>, boolean) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
putDataset(PCollection<T>, Dataset<WindowedValue<T>>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
putDataset(PCollection<T>, Dataset<WindowedValue<T>>, boolean) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
putIfAbsent(K, V) - Method in interface org.apache.beam.sdk.state.MapState
A deferred read-followed-by-write.
putSchemaIfAbsent(TableReference, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
Registers schema for a table if one is not already present.
putUnresolved(PCollection<OutT>, PipelineTranslator.UnresolvedTranslation<InT, OutT>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.TranslationState
 
putUnresolved(PCollection<T>, PipelineTranslator.UnresolvedTranslation<InputT, T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
PValue - Interface in org.apache.beam.sdk.values
For internal use.
PValueBase - Class in org.apache.beam.sdk.values
For internal use.
PValueBase(Pipeline) - Constructor for class org.apache.beam.sdk.values.PValueBase
 
PValueBase() - Constructor for class org.apache.beam.sdk.values.PValueBase
No-arg constructor to allow subclasses to implement Serializable.
PValues - Class in org.apache.beam.sdk.values
For internal use.
PythonCallable - Class in org.apache.beam.sdk.schemas.logicaltypes
A logical type for PythonCallableSource objects.
PythonCallable() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
 
PythonExternalTransform<InputT extends PInput,OutputT extends POutput> - Class in org.apache.beam.sdk.extensions.python
Wrapper for invoking external Python transforms.
PythonExternalTransformOptions - Interface in org.apache.beam.sdk.extensions.python
Pipeline options for PythonExternalTransform.
PythonExternalTransformOptionsRegistrar - Class in org.apache.beam.sdk.extensions.python
PythonExternalTransformOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.python.PythonExternalTransformOptionsRegistrar
 
PythonMap<InputT,OutputT> - Class in org.apache.beam.sdk.extensions.python.transforms
Wrapper for invoking external Python Map transforms..
PythonService - Class in org.apache.beam.sdk.extensions.python
Utility to bootstrap and start a Beam Python service.
PythonService(String, List<String>, List<String>) - Constructor for class org.apache.beam.sdk.extensions.python.PythonService
 
PythonService(String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.python.PythonService
 
PythonService(String, String...) - Constructor for class org.apache.beam.sdk.extensions.python.PythonService
 

Q

QMARK - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
 
QMARK_RELUCTANT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
 
qualifiedComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
qualifiedComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
 
QualifiedComponentContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
 
Qualifier() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
 
QUALIFIER_DEFAULT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
qualifierList() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
 
qualifierList(int) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
 
qualifierList() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
 
qualifierList(int) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
 
qualifierList() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
 
qualifierList() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
QualifierListContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
 
QualifierListContext() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
 
QualifyComponentContext(FieldSpecifierNotationParser.DotExpressionComponentContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
 
Quantifier - Class in org.apache.beam.sdk.extensions.sql.impl.cep
The Quantifier class is intended for storing the information of the quantifier for a pattern variable.
query(String) - Static method in class org.apache.beam.sdk.extensions.sql.SqlTransform
Returns a SqlTransform representing an equivalent execution plan.
query(MetricResults, Lineage.Type) - Static method in class org.apache.beam.sdk.metrics.Lineage
Query StringSet metrics from MetricResults.
QUERY_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of queries issued during the execution of the Connector.
queryChangeStreamAction(ChangeStreamDao, PartitionMetadataDao, ChangeStreamRecordMapper, PartitionMetadataMapper, DataChangeRecordAction, HeartbeatRecordAction, ChildPartitionsRecordAction, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a single instance of an action class capable of performing a change stream query for a given partition.
QueryChangeStreamAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
Main action class for querying a partition change stream.
queryMetrics(MetricsFilter) - Method in class org.apache.beam.runners.jet.metrics.JetMetricResults
 
queryMetrics(MetricsFilter) - Method in class org.apache.beam.runners.portability.PortableMetrics
 
queryMetrics(MetricsFilter) - Method in class org.apache.beam.sdk.metrics.MetricResults
Query for all metric values that match a given filter.
QueryParameters() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
 
QueryPlanner - Interface in org.apache.beam.sdk.extensions.sql.impl
An interface that planners should implement to convert sql statement to BeamRelNode or SqlNode.
QueryPlanner.Factory - Interface in org.apache.beam.sdk.extensions.sql.impl
 
QueryPlanner.QueryParameters - Class in org.apache.beam.sdk.extensions.sql.impl
 
QueryPlanner.QueryParameters.Kind - Enum in org.apache.beam.sdk.extensions.sql.impl
 
queryResultHasChecksum(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
QueryStatementConverter - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
Converts a resolved Zeta SQL query represented by a tree to corresponding Calcite representation.
QueryTrait - Class in org.apache.beam.sdk.extensions.sql.zetasql
QueryTrait.
QueryTrait() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
 
queryUnflattened(String, String, boolean, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Performs a query without flattening results.
queryUnflattened(String, String, boolean, boolean, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Performs a query without flattening results.
queryWithRetries(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
queryWithRetries(String, String, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
queryWithRetriesUsingStandardSql(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
Queue() - Constructor for class org.apache.beam.sdk.io.solace.data.Semp.Queue
 
QueueData() - Constructor for class org.apache.beam.sdk.io.solace.data.Semp.QueueData
 
queueName(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService.Builder
 
queueName() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
The name of the queue to receive messages from.
queueUrl(T) - Method in interface org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches.DynamicDestination
 
quoteIdentifier(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 

R

RabbitMqIO - Class in org.apache.beam.sdk.io.rabbitmq
A IO to publish or consume messages with a RabbitMQ broker.
RabbitMqIO.Read - Class in org.apache.beam.sdk.io.rabbitmq
A PTransform to consume messages from RabbitMQ server.
RabbitMqIO.Write - Class in org.apache.beam.sdk.io.rabbitmq
A PTransform to publish messages to a RabbitMQ server.
RabbitMqMessage - Class in org.apache.beam.sdk.io.rabbitmq
It contains the message payload, and additional metadata like routing key or attributes.
RabbitMqMessage(byte[]) - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
RabbitMqMessage(String, GetResponse) - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
RabbitMqMessage(String, byte[], String, String, Map<String, Object>, Integer, Integer, String, String, String, String, Date, String, String, String, String) - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
 
RampupThrottlingFn<T> - Class in org.apache.beam.sdk.io.gcp.datastore
An implementation of a client-side throttler that enforces a gradual ramp-up, broadly in line with Datastore best practices.
RampupThrottlingFn(ValueProvider<Integer>, PCollectionView<Instant>) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
 
RampupThrottlingFn(int, PCollectionView<Instant>) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
 
random() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
RandomAccessData - Class in org.apache.beam.runners.dataflow.util
An elastic-sized byte array which allows you to manipulate it as a stream, or access it directly.
RandomAccessData() - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
Constructs a RandomAccessData with a default buffer size.
RandomAccessData(byte[]) - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
Constructs a RandomAccessData with the initial buffer.
RandomAccessData(int) - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
Constructs a RandomAccessData with the given buffer size.
RandomAccessData.RandomAccessDataCoder - Class in org.apache.beam.runners.dataflow.util
A Coder which encodes the valid parts of this stream.
RandomAccessData.UnsignedLexicographicalComparator - Class in org.apache.beam.runners.dataflow.util
A Comparator that compares two byte arrays lexicographically.
RandomAccessDataCoder() - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
range - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
range - Variable in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
RANGE_OFFSET - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
 
RangeTracker<PositionT> - Interface in org.apache.beam.sdk.io.range
A RangeTracker is a thread-safe helper object for implementing dynamic work rebalancing in position-based BoundedSource.BoundedReader subclasses.
Rate() - Constructor for class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate
 
RateLimitPolicy - Interface in org.apache.beam.sdk.io.aws2.kinesis
 
RateLimitPolicy - Interface in org.apache.beam.sdk.io.kinesis
 
RateLimitPolicyFactory - Interface in org.apache.beam.sdk.io.aws2.kinesis
Implement this interface to create a RateLimitPolicy.
RateLimitPolicyFactory - Interface in org.apache.beam.sdk.io.kinesis
Implement this interface to create a RateLimitPolicy.
RateLimitPolicyFactory.DefaultRateLimiter - Class in org.apache.beam.sdk.io.aws2.kinesis
Default rate limiter that throttles reading from a shard using an exponential backoff if the response is empty or if the consumer is throttled by AWS.
RateLimitPolicyFactory.DefaultRateLimiter - Class in org.apache.beam.sdk.io.kinesis
Default rate limiter that throttles reading from a shard using an exponential backoff if the response is empty or if the consumer is throttled by AWS.
RateLimitPolicyFactory.DelayIntervalRateLimiter - Class in org.apache.beam.sdk.io.aws2.kinesis
 
RateLimitPolicyFactory.DelayIntervalRateLimiter - Class in org.apache.beam.sdk.io.kinesis
 
RawUnionValue - Class in org.apache.beam.sdk.transforms.join
This corresponds to an integer union tag and value.
RawUnionValue(int, Object) - Constructor for class org.apache.beam.sdk.transforms.join.RawUnionValue
Constructs a partial union from the given union tag and value.
read() - Static method in class org.apache.beam.io.debezium.DebeziumIO
Read data from a Debezium source.
Read() - Constructor for class org.apache.beam.io.debezium.DebeziumIO.Read
 
read(JavaStreamingContext, SerializablePipelineOptions, UnboundedSource<T, CheckpointMarkT>, String) - Static method in class org.apache.beam.runners.spark.io.SparkUnboundedSource
 
read(Kryo, Input) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.BaseSideInputValues
 
read(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Reads records of the given type from an Avro file (or multiple Avro files matching a pattern).
Read() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
 
read(ByteBuffer) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingReadableByteChannel
 
read(ByteBuffer) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
read(T) - Method in interface org.apache.beam.sdk.fn.stream.DataStreams.OutputChunkConsumer
 
read() - Static method in class org.apache.beam.sdk.io.amqp.AmqpIO
 
Read() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO
Deprecated.
 
Read() - Constructor for class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
 
read() - Static method in class org.apache.beam.sdk.io.aws.sqs.SqsIO
Deprecated.
 
Read() - Constructor for class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
Deprecated.
 
read() - Static method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
 
Read() - Constructor for class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
Returns a new KinesisIO.Read transform for reading from Kinesis.
Read() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
 
Read() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
 
read(Class<T>) - Static method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO
Provide a CosmosIO.Read PTransform to read data from a Cosmos DB.
Read() - Constructor for class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
Provide a CassandraIO.Read PTransform to read data from a Cassandra database.
Read() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.cdap.CdapIO
 
Read() - Constructor for class org.apache.beam.sdk.io.cdap.CdapIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO
A PTransform that reads from one or more text files and returns a bounded PCollection containing one element for each line in the input files.
Read() - Constructor for class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
 
Read() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
read(SerializableFunction<SchemaAndRecord, T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Reads from a BigQuery table or query and returns a PCollection with one element per each row of the table or query result, parsed from the BigQuery AVRO format using the specified function.
read() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Creates an uninitialized BigtableIO.Read.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
read() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.Read builder.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
read() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
The class returned by this method provides the ability to create PTransforms for read operations available in the Firestore V1 API provided by FirestoreStub.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
Instantiates a new Read.
read(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a single store.
read(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a single store.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
 
Read() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
read(SubscriberOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
Read messages from Pub/Sub Lite.
read(Object, Decoder) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
Deserializes a Timestamp from the given Decoder.
read() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
Creates an uninitialized instance of SpannerIO.Read.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
read() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17
 
Read() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsV17.Read
 
read() - Static method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
Creates an uninitialized HadoopFormatIO.Read.
Read() - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
Creates an uninitialized HBaseIO.Read.
read() - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO
Read data from Hive.
Read() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
 
Read() - Constructor for class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
Read data from a JDBC datasource.
Read() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
 
Read() - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
Creates an uninitialized KafkaIO.Read PTransform.
Read() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
 
read() - Static method in class org.apache.beam.sdk.io.kinesis.KinesisIO
Deprecated.
Returns a new KinesisIO.Read transform for reading from Kinesis.
Read() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
 
read() - Static method in class org.apache.beam.sdk.io.kudu.KuduIO
 
Read() - Constructor for class org.apache.beam.sdk.io.kudu.KuduIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
Read data from GridFS.
Read() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbIO
Read data from MongoDB.
Read() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
 
Read() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
read(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
Reads GenericRecord from a Parquet file (or multiple Parquet files matching the pattern).
Read() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.pulsar.PulsarIO
Read from Apache Pulsar.
Read() - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO
 
Read() - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
 
Read - Class in org.apache.beam.sdk.io
A PTransform for reading from a Source.
Read() - Constructor for class org.apache.beam.sdk.io.Read
 
read() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
Read data from a Redis server.
Read() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
Read data from a SingleStoreDB datasource.
Read() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
 
read(SnowflakeBatchServiceConfig) - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceImpl
Reading data from Snowflake tables in batch processing.
read(SnowflakeBatchServiceConfig) - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices.BatchService
 
read(SnowflakeStreamingServiceConfig) - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices.StreamingService
 
read(SnowflakeStreamingServiceConfig) - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceImpl
Reading data from Snowflake in streaming mode is not supported.
read(SnowflakeServices) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO
Read data from Snowflake via COPY statement using user-defined SnowflakeServices.
read() - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO
Read data from Snowflake via COPY statement using default SnowflakeBatchServiceImpl.
Read() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
Create a SolaceIO.Read transform, to read from Solace.
read(TypeDescriptor<T>, SerializableFunction<BytesXMLMessage, T>, SerializableFunction<T, Instant>) - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
Create a SolaceIO.Read transform, to read from Solace.
read() - Static method in class org.apache.beam.sdk.io.solr.SolrIO
 
Read() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO
 
Read() - Constructor for class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.TextIO
A PTransform that reads from one or more text files and returns a bounded PCollection containing one element for each line of the input files.
Read() - Constructor for class org.apache.beam.sdk.io.TextIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.TFRecordIO
A PTransform that reads from a TFRecord file (or multiple TFRecord files matching a pattern) and returns a PCollection containing the decoding of each of the records of the TFRecord file(s) as a byte array.
Read() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
Reads XML files as a PCollection of a given type mapped via JAXB.
Read() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Read
 
read(String) - Static method in class org.apache.beam.sdk.managed.Managed
Instantiates a Managed.ManagedTransform transform for the specified source.
read() - Method in interface org.apache.beam.sdk.state.BagState
 
read() - Method in interface org.apache.beam.sdk.state.CombiningState
 
read() - Method in interface org.apache.beam.sdk.state.ReadableState
Read the current value, blocking until it is available.
read() - Method in interface org.apache.beam.sdk.state.ValueState
Read the current value, blocking until it is available.
Read.Bounded<T> - Class in org.apache.beam.sdk.io
PTransform that reads from a BoundedSource.
Read.Builder - Class in org.apache.beam.sdk.io
Helper class for building Read transforms.
Read.Unbounded<T> - Class in org.apache.beam.sdk.io
PTransform that reads from a UnboundedSource.
READ_DATA_URN - Static variable in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar
 
READ_JSON_URN - Static variable in class org.apache.beam.io.debezium.DebeziumTransformRegistrar
 
READ_TRANSFORMS - Static variable in class org.apache.beam.sdk.managed.Managed
 
READ_URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
 
READ_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
READ_URN - Static variable in class org.apache.beam.sdk.io.snowflake.crosslanguage.SnowflakeTransformRegistrar
 
ReadableFileCoder - Class in org.apache.beam.sdk.io
ReadableState<T> - Interface in org.apache.beam.sdk.state
A State that can be read via ReadableState.read().
ReadableStates - Class in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
ReadableStates() - Constructor for class org.apache.beam.sdk.state.ReadableStates
 
readAll(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Deprecated.
You can achieve The functionality of AvroIO.readAll(java.lang.Class<T>) using FileIO matching plus AvroIO.readFiles(Class). This is the preferred method to make composition explicit. AvroIO.ReadAll will not receive upgrades and will be removed in a future version of Beam.
ReadAll() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
Deprecated.
 
readAll() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
Provide a CassandraIO.ReadAll PTransform to read data from a Cassandra database.
ReadAll() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraIO.ReadAll
 
readAll(List<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from multiple stores.
readAll(ValueProvider<List<String>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from multiple stores.
readAll() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
A PTransform that works like SpannerIO.read(), but executes read operations coming from a PCollection.
ReadAll() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
readAll() - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17
 
ReadAll() - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsV17.ReadAll
 
readAll() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
A PTransform that works like HBaseIO.read(), but executes read operations coming from a PCollection of HBaseIO.Read.
readAll() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
Like JdbcIO.read(), but executes multiple instances of the query substituting each element of a PCollection as query parameters.
ReadAll() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
readAll() - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO
Read all rows using a Neo4j Cypher query.
ReadAll() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
readAll() - Static method in class org.apache.beam.sdk.io.solr.SolrIO
 
ReadAll() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ReadAll
 
readAll() - Static method in class org.apache.beam.sdk.io.TextIO
Deprecated.
You can achieve The functionality of TextIO.readAll() using FileIO matching plus TextIO.readFiles(). This is the preferred method to make composition explicit. TextIO.ReadAll will not receive upgrades and will be removed in a future version of Beam.
ReadAll() - Constructor for class org.apache.beam.sdk.io.TextIO.ReadAll
Deprecated.
 
readAllGenericRecords(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Deprecated.
You can achieve The functionality of AvroIO.readAllGenericRecords(Schema) using FileIO matching plus AvroIO.readFilesGenericRecords(Schema). This is the preferred method to make composition explicit. AvroIO.ReadAll will not receive upgrades and will be removed in a future version of Beam.
readAllGenericRecords(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Deprecated.
You can achieve The functionality of AvroIO.readAllGenericRecords(String) using FileIO matching plus AvroIO.readFilesGenericRecords(String). This is the preferred method to make composition explicit. AvroIO.ReadAll will not receive upgrades and will be removed in a future version of Beam.
readAllRequests() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Retrieve all HL7v2 Messages from a PCollection of HL7v2ReadParameter.
readAllStreamPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Read all the StreamPartition and output PartitionRecord to stream them.
ReadAllViaFileBasedSource<T> - Class in org.apache.beam.sdk.io
Reads each file in the input PCollection of FileIO.ReadableFile using given parameters for splitting files into offset ranges and for creating a FileBasedSource for a file.
ReadAllViaFileBasedSource(long, SerializableFunction<String, ? extends FileBasedSource<T>>, Coder<T>) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
 
ReadAllViaFileBasedSource(long, SerializableFunction<String, ? extends FileBasedSource<T>>, Coder<T>, boolean, ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
 
ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler - Class in org.apache.beam.sdk.io
A class to handle errors which occur during file reads.
ReadAllViaFileBasedSourceTransform<InT,T> - Class in org.apache.beam.sdk.io
 
ReadAllViaFileBasedSourceTransform(long, SerializableFunction<String, ? extends FileBasedSource<InT>>, Coder<T>) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
ReadAllViaFileBasedSourceTransform(long, SerializableFunction<String, ? extends FileBasedSource<InT>>, Coder<T>, boolean, ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
ReadAllViaFileBasedSourceTransform.AbstractReadFileRangesFn<InT,T> - Class in org.apache.beam.sdk.io
 
ReadAllViaFileBasedSourceTransform.SplitIntoRangesFn - Class in org.apache.beam.sdk.io
 
ReadAllViaFileBasedSourceWithFilename<T> - Class in org.apache.beam.sdk.io
Reads each file of the input PCollection and outputs each element as the value of a KV, where the key is the filename from which that value came.
ReadAllViaFileBasedSourceWithFilename(long, SerializableFunction<String, ? extends FileBasedSource<T>>, Coder<KV<String, T>>) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceWithFilename
 
readAllWithFilter(List<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a multiple stores matching a filter.
readAllWithFilter(ValueProvider<List<String>>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a multiple stores matching a filter.
readAsJson() - Static method in class org.apache.beam.io.debezium.DebeziumIO
Read data from Debezium source and convert a Kafka SourceRecord into a JSON string using SourceRecordJson.SourceRecordJsonMapper as default function mapper.
readAvroGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns a PTransform that continuously reads binary encoded Avro messages into the Avro GenericRecord type.
readAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads binary encoded Avro messages of the given type from a Google Cloud Pub/Sub stream.
readAvrosWithBeamSchema(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns a PTransform that continuously reads binary encoded Avro messages of the specific type.
ReadBuilder() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder
 
ReadBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
 
ReadBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
 
ReadBuilder - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
 
ReadBuilder() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder
 
ReadBuilder.Configuration - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
Parameters class to expose the transform to an external SDK.
readBytes() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
A specific instance of uninitialized KafkaIO.read() where key and values are bytes.
readCallMetric(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
readChangeStream() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Creates an uninitialized BigtableIO.ReadChangeStream.
ReadChangeStream() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
 
readChangeStream() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
Creates an uninitialized instance of SpannerIO.ReadChangeStream.
ReadChangeStream() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
 
readChangeStreamPartition(PartitionRecord, StreamProgress, Instant, Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
Streams a partition.
readChangeStreamPartitionAction(MetadataTableDao, ChangeStreamDao, ChangeStreamMetrics, ChangeStreamAction, Duration, SizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class for processing ReadChangeStreamPartitionDoFn.
ReadChangeStreamPartitionAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
This class is part of ReadChangeStreamPartitionDoFn SDF.
ReadChangeStreamPartitionAction(MetadataTableDao, ChangeStreamDao, ChangeStreamMetrics, ChangeStreamAction, Duration, SizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ReadChangeStreamPartitionAction
 
ReadChangeStreamPartitionDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
 
ReadChangeStreamPartitionDoFn(DaoFactory, ActionFactory, ChangeStreamMetrics, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
ReadChangeStreamPartitionDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
A SDF (Splittable DoFn) class which is responsible for performing a change stream query for a given partition.
ReadChangeStreamPartitionDoFn(DaoFactory, MapperFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
This class needs a DaoFactory to build DAOs to access the partition metadata tables and to perform the change streams query.
ReadChangeStreamPartitionProgressTracker - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
RestrictionTracker used by ReadChangeStreamPartitionDoFn to keep track of the progress of the stream and to split the restriction for runner initiated checkpoints.
ReadChangeStreamPartitionProgressTracker(StreamProgress) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
Constructs a restriction tracker with the streamProgress.
ReadChangeStreamPartitionRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
This restriction tracker delegates most of its behavior to an internal TimestampRangeTracker.
ReadChangeStreamPartitionRangeTracker(PartitionMetadata, TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
Receives the partition that will be queried and the timestamp range that belongs to it.
readData() - Static method in class org.apache.beam.sdk.io.kinesis.KinesisIO
Deprecated.
A PTransform to read from Kinesis stream as bytes without metadata and returns a PCollection of byte[].
ReadDataBuilder() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder
 
readDecompressed(ReadableByteChannel) - Method in enum org.apache.beam.sdk.io.Compression
 
readDetectNewPartitionMissingPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Read and deserialize missing partition and how long they have been missing from the metadata table.
readDetectNewPartitionsState() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Read the low watermark of the pipeline from Detect New Partition row.
Reader() - Constructor for class org.apache.beam.sdk.io.Source.Reader
 
ReaderInvocationUtil<OutputT,ReaderT extends Source.Reader<OutputT>> - Class in org.apache.beam.runners.flink.metrics
Util for invoking Source.Reader methods that might require a MetricsContainerImpl to be active.
ReaderInvocationUtil(String, PipelineOptions, FlinkMetricContainerBase) - Constructor for class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
readExternal(ObjectInput) - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
readFhirResource(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Read fhir resource http body.
readFhirResource(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundle
Reads single field from row (of type RowWithGetters).
readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle
 
readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle
 
readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle
 
readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle
 
readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle
 
readField(Row, Blackhole) - Method in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle
 
ReadFileRangesFnExceptionHandler() - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler
 
readFiles(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
ReadFiles() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
 
readFiles() - Static method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO
Like ContextualTextIO.read(), but reads each file in a PCollection of FileIO.ReadableFile, returned by FileIO.readMatches().
ReadFiles() - Constructor for class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
 
readFiles(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
Like ParquetIO.read(Schema), but reads each file in a PCollection of FileIO.ReadableFile, which allows more flexible usage.
ReadFiles() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
 
readFiles() - Static method in class org.apache.beam.sdk.io.TextIO
Like TextIO.read(), but reads each file in a PCollection of FileIO.ReadableFile, returned by FileIO.readMatches().
ReadFiles() - Constructor for class org.apache.beam.sdk.io.TextIO.ReadFiles
 
readFiles() - Static method in class org.apache.beam.sdk.io.TFRecordIO
Like TFRecordIO.read(), but reads each file in a PCollection of FileIO.ReadableFile, returned by FileIO.readMatches().
ReadFiles() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.ReadFiles
 
readFiles(Class<T>) - Static method in class org.apache.beam.sdk.io.thrift.ThriftIO
Reads each file in a PCollection of FileIO.ReadableFile, which allows more flexible usage.
ReadFiles() - Constructor for class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
 
readFiles() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
Like XmlIO.read(), but reads each file in a PCollection of FileIO.ReadableFile, which allows more flexible usage via different configuration options of FileIO.match() and FileIO.readMatches() that are not explicitly provided for XmlIO.read().
ReadFiles() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
 
readFilesGenericRecords(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
readFilesGenericRecords(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
readFrom(InputStream, int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Reads length bytes from the specified input stream writing them into the backing data store starting at offset.
readFromPort(BeamFnApi.RemoteGrpcPort, String) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
ReadFromPulsarDoFn - Class in org.apache.beam.sdk.io.pulsar
Transform for reading from Apache Pulsar.
ReadFromPulsarDoFn(PulsarIO.Read) - Constructor for class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
readFromSource(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Reads all elements from the given BoundedSource.
readFromSplitsOfSource(BoundedSource<T>, long, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
readFromStartedReader(Source.Reader<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Reads all elements from the given started Source.Reader.
readFromUnstartedReader(Source.Reader<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Reads all elements from the given unstarted Source.Reader.
readFullyAsBytes() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
Returns the full contents of the file as bytes.
readFullyAsUTF8String() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
Returns the full contents of the file as a String decoded as UTF-8.
readGenericRecords(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Reads Avro file(s) containing records of the specified schema.
readGenericRecords(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Reads Avro file(s) containing records of the specified schema.
readKeyPatterns() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
Like RedisIO.read() but executes multiple instances of the Redis query substituting each element of a PCollection as key pattern.
ReadKeyPatterns() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
 
readLater() - Method in interface org.apache.beam.sdk.state.BagState
 
readLater() - Method in interface org.apache.beam.sdk.state.CombiningState
 
readLater() - Method in interface org.apache.beam.sdk.state.GroupingState
 
readLater() - Method in interface org.apache.beam.sdk.state.ReadableState
Indicate that the value will be read later.
readLater() - Method in interface org.apache.beam.sdk.state.SetState
 
readLater() - Method in interface org.apache.beam.sdk.state.ValueState
 
readLater() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
 
readMatches() - Static method in class org.apache.beam.sdk.io.FileIO
Converts each result of FileIO.match() or FileIO.matchAll() to a FileIO.ReadableFile which can be used to read the contents of each file, optionally decompressing it.
ReadMatches() - Constructor for class org.apache.beam.sdk.io.FileIO.ReadMatches
 
readMessage() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
 
readMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithAttributes() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithAttributesAndMessageId() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithAttributesAndMessageIdAndOrderingKey() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithAttributesWithCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream, mapping each PubsubMessage, with attributes, into type T using the supplied parse function and coder.
readMessagesWithCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream, mapping each PubsubMessage into type T using the supplied parse function and coder.
readMessagesWithMessageId() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readNewPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
 
readNewPartitionsIncludingDeleted() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
 
readNextBlock() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
 
readNextBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Read the next block from the input.
readNextRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
Reads the next record from the block and returns true iff one exists.
readNextRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Reads the next record from the current block if possible.
readNextRecord() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Reads the next record via the delegate reader.
readNextRecord() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
readNItemsFromStartedReader(Source.Reader<T>, int) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Read elements from a Source.Reader that has already had Source.Reader#start called on it, until n elements are read.
readNItemsFromUnstartedReader(Source.Reader<T>, int) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Read elements from a Source.Reader until n elements are read.
readOnly(String, Map<String, BeamSqlTable>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
This method creates BeamSqlEnv using empty Pipeline Options.
ReadOnlyTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
A ReadOnlyTableProvider provides in-memory read only set of BeamSqlTable BeamSqlTables.
ReadOnlyTableProvider(String, Map<String, BeamSqlTable>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
ReadOperation - Class in org.apache.beam.sdk.io.gcp.spanner
Encapsulates a spanner read operation.
ReadOperation() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
readPrivateKeyFile(String) - Static method in class org.apache.beam.sdk.io.snowflake.KeyPairUtils
 
readProtoDynamicMessages(ProtoDomain, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns a PTransform that continuously reads binary encoded protobuf messages for the type specified by fullMessageName.
readProtoDynamicMessages(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Similar to PubsubIO.readProtoDynamicMessages(ProtoDomain, String) but for when the Descriptors.Descriptor is already known.
readProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads binary encoded protobuf messages of the given type from a Google Cloud Pub/Sub stream.
readRange(Instant, Instant) - Method in interface org.apache.beam.sdk.state.OrderedListState
Read a timestamp-limited subrange of the list.
readRangeLater(Instant, Instant) - Method in interface org.apache.beam.sdk.state.OrderedListState
Call to indicate that a specific range will be read from the list, allowing runners to batch multiple range reads.
readRangesFn() - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
 
readRangesFn() - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
readRangesFn() - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceWithFilename
 
ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.ReadRegistrar
 
ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
 
ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.ReadRegistrar
 
ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.ReadRegistrar
 
ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.ReadRegistrar
 
readRemainingFromReader(Source.Reader<T>, boolean) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Read all remaining elements from a Source.Reader.
readResolve() - Method in class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToElemFunction
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunction
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.ByteToWindowFunctionPrimitive
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.ElemToBytesFunction
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
readResolve() - Method in class org.apache.beam.runners.twister2.translators.functions.MapToTupleFunction
 
readResources() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Read resources from a PCollection of resource IDs (e.g.
readRows(ReadRowsRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
Read rows in the context of a specific read stream.
readRows(ReadRowsRequest, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
 
readRows(IcebergCatalogConfig) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergIO
 
ReadRows() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergIO.ReadRows
 
readRows() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
Read Beam Rows from a JDBC data source.
ReadRows() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
readRows() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
Read Beam Rows from a SingleStoreDB datasource.
readSource(PipelineOptions, TupleTag<T>, DoFn.MultiOutputReceiver, BoundedSource<T>, BigQueryIO.TypedRead.ErrorHandlingParseFn<T>, BadRecordRouter) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
readSourceDescriptors() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
Creates an uninitialized KafkaIO.ReadSourceDescriptors PTransform.
ReadSourceDescriptors() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
 
ReadSourceTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
Source translator.
ReadSourceTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.ReadSourceTranslatorBatch
 
ReadSourceTranslatorStream<T> - Class in org.apache.beam.runners.twister2.translators.streaming
doc.
ReadSourceTranslatorStream() - Constructor for class org.apache.beam.runners.twister2.translators.streaming.ReadSourceTranslatorStream
 
ReadSpannerSchema - Class in org.apache.beam.sdk.io.gcp.spanner
This DoFn reads Cloud Spanner 'information_schema.*' tables to build the SpannerSchema.
ReadSpannerSchema(SpannerConfig, PCollectionView<Dialect>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
Constructor for creating an instance of the ReadSpannerSchema class.
ReadSpannerSchema(SpannerConfig, PCollectionView<Dialect>, Set<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
Constructor for creating an instance of the ReadSpannerSchema class.
readStreamPartitionsWithWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Return list of locked StreamPartition and their watermarks.
readStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads UTF-8 encoded strings from a Google Cloud Pub/Sub stream.
readStudyMetadata() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
 
readTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Like BigQueryIO.read(SerializableFunction) but represents each row as a TableRow.
readTableRowsWithSchema() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Like BigQueryIO.readTableRows() but with Schema support.
readTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
Milliseconds to wait for a read on a socket before an exception is thrown.
readTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
Milliseconds to wait for a read on a socket before an exception is thrown.
readValue - Variable in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
 
readValue - Variable in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
 
readWithDatumReader(AvroSource.DatumReaderFactory<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Reads from a BigQuery table or query and returns a PCollection with one element per each row of the table or query result.
readWithFilter(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a single store matching a filter.
readWithFilter(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a single store matching a filter.
readWithMetadata() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
 
readWithPartitions(TypeDescriptor<PartitionColumnT>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
Like JdbcIO.readAll(), but executes multiple instances of the query on the same table (subquery) using ranges.
readWithPartitions(JdbcReadWithPartitionsHelper<PartitionColumnT>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
Like JdbcIO.readAll(), but executes multiple instances of the query on the same table (subquery) using ranges.
readWithPartitions() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
 
ReadWithPartitions() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
 
readWithPartitions() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
Like SingleStoreIO.read(), but executes multiple instances of the query on the same table for each database partition.
ReadWithPartitions() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
 
readWithPartitionsRows() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
Like SingleStoreIO.readRows(), but executes multiple instances of the query on the same table for each database partition.
readWithSchema() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
 
ReadWriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.ReadWriteRegistrar
 
receive() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
Receives a message from the broker.
receive() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
 
ReceiverBuilder<X,T extends org.apache.spark.streaming.receiver.Receiver<X>> - Class in org.apache.beam.sdk.io.sparkreceiver
Class for building an instance for Receiver that uses Apache Beam mechanisms instead of Spark environment.
ReceiverBuilder(Class<T>) - Constructor for class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
 
RecommendationAICreateCatalogItem - Class in org.apache.beam.sdk.extensions.ml
A PTransform using the Recommendations AI API (https://cloud.google.com/recommendations).
RecommendationAICreateCatalogItem() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
 
RecommendationAIImportCatalogItems - Class in org.apache.beam.sdk.extensions.ml
A PTransform connecting to the Recommendations AI API (https://cloud.google.com/recommendations) and creating CatalogItems.
RecommendationAIImportCatalogItems() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
RecommendationAIImportUserEvents - Class in org.apache.beam.sdk.extensions.ml
A PTransform connecting to the Recommendations AI API (https://cloud.google.com/recommendations) and creating UserEvents.
RecommendationAIImportUserEvents() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
RecommendationAIIO - Class in org.apache.beam.sdk.extensions.ml
The RecommendationAIIO class acts as a wrapper around the PTransforms that interact with the Recommendation AI API (https://cloud.google.com/recommendations).
RecommendationAIIO() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
 
RecommendationAIPredict - Class in org.apache.beam.sdk.extensions.ml
A PTransform using the Recommendations AI API (https://cloud.google.com/recommendations).
RecommendationAIPredict() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
RecommendationAIWriteUserEvent - Class in org.apache.beam.sdk.extensions.ml
A PTransform using the Recommendations AI API (https://cloud.google.com/recommendations).
RecommendationAIWriteUserEvent() - Constructor for class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
record(List<FieldOperation>) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
RECORD - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.PortableBigQueryDestinations
 
Record() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.Record
 
Record() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record
 
RECORD_NUM - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
 
RECORD_NUM_IN_OFFSET - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
 
RECORD_OFFSET - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
 
RecordAggregation() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation
 
recordId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Id to pass to the runner to distinguish this message from all others.
recordId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
If using an id attribute, the record id to associate with this record's metadata so the receiver can reject duplicates.
RECORDING_ROUTER - Static variable in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
 
RecordingBadRecordRouter() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.RecordingBadRecordRouter
 
RecordToPublishResultDoFn - Class in org.apache.beam.sdk.io.solace.write
This class just transforms to PublishResult to be able to capture the windowing with the right strategy.
RecordToPublishResultDoFn() - Constructor for class org.apache.beam.sdk.io.solace.write.RecordToPublishResultDoFn
 
RecordWithMetadata - Class in org.apache.beam.sdk.io.contextualtextio
Helper Class based on Row, it provides Metadata associated with each Record when reading from file(s) using ContextualTextIO.
RecordWithMetadata() - Constructor for class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
 
recoverRecords(Consumer<HistoryRecord>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
 
RedisConnectionConfiguration - Class in org.apache.beam.sdk.io.redis
RedisConnectionConfiguration describes and wraps a connectionConfiguration to Redis server or cluster.
RedisConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
RedisCursor - Class in org.apache.beam.sdk.io.redis
 
RedisIO - Class in org.apache.beam.sdk.io.redis
An IO to manipulate Redis key/value database.
RedisIO.Read - Class in org.apache.beam.sdk.io.redis
Implementation of RedisIO.read().
RedisIO.ReadKeyPatterns - Class in org.apache.beam.sdk.io.redis
Implementation of RedisIO.readKeyPatterns().
RedisIO.Write - Class in org.apache.beam.sdk.io.redis
A PTransform to write to a Redis server.
RedisIO.Write.Method - Enum in org.apache.beam.sdk.io.redis
Determines the method used to insert data in Redis.
RedisIO.WriteStreams - Class in org.apache.beam.sdk.io.redis
A PTransform to write stream key pairs (https://redis.io/topics/streams-intro) to a Redis server.
Redistribute - Class in org.apache.beam.sdk.transforms
A family of PTransforms that returns a PCollection equivalent to its input but functions as an operational hint to a runner that redistributing the data in some way is likely useful.
Redistribute() - Constructor for class org.apache.beam.sdk.transforms.Redistribute
 
Redistribute.RedistributeArbitrarily<T> - Class in org.apache.beam.sdk.transforms
Noop transform that hints to the runner to try to redistribute the work evenly, or via whatever clever strategy the runner comes up with.
Redistribute.RedistributeByKey<K,V> - Class in org.apache.beam.sdk.transforms
 
Redistribute.Registrar - Class in org.apache.beam.sdk.transforms
Registers translators for the Redistribute family of transforms.
ReferenceCountingExecutableStageContextFactory - Class in org.apache.beam.runners.fnexecution.control
ExecutableStageContext.Factory which counts ExecutableStageContext reference for book keeping.
ReferenceCountingExecutableStageContextFactory.Creator - Interface in org.apache.beam.runners.fnexecution.control
Interface for creator which extends Serializable.
referencesSingleField() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Returns true if this descriptor references only a single, non-wildcard field.
reflect(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type respecting Avro's Reflect* suite for encoding and decoding.
reflect(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type respecting Avro's Reflect* suite for encoding and decoding.
reflect(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type respecting Avro's Reflect* suite for encoding and decoding.
reflect(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
Returns an AvroDatumFactory instance for the provided element type respecting Avro's Reflect* suite for encoding and decoding.
ReflectDatumFactory(Class<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.ReflectDatumFactory
 
ReflectUtils - Class in org.apache.beam.sdk.schemas.utils
A set of reflection helper methods.
ReflectUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
ReflectUtils.ClassWithSchema - Class in org.apache.beam.sdk.schemas.utils
Represents a class and a schema.
ReflectUtils.TypeDescriptorWithSchema<T> - Class in org.apache.beam.sdk.schemas.utils
Represents a type descriptor and a schema.
refreshSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
 
refreshThread() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
 
Regex - Class in org.apache.beam.sdk.transforms
PTransforms to use Regular Expressions to process elements in a PCollection.
Regex.AllMatches - Class in org.apache.beam.sdk.transforms
Regex.MatchesName<String> takes a PCollection<String> and returns a PCollection<List<String>> representing the value extracted from all the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.Find - Class in org.apache.beam.sdk.transforms
Regex.Find<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.FindAll - Class in org.apache.beam.sdk.transforms
Regex.Find<String> takes a PCollection<String> and returns a PCollection<List<String>> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.FindKV - Class in org.apache.beam.sdk.transforms
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a PCollection<KV<String, String>> representing the key and value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.FindName - Class in org.apache.beam.sdk.transforms
Regex.Find<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.FindNameKV - Class in org.apache.beam.sdk.transforms
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a PCollection<KV<String, String>> representing the key and value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.Matches - Class in org.apache.beam.sdk.transforms
Regex.Matches<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.MatchesKV - Class in org.apache.beam.sdk.transforms
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a PCollection<KV<String, String>> representing the key and value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.MatchesName - Class in org.apache.beam.sdk.transforms
Regex.MatchesName<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.MatchesNameKV - Class in org.apache.beam.sdk.transforms
Regex.MatchesNameKV<KV<String, String>> takes a PCollection<String> and returns a PCollection<KV<String, String>> representing the key and value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.ReplaceAll - Class in org.apache.beam.sdk.transforms
Regex.ReplaceAll<String> takes a PCollection<String> and returns a PCollection<String> with all Strings that matched the Regex being replaced with the replacement string.
Regex.ReplaceFirst - Class in org.apache.beam.sdk.transforms
Regex.ReplaceFirst<String> takes a PCollection<String> and returns a PCollection<String> with the first Strings that matched the Regex being replaced with the replacement string.
Regex.Split - Class in org.apache.beam.sdk.transforms
Regex.Split<String> takes a PCollection<String> and returns a PCollection<String> with the input string split into individual items in a list.
RegexMatcher - Class in org.apache.beam.sdk.testing
Hamcrest matcher to assert a string matches a pattern.
RegexMatcher(String) - Constructor for class org.apache.beam.sdk.testing.RegexMatcher
 
region(Region) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
Optional Region.
region() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
Optional Region.
register(WatchService, WatchEvent.Kind<?>[], WatchEvent.Modifier...) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
register(WatchService, WatchEvent.Kind<?>...) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
register(RelOptPlanner) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
register(RelOptPlanner) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
register(Class<? extends PipelineOptions>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
This registers the interface with this factory.
registerBadRecordErrorHandler(PTransform<PCollection<BadRecord>, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
 
registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.Coder
Notifies the ElementByteSizeObserver about the byte size of the encoded value using this Coder.
registerByteSizeObserver(ReadableDuration, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
registerByteSizeObserver(IterableT, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
registerByteSizeObserver(KV<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.KvCoder
Notifies ElementByteSizeObserver about the byte size of the encoded value using this coder.
registerByteSizeObserver(Map<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.MapCoder
 
registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.NullableCoder
Overridden to short-circuit the default StructuredCoder behavior of encoding and counting the bytes.
registerByteSizeObserver(SortedMap<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
registerByteSizeObserver(RawUnionValue, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
Notifies ElementByteSizeObserver about the byte size of the encoded value using this coder.
registerByteSizeObserver(IntervalWindow, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
registerClasses(Kryo) - Method in class org.apache.beam.runners.spark.coders.SparkRunnerKryoRegistrator
 
registerClasses(Kryo) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory.SparkKryoRegistrator
 
registerCoderForClass(Class<?>, Coder<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Registers the provided Coder for the given class.
registerCoderForType(TypeDescriptor<?>, Coder<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Registers the provided Coder for the given type.
registerCoderProvider(CoderProvider) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Registers coderProvider as a potential CoderProvider which can produce Coder instances.
registerConsumer(String, CloseableFnDataReceiver<BeamFnApi.Elements>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
Registers a consumer for the specified instruction id.
registerEnvironment(String, RunnerApi.Environment) - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
 
registerFileSystemsOnce(PipelineOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
Register file systems once if never done before.
registerForProcessBundleInstructionId(String, StateRequestHandler) - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
 
registerForProcessBundleInstructionId(String, StateRequestHandler) - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator
Registers the supplied handler for the given process bundle instruction id for all BeamFnApi.StateRequests with a matching id.
registerJavaBean(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a JavaBean type for automatic schema inference.
registerJavaBean(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a JavaBean type for automatic schema inference.
registerJob(String, Map<String, List<RunnerApi.ArtifactInformation>>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
Registers a set of artifacts to be staged with this service.
registerKnownTableNames(List<TableName>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.CustomTableResolver
Register the table names as extracted from the FROM clause.
registerKnownTableNames(List<TableName>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
 
registerLineage(String, Schema) - Method in class org.apache.beam.sdk.io.cdap.context.StreamingSourceContextImpl
 
registerMetricsForPipelineResult() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
This should be called at the end of the Flink job and sets up an accumulator to push the metrics to the PipelineResult.
registerOutputDataLocation(String, Coder<T>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
Register the outbound data logical endpoint, returns the FnDataReceiver for processing the endpoint's outbound data.
registerOutputTimersLocation(String, String, Coder<T>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
Register the outbound timers logical endpoint, returns the FnDataReceiver for processing the endpoint's outbound timers data.
registerPOJO(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a POJO type for automatic schema inference.
registerPOJO(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a POJO type for automatic schema inference.
registerProcessBundleDescriptor(BeamFnApi.ProcessBundleDescriptor) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
registerProcessBundleDescriptor(BeamFnApi.ProcessBundleDescriptor) - Method in interface org.apache.beam.runners.fnexecution.control.InstructionRequestHandler
 
registerProvider(TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
registerProvider(TableProvider) - Method in interface org.apache.beam.sdk.extensions.sql.meta.store.MetaStore
Register a table provider.
registerReceiver(String, CloseableFnDataReceiver<BeamFnApi.Elements>) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
Registers a receiver for the provided instruction id.
registerReceiver(String, CloseableFnDataReceiver<BeamFnApi.Elements>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
registerSchemaForClass(Class<T>, Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a schema for a specific Class type.
registerSchemaForType(TypeDescriptor<T>, Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a schema for a specific TypeDescriptor type.
registerSchemaProvider(SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a SchemaProvider.
registerSchemaProvider(Class<T>, SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a SchemaProvider to be used for a specific type.
registerSchemaProvider(TypeDescriptor<T>, SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
Register a SchemaProvider to be used for a specific type.
registerTables(SchemaPlus, List<List<String>>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.TableResolution
Registers tables that will be resolved during query analysis, so table providers can eagerly pre-load metadata.
registerTransformTranslator(Class<TransformT>, TransformTranslator<? extends TransformT>) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Records that instances of the specified PTransform class should be translated by default by the corresponding TransformTranslator.
registerUdaf(String, Combine.CombineFn) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
register a Combine.CombineFn as UDAF function used in this query.
registerUdf(String, Class<? extends BeamSqlUdf>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
register a UDF function used in this query.
registerUdf(String, SerializableFunction) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
Register SerializableFunction as a UDF function used in this query.
Registrar() - Constructor for class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.Registrar
 
Registrar() - Constructor for class org.apache.beam.sdk.transforms.Redistribute.Registrar
 
Reify - Class in org.apache.beam.sdk.transforms
PTransforms for converting between explicit and implicit form of various Beam values.
ReifyAsIterable<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
This transforms turns a side input into a singleton PCollection that can be used as the main input for another transform.
ReifyAsIterable() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
 
rel(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
relativeErrorForPrecision(int) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
 
relativeFileNaming(ValueProvider<String>, FileIO.Write.FileNaming) - Static method in class org.apache.beam.sdk.io.FileIO.Write
 
relativize(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
relBuilder - Variable in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
 
release(ObjectT) - Method in class org.apache.beam.sdk.io.aws2.common.ObjectPool
Release a reference to a shared client instance.
releaseByKey(KeyT) - Method in class org.apache.beam.sdk.io.aws2.common.ObjectPool
Release a reference to a shared object instance using KeyT.
releaseDataset(Dataset) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
releaseJobIdLock(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
Deletes lock ids bounded with given job if any exists.
releaseJobIdLock(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
 
releaseSavepoint(Savepoint) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
releaseStreamPartitionLockForDeletion(Range.ByteStringRange, String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
This is the 1st step of 2 phase delete of StreamPartition.
RelMdNodeStats - Class in org.apache.beam.sdk.extensions.sql.impl.planner
This is the implementation of NodeStatsMetadata.
RelMdNodeStats() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
 
RelType(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
 
remerge() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
Creates a Window PTransform that does not change assigned windows, but will cause windows to be merged again as part of the next GroupByKey.
RemoteBundle - Interface in org.apache.beam.runners.fnexecution.control
A bundle capable of handling input data elements for a bundle descriptor by forwarding them to a remote environment for processing.
RemoteEnvironment - Interface in org.apache.beam.runners.fnexecution.environment
A handle to an available remote RunnerApi.Environment.
RemoteEnvironment.SimpleRemoteEnvironment - Class in org.apache.beam.runners.fnexecution.environment
A RemoteEnvironment which uses the default RemoteEnvironment.close() behavior.
RemoteEnvironmentOptions - Interface in org.apache.beam.sdk.options
Options that are used to control configuration of the remote environment.
RemoteEnvironmentOptions.Options - Class in org.apache.beam.sdk.options
RemoteGrpcPortRead - Class in org.apache.beam.sdk.fn.data
An execution-time only RunnerApi.PTransform which represents an SDK harness reading from a BeamFnApi.RemoteGrpcPort.
RemoteGrpcPortRead() - Constructor for class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
RemoteGrpcPortWrite - Class in org.apache.beam.sdk.fn.data
An execution-time only RunnerApi.PTransform which represents a write from within an SDK harness to a BeamFnApi.RemoteGrpcPort.
RemoteGrpcPortWrite() - Constructor for class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
RemoteInputDestination<T> - Class in org.apache.beam.runners.fnexecution.data
A pair of Coder and BeamFnApi.Target which specifies the arguments to a FnDataService to send data to a remote harness.
RemoteInputDestination() - Constructor for class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
 
RemoteOutputReceiver<T> - Class in org.apache.beam.runners.fnexecution.control
A pair of Coder and FnDataReceiver which can be registered to receive elements for a LogicalEndpoint.
RemoteOutputReceiver() - Constructor for class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
 
remove(Collection<String>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
 
remove() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
 
remove(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
remove(K) - Method in interface org.apache.beam.sdk.state.MapState
Remove the mapping for a key from this map if it is present.
remove(K) - Method in interface org.apache.beam.sdk.state.MultimapState
Removes all values associated with the key from this multimap.
remove(T) - Method in interface org.apache.beam.sdk.state.SetState
Removes the specified element from this set if it is present.
removeAllPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
removeBucket(Bucket) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Remove an empty Bucket in Cloud Storage or propagates an exception.
removeMetadata(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
removePipelineOption(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
removePrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
Remove prefix, e.g.
removeProperties(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
removeProperties(MetadataEntity, String...) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
removeStagedArtifacts(String) - Method in interface org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestinationProvider
 
removeStagedArtifacts(String) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
 
removeTags(MetadataEntity) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
removeTags(MetadataEntity, String...) - Method in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
 
removeTemporaryFiles(Collection<ResourceId>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
rename(Iterable<String>, Iterable<String>, MoveOptions...) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
 
rename(List<ClassLoaderFileSystem.ClassLoaderResourceId>, List<ClassLoaderFileSystem.ClassLoaderResourceId>, MoveOptions...) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
rename(List<ResourceIdT>, List<ResourceIdT>, MoveOptions...) - Method in class org.apache.beam.sdk.io.FileSystem
Renames a List of file-like resources from one location to another.
rename(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
Renames a List of file-like resources from one location to another.
rename(String, String) - Method in class org.apache.beam.sdk.schemas.transforms.RenameFields.Inner
Rename a specific field.
rename(FieldAccessDescriptor, String) - Method in class org.apache.beam.sdk.schemas.transforms.RenameFields.Inner
Rename a specific field.
RenameFields - Class in org.apache.beam.sdk.schemas.transforms
A transform for renaming fields inside an existing schema.
RenameFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.RenameFields
 
RenameFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
The class implementing the actual PTransform.
REPEATABLE_ERROR_TYPES - Static variable in class org.apache.beam.io.requestresponse.RequestResponseIO
Set of UserCodeExecutionExceptions that warrant repeating.
Repeatedly - Class in org.apache.beam.sdk.transforms.windowing
A Trigger that fires according to its subtrigger forever.
replace(Class<V>, T) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
replace(String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
REPLACE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
REPLACE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
REPLACE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
replaceAll(List<PTransformOverride>) - Method in class org.apache.beam.sdk.Pipeline
For internal use only; no backwards-compatibility guarantees.
replaceAll(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.ReplaceAll PTransform that checks if a portion of the line matches the Regex and replaces all matches with the replacement String.
replaceAll(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.ReplaceAll PTransform that checks if a portion of the line matches the Regex and replaces all matches with the replacement String.
ReplaceAll(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.ReplaceAll
 
ReplaceBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReplaceBuilder
 
replaceFirst(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.ReplaceAll PTransform that checks if a portion of the line matches the Regex and replaces the first match with the replacement String.
replaceFirst(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.ReplaceAll PTransform that checks if a portion of the line matches the Regex and replaces the first match with the replacement String.
ReplaceFirst(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
 
replaceTransforms(Pipeline, StreamingOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
 
replaceV1Transforms(Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
ReplicaInfo() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
 
report() - Method in class org.apache.beam.runners.spark.metrics.sink.CsvSink
 
report() - Method in class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
 
report() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
 
report() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
 
reportElementSize(long) - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
reportError(String, Throwable) - Method in class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
reportFailedRPCMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Records RpcRequests counter and RpcLatency histogram for this RPC call.
reportFailedRPCMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Records RpcRequests counter and RpcLatency histogram for this RPC call.
reportLineage(ResourceIdT, Lineage) - Method in class org.apache.beam.sdk.io.FileSystem
Report Lineage metrics for resource id at file level.
reportLineage(ResourceIdT, Lineage, FileSystem.LineageLevel) - Method in class org.apache.beam.sdk.io.FileSystem
Report Lineage metrics for resource id to a given level.
reportPendingMetrics() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
Call this method on Work Item thread to report outstanding metrics.
reportSinkLineage(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
Report sink Lineage metrics for resource id.
reportSinkLineage(ResourceId, FileSystem.LineageLevel) - Static method in class org.apache.beam.sdk.io.FileSystems
Report source Lineage metrics for resource id at given level.
reportSourceLineage(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
Report source Lineage metrics for resource id.
reportSourceLineage(ResourceId, FileSystem.LineageLevel) - Static method in class org.apache.beam.sdk.io.FileSystems
Report source Lineage metrics for resource id at given level.
reportSuccessfulRpcMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Records RpcRequests counter and RpcLatency histogram for this RPC call.
reportSuccessfulRpcMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Records RpcRequests counter and RpcLatency histogram for this RPC call.
reportWorkItemStatus(String, ReportWorkItemStatusRequest) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Reports the status of the work item for jobId.
REQUEST_TIMEOUT - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
requestProgress() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
Ask the remote bundle for progress.
requestProgress() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
 
RequestResponseIO<RequestT,ResponseT> - Class in org.apache.beam.io.requestresponse
PTransform for reading from and writing to Web APIs.
requestsFinalization(String) - Method in interface org.apache.beam.runners.fnexecution.control.BundleFinalizationHandler
This callback is invoked whenever an inflight bundle that is being processed requests finalization.
requestsFinalization(String) - Method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers.InMemoryFinalizer
 
requestsInProgress() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
 
requestTimeMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Timestamp (in system time) at which we requested the message (ms since epoch).
REQUIRED_MEMORY_FOR_DEFAULT_BUFFER_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
 
Requirements - Class in org.apache.beam.sdk.transforms
Describes the run-time requirements of a Contextful, such as access to side inputs.
requiresDataSchema() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSchemaIOProvider
 
requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Indicates whether this transform requires a specified data schema.
requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
 
requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
requiresDataSchema() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
 
requiresDataSchema() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
Indicates whether the dataSchema value is necessary.
requiresDeduping() - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
requiresDeduping() - Method in class org.apache.beam.sdk.io.UnboundedSource
Returns whether this source requires explicit deduping.
requiresSideInputs(Collection<PCollectionView<?>>) - Static method in class org.apache.beam.sdk.transforms.Requirements
Describes the need for access to the given side inputs.
requiresSideInputs(PCollectionView<?>...) - Static method in class org.apache.beam.sdk.transforms.Requirements
reset() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
 
reset() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
 
reset() - Method in class org.apache.beam.sdk.fn.CancellableQueue
Enables the queue to be re-used after it has been cancelled.
reset() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
Enables this receiver to be used again for another bundle.
reset() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
 
reset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
resetCache() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Resets the set of interfaces registered with this factory to the default state.
resetForNewKey() - Method in class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
Prepares previous emitted state handlers for processing a new key.
resetLocal() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
resetTo(int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Resets the end of the stream to the specified position.
Reshuffle<K,V> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards compatibility guarantees.
Reshuffle.AssignShardFn<T> - Class in org.apache.beam.sdk.transforms
 
Reshuffle.ViaRandomKey<T> - Class in org.apache.beam.sdk.transforms
Implementation of Reshuffle.viaRandomKey().
ReshuffleTrigger<W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards compatibility guarantees.
ReshuffleTrigger() - Constructor for class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
resolve(Supplier<PipelineOptions>, Dataset<WindowedValue<InT>>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator.UnresolvedTranslation
 
resolve(String, ResolveOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
resolve(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
resolve(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
resolve(String, ResolveOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
 
resolve(String, ResolveOptions) - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns a child ResourceId under this.
resolve(Schema) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Resolve the FieldAccessDescriptor against a schema.
resolveAlias(ResolvedColumn) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
 
resolveArtifacts(RunnerApi.Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
resolveArtifacts(ArtifactApi.ResolveArtifactsRequest, StreamObserver<ArtifactApi.ResolveArtifactsResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
resolveCalciteTable(SchemaPlus, List<String>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.TableResolution
Resolves tablePath according to the given schemaPlus.
resolvedTables - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
 
ResolveOptions - Interface in org.apache.beam.sdk.io.fs
ResolveOptions.StandardResolveOptions - Enum in org.apache.beam.sdk.io.fs
Defines the standard resolve options.
resolveSibling(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
resolveSibling(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
resolveType(Type) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a TypeDescriptor representing the given type, with type variables resolved according to the specialization in this type.
RESOURCE_HINTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
RESOURCE_ID - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
 
ResourceHint - Class in org.apache.beam.sdk.transforms.resourcehints
Provides a definition of a resource hint known to the SDK.
ResourceHint() - Constructor for class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
 
resourceHints - Variable in class org.apache.beam.sdk.transforms.PTransform
 
ResourceHints - Class in org.apache.beam.sdk.transforms.resourcehints
Pipeline authors can use resource hints to provide additional information to runners about the desired aspects of the execution environment.
ResourceHintsOptions - Interface in org.apache.beam.sdk.transforms.resourcehints
Options that are used to control configuration of the remote environment.
ResourceHintsOptions.EmptyListDefault - Class in org.apache.beam.sdk.transforms.resourcehints
 
ResourceHintsOptions.Options - Class in org.apache.beam.sdk.transforms.resourcehints
Register the ResourceHintsOptions.
resourceId() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
ResourceId - Interface in org.apache.beam.sdk.io.fs
An identifier which represents a file-like resource.
ResourceIdCoder - Class in org.apache.beam.sdk.io.fs
ResourceIdCoder() - Constructor for class org.apache.beam.sdk.io.fs.ResourceIdCoder
 
ResourceIdTester - Class in org.apache.beam.sdk.io.fs
A utility to test ResourceId implementations.
responseMetadata() - Static method in class org.apache.beam.sdk.io.aws.coders.AwsCoders
Returns a new coder for ResponseMetadata.
responseReceivedEx(Object) - Method in class org.apache.beam.sdk.io.solace.broker.PublishResultHandler
 
RESTRICTION_CODER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
Deprecated.
Uses the incorrect terminology. PropertyNames.RESTRICTION_ENCODING. Should be removed once non FnAPI SplittableDoFn expansion for Dataflow is removed.
RESTRICTION_ENCODING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
restrictionTracker(OffsetRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
restrictionTracker(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
restrictionTracker(PulsarSourceDescriptor, OffsetRange) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
RestrictionTracker<RestrictionT,PositionT> - Class in org.apache.beam.sdk.transforms.splittabledofn
Manages access to the restriction and keeps track of its claimed part for a splittable DoFn.
RestrictionTracker() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
 
RestrictionTracker.HasProgress - Interface in org.apache.beam.sdk.transforms.splittabledofn
All RestrictionTrackers SHOULD implement this interface to improve auto-scaling and splitting performance.
RestrictionTracker.IsBounded - Enum in org.apache.beam.sdk.transforms.splittabledofn
 
RestrictionTracker.Progress - Class in org.apache.beam.sdk.transforms.splittabledofn
A representation for the amount of known completed and remaining work.
RestrictionTracker.TruncateResult<RestrictionT> - Class in org.apache.beam.sdk.transforms.splittabledofn
A representation of the truncate result.
RestrictionTrackers - Class in org.apache.beam.sdk.fn.splittabledofn
Support utilities for interacting with RestrictionTrackers.
RestrictionTrackers() - Constructor for class org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers
 
RestrictionTrackers.ClaimObserver<PositionT> - Interface in org.apache.beam.sdk.fn.splittabledofn
Interface allowing a runner to observe the calls to RestrictionTracker.tryClaim(PositionT).
Result<ResponseT> - Class in org.apache.beam.io.requestresponse
The Result of processing request PCollection into response PCollection.
Result() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup.Result
 
Result() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.Result
 
resultBuilder() - Static method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
 
resume() - Static method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
Indicates that there is more work to be done for the current element.
resumeDelay() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
A minimum duration that should elapse between the end of this DoFn.ProcessElement call and the DoFn.ProcessElement call continuing processing of the same element.
resumeFromPreviousPipelineAction(ChangeStreamMetrics, MetadataTableDao, Instant, ProcessNewPartitionsAction) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
 
ResumeFromPreviousPipelineAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
 
ResumeFromPreviousPipelineAction(ChangeStreamMetrics, MetadataTableDao, Instant, ProcessNewPartitionsAction) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ResumeFromPreviousPipelineAction
 
retain(AwsOptions, ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.common.ObjectPool.ClientPool
Retain a reference to a shared client instance.
retain(KeyT) - Method in class org.apache.beam.sdk.io.aws2.common.ObjectPool
Retain a reference to a shared client instance.
retrievalToken() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
retrieveDicomStudyMetadata(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Retrieve DicomStudyMetadata.
retrieveDicomStudyMetadata(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
retrieveFieldNames(List<ResolvedColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
 
retrieveRexNode(ResolvedNodes.ResolvedProjectScan, List<RelDataTypeField>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
Extract expressions from a project scan node.
retrieveRexNodeFromOrderByScan(RelOptCluster, ResolvedNodes.ResolvedOrderByScan, List<RelDataTypeField>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
Extract expressions from order by scan node.
retry(RetryConfiguration) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
Optional RetryConfiguration for AWS clients.
retry(Consumer<RetryConfiguration.Builder>) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
Optional RetryConfiguration for AWS clients.
retry() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
Optional RetryConfiguration for AWS clients.
retryCallable(Callable<V>, Set<Class<? extends Exception>>) - Method in class org.apache.beam.sdk.io.solace.RetryCallableManager
Method that executes and repeats the execution of the callable argument, if it throws one of the exceptions from the exceptionsToIntercept Set.
RetryCallableManager - Class in org.apache.beam.sdk.io.solace
A class that manages retrying of callables based on the exceptions they throw.
RetryCallableManager() - Constructor for class org.apache.beam.sdk.io.solace.RetryCallableManager
 
RetryCallableManager.Builder - Class in org.apache.beam.sdk.io.solace
 
RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.RetryConfiguration
Deprecated.
 
RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.aws.sns.SnsIO.RetryConfiguration
Deprecated.
 
RetryConfiguration - Class in org.apache.beam.sdk.io.aws2.common
Configuration of the retry behavior for AWS SDK clients.
RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
 
RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.RetryConfiguration
 
RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.RetryConfiguration
 
RetryConfiguration - Class in org.apache.beam.sdk.io.jms
 
RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.jms.RetryConfiguration
 
RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.RetryConfiguration
 
RetryConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.common
 
RetryHttpRequestInitializer - Class in org.apache.beam.sdk.extensions.gcp.util
Implements a request initializer that adds retry handlers to all HttpRequests.
RetryHttpRequestInitializer() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
 
RetryHttpRequestInitializer(Collection<Integer>) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
 
RetryHttpRequestInitializer(Collection<Integer>, HttpResponseInterceptor) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
 
retryTransientErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
Retry all failures except for known persistent errors.
reverse(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
REVERSE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
REVERSE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
reverseArtifactRetrievalService(StreamObserver<ArtifactApi.ArtifactRequestWrapper>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
 
reverseBytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
Reversed() - Constructor for class org.apache.beam.sdk.transforms.Top.Reversed
 
reverseString(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
revision() - Method in interface org.apache.beam.sdk.options.PipelineOptions
A monotonically increasing revision number of this PipelineOptions object that can be used to detect changes.
RHS_TAG - Static variable in class org.apache.beam.sdk.schemas.transforms.Join
 
RIGHT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinPushThroughJoinRule
Instance of the rule that works on logical joins only, and pushes to the right.
right(String...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
 
right(Integer...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
 
right(FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual.Impl
 
right(String...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
 
right(Integer...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
 
right(FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
 
rightOuterJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Right Outer Join of two collections of KV elements.
rightOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Right Outer Join of two collections of KV elements.
rightOuterJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
Perform a right outer join.
RingRange - Class in org.apache.beam.sdk.io.cassandra
Models a Cassandra token range.
rollback() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
rollback(Savepoint) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
root() - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
Path for display data registered by a top-level component.
roundRobinSubList(List<T>, int, int) - Static method in class org.apache.beam.runners.jet.Utils
Assigns the list to count sublists in a round-robin fashion.
route(DoFn.MultiOutputReceiver, RecordT, Coder<RecordT>, Exception, String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.RecordingBadRecordRouter
 
route(DoFn<?, ?>.FinishBundleContext, RecordT, Coder<RecordT>, Exception, String, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.RecordingBadRecordRouter
 
route(DoFn.MultiOutputReceiver, RecordT, Coder<RecordT>, Exception, String) - Method in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
 
route(DoFn<?, ?>.FinishBundleContext, RecordT, Coder<RecordT>, Exception, String, BoundedWindow) - Method in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
 
route(DoFn.MultiOutputReceiver, RecordT, Coder<RecordT>, Exception, String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.ThrowingBadRecordRouter
 
route(DoFn<?, ?>.FinishBundleContext, RecordT, Coder<RecordT>, Exception, String, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.ThrowingBadRecordRouter
 
ROW - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
row(Schema) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
Create a map type for the given key and value types.
Row - Class in org.apache.beam.sdk.values
Row is an immutable tuple-like schema to represent one element in a PCollection.
Row.Builder - Class in org.apache.beam.sdk.values
Builder for Row.
Row.Equals - Class in org.apache.beam.sdk.values
 
Row.FieldValueBuilder - Class in org.apache.beam.sdk.values
Builder for Row that bases a row on another row.
ROW_PROPERTY_MUTATION_INFO - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
ROW_PROPERTY_MUTATION_SQN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
ROW_PROPERTY_MUTATION_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
ROW_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
ROW_SCHEMA_MUTATION_INFO - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
rowBag(Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a BagState, optimized for adding values frequently and occasionally retrieving all the values that have been added.
RowBundle<T> - Class in org.apache.beam.sdk.jmh.schemas
Bundle of rows according to the configured Factory as input for benchmarks.
RowBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundle
 
RowBundle(Class<T>) - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundle
 
RowBundle.Action - Enum in org.apache.beam.sdk.jmh.schemas
 
RowBundles - Interface in org.apache.beam.sdk.jmh.schemas
 
RowBundles.ArrayOfNestedStringBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.ArrayOfNestedStringBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.ArrayOfStringBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.ArrayOfStringBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.ByteBufferBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.ByteBufferBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.BytesBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.BytesBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.DateTimeBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.DateTimeBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.IntBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.IntBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.MapOfIntBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.MapOfIntBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.MapOfNestedIntBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.MapOfNestedIntBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.NestedBytesBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.NestedBytesBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.NestedIntBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.NestedIntBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.StringBuilderBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.StringBuilderBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.StringBundle - Class in org.apache.beam.sdk.jmh.schemas
 
RowBundles.StringBundle.Field - Class in org.apache.beam.sdk.jmh.schemas
 
RowCoder - Class in org.apache.beam.sdk.coders
A sub-class of SchemaCoder that can only encode Row instances.
RowCoderCloudObjectTranslator - Class in org.apache.beam.runners.dataflow.util
Translator for row coders.
RowCoderCloudObjectTranslator() - Constructor for class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
 
RowCoderGenerator - Class in org.apache.beam.sdk.coders
A utility for automatically generating a Coder for Row objects corresponding to a specific schema.
RowCoderGenerator() - Constructor for class org.apache.beam.sdk.coders.RowCoderGenerator
 
rowFromProto(SchemaApi.Row, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
 
rowMap(Schema, Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a MapState, optimized for key lookups and writes.
RowMessages - Class in org.apache.beam.sdk.schemas
 
rowMultimap(Schema, Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a MultimapState, optimized for key lookups, key puts, and clear.
RowMutation - Class in org.apache.beam.sdk.io.gcp.bigquery
A convenience class for applying row updates to BigQuery using BigQueryIO.applyRowMutations().
RowMutation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
 
RowMutation.RowMutationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
RowMutationCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
 
RowMutationInformation - Class in org.apache.beam.sdk.io.gcp.bigquery
This class indicates how to apply a row update to BigQuery.
RowMutationInformation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
 
RowMutationInformation.MutationType - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
rowOrderedList(Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
rowReceiver(DoFn<?, ?>.WindowedContext, TupleTag<T>, SchemaCoder<T>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
Returns a DoFn.OutputReceiver that automatically converts a Row to the user's output type and delegates to WindowedContextOutputReceiver.
rows() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
RowSchemaInformationProvider - Class in org.apache.beam.sdk.schemas.utils
 
RowSchemaInformationProvider() - Constructor for class org.apache.beam.sdk.schemas.utils.RowSchemaInformationProvider
 
RowSelector - Interface in org.apache.beam.sdk.schemas.utils
A selector interface for extracting fields from a row.
RowSelectorContainer(Schema, FieldAccessDescriptor, boolean) - Constructor for class org.apache.beam.sdk.schemas.utils.SelectHelpers.RowSelectorContainer
 
rowSet(Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a SetState, optimized for checking membership.
rowsFromRecordBatch(Schema, VectorSchemaRoot) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion
Returns a ArrowConversion.RecordBatchRowIterator backed by the Arrow record batch stored in vectorSchemaRoot.
rowsFromSerializedRecordBatch(Schema, InputStream, RootAllocator) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion
 
rowToBytesFn(SchemaProvider, TypeDescriptor<T>, ProcessFunction<? super T, byte[]>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
 
rowToBytesFn(SchemaProvider, TypeDescriptor<T>, Coder<? super T>) - Static method in class org.apache.beam.sdk.schemas.RowMessages
 
RowToEntity - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform to perform a conversion of Row to Entity.
rowToProto(Row) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
 
RowUtils - Class in org.apache.beam.sdk.io.gcp.bigtable
 
RowUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
rowValue(Schema) - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a row value with the specified schema.
RowWithGetters<T> - Class in org.apache.beam.sdk.values
A Concrete subclass of Row that delegates to a set of provided FieldValueGetters.
RowWithStorage - Class in org.apache.beam.sdk.values
Concrete subclass of Row that explicitly stores all fields of the row.
rpad(String, Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
rpad(String, Long, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
rpad(byte[], Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
rpad(byte[], Long, byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
RpcQosOptions - Class in org.apache.beam.sdk.io.gcp.firestore
Quality of Service manager options for Firestore RPCs.
RpcQosOptions.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
Mutable Builder class for creating instances of RpcQosOptions.
rtrim(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
rtrim(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
RTRIM - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
RTRIM_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
RULE_arrayQualifier - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
RULE_dotExpression - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
RULE_dotExpressionComponent - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
RULE_fieldSpecifier - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
RULE_mapQualifier - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
RULE_qualifiedComponent - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
RULE_qualifierList - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
ruleNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
ruleNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
run(Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.dataflow.TestDataflowRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.direct.DirectRunner
 
run(RunnerApi.Pipeline, JobInfo) - Method in class org.apache.beam.runners.flink.FlinkPipelineRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.flink.FlinkRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.flink.TestFlinkRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.jet.JetRunner
 
run(JobApi.RunJobRequest, StreamObserver<JobApi.RunJobResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
 
run() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
run(RunnerApi.Pipeline, JobInfo) - Method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
Does not actually run the pipeline. Instead bundles the input pipeline along with all dependencies, artifacts, etc.
run(RunnerApi.Pipeline, JobInfo) - Method in interface org.apache.beam.runners.jobsubmission.PortablePipelineRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.portability.PortableRunner
 
run(JobApi.RunJobRequest, StreamObserver<JobApi.RunJobResponse>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
 
run(Pipeline) - Method in class org.apache.beam.runners.portability.testing.TestPortableRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.prism.PrismRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.prism.TestPrismRunner
 
run(RunnerApi.Pipeline, JobInfo) - Method in class org.apache.beam.runners.spark.SparkPipelineRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.spark.SparkRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger
 
run(Pipeline) - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.spark.TestSparkRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.twister2.Twister2Runner
 
run(Pipeline) - Method in class org.apache.beam.runners.twister2.Twister2TestRunner
 
run() - Method in interface org.apache.beam.sdk.function.ThrowingRunnable
 
run(PartitionRecord, ChangeStreamRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>, BytesThroughputEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ChangeStreamAction
This class processes ReadChangeStreamResponse from bigtable server.
run(RestrictionTracker<OffsetRange, Long>, DoFn.OutputReceiver<PartitionRecord>, ManualWatermarkEstimator<Instant>, InitialPipelineState) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.DetectNewPartitionsAction
Perform the necessary steps to manage initial set of partitions and new partitions.
run(DoFn.OutputReceiver<PartitionRecord>, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.GenerateInitialPartitionsAction
The very first step of the pipeline when there are no partitions being streamed yet.
run(PartitionRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ReadChangeStreamPartitionAction
Streams changes from a specific partition.
run(DoFn.OutputReceiver<PartitionRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ResumeFromPreviousPipelineAction
Resume from previously drained pipeline.
run(PartitionMetadata, ChildPartitionsRecord, RestrictionTracker<TimestampRange, Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ChildPartitionsRecordAction
This is the main processing function for a ChildPartitionsRecord.
run(PartitionMetadata, DataChangeRecord, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
This is the main processing function for a DataChangeRecord.
run(RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<PartitionMetadata>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
Executes the main logic to schedule new partitions.
run(PartitionMetadata, HeartbeatRecord, RestrictionTracker<TimestampRange, Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.HeartbeatRecordAction
This is the main processing function for a HeartbeatRecord.
run(PartitionMetadata, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>, DoFn.BundleFinalizer) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.QueryChangeStreamAction
This method will dispatch a change stream query for the given partition, it delegate the processing of the records to one of the corresponding action classes registered and it will keep the state of the partition up to date in the Connector's metadata table.
run() - Method in class org.apache.beam.sdk.Pipeline
Runs this Pipeline according to the PipelineOptions used to create the Pipeline via Pipeline.create(PipelineOptions).
run(PipelineOptions) - Method in class org.apache.beam.sdk.Pipeline
Runs this Pipeline using the given PipelineOptions, using the runner specified by the options.
run(Pipeline) - Method in class org.apache.beam.sdk.PipelineRunner
Processes the given Pipeline, potentially asynchronously, returning a runner-specific type of result.
run(PTransform<PBegin, ?>, PipelineOptions) - Method in class org.apache.beam.sdk.PipelineRunner
Creates a Pipeline out of a single PTransform step, and executes it.
run(PTransform<PBegin, ?>) - Method in class org.apache.beam.sdk.PipelineRunner
Overloaded PTransform runner that runs with the default app PipelineOptions.
run(Pipeline) - Method in class org.apache.beam.sdk.testing.CrashingRunner
 
run() - Method in class org.apache.beam.sdk.testing.TestPipeline
Runs this TestPipeline, unwrapping any AssertionError that is raised during testing.
run(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipeline
Like TestPipeline.run() but with the given potentially modified options.
runBeforeProcessing(PipelineOptions) - Static method in class org.apache.beam.sdk.fn.JvmInitializers
Finds all registered implementations of JvmInitializer and executes their beforeProcessing methods.
RunInference<OutputT> - Class in org.apache.beam.sdk.extensions.python.transforms
Wrapper for invoking external Python RunInference.
runInTransaction(Function<PartitionMetadataDao.InTransactionContext, T>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Runs a given function in a transaction context.
runInTransaction(Function<PartitionMetadataDao.InTransactionContext, T>, String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
 
Runner() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.jet.JetRunnerRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.prism.PrismRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Runner
 
RunnerRegistrar() - Constructor for class org.apache.beam.runners.portability.testing.TestUniversalRunner.RunnerRegistrar
 
runOnStartup() - Static method in class org.apache.beam.sdk.fn.JvmInitializers
Finds all registered implementations of JvmInitializer and executes their onStartup methods.
runQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for RunQueryRequest operations.
runResourceIdBattery(ResourceId) - Static method in class org.apache.beam.sdk.io.fs.ResourceIdTester
Enforces that the ResourceId implementation of baseDirectory meets the ResourceId spec.
runTest(Pipeline) - Method in class org.apache.beam.runners.twister2.Twister2Runner
 
runWindowFn(WindowFn<T, W>, List<Long>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
Runs the WindowFn over the provided input, returning a map of windows to the timestamps in those windows.
runWindowFnWithValue(WindowFn<T, W>, List<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
Runs the WindowFn over the provided input, returning a map of windows to the timestamps in those windows.
runWithAdditionalOptionArgs(List<String>) - Method in class org.apache.beam.sdk.testing.TestPipeline
Runs this TestPipeline with additional cmd pipeline option args.

S

S3ClientBuilderFactory - Interface in org.apache.beam.sdk.io.aws.options
Construct AmazonS3ClientBuilder from S3 pipeline options.
S3ClientBuilderFactory - Interface in org.apache.beam.sdk.io.aws2.options
Construct S3ClientBuilder from S3 pipeline options.
S3FileSystemConfiguration - Class in org.apache.beam.sdk.io.aws.s3
Object used to configure S3FileSystem.
S3FileSystemConfiguration() - Constructor for class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
 
S3FileSystemConfiguration - Class in org.apache.beam.sdk.io.aws2.s3
Object used to configure S3FileSystem.
S3FileSystemConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
 
S3FileSystemConfiguration.Builder - Class in org.apache.beam.sdk.io.aws.s3
 
S3FileSystemConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.s3
 
S3FileSystemRegistrar - Class in org.apache.beam.sdk.io.aws.s3
AutoService registrar for the S3FileSystem.
S3FileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.s3.S3FileSystemRegistrar
 
S3FileSystemRegistrar - Class in org.apache.beam.sdk.io.aws2.s3
AutoService registrar for the S3FileSystem.
S3FileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.s3.S3FileSystemRegistrar
 
S3FileSystemSchemeRegistrar - Interface in org.apache.beam.sdk.io.aws.s3
A registrar that creates S3FileSystemConfiguration instances from PipelineOptions.
S3FileSystemSchemeRegistrar - Interface in org.apache.beam.sdk.io.aws2.s3
A registrar that creates S3FileSystemConfiguration instances from PipelineOptions.
S3Options - Interface in org.apache.beam.sdk.io.aws.options
Options used to configure Amazon Web Services S3.
S3Options - Interface in org.apache.beam.sdk.io.aws2.options
Options used to configure Amazon Web Services S3.
S3Options.S3UploadBufferSizeBytesFactory - Class in org.apache.beam.sdk.io.aws.options
Provide the default s3 upload buffer size in bytes: 64MB if more than 512MB in RAM are available and 5MB otherwise.
S3Options.S3UploadBufferSizeBytesFactory - Class in org.apache.beam.sdk.io.aws2.options
Provide the default s3 upload buffer size in bytes: 64MB if more than 512MB in RAM are available and 5MB otherwise.
S3Options.SSECustomerKeyFactory - Class in org.apache.beam.sdk.io.aws2.options
 
S3UploadBufferSizeBytesFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.S3Options.S3UploadBufferSizeBytesFactory
 
S3UploadBufferSizeBytesFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
 
Sample - Class in org.apache.beam.sdk.transforms
PTransforms for taking samples of the elements in a PCollection, or samples of the values associated with each key in a PCollection of KVs.
Sample() - Constructor for class org.apache.beam.sdk.transforms.Sample
 
Sample.FixedSizedSampleFn<T> - Class in org.apache.beam.sdk.transforms
CombineFn that computes a fixed-size sample of a collection of values.
SAMPLE_PARTITION - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
We use a bogus partition here to estimate the average size of a partition metadata record.
SampleAllFiles() - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator.SampleAllFiles
 
satisfies(RelTrait) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
satisfies(SerializableFunction<Iterable<T>, Void>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Applies the provided checking function (presumably containing assertions) to the iterable in question.
satisfies(SerializableFunction<Iterable<T>, Void>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
satisfies(SerializableFunction<Iterable<T>, Void>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionListContentsAssert
Applies one SerializableFunction to check the elements of each PCollection in the PCollectionList.
satisfies(List<SerializableFunction<Iterable<T>, Void>>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionListContentsAssert
Takes list of SerializableFunctions of the same size as PAssert.PCollectionListContentsAssert.pCollectionList, and applies each matcher to the PCollection with the identical index in the PAssert.PCollectionListContentsAssert.pCollectionList.
satisfies(SerializableFunction<T, Void>) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Applies the provided checking function (presumably containing assertions) to the value in question.
saveAsync(T) - Method in interface org.apache.beam.sdk.io.cassandra.Mapper
This method is called for each save event.
SbeLogicalTypes - Class in org.apache.beam.sdk.extensions.sbe
Classes that represent various SBE semantic types.
SbeLogicalTypes.LocalMktDate - Class in org.apache.beam.sdk.extensions.sbe
Representation of SBE's LocalMktDate.
SbeLogicalTypes.TZTimeOnly - Class in org.apache.beam.sdk.extensions.sbe
Represents SBE's TimeOnly composite type.
SbeLogicalTypes.TZTimestamp - Class in org.apache.beam.sdk.extensions.sbe
Represents SBE's TZTimestamp composite type.
SbeLogicalTypes.Uint16 - Class in org.apache.beam.sdk.extensions.sbe
Represents SBE's uint16 type.
SbeLogicalTypes.Uint32 - Class in org.apache.beam.sdk.extensions.sbe
Represents SBE's uint32 type.
SbeLogicalTypes.Uint64 - Class in org.apache.beam.sdk.extensions.sbe
Represents SBE's uint64 type.
SbeLogicalTypes.Uint8 - Class in org.apache.beam.sdk.extensions.sbe
Represents SBE's uint8 type.
SbeLogicalTypes.UTCDateOnly - Class in org.apache.beam.sdk.extensions.sbe
Representation of SBE's UTCDateOnly.
SbeLogicalTypes.UTCTimeOnly - Class in org.apache.beam.sdk.extensions.sbe
Represents SBE's UTCTimeOnly composite type.
SbeLogicalTypes.UTCTimestamp - Class in org.apache.beam.sdk.extensions.sbe
Represents SBE's UTCTimestamp composite type.
SbeSchema - Class in org.apache.beam.sdk.extensions.sbe
Represents an SBE schema.
SbeSchema.IrOptions - Class in org.apache.beam.sdk.extensions.sbe
Options for configuring schema generation from an Ir.
SbeSchema.IrOptions.Builder - Class in org.apache.beam.sdk.extensions.sbe
Builder for SbeSchema.IrOptions.
ScalaInterop - Class in org.apache.beam.runners.spark.structuredstreaming.translation.utils
Utilities for easier interoperability with the Spark Scala API.
ScalaInterop.Fun1<T,V> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.utils
 
ScalaInterop.Fun2<T1,T2,V> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.utils
 
scalaIterator(Iterable<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
Scala Iterator of Java Iterable.
scalaIterator(Iterator<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
Scala Iterator of Java Iterator.
SCALAR_FIELD_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ScalarFn - Class in org.apache.beam.sdk.extensions.sql.udf
A scalar function that can be executed as part of a SQL query.
ScalarFn() - Constructor for class org.apache.beam.sdk.extensions.sql.udf.ScalarFn
 
ScalarFn.ApplyMethod - Annotation Type in org.apache.beam.sdk.extensions.sql.udf
Annotates the single method in a ScalarFn implementation that is to be applied to SQL function arguments.
ScalarFnReflector - Class in org.apache.beam.sdk.extensions.sql.impl
Reflection-based implementation logic for ScalarFn.
ScalarFnReflector() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.ScalarFnReflector
 
ScalarFunctionImpl - Class in org.apache.beam.sdk.extensions.sql.impl
Beam-customized version from ScalarFunctionImpl , to address BEAM-5921.
ScalarFunctionImpl(Method, CallImplementor, String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
 
ScalarFunctionImpl(Method, CallImplementor) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
 
ScheduledExecutorServiceFactory() - Constructor for class org.apache.beam.sdk.options.ExecutorOptions.ScheduledExecutorServiceFactory
 
schema() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
schema - Variable in class org.apache.beam.sdk.extensions.sql.meta.SchemaBaseBeamTable
 
schema(Schema) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
SCHEMA - Static variable in class org.apache.beam.sdk.io.ClassLoaderFileSystem
 
schema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
SCHEMA - Static variable in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
SCHEMA - Static variable in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
 
schema() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIO
Returns the schema of the data.
Schema - Class in org.apache.beam.sdk.schemas
Schema describes the fields in Row.
Schema(List<Schema.Field>) - Constructor for class org.apache.beam.sdk.schemas.Schema
 
Schema(List<Schema.Field>, Schema.Options) - Constructor for class org.apache.beam.sdk.schemas.Schema
 
schema - Variable in class org.apache.beam.sdk.schemas.SchemaCoder
 
Schema.Builder - Class in org.apache.beam.sdk.schemas
Builder class for building Schema objects.
Schema.EquivalenceNullablePolicy - Enum in org.apache.beam.sdk.schemas
Control whether nullable is included in equivalence check.
Schema.Field - Class in org.apache.beam.sdk.schemas
Field of a row.
Schema.Field.Builder - Class in org.apache.beam.sdk.schemas
Builder for Schema.Field.
Schema.FieldType - Class in org.apache.beam.sdk.schemas
A descriptor of a single field type.
Schema.LogicalType<InputT,BaseT> - Interface in org.apache.beam.sdk.schemas
A LogicalType allows users to define a custom schema type.
Schema.Options - Class in org.apache.beam.sdk.schemas
 
Schema.Options.Builder - Class in org.apache.beam.sdk.schemas
 
Schema.TypeName - Enum in org.apache.beam.sdk.schemas
An enumerated list of type constructors.
SchemaAndRecord - Class in org.apache.beam.sdk.io.gcp.bigquery
A wrapper for a GenericRecord and the TableSchema representing the schema of the table (or query) it was generated from.
SchemaAndRecord(GenericRecord, TableSchema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
SchemaBaseBeamTable - Class in org.apache.beam.sdk.extensions.sql.meta
Each IO in Beam has one table schema, by extending SchemaBaseBeamTable.
SchemaBaseBeamTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.SchemaBaseBeamTable
 
SchemaCaseFormat - Annotation Type in org.apache.beam.sdk.schemas.annotations
When used on a POJO, Java Bean, or AutoValue class the specified case format will be used for all the generated Schema fields.
schemaCoder(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Returns an SchemaCoder instance for the provided element type.
schemaCoder(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Returns an SchemaCoder instance for the provided element class.
schemaCoder(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Returns an SchemaCoder instance for the Avro schema.
schemaCoder(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Returns an SchemaCoder instance for the provided element type using the provided Avro schema.
schemaCoder(AvroCoder<T>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Returns an SchemaCoder instance based on the provided AvroCoder for the element type.
SchemaCoder<T> - Class in org.apache.beam.sdk.schemas
SchemaCoder is used as the coder for types that have schemas registered.
SchemaCoder(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Constructor for class org.apache.beam.sdk.schemas.SchemaCoder
 
SchemaCoderCloudObjectTranslator - Class in org.apache.beam.runners.dataflow.util
Translator for Schema coders.
SchemaCoderCloudObjectTranslator() - Constructor for class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
 
SchemaConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
 
SchemaCreate - Annotation Type in org.apache.beam.sdk.schemas.annotations
Can be put on a constructor or a static method, in which case that constructor or method will be used to created instance of the class by Beam's schema code.
SchemaFieldDescription - Annotation Type in org.apache.beam.sdk.schemas.annotations
When used on a POJO field, a Java Bean getter, or an AutoValue getter, the specified description is used for the generated schema field.
SchemaFieldName - Annotation Type in org.apache.beam.sdk.schemas.annotations
When used on a POJO field, a Java Bean getter, or an AutoValue getter, the specified name is used for the generated schema field.
SchemaFieldNumber - Annotation Type in org.apache.beam.sdk.schemas.annotations
When used on a POJO field, a Java Bean getter, or an AutoValue getter, the generated field will have the specified index.
schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
 
schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
 
schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
 
schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
 
schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
 
schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
 
schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
 
schemaFor(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
Lookup a schema for the given type.
schemaFromClass(TypeDescriptor<?>, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
Infer a schema from a Java class.
schemaFromJavaBeanClass(TypeDescriptor<?>, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
Create a Schema for a Java Bean class.
schemaFromPojoClass(TypeDescriptor<?>, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
schemaFromProto(SchemaApi.Schema) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
 
SchemaIgnore - Annotation Type in org.apache.beam.sdk.schemas.annotations
When used on a POJO field or a JavaBean getter, that field or getter is ignored from the inferred schema.
SchemaInformationProvider - Interface in org.apache.beam.sdk.schemas.utils
SchemaIO - Interface in org.apache.beam.sdk.schemas.io
An abstraction to create schema capable and aware IOs.
SchemaIOProvider - Interface in org.apache.beam.sdk.schemas.io
Provider to create SchemaIO instances for use in Beam SQL and other SDKs.
SchemaIOTableProviderWrapper - Class in org.apache.beam.sdk.extensions.sql.meta.provider
A general TableProvider for IOs for consumption by Beam SQL.
SchemaIOTableProviderWrapper() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
 
SchemaLogicalType - Class in org.apache.beam.sdk.schemas.logicaltypes
A schema represented as a serialized proto bytes.
SchemaLogicalType() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
 
schemaPathFromId(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
schemaPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
SchemaProvider - Interface in org.apache.beam.sdk.schemas
Concrete implementations of this class allow creation of schema service objects that vend a Schema for a specific type.
SchemaProviderRegistrar - Interface in org.apache.beam.sdk.schemas
SchemaProvider creators have the ability to automatically have their schemaProvider registered with this SDK by creating a ServiceLoader entry and a concrete implementation of this interface.
SchemaRegistry - Class in org.apache.beam.sdk.schemas
A SchemaRegistry allows registering Schemas for a given Java Class or a TypeDescriptor.
schemaToProto(Schema, boolean) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
 
schemaToProto(Schema, boolean, boolean) - Static method in class org.apache.beam.sdk.schemas.SchemaTranslation
 
schemaToProtoTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
SchemaTransform - Class in org.apache.beam.sdk.schemas.transforms
An abstraction representing schema capable and aware transforms.
SchemaTransform() - Constructor for class org.apache.beam.sdk.schemas.transforms.SchemaTransform
 
SchemaTransformConfiguration - Class in org.apache.beam.sdk.io.iceberg
 
SchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration
 
SchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.iceberg
 
SchemaTransformPayloadTranslator() - Constructor for class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
 
SchemaTransformProvider - Interface in org.apache.beam.sdk.schemas.transforms
Provider to create SchemaTransform instances for use in Beam SQL and other SDKs.
SchemaTransformTranslation - Class in org.apache.beam.sdk.schemas.transforms
A PTransformTranslation.TransformPayloadTranslator implementation that translates between a Java SchemaTransform and a protobuf payload for that transform.
SchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation
 
SchemaTransformTranslation.SchemaTransformPayloadTranslator<T extends SchemaTransform> - Class in org.apache.beam.sdk.schemas.transforms
 
SchemaTranslation - Class in org.apache.beam.sdk.schemas
Utility methods for translating schemas.
SchemaTranslation() - Constructor for class org.apache.beam.sdk.schemas.SchemaTranslation
 
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.AvroRecordSchema
 
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
 
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsSchemaProvider
 
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
 
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
 
schemaTypeCreator(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
new implementations should override GetterBasedSchemaProvider.schemaTypeCreator(TypeDescriptor, Schema) and make this method throw an UnsupportedOperationException
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
Delegates to the GetterBasedSchemaProvider.schemaTypeCreator(Class, Schema) for backwards compatibility, override it if you want to use the richer type signature contained in the TypeDescriptor not subject to the type erasure.
schemaTypeCreator(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
 
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProviderV2
 
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
schemaTypeCreator(TypeDescriptor<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
 
SchemaUserTypeCreator - Interface in org.apache.beam.sdk.schemas
A creator interface for user types that have schemas.
SchemaUtil - Class in org.apache.beam.sdk.io.jdbc
Provides utility functions for working with Beam Schema types.
SchemaUtil() - Constructor for class org.apache.beam.sdk.io.jdbc.SchemaUtil
 
SchemaUtil.BeamRowMapper - Class in org.apache.beam.sdk.io.jdbc
A JdbcIO.RowMapper implementation that converts JDBC results into Beam Row objects.
SchemaUtils - Class in org.apache.beam.sdk.schemas
A set of utility functions for schemas.
SchemaUtils() - Constructor for class org.apache.beam.sdk.schemas.SchemaUtils
 
SchemaVerification - Class in org.apache.beam.sdk.values
 
SchemaVerification() - Constructor for class org.apache.beam.sdk.values.SchemaVerification
 
SchemaZipFold<T> - Class in org.apache.beam.sdk.schemas.utils
Visitor that zips schemas, and accepts pairs of fields and their types.
SchemaZipFold() - Constructor for class org.apache.beam.sdk.schemas.utils.SchemaZipFold
 
SchemaZipFold.Context - Class in org.apache.beam.sdk.schemas.utils
Context referring to a current position in a schema.
SCHEME - Static variable in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
scopedMetricsContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Set the MetricsContainer for the current thread.
SDF_PREFIX - Static variable in class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
 
SdkCoreByteStringOutputStream() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream
 
sdkFields() - Method in class org.apache.beam.sdk.io.aws2.schemas.AwsBuilderFactory
 
SdkHarnessClient - Class in org.apache.beam.runners.fnexecution.control
A high-level client for an SDK harness.
SdkHarnessClient.BundleProcessor - Class in org.apache.beam.runners.fnexecution.control
A processor capable of creating bundles for some registered BeamFnApi.ProcessBundleDescriptor.
SdkHarnessClient.BundleProcessor.ActiveBundle - Class in org.apache.beam.runners.fnexecution.control
An active bundle for a particular BeamFnApi.ProcessBundleDescriptor.
SdkHarnessLogLevelOverrides() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
 
SdkHarnessOptions - Interface in org.apache.beam.sdk.options
Options that are used to control configuration of the SDK harness.
SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb - Class in org.apache.beam.sdk.options
The default implementation which detects how much memory to use for a process wide cache.
SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory - Class in org.apache.beam.sdk.options
A DefaultValueFactory which constructs an instance of the class specified by maxCacheMemoryUsageMbClass to compute the maximum amount of memory to allocate to the process wide cache within an SDK harness instance.
SdkHarnessOptions.LogLevel - Enum in org.apache.beam.sdk.options
The set of log levels that can be used in the SDK harness.
SdkHarnessOptions.MaxCacheMemoryUsageMb - Interface in org.apache.beam.sdk.options
Specifies the maximum amount of memory to use within the current SDK harness instance.
SdkHarnessOptions.SdkHarnessLogLevelOverrides - Class in org.apache.beam.sdk.options
Defines a log level override for a specific class, package, or name.
sdkHttpMetadata() - Static method in class org.apache.beam.sdk.io.aws.coders.AwsCoders
Returns a new coder for SdkHttpMetadata.
sdkHttpMetadataWithoutHeaders() - Static method in class org.apache.beam.sdk.io.aws.coders.AwsCoders
Returns a new coder for SdkHttpMetadata that does not serialize the response headers.
searchFhirResource(String, String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Search fhir resource http body.
searchFhirResource(String, String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
searchResources(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Search resources from a Fhir store with String parameter values.
searchResourcesWithGenericParameters(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Search resources from a Fhir store with any type of parameter values.
secretManagerProjectId(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
Optional for Dataflow or VMs running on Google Cloud.
secretManagerProjectId() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 
seekable(BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
check if BeamRelNode implements BeamSeekableTable.
seekableInputIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
seekRow(Row) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
return a list of Row with given key set.
Select - Class in org.apache.beam.sdk.schemas.transforms
A PTransform for selecting a subset of fields from a schema type.
Select() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select
 
select(Row) - Method in interface org.apache.beam.sdk.schemas.utils.RowSelector
 
select(Row) - Method in class org.apache.beam.sdk.schemas.utils.SelectHelpers.RowSelectorContainer
 
Select.Fields<T> - Class in org.apache.beam.sdk.schemas.transforms
 
Select.Flattened<T> - Class in org.apache.beam.sdk.schemas.transforms
A PTransform representing a flattened schema.
SelectHelpers - Class in org.apache.beam.sdk.schemas.utils
Helper methods to select subrows out of rows.
SelectHelpers() - Constructor for class org.apache.beam.sdk.schemas.utils.SelectHelpers
 
SelectHelpers.RowSelectorContainer - Class in org.apache.beam.sdk.schemas.utils
 
Semp - Class in org.apache.beam.sdk.io.solace.data
 
Semp() - Constructor for class org.apache.beam.sdk.io.solace.data.Semp
 
Semp.Queue - Class in org.apache.beam.sdk.io.solace.data
 
Semp.QueueData - Class in org.apache.beam.sdk.io.solace.data
 
SempClient - Interface in org.apache.beam.sdk.io.solace.broker
This interface defines methods for interacting with a Solace message broker using the Solace Element Management Protocol (SEMP).
SempClientFactory - Interface in org.apache.beam.sdk.io.solace.broker
This interface serves as a blueprint for creating SempClient objects, which are used to interact with a Solace message broker using the Solace Element Management Protocol (SEMP).
sendElements(BeamFnApi.Elements) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
 
SENDER_TIMESTAMP_FUNCTION - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO
 
sendOrCollectBufferedDataAndFinishOutboundStreams() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
Closes the streams for all registered outbound endpoints.
seqOf(T...) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
SequenceDefinition() - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
 
SequenceDefinition(Instant, Instant, Duration) - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
 
SequenceDefinition(Instant, Instant, Duration, boolean) - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
catchUpToNow is experimental; no backwards-compatibility guarantees.
SequenceRangeAccumulator - Class in org.apache.beam.sdk.extensions.ordered.combiner
Default accumulator used to combine sequence ranges.
SequenceRangeAccumulator() - Constructor for class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
 
SequenceRangeAccumulator.SequenceRangeAccumulatorCoder - Class in org.apache.beam.sdk.extensions.ordered.combiner
 
serde(T) - Static method in class org.apache.beam.runners.jet.Utils
Returns a deep clone of an object by serializing and deserializing it (ser-de).
SerializableBiConsumer<FirstInputT,SecondInputT> - Interface in org.apache.beam.sdk.transforms
A union of the BiConsumer and Serializable interfaces.
SerializableBiFunction<FirstInputT,SecondInputT,OutputT> - Interface in org.apache.beam.sdk.transforms
A union of the BiFunction and Serializable interfaces.
SerializableCoder<T extends java.io.Serializable> - Class in org.apache.beam.sdk.coders
A Coder for Java classes that implement Serializable.
SerializableCoder(Class<T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.SerializableCoder
 
SerializableCoder.SerializableCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
A CoderProviderRegistrar which registers a CoderProvider which can handle serializable types.
SerializableCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
 
SerializableComparator<T> - Interface in org.apache.beam.sdk.transforms
A Comparator that is also Serializable.
SerializableConfiguration - Class in org.apache.beam.sdk.io.hadoop
A wrapper to allow Hadoop Configurations to be serialized using Java's standard serialization mechanisms.
SerializableConfiguration() - Constructor for class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
SerializableConfiguration(Configuration) - Constructor for class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
SerializableFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.transforms
A function that computes an output value of type OutputT from an input value of type InputT, is Serializable, and does not allow checked exceptions to be declared.
SerializableFunctions - Class in org.apache.beam.sdk.transforms
Useful SerializableFunction overrides.
SerializableFunctions() - Constructor for class org.apache.beam.sdk.transforms.SerializableFunctions
 
SerializableIr - Class in org.apache.beam.sdk.extensions.sbe
A wrapper around Ir that fulfils Java's Serializable contract.
SerializableMatcher<T> - Interface in org.apache.beam.sdk.testing
A Matcher that is also Serializable.
SerializableMatchers - Class in org.apache.beam.sdk.testing
Static class for building and using SerializableMatcher instances.
SerializableRexFieldAccess - Class in org.apache.beam.sdk.extensions.sql.impl.utils
SerializableRexFieldAccess.
SerializableRexFieldAccess(RexFieldAccess) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexFieldAccess
 
SerializableRexInputRef - Class in org.apache.beam.sdk.extensions.sql.impl.utils
SerializableRexInputRef.
SerializableRexInputRef(RexInputRef) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexInputRef
 
SerializableRexNode - Class in org.apache.beam.sdk.extensions.sql.impl.utils
SerializableRexNode.
SerializableRexNode() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexNode
 
SerializableRexNode.Builder - Class in org.apache.beam.sdk.extensions.sql.impl.utils
SerializableRexNode.Builder.
SerializableSplit() - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.SerializableSplit
 
SerializableSplit(InputSplit) - Constructor for class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.SerializableSplit
 
serialize(String, Instant) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
serialize(AWSCredentialsProvider) - Static method in class org.apache.beam.sdk.io.kinesis.serde.AwsSerializableUtils
 
serialize(ValueProvider<?>, JsonGenerator, SerializerProvider) - Method in class org.apache.beam.sdk.options.ValueProvider.Serializer
 
serialize(Row) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
 
serializeAwsCredentialsProvider(AwsCredentialsProvider) - Static method in class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
 
SERIALIZED_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SERIALIZED_TEST_STREAM - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
serializeOneOf(Expression, List<Encoder<T>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
 
Serializer() - Constructor for class org.apache.beam.sdk.options.ValueProvider.Serializer
 
serializeTimers(Collection<TimerInternals.TimerData>, TimerInternals.TimerDataCoderV2) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
serialVersionUID - Static variable in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
 
serialVersionUID - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
serialVersionUID - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
 
serialVersionUID - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
 
serialVersionUID - Static variable in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
 
seriesId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
ServerConfiguration() - Constructor for class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
serverDirect() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
Like OutboundObserverFactory.clientDirect() but for server-side RPCs.
ServerFactory - Class in org.apache.beam.sdk.fn.server
A gRPC server factory.
ServerFactory() - Constructor for class org.apache.beam.sdk.fn.server.ServerFactory
 
ServerFactory.InetSocketAddressServerFactory - Class in org.apache.beam.sdk.fn.server
Creates a gRPC Server using the default server factory.
ServerFactory.UrlFactory - Interface in org.apache.beam.sdk.fn.server
Factory that constructs client-accessible URLs from a local server address and port.
ServerInfo() - Constructor for class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.ServerInfo
 
SESSION_WINDOW_TVF - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
 
sessionBuilder(String) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory
Creates Spark session builder with some optimizations for local mode, e.g.
Sessions - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows values into sessions separated by periods with no input for at least the duration specified by Sessions.getGapDuration().
SessionService - Class in org.apache.beam.sdk.io.solace.broker
The SessionService interface provides a set of methods for managing a session with the Solace messaging system.
SessionService() - Constructor for class org.apache.beam.sdk.io.solace.broker.SessionService
 
SessionServiceFactory - Class in org.apache.beam.sdk.io.solace.broker
This abstract class serves as a blueprint for creating `SessionServiceFactory` objects.
SessionServiceFactory() - Constructor for class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
 
set(long) - Method in class org.apache.beam.runners.jet.metrics.GaugeImpl
 
set(long) - Method in interface org.apache.beam.sdk.metrics.Gauge
Set current value for this gauge.
set(ObjectT, ValueT) - Method in interface org.apache.beam.sdk.schemas.FieldValueSetter
Sets the specified field on object to value.
set() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a SetState, optimized for checking membership.
set(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.set(), but with an element coder explicitly supplied.
set(Instant) - Method in interface org.apache.beam.sdk.state.Timer
Sets or resets the time in the timer's TimeDomain at which it should fire.
set(String, Instant) - Method in interface org.apache.beam.sdk.state.TimerMap
 
set(long...) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
setAccessKey(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setAccountName(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setActiveWorkRefreshPeriodMillis(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setAllowDuplicates(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setAllowNonRestoredState(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setAlsoStartLoopbackWorker(boolean) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
setApiRootUrl(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setAppend(Boolean) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
 
setAppName(String) - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
 
setArtifactPort(int) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
setAttachedMode(boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setAttachmentBytes(byte[]) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setAttributeId(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setAttributeId(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setAttributeMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setAttributesMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setAttributesMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setAuthenticator(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setAutoBalanceWriteFilesShardingEnabled(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setAutoCommit(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setAutoOffsetResetConfig(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setAutoscalingAlgorithm(DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setAutoSharding(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setAutosharding(Boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setAutoWatermarkInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setAveragePartitionBytesSize(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
Sets the average partition bytes size to estimate the backlog of this DoFn.
setAwsCredentialsProvider(AWSCredentialsProvider) - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
setAwsCredentialsProvider(AwsCredentialsProvider) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
 
setAwsRegion(String) - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
setAwsRegion(Region) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
 
setAwsServiceEndpoint(String) - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
setAzureConnectionString(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setAzureCredentialsProvider(TokenCredential) - Method in interface org.apache.beam.sdk.io.azure.options.AzureOptions
 
setBatching(Boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
setBatching(Boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setBatchIntervalMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setBatchRequestSupplier(Supplier<GcsUtil.BatchInterface>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
 
setBatchSize(Integer) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
 
setBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
setBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
 
setBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
setBeamVersion(String) - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
Specifies the Beam version to get containers for the transform service.
setBigQueryEndpoint(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setBigQueryLocation(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
setBigQueryProject(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setBigQueryServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
 
setBigQueryServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform
 
setBigtableChangeStreamInstanceId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.BigtableChangeStreamTestOptions
 
setBlobServiceEndpoint(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setBlobstoreClientFactoryClass(Class<? extends BlobstoreClientBuilderFactory>) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setBlockOnRun(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
setBlockOnRun(boolean) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
setBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
Sets the bootstrap servers for the Kafka consumer.
setBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
setBqStreamingApiLoggingFrequencySec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setBranch(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setBucketKeyEnabled(boolean) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
setBucketKeyEnabled(boolean) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setBucketKeyEnabled(boolean) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
setBucketKeyEnabled(boolean) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setBundleSize(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setCacheDisabled(boolean) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setCallable(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
 
setCaseSensitive(Boolean) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setCatalog(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setCatalogConfig(IcebergCatalogConfig) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setCatalogName(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
 
setCatalogName(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
setCatalogName(String) - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration.Builder
 
setCatalogProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
 
setCatalogProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
setCatalogProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration.Builder
 
setChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setChannelzShowOnlyWindmillServiceChannels(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setCharset(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
The charset used to write the file.
setCheckpoint(Long) - Method in interface org.apache.beam.sdk.io.sparkreceiver.HasOffset
Some Receiver support mechanism of checkpoint (e.g.
setCheckpointDir(String) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
setCheckpointDurationMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setCheckpointingInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setCheckpointingInterval(Long) - Method in interface org.apache.beam.sdk.io.kafka.KafkaIO.Read.FakeFlinkPipelineOptions
 
setCheckpointingMode(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setCheckpointTimeoutMillis(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setChecksum(String) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
setClientBuilderFactory(Class<? extends ClientBuilderFactory>) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
 
setClientConfiguration(ClientConfiguration) - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
setClientFactory(PubsubTestClient.PubsubTestClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setClientInfo(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setClientInfo(Properties) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setClock(Clock) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setClock(PubsubIO.Read<T>, Clock) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient.PubsubTestClientFactory
 
setCloningBehavior(DoFnTester.CloningBehavior) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
setClusterName(String) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
setClusterType(String) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
setCodeJarPathname(String) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
SetCoder<T> - Class in org.apache.beam.sdk.coders
A SetCoder encodes any Set using the format of IterableLikeCoder.
SetCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.SetCoder
 
setCoder(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
 
setCoder(Coder<T>) - Method in class org.apache.beam.sdk.values.PCollection
Sets the Coder used by this PCollection to encode and decode the values stored in it.
setCoderRegistry(CoderRegistry) - Method in class org.apache.beam.sdk.Pipeline
Deprecated.
this should never be used - every Pipeline has a registry throughout its lifetime.
setColumnDelimiter(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
setColumnDelimiter(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
 
setColumnDelimiter(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
setColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
setColumns(SnowflakeColumn[]) - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
 
setCommitOffsetInFinalize(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setCompression(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
setCompression(Compression) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
setCompressionCodecName(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
Specifies compression codec.
setConfig(byte[]) - Method in class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
 
setConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergCatalogConfig.Builder
 
setConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
setConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration.Builder
 
setConfluentSchemaRegistrySubject(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setConfluentSchemaRegistryUrl(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setConnectionInitSql(List<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setConnectionInitSql(List<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setConnectionProperties(List<String>) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
 
setConnectionProperties(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setConnectionProperties(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setConsumerConfig(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setConsumerConfigUpdates(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setConsumerPollingTimeout(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setContentType(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
 
setContext(BatchContextImpl) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
setCosmosClientBuilder(CosmosClientBuilder) - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
 
setCosmosKey(String) - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
 
setCosmosServiceEndpoint(String) - Method in interface org.apache.beam.sdk.io.azure.cosmos.CosmosOptions
 
setCountBackoffs(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountCacheReadFailures(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountCacheReadNonNulls(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountCacheReadNulls(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountCacheReadRequests(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountCacheWriteFailures(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountCacheWriteRequests(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountCacheWriteSuccesses(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountCalls(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountFailures(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountRequests(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountResponses(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountryOfResidence(String) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
setCountSetup(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountShouldBackoff(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountSleeps(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCountTeardown(Boolean) - Method in class org.apache.beam.io.requestresponse.Monitoring.Builder
 
setCreatedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the time at which the partition was created.
setCreateDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setCreateDisposition(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
 
setCreateFromSnapshot(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setCredentialFactoryClass(Class<? extends CredentialFactory>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setCrossProduct(Boolean) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
 
setCsvConfiguration(FileWriteSchemaTransformConfiguration.CsvConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
Configures extra details related to writing CSV formatted files.
setCurrentBundleTimestamp(Instant) - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
setCurrentContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Set the MetricsContainer for the current thread.
setCurrentSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Set the current (default) schema.
setCurrentTransform(AppliedPTransform<?, ?, ?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
setCustomBeamRequirement(String) - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
Set custom Beam version for bootstrap Beam venv.
setCustomerId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
setCustomerProvidedKey(CustomerProvidedKey) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setCustomErrors(CustomHttpErrors) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
 
setDatabase(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
setDatabase(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setDatabase(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
setDataCatalogEndpoint(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
 
setDataflowClient(Dataflow) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setDataflowEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setDataflowEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setDataflowJobFile(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setDataflowKmsKey(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setDataflowServiceOptions(List<String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setDataflowWorkerJar(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setDataSchema(byte[]) - Method in class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
 
setDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
 
setDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
 
setDataType(SnowflakeDataType) - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
setDebeziumConnectionProperties(List<String>) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
setDeduplicate(Deduplicate.KeyedValues<Uuid, SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
Set the deduplication transform.
setDefaultEnvironmentConfig(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setDefaultEnvironmentType(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setDefaultPipelineOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
Sets the default configuration in workers.
setDefaultPipelineOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.metrics.Metrics
Initialize metrics flags if not already done so.
setDefaultSdkHarnessLogLevel(SdkHarnessOptions.LogLevel) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
setDefaultTimezone(String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
setDefaultWorkerLogLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Deprecated.
 
setDeidentifyConfig(DeidentifyConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
setDeidentifyTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
setDelimiter(String) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
 
setDelimiters(byte[]) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
setDescription(String) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setDescription(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
setDescription(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
 
setDesiredNumUnboundedSourceSplits(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setDestination(Solace.Destination) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setDialect(String) - Method in class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
 
setDirectoryTreatment(FileIO.ReadMatches.DirectoryTreatment) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
setDisableAutoCommit(Boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setDisableMetrics(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setDiskSizeGb(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setDisplayData(List<DisplayData.ItemSpec<?>>) - Method in class org.apache.beam.sdk.transforms.PTransform
Set display data for your PTransform.
setDriverClassName(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setDriverClassName(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setDriverJars(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setDriverJars(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setDrop(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setDrop(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
setDrop(List<String>) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
 
setDumpHeapOnOOM(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setDuplicateCount(long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setEarliestBufferedSequence(Long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setElasticsearchHttpPort(Integer) - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
 
setElasticsearchServer(String) - Method in interface org.apache.beam.sdk.io.elasticsearch.ElasticsearchIOITCommon.ElasticsearchPipelineOptions
 
setElements(Long) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
 
setElementsPerPeriod(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
 
setElementType(FieldValueTypeInformation) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
setEmulatorHost(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
Define a host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
setEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setEnableBucketReadMetricCounter(Boolean) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setEnableBucketWriteMetricCounter(Boolean) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setEnableHeapDumps(boolean) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setEnableSparkMetricSinks(Boolean) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
setEnableStableInputDrain(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setEnableStorageReadApiV2(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setEnableStreamingEngine(boolean) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setEnableWebUI(Boolean) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
setEncodedRecord(byte[]) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
 
setEncodingPositions(Map<String, Integer>) - Method in class org.apache.beam.sdk.schemas.Schema
Sets the encoding positions for this schema.
setEnd(Long) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
 
setEndAtTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setEndpoint(URI) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
 
setEndTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
setEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the end time of the partition.
setEnforceEncodability(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
setEnforceImmutability(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
setEnvironmentCacheMillis(int) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setEnvironmentExpirationMillis(int) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setEnvironmentOptions(List<String>) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setError(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
 
setError(String) - Method in class org.apache.beam.sdk.schemas.io.Failure.Builder
 
setErrorField(String) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
Adds the error message to the returned error Row.
setErrorHandling(BigQueryWriteConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setErrorHandling(PubsubReadSchemaTransformConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setErrorHandling(PubsubWriteSchemaTransformConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
 
setException(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
 
setExceptionStacktrace(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Failure.Builder
 
setExecutionModeForBatch(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setExecutionRetryDelay(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setExecutorService(ExecutorService) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
Deprecated.
use ExecutorOptions.setScheduledExecutorService(java.util.concurrent.ScheduledExecutorService) instead. If set, it may result in multiple ExecutorServices, and therefore thread pools, in the runtime.
setExpansionPort(int) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
setExpansionServiceConfig(ExpansionServiceConfig) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
setExpansionServiceConfigFile(String) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
setExpectedAssertions(Integer) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
setExpectFileToNotExist(boolean) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
 
setExpectFileToNotExist(Boolean) - Method in class org.apache.beam.sdk.io.fs.CreateOptions.Builder
 
setExperiments(List<String>) - Method in interface org.apache.beam.sdk.options.ExperimentalOptions
 
setExpiration(long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setExpression(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
 
setExternalizedCheckpointsEnabled(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setExtraInteger(Integer) - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
 
setExtraString(String) - Method in class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider.Config.Builder
 
setFailOnCheckpointingErrors(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setFailsafeTableRowPayload(byte[]) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
setFailToLock(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
setFailure(BadRecord.Failure) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
 
setFasterCopy(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setFeatures(AnnotateTextRequest.Features) - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
 
setFetchSize(Short) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setField(Field) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setFieldId(Integer) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
 
setFieldName(String) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
 
setFieldRename(String) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
 
setFields(List<String>) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaExplodeTransformProvider.Configuration.Builder
 
setFields(Map<String, JavaRowUdf.Configuration>) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
 
setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
setFileFormat(FileFormat) - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
 
setFileInputSplitMaxSizeMB(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setFilenamePrefix(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
A common prefix to use for all generated filenames.
setFilenameSuffix(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
Configures the filename suffix for written files.
setFilepattern(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
 
setFilePattern(String) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
setFilesToStage(List<String>) - Method in interface org.apache.beam.sdk.options.FileStagingOptions
 
setFileSystem(FileSystem) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
setFilter(Expression) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setFinishBundleBeforeCheckpointing(boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setFinishedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the time at which the partition finished running.
setFirestoreDb(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
Set the Firestore database ID to connect to.
setFirestoreHost(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
Define a host port pair to allow connecting to a Cloud Firestore instead of the default live service.
setFirestoreProject(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
Set the Firestore project ID, it will override the value from GcpOptions.getProject().
setFlexRSGoal(DataflowPipelineOptions.FlexResourceSchedulingGoal) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setFlinkConfDir(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setFlinkMaster(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setForceStreaming(boolean) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
setForceUnalignedCheckpointEnabled(boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setFormat(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
The format of the file content.
setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
setFormatClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
setFormatProviderClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
setFromSnapshotExclusive(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setFromSnapshotInclusive(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setFromSnapshotRefExclusive(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setFromSnapshotRefInclusive(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setGcpCredential(Credentials) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setGcpOauthScopes(List<String>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setGcpTempLocation(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setGcsEndpoint(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsHttpRequestReadTimeout(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsHttpRequestWriteTimeout(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsPerformanceMetrics(Boolean) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsReadCounterPrefix(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsRewriteDataOpBatchLimit(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsUploadBufferSizeBytes(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsUploadBufferSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
setGcsUtil(GcsUtil) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsWriteCounterPrefix(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGCThrashingPercentagePerPeriod(Double) - Method in interface org.apache.beam.sdk.options.MemoryMonitorOptions
 
setGetOffsetFn(SerializableFunction<V, Long>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
setGetReceiverArgsFromConfigFn(SerializableFunction<PluginConfig, Object[]>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
setGlobalConfigRefreshPeriod(Duration) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setGoogleAdsClientId(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
 
setGoogleAdsClientSecret(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
 
setGoogleAdsCredential(Credentials) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
 
setGoogleAdsCredentialFactoryClass(Class<? extends CredentialFactory>) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
 
setGoogleAdsDeveloperToken(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
 
setGoogleAdsEndpoint(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
 
setGoogleAdsRefreshToken(String) - Method in interface org.apache.beam.sdk.io.googleads.GoogleAdsOptions
 
setGoogleApiTrace(GoogleApiDebugOptions.GoogleApiTracer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
 
setGroupingTableMaxSizeMb(int) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
setHdfsConfiguration(List<Configuration>) - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
 
setHeaderColumns(PCollectionView<List<String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
setHeaderColumns(PCollectionView<List<String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
 
setHeaderColumns(PCollectionView<List<String>>) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
setHeartbeatMillis(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the heartbeat interval in millis.
setHoldability(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setHooks(DataflowRunnerHooks) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
Sets callbacks to invoke during execution see DataflowRunnerHooks.
setHost(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
setHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setHotKeyLoggingEnabled(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setHttpClient(HttpClient) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setHttpClientConfiguration(HttpClientConfiguration) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
 
setHttpHeaders(Map<String, String>) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
 
setHttpPipeline(HttpPipeline) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setHTTPReadTimeout(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setHTTPWriteTimeout(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setHumanReadableJsonRecord(String) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Record.Builder
 
setId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
setId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
setIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setIdLabel(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setIdLabel(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
 
setIdleShutdownTimeout(String) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
setImpersonateServiceAccount(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
setInferMaps(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
 
setInitialPositionInStream(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setInitialTimestampInStream(Long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setInput(Input) - Method in class org.apache.beam.sdk.io.cdap.context.BatchSourceContextImpl
Overrides the input configuration of this Batch job to the specified Input.
setInputFile(String) - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
 
setInsertBundleParallelism(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setInspectConfig(InspectConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
setInspectConfig(InspectConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
 
setInspectConfig(InspectConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
setInspectTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
setInspectTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
 
setInspectTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
setIsBoundedInternal(PCollection.IsBounded) - Method in class org.apache.beam.sdk.values.PCollection
For internal use only; no backwards-compatibility guarantees.
setIsReadSeekEfficient(boolean) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
setIsWindmillServiceDirectPathEnabled(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setJavaAggregateFunctions(ImmutableMap<List<String>, Combine.CombineFn<?, ?, ?>>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
 
setJavaClassLookupAllowlist(JavaClassLookupTransformProvider.AllowList) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
setJavaClassLookupAllowlistFile(String) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
setJavaScalarFunctions(ImmutableMap<List<String>, UserFunctionDefinitions.JavaScalarFunction>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
 
setJdbcType(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setJdbcType(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setJdbcUrl(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setJdbcUrl(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setJdkAddOpenModules(List<String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setJdkAddOpenModules(List<String>) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
setJetDefaultParallelism(Integer) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
setJetLocalMode(Integer) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
setJetProcessorsCooperative(Boolean) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
setJetServers(String) - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
 
setJfrRecordingDurationSec(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setJobCheckIntervalInSecs(int) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setJobEndpoint(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setJobFileZip(String) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
setJobId(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
setJobLabelsMap(Map<String, String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setJobName(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setJobServerConfig(String...) - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
 
setJobServerDriver(Class<JobServerDriver>) - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
 
setJobServerTimeout(int) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setJobType(String) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
setJsonToRowWithErrFn(JsonToRow.JsonToRowWithErrFn) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError.Builder
 
setKeep(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setKeep(List<String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
setKeep(JavaRowUdf.Configuration) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
 
setKeyDeserializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setKeySerializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
 
setKinesisIOConsumerArns(Map<String, String>) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisIOOptions
 
setKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setLabels(Map<String, String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setLanguage(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaFilterTransformProvider.Configuration.Builder
 
setLanguage(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaMapToFieldsTransformProvider.Configuration.Builder
 
setLanguageHint(String) - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText.Builder
 
setLastEventReceived(boolean) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setLastModifiedMillis(long) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
setLastProcessedSequence(Long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setLatencyNanos(Long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
 
setLatencyTrackingInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setLatestBufferedSequence(Long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setLength(Long) - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
 
setLevel(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
 
setLineField(String) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
Sets the field name for the line field in the returned Row.
setListeners(List<JavaStreamingListener>) - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
setLoadBalanceBundles(boolean) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setLocalJobServicePortFile(String) - Method in interface org.apache.beam.runners.portability.testing.TestUniversalRunner.Options
 
setLocalWindmillHostport(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setLocation(String) - Method in class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
 
setLocation(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setLocation(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setLocation(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setLocation(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setLoginTimeout(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setLogMdc(boolean) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
setManifestListLocation(String) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
setMapKeyType(FieldValueTypeInformation) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setMapping(Contextful<Contextful.Fn<T, Long>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
 
setMapping(Contextful<Contextful.Fn<KV<K, V>, KV<K, Long>>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
 
setMapValueType(FieldValueTypeInformation) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setMaxBufferingDurationMilliSec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setMaxBundlesFromWindmillOutstanding(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setMaxBundleSize(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setMaxBundleTimeMills(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setMaxBytesFromWindmillOutstanding(long) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setMaxCacheMemoryUsageMb(int) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
setMaxCacheMemoryUsageMbClass(Class<? extends SdkHarnessOptions.MaxCacheMemoryUsageMb>) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
setMaxCacheMemoryUsagePercent(float) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
setMaxCapacityPerShard(Long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setMaxConnectionPoolConnections(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setMaxNumberOfRecords(Long) - Method in class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
 
setMaxNumRecords(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setMaxNumRecords(Long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setMaxNumWorkers(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setMaxOutputElementsPerBundle(int) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Overrides the default value.
setMaxParallelism(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setMaxReadTime(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
 
setMaxReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setMaxReadTime(Long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setMaxReadTimeSeconds(Integer) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setMaxRecordsPerBatch(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setMaxStackTraceDepthToReport(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setMaxStreamingBatchSize(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setMaxStreamingRowsToBatch(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setMemoryMB(int) - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
Sets the size of the memory buffer in megabytes.
setMessageId(Long) - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
 
setMessageId(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
 
setMessageId(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
 
setMessageId(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setMessageName(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions.Builder
 
setMessageName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setMessageName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setMessageName(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setMessageName(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
setMessageRecord(Object) - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
 
setMetadataTable(String) - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
Specifies the name of the metadata table.
setMethod(Method) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setMetricsGraphiteHost(String) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
setMetricsGraphitePort(Integer) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
setMetricsHttpSinkUrl(String) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
setMetricsPushPeriod(Long) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
setMetricsSink(Class<? extends MetricsSink>) - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
 
setMetricsSupported(boolean) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Called by the run to indicate whether metrics reporting is supported.
setMimeType(String) - Method in class org.apache.beam.sdk.io.fs.CreateOptions.Builder
 
setMinConnectionPoolConnections(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setMinCpuPlatform(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setMinPauseBetweenCheckpoints(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setMinReadTimeMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setName(String) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
setName(String) - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
setName(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
 
setName(String) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setName(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
setName(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
 
setName(String) - Method in class org.apache.beam.sdk.values.PCollection
Sets the name of this PCollection.
setName(String) - Method in class org.apache.beam.sdk.values.PValueBase
Sets the name of this PValueBase.
setNetwork(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setNetworkTimeout(Executor, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setNullable(boolean) - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
setNullable(boolean) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setNumber(Integer) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setNumberOfBufferedEvents(long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setNumberOfExecutionRetries(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setNumberOfReceivedEvents(long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setNumberOfWorkerHarnessThreads(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setNumConcurrentCheckpoints(int) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setNumFailuresExpected(int) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
setNumSampledBytesPerFile(long) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
setNumShards(Integer) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
The number of output shards produced; a value of 1 disables sharding.
setNumStorageWriteApiStreamAppendClients(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setNumStorageWriteApiStreams(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setNumStreamingKeys(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setNumStreams(Integer) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setNumWorkers(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setOAuthToken(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setOauthToken(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setObjectReuse(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setOnCreateMatcher(SerializableMatcher<PipelineResult>) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
setOneOfTypes(Map<String, FieldValueTypeInformation>) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setOnly(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setOnly(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
setOnSuccessMatcher(SerializableMatcher<PipelineResult>) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
setOperation(String) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
setOperatorChaining(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
SetOperatorFilteringDoFn(String, String, BeamSetOperatorRelBase.OpType, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.SetOperatorFilteringDoFn
 
setOption(String, Row) - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
 
setOption(String, Schema.FieldType, Object) - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
 
setOption(String, Schema.FieldType, Object) - Static method in class org.apache.beam.sdk.schemas.Schema.Options
 
setOption(String, Row) - Static method in class org.apache.beam.sdk.schemas.Schema.Options
 
setOptions(ImmutableMap<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
Returns a copy of the Field with isNullable set.
setOptions(Schema.Options.Builder) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
setOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
setOptions(Schema.Options.Builder) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
setOptionsId(long) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setOutput(String) - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
 
setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.ErrorHandling.Builder
 
setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
 
setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
 
setOutput(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.ErrorHandling.Builder
 
setOutputDataSet(PCollection<T>, TSet<WindowedValue<T>>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
setOutputExecutablePath(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setOutputFilePrefix(String) - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
 
setOutputParallelization(Boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setOutputParallelization(Boolean) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
 
setOverrideWindmillBinary(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setParallelism(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setParallelism(int) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
setParam(String, Object) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
Sets a Plugin single parameter.
setParameters(T, PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.PreparedStatementSetter
 
setParameters(PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.StatementPreparator
 
setParameters(KV<PartitionT, PartitionT>, PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcReadWithPartitionsHelper
 
setParameters(PreparedStatement) - Method in interface org.apache.beam.sdk.io.singlestore.SingleStoreIO.StatementPreparator
 
setParentId(Long) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
setParentTokens(HashSet<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the collection of parent partition identifiers.
setParquetConfiguration(FileWriteSchemaTransformConfiguration.ParquetConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
Configures extra details related to writing Parquet formatted files.
setPartitionKey(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
 
setPartitionSpec(PartitionSpec) - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
 
setPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the unique partition identifier.
setPassword(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
setPassword(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setPassword(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setPassword(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setPassword(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setPath(String) - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration.Builder
 
setPath(String) - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration.Builder
 
setPath(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration.Builder
 
setPathValidator(PathValidator) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setPathValidatorClass(Class<? extends PathValidator>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setPayload(byte[]) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
setPayload(byte[]) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setPayload(byte[]) - Method in class org.apache.beam.sdk.schemas.io.Failure.Builder
 
setPeriod(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
 
setPeriodicStatusPageOutputDirectory(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setPerWorkerMetricsUpdateReportingPeriodMillis(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setPipelineOption(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
setPipelineOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
 
setPipelineOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
 
setPipelineOptionsMap(Map<String, String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
Only called from the BeamCalciteSchema.
setPipelinePolicy(HttpPipelinePolicy) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setPipelineUrl(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setPlannerName(String) - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
 
setPluginClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
setPluginType(PluginConstants.PluginType) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
setPollIntervalMillis(Long) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
 
setPort(Integer) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
setPort(int) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
 
setPortNumber(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setPrecision(int) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally.Builder
 
setPrecision(Integer) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey.Builder
 
setPrecision(int) - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
setPredefinedCsvFormat(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.CsvConfiguration.Builder
The CSVFormat.Predefined#name() of the written CSV file.
setPrefix(String) - Method in class org.apache.beam.sdk.schemas.transforms.providers.LoggingTransformProvider.Configuration.Builder
 
setPrimaryKey(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setPrimaryKey(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
setPriority(int) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setPrismLocation(String) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
setPrismLogLevel(String) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
setPrismVersionOverride(String) - Method in interface org.apache.beam.runners.prism.PrismPipelineOptions
 
setPrivateKeyPassphrase(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setPrivateKeyPassphrase(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setPrivateKeyPath(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setPrivateKeyPath(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setProcessWideContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Set the MetricsContainer for the current process.
setProducerConfig(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
 
setProducerConfigUpdates(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
setProducerProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
 
setProduceStatusUpdateOnEveryEvent(boolean) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Sets the indicator of whether the status notification needs to be produced on every event.
setProfilingAgentConfiguration(DataflowProfilingOptions.DataflowProfilingAgentConfiguration) - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
setProject(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setProject(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setProject(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setProject(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
setProvidedSparkContext(JavaSparkContext) - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
setProviderRuntimeValues(ValueProvider<Map<String, Object>>) - Method in interface org.apache.beam.sdk.testing.TestPipeline.TestValueProviderOptions
 
setProxyConfiguration(ProxyConfiguration) - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
 
setPublished(Boolean) - Method in class org.apache.beam.sdk.io.solace.data.Solace.PublishResult.Builder
 
setPublishMonotonicNanos(long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.CorrelationKey.Builder
 
setPubsubRootUrl(String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
 
setQualifiers(List<FieldAccessDescriptor.FieldDescriptor.Qualifier>) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Builder
 
setQuery(String) - Method in class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
 
setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
Configures the BigQuery read job with the SQL query.
setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
setQuery(String) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
 
setQuery(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setQuery(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
BigQuery geographic location where the query job will be executed.
setQueryPlannerClassName(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
 
setQueryString(String) - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
 
setQueue(Queue) - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
This method is called in the SolaceIO.Read.expand(org.apache.beam.sdk.values.PBegin) method to set the Queue reference.
setRamMegaBytes(int) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
setRate(GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
 
setRateLimit(Long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setRawPrivateKey(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setRawPrivateKey(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setRawType(Class<?>) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setReaderCacheTimeoutSec(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setReadOnly(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setReadQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setReadTimeout(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
 
setReadTimePercentage(Double) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setReadTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setReceiverClass(Class<? extends Receiver<V>>) - Method in class org.apache.beam.sdk.io.cdap.Plugin.Builder
 
setReceiveTimestamp(Long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setRecord(BadRecord.Record) - Method in class org.apache.beam.sdk.transforms.errorhandling.BadRecord.Builder
 
setRecordJfrOnGcThrashing(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setRedelivered(boolean) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setRedistribute(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setRedistributeNumKeys(Integer) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setRegion(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setReidentifyConfig(DeidentifyConfig) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
setReidentifyTemplateName(String) - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText.Builder
 
setReIterableGroupByKeyResult(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setRelative() - Method in interface org.apache.beam.sdk.state.Timer
Sets the timer relative to the current time, according to any offset and alignment specified.
setReplicationGroupMessageId(String) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setReplyTo(Solace.Destination) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setReportCheckpointDuration(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setRequestRecordsLimit(Long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setResourceHints(ResourceHints) - Method in class org.apache.beam.sdk.transforms.PTransform
Sets resource hints for the transform.
setResourceHints(List<String>) - Method in interface org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions
 
setResourceId(ResourceId) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
setResultCount(long) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setRetainDockerContainers(boolean) - Method in interface org.apache.beam.sdk.options.ManualDockerEnvironmentOptions
 
setRetainExternalizedCheckpointsOnCancellation(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setRole(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setRole(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setRootElement(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration.Builder
Sets the enclosing root element for the generated XML files.
setRowGroupSize(Integer) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.ParquetConfiguration.Builder
Specify row-group size; if not set or zero, a default is used by the underlying writer.
setRowRestriction(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setRowSchema(Schema) - Method in class org.apache.beam.sdk.values.PCollection
Sets a schema on this PCollection.
setRuleSets(Collection<RuleSet>) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
Set the ruleSet used for query optimizer.
setRunner(Class<? extends PipelineRunner<?>>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setRunnerDeterminedSharding(boolean) - Method in interface org.apache.beam.runners.direct.DirectTestOptions
 
setRunningAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the time at which the partition started running.
Sets - Class in org.apache.beam.sdk.transforms
The PTransforms that allow to compute different set functions across PCollections.
Sets() - Constructor for class org.apache.beam.sdk.transforms.Sets
 
sets(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Set.
setS3ClientBuilder(AmazonS3ClientBuilder) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setS3ClientBuilder(S3ClientBuilder) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setS3ClientFactoryClass(Class<? extends S3ClientBuilderFactory>) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
setS3ClientFactoryClass(Class<? extends S3ClientBuilderFactory>) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
setS3StorageClass(String) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
setS3StorageClass(String) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setS3StorageClass(String) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
setS3StorageClass(String) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setS3ThreadPoolSize(int) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
setS3ThreadPoolSize(int) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setS3ThreadPoolSize(int) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
setS3ThreadPoolSize(int) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setS3UploadBufferSizeBytes(Integer) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
setS3UploadBufferSizeBytes(int) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setS3UploadBufferSizeBytes(Integer) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
setS3UploadBufferSizeBytes(int) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setSamplingStrategy(TextRowCountEstimator.SamplingStrategy) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
setSasToken(String) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
 
setSaveHeapDumpsToGcsPath(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setSavepoint() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setSavepoint(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setSavepointPath(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setSaveProfilesToGcs(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
setScale(int) - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
setScanType(IcebergScanConfig.ScanType) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setScheduledAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the time at which the partition was scheduled.
setScheduledExecutorService(ScheduledExecutorService) - Method in interface org.apache.beam.sdk.options.ExecutorOptions
 
setSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setSchema(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
 
setSchema(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setSchema(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setSchema(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setSchema(byte[]) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setSchema(Schema) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setSchema(Schema) - Method in class org.apache.beam.sdk.io.iceberg.IcebergTableCreateConfig.Builder
 
setSchema(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
 
setSchema(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
setSchema(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setSchema(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setSchema(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.values.PCollection
Sets a Schema on this PCollection.
setSchemaId(Integer) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
setSchemaIfNotPresent(String, Schema) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
 
setSchematizedData(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
setScheme(String) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setScheme(String) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setSdkContainerImage(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setSdkHarnessContainerImageOverrides(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setSdkHarnessLogLevelOverrides(SdkHarnessOptions.SdkHarnessLogLevelOverrides) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
setSdkWorkerParallelism(int) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
setSeconds(Long) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Rate.Builder
 
setSelectedFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setSemiPersistDir(String) - Method in interface org.apache.beam.sdk.options.RemoteEnvironmentOptions
 
setSenderTimestamp(Long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setSequenceNumber(long) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
setSequenceNumber(Long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setSerializedWindowingStrategy(byte[]) - Method in class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider.Configuration.Builder
 
setServerName(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setServerName(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setServiceAccount(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
Uses the given ShardNameTemplate for naming output files.
setShouldFailRow(Function<TableRow, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
setShutdownSourcesAfterIdleMs(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setSideInput(PCollectionView<T>, BoundedWindow, T) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
setSideInputDataSet(String, BatchTSet<WindowedValue<ElemT>>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
 
setSideInputs(Map<PCollectionView<?>, Map<BoundedWindow, ?>>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
setSize(Long) - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
 
setSizeBytes(long) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
setSizeEstimator(CoderSizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
Sets the estimator to track throughput for each DoFn instance.
setSkipHeaderLines(int) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.Builder
 
setSnapshot(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setSnapshotId(long) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
setSnowPipe(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setSorterType(ExternalSorter.Options.SorterType) - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
Sets the sorter type.
setSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
setSparkMaster(String) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
setSql(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setSqlScalarFunctions(ImmutableMap<List<String>, ResolvedNodes.ResolvedCreateFunctionStmt>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
 
setSqlTableValuedFunctions(ImmutableMap<List<String>, ResolvedNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.Builder
 
setSSEAlgorithm(String) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
setSSEAlgorithm(String) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setSSEAlgorithm(String) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
setSSEAlgorithm(String) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setSSEAwsKeyManagementParams(SSEAwsKeyManagementParams) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
setSSEAwsKeyManagementParams(SSEAwsKeyManagementParams) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setSSECustomerKey(SSECustomerKey) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
setSSECustomerKey(SSECustomerKey) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration.Builder
 
setSSECustomerKey(SSECustomerKey) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
setSSECustomerKey(SSECustomerKey) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setSSEKMSKeyId(String) - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
 
setSSEKMSKeyId(String) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration.Builder
 
setStableUniqueNames(PipelineOptions.CheckEnabled) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setStager(Stager) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setStagerClass(Class<? extends Stager>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setStagingBucketName(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setStagingBucketName(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setStagingLocation(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setStaleness(Long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setStart(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
 
setStart(Long) - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration.Builder
 
setStartAtTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setStartOffset(Long) - Method in interface org.apache.beam.sdk.io.sparkreceiver.HasOffset
 
setStartReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the start time of the partition.
setState(PipelineResult.State) - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
 
setState(PartitionMetadata.State) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the current state of the partition.
SetState<T> - Interface in org.apache.beam.sdk.state
A ReadableState cell containing a set of elements.
setStateBackend(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setStateBackendFactory(Class<? extends FlinkStateBackendFactory>) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
Deprecated.
Please use setStateBackend below.
setStateBackendStoragePath(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setStatistics(BeamTableStatistics) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
setStatusDate(Instant) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingStatus.Builder
 
setStatusUpdateFrequency(Duration) - Method in class org.apache.beam.sdk.extensions.ordered.OrderedProcessingHandler
Changes the default status update frequency.
setStop(Long) - Method in class org.apache.beam.sdk.io.GenerateSequence.External.ExternalConfiguration
 
setStopPipelineWatermark(Long) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
setStopReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setStorageApiAppendThresholdBytes(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageApiAppendThresholdRecordCount(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageClient(Storage) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
 
setStorageIntegrationName(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setStorageIntegrationName(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setStorageLevel(String) - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
 
setStorageWriteApiMaxRequestSize(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageWriteApiMaxRetries(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageWriteApiTriggeringFrequencySec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageWriteMaxInflightBytes(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageWriteMaxInflightRequests(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStreaming(boolean) - Method in interface org.apache.beam.sdk.options.StreamingOptions
 
setStreamingSideInputCacheExpirationMillis(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setStreamingSideInputCacheMb(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setStreamingTimeoutMs(Long) - Method in interface org.apache.beam.runners.spark.SparkPortableStreamingPipelineOptions
 
setStuckCommitDurationMillis(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setSubmissionMode(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.broker.SessionServiceFactory
Called by the write connector to set the submission mode used to create the message producers.
setSubnetwork(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setSubscriptionName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setSubscriptionPath(SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
 
setSummary(Map<String, String>) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
setSupportKafkaMetrics(boolean) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
 
setSupportMetricsDeletion(boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
setSupportStreamingInsertsMetrics(boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
setTable(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setTable(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.iceberg.SchemaTransformConfiguration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setTable(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
settableArguments - Variable in class org.apache.beam.sdk.io.cdap.context.BatchContextImpl
This should be set after SubmitterLifecycle.prepareRun(Object) call with passing this context object as a param.
setTableCreateConfig(IcebergTableCreateConfig) - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
 
setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
 
setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
 
setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder
 
setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
setTableIdentifier(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergDestination.Builder
 
setTableIdentifier(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setTableIdentifier(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setTableIdentifier(String...) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setTableSchema(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
 
setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
Specifies a table for a BigQuery read job.
setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setTag(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setTargetDataset(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
setTargetParallelism(int) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
setTempDatasetId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setTemplateLocation(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setTempLocation(String) - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
Sets the path to a temporary location where the sorter writes intermediate files.
setTempLocation(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setTempRoot(String) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
SETTER_WITH_NULL_METHOD_ERROR - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
setTerminateAfterSecondsSinceNewOutput(Long) - Method in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration.Builder
 
SetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
 
setTestMode(boolean) - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
 
setTestTimeoutSeconds(Long) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
setThroughputEstimator(BytesThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
Sets the estimator to calculate the backlog of this function.
setTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
setTimer(StateNamespace, String, String, Instant, Instant, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
setTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
Sets a timer to fire when the event time watermark, the current processing time, or the synchronized processing time watermark surpasses a given timestamp.
setTimer(Instant, Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
Set a timer with outputTimestamp.
setTimestamp(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
setTimestamp(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
 
setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
setTimestampBoundMode(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setTimestampMillis(long) - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo.Builder
 
setTimestampPolicy(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setTimeSupplier(Supplier<Timestamp>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
setTimeToLive(long) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Record.Builder
 
setTimeUnit(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
 
setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
 
setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
Sets the topic from which to read.
setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
 
setTopicName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setTopicPath(TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
 
setTopics(List<String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setToSnapshot(Long) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setToSnapshotRef(String) - Method in class org.apache.beam.sdk.io.iceberg.IcebergScanConfig.Builder
 
setTransactionIsolation(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setTransformNameMapping(Map<String, String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setTriggeringFrequencySeconds(Long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setTriggeringFrequencySeconds(Integer) - Method in class org.apache.beam.sdk.io.iceberg.IcebergWriteSchemaTransformProvider.Configuration.Builder
 
setTruncateTimestamps(boolean) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
 
setTruncateTimestamps(BigQueryUtils.ConversionOptions.TruncateTimestamps) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
 
setTSetEnvironment(TSetEnvironment) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
setTwister2Home(String) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
setType(Solace.DestinationType) - Method in class org.apache.beam.sdk.io.solace.data.Solace.Destination.Builder
 
setType(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation.Builder
 
setType(Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
setTypeDescriptor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.values.PCollection
setTypeMap(Map<String, Class<?>>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
setUint16Behavior(UnsignedOptions.Behavior) - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
 
setUint32Behavior(UnsignedOptions.Behavior) - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
 
setUint64Behavior(UnsignedOptions.Behavior) - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
 
setUint8Behavior(UnsignedOptions.Behavior) - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Builder
 
setUnalignedCheckpointEnabled(boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setUnboundedReaderMaxElements(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setUnboundedReaderMaxReadTimeMs(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setUnboundedReaderMaxReadTimeSec(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setUnboundedReaderMaxWaitForElementsMs(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setUnknownFieldsPayload(byte[]) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
setup() - Method in interface org.apache.beam.io.requestresponse.SetupTeardown
Called during the DoFn's setup lifecycle method.
setup(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
setUp(Schema) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
prepare the instance.
setup() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
setup() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
setup() - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
 
setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
Obtains the instance of DetectNewPartitionsAction.
setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
 
setUp() - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
setUp() - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
setup() - Method in interface org.apache.beam.sdk.io.kafka.CheckStopReadingFn
 
setup() - Method in class org.apache.beam.sdk.io.pulsar.WriteToPulsarDoFn
 
setup() - Method in class org.apache.beam.sdk.jmh.schemas.RowBundle
 
setup() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState
 
setup() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState
 
setup(Blackhole) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark.BlackholeOutput
 
setup() - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Bytes
 
setup() - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark.Longs
 
setup() - Method in class org.apache.beam.sdk.transforms.Reshuffle.AssignShardFn
 
setUpdate(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setUpdateCompatibilityVersion(String) - Method in interface org.apache.beam.sdk.options.StreamingOptions
 
setUploadBufferSizeBytes(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions.Builder
 
setupModule(Module.SetupContext) - Method in class org.apache.beam.sdk.io.aws2.options.AwsModule
 
SetupTeardown - Interface in org.apache.beam.io.requestresponse
Provided by user and called within DoFn.Setup and @{link org.apache.beam.sdk.transforms.DoFn.Teardown} lifecycle methods of Call's DoFn.
setUpToDateThreshold(Long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setUrl(String) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setUseActiveSparkSession(boolean) - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
 
setUseAltsServer(boolean) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
 
setUseAtLeastOnceSemantics(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setUseCdcWrites(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setUseDataStreamForBatch(Boolean) - Method in interface org.apache.beam.runners.flink.VersionDependentFlinkPipelineOptions
 
setUsePublicIps(Boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setUserAgent(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setUsername(String) - Method in class org.apache.beam.io.debezium.DebeziumReadSchemaTransformProvider.DebeziumReadSchemaTransformConfiguration.Builder
 
setUsername(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration.Builder
 
setUsername(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setUsername(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setUsername(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setUseSeparateWindmillHeartbeatStreams(Boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setUsesProvidedSparkContext(boolean) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setUseStandardSql(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
Enables BigQuery's Standard SQL dialect when reading from a query.
setUseStorageApiConnectionPool(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setUseStorageWriteApi(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setUseStorageWriteApiAtLeastOnce(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setUseTransformService(boolean) - Method in interface org.apache.beam.sdk.extensions.python.PythonExternalTransformOptions
 
setUseWindmillIsolatedChannels(Boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setUuid(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
setUUID(UUID) - Method in class org.apache.beam.sdk.schemas.Schema
Set this schema's UUID.
setUuidExtractor(SerializableFunction<SequencedMessage, Uuid>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
 
SetUuidFn(String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn
 
SetUuidFromPubSubMessage(String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage
 
setValueDeserializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
 
setValueSerializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
 
setVerifyRowValues(Boolean) - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
 
setWarehouse(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
 
setWarehouse(ValueProvider<String>) - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
 
setWatermark(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the watermark (last processed timestamp) for the partition.
setWatermark(Instant) - Method in interface org.apache.beam.sdk.transforms.splittabledofn.ManualWatermarkEstimator
Sets a timestamp before or at the timestamps of all future elements produced by the associated DoFn.
setWatermark(Instant) - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
 
setWatermarkIdleDurationThreshold(Long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setWatermarkPolicy(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
 
setWindmillGetDataStreamCount(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillHarnessUpdateReportingPeriod(Duration) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillMessagesBetweenIsReadyChecks(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillServiceCommitThreads(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillServiceEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillServicePort(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillServiceRpcChannelAliveTimeoutSec(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillServiceStreamingLogEveryNStreamFailures(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillServiceStreamingRpcBatchLimit(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillServiceStreamingRpcHealthCheckPeriodMs(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindmillServiceStreamMaxBackoffMillis(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions
 
setWindowedWrites() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Indicates that the operation will be performing windowed writes.
setWindowingStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
For internal use only; no backwards-compatibility guarantees.
setWithAttributes(Boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setWithPartitions(Boolean) - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration.Builder
 
setWorkerCacheMb(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setWorkerCPUs(int) - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
 
setWorkerDiskType(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setWorkerHarnessContainerImage(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
setWorkerId(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
setWorkerLogLevelOverrides(DataflowWorkerLoggingOptions.WorkerLogLevelOverrides) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Deprecated.
 
setWorkerMachineType(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setWorkerPool(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
setWorkerRegion(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setWorkerSystemErrMessageLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Deprecated.
 
setWorkerSystemOutMessageLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Deprecated.
 
setWorkerZone(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setWriteDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration.Builder
 
setWriteDisposition(String) - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
 
setWriteStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration.Builder
 
setWriteTimeout(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
 
setXmlConfiguration(FileWriteSchemaTransformConfiguration.XmlConfiguration) - Method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.Builder
Configures extra details related to writing XML formatted files.
setZetaSqlDefaultTimezone(String) - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
 
setZone(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
sha1Bytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
SHA1(X)
sha1String(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
SHA1(X)
sha256Bytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
SHA256(X)
sha256String(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
SHA256(X)
sha512Bytes(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
SHA512(X)
sha512String(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinHashFunctions
SHA512(X)
ShardedKey<K> - Class in org.apache.beam.sdk.values
A key and a shard number.
ShardedKeyCoder<KeyT> - Class in org.apache.beam.sdk.coders
A Coder for ShardedKey, using a wrapped key Coder.
ShardedKeyCoder(Coder<KeyT>) - Constructor for class org.apache.beam.sdk.coders.ShardedKeyCoder
 
ShardingFunction<UserT,DestinationT> - Interface in org.apache.beam.sdk.io
Function for assigning ShardedKeys to input elements for sharded WriteFiles.
ShardNameTemplate - Class in org.apache.beam.sdk.io
Standard shard naming templates.
ShardNameTemplate() - Constructor for class org.apache.beam.sdk.io.ShardNameTemplate
 
shardRefreshInterval(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.RecordAggregation.Builder
Refresh interval for shards.
shortCircuitReturnNull(StackManipulation, StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
shorts() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Short.
shouldConvertRaggedUnionTypesToVarying() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
shouldDefer(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
shouldPublishLatencyMetrics() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
shouldRepeat() - Method in exception org.apache.beam.io.requestresponse.UserCodeExecutionException
Reports whether when thrown warrants repeat execution.
shouldRepeat() - Method in exception org.apache.beam.io.requestresponse.UserCodeQuotaException
Reports that quota errors should be repeated.
shouldRepeat() - Method in exception org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
Reports that remote system errors should be repeated.
shouldRepeat() - Method in exception org.apache.beam.io.requestresponse.UserCodeTimeoutException
Reports that timeouts should be repeated.
shouldResume() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
If false, the DoFn promises that there is no more work remaining for the current element, so the runner should not resume the DoFn.ProcessElement call.
shouldRetry(InsertRetryPolicy.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
Return true if this failure should be retried.
shutdown() - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
 
sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
Returns the value of a given side input.
sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the value of a given side input.
sideInput(PCollectionView<T>) - Method in interface org.apache.beam.sdk.state.StateContext
Returns the value of the side input for the corresponding state window.
sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
Returns the value of the side input for the window corresponding to the main input's window in which values are being combined.
sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.Contextful.Fn.Context
Accesses the given side input.
sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Returns the value of the side input.
SideInputBroadcast<T> - Class in org.apache.beam.runners.spark.util
Broadcast helper for side inputs.
sideInputId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
sideInputJoin(PCollection<Row>, PCollection<Row>, FieldAccessDescriptor, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
 
SideInputSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
SideInputValues<T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
SideInputValues serves as a Kryo serializable container that contains a materialized view of side inputs.
SideInputValues.BaseSideInputValues<BinaryT,ValuesT,T> - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
 
SideInputValues.ByWindow<T> - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
General SideInputValues for BoundedWindows in two possible states.
SideInputValues.Global<T> - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
Specialized SideInputValues for use with the GlobalWindow in two possible states.
SideInputValues.Loader<T> - Interface in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
Factory function for load SideInputValues from a Dataset.
signalStart() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Outputs a message that the pipeline has started.
signalSuccessWhen(Coder<T>, SerializableFunction<T, String>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Outputs a success message when successPredicate is evaluated to true.
signalSuccessWhen(Coder<T>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
SimpleCombineFn(SerializableFunction<Iterable<V>, V>) - Constructor for class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
Deprecated.
 
SimpleFunction<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A SerializableFunction which is not a functional interface.
SimpleFunction() - Constructor for class org.apache.beam.sdk.transforms.SimpleFunction
 
SimpleFunction(SerializableFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.transforms.SimpleFunction
 
SimpleIdentifierContext(FieldSpecifierNotationParser.DotExpressionComponentContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
 
SimpleRateLimitPolicy(double) - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsV17.SimpleRateLimitPolicy
 
SimpleRateLimitPolicy(double, long, TimeUnit) - Constructor for class org.apache.beam.sdk.io.googleads.GoogleAdsV17.SimpleRateLimitPolicy
 
SimpleRemoteEnvironment() - Constructor for class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
 
singleByteEncodeDoLoopByteString(VarIntBenchmark.Bytes, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
singleByteEncodeDoLoopTwiddleByteString(VarIntBenchmark.Bytes, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
singleByteEncodeLoopByteString(VarIntBenchmark.Bytes, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
singleByteEncodeUnrolledByteString(VarIntBenchmark.Bytes, VarIntBenchmark.ByteStringOutput) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
SingleEnvironmentInstanceJobBundleFactory - Class in org.apache.beam.runners.fnexecution.control
Deprecated.
replace with a DefaultJobBundleFactory when appropriate if the EnvironmentFactory is a DockerEnvironmentFactory, or create an InProcessJobBundleFactory and inline the creation of the environment if appropriate.
singleOutputOverrideFactory() - Static method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
Returns a PTransformOverrideFactory that replaces a single-output ParDo with a composite transform specialized for the DataflowRunner.
SingleStoreIO - Class in org.apache.beam.sdk.io.singlestore
IO to read and write data on SingleStoreDB.
SingleStoreIO() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO
 
SingleStoreIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.singlestore
A POJO describing a SingleStoreDB DataSource by providing all properties needed to create it.
SingleStoreIO.Read<T> - Class in org.apache.beam.sdk.io.singlestore
A PTransform for reading data from SingleStoreDB.
SingleStoreIO.Read.SingleStoreRowMapperInitializationException - Exception in org.apache.beam.sdk.io.singlestore
 
SingleStoreIO.ReadWithPartitions<T> - Class in org.apache.beam.sdk.io.singlestore
A PTransform for reading data from SingleStoreDB.
SingleStoreIO.RowMapper<T> - Interface in org.apache.beam.sdk.io.singlestore
An interface used by SingleStoreIO.Read and SingleStoreIO.ReadWithPartitions for converting each row of the ResultSet into an element of the resulting PCollection.
SingleStoreIO.RowMapperWithCoder<T> - Interface in org.apache.beam.sdk.io.singlestore
A RowMapper that provides a Coder for resulting PCollection.
SingleStoreIO.RowMapperWithInit<T> - Interface in org.apache.beam.sdk.io.singlestore
A RowMapper that requires initialization.
SingleStoreIO.StatementPreparator - Interface in org.apache.beam.sdk.io.singlestore
An interface used by the SingleStoreIO SingleStoreIO.Read to set the parameters of the PreparedStatement.
SingleStoreIO.UserDataMapper<T> - Interface in org.apache.beam.sdk.io.singlestore
An interface used by the SingleStoreIO SingleStoreIO.Write to map a data from each element of PCollection to a List of Strings.
SingleStoreIO.Write<T> - Class in org.apache.beam.sdk.io.singlestore
A PTransform for writing data to SingleStoreDB.
SingleStoreSchemaTransformReadConfiguration - Class in org.apache.beam.sdk.io.singlestore.schematransform
Configuration for reading from SingleStoreDB.
SingleStoreSchemaTransformReadConfiguration() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
 
SingleStoreSchemaTransformReadConfiguration.Builder - Class in org.apache.beam.sdk.io.singlestore.schematransform
 
SingleStoreSchemaTransformReadProvider - Class in org.apache.beam.sdk.io.singlestore.schematransform
An implementation of TypedSchemaTransformProvider for SingleStoreDB read jobs configured using SingleStoreSchemaTransformReadConfiguration.
SingleStoreSchemaTransformReadProvider() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadProvider
 
SingleStoreSchemaTransformWriteConfiguration - Class in org.apache.beam.sdk.io.singlestore.schematransform
Configuration for writing to SingleStoreDB.
SingleStoreSchemaTransformWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
 
SingleStoreSchemaTransformWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.singlestore.schematransform
 
SingleStoreSchemaTransformWriteProvider - Class in org.apache.beam.sdk.io.singlestore.schematransform
An implementation of TypedSchemaTransformProvider for SingleStoreDB write jobs configured using SingleStoreSchemaTransformWriteConfiguration.
SingleStoreSchemaTransformWriteProvider() - Constructor for class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteProvider
 
singleTable(TableIdentifier, Schema) - Static method in interface org.apache.beam.sdk.io.iceberg.DynamicDestinations
 
singletonView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>, boolean, T, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<T> capable of processing elements windowed using the provided WindowingStrategy.
singletonViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, T>>, PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>, boolean, T, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
sinh(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
SINH(X)
sink(Class<ElementT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
A AvroIO.Sink for use with FileIO.write() and FileIO.writeDynamic(), writing elements of the given generated class, like AvroIO.write(Class).
sink(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
A AvroIO.Sink for use with FileIO.write() and FileIO.writeDynamic(), writing elements with a given (common) schema, like AvroIO.writeGenericRecords(Schema).
sink(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
A AvroIO.Sink for use with FileIO.write() and FileIO.writeDynamic(), writing elements with a given (common) schema, like AvroIO.writeGenericRecords(String).
Sink() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
 
sink - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
The Sink that this WriteOperation will write to.
sink(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
Creates a ParquetIO.Sink that, for use with FileIO.write().
Sink() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
 
sink() - Static method in class org.apache.beam.sdk.io.TextIO
Creates a TextIO.Sink that writes newline-delimited strings in UTF-8, for use with FileIO.write().
Sink() - Constructor for class org.apache.beam.sdk.io.TextIO.Sink
 
sink() - Static method in class org.apache.beam.sdk.io.TFRecordIO
Sink() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Sink
 
sink(TProtocolFactory) - Static method in class org.apache.beam.sdk.io.thrift.ThriftIO
Sink() - Constructor for class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
 
sink(Class<T>) - Static method in class org.apache.beam.sdk.io.xml.XmlIO
Outputs records as XML-formatted elements using JAXB.
Sink() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
SinkMetrics - Class in org.apache.beam.sdk.metrics
Standard Sink Metrics.
SinkMetrics() - Constructor for class org.apache.beam.sdk.metrics.SinkMetrics
 
sinkViaGenericRecords(Schema, AvroIO.RecordFormatter<ElementT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Deprecated.
RecordFormatter will be removed in future versions.
size() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns the number of bytes in the backing array that are valid.
size() - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
size() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
 
size() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
 
size() - Method in class org.apache.beam.sdk.fn.data.WeightedList
 
size() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
size() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
size() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Returns the number of columns for this schema.
size() - Method in class org.apache.beam.sdk.values.PCollectionList
Returns the number of PCollections in this PCollectionList.
size() - Method in class org.apache.beam.sdk.values.TupleTagList
Returns the number of TupleTags in this TupleTagList.
sizeBytes() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
SizeEstimator<T> - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
 
SizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
This class is used to estimate the size in bytes of a given element.
SizeEstimator(Coder<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.SizeEstimator
 
sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.CoderSizeEstimator
Estimates the size in bytes of the given element with the configured Coder .
sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.NullSizeEstimator
 
sizeOf(T) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.SizeEstimator
 
sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.SizeEstimator
Estimates the size in bytes of the given element with the configured Coder .
Sketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
 
SketchFrequencies - Class in org.apache.beam.sdk.extensions.sketching
PTransforms to compute the estimate frequency of each element in a stream.
SketchFrequencies() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
 
SketchFrequencies.CountMinSketchFn<InputT> - Class in org.apache.beam.sdk.extensions.sketching
Implements the Combine.CombineFn of SketchFrequencies transforms.
SketchFrequencies.GlobalSketch<InputT> - Class in org.apache.beam.sdk.extensions.sketching
Implementation of SketchFrequencies.globally().
SketchFrequencies.PerKeySketch<K,V> - Class in org.apache.beam.sdk.extensions.sketching
Implementation of SketchFrequencies.perKey().
SketchFrequencies.Sketch<T> - Class in org.apache.beam.sdk.extensions.sketching
Wrap StreamLib's Count-Min Sketch to support counting all user types by hashing the encoded user type using the supplied deterministic coder.
skipIfEmpty() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Don't write any output files if the PCollection is empty.
skipInvalidRows() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Insert all valid rows of a request, even if invalid rows exist.
Slf4jLogWriter - Class in org.apache.beam.runners.fnexecution.logging
A LogWriter which uses an SLF4J Logger as the underlying log backend.
SLIDING_WINDOW_TVF - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
 
SlidingWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows values into possibly overlapping fixed-size timestamp-based windows.
SMALL_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
smallest(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<List<T>> with a single element containing the smallest count elements of the input PCollection<T>, in increasing order, sorted according to their natural order.
Smallest() - Constructor for class org.apache.beam.sdk.transforms.Top.Smallest
Deprecated.
 
smallestDoublesFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a Top.TopCombineFn that aggregates the smallest count double values.
smallestFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a Top.TopCombineFn that aggregates the smallest count values.
smallestIntsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a Top.TopCombineFn that aggregates the smallest count int values.
smallestLongsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a Top.TopCombineFn that aggregates the smallest count long values.
smallestPerKey(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the input PCollection to the smallest count values associated with that key in the input PCollection<KV<K, V>>, in increasing order, sorted according to their natural order.
SnappyCoder<T> - Class in org.apache.beam.sdk.coders
Wraps an existing coder with Snappy compression.
snapshot(SchemaVersion) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
SnapshotInfo - Class in org.apache.beam.sdk.io.iceberg
This is an AutoValue representation of an Iceberg Snapshot.
SnapshotInfo() - Constructor for class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
SnapshotInfo.Builder - Class in org.apache.beam.sdk.io.iceberg
 
SnowflakeArray - Class in org.apache.beam.sdk.io.snowflake.data.structured
 
SnowflakeArray() - Constructor for class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeArray
 
SnowflakeBatchServiceConfig - Class in org.apache.beam.sdk.io.snowflake.services
Class for preparing configuration for batch write and read.
SnowflakeBatchServiceConfig(SerializableFunction<Void, DataSource>, String, String, String, String, String, String, String) - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Creating a batch configuration for reading.
SnowflakeBatchServiceConfig(SerializableFunction<Void, DataSource>, List<String>, SnowflakeTableSchema, String, String, String, String, CreateDisposition, WriteDisposition, String, String, String) - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
Creating a batch configuration for writing.
SnowflakeBatchServiceImpl - Class in org.apache.beam.sdk.io.snowflake.services
Implemenation of SnowflakeServices.BatchService used in production.
SnowflakeBatchServiceImpl() - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceImpl
 
SnowflakeBinary - Class in org.apache.beam.sdk.io.snowflake.data.text
 
SnowflakeBinary() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
 
SnowflakeBinary(long) - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
 
SnowflakeBoolean - Class in org.apache.beam.sdk.io.snowflake.data.logical
 
SnowflakeBoolean() - Constructor for class org.apache.beam.sdk.io.snowflake.data.logical.SnowflakeBoolean
 
SnowflakeChar - Class in org.apache.beam.sdk.io.snowflake.data.text
 
SnowflakeChar() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeChar
 
SnowflakeColumn - Class in org.apache.beam.sdk.io.snowflake.data
POJO describing single Column within Snowflake Table.
SnowflakeColumn() - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
SnowflakeColumn(String, SnowflakeDataType) - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
SnowflakeColumn(String, SnowflakeDataType, boolean) - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
SnowflakeDataType - Interface in org.apache.beam.sdk.io.snowflake.data
Interface for data types to provide SQLs for themselves.
SnowflakeDate - Class in org.apache.beam.sdk.io.snowflake.data.datetime
 
SnowflakeDate() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDate
 
SnowflakeDateTime - Class in org.apache.beam.sdk.io.snowflake.data.datetime
 
SnowflakeDateTime() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDateTime
 
SnowflakeDecimal - Class in org.apache.beam.sdk.io.snowflake.data.numeric
 
SnowflakeDecimal() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDecimal
 
SnowflakeDecimal(int, int) - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDecimal
 
SnowflakeDouble - Class in org.apache.beam.sdk.io.snowflake.data.numeric
 
SnowflakeDouble() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDouble
 
SnowflakeFloat - Class in org.apache.beam.sdk.io.snowflake.data.numeric
 
SnowflakeFloat() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeFloat
 
SnowflakeGeography - Class in org.apache.beam.sdk.io.snowflake.data.geospatial
 
SnowflakeGeography() - Constructor for class org.apache.beam.sdk.io.snowflake.data.geospatial.SnowflakeGeography
 
SnowflakeInteger - Class in org.apache.beam.sdk.io.snowflake.data.numeric
 
SnowflakeInteger() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeInteger
 
SnowflakeIO - Class in org.apache.beam.sdk.io.snowflake
IO to read and write data on Snowflake.
SnowflakeIO() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO
 
SnowflakeIO.Concatenate - Class in org.apache.beam.sdk.io.snowflake
Combines list of String to provide one String with paths where files were staged for write.
SnowflakeIO.CsvMapper<T> - Interface in org.apache.beam.sdk.io.snowflake
Interface for user-defined function mapping parts of CSV line into T.
SnowflakeIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.snowflake
A POJO describing a DataSource, providing all properties allowing to create a DataSource.
SnowflakeIO.DataSourceProviderFromDataSourceConfiguration - Class in org.apache.beam.sdk.io.snowflake
Wraps SnowflakeIO.DataSourceConfiguration to provide DataSource.
SnowflakeIO.Read<T> - Class in org.apache.beam.sdk.io.snowflake
Implementation of SnowflakeIO.read().
SnowflakeIO.Read.CleanTmpFilesFromGcsFn - Class in org.apache.beam.sdk.io.snowflake
Removes temporary staged files after reading.
SnowflakeIO.Read.MapCsvToStringArrayFn - Class in org.apache.beam.sdk.io.snowflake
Parses String from incoming data in PCollection to have proper format for CSV files.
SnowflakeIO.UserDataMapper<T> - Interface in org.apache.beam.sdk.io.snowflake
Interface for user-defined function mapping T into array of Objects.
SnowflakeIO.Write<T> - Class in org.apache.beam.sdk.io.snowflake
Implementation of SnowflakeIO.write().
SnowflakeNumber - Class in org.apache.beam.sdk.io.snowflake.data.numeric
 
SnowflakeNumber() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
SnowflakeNumber(int, int) - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
SnowflakeNumeric - Class in org.apache.beam.sdk.io.snowflake.data.numeric
 
SnowflakeNumeric() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumeric
 
SnowflakeNumeric(int, int) - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumeric
 
SnowflakeObject - Class in org.apache.beam.sdk.io.snowflake.data.structured
 
SnowflakeObject() - Constructor for class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeObject
 
SnowflakePipelineOptions - Interface in org.apache.beam.sdk.io.snowflake
 
SnowflakeReal - Class in org.apache.beam.sdk.io.snowflake.data.numeric
 
SnowflakeReal() - Constructor for class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeReal
 
SnowflakeServices - Interface in org.apache.beam.sdk.io.snowflake.services
Interface which defines common methods for interacting with Snowflake.
SnowflakeServices.BatchService - Interface in org.apache.beam.sdk.io.snowflake.services
 
SnowflakeServices.StreamingService - Interface in org.apache.beam.sdk.io.snowflake.services
 
SnowflakeServicesImpl - Class in org.apache.beam.sdk.io.snowflake.services
 
SnowflakeServicesImpl() - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
 
SnowflakeStreamingServiceConfig - Class in org.apache.beam.sdk.io.snowflake.services
Class for preparing configuration for streaming write.
SnowflakeStreamingServiceConfig(List<String>, String, SimpleIngestManager) - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
Constructor to create configuration for streaming write.
SnowflakeStreamingServiceImpl - Class in org.apache.beam.sdk.io.snowflake.services
Implementation of SnowflakeServices.StreamingService used in production.
SnowflakeStreamingServiceImpl() - Constructor for class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceImpl
 
SnowflakeString - Class in org.apache.beam.sdk.io.snowflake.data.text
 
SnowflakeString() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
 
SnowflakeString(long) - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
 
SnowflakeTableSchema - Class in org.apache.beam.sdk.io.snowflake.data
POJO representing schema of Table in Snowflake.
SnowflakeTableSchema() - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
 
SnowflakeTableSchema(SnowflakeColumn...) - Constructor for class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
 
SnowflakeText - Class in org.apache.beam.sdk.io.snowflake.data.text
 
SnowflakeText() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
 
SnowflakeText(long) - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
 
SnowflakeTime - Class in org.apache.beam.sdk.io.snowflake.data.datetime
 
SnowflakeTime() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTime
 
SnowflakeTimestamp - Class in org.apache.beam.sdk.io.snowflake.data.datetime
 
SnowflakeTimestamp() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestamp
 
SnowflakeTimestampLTZ - Class in org.apache.beam.sdk.io.snowflake.data.datetime
 
SnowflakeTimestampLTZ() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampLTZ
 
SnowflakeTimestampNTZ - Class in org.apache.beam.sdk.io.snowflake.data.datetime
 
SnowflakeTimestampNTZ() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampNTZ
 
SnowflakeTimestampTZ - Class in org.apache.beam.sdk.io.snowflake.data.datetime
 
SnowflakeTimestampTZ() - Constructor for class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampTZ
 
SnowflakeTransformRegistrar - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
Exposes SnowflakeIO.Read and SnowflakeIO.Write as an external transform for cross-language usage.
SnowflakeTransformRegistrar() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.SnowflakeTransformRegistrar
 
SnowflakeVarBinary - Class in org.apache.beam.sdk.io.snowflake.data.text
 
SnowflakeVarBinary() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarBinary
 
SnowflakeVarchar - Class in org.apache.beam.sdk.io.snowflake.data.text
 
SnowflakeVarchar() - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
 
SnowflakeVarchar(long) - Constructor for class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
 
SnowflakeVariant - Class in org.apache.beam.sdk.io.snowflake.data.structured
 
SnowflakeVariant() - Constructor for class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeVariant
 
SnsCoderProviderRegistrar - Class in org.apache.beam.sdk.io.aws.sns
A CoderProviderRegistrar for standard types used with SnsIO.
SnsCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.sns.SnsCoderProviderRegistrar
 
SnsIO - Class in org.apache.beam.sdk.io.aws.sns
Deprecated.
Module beam-sdks-java-io-amazon-web-services is deprecated and will be eventually removed. Please migrate to SnsIO in module beam-sdks-java-io-amazon-web-services2.
SnsIO() - Constructor for class org.apache.beam.sdk.io.aws.sns.SnsIO
Deprecated.
 
SnsIO - Class in org.apache.beam.sdk.io.aws2.sns
IO to send notifications via SNS.
SnsIO() - Constructor for class org.apache.beam.sdk.io.aws2.sns.SnsIO
 
SnsIO.RetryConfiguration - Class in org.apache.beam.sdk.io.aws.sns
Deprecated.
A POJO encapsulating a configuration for retry behavior when issuing requests to SNS.
SnsIO.Write - Class in org.apache.beam.sdk.io.aws.sns
Deprecated.
Implementation of SnsIO.write().
SnsIO.Write<T> - Class in org.apache.beam.sdk.io.aws2.sns
Implementation of SnsIO.write().
SOCKET_TIMEOUT - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
 
SocketAddressFactory - Class in org.apache.beam.sdk.fn.channel
Creates a SocketAddress based upon a supplied string.
SocketAddressFactory() - Constructor for class org.apache.beam.sdk.fn.channel.SocketAddressFactory
 
socketTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
Milliseconds to wait for data to be transferred over an established, open connection before the connection is timed out.
socketTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
Milliseconds to wait for data to be transferred over an established, open connection before the connection is timed out.
Solace - Class in org.apache.beam.sdk.io.solace.data
Provides core data models and utilities for working with Solace messages in the context of Apache Beam pipelines.
Solace() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace
 
Solace.CorrelationKey - Class in org.apache.beam.sdk.io.solace.data
The correlation key is an object that is passed back to the client during the event broker ack or nack.
Solace.CorrelationKey.Builder - Class in org.apache.beam.sdk.io.solace.data
 
Solace.Destination - Class in org.apache.beam.sdk.io.solace.data
Represents a Solace message destination (either a Topic or a Queue).
Solace.Destination.Builder - Class in org.apache.beam.sdk.io.solace.data
 
Solace.DestinationType - Enum in org.apache.beam.sdk.io.solace.data
Represents a Solace destination type.
Solace.PublishResult - Class in org.apache.beam.sdk.io.solace.data
The result of writing a message to Solace.
Solace.PublishResult.Builder - Class in org.apache.beam.sdk.io.solace.data
 
Solace.Queue - Class in org.apache.beam.sdk.io.solace.data
Represents a Solace queue.
Solace.Record - Class in org.apache.beam.sdk.io.solace.data
Represents a Solace message record with its associated metadata.
Solace.Record.Builder - Class in org.apache.beam.sdk.io.solace.data
 
Solace.SolaceRecordMapper - Class in org.apache.beam.sdk.io.solace.data
A utility class for mapping BytesXMLMessage instances to Solace.Record objects.
Solace.Topic - Class in org.apache.beam.sdk.io.solace.data
Represents a Solace topic.
SolaceCheckpointMark - Class in org.apache.beam.sdk.io.solace.read
Checkpoint for an unbounded Solace source.
SolaceIO - Class in org.apache.beam.sdk.io.solace
A PTransform to read and write from/to Solace event broker.
SolaceIO() - Constructor for class org.apache.beam.sdk.io.solace.SolaceIO
 
SolaceIO.Read<T> - Class in org.apache.beam.sdk.io.solace
 
SolaceIO.SubmissionMode - Enum in org.apache.beam.sdk.io.solace
 
SolaceIO.Write<T> - Class in org.apache.beam.sdk.io.solace
 
SolaceIO.WriterType - Enum in org.apache.beam.sdk.io.solace
 
SolaceMessageProducer - Class in org.apache.beam.sdk.io.solace.broker
 
SolaceMessageProducer(XMLMessageProducer) - Constructor for class org.apache.beam.sdk.io.solace.broker.SolaceMessageProducer
 
SolaceMessageReceiver - Class in org.apache.beam.sdk.io.solace.broker
 
SolaceMessageReceiver(FlowReceiver) - Constructor for class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
 
SolaceOutput - Class in org.apache.beam.sdk.io.solace.write
The SolaceIO.Write transform's output return this type, containing the successful publishes (SolaceOutput.getSuccessfulPublish()).
SolaceRecordMapper() - Constructor for class org.apache.beam.sdk.io.solace.data.Solace.SolaceRecordMapper
 
solaceSessionServiceWithProducer() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
SolrIO - Class in org.apache.beam.sdk.io.solr
Transforms for reading and writing data from/to Solr.
SolrIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.solr
A POJO describing a connection configuration to Solr.
SolrIO.Read - Class in org.apache.beam.sdk.io.solr
A PTransform reading data from Solr.
SolrIO.ReadAll - Class in org.apache.beam.sdk.io.solr
 
SolrIO.ReplicaInfo - Class in org.apache.beam.sdk.io.solr
A POJO describing a replica of Solr.
SolrIO.RetryConfiguration - Class in org.apache.beam.sdk.io.solr
A POJO encapsulating a configuration for retry behavior when issuing requests to Solr.
SolrIO.Write - Class in org.apache.beam.sdk.io.solr
A PTransform writing data to Solr.
sort() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
SORT_VALUES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
sortBySchema(List<FieldValueTypeInformation>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
 
sorted() - Method in class org.apache.beam.sdk.schemas.Schema
Returns an identical Schema with lexicographically sorted fields.
sorted() - Method in class org.apache.beam.sdk.values.Row
Returns an equivalent Row with fields lexicographically sorted by their name.
SortedMapCoder<K extends java.lang.Comparable<? super K>,V> - Class in org.apache.beam.sdk.coders
A Coder for Maps that encodes them according to provided coders for keys and values.
SortValues<PrimaryKeyT,SecondaryKeyT,ValueT> - Class in org.apache.beam.sdk.extensions.sorter
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT> takes a PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>> with elements consisting of a primary key and iterables over <secondary key, value> pairs, and returns a PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>> of the same elements but with values sorted by a secondary key.
SOURCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
 
SOURCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
 
Source<T> - Class in org.apache.beam.sdk.io
Base class for defining input formats and creating a Source for reading the input.
Source() - Constructor for class org.apache.beam.sdk.io.Source
 
source() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
 
Source.Reader<T> - Class in org.apache.beam.sdk.io
The interface that readers of custom input sources must implement.
SOURCE_DOES_NOT_NEED_SPLITTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_ESTIMATED_SIZE_BYTES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_IS_INFINITE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_METADATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_SPEC - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_STEP_INPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SourceMetrics - Class in org.apache.beam.sdk.metrics
Standard Source Metrics.
SourceMetrics() - Constructor for class org.apache.beam.sdk.metrics.SourceMetrics
 
sourceName() - Method in class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
 
sourceName() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.SparkBeamMetricSource
 
SourceRDD - Class in org.apache.beam.runners.spark.io
Classes implementing Beam Source RDDs.
SourceRDD() - Constructor for class org.apache.beam.runners.spark.io.SourceRDD
 
SourceRDD.Bounded<T> - Class in org.apache.beam.runners.spark.io
A SourceRDD.Bounded reads input from a BoundedSource and creates a Spark RDD.
SourceRDD.Unbounded<T,CheckpointMarkT extends UnboundedSource.CheckpointMark> - Class in org.apache.beam.runners.spark.io
A SourceRDD.Unbounded is the implementation of a micro-batch in a SourceDStream.
SourceRecordJson - Class in org.apache.beam.io.debezium
This class can be used as a mapper for each SourceRecord retrieved.
SourceRecordJson(SourceRecord) - Constructor for class org.apache.beam.io.debezium.SourceRecordJson
Initializer.
SourceRecordJson.SourceRecordJsonMapper - Class in org.apache.beam.io.debezium
SourceRecordJson implementation.
SourceRecordJsonMapper() - Constructor for class org.apache.beam.io.debezium.SourceRecordJson.SourceRecordJsonMapper
 
SourceRecordMapper<T> - Interface in org.apache.beam.io.debezium
Interface used to map a Kafka source record.
SourceTestUtils - Class in org.apache.beam.sdk.testing
Helper functions and test harnesses for checking correctness of Source implementations.
SourceTestUtils() - Constructor for class org.apache.beam.sdk.testing.SourceTestUtils
 
SourceTestUtils.ExpectedSplitOutcome - Enum in org.apache.beam.sdk.testing
sourceType() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
 
span(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns the minimal window that includes both this window and the given window.
SpannerAccessor - Class in org.apache.beam.sdk.io.gcp.spanner
Manages lifecycle of DatabaseClient and Spanner instances.
SpannerChangestreamsReadConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
SpannerChangestreamsReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
 
SpannerChangestreamsReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
 
SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
 
SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
 
SpannerConfig - Class in org.apache.beam.sdk.io.gcp.spanner
Configuration for a Cloud Spanner client.
SpannerConfig() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
SpannerConfig.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
Builder for SpannerConfig.
SpannerIO - Class in org.apache.beam.sdk.io.gcp.spanner
Reading from Cloud Spanner
SpannerIO.CreateTransaction - Class in org.apache.beam.sdk.io.gcp.spanner
A PTransform that create a transaction.
SpannerIO.CreateTransaction.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
SpannerIO.FailureMode - Enum in org.apache.beam.sdk.io.gcp.spanner
A failure handling strategy.
SpannerIO.Read - Class in org.apache.beam.sdk.io.gcp.spanner
Implementation of SpannerIO.read().
SpannerIO.ReadAll - Class in org.apache.beam.sdk.io.gcp.spanner
Implementation of SpannerIO.readAll().
SpannerIO.ReadChangeStream - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerIO.SpannerChangeStreamOptions - Interface in org.apache.beam.sdk.io.gcp.spanner
Interface to display the name of the metadata table on Dataflow UI.
SpannerIO.Write - Class in org.apache.beam.sdk.io.gcp.spanner
A PTransform that writes Mutation objects to Google Cloud Spanner.
SpannerIO.WriteGrouped - Class in org.apache.beam.sdk.io.gcp.spanner
Same as SpannerIO.Write but supports grouped mutations.
SpannerReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
SpannerReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner
A provider for reading from Cloud Spanner using a Schema Transform Provider.
SpannerReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider
 
SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerSchema - Class in org.apache.beam.sdk.io.gcp.spanner
Encapsulates Cloud Spanner Schema.
SpannerSchema() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
SpannerSchema.Column - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerSchema.KeyPart - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerSchemaRetrievalException - Exception in org.apache.beam.sdk.io.gcp.spanner
Exception to signal that Spanner schema retrieval failed.
SpannerTransformRegistrar - Class in org.apache.beam.sdk.io.gcp.spanner
Exposes SpannerIO.WriteRows and SpannerIO.ReadRows as an external transform for cross-language usage.
SpannerTransformRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
SpannerTransformRegistrar.CrossLanguageConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.DeleteBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.InsertBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.InsertOrUpdateBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.ReadBuilder.Configuration - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.ReplaceBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.UpdateBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerWriteResult - Class in org.apache.beam.sdk.io.gcp.spanner
The results of a SpannerIO.write() transform.
SpannerWriteResult(Pipeline, PCollection<Void>, PCollection<MutationGroup>, TupleTag<MutationGroup>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
SpannerWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
SpannerWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SparkBeamMetricSource - Class in org.apache.beam.runners.spark.metrics
A Spark Source that is tailored to expose a SparkBeamMetric, wrapping an underlying MetricResults instance.
SparkBeamMetricSource(String) - Constructor for class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
 
SparkBeamMetricSource - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
A Spark Source that is tailored to expose a SparkBeamMetric, wrapping an underlying MetricResults instance.
SparkBeamMetricSource(String, MetricsAccumulator) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.SparkBeamMetricSource
 
SparkCommonPipelineOptions - Interface in org.apache.beam.runners.spark
Spark runner PipelineOptions handles Spark execution-related configurations, such as the master address, and other user-related knobs.
SparkCommonPipelineOptions.StorageLevelFactory - Class in org.apache.beam.runners.spark
Returns Spark's default storage level for the Dataset or RDD API based on the respective runner.
SparkCommonPipelineOptions.TmpCheckpointDirFactory - Class in org.apache.beam.runners.spark
Returns the default checkpoint directory of /tmp/${job.name}.
SparkContextOptions - Interface in org.apache.beam.runners.spark
A custom PipelineOptions to work with properties related to JavaSparkContext.
SparkContextOptions.EmptyListenersList - Class in org.apache.beam.runners.spark
Returns an empty list, to avoid handling null.
SparkGroupAlsoByWindowViaWindowSet - Class in org.apache.beam.runners.spark.stateful
An implementation of GroupByKeyViaGroupByKeyOnly.GroupAlsoByWindow logic for grouping by windows and controlling trigger firings and pane accumulation.
SparkGroupAlsoByWindowViaWindowSet() - Constructor for class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
 
SparkGroupAlsoByWindowViaWindowSet.StateAndTimers - Class in org.apache.beam.runners.spark.stateful
State and Timers wrapper.
SparkJobInvoker - Class in org.apache.beam.runners.spark
Creates a job invocation to manage the Spark runner's execution of a portable pipeline.
SparkJobServerDriver - Class in org.apache.beam.runners.spark
Driver program that starts a job server for the Spark runner.
SparkJobServerDriver.SparkServerConfiguration - Class in org.apache.beam.runners.spark
Spark runner-specific Configuration for the jobServer.
SparkKryoRegistrator() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory.SparkKryoRegistrator
 
SparkNativePipelineVisitor - Class in org.apache.beam.runners.spark
Pipeline visitor for translating a Beam pipeline into equivalent Spark operations.
SparkPipelineOptions - Interface in org.apache.beam.runners.spark
Spark runner PipelineOptions handles Spark execution-related configurations, such as the master address, batch-interval, and other user-related knobs.
SparkPipelineResult - Class in org.apache.beam.runners.spark
Represents a Spark pipeline execution result.
SparkPipelineRunner - Class in org.apache.beam.runners.spark
Runs a portable pipeline on Apache Spark.
SparkPipelineRunner(SparkPipelineOptions) - Constructor for class org.apache.beam.runners.spark.SparkPipelineRunner
 
SparkPortableStreamingPipelineOptions - Interface in org.apache.beam.runners.spark
Pipeline options specific to the Spark portable runner running a streaming job.
SparkReceiverIO - Class in org.apache.beam.sdk.io.sparkreceiver
Streaming sources for Spark Receiver.
SparkReceiverIO() - Constructor for class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO
 
SparkReceiverIO.Read<V> - Class in org.apache.beam.sdk.io.sparkreceiver
A PTransform to read from Spark Receiver.
SparkRunner - Class in org.apache.beam.runners.spark
The SparkRunner translate operations defined on a pipeline to a representation executable by Spark, and then submitting the job to Spark to be executed.
SparkRunner.Evaluator - Class in org.apache.beam.runners.spark
Evaluator on the pipeline.
SparkRunnerDebugger - Class in org.apache.beam.runners.spark
Pipeline runner which translates a Beam pipeline into equivalent Spark operations, without running them.
SparkRunnerDebugger.DebugSparkPipelineResult - Class in org.apache.beam.runners.spark
PipelineResult of running a Pipeline using SparkRunnerDebugger Use SparkRunnerDebugger.DebugSparkPipelineResult.getDebugString() to get a String representation of the Pipeline translated into Spark native operations.
SparkRunnerKryoRegistrator - Class in org.apache.beam.runners.spark.coders
Custom KryoRegistrators for Beam's Spark runner needs and registering used class in spark translation for better serialization performance.
SparkRunnerKryoRegistrator() - Constructor for class org.apache.beam.runners.spark.coders.SparkRunnerKryoRegistrator
 
SparkRunnerRegistrar - Class in org.apache.beam.runners.spark
Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the SparkRunner.
SparkRunnerRegistrar.Options - Class in org.apache.beam.runners.spark
Registers the SparkPipelineOptions.
SparkRunnerRegistrar.Runner - Class in org.apache.beam.runners.spark
Registers the SparkRunner.
SparkServerConfiguration() - Constructor for class org.apache.beam.runners.spark.SparkJobServerDriver.SparkServerConfiguration
 
SparkSessionFactory - Class in org.apache.beam.runners.spark.structuredstreaming.translation
 
SparkSessionFactory() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory
 
SparkSessionFactory.SparkKryoRegistrator - Class in org.apache.beam.runners.spark.structuredstreaming.translation
KryoRegistrator for Spark to serialize broadcast variables used for side-inputs.
SparkSideInputReader - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
SideInputReader using broadcasted SideInputValues.
SparkSideInputReader - Class in org.apache.beam.runners.spark.util
A SideInputReader for the SparkRunner.
SparkSideInputReader(Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>) - Constructor for class org.apache.beam.runners.spark.util.SparkSideInputReader
 
SparkStructuredStreamingPipelineOptions - Interface in org.apache.beam.runners.spark.structuredstreaming
Spark runner PipelineOptions handles Spark execution-related configurations, such as the master address, and other user-related knobs.
SparkStructuredStreamingPipelineResult - Class in org.apache.beam.runners.spark.structuredstreaming
 
SparkStructuredStreamingRunner - Class in org.apache.beam.runners.spark.structuredstreaming
A Spark runner build on top of Spark's SQL Engine (Structured Streaming framework).
SparkStructuredStreamingRunnerRegistrar - Class in org.apache.beam.runners.spark.structuredstreaming
Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the SparkStructuredStreamingRunner.
SparkStructuredStreamingRunnerRegistrar.Options - Class in org.apache.beam.runners.spark.structuredstreaming
SparkStructuredStreamingRunnerRegistrar.Runner - Class in org.apache.beam.runners.spark.structuredstreaming
SparkTimerInternals - Class in org.apache.beam.runners.spark.stateful
An implementation of TimerInternals for the SparkRunner.
SparkTransformOverrides - Class in org.apache.beam.runners.spark
PTransform overrides for Spark runner.
SparkTransformOverrides() - Constructor for class org.apache.beam.runners.spark.SparkTransformOverrides
 
SparkUnboundedSource - Class in org.apache.beam.runners.spark.io
A "composite" InputDStream implementation for UnboundedSources.
SparkUnboundedSource() - Constructor for class org.apache.beam.runners.spark.io.SparkUnboundedSource
 
SparkUnboundedSource.Metadata - Class in org.apache.beam.runners.spark.io
A metadata holder for an input stream partition.
SparkWatermarks(Instant, Instant, Instant) - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
specific(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type respecting Avro's Specific* suite for encoding and decoding.
specific(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type respecting Avro's Specific* suite for encoding and decoding.
specific(Class<T>, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Returns an AvroCoder instance for the provided element type respecting Avro's Specific* suite for encoding and decoding.
specific(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
Returns an AvroDatumFactory instance for the provided element type respecting Avro's Specific* suite for encoding and decoding.
SpecificDatumFactory(Class<T>) - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory.SpecificDatumFactory
 
split(BeamFnApi.ProcessBundleSplitResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleSplitHandler
 
split(double) - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
Ask the remote bundle to split its current processing based upon its knowledge of remaining work.
split(double) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.BoundedCosmosBDSource
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
Splits the source into bundles of approximately desiredBundleSizeBytes.
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
split(int) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns a list of up to numSplits + 1 ByteKeys in ascending order, where the keys have been interpolated to form roughly equal sub-ranges of this ByteKeyRange, assuming a uniform distribution of keys within this range.
split(long, long) - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.UnboundedSource
Returns a list of UnboundedSource objects representing the instances of this source that should be used when executing the workflow.
split(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Split PTransform that splits a string on the regular expression and then outputs each item.
split(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Split PTransform that splits a string on the regular expression and then outputs each item.
split(String, boolean) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Split PTransform that splits a string on the regular expression and then outputs each item.
split(Pattern, boolean) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Split PTransform that splits a string on the regular expression and then outputs each item.
Split(Pattern, boolean) - Constructor for class org.apache.beam.sdk.transforms.Regex.Split
 
SPLIT_POINTS_UNKNOWN - Static variable in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
A constant to use as the return value for BoundedSource.BoundedReader.getSplitPointsConsumed() or BoundedSource.BoundedReader.getSplitPointsRemaining() when the exact value is unknown.
splitAtFraction(double) - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Tells the reader to narrow the range of the input it's going to read and give up the remainder, so that the new range would contain approximately the given fraction of the amount of data in the current range.
splitAtFraction(double) - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
SplitIntoRangesFn(long) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform.SplitIntoRangesFn
 
splitReadStream(SplitReadStreamRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
 
splitReadStream(SplitReadStreamRequest, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
 
SplitResult<RestrictionT> - Class in org.apache.beam.sdk.transforms.splittabledofn
A representation of a split result.
SplitResult() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
 
SplunkEvent - Class in org.apache.beam.sdk.io.splunk
A SplunkEvent describes a single payload sent to Splunk's Http Event Collector (HEC) endpoint.
SplunkEvent() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkEvent
 
SplunkEvent.Builder - Class in org.apache.beam.sdk.io.splunk
A builder class for creating a SplunkEvent.
SplunkEventCoder - Class in org.apache.beam.sdk.io.splunk
A Coder for SplunkEvent objects.
SplunkEventCoder() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkEventCoder
 
SplunkIO - Class in org.apache.beam.sdk.io.splunk
An unbounded sink for Splunk's Http Event Collector (HEC).
SplunkIO.Write - Class in org.apache.beam.sdk.io.splunk
Class SplunkIO.Write provides a PTransform that allows writing SplunkEvent records into a Splunk HTTP Event Collector end-point using HTTP POST requests.
SplunkWriteError - Class in org.apache.beam.sdk.io.splunk
A class for capturing errors that occur while writing SplunkEvent to Splunk's Http Event Collector (HEC) end point.
SplunkWriteError() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkWriteError
 
SplunkWriteError.Builder - Class in org.apache.beam.sdk.io.splunk
A builder class for creating a SplunkWriteError.
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDate
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTime
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampLTZ
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampNTZ
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampTZ
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.geospatial.SnowflakeGeography
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.logical.SnowflakeBoolean
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeFloat
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
 
sql() - Method in interface org.apache.beam.sdk.io.snowflake.data.SnowflakeDataType
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeArray
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeObject
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeVariant
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
 
sql() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
 
SqlAnalyzer - Class in org.apache.beam.sdk.extensions.sql.zetasql
Adapter for Analyzer to simplify the API for parsing the query and resolving the AST.
SqlCheckConstraint - Class in org.apache.beam.sdk.extensions.sql.impl.parser
Parse tree for UNIQUE, PRIMARY KEY constraints.
SqlColumnDeclaration - Class in org.apache.beam.sdk.extensions.sql.impl.parser
Parse tree for column.
SqlConversionException - Exception in org.apache.beam.sdk.extensions.sql.impl
Exception thrown when BeamSQL cannot convert sql to BeamRelNode.
SqlConversionException(Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.impl.SqlConversionException
 
SqlConversionException(String) - Constructor for exception org.apache.beam.sdk.extensions.sql.impl.SqlConversionException
 
SqlConversionException(String, Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.impl.SqlConversionException
 
SqlCreateExternalTable - Class in org.apache.beam.sdk.extensions.sql.impl.parser
Parse tree for CREATE EXTERNAL TABLE statement.
SqlCreateExternalTable(SqlParserPos, boolean, boolean, SqlIdentifier, List<Schema.Field>, SqlNode, SqlNode, SqlNode, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
Creates a SqlCreateExternalTable.
SqlCreateFunction - Class in org.apache.beam.sdk.extensions.sql.impl.parser
Parse tree for CREATE FUNCTION statement.
SqlCreateFunction(SqlParserPos, boolean, SqlIdentifier, SqlNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
Creates a SqlCreateFunction.
SqlDdlNodes - Class in org.apache.beam.sdk.extensions.sql.impl.parser
Utilities concerning SqlNode for DDL.
SqlDropTable - Class in org.apache.beam.sdk.extensions.sql.impl.parser
Parse tree for DROP TABLE statement.
SqlOperators - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
A separate SqlOperators table for those functions that do not exist or not compatible with Calcite.
SqlOperators() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
sqlScalarFunctions() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
 
SqlSetOptionBeam - Class in org.apache.beam.sdk.extensions.sql.impl.parser
SQL parse tree node to represent SET and RESET statements.
SqlSetOptionBeam(SqlParserPos, String, SqlIdentifier, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlSetOptionBeam
 
sqlTableValuedFunctions() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
SQL native user-defined table-valued function can be resolved by Analyzer.
SqlTransform - Class in org.apache.beam.sdk.extensions.sql
SqlTransform is the DSL interface of Beam SQL.
SqlTransform() - Constructor for class org.apache.beam.sdk.extensions.sql.SqlTransform
 
SqlTransformSchemaTransformProvider - Class in org.apache.beam.sdk.extensions.sql.expansion
 
SqlTransformSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.SqlTransformSchemaTransformProvider
 
SqlTypes - Class in org.apache.beam.sdk.schemas.logicaltypes
Beam Schema.LogicalTypes corresponding to SQL data types.
sqlTypeWithAutoCast(RelDataTypeFactory, Type) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
SQL-Java type mapping, with specified Beam rules:
1.
SqsIO - Class in org.apache.beam.sdk.io.aws.sqs
Deprecated.
Module beam-sdks-java-io-amazon-web-services is deprecated and will be eventually removed. Please migrate to SqsIO in module beam-sdks-java-io-amazon-web-services2.
SqsIO - Class in org.apache.beam.sdk.io.aws2.sqs
IO to read (unbounded) from and write to SQS queues.
SqsIO.Read - Class in org.apache.beam.sdk.io.aws.sqs
Deprecated.
A PTransform to read/receive messages from SQS.
SqsIO.Read - Class in org.apache.beam.sdk.io.aws2.sqs
A PTransform to read/receive messages from SQS.
SqsIO.Write - Class in org.apache.beam.sdk.io.aws.sqs
Deprecated.
A PTransform to send messages to SQS.
SqsIO.Write - Class in org.apache.beam.sdk.io.aws2.sqs
Deprecated.
superseded by SqsIO.WriteBatches
SqsIO.WriteBatches<T> - Class in org.apache.beam.sdk.io.aws2.sqs
A PTransform to send messages to SQS.
SqsIO.WriteBatches.DynamicDestination<T> - Interface in org.apache.beam.sdk.io.aws2.sqs
 
SqsIO.WriteBatches.EntryMapperFn<T> - Interface in org.apache.beam.sdk.io.aws2.sqs
Mapper to create a SendMessageBatchRequestEntry from a unique batch entry id and the input T.
SqsIO.WriteBatches.EntryMapperFn.Builder<T> - Interface in org.apache.beam.sdk.io.aws2.sqs
A more convenient SqsIO.WriteBatches.EntryMapperFn variant that already sets the entry id.
SqsIO.WriteBatches.Result - Class in org.apache.beam.sdk.io.aws2.sqs
SqsMessage - Class in org.apache.beam.sdk.io.aws2.sqs
 
SqsMessage() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
 
src - Variable in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState
 
src - Variable in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState
 
SSECustomerKey - Class in org.apache.beam.sdk.io.aws2.s3
Customer provided key for use with Amazon S3 server-side encryption.
SSECustomerKey.Builder - Class in org.apache.beam.sdk.io.aws2.s3
 
SSECustomerKeyFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.S3Options.SSECustomerKeyFactory
 
stageArtifacts(RunnerApi.Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
StageBundleFactory - Interface in org.apache.beam.runners.fnexecution.control
A bundle factory scoped to a particular ExecutableStage, which has all of the resources it needs to provide new RemoteBundles.
StagedFile() - Constructor for class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
 
stageFiles(List<PackageUtil.StagedFile>) - Method in class org.apache.beam.runners.dataflow.util.GcsStager
Stages files to DataflowPipelineOptions.getStagingLocation(), suffixed with their md5 hash to avoid collisions.
stageFiles(List<PackageUtil.StagedFile>) - Method in interface org.apache.beam.runners.dataflow.util.Stager
Stage files and return a list of packages DataflowPackage objects describing th actual location at which each file was staged.
stagePackage(PackageUtil.PackageAttributes, Sleeper, CreateOptions) - Method in class org.apache.beam.runners.dataflow.util.PackageUtil
Stages one file ("package") if necessary.
Stager - Interface in org.apache.beam.runners.dataflow.util
Interface for staging files needed for running a Dataflow pipeline.
StagerFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
 
stageToFile(byte[], String) - Method in class org.apache.beam.runners.dataflow.util.GcsStager
 
stageToFile(byte[], String, String, CreateOptions) - Method in class org.apache.beam.runners.dataflow.util.PackageUtil
 
stageToFile(byte[], String) - Method in interface org.apache.beam.runners.dataflow.util.Stager
Stage bytes to a target file name wherever this stager stages things.
STAGING_TO_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
StagingLocationFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
 
StandardCreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
 
start() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
 
start() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
Start the job.
start() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
start() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
start() - Method in class org.apache.beam.runners.spark.metrics.sink.CsvSink
 
start() - Method in class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
 
start() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
 
start() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
 
start() - Method in class org.apache.beam.sdk.extensions.python.PythonService
 
start() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
Starts the flushing daemon thread if data_buffer_time_limit_ms is set.
start() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
start() - Method in interface org.apache.beam.sdk.io.jms.AutoScaler
The AutoScaler is started when the JmsIO.UnboundedJmsReader is started.
start() - Method in class org.apache.beam.sdk.io.jms.DefaultAutoscaler
 
start() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
start() - Method in interface org.apache.beam.sdk.io.solace.broker.MessageReceiver
Starts the message receiver.
start() - Method in class org.apache.beam.sdk.io.solace.broker.SolaceMessageReceiver
 
start() - Method in class org.apache.beam.sdk.io.Source.Reader
Initializes the reader and advances the reader to the first record.
start() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Initializes the reader and advances the reader to the first record.
start() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns the start of this window, inclusive.
start() - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
 
START_WITHS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
startAt(Instant) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
Assign a timestamp when the pipeliene starts to produce data.
startBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
startBundle(DoFn<Row, Row>.StartBundleContext, PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
 
startBundle(DoFn<Iterable<KV<DestinationT, WriteTables.Result>>, Iterable<KV<TableDestination, WriteTables.Result>>>.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
startBundle(DoFn<PubsubMessage, Void>.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
startBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
 
startBundle() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
startBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
StartBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
 
startCopyJob(JobReference, JobConfigurationTableCopy) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery copy job.
startCopyJob(JobReference, JobConfigurationTableCopy) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startExtractJob(JobReference, JobConfigurationExtract) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery extract job.
startExtractJob(JobReference, JobConfigurationExtract) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
startImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Initializes the OffsetBasedSource.OffsetBasedReader and advances to the first record, returning true if there is a record available to be read.
startLoadJob(JobReference, JobConfigurationLoad) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery load job.
startLoadJob(JobReference, JobConfigurationLoad, AbstractInputStreamContent) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery load job with stream content.
startLoadJob(JobReference, JobConfigurationLoad) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startLoadJob(JobReference, JobConfigurationLoad, AbstractInputStreamContent) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startProcess(String, String, List<String>, Map<String, String>) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
Forks a process with the given command, arguments, and additional environment variables.
startProcess(String, String, List<String>, Map<String, String>, File) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
 
startQueryJob(JobReference, JobConfigurationQuery) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery query job.
startQueryJob(JobReference, JobConfigurationQuery) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource.AvroReader
 
startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Creates a decompressing channel from the input channel and passes it to its delegate reader's FileBasedReader#startReading(ReadableByteChannel).
startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
Performs any initialization of the subclass of FileBasedReader that involves IO operations.
startRunnerBundle(DoFnRunner<KV<?, ?>, OutputT>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
 
STARTS_WITH_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
startsWith(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
startsWith(String) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
startsWith(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
startsWith(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
startsWith(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
state(StreamObserver<BeamFnApi.StateResponse>) - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
 
State - Interface in org.apache.beam.sdk.state
A state cell, supporting a State.clear() operation.
STATE_CACHE_SIZE - Static variable in interface org.apache.beam.sdk.options.ExperimentalOptions
 
STATE_SAMPLING_PERIOD_MILLIS - Static variable in interface org.apache.beam.sdk.options.ExperimentalOptions
 
StateAndTimerBundleCheckpointHandler(TimerInternalsFactory<T>, StateInternalsFactory<T>, Coder<WindowedValue<T>>, Coder) - Constructor for class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
 
StateBinder - Interface in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
StateContext<W extends BoundedWindow> - Interface in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
StateContexts - Class in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
StateContexts() - Constructor for class org.apache.beam.sdk.state.StateContexts
 
StateDelegator - Interface in org.apache.beam.runners.fnexecution.state
The StateDelegator is able to delegate BeamFnApi.StateRequests to a set of registered handlers.
StateDelegator.Registration - Interface in org.apache.beam.runners.fnexecution.state
Allows callers to deregister from receiving further state requests.
StatefulParDoP<OutputT> - Class in org.apache.beam.runners.jet.processors
Jet Processor implementation for Beam's stateful ParDo primitive.
StatefulParDoP.Supplier<OutputT> - Class in org.apache.beam.runners.jet.processors
Jet Processor supplier that will provide instances of StatefulParDoP.
stateInternals() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.NoOpStepContext
 
stateInternals() - Method in class org.apache.beam.runners.twister2.utils.NoOpStepContext
 
StateKeySpec - Class in org.apache.beam.sdk.state
 
StateRequestHandler - Interface in org.apache.beam.runners.fnexecution.state
Handler for StateRequests.
StateRequestHandlers - Class in org.apache.beam.runners.fnexecution.state
A set of utility methods which construct StateRequestHandlers.
StateRequestHandlers() - Constructor for class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
 
StateRequestHandlers.BagUserStateHandler<K,V,W extends BoundedWindow> - Interface in org.apache.beam.runners.fnexecution.state
A handler for bag user state.
StateRequestHandlers.BagUserStateHandlerFactory<K,V,W extends BoundedWindow> - Interface in org.apache.beam.runners.fnexecution.state
A factory which constructs StateRequestHandlers.BagUserStateHandlers.
StateRequestHandlers.IterableSideInputHandler<V,W extends BoundedWindow> - Interface in org.apache.beam.runners.fnexecution.state
A handler for iterable side inputs.
StateRequestHandlers.MultimapSideInputHandler<K,V,W extends BoundedWindow> - Interface in org.apache.beam.runners.fnexecution.state
A handler for multimap side inputs.
StateRequestHandlers.SideInputHandler - Interface in org.apache.beam.runners.fnexecution.state
Marker interface that denotes some type of side input handler.
StateRequestHandlers.SideInputHandlerFactory - Interface in org.apache.beam.runners.fnexecution.state
StateSpec<StateT extends State> - Interface in org.apache.beam.sdk.state
A specification of a persistent state cell.
StateSpec.Cases<ResultT> - Interface in org.apache.beam.sdk.state
Cases for doing a "switch" on the type of StateSpec.
StateSpec.Cases.WithDefault<ResultT> - Class in org.apache.beam.sdk.state
A base class for a visitor with a default method for cases it is not interested in.
StateSpecFunctions - Class in org.apache.beam.runners.spark.stateful
A class containing StateSpec mappingFunctions.
StateSpecFunctions() - Constructor for class org.apache.beam.runners.spark.stateful.StateSpecFunctions
 
StateSpecs - Class in org.apache.beam.sdk.state
Static methods for working with StateSpecs.
StaticGrpcProvisionService - Class in org.apache.beam.runners.fnexecution.provisioning
A provision service that returns a static response to all calls.
StaticRemoteEnvironment - Class in org.apache.beam.runners.fnexecution.environment
A RemoteEnvironment that connects to Dataflow runner harness.
StaticRemoteEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
An EnvironmentFactory that creates StaticRemoteEnvironment used by a runner harness that would like to use an existing InstructionRequestHandler.
StaticRemoteEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
Provider for StaticRemoteEnvironmentFactory.
StaticSchemaInference - Class in org.apache.beam.sdk.schemas.utils
A set of utilities for inferring a Beam Schema from static Java types.
StaticSchemaInference() - Constructor for class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
 
status() - Method in class org.apache.beam.sdk.io.fs.MatchResult
Status of the MatchResult.
status() - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
 
STATUS_BACKOFF_FACTORY - Static variable in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
statusCode() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError
 
statusMessage() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError
 
stepName() - Method in class org.apache.beam.sdk.metrics.MetricKey
The step name that is associated with this metric or Null if none is associated.
steps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
stop() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
 
stop() - Method in class org.apache.beam.runners.spark.metrics.sink.CsvSink
 
stop() - Method in class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
 
stop() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
stop() - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
 
stop() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
 
stop() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
 
stop() - Method in interface org.apache.beam.sdk.io.jms.AutoScaler
The AutoScaler is stopped when the JmsIO.UnboundedJmsReader is closed.
stop() - Method in class org.apache.beam.sdk.io.jms.DefaultAutoscaler
 
stop() - Static method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
Indicates that there is no more work to be done for the current element.
stopAfter(Duration) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
For internal use only; no backwards-compatibility guarantees.
stopAt(Instant) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
Assign a timestamp when the pipeliene stops producing data.
stopProcess(String) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
Stops a previously started process identified by its unique id.
stopSampling(int, long) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.LimitNumberOfFiles
 
stopSampling(int, long) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.LimitNumberOfTotalBytes
 
stopSampling(int, long) - Method in class org.apache.beam.sdk.io.TextRowCountEstimator.SampleAllFiles
 
stopSampling(int, long) - Method in interface org.apache.beam.sdk.io.TextRowCountEstimator.SamplingStrategy
 
StorageApiCDC - Class in org.apache.beam.sdk.io.gcp.bigquery
Constants and variables for CDC support.
StorageApiCDC() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
 
StorageApiConvertMessages<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
A transform that converts messages to protocol buffers in preparation for writing to BigQuery.
StorageApiConvertMessages(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<KV<DestinationT, StorageApiWritePayload>>, Coder<BigQueryStorageApiInsertError>, Coder<KV<DestinationT, StorageApiWritePayload>>, SerializableFunction<ElementT, RowMutationInformation>, BadRecordRouter) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
 
StorageApiConvertMessages.ConvertMessagesDoFn<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
StorageApiDynamicDestinationsTableRow<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
StorageApiFlushAndFinalizeDoFn - Class in org.apache.beam.sdk.io.gcp.bigquery
This DoFn flushes and optionally (if requested) finalizes Storage API streams.
StorageApiFlushAndFinalizeDoFn(BigQueryServices) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
 
StorageApiLoads<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
This PTransform manages loads into BigQuery using the Storage API.
StorageApiLoads(Coder<DestinationT>, StorageApiDynamicDestinations<ElementT, DestinationT>, SerializableFunction<ElementT, RowMutationInformation>, BigQueryIO.Write.CreateDisposition, String, Duration, BigQueryServices, int, boolean, boolean, boolean, boolean, boolean, Predicate<String>, boolean, AppendRowsRequest.MissingValueInterpretation, BadRecordRouter, ErrorHandler<BadRecord, ?>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
StorageApiWritePayload - Class in org.apache.beam.sdk.io.gcp.bigquery
Class used to wrap elements being sent to the Storage API sinks.
StorageApiWritePayload() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
StorageApiWritePayload.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
StorageApiWriteRecordsInconsistent<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
A transform to write sharded records to BigQuery using the Storage API.
StorageApiWriteRecordsInconsistent(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, Predicate<String>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, boolean, boolean, BigQueryIO.Write.CreateDisposition, String, boolean, AppendRowsRequest.MissingValueInterpretation) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
 
StorageApiWritesShardedRecords<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
A transform to write sharded records to BigQuery using the Storage API (Streaming).
StorageApiWritesShardedRecords(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryIO.Write.CreateDisposition, String, BigQueryServices, Coder<DestinationT>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, Predicate<String>, boolean, boolean, AppendRowsRequest.MissingValueInterpretation) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
 
StorageApiWriteUnshardedRecords<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Write records to the Storage API using a standard batch approach.
StorageApiWriteUnshardedRecords(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, Predicate<String>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, boolean, boolean, BigQueryIO.Write.CreateDisposition, String, boolean, AppendRowsRequest.MissingValueInterpretation) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
 
storageExists() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
 
StorageLevelFactory() - Constructor for class org.apache.beam.runners.spark.SparkCommonPipelineOptions.StorageLevelFactory
 
storageObject() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
Returns the StorageObject.
StorageObjectOrIOException() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
 
storeId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
storeRecord(HistoryRecord) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
 
STREAM_PARTITION_PREFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
STREAMING_ENGINE_EXPERIMENT - Static variable in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
Experiment to turn on the Streaming Engine experiment.
StreamingInserts<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
PTransform that performs streaming BigQuery write.
StreamingInserts(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>, Coder<ElementT>, SerializableFunction<ElementT, TableRow>, SerializableFunction<ElementT, TableRow>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
Constructor.
StreamingInsertsMetrics - Interface in org.apache.beam.sdk.io.gcp.bigquery
Stores and exports metrics for a batch of Streaming Inserts RPCs.
StreamingInsertsMetrics.NoOpStreamingInsertsMetrics - Class in org.apache.beam.sdk.io.gcp.bigquery
No-op implementation of StreamingInsertsResults.
StreamingInsertsMetrics.StreamingInsertsMetricsImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
Metrics of a batch of InsertAll RPCs.
StreamingInsertsMetricsImpl() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
 
StreamingIT - Interface in org.apache.beam.sdk.testing
Deprecated.
tests which use unbounded PCollections should be in the category UsesUnboundedPCollections. Beyond that, it is up to the runner and test configuration to decide whether to run in streaming mode.
StreamingLogLevel - Enum in org.apache.beam.sdk.io.snowflake.enums
 
StreamingOptions - Interface in org.apache.beam.sdk.options
Options used to configure streaming.
StreamingSideInputHandlerFactory - Class in org.apache.beam.runners.fnexecution.translation
StateRequestHandler that uses SideInputHandler to access the broadcast state that represents side inputs.
StreamingSourceContextImpl - Class in org.apache.beam.sdk.io.cdap.context
Class for creating context object of different CDAP classes with stream source type.
StreamingSourceContextImpl() - Constructor for class org.apache.beam.sdk.io.cdap.context.StreamingSourceContextImpl
 
StreamingWriteTables<ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
This transform takes in key-value pairs of TableRow entries and the TableDestination it should be written to.
StreamingWriteTables() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
StreamPartitionWithWatermark - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
 
StreamPartitionWithWatermark(Range.ByteStringRange, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
StreamProgress - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
StreamProgress() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
StreamProgress(ChangeStreamContinuationToken, Instant, BigDecimal, Instant, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
StreamProgress(CloseStream) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
streamToTable(SnowflakeServices, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
 
StreamTransformTranslator<TransformT extends PTransform> - Interface in org.apache.beam.runners.twister2.translators
Stream TransformTranslator interface.
STRING - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
STRING - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
The type of string fields.
STRING_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
StringAgg - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
Combine.CombineFns for aggregating strings or bytes with an optional delimiter (default comma).
StringAgg() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg
 
StringAgg.StringAggByte - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
A Combine.CombineFn that aggregates bytes with a byte array as delimiter.
StringAgg.StringAggString - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
A Combine.CombineFn that aggregates strings with a string as delimiter.
StringAggByte(byte[]) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
 
StringAggString(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
 
StringBuilderBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle
 
StringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle
 
StringCompiler - Class in org.apache.beam.sdk.schemas.transforms.providers
 
StringCompiler() - Constructor for class org.apache.beam.sdk.schemas.transforms.providers.StringCompiler
 
StringCompiler.CompileException - Exception in org.apache.beam.sdk.schemas.transforms.providers
 
StringDelegateCoder<T> - Class in org.apache.beam.sdk.coders
A Coder that wraps a Coder<String> and encodes/decodes values via string representations.
StringDelegateCoder(Class<T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.StringDelegateCoder
 
StringFunctions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
StringFunctions.
StringFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
strings() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for String.
stringSet(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that accumulates and reports set of unique string values.
stringSet(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that accumulates and reports set of unique string values.
StringSet - Interface in org.apache.beam.sdk.metrics
A metric that reports set of unique string values.
StringSetImpl - Class in org.apache.beam.runners.jet.metrics
Implementation of StringSet.
StringSetImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.StringSetImpl
 
StringSetResult - Class in org.apache.beam.sdk.metrics
The result of a StringSet metric.
StringSetResult() - Constructor for class org.apache.beam.sdk.metrics.StringSetResult
 
StringSetResult.EmptyStringSetResult - Class in org.apache.beam.sdk.metrics
Empty StringSetResult, representing no values reported and is immutable.
StringUtf8Coder - Class in org.apache.beam.sdk.coders
A Coder that encodes Strings in UTF-8 encoding.
stripGetterPrefix(String) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
StripIdsDoFn() - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId.StripIdsDoFn
 
stripPartitionDecorator(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
Strip off any partition decorator information from a tablespec.
stripPrefix(String, String) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
stripSetterPrefix(String) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
Structs - Class in org.apache.beam.runners.dataflow.util
A collection of static methods for manipulating datastructure representations transferred via the Dataflow API.
StructuralByteArray - Class in org.apache.beam.sdk.coders
A wrapper around a byte[] that uses structural, value-based equality rather than byte[]'s normal object identity.
StructuralByteArray(byte[]) - Constructor for class org.apache.beam.sdk.coders.StructuralByteArray
 
StructuralKey<K> - Class in org.apache.beam.runners.local
A (Key, Coder) pair that uses the structural value of the key (as provided by Coder.structuralValue(Object)) to perform equality and hashing.
structuralValue(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
Returns an object with an Object.equals() method that represents structural equality on the argument.
structuralValue(T) - Method in class org.apache.beam.sdk.coders.Coder
Returns an object with an Object.equals() method that represents structural equality on the argument.
structuralValue(T) - Method in class org.apache.beam.sdk.coders.DelegateCoder
Returns an object with an Object.equals() method that represents structural equality on the argument.
structuralValue(Deque<T>) - Method in class org.apache.beam.sdk.coders.DequeCoder
 
structuralValue(Iterable<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
 
structuralValue(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
 
structuralValue(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
 
structuralValue(Map<K, V>) - Method in class org.apache.beam.sdk.coders.MapCoder
 
structuralValue(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
structuralValue(T) - Method in class org.apache.beam.sdk.coders.SerializableCoder
The structural value of the object is the object itself.
structuralValue(SortedMap<K, V>) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
 
structuralValue(T) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
structuralValue(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
structuralValue(T) - Method in class org.apache.beam.sdk.coders.ZstdCoder
Returns an object with an Object.equals() method that represents structural equality on the argument.
structuralValue(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
structuralValue(ProducerRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
structuralValue(TimestampedValue<T>) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
structuralValueConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T> and values of type T, the structural values are equal if and only if the encoded bytes are equal.
structuralValueConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and values of type T, the structural values are equal if and only if the encoded bytes are equal, in any Coder.Context.
structuralValueDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T> and value of type T, the structural value is equal to the structural value yield by encoding and decoding the original value.
structuralValueDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and value of type T, the structural value is equal to the structural value yield by encoding and decoding the original value, in any Coder.Context.
structuralValueDecodeEncodeEqualIterable(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T> and value of type T, the structural value of the content of the Iterable is equal to the structural value yield by encoding and decoding the original value.
structuralValueDecodeEncodeEqualIterableInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and value of type T, the structural content of the Iterable of the value is equal to the structural value yield by encoding and decoding the original value, in any Coder.Context.
StructuredCoder<T> - Class in org.apache.beam.sdk.coders
An abstract base class to implement a Coder that defines equality, hashing, and printing via the class name and recursively using StructuredCoder.getComponents().
StructuredCoder() - Constructor for class org.apache.beam.sdk.coders.StructuredCoder
 
studyId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
subclassGetterInterface(ByteBuddy, Type, Type) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
 
subclassSetterInterface(ByteBuddy, Type, Type) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
 
submitFn - Variable in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
 
subpath(int, int) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
subPathMatches(String, String) - Static method in class org.apache.beam.sdk.metrics.MetricFiltering
subPathMatches(haystack, needle) returns true if needle represents a path within haystack.
SubscriberOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
SubscriberOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
 
SubscriberOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
SubscribeTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
SubscribeTransform(SubscriberOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
 
SubscriptionPartition - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
SubscriptionPartition() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartition
 
SubscriptionPartitionCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
SubscriptionPartitionCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
 
subscriptionPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Subscription path used to listen for messages on TestPubsub.topicPath().
subscriptionPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
 
subscriptionPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
subscriptionPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
substr(String, long, long) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
SUBSTR - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
SUBSTR_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
SUBSTR_PARAMETER_EXCEED_INTEGER - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
subTriggers - Variable in class org.apache.beam.sdk.transforms.windowing.Trigger
 
subTriggers() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
SUCCESS - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
The tag for the successful writes to HL7v2 store`.
success(String, String, Metadata) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
 
success(String, String) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
 
success() - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
SUCCESS_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
 
SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
 
SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
SUCCESS_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
SUCCESSFUL_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
The tag for successful writes to FHIR store.
SUCCESSFUL_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
The TupleTag used for bundles that were executed successfully.
SUCCESSFUL_PUBLISH_TAG - Static variable in class org.apache.beam.sdk.io.solace.SolaceIO.Write
 
SUCCESSFUL_WRITES - Static variable in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
SuccessOrFailure - Class in org.apache.beam.sdk.testing
Output of PAssert.
Sum() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
 
Sum - Class in org.apache.beam.sdk.transforms
PTransforms for computing the sum of the elements in a PCollection, or the sum of the values associated with each key in a PCollection of KVs.
supplier(Coder, Coder, WindowingStrategy<T, BoundedWindow>, String) - Static method in class org.apache.beam.runners.jet.processors.AssignWindowP
 
supplier(BoundedSource<T>, SerializablePipelineOptions, Coder, String) - Static method in class org.apache.beam.runners.jet.processors.BoundedSourceP
 
Supplier(Map<String, Coder>, Coder, String) - Constructor for class org.apache.beam.runners.jet.processors.FlattenP.Supplier
 
supplier(Coder, String) - Static method in class org.apache.beam.runners.jet.processors.ImpulseP
 
Supplier(String, String, DoFn<InputT, OutputT>, WindowingStrategy<?, ?>, DoFnSchemaInformation, SerializablePipelineOptions, TupleTag<OutputT>, Set<TupleTag<OutputT>>, Coder<InputT>, Map<PCollectionView<?>, Coder<?>>, Map<TupleTag<?>, Coder<?>>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, Collection<PCollectionView<?>>, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.jet.processors.ParDoP.Supplier
 
Supplier(String, String, DoFn<KV<?, ?>, OutputT>, WindowingStrategy<?, ?>, DoFnSchemaInformation, SerializablePipelineOptions, TupleTag<OutputT>, Set<TupleTag<OutputT>>, Coder<KV<?, ?>>, Map<PCollectionView<?>, Coder<?>>, Map<TupleTag<?>, Coder<?>>, Coder<KV<?, ?>>, Map<TupleTag<?>, Coder<?>>, Collection<PCollectionView<?>>, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.jet.processors.StatefulParDoP.Supplier
 
supplier(UnboundedSource<T, CmT>, SerializablePipelineOptions, Coder, String) - Static method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
 
supplier(Coder, Coder, WindowingStrategy<?, ?>, String) - Static method in class org.apache.beam.runners.jet.processors.ViewP
 
supplier(SerializablePipelineOptions, WindowedValue.WindowedValueCoder<KV<K, V>>, Coder, WindowingStrategy, String) - Static method in class org.apache.beam.runners.jet.processors.WindowGroupP
 
SUPPORTED_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
SUPPORTED_FORMATS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
SUPPORTED_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
SUPPORTED_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
 
supportsCondition() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
 
supportsProjectionPushdown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
supportsProjectionPushdown() - Method in interface org.apache.beam.sdk.schemas.ProjectionProducer
Whether this supports projection pushdown.
supportsProjects() - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
 
supportsProjects() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
Whether project push-down is supported by the IO API.
supportsProjects() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
 
SynchronizedStreamObserver<V> - Class in org.apache.beam.sdk.fn.stream
A StreamObserver which provides synchronous access access to an underlying StreamObserver.
SystemReduceFnBuffering<K,T,W extends BoundedWindow> - Class in org.apache.beam.runners.twister2.translators.functions.internal
 
SystemReduceFnBuffering() - Constructor for class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 
SystemReduceFnBuffering(Coder<T>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
 

T

T__0 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
T__0 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
T__1 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
T__1 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
T__2 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
T__2 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
T__3 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
T__3 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
T__4 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
T__4 - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
table() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
 
Table - Class in org.apache.beam.sdk.extensions.sql.meta
Represents the metadata of a BeamSqlTable.
Table() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.Table
 
table() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
Table.Builder - Class in org.apache.beam.sdk.extensions.sql.meta
Builder class for Table.
TABLE_FIELD_SCHEMAS - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
 
TABLE_ROW_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
TableAndQuery() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
TableAndRecord<T> - Class in org.apache.beam.sdk.io.kudu
A wrapper for a KuduTable and the T representing a typed record.
TableAndRecord(KuduTable, T) - Constructor for class org.apache.beam.sdk.io.kudu.TableAndRecord
 
TableDestination - Class in org.apache.beam.sdk.io.gcp.bigquery
Encapsulates a BigQuery table destination.
TableDestination(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(String, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(String, String, TimePartitioning, Clustering) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(String, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestinationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
A coder for TableDestination objects.
TableDestinationCoderV2 - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder for TableDestination that includes time partitioning information.
TableDestinationCoderV2() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
TableDestinationCoderV3 - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder for TableDestination that includes time partitioning and clustering information.
TableDestinationCoderV3() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
tableExists() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Checks whether the metadata table already exists in the database.
tableFieldToProtoTableField(TableFieldSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
TableName - Class in org.apache.beam.sdk.extensions.sql.impl
Represents a parsed table name that is specified in a FROM clause (and other places).
TableName() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.TableName
 
TableNameExtractionUtils - Class in org.apache.beam.sdk.extensions.sql
Helper class to extract table identifiers from the query.
TableNameExtractionUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.TableNameExtractionUtils
 
TableProvider - Interface in org.apache.beam.sdk.extensions.sql.meta.provider
A TableProvider handles the metadata CRUD of a specified kind of tables.
tableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
TableResolution - Class in org.apache.beam.sdk.extensions.sql.zetasql
Utility methods to resolve a table, given a top-level Calcite schema and a table path.
TableResolution() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.TableResolution
 
tableRowFromBeamRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
tableRowFromMessage(Message, boolean, Predicate<String>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
tableRowFromMessage(Message, boolean, Predicate<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
TableRowJsonCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder that encodes BigQuery TableRow objects in their native JSON format.
tableRows(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
tableRowToBeamRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
TableRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
Utility methods for converting JSON TableRow objects to dynamic protocol message, for use with the Storage write API.
TableRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
TableRowToStorageApiProto.SchemaDoesntMatchException - Exception in org.apache.beam.sdk.io.gcp.bigquery
 
TableRowToStorageApiProto.SchemaTooNarrowException - Exception in org.apache.beam.sdk.io.gcp.bigquery
 
TableRowToStorageApiProto.SingleValueConversionException - Exception in org.apache.beam.sdk.io.gcp.bigquery
 
tables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
tableSchema() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
TableSchema - Class in org.apache.beam.sdk.io.clickhouse
A descriptor for ClickHouse table schema.
TableSchema() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema
 
TableSchema.Column - Class in org.apache.beam.sdk.io.clickhouse
A column in ClickHouse table.
TableSchema.ColumnType - Class in org.apache.beam.sdk.io.clickhouse
A descriptor for a column type.
TableSchema.DefaultType - Enum in org.apache.beam.sdk.io.clickhouse
An enumeration of possible kinds of default values in ClickHouse.
TableSchema.TypeName - Enum in org.apache.beam.sdk.io.clickhouse
An enumeration of possible types in ClickHouse.
TableSchemaCache - Class in org.apache.beam.sdk.io.gcp.bigquery
An updatable cache for table schemas.
TableSchemaUpdateUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
Helper utilities for handling schema-update responses.
TableSchemaUpdateUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaUpdateUtils
 
tableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
TableUtils - Class in org.apache.beam.sdk.extensions.sql
 
TableWithRows(long, Table) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.TableWithRows
 
TaggedKeyedPCollection(TupleTag<V>, PCollection<KV<K, V>>) - Constructor for class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
 
TaggedPValue - Class in org.apache.beam.sdk.values
For internal use only; no backwards-compatibility guarantees.
TaggedPValue() - Constructor for class org.apache.beam.sdk.values.TaggedPValue
 
take(String, Duration) - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool.Source
Retrieves the InstructionRequestHandler for the given worker id, blocking until available or the request times out.
take() - Method in class org.apache.beam.sdk.fn.CancellableQueue
Takes an element from this queue.
takeOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
takeOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
takeOutputElementsWithTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Deprecated.
Use TestPipeline with the DirectRunner.
tanh(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
TANH(X)
targetForRootUrl(String) - Static method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
Internal only utility for converting PubsubOptions.getPubsubRootUrl() (e.g.
TDigestQuantiles - Class in org.apache.beam.sdk.extensions.sketching
PTransforms for getting information about quantiles in a stream.
TDigestQuantiles() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
 
TDigestQuantiles.GlobalDigest - Class in org.apache.beam.sdk.extensions.sketching
Implementation of TDigestQuantiles.globally().
TDigestQuantiles.PerKeyDigest<K> - Class in org.apache.beam.sdk.extensions.sketching
Implementation of TDigestQuantiles.perKey().
TDigestQuantiles.TDigestQuantilesFn - Class in org.apache.beam.sdk.extensions.sketching
Implements the Combine.CombineFn of TDigestQuantiles transforms.
teardown() - Method in interface org.apache.beam.io.requestresponse.SetupTeardown
Called during the DoFn's teardown lifecycle method.
teardown() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
tearDown() - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
cleanup resources of the instance.
teardown() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
 
teardown() - Method in interface org.apache.beam.sdk.io.kafka.CheckStopReadingFn
 
teardown() - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
 
teardown() - Method in class org.apache.beam.sdk.io.pulsar.WriteToPulsarDoFn
 
teardown() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
tearDown() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream
 
tearDown() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream
 
tearDown(Blackhole) - Method in class org.apache.beam.sdk.jmh.util.VarIntBenchmark.ByteStringOutput
 
Tee<T> - Class in org.apache.beam.sdk.transforms
A PTransform that returns its input, but also applies its input to an auxiliary PTransform, akin to the shell tee command, which is named after the T-splitter used in plumbing.
TEMP_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
The tag for temp files for import to FHIR store.
test(RunnerApi.PTransform) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform
 
test(RunnerApi.PTransform) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform
 
TestBigQuery - Class in org.apache.beam.sdk.io.gcp.bigquery
Test rule which creates a new table with specified schema, with randomized name and exposes few APIs to work with it.
TestBigQuery.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.bigquery
Interface to implement a polling assertion.
TestBigQuery.RowsAssertion - Class in org.apache.beam.sdk.io.gcp.bigquery
Interface for creating a polling eventual assertion.
TestBigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
TestBoundedTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
Mocked table for bounded data sources.
TestBoundedTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
testByteCount(Coder<T>, Coder.Context, T[]) - Static method in class org.apache.beam.sdk.testing.CoderProperties
A utility method that passes the given (unencoded) elements through coder's registerByteSizeObserver() and encode() methods, and confirms they are mutually consistent.
testCombineFn(Combine.CombineFn<InputT, AccumT, OutputT>, List<InputT>, OutputT) - Static method in class org.apache.beam.sdk.testing.CombineFnTester
Tests that the Combine.CombineFn, when applied to the provided input, produces the provided output.
testCombineFn(Combine.CombineFn<InputT, AccumT, OutputT>, List<InputT>, Matcher<? super OutputT>) - Static method in class org.apache.beam.sdk.testing.CombineFnTester
 
testCopyArray(ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState, Blackhole) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy
 
TestDataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow
A set of options used to configure the TestPipeline.
TestDataflowRunner - Class in org.apache.beam.runners.dataflow
TestDataflowRunner is a pipeline runner that wraps a DataflowRunner when running tests against the TestPipeline.
TestElementByteSizeObserver() - Constructor for class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
TestExecutors - Class in org.apache.beam.sdk.fn.test
A TestRule that validates that all submitted tasks finished and were completed.
TestExecutors() - Constructor for class org.apache.beam.sdk.fn.test.TestExecutors
 
TestExecutors.TestExecutorService - Interface in org.apache.beam.sdk.fn.test
A union of the ExecutorService and TestRule interfaces.
TestFlinkRunner - Class in org.apache.beam.runners.flink
Test Flink runner.
testingPipelineOptions() - Static method in class org.apache.beam.sdk.testing.TestPipeline
Creates PipelineOptions for testing.
TestJobService - Class in org.apache.beam.runners.portability.testing
A JobService for tests.
TestJobService(Endpoints.ApiServiceDescriptor, String, String, JobApi.JobState.Enum, JobApi.MetricResults) - Constructor for class org.apache.beam.runners.portability.testing.TestJobService
 
testNewArray(ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState, Blackhole) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy
 
TestPipeline - Class in org.apache.beam.sdk.testing
A creator of test pipelines that can be used inside of tests that can be configured to run locally or against a remote pipeline runner.
TestPipeline.AbandonedNodeException - Exception in org.apache.beam.sdk.testing
An exception thrown in case an abandoned PTransform is detected, that is, a PTransform that has not been run.
TestPipeline.PipelineRunMissingException - Exception in org.apache.beam.sdk.testing
An exception thrown in case a test finishes without invoking Pipeline.run().
TestPipeline.TestValueProviderOptions - Interface in org.apache.beam.sdk.testing
Implementation detail of TestPipeline.newProvider(T), do not use.
TestPipelineOptions - Interface in org.apache.beam.sdk.testing
TestPipelineOptions is a set of options for test pipelines.
TestPipelineOptions.AlwaysPassMatcher - Class in org.apache.beam.sdk.testing
Matcher which will always pass.
TestPipelineOptions.AlwaysPassMatcherFactory - Class in org.apache.beam.sdk.testing
Factory for PipelineResult matchers which always pass.
TestPortablePipelineOptions - Interface in org.apache.beam.runners.portability.testing
Options for TestPortableRunner.
TestPortablePipelineOptions.DefaultJobServerConfigFactory - Class in org.apache.beam.runners.portability.testing
Factory for default config.
TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar - Class in org.apache.beam.runners.portability.testing
TestPortablePipelineOptionsRegistrar() - Constructor for class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar
 
TestPortableRunner - Class in org.apache.beam.runners.portability.testing
TestPortableRunner is a pipeline runner that wraps a PortableRunner when running tests against the TestPipeline.
TestPrismPipelineOptions - Interface in org.apache.beam.runners.prism
TestPrismRunner - Class in org.apache.beam.runners.prism
TestPrismRunner is the recommended PipelineRunner to use for tests that rely on sdks/go/cmd/prism.
testProtobufByteStringOutputStreamFewLargeWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamFewMixedWritesWithoutReuse() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamFewMixedWritesWithReuse(ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamFewSmallWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamFewTinyWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamManyLargeWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamManyMixedWritesWithoutReuse() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamManyMixedWritesWithReuse(ByteStringOutputStreamBenchmark.ProtobufByteStringOutputStream) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamManySmallWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testProtobufByteStringOutputStreamManyTinyWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
TestPubsub - Class in org.apache.beam.sdk.io.gcp.pubsub
Test rule which creates a new topic and subscription with randomized names and exposes the APIs to work with them.
TestPubsub.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.pubsub
 
TestPubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
TestPubsubSignal - Class in org.apache.beam.sdk.io.gcp.pubsub
Test rule which observes elements of the PCollection and checks whether they match the success criteria.
TestSchemaTransformProvider - Class in org.apache.beam.sdk.managed.testing
 
TestSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.managed.testing.TestSchemaTransformProvider
 
TestSchemaTransformProvider.Config - Class in org.apache.beam.sdk.managed.testing
 
TestSchemaTransformProvider.Config.Builder - Class in org.apache.beam.sdk.managed.testing
 
testSdkCoreByteStringOutputStreamFewLargeWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamFewMixedWritesWithoutReuse() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamFewMixedWritesWithReuse(ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamFewSmallWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamFewTinyWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamManyLargeWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamManyMixedWritesWithoutReuse() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamManyMixedWritesWithReuse(ByteStringOutputStreamBenchmark.SdkCoreByteStringOutputStream) - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamManySmallWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
testSdkCoreByteStringOutputStreamManyTinyWrites() - Method in class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark
 
TestSparkPipelineOptions - Interface in org.apache.beam.runners.spark
TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory - Class in org.apache.beam.runners.spark
A factory to provide the default watermark to stop a pipeline that reads from an unbounded source.
TestSparkRunner - Class in org.apache.beam.runners.spark
The SparkRunner translate operations defined on a pipeline to a representation executable by Spark, and then submitting the job to Spark to be executed.
TestStream<T> - Class in org.apache.beam.sdk.testing
A testing input that generates an unbounded PCollection of elements, advancing the watermark and processing time as elements are emitted.
TestStream.Builder<T> - Class in org.apache.beam.sdk.testing
An incomplete TestStream.
TestStream.ElementEvent<T> - Class in org.apache.beam.sdk.testing
A TestStream.Event that produces elements.
TestStream.Event<T> - Interface in org.apache.beam.sdk.testing
An event in a TestStream.
TestStream.EventType - Enum in org.apache.beam.sdk.testing
The types of TestStream.Event that are supported by TestStream.
TestStream.ProcessingTimeEvent<T> - Class in org.apache.beam.sdk.testing
A TestStream.Event that advances the processing time clock.
TestStream.TestStreamCoder<T> - Class in org.apache.beam.sdk.testing
Coder for TestStream.
TestStream.WatermarkEvent<T> - Class in org.apache.beam.sdk.testing
A TestStream.Event that advances the watermark.
TestStreams - Class in org.apache.beam.sdk.fn.test
Utility methods which enable testing of StreamObservers.
TestStreams() - Constructor for class org.apache.beam.sdk.fn.test.TestStreams
 
TestStreams.Builder<T> - Class in org.apache.beam.sdk.fn.test
A builder for a test CallStreamObserver that performs various callbacks.
TestTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
Base class for mocked table.
TestTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
 
TestTableFilter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
 
TestTableFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
 
TestTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
Test in-memory table provider for use in tests.
TestTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
TestTableProvider.PushDownOptions - Enum in org.apache.beam.sdk.extensions.sql.meta.provider.test
 
TestTableProvider.TableWithRows - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
TableWitRows.
TestTableUtils - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
Utility functions for mock classes.
TestTableUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
 
TestUnboundedTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
A mocked unbounded table.
TestUniversalRunner - Class in org.apache.beam.runners.portability.testing
A PipelineRunner a Pipeline against a JobService.
TestUniversalRunner.Options - Interface in org.apache.beam.runners.portability.testing
 
TestUniversalRunner.OptionsRegistrar - Class in org.apache.beam.runners.portability.testing
TestUniversalRunner.RunnerRegistrar - Class in org.apache.beam.runners.portability.testing
Registrar for the portable runner.
TextIO - Class in org.apache.beam.sdk.io
PTransforms for reading and writing text files.
TextIO.CompressionType - Enum in org.apache.beam.sdk.io
Deprecated.
TextIO.Read - Class in org.apache.beam.sdk.io
Implementation of TextIO.read().
TextIO.ReadAll - Class in org.apache.beam.sdk.io
Deprecated.
See TextIO.readAll() for details.
TextIO.ReadFiles - Class in org.apache.beam.sdk.io
Implementation of TextIO.readFiles().
TextIO.Sink - Class in org.apache.beam.sdk.io
Implementation of TextIO.sink().
TextIO.TypedWrite<UserT,DestinationT> - Class in org.apache.beam.sdk.io
Implementation of TextIO.write().
TextIO.Write - Class in org.apache.beam.sdk.io
This class is used as the default return value of TextIO.write().
TextJsonTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
TextJsonTable is a BeamSqlTable that reads text files and converts them according to the JSON format.
TextJsonTable(Schema, String, TextTableProvider.JsonToRow, TextTableProvider.RowToJson) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextJsonTable
 
TextMessageMapper - Class in org.apache.beam.sdk.io.jms
The TextMessageMapper takes a String value, a Session and returns a TextMessage.
TextMessageMapper() - Constructor for class org.apache.beam.sdk.io.jms.TextMessageMapper
 
TextRowCountEstimator - Class in org.apache.beam.sdk.io
This returns a row count estimation for files associated with a file pattern.
TextRowCountEstimator() - Constructor for class org.apache.beam.sdk.io.TextRowCountEstimator
 
TextRowCountEstimator.Builder - Class in org.apache.beam.sdk.io
TextRowCountEstimator.LimitNumberOfFiles - Class in org.apache.beam.sdk.io
This strategy stops sampling if we sample enough number of bytes.
TextRowCountEstimator.LimitNumberOfTotalBytes - Class in org.apache.beam.sdk.io
This strategy stops sampling when total number of sampled bytes are more than some threshold.
TextRowCountEstimator.NoEstimationException - Exception in org.apache.beam.sdk.io
An exception that will be thrown if the estimator cannot get an estimation of the number of lines.
TextRowCountEstimator.SampleAllFiles - Class in org.apache.beam.sdk.io
This strategy samples all the files.
TextRowCountEstimator.SamplingStrategy - Interface in org.apache.beam.sdk.io
Sampling Strategy shows us when should we stop reading further files.
TextSource - Class in org.apache.beam.sdk.io
Implementation detail of TextIO.Read.
TextSource(ValueProvider<String>, EmptyMatchTreatment, byte[], int) - Constructor for class org.apache.beam.sdk.io.TextSource
 
TextSource(ValueProvider<String>, EmptyMatchTreatment, byte[]) - Constructor for class org.apache.beam.sdk.io.TextSource
 
TextSource(MatchResult.Metadata, long, long, byte[], int) - Constructor for class org.apache.beam.sdk.io.TextSource
 
TextSource(MatchResult.Metadata, long, long, byte[]) - Constructor for class org.apache.beam.sdk.io.TextSource
 
TextSourceBenchmark - Class in org.apache.beam.sdk.jmh.io
 
TextSourceBenchmark() - Constructor for class org.apache.beam.sdk.jmh.io.TextSourceBenchmark
 
TextSourceBenchmark.Data - Class in org.apache.beam.sdk.jmh.io
 
TextTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
TextTable is a BeamSqlTable that reads text files and converts them according to the specified format.
TextTable(Schema, String, PTransform<PCollection<String>, PCollection<Row>>, PTransform<PCollection<Row>, PCollection<String>>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
Text table with the specified read and write transforms.
TextTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
Text table provider.
TextTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
 
TextTableProvider.CsvToRow - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
Read-side converter for TextTable with format 'csv'.
TextTableProvider.LinesReadConverter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
Read-side converter for TextTable with format 'lines'.
TextTableProvider.LinesWriteConverter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
Write-side converter for for TextTable with format 'lines'.
TextualIntegerCoder - Class in org.apache.beam.sdk.coders
A Coder that encodes Integer Integers as the ASCII bytes of their textual, decimal, representation.
TextualIntegerCoder() - Constructor for class org.apache.beam.sdk.coders.TextualIntegerCoder
 
TFRecordIO - Class in org.apache.beam.sdk.io
PTransforms for reading and writing TensorFlow TFRecord files.
TFRecordIO.CompressionType - Enum in org.apache.beam.sdk.io
Deprecated.
TFRecordIO.Read - Class in org.apache.beam.sdk.io
Implementation of TFRecordIO.read().
TFRecordIO.ReadFiles - Class in org.apache.beam.sdk.io
Implementation of TFRecordIO.readFiles().
TFRecordIO.Sink - Class in org.apache.beam.sdk.io
TFRecordIO.Write - Class in org.apache.beam.sdk.io
Implementation of TFRecordIO.write().
that(PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the elements of the provided PCollection.
that(String, PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the elements of the provided PCollection with the specified reason.
thatFlattened(PCollectionList<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the elements of the flattened PCollectionList.
thatFlattened(String, PCollectionList<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the elements of the flattened PCollectionList with the specified reason.
thatList(PCollectionList<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
thatMap(PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection, which must have at most one value per key.
thatMap(String, PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection with the specified reason.
thatMultimap(PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection.
thatMultimap(String, PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection with the specified reason.
thatSingleton(PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection PCollection<T>, which must be a singleton.
thatSingleton(String, PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection PCollection<T> with the specified reason.
thatSingletonIterable(PCollection<? extends Iterable<T>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the value of the provided PCollection which must contain a single Iterable<T> value.
thatSingletonIterable(String, PCollection<? extends Iterable<T>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the value of the provided PCollection with the specified reason.
threadSafe(WatermarkEstimator<WatermarkEstimatorStateT>) - Static method in class org.apache.beam.sdk.fn.splittabledofn.WatermarkEstimators
Returns a thread safe WatermarkEstimator which allows getting a snapshot of the current watermark and watermark estimator state.
ThriftCoder<T> - Class in org.apache.beam.sdk.io.thrift
A Coder using a Thrift TProtocol to serialize/deserialize elements.
ThriftCoder(Class<T>, TProtocolFactory) - Constructor for class org.apache.beam.sdk.io.thrift.ThriftCoder
 
ThriftIO - Class in org.apache.beam.sdk.io.thrift
PTransforms for reading and writing files containing Thrift encoded data.
ThriftIO.ReadFiles<T> - Class in org.apache.beam.sdk.io.thrift
ThriftIO.Sink<T extends org.apache.thrift.TBase<?,?>> - Class in org.apache.beam.sdk.io.thrift
ThriftIO.ThriftWriter<T extends org.apache.thrift.TBase<?,?>> - Class in org.apache.beam.sdk.io.thrift
Writer to write Thrift object to OutputStream.
ThriftPayloadSerializerProvider - Class in org.apache.beam.sdk.io.thrift
 
ThriftPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
 
ThriftSchema - Class in org.apache.beam.sdk.io.thrift
Schema provider for generated thrift types.
ThriftSchema.Customizer - Class in org.apache.beam.sdk.io.thrift
 
THROTTLE_TIME_COUNTER_NAME - Static variable in class org.apache.beam.sdk.metrics.Metrics
 
THROTTLE_TIME_NAMESPACE - Static variable in class org.apache.beam.sdk.metrics.Metrics
 
THROTTLED_TIME - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
throttledBaseBackoff(Duration) - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration.Builder
 
throttledBaseBackoff() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
 
throttledTimeCounter(BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
THROUGHPUT_WINDOW_SECONDS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The sliding window size in seconds for throughput reporting.
ThroughputEstimator<T> - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
An estimator to calculate the throughput of the outputted elements from a DoFn.
throwable() - Method in class org.apache.beam.sdk.values.EncodableThrowable
Returns the underlying Throwable.
ThrowableHandler() - Constructor for class org.apache.beam.sdk.transforms.WithFailures.ThrowableHandler
 
throwableToGRPCCodeString(Throwable) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Converts a Throwable to a gRPC Status code.
THROWING_ROUTER - Static variable in interface org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter
 
ThrowingBadRecordRouter() - Constructor for class org.apache.beam.sdk.transforms.errorhandling.BadRecordRouter.ThrowingBadRecordRouter
 
ThrowingBiConsumer<T1,T2> - Interface in org.apache.beam.sdk.function
A BiConsumer which can throw Exceptions.
ThrowingBiFunction<T1,T2,T3> - Interface in org.apache.beam.sdk.function
A BiFunction which can throw Exceptions.
ThrowingConsumer<ExceptionT extends java.lang.Exception,T> - Interface in org.apache.beam.sdk.function
A Consumer which can throw Exceptions.
ThrowingFunction<T1,T2> - Interface in org.apache.beam.sdk.function
A Function which can throw Exceptions.
ThrowingRunnable - Interface in org.apache.beam.sdk.function
A Runnable which can throw Exceptions.
throwNullCredentialException() - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
 
TikaIO - Class in org.apache.beam.sdk.io.tika
Transforms for parsing arbitrary files using Apache Tika.
TikaIO() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO
 
TikaIO.Parse - Class in org.apache.beam.sdk.io.tika
Implementation of TikaIO.parse().
TikaIO.ParseFiles - Class in org.apache.beam.sdk.io.tika
Implementation of TikaIO.parseFiles().
TIME - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
time() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
 
TIME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
Beam LogicalType corresponding to ZetaSQL/CalciteSQL TIME type.
Time - Class in org.apache.beam.sdk.schemas.logicaltypes
A time without a time-zone.
Time() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.Time
 
TIME_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
TIME_WITH_LOCAL_TZ - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
TimeConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
 
TimeDomain - Enum in org.apache.beam.sdk.state
TimeDomain specifies whether an operation is based on timestamps of elements or current "real-world" time as reported while processing.
timeDomain() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
Returns the time domain of the current timer.
TimeMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
 
TimeMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeMicrosConversion
 
TimeMillisConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
 
timer(String, String, String) - Static method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
Timer - Interface in org.apache.beam.sdk.state
A timer for a specified time domain that can be set to register the desire for further processing at particular time in its specified time domain.
timer(TimeDomain) - Static method in class org.apache.beam.sdk.state.TimerSpecs
 
TimerEndpoint<T> - Class in org.apache.beam.sdk.fn.data
 
TimerEndpoint() - Constructor for class org.apache.beam.sdk.fn.data.TimerEndpoint
 
timerId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
timerInternals() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.NoOpStepContext
 
timerInternals() - Method in class org.apache.beam.runners.twister2.utils.NoOpStepContext
 
TimerMap - Interface in org.apache.beam.sdk.state
 
timerMap(TimeDomain) - Static method in class org.apache.beam.sdk.state.TimerSpecs
 
TimerReceiverFactory - Class in org.apache.beam.runners.fnexecution.control
A factory that passes timers to TimerReceiverFactory.timerDataConsumer.
TimerReceiverFactory(StageBundleFactory, BiConsumer<Timer<?>, TimerInternals.TimerData>, Coder) - Constructor for class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
 
Timers - Interface in org.apache.beam.sdk.state
Interface for interacting with time.
TimerSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
TimerSpec - Interface in org.apache.beam.sdk.state
A specification for a Timer.
TimerSpecs - Class in org.apache.beam.sdk.state
Static methods for working with TimerSpecs.
TimerSpecs() - Constructor for class org.apache.beam.sdk.state.TimerSpecs
 
TimerUtils - Class in org.apache.beam.runners.spark.util
 
TimerUtils() - Constructor for class org.apache.beam.runners.spark.util.TimerUtils
 
timestamp() - Method in class org.apache.beam.runners.twister2.utils.Twister2AssignContext
 
TIMESTAMP - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
timestamp(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
 
timestamp(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
 
timestamp(Integer) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
 
timestamp(Integer, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
 
TIMESTAMP - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
Beam LogicalType corresponding to ZetaSQL TIMESTAMP type.
timestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
Returns the output timestamp of the current timer.
timestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Returns the timestamp of the input element.
timestamp() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
Returns the timestamp of the current element.
TIMESTAMP_FIELD_NAME - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
 
TIMESTAMP_MAX_VALUE - Static variable in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
The maximum value for any Beam timestamp.
TIMESTAMP_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
TIMESTAMP_MICROS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
TIMESTAMP_MIN_VALUE - Static variable in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
The minimum value for any Beam timestamp.
TIMESTAMP_OP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
TIMESTAMP_WITH_LOCAL_TZ - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
timestampColumnIndex(int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
TimestampCombiner - Enum in org.apache.beam.sdk.transforms.windowing
Policies for combining timestamps that occur within a window.
TimeStampComparator() - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
 
TimestampConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
 
TimestampConvert() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.TimestampConvert
 
TimestampConverter - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
Convert between different Timestamp and Instant classes.
TimestampConverter() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
timestamped(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.TimestampedValues transform that produces a PCollection containing the elements of the provided Iterable with the specified timestamps.
timestamped(TimestampedValue<T>, TimestampedValue<T>...) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.TimestampedValues transform that produces a PCollection containing the specified elements with the specified timestamps.
timestamped(Iterable<T>, Iterable<Long>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new root transform that produces a PCollection containing the specified elements with the specified timestamps.
TimestampedValue<V> - Class in org.apache.beam.sdk.values
An immutable pair of a value and a timestamp.
TimestampedValue(V, Instant) - Constructor for class org.apache.beam.sdk.values.TimestampedValue
 
TimestampedValue.TimestampedValueCoder<T> - Class in org.apache.beam.sdk.values
TimestampEncoding - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
This encoder/decoder writes a com.google.cloud.Timestamp object as a pair of long and int to avro and reads a Timestamp object from the same pair.
TimestampEncoding() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
 
TimestampFunctions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
TimestampFunctions.
TimestampFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.TimestampFunctions
 
TimestampMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
 
TimestampMicrosConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampMicrosConversion
 
TimestampMillisConversion() - Constructor for class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
 
timestampMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Timestamp for element (ms since epoch).
TimestampObservingWatermarkEstimator<WatermarkEstimatorStateT> - Interface in org.apache.beam.sdk.transforms.splittabledofn
A WatermarkEstimator that observes the timestamps of all records output from a DoFn.
TimestampPolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
A timestamp policy to assign event time for messages in a Kafka partition and watermark for it.
TimestampPolicy() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicy
 
TimestampPolicy.PartitionContext - Class in org.apache.beam.sdk.io.kafka
The context contains state maintained in the reader for the partition.
TimestampPolicyFactory<KeyT,ValueT> - Interface in org.apache.beam.sdk.io.kafka
An extendable factory to create a TimestampPolicy for each partition at runtime by KafkaIO reader.
TimestampPolicyFactory.LogAppendTimePolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
Assigns Kafka's log append time (server side ingestion time) to each record.
TimestampPolicyFactory.ProcessingTimePolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
A simple policy that uses current time for event time and watermark.
TimestampPolicyFactory.TimestampFnPolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
Internal policy to support deprecated withTimestampFn API.
TimestampPrefixingWindowCoder<T extends BoundedWindow> - Class in org.apache.beam.sdk.coders
A TimestampPrefixingWindowCoder wraps arbitrary user custom window coder.
TimestampRange - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
A restriction represented by a range of timestamps [from, to).
TimestampRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
A RestrictionTracker for claiming positions in a TimestampRange in a monotonically increasing fashion.
TimestampRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
timestamps() - Static method in class org.apache.beam.sdk.transforms.Reify
Create a PTransform that will output all inputs wrapped in a TimestampedValue.
timestampsInValue() - Static method in class org.apache.beam.sdk.transforms.Reify
Create a PTransform that will output all input KVs with the timestamp inside the value.
TimestampTransform - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards-compatibility guarantees.
TimestampTransform.AlignTo - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards-compatibility guarantees.
TimestampTransform.Delay - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards-compatibility guarantees.
TimestampUtils - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
Provides methods in order to convert timestamp to nanoseconds representation and back.
TimestampUtils() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
 
timeSupplier - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
TimeUtil - Class in org.apache.beam.runners.dataflow.util
A helper class for converting between Dataflow API and SDK time representations.
TimeUtil - Class in org.apache.beam.sdk.io.aws2.kinesis
Time conversion utilities.
TimeWithLocalTzType() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.TimeWithLocalTzType
 
TINY_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
TmpCheckpointDirFactory() - Constructor for class org.apache.beam.runners.spark.SparkCommonPipelineOptions.TmpCheckpointDirFactory
 
to(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Writes to file(s) with the given output prefix.
to(ResourceId) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Writes to file(s) with the given output prefix.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Writes to files named according to the given FileBasedSink.FilenamePolicy.
to(DynamicAvroDestinations<UserT, NewDestinationT, OutputT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Deprecated.
to(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
to(ResourceId) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
See TypedWrite#to(FilenamePolicy).
to(DynamicAvroDestinations<T, ?, T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
Deprecated.
to(SqsIO.WriteBatches.DynamicDestination<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
Dynamic record based destination to write to.
to(String) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
Queue url to write to.
to(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies a common directory for all generated files.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
to(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the given table, specified in the format described in BigQueryHelpers.parseTableSpec(java.lang.String).
to(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the given table, specified as a TableReference.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
to(SerializableFunction<ValueInSingleWindow<T>, TableDestination>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to table specified by the specified table function.
to(DynamicDestinations<T, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the table and schema specified by the DynamicDestinations object.
to(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Publishes to the specified topic.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Like topic() but with a ValueProvider.
to(SerializableFunction<ValueInSingleWindow<T>, String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Provides a function to dynamically specify the target topic per message.
to(long) - Method in class org.apache.beam.sdk.io.GenerateSequence
Specifies the maximum number to generate (exclusive).
to(TableIdentifier) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
 
to(DynamicDestinations) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
 
to(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
A table name to be written in Snowflake.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
 
to(Solace.Topic) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
Write to a Solace topic.
to(Solace.Queue) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
Write to a Solace queue.
to(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
Provide name of collection while reading from Solr.
to(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Writes to text files with the given prefix.
to(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Writes to files named according to the given FileBasedSink.FilenamePolicy.
to(FileBasedSink.DynamicDestinations<UserT, NewDestinationT, String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Deprecated.
to(SerializableFunction<UserT, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Deprecated.
to(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
to(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.Write
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.TextIO.Write
See TypedWrite#to(FilenamePolicy).
to(FileBasedSink.DynamicDestinations<String, ?, String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
Deprecated.
to(SerializableFunction<String, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params) - Method in class org.apache.beam.sdk.io.TextIO.Write
Deprecated.
to(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes TFRecord file(s) with the given output prefix.
to(ResourceId) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes TFRecord file(s) with a prefix given by the specified resource.
to(FileBasedSink<UserT, DestinationT, OutputT>) - Static method in class org.apache.beam.sdk.io.WriteFiles
Creates a WriteFiles transform that writes to the given FileBasedSink, letting the runner control how many different shards are produced.
to(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
Writes to files with the given path prefix.
to(Class<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
Convert a PCollection<InputT> to a PCollection<OutputT>.
to(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
Convert a PCollection<InputT> to a PCollection<OutputT>.
toAbsolutePath() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
toAdditionalInputs(Iterable<PCollectionView<?>>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Expands a list of PCollectionView into the form needed for PTransform.getAdditionalInputs().
toAvroField(Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Get Avro Field from Beam Field.
toAvroSchema(Schema, String, String) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Converts a Beam Schema into an AVRO schema.
toAvroSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
 
toAvroType(String, String) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
Convert to an AVRO type.
toBaseType(OffsetTime) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
 
toBaseType(OffsetDateTime) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
 
toBaseType(LocalTime) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
 
toBaseType(Instant) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
 
toBaseType(LocalDate) - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
 
toBaseType(LocalDateTime) - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
toBaseType(EnumerationType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
toBaseType(byte[]) - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
 
toBaseType(Instant) - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
 
toBaseType(Duration) - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
 
toBaseType(Instant) - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
 
toBaseType(OneOfType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
toBaseType(T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
 
toBaseType(PythonCallableSource) - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
 
toBaseType(Schema) - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
 
toBaseType(LocalTime) - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
 
toBaseType(UUID) - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
toBaseType(InputT) - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
Convert the input type to the type Java type used by the base Schema.FieldType.
toBeamField(Schema.Field) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Get Beam Field from avro Field.
toBeamObject(Value, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
 
toBeamObject(Value, Schema.FieldType, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
 
toBeamRow(Value, Schema, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
 
toBeamRow(GenericRecord, Schema, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
toBeamRow(Schema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Tries to convert a JSON TableRow from BigQuery into a Beam Row.
toBeamRow(Schema, TableSchema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Tries to parse the JSON TableRow from BigQuery.
toBeamRow() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformReadConfiguration
Serializes configuration to a Row.
toBeamRow() - Method in class org.apache.beam.sdk.io.singlestore.schematransform.SingleStoreSchemaTransformWriteConfiguration
Serializes configuration to a Row.
toBeamRow(String, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
 
toBeamRow(String, Schema, boolean) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
 
toBeamRow(Map<String, Object>, Schema, boolean) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
 
toBeamRowStrict(GenericRecord, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during conversion.
toBeamSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.ArrowSchemaTranslator
 
toBeamSchema(List<Field>) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.ArrowSchemaTranslator
 
toBeamSchema(Class<?>) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Converts AVRO schema to Beam row schema.
toBeamSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Converts AVRO schema to Beam row schema.
toBeamSchema(ResultSetMetaData) - Static method in class org.apache.beam.sdk.io.jdbc.SchemaUtil
Infers the Beam Schema from ResultSetMetaData.
toBeamType() - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
Convert to a Beam type.
toBeamType(Type) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
 
ToBigtableRowFn(Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
 
toBuilder() - Method in class org.apache.beam.io.requestresponse.Monitoring
 
toBuilder() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
 
toBuilder() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
 
toBuilder() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
toBuilder() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
 
toBuilder() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
 
toBuilder() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
Creates a new S3FileSystemConfiguration.Builder with values initialized by this instance's properties.
toBuilder() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
 
toBuilder() - Method in class org.apache.beam.sdk.io.aws2.common.RetryConfiguration
 
toBuilder() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
 
toBuilder() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
Creates a new S3FileSystemConfiguration.Builder with values initialized by this instance's properties.
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Create a new RpcQosOptions.Builder initialized with the values from this instance.
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
Transforms the instance into a builder, so field values can be modified.
toBuilder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
 
toBuilder() - Method in class org.apache.beam.sdk.io.kinesis.WatermarkParameters
 
toBuilder() - Method in class org.apache.beam.sdk.io.solace.data.Semp.Queue
 
toBuilder() - Method in class org.apache.beam.sdk.io.solace.data.Semp.QueueData
 
toBuilder() - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
toBuilder() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
toBuilder() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
 
toBuilder() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
 
toByteArray(T, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for serializing an object using the specified coder.
toByteArray(T, Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.CoderHelpers
Utility method for serializing an object using the specified coder.
toByteArrays(Iterable<T>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for serializing a Iterable of values using the specified coder.
toByteArrayWithTs(T, Coder<T>, Instant) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for serializing an object using the specified coder, appending timestamp representation.
toByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting an object to a bytearray.
toByteFunction(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting a key-value pair to a byte array pair.
toByteFunctionWithTs(Coder<K>, Coder<V>, Function<Tuple2<K, V>, Instant>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting a key-value pair to a byte array pair, where the key in resulting ByteArray contains (key, timestamp).
toBytes() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHint
Defines how to represent the as bytestring.
toCalciteRowType(Schema, RelDataTypeFactory) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
Create an instance of RelDataType so it can be used to create a table.
toCalciteType(Type, boolean, RexBuilder) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
 
toCamelCase() - Method in class org.apache.beam.sdk.schemas.Schema
Recursively converts all field names to `lowerCamelCase`.
toCamelCase() - Method in class org.apache.beam.sdk.values.Row
Returns an equivalent Row with `lowerCamelCase` field names.
toChangeStreamRecords(PartitionMetadata, ChangeStreamResultSet, ChangeStreamResultSetMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.ChangeStreamRecordMapper
In GoogleSQL, change stream records are returned as an array of Struct.
toCloudDuration(ReadableDuration) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
Converts a ReadableDuration into a Dataflow API duration string.
toCloudObject(T, SdkComponents) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
Converts the provided object into an equivalent CloudObject.
toCloudObject(RowCoder, SdkComponents) - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
Convert to a cloud object.
toCloudObject(SchemaCoder, SdkComponents) - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
Convert to a cloud object.
toCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
Transform messages read from Pub/Sub Lite to their equivalent Cloud Pub/Sub Message that would have been read from PubsubIO.
toCloudTime(ReadableInstant) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
Converts a ReadableInstant into a Dataflow API time value.
toConfigRow(BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryStorageReadSchemaTransformTranslator
 
toConfigRow(BigQueryWriteSchemaTransformProvider.BigQueryWriteSchemaTransform) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQuerySchemaTransformTranslation.BigQueryWriteSchemaTransformTranslator
 
toDefaultPolicies(SerializableFunction<UserT, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
Returns a FileBasedSink.DynamicDestinations that returns instances of DefaultFilenamePolicy configured with the given DefaultFilenamePolicy.Params.
toDuration(Row) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.DurationConvert
ByteBuddy conversion for NanosDuration base type to Duration.
toEnumerable(BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
toFeedRange() - Method in class org.apache.beam.sdk.io.azure.cosmos.NormalizedRange
 
toField(RelDataTypeField) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
toField(String, RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
toFieldType(SqlTypeNameSpec) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
toFieldType(SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
toFieldType(RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
toFile() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
toGcpBackOff(BackOff) - Static method in class org.apache.beam.sdk.extensions.gcp.util.BackOffAdapter
Returns an adapter to convert from BackOff to BackOff.
toGenericAvroSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a list of BigQuery TableSchema to Avro Schema.
toGenericAvroSchema(TableSchema, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a list of BigQuery TableSchema to Avro Schema.
toGenericAvroSchema(String, List<TableFieldSchema>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a list of BigQuery TableFieldSchema to Avro Schema.
toGenericAvroSchema(String, List<TableFieldSchema>, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a list of BigQuery TableFieldSchema to Avro Schema.
toGenericRecord(Row) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Convert from a Beam Row to an AVRO GenericRecord.
toGenericRecord(Row, Schema) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils
Convert from a Beam Row to an AVRO GenericRecord.
toHex(byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
 
toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
 
toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
 
toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
 
toInputType(String) - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
 
toInputType(Long) - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
 
toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
 
toInputType(Integer) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
toInputType(byte[]) - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
 
toInputType(BigDecimal) - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedPrecisionNumeric
 
toInputType(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
 
toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
 
toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
 
toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
 
toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
toInputType(T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
 
toInputType(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
 
toInputType(byte[]) - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
 
toInputType(Long) - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
 
toInputType(Row) - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
toInputType(byte[]) - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
 
toInputType(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
 
toInputType(BaseT) - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
Convert the Java type used by the base Schema.FieldType to the input type.
toInt(LocalDate, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.DateConversion
 
toInt(LocalTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMillisConversion
 
toInt(LocalDate, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.DateConversion
 
toInt(LocalTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimeConversion
 
toJava(Instant) - Static method in class org.apache.beam.sdk.io.aws2.kinesis.TimeUtil
 
toJoda(Instant) - Static method in class org.apache.beam.sdk.io.aws2.kinesis.TimeUtil
 
toJodaTime(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
toJson() - Method in class org.apache.beam.io.debezium.SourceRecordJson
Transforms the extracted data to a JSON string.
ToJson<T> - Class in org.apache.beam.sdk.transforms
Creates a PTransform that serializes UTF-8 JSON objects from a Schema-aware PCollection (i.e.
toJsonString(Object) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
tokenNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
Deprecated.
tokenNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
Deprecated.
ToListViewDoFn() - Constructor for class org.apache.beam.sdk.transforms.View.ToListViewDoFn
 
toLogicalBaseType(Schema.LogicalType<InputT, BaseT>, InputT) - Static method in class org.apache.beam.sdk.schemas.SchemaUtils
Returns the base type given a logical type and the input type.
toLogicalInputType(Schema.LogicalType<InputT, BaseT>, BaseT) - Static method in class org.apache.beam.sdk.schemas.SchemaUtils
Returns the input type given a logical type and the base type.
toLong(LocalDateTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicrosConversion
 
toLong(LocalDateTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillisConversion
 
toLong(LocalTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimeMicrosConversion
 
toLong(Instant, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMicrosConversion
 
toLong(Instant, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.TimestampMillisConversion
 
toLong(LocalTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.LossyTimeMicrosConversion
 
toLong(DateTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.LossyTimestampMicrosConversion
 
toLong(DateTime, Schema, LogicalType) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJodaTimeConversions.TimestampConversion
 
toMap(ArrayData, ArrayData, DataType, DataType) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
 
toModel() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
To model message.
toModificationRel(RelOptCluster, RelOptTable, Prepare.CatalogReader, RelNode, TableModify.Operation, List<String>, List<RexNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
 
toNanos(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
Converts the given timestamp to respective nanoseconds representation.
toNullableRecordField(Object[], int) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
 
Top - Class in org.apache.beam.sdk.transforms
PTransforms for finding the largest (or smallest) set of elements in a PCollection, or the largest (or smallest) set of values associated with each key in a PCollection of KVs.
Top.Largest<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
Deprecated.
use Top.Natural instead
Top.Natural<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
A Serializable Comparator that that uses the compared elements' natural ordering.
Top.Reversed<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
Serializable Comparator that that uses the reverse of the compared elements' natural ordering.
Top.Smallest<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
Deprecated.
use Top.Reversed instead
Top.TopCombineFn<T,ComparatorT extends java.util.Comparator<T> & java.io.Serializable> - Class in org.apache.beam.sdk.transforms
CombineFn for Top transforms that combines a bunch of Ts into a single count-long List<T>, using compareFn to choose the largest Ts.
toPCollection(Pipeline, BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
toPCollection(Pipeline, BeamRelNode, PTransform<PCollection<Row>, ? extends POutput>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
TopCombineFn(int, ComparatorT) - Constructor for class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
topic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
TopicPartitionCoder - Class in org.apache.beam.sdk.io.kafka
The Coder for encoding and decoding TopicPartition in Beam.
TopicPartitionCoder() - Constructor for class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
 
topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Topic path where events will be published to.
topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
 
topicPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
topicPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
toProto() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
Convert to JobApi.JobInfo.
toProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
 
toProvisionInfo() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
toPTransform() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
toPTransform() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
toRealPath(LinkOption...) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
toRecordField(Object[], int) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
 
toRel(RelOptTable.ToRelContext, RelOptTable) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
 
toRelDataType(RelDataTypeFactory, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.Write
toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
toResourceName() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
toRexNode(Value, RexBuilder) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
 
toRow(Duration) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.DurationConvert
ByteBuddy conversion for Duration to NanosDuration base type.
toRow(Timestamp) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.TimestampConvert
ByteBuddy conversion for Timestamp to NanosInstant base type.
toRow() - Method in class org.apache.beam.sdk.io.iceberg.SnapshotInfo
 
toRow(Schema) - Static method in class org.apache.beam.sdk.values.Row
Creates a Row from the list of values and Row.getSchema().
toRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
Given a type, return a function that converts that type to a Row object If no schema exists, returns null.
toRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
Deprecated.
 
toRowFunction(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
Given a type, return a function that converts that type to a Row object If no schema exists, returns null.
toRowList(BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
toRowList(BeamRelNode, Map<String, String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
toRows() - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
Convert a PCollection<InputT> into a PCollection<Row>.
toSchema(RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
Generate Schema from RelDataType which is used to create table.
toSchema() - Static method in class org.apache.beam.sdk.schemas.Schema
Collects a stream of Schema.Fields into a Schema.
toSeconds(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
toSeq(ArrayData) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
 
toSeq(Collection<Object>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
 
toSnakeCase() - Method in class org.apache.beam.sdk.schemas.Schema
Recursively converts all field names to `snake_case`.
toSnakeCase() - Method in class org.apache.beam.sdk.values.Row
Returns an equivalent Row with `snake_case` field names.
toSql(RexProgram, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
 
toSqlTypeName(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
toState(String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
toString() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
 
toString() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
toString() - Method in class org.apache.beam.runners.dataflow.TestDataflowRunner
 
toString() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
toString() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
toString() - Method in class org.apache.beam.runners.flink.FlinkRunner
 
toString() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
toString(Metric) - Static method in class org.apache.beam.runners.flink.metrics.Metrics
 
toString() - Method in class org.apache.beam.runners.portability.PortableRunner
 
toString() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
toString() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
toString() - Method in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
toString() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
toString() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
toString() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
toString() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
toString() - Method in class org.apache.beam.sdk.coders.StructuredCoder
 
toString() - Method in class org.apache.beam.sdk.coders.ZstdCoder
 
toString() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
toString() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
toString() - Method in class org.apache.beam.sdk.extensions.ordered.combiner.SequenceRangeAccumulator
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
 
toString() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
 
toString() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
 
toString() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
toString() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
 
toString() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
 
toString() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
toString() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
toString() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
toString() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
toString() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
toString() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns the string representation of this ResourceId.
toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
toString() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
toString() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
 
toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataTableNames
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
toString() - Method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
toString() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
toString() - Method in class org.apache.beam.sdk.io.range.ByteKey
 
toString() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
toString() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
toString() - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
toString() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
toString() - Method in class org.apache.beam.sdk.io.solace.broker.BrokerResponse
 
toString() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
toString() - Method in enum org.apache.beam.sdk.metrics.Lineage.Type
 
toString() - Method in class org.apache.beam.sdk.metrics.MetricKey
 
toString() - Method in class org.apache.beam.sdk.metrics.MetricName
 
toString() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
 
toString() - Method in class org.apache.beam.sdk.metrics.MetricResults
 
toString() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
toString() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
toString() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
toString() - Method in class org.apache.beam.sdk.Pipeline
 
toString() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
toString(EnumerationType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
 
toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
 
toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
 
toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedString
 
toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
 
toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
 
toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableBytes
 
toString() - Method in class org.apache.beam.sdk.schemas.logicaltypes.VariableString
 
toString() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
toString() - Method in class org.apache.beam.sdk.schemas.Schema.Options
 
toString() - Method in class org.apache.beam.sdk.schemas.Schema
 
toString() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
toString() - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
 
toString() - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
 
toString() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
toString() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
toString() - Method in class org.apache.beam.sdk.testing.TestPipeline
 
toString() - Method in class org.apache.beam.sdk.transforms.Combine.Holder
 
toString() - Method in class org.apache.beam.sdk.transforms.Contextful
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
toString() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
 
toString() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
toString() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
toString() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
toString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
toString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
toString() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
 
toString() - Method in class org.apache.beam.sdk.transforms.PTransform
 
toString() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
toString() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
ToString - Class in org.apache.beam.sdk.transforms
toString() - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
 
toString(StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
Creates a human-readable representation of the given state of this condition.
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
toString() - Method in class org.apache.beam.sdk.values.KV
 
toString() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
toString() - Method in class org.apache.beam.sdk.values.PValueBase
 
toString() - Method in class org.apache.beam.sdk.values.Row
 
toString(boolean) - Method in class org.apache.beam.sdk.values.Row
Convert Row to String.
toString() - Method in class org.apache.beam.sdk.values.ShardedKey
 
toString() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
toString() - Method in class org.apache.beam.sdk.values.TupleTag
 
toString() - Method in class org.apache.beam.sdk.values.TupleTagList
 
toString() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
toString() - Method in class org.apache.beam.sdk.values.TypeParameter
 
toString() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
toString() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
toStringTimestamp(long) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamCodegenUtils
 
toStringUTF8(byte[]) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamCodegenUtils
 
toTableReference(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
toTableRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a Beam Row to a BigQuery TableRow.
toTableRow(SerializableFunction<T, Row>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a Beam schema type to a BigQuery TableRow.
toTableRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a Beam Row to a BigQuery TableRow.
toTableSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a Beam Schema to a BigQuery TableSchema.
toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
Returns a canonical string representation of the TableReference.
toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
toThreetenInstant(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
toTimestamp(Row) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.TimestampConvert
ByteBuddy conversion for NanosInstant base type to Timestamp.
toTimestamp(BigDecimal) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
Converts nanoseconds to their respective timestamp.
toTreeMap(ArrayData, ArrayData, DataType, DataType) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
 
toUnsplittableSource(BoundedSource<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Returns an equivalent unsplittable BoundedSource<T>.
toUri() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
 
toZetaSqlStructType(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
 
toZetaSqlStructValue(Row, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
 
toZetaSqlType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
 
toZetaSqlType(RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
 
toZetaSqlValue(Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlBeamTranslationUtils
 
TrackerWithProgress - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
TrackerWithProgress() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.TrackerWithProgress
 
Transaction - Class in org.apache.beam.sdk.io.gcp.spanner
A transaction object.
Transaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
transactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
TransactionResult(T, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
 
transfer() - Method in class org.apache.beam.runners.portability.CloseableResource
Returns a new CloseableResource that owns the underlying resource and relinquishes ownership from this CloseableResource.
transform(Function<T, V>) - Method in class org.apache.beam.sdk.metrics.MetricResult
 
TRANSFORM_URN - Static variable in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
 
TRANSFORM_URN - Static variable in class org.apache.beam.runners.spark.io.CreateStream
 
transformContainer(Iterable<FromT>, Function<FromT, DestT>) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
 
TransformExecutor - Interface in org.apache.beam.runners.direct
A Runnable that will execute a PTransform on some bundle of input.
transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
TransformingMap(Map<K1, V1>, Function<K1, K2>, Function<V1, V2>) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
TransformProvider<InputT extends PInput,OutputT extends POutput> - Interface in org.apache.beam.sdk.expansion.service
Provides a mapping of RunnerApi.FunctionSpec to a PTransform, together with mappings of its inputs and outputs to maps of PCollections.
TransformServiceLauncher - Class in org.apache.beam.sdk.transformservice.launcher
A utility that can be used to manage a Beam Transform Service.
transformTo(RelNode, Map<RelNode, RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
transformTo(RelNode, Map<RelNode, RelNode>, RelHintsPropagator) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
 
TransformTranslator<TransformT extends PTransform> - Interface in org.apache.beam.runners.dataflow
A TransformTranslator knows how to translate a particular subclass of PTransform for the Cloud Dataflow service.
TransformTranslator<InT extends PInput,OutT extends POutput,TransformT extends PTransform<InT,OutT>> - Class in org.apache.beam.runners.spark.structuredstreaming.translation
A TransformTranslator provides the capability to translate a specific primitive or composite PTransform into its Spark correspondence.
TransformTranslator(float) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
 
TransformTranslator.Context - Class in org.apache.beam.runners.spark.structuredstreaming.translation
Available mutable context to translate a PTransform.
TransformTranslator.StepTranslationContext - Interface in org.apache.beam.runners.dataflow
The interface for a TransformTranslator to build a Dataflow step.
TransformTranslator.TranslationContext - Interface in org.apache.beam.runners.dataflow
The interface provided to registered callbacks for interacting with the DataflowRunner, including reading and writing the values of PCollections and side inputs.
translate(Pipeline, RunnerApi.Pipeline, SdkComponents, DataflowRunner, List<DataflowPackage>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Translates a Pipeline into a JobSpecification.
translate(AppliedPTransform<?, ?, PrimitiveParDoSingleFactory.ParDoSingle<?, ?>>, SdkComponents) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
 
translate(TransformT, TransformTranslator.TranslationContext) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator
 
translate(PipelineNode.PTransformNode, RunnerApi.Pipeline, FlinkBatchPortablePipelineTranslator.BatchTranslationContext) - Method in interface org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.PTransformTranslator
Translate a PTransform into the given translation context.
translate(FlinkBatchPortablePipelineTranslator.BatchTranslationContext, RunnerApi.Pipeline) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
 
translate(T, RunnerApi.Pipeline) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
Translates the given pipeline.
translate(String, RunnerApi.Pipeline, T) - Method in interface org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.PTransformTranslator
 
translate(FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext, RunnerApi.Pipeline) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
 
translate(TransformHierarchy.Node, TransformT) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
Determine if this Node belongs to a Bounded branch of the pipeline, or Unbounded, and translate with the proper translator.
translate(Pipeline, SparkSession, SparkCommonPipelineOptions) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
Translates a Beam pipeline into its Spark correspondence using the Spark SQL / Dataset API.
translate(TransformT, TransformTranslator<InT, OutT, TransformT>.Context) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
 
translate(Pipeline) - Method in class org.apache.beam.runners.twister2.translators.Twister2PipelineTranslator
Translates the pipeline by passing this class as a visitor.
translate(Pipeline) - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
translate the pipline into Twister2 TSet graph.
translate(AppliedPTransform<?, ?, T>, SdkComponents) - Method in class org.apache.beam.sdk.schemas.transforms.SchemaTransformTranslation.SchemaTransformPayloadTranslator
 
translateNode(Window.Assign<T>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.AssignWindowTranslatorBatch
 
translateNode(Flatten.PCollections<T>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.FlattenTranslatorBatch
 
translateNode(GroupByKey<K, V>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.GroupByKeyTranslatorBatch
 
translateNode(Impulse, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.ImpulseTranslatorBatch
 
translateNode(ParDo.MultiOutput<InputT, OutputT>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.ParDoMultiOutputTranslatorBatch
 
translateNode(View.CreatePCollectionView<ElemT, ViewT>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.PCollectionViewTranslatorBatch
 
translateNode(SplittableParDo.PrimitiveBoundedRead<T>, Twister2BatchTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.batch.ReadSourceTranslatorBatch
 
translateNode(TransformT, Twister2BatchTranslationContext) - Method in interface org.apache.beam.runners.twister2.translators.BatchTransformTranslator
 
translateNode(PTransform<PBegin, PCollection<T>>, Twister2StreamTranslationContext) - Method in class org.apache.beam.runners.twister2.translators.streaming.ReadSourceTranslatorStream
 
translateNode(TransformT, Twister2StreamTranslationContext) - Method in interface org.apache.beam.runners.twister2.translators.StreamTransformTranslator
 
TranslationUtils - Class in org.apache.beam.runners.twister2.utils
doc.
translator - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
Transport - Class in org.apache.beam.sdk.extensions.gcp.util
Helpers for cloud communication.
Transport() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.Transport
 
traverseTopologically(Pipeline.PipelineVisitor) - Method in class org.apache.beam.sdk.Pipeline
For internal use only; no backwards-compatibility guarantees.
Trigger - Class in org.apache.beam.sdk.transforms.windowing
Triggers control when the elements for a specific key and window are output.
Trigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger
 
Trigger() - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger
 
Trigger.OnceTrigger - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards-compatibility guarantees.
triggering(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Sets a non-default trigger for this Window PTransform.
trim(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
trim(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
 
TRIM - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
TRIM_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
 
trivial() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
Creates an OutboundObserverFactory that simply delegates to the base factory, with no flow control or synchronization.
trueLiteral() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
 
truncate(long) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
TruncateResult() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
 
tryAcquireJobLock(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
Tries to acquire lock for given job.
tryAcquireJobLock(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
 
tryClaim(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
Claims a new StreamProgress to be processed.
tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
Attempts to claim the given position.
tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
Attempts to claim the given position.
tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
Attempts to claim the given position.
tryClaim(Timestamp, PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
tryClaim(Long) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
 
tryClaim(ByteKey) - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
Attempts to claim the given key.
tryClaim(Long) - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
Attempts to claim the given offset.
tryClaim(PositionT) - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
Attempts to claim the block of work in the current restriction identified by the given position.
tryProcess(int, Object) - Method in class org.apache.beam.runners.jet.processors.AssignWindowP
 
tryProcess(int, Object) - Method in class org.apache.beam.runners.jet.processors.FlattenP
 
tryProcess(int, Object) - Method in class org.apache.beam.runners.jet.processors.ViewP
 
tryProcess() - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
 
tryProcess(int, Object) - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
 
tryProcessWatermark(Watermark) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
 
tryProcessWatermark(Watermark) - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
 
tryReturnRecordAt(boolean, ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
tryReturnRecordAt(boolean, Long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
tryReturnRecordAt(boolean, long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
tryReturnRecordAt(boolean, PositionT) - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Atomically determines whether a record at the given position can be returned and updates internal state.
trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.DetectNewPartitionsTracker
 
trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
Splits the work that's left.
trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
If the partition token is the InitialPartition.PARTITION_TOKEN, it does not allow for splits (returns null).
trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
Splits the restriction through the following algorithm:
trySplit(double) - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
 
trySplit(double) - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
trySplit(double) - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
 
trySplit(double) - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
trySplit(double) - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
Splits current restriction based on fractionOfRemainder.
trySplitAtPosition(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
trySplitAtPosition(Long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
trySplitAtPosition(long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
trySplitAtPosition(PositionT) - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Atomically splits the current range [RangeTracker.getStartPosition(), RangeTracker.getStopPosition()) into a "primary" part [RangeTracker.getStartPosition(), splitPosition) and a "residual" part [splitPosition, RangeTracker.getStopPosition()), assuming the current last-consumed position is within [RangeTracker.getStartPosition(), splitPosition) (i.e., splitPosition has not been consumed yet).
tuple(T1, T2) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.ScalaInterop
 
TUPLE - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
tuple(Map<String, TableSchema.ColumnType>) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
TUPLE_TAGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
tupleEncoder(Encoder<T1>, Encoder<T2>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
TupleTag<V> - Class in org.apache.beam.sdk.values
A TupleTag is a typed tag to use as the key of a heterogeneously typed tuple, like PCollectionTuple.
TupleTag() - Constructor for class org.apache.beam.sdk.values.TupleTag
Constructs a new TupleTag, with a fresh unique id.
TupleTag(String) - Constructor for class org.apache.beam.sdk.values.TupleTag
Constructs a new TupleTag with the given id.
TupleTagList - Class in org.apache.beam.sdk.values
A TupleTagList is an immutable list of heterogeneously typed TupleTags.
tupleTypes() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
TVFSlidingWindowFn - Class in org.apache.beam.sdk.extensions.sql.impl
TVFSlidingWindowFn assigns window based on input row's "window_start" and "window_end" timestamps.
TVFSlidingWindowFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
 
TVFStreamingUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
Provides static constants or utils for TVF streaming.
TVFStreamingUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
 
Twister2AssignContext<T,W extends BoundedWindow> - Class in org.apache.beam.runners.twister2.utils
doc.
Twister2AssignContext(WindowFn<T, W>, WindowedValue<T>) - Constructor for class org.apache.beam.runners.twister2.utils.Twister2AssignContext
 
Twister2BatchPipelineTranslator - Class in org.apache.beam.runners.twister2.translators
Twister pipeline translator for batch pipelines.
Twister2BatchPipelineTranslator(Twister2PipelineOptions, Twister2BatchTranslationContext) - Constructor for class org.apache.beam.runners.twister2.translators.Twister2BatchPipelineTranslator
 
Twister2BatchTranslationContext - Class in org.apache.beam.runners.twister2
Twister2BatchTranslationContext.
Twister2BatchTranslationContext(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
 
Twister2BoundedSource<T> - Class in org.apache.beam.runners.twister2.translation.wrappers
Twister2 wrapper for Bounded Source.
Twister2BoundedSource(BoundedSource<T>, Twister2TranslationContext, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translation.wrappers.Twister2BoundedSource
 
Twister2EmptySource<T> - Class in org.apache.beam.runners.twister2.translation.wrappers
Empty Source wrapper.
Twister2EmptySource() - Constructor for class org.apache.beam.runners.twister2.translation.wrappers.Twister2EmptySource
 
Twister2PipelineExecutionEnvironment - Class in org.apache.beam.runners.twister2
Twister2PipelineExecutionEnvironment.
Twister2PipelineExecutionEnvironment(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
 
Twister2PipelineOptions - Interface in org.apache.beam.runners.twister2
Twister2PipelineOptions.
Twister2PipelineResult - Class in org.apache.beam.runners.twister2
Represents a Twister2 pipeline execution result.
Twister2PipelineResult(Twister2JobState) - Constructor for class org.apache.beam.runners.twister2.Twister2PipelineResult
 
Twister2PipelineTranslator - Class in org.apache.beam.runners.twister2.translators
Twister2PipelineTranslator, both batch and streaming translators need to extend from this.
Twister2PipelineTranslator() - Constructor for class org.apache.beam.runners.twister2.translators.Twister2PipelineTranslator
 
Twister2Runner - Class in org.apache.beam.runners.twister2
A PipelineRunner that executes the operations in the pipeline by first translating them to a Twister2 Plan and then executing them either locally or on a Twister2 cluster, depending on the configuration.
Twister2Runner(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2Runner
 
Twister2RunnerRegistrar - Class in org.apache.beam.runners.twister2
AutoService registrar - will register Twister2Runner and Twister2Options as possible pipeline runner services.
Twister2RunnerRegistrar.Options - Class in org.apache.beam.runners.twister2
Pipeline options registrar.
Twister2RunnerRegistrar.Runner - Class in org.apache.beam.runners.twister2
Pipeline runner registrar.
Twister2SideInputReader - Class in org.apache.beam.runners.twister2.utils
 
Twister2SideInputReader(Map<TupleTag<?>, WindowingStrategy<?, ?>>, TSetContext) - Constructor for class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
 
Twister2SinkFunction<T> - Class in org.apache.beam.runners.twister2.translators.functions
Sink Function that collects results.
Twister2SinkFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
 
Twister2StreamPipelineTranslator - Class in org.apache.beam.runners.twister2.translators
Twister pipeline translator for stream pipelines.
Twister2StreamPipelineTranslator() - Constructor for class org.apache.beam.runners.twister2.translators.Twister2StreamPipelineTranslator
 
Twister2StreamTranslationContext - Class in org.apache.beam.runners.twister2
Twister2StreamingTranslationContext.
Twister2StreamTranslationContext(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2StreamTranslationContext
 
Twister2TestRunner - Class in org.apache.beam.runners.twister2
A PipelineRunner that executes the operations in the pipeline by first translating them to a Twister2 Plan and then executing them either locally or on a Twister2 cluster, depending on the configuration.
Twister2TestRunner(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2TestRunner
 
Twister2TranslationContext - Class in org.apache.beam.runners.twister2
Twister2TranslationContext.
Twister2TranslationContext(Twister2PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.Twister2TranslationContext
 
type - Variable in class org.apache.beam.runners.dataflow.util.OutputReference
 
type - Variable in class org.apache.beam.sdk.extensions.avro.io.AvroDatumFactory
 
type - Variable in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.TypeWithNullability
 
type(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
TypeCode - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents a type of a column within Cloud Spanner.
TypeCode(String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
Constructs a type code from the given String code.
TypeConversion() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
 
TypedCombineFnDelegate<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.extensions.sql
A Combine.CombineFn delegating all relevant calls to given delegate.
TypedCombineFnDelegate(Combine.CombineFn<InputT, AccumT, OutputT>) - Constructor for class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
 
typedef(String, Schema.FieldType) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema.Customizer
 
TypeDescriptor<T> - Class in org.apache.beam.sdk.values
A description of a Java type, including actual generic parameters where possible.
TypeDescriptor() - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
Creates a TypeDescriptor representing the type parameter T.
TypeDescriptor(Object) - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
Creates a TypeDescriptor representing the type parameter T, which should resolve to a concrete type in the context of the class clazz.
TypeDescriptor(Class<?>) - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
Creates a TypeDescriptor representing the type parameter T, which should resolve to a concrete type in the context of the class clazz.
TypeDescriptors - Class in org.apache.beam.sdk.values
A utility class for creating TypeDescriptor objects for different types, such as Java primitive types, containers and KVs of other TypeDescriptor objects, and extracting type variables of parameterized types (e.g.
TypeDescriptors() - Constructor for class org.apache.beam.sdk.values.TypeDescriptors
 
TypeDescriptors.TypeVariableExtractor<InputT,OutputT> - Interface in org.apache.beam.sdk.values
TypeDescriptorWithSchema() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils.TypeDescriptorWithSchema
 
TypedRead() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
TypedSchemaTransformProvider<ConfigT> - Class in org.apache.beam.sdk.schemas.transforms
Like SchemaTransformProvider except uses a configuration object instead of Schema and Row.
TypedSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
 
TypedWrite() - Constructor for class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
 
TypedWrite() - Constructor for class org.apache.beam.sdk.io.TextIO.TypedWrite
 
typeName() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
TypeParameter<T> - Class in org.apache.beam.sdk.values
TypeParameter() - Constructor for class org.apache.beam.sdk.values.TypeParameter
 
typesEqual(Schema.Field) - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns true if two fields are equal, ignoring name and description.
typesEqual(Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Returns true if two FieldTypes are equal.
typesEqual(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
Returns true if two schemas are equal ignoring field names and descriptions.
typeToProtoType(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
TZTimeOnly() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
 
TZTimestamp() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
 

U

UdafImpl<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.extensions.sql.impl
Implement AggregateFunction to take a Combine.CombineFn as UDAF.
UdafImpl(Combine.CombineFn<InputT, AccumT, OutputT>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
UDF_METHOD - Static variable in interface org.apache.beam.sdk.extensions.sql.BeamSqlUdf
 
UdfImplReflectiveFunctionBase - Class in org.apache.beam.sdk.extensions.sql.impl
Beam-customized version from ReflectiveFunctionBase, to address BEAM-5921.
UdfImplReflectiveFunctionBase(Method) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
UdfImplReflectiveFunctionBase constructor.
UdfImplReflectiveFunctionBase.ParameterListBuilder - Class in org.apache.beam.sdk.extensions.sql.impl
Helps build lists of FunctionParameter.
UdfProvider - Interface in org.apache.beam.sdk.extensions.sql.udf
Provider for user-defined functions written in Java.
UdfTestProvider - Class in org.apache.beam.sdk.extensions.sql.provider
Defines Java UDFs for use in tests.
UdfTestProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider
 
UdfTestProvider.DateIncrementAllFn - Class in org.apache.beam.sdk.extensions.sql.provider
 
UdfTestProvider.HelloWorldFn - Class in org.apache.beam.sdk.extensions.sql.provider
 
UdfTestProvider.IncrementFn - Class in org.apache.beam.sdk.extensions.sql.provider
 
UdfTestProvider.IsNullFn - Class in org.apache.beam.sdk.extensions.sql.provider
 
UdfTestProvider.MatchFn - Class in org.apache.beam.sdk.extensions.sql.provider
 
UdfTestProvider.Sum - Class in org.apache.beam.sdk.extensions.sql.provider
 
UdfTestProvider.UnusedFn - Class in org.apache.beam.sdk.extensions.sql.provider
 
UdfUdafProvider - Interface in org.apache.beam.sdk.extensions.sql.meta.provider
Provider for UDF and UDAF.
Uint16() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint16
 
UINT16 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
uint16Behavior() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
 
Uint32() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint32
 
UINT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
uint32Behavior() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
 
Uint64() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint64
 
UINT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
uint64Behavior() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
 
Uint8() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint8
 
UINT8 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
uint8Behavior() - Method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
 
Unbounded(SparkContext, SerializablePipelineOptions, MicrobatchSource<T, CheckpointMarkT>, int) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
unbounded() - Static method in class org.apache.beam.sdk.io.CountingSource
Deprecated.
use GenerateSequence instead
UNBOUNDED_UNKNOWN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
 
UnboundedBatchedSolaceWriter - Class in org.apache.beam.sdk.io.solace.write
This DoFn is the responsible for writing to Solace in batch mode (holding up any messages), and emit the corresponding output (success or fail; only for persistent messages), so the SolaceIO.Write connector can be composed with other subsequent transforms in the pipeline.
UnboundedBatchedSolaceWriter(SerializableFunction<Solace.Record, Destination>, SessionServiceFactory, DeliveryMode, SolaceIO.SubmissionMode, int, boolean) - Constructor for class org.apache.beam.sdk.io.solace.write.UnboundedBatchedSolaceWriter
 
UnboundedReader() - Constructor for class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
UnboundedReaderImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
UnboundedReaderMaxReadTimeFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.UnboundedReaderMaxReadTimeFactory
 
UnboundedSolaceSource<T> - Class in org.apache.beam.sdk.io.solace.read
 
UnboundedSolaceSource(Queue, SempClientFactory, SessionServiceFactory, Integer, boolean, Coder<T>, SerializableFunction<T, Instant>, Duration, SerializableFunction<BytesXMLMessage, T>) - Constructor for class org.apache.beam.sdk.io.solace.read.UnboundedSolaceSource
 
UnboundedSolaceWriter - Class in org.apache.beam.sdk.io.solace.write
This DoFn encapsulates common code used both for the UnboundedBatchedSolaceWriter and UnboundedStreamingSolaceWriter.
UnboundedSolaceWriter(SerializableFunction<Solace.Record, Destination>, SessionServiceFactory, DeliveryMode, SolaceIO.SubmissionMode, int, boolean) - Constructor for class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
UnboundedSource<OutputT,CheckpointMarkT extends UnboundedSource.CheckpointMark> - Class in org.apache.beam.sdk.io
A Source that reads an unbounded amount of input and, because of that, supports some additional operations such as checkpointing, watermarks, and record ids.
UnboundedSource() - Constructor for class org.apache.beam.sdk.io.UnboundedSource
 
UnboundedSource.CheckpointMark - Interface in org.apache.beam.sdk.io
A marker representing the progress and state of an UnboundedSource.UnboundedReader.
UnboundedSource.CheckpointMark.NoopCheckpointMark - Class in org.apache.beam.sdk.io
A checkpoint mark that does nothing when finalized.
UnboundedSource.UnboundedReader<OutputT> - Class in org.apache.beam.sdk.io
A Reader that reads an unbounded amount of input.
UnboundedSourceImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
UnboundedSourceP<T,CmT extends UnboundedSource.CheckpointMark> - Class in org.apache.beam.runners.jet.processors
Jet Processor implementation for reading from an unbounded Beam source.
UnboundedStreamingSolaceWriter - Class in org.apache.beam.sdk.io.solace.write
This DoFn is the responsible for writing to Solace in streaming mode (one message at a time, not holding up any message), and emit the corresponding output (success or fail; only for persistent messages), so the SolaceIO.Write connector can be composed with other subsequent transforms in the pipeline.
UnboundedStreamingSolaceWriter(SerializableFunction<Solace.Record, Destination>, SessionServiceFactory, DeliveryMode, SolaceIO.SubmissionMode, int, boolean) - Constructor for class org.apache.beam.sdk.io.solace.write.UnboundedStreamingSolaceWriter
 
unboundedWithTimestampFn(SerializableFunction<Long, Instant>) - Static method in class org.apache.beam.sdk.io.CountingSource
unboxedType - Variable in class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
 
union(Iterable<FieldAccessDescriptor>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
union(Contextful...) - Static method in class org.apache.beam.sdk.transforms.Requirements
 
unionAll(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET ALL semantics to compute the unionAll with provided PCollection<T>.
unionAll() - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET ALL semantics which takes a PCollectionList<PCollection<T>> and returns a PCollection<T> containing the unionAll of collections done in order for all collections in PCollectionList<T>.
UnionCoder - Class in org.apache.beam.sdk.transforms.join
A UnionCoder encodes RawUnionValues.
unionDistinct(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET DISTINCT semantics to compute the union with provided PCollection<T>.
unionDistinct() - Static method in class org.apache.beam.sdk.transforms.Sets
Returns a new PTransform transform that follows SET DISTINCT semantics which takes a PCollectionList<PCollection<T>> and returns a PCollection<T> containing the union of collections done in order for all collections in PCollectionList<T>.
UniqueIdGenerator - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
Generate unique IDs that can be used to differentiate different jobs and partitions.
UniqueIdGenerator() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
 
UNKNOWN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
Returns an instance with all values set to INFINITY.
unknown() - Static method in class org.apache.beam.sdk.io.fs.MatchResult
UNKNOWN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
UnknownLogicalType<T> - Class in org.apache.beam.sdk.schemas.logicaltypes
A base class for logical types that are not understood by the Java SDK.
UnknownLogicalType(String, byte[], Schema.FieldType, Object, Schema.FieldType) - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.UnknownLogicalType
 
unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
 
unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
 
unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
 
unparse(SqlWriter, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
 
unparseCall(SqlWriter, SqlCall, int, int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
unparseDateTimeLiteral(SqlWriter, SqlAbstractDateTimeLiteral, int, int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
 
unparseSqlIntervalLiteral(SqlWriter, SqlIntervalLiteral, int, int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
BigQuery interval syntax: INTERVAL int64 time_unit.
unpersist() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
unpin() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
Unpin this object.
UnprocessedEvent<EventT> - Class in org.apache.beam.sdk.extensions.ordered
Combines the source event which failed to process with the failure reason.
UnprocessedEvent() - Constructor for class org.apache.beam.sdk.extensions.ordered.UnprocessedEvent
 
UnprocessedEvent.Reason - Enum in org.apache.beam.sdk.extensions.ordered
 
unprocessedEvents() - Method in class org.apache.beam.sdk.extensions.ordered.OrderedEventProcessorResult
 
unregisterConsumer(String) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
Unregisters a previously registered consumer.
unregisterReceiver(String) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
Receivers are only expected to be unregistered when bundle processing has completed successfully.
unregisterReceiver(String) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
UNSAFELY_ATTEMPT_TO_PROCESS_UNBOUNDED_DATA_IN_BATCH_MODE - Static variable in class org.apache.beam.runners.dataflow.DataflowRunner
Experiment to "unsafely attempt to process unbounded data in batch mode".
UNSIGNED_LEXICOGRAPHICAL_COMPARATOR - Static variable in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
UnsignedOptions - Class in org.apache.beam.sdk.extensions.sbe
Options for controlling what to do with unsigned types, specifically whether to use a higher bit count or, in the case of uint64, a string.
UnsignedOptions() - Constructor for class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
 
UnsignedOptions.Behavior - Enum in org.apache.beam.sdk.extensions.sbe
Defines the exact behavior for unsigned types.
UnsignedOptions.Builder - Class in org.apache.beam.sdk.extensions.sbe
Builder for UnsignedOptions.
unsupported() - Static method in interface org.apache.beam.runners.fnexecution.control.BundleSplitHandler
Returns a bundle split handler that throws on any split response.
unsupported() - Static method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
 
unsupported() - Static method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandlerFactory
Throws a UnsupportedOperationException on the first access.
unsupported() - Static method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
Throws a UnsupportedOperationException on the first access.
UnusedFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.UnusedFn
 
unwindowedFilename(int, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
unwindowedFilename(int, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
When a sink has not requested windowed or triggered output, this method will be invoked to return the file resource to be created given the base output directory and a FileBasedSink.OutputFileHints containing information about the file, including a suggested (e.g.
unwrap(Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
 
update(UserCodeExecutionException) - Method in interface org.apache.beam.io.requestresponse.CallShouldBackoff
Update the state of whether to backoff using information about the exception.
update(ResponseT) - Method in interface org.apache.beam.io.requestresponse.CallShouldBackoff
Update the state of whether to backoff using information about the response.
update(long) - Method in class org.apache.beam.runners.jet.metrics.DistributionImpl
 
update(long, long, long, long) - Method in class org.apache.beam.runners.jet.metrics.DistributionImpl
 
update(KinesisRecord) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicy
 
update(KinesisRecord) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
 
update(KinesisRecord) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
 
update(KinesisRecord) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
 
update(Instant, T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
Updates the estimator with the bytes of records if it is selected to be sampled.
update(Timestamp, T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
Updates the estimator with the bytes of records.
update(Timestamp, T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
NoOp.
update(Timestamp, T) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
Updates the estimator with the size of the records.
update(KinesisRecord) - Method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicy
 
update(KinesisRecord) - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
 
update(KinesisRecord) - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
 
update(KinesisRecord) - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
 
update(long) - Method in class org.apache.beam.sdk.metrics.DelegatingDistribution
 
update(long, long, long, long) - Method in class org.apache.beam.sdk.metrics.DelegatingDistribution
 
update(double) - Method in class org.apache.beam.sdk.metrics.DelegatingHistogram
 
update(double...) - Method in class org.apache.beam.sdk.metrics.DelegatingHistogram
 
update(long) - Method in interface org.apache.beam.sdk.metrics.Distribution
Add an observation to this distribution.
update(long, long, long, long) - Method in interface org.apache.beam.sdk.metrics.Distribution
 
update(double) - Method in interface org.apache.beam.sdk.metrics.Histogram
Add an observation to this histogram.
update(double...) - Method in interface org.apache.beam.sdk.metrics.Histogram
Add observations to this histogram.
update(double) - Method in class org.apache.beam.sdk.metrics.NoOpHistogram
 
update(double...) - Method in class org.apache.beam.sdk.metrics.NoOpHistogram
 
UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
UpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.UpdateBuilder
 
updateCacheCandidates(Pipeline, SparkPipelineTranslator, EvaluationContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
Evaluator that update/populate the cache candidates.
updateCompatibilityVersionLessThan(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.StreamingOptions
 
UpdateConfiguration - Class in org.apache.beam.sdk.io.mongodb
Builds a MongoDB UpdateConfiguration object.
UpdateConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
 
updateConsumerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
updateConsumerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Deprecated.
as of version 2.13. Use KafkaIO.Read.withConsumerConfigUpdates(Map) instead
updateDataClientSettings(BigtableDataSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
Update BigtableDataSettings.Builder with custom configurations.
updateDataRecordCommittedToEmitted(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
 
updateDetectNewPartitionWatermark(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Update the watermark cell for Detect New Partition step.
updateFailedRpcMetrics(Instant, Instant, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.NoOpStreamingInsertsMetrics
 
updateFailedRpcMetrics(Instant, Instant, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
Record the rpc status and latency of a failed StreamingInserts RPC call.
updateFailedRpcMetrics(Instant, Instant, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics
 
UpdateField - Class in org.apache.beam.sdk.io.mongodb
 
UpdateField() - Constructor for class org.apache.beam.sdk.io.mongodb.UpdateField
 
updateInstanceAdminClientSettings(BigtableInstanceAdminSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
Update BigtableInstanceAdminSettings.Builder with custom configurations.
updateJob(String, Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Updates the Dataflow Job with the given jobId.
updateKafkaMetrics() - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
Export all metrics recorded in this instance to the underlying perWorkerMetrics containers.
updateKafkaMetrics() - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.NoOpKafkaMetrics
 
updateKafkaMetrics() - Method in interface org.apache.beam.sdk.io.kafka.KafkaMetrics
 
updatePartitionCreatedToScheduled(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Adds measurement of an instance for the ChangeStreamMetrics.PARTITION_CREATED_TO_SCHEDULED_MS if the metric is enabled.
updatePartitionScheduledToRunning(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Adds measurement of an instance for the ChangeStreamMetrics.PARTITION_SCHEDULED_TO_RUNNING_MS if the metric is enabled.
updateProcessingDelayFromCommitTimestamp(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Adds measurement of an instance for the ChangeStreamMetrics.PROCESSING_DELAY_FROM_COMMIT_TIMESTAMP.
updateProducerIndex() - Method in class org.apache.beam.sdk.io.solace.write.UnboundedSolaceWriter
 
updateProducerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Deprecated.
as of version 2.13. Use KafkaIO.Write.withProducerConfigUpdates(Map) instead.
updateProducerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Deprecated.
updateRetriedRowsWithStatus(String, int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.NoOpStreamingInsertsMetrics
 
updateRetriedRowsWithStatus(String, int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
Update metrics for rows that were retried due to an RPC error.
updateRetriedRowsWithStatus(String, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics
 
UpdateSchemaDestination<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Update destination schema based on data that is about to be copied into it.
UpdateSchemaDestination(BigQueryServices, PCollectionView<String>, ValueProvider<String>, BigQueryIO.Write.WriteDisposition, BigQueryIO.Write.CreateDisposition, int, String, Set<BigQueryIO.Write.SchemaUpdateOption>, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
updateSerializedOptions(String, Map<String, String>) - Static method in class org.apache.beam.sdk.options.ValueProviders
Deprecated.
updateStreamingInsertsMetrics(TableReference, int, int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.NoOpStreamingInsertsMetrics
 
updateStreamingInsertsMetrics(TableReference, int, int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
Export all metrics recorded in this instance to the underlying perWorkerMetrics containers.
updateStreamingInsertsMetrics(TableReference, int, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics
 
updateSuccessfulRpcMetrics(Instant, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.NoOpStreamingInsertsMetrics
 
updateSuccessfulRpcMetrics(Instant, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics.StreamingInsertsMetricsImpl
Record the rpc status and latency of a successful StreamingInserts RPC call.
updateSuccessfulRpcMetrics(Instant, Instant) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.StreamingInsertsMetrics
 
updateSuccessfulRpcMetrics(String, Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
Record the rpc status and latency of a successful Kafka poll RPC call.
updateSuccessfulRpcMetrics(String, Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.NoOpKafkaMetrics
 
updateSuccessfulRpcMetrics(String, Duration) - Method in interface org.apache.beam.sdk.io.kafka.KafkaMetrics
 
updateTableAdminClientSettings(BigtableTableAdminSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
Update BigtableTableAdminSettings.Builder with custom configurations.
updateTableSchema(String, String, String, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
updateTableSchema(TableReference, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
updateToFinished(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Updates a partition row to PartitionMetadata.State.FINISHED state.
updateToFinished(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Updates a partition row to PartitionMetadata.State.FINISHED state.
updateToRunning(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Updates a partition row to PartitionMetadata.State.RUNNING state.
updateToRunning(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Updates a partition row to PartitionMetadata.State.RUNNING state.
updateToScheduled(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Updates multiple partition rows to PartitionMetadata.State.SCHEDULED state.
updateToScheduled(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Updates multiple partition row to PartitionMetadata.State.SCHEDULED state.
updateWatermark(Range.ByteStringRange, Instant, ChangeStreamContinuationToken) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Update the metadata for the row key represented by the partition.
updateWatermark(String, Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Update the partition watermark to the given timestamp.
updateWatermark(String, Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Update the partition watermark to the given timestamp.
updateWindowingStrategy(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
UploadIdResponseInterceptor - Class in org.apache.beam.sdk.extensions.gcp.util
Implements a response intercepter that logs the upload id if the upload id header exists and it is the first request (does not have upload_id parameter in the request).
UploadIdResponseInterceptor() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.UploadIdResponseInterceptor
 
uploadToDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Upload to a Dicom Store.
uploadToDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
upTo(long) - Static method in class org.apache.beam.sdk.io.CountingSource
Deprecated.
use GenerateSequence instead
URL_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
 
URN - Static variable in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
URN - Static variable in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
 
URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
 
URN - Static variable in class org.apache.beam.sdk.io.GenerateSequence.External
 
URN - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
 
URN_WITH_METADATA - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
 
URN_WITHOUT_METADATA - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
 
USE_INDEXED_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
useAbstractConvertersForConversion(RelTraitSet, RelTraitSet) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
useAvroLogicalTypes() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
useAvroLogicalTypes() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Enables interpreting logical types into their corresponding types (ie.
useBeamSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, then the BigQuery schema will be inferred from the input schema.
USER_DEFINED_JAVA_AGGREGATE_FUNCTIONS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
 
USER_DEFINED_JAVA_SCALAR_FUNCTIONS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
 
USER_DEFINED_SQL_FUNCTIONS - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
 
USER_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
USER_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
UserAgentFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.UserAgentFactory
 
UserCodeExecutionException - Exception in org.apache.beam.io.requestresponse
Base Exception for signaling errors in user custom code.
UserCodeExecutionException(String) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeExecutionException
 
UserCodeExecutionException(String, Throwable) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeExecutionException
 
UserCodeExecutionException(Throwable) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeExecutionException
 
UserCodeExecutionException(String, Throwable, boolean, boolean) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeExecutionException
 
UserCodeQuotaException - Exception in org.apache.beam.io.requestresponse
Extends UserCodeQuotaException to allow the user custom code to specifically signal a Quota or API overuse related error.
UserCodeQuotaException(String) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeQuotaException
 
UserCodeQuotaException(String, Throwable) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeQuotaException
 
UserCodeQuotaException(Throwable) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeQuotaException
 
UserCodeQuotaException(String, Throwable, boolean, boolean) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeQuotaException
 
UserCodeRemoteSystemException - Exception in org.apache.beam.io.requestresponse
A UserCodeExecutionException that signals an error with a remote system.
UserCodeRemoteSystemException(String) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
 
UserCodeRemoteSystemException(String, Throwable) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
 
UserCodeRemoteSystemException(Throwable) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
 
UserCodeRemoteSystemException(String, Throwable, boolean, boolean) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeRemoteSystemException
 
UserCodeTimeoutException - Exception in org.apache.beam.io.requestresponse
An extension of UserCodeQuotaException to specifically signal a user code timeout.
UserCodeTimeoutException(String) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeTimeoutException
 
UserCodeTimeoutException(String, Throwable) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeTimeoutException
 
UserCodeTimeoutException(Throwable) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeTimeoutException
 
UserCodeTimeoutException(String, Throwable, boolean, boolean) - Constructor for exception org.apache.beam.io.requestresponse.UserCodeTimeoutException
 
userDefinedAggregateFunctions() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider
 
userDefinedAggregateFunctions() - Method in interface org.apache.beam.sdk.extensions.sql.udf.UdfProvider
Maps function names to aggregate function implementations.
userDefinedScalarFunctions() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider
 
userDefinedScalarFunctions() - Method in interface org.apache.beam.sdk.extensions.sql.udf.UdfProvider
Maps function names to scalar function implementations.
useReflectApi() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
Deprecated.
kept for backward API compatibility only.
UserFunctionDefinitions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
Holds user defined function definitions.
UserFunctionDefinitions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions
 
UserFunctionDefinitions.Builder - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
 
UserFunctionDefinitions.JavaScalarFunction - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
 
username(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService.Builder
 
username() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
The username to use for authentication.
username(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
Set Solace username.
username() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
The username to use for authentication.
username(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
Set Solace username.
username(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
Username to be used to authenticate with the broker.
username() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 
userStateId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
USES_KEYED_STATE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
UsesAttemptedMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Metrics.
UsesAttemptedMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesAttemptedMetrics
 
UsesBoundedSplittableParDo - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize splittable ParDo with a DoFn.BoundedPerElement DoFn.
UsesBundleFinalizer - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which use DoFn.BundleFinalizer.
UsesCommittedMetrics - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Metrics.
UsesCounterMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Counter.
UsesCounterMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesCounterMetrics
 
UsesCustomWindowMerging - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize custom window merging.
UsesDistributionMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Distribution.
UsesDistributionMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesDistributionMetrics
 
usesErrorHandler() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
UsesExternalService - Interface in org.apache.beam.sdk.testing
Category tag for tests which relies on a pre-defined port, such as expansion service or transform service.
UsesFailureMessage - Interface in org.apache.beam.sdk.testing
Category tag for tests which validate that currect failure message is provided by failed pipeline.
UsesGaugeMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Gauge.
UsesGaugeMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesGaugeMetrics
 
UsesImpulse - Class in org.apache.beam.sdk.testing
Category for tests that use Impulse transformations.
UsesImpulse() - Constructor for class org.apache.beam.sdk.testing.UsesImpulse
 
UsesJavaExpansionService - Interface in org.apache.beam.sdk.testing
Category tag for tests which use the expansion service in Java.
UsesKeyInParDo - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which use key.
UsesKms - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize --tempRoot from TestPipelineOptions and and expect a default KMS key enable for the bucket specified.
UsesLoopingTimer - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize looping timers in ParDo.
UsesMapState - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize MapState.
UsesMetricsPusher - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize the metrics pusher feature.
UsesMultimapState - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize MultimapState.
UsesOnWindowExpiration - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize DoFn.OnWindowExpiration.
UsesOrderedListState - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize OrderedListState.
UsesOrderedListState() - Constructor for class org.apache.beam.sdk.testing.UsesOrderedListState
 
UsesParDoLifecycle - Interface in org.apache.beam.sdk.testing
Category tag for the ParDoLifecycleTest for exclusion (BEAM-3241).
UsesPerKeyOrderedDelivery - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which rely on a runner providing per-key ordering.
UsesPerKeyOrderInBundle - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which rely on a runner providing per-key ordering in between transforms in the same ProcessBundleRequest.
UsesProcessingTimeTimers - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize timers in ParDo.
UsesPythonExpansionService - Interface in org.apache.beam.sdk.testing
Category tag for tests which use the expansion service in Python.
UsesRequiresTimeSortedInput - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilizeDoFn.RequiresTimeSortedInput in stateful ParDo.
usesReshuffle - Variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSourceTransform
 
UsesSchema - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize schemas.
UsesSdkHarnessEnvironment - Interface in org.apache.beam.sdk.testing
Category tag for tests which validate that the SDK harness executes in a well formed environment.
UsesSetState - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize SetState.
UsesSideInputs - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which use sideinputs.
UsesSideInputsWithDifferentCoders - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which use multiple side inputs with different coders.
UsesStatefulParDo - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize stateful ParDo.
UsesStrictTimerOrdering - Interface in org.apache.beam.sdk.testing
Category for tests that enforce strict event-time ordering of fired timers, even in situations where multiple tests mutually set one another and watermark hops arbitrarily far to the future.
UsesStringSetMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize StringSet.
UsesStringSetMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesStringSetMetrics
 
UsesSystemMetrics - Interface in org.apache.beam.sdk.testing
Category tag for tests that use System metrics.
UsesTestStream - Interface in org.apache.beam.sdk.testing
Category tag for tests that use TestStream, which is not a part of the Beam model but a special feature currently only implemented by the direct runner and the Flink Runner (streaming).
UsesTestStreamWithMultipleStages - Interface in org.apache.beam.sdk.testing
Subcategory for UsesTestStream tests which use TestStream # across multiple stages.
UsesTestStreamWithOutputTimestamp - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which use outputTimestamp.
UsesTestStreamWithProcessingTime - Interface in org.apache.beam.sdk.testing
Subcategory for UsesTestStream tests which use the processing time feature of TestStream.
UsesTimerMap - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which use timerMap.
UsesTimersInParDo - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize timers in ParDo.
UsesTriggeredSideInputs - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which use triggered sideinputs.
UsesUnboundedPCollections - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize at least one unbounded PCollection.
UsesUnboundedSplittableParDo - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize splittable ParDo with a DoFn.UnboundedPerElement DoFn.
using(String...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
Perform a natural join between the PCollections.
using(Integer...) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
Perform a natural join between the PCollections.
using(FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
Perform a natural join between the PCollections.
usingFnApiClient(InstructionRequestHandler, FnDataService) - Static method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
Creates a client for a particular SDK harness.
usingHigherBitSize(UnsignedOptions.Behavior) - Static method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
Returns options for using a higher bit count for unsigned types.
usingRedis(URI, Coder<RequestT>, Coder<ResponseT>, Duration) - Static method in class org.apache.beam.io.requestresponse.Cache
Builds a Cache.Pair using a Redis cache to read and write RequestT and ResponseT pairs.
usingSameBitSize() - Static method in class org.apache.beam.sdk.extensions.sbe.UnsignedOptions
Returns options for using the same bit size for all unsigned types.
usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Enables BigQuery's Standard SQL dialect when reading from a query.
usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
UTCDateOnly() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
 
UTCTimeOnly() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
 
UTCTimestamp() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
 
Utils - Class in org.apache.beam.runners.jet
Various common methods used by the Jet based runner.
Utils() - Constructor for class org.apache.beam.runners.jet.Utils
 
Utils() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers.Utils
 
Utils.ByteArrayKey - Class in org.apache.beam.runners.jet
A wrapper of byte[] that can be used as a hash-map key.
Uuid - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A Uuid storable in a Pub/Sub Lite attribute.
Uuid() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
UUID_SCHEMA - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 
UuidCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A coder for a Uuid.
UuidCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
 
UuidDeduplicationOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
Options for deduplicating Pub/Sub Lite messages based on the UUID they were published with.
UuidDeduplicationOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
UuidDeduplicationOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
UuidDeduplicationTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A transform for deduplicating Pub/Sub Lite messages based on the UUID they were published with.
UuidDeduplicationTransform(UuidDeduplicationOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
 
uuidExtractor() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
UuidLogicalType - Class in org.apache.beam.sdk.schemas.logicaltypes
Base class for types representing UUID as two long values.
UuidLogicalType() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
 

V

v1() - Static method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreIO
Returns a DatastoreV1 that provides an API for accessing Cloud Datastore through v1 version of Datastore Client library.
v1() - Static method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreIO
 
v17() - Static method in class org.apache.beam.sdk.io.googleads.GoogleAdsIO
 
V1_READ_OVERRIDE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
 
VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
VALID_FIELD_TYPE_SET - Static variable in class org.apache.beam.sdk.io.csv.CsvIO
The valid Schema.FieldType from which CsvIO converts CSV records to the fields.
VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
VALID_PROVIDERS - Static variable in class org.apache.beam.sdk.io.fileschematransform.FileReadSchemaTransformConfiguration
 
VALID_START_OFFSET_VALUES - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
validate() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
 
validate(Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMicros
 
validate(Schema) - Method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroJavaTimeConversions.LocalTimestampMillis
 
validate(AwsOptions, ClientConfiguration) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
Utility to validate if all necessary configuration is available to create clients using the ClientBuilderFactory configured in AwsOptions.
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.CompressedSource
Validates that the delegate source is a valid source and that the channel factory is not null.
validate() - Method in class org.apache.beam.sdk.io.csv.providers.CsvWriteTransformProvider.CsvWriteConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSink
 
validate() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryWriteConfiguration
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
 
validate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
validate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
Validates the configuration object.
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerReadSchemaTransformProvider.SpannerReadSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.jdbc.JdbcReadSchemaTransformProvider.JdbcReadSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.jdbc.JdbcWriteSchemaTransformProvider.JdbcWriteSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.json.providers.JsonWriteTransformProvider.JsonWriteConfiguration
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
validate() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
 
validate(T) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
 
validate() - Method in class org.apache.beam.sdk.io.Source
Checks that this source is valid, before it can be used in a pipeline.
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.WriteFiles
 
validate(Class<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.options.PipelineOptionsValidator
Validates that the passed PipelineOptions conforms to all the validation criteria from the passed in interface.
validate() - Method in class org.apache.beam.sdk.providers.GenerateSequenceSchemaTransformProvider.GenerateSequenceConfiguration
 
validate() - Method in class org.apache.beam.sdk.schemas.transforms.providers.JavaRowUdf.Configuration
 
validate(PipelineOptions, Map<TupleTag<?>, PCollection<?>>, Map<TupleTag<?>, PCollection<?>>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.transforms.PTransform
Called before running the Pipeline to verify this transform is fully and correctly specified.
validate(PipelineOptions, Map<TupleTag<?>, PCollection<?>>, Map<TupleTag<?>, PCollection<?>>) - Method in class org.apache.beam.sdk.transforms.PTransform
Called before running the Pipeline to verify this transform, its inputs, and outputs are fully and correctly specified.
VALIDATE_TIME_INTERVAL - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
VALIDATE_TIMESTAMP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
validateCli(Class<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.options.PipelineOptionsValidator
Validates that the passed PipelineOptions from command line interface (CLI) conforms to all the validation criteria from the passed in interface.
validateGetOutputTimestamps(WindowFn<T, W>, TimestampCombiner, List<List<Long>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
Verifies that later-ending merged windows from any of the timestamps hold up output of earlier-ending windows, using the provided WindowFn and TimestampCombiner.
validateGetOutputTimestampsWithValue(WindowFn<T, W>, TimestampCombiner, List<List<TimestampedValue<T>>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
Verifies that later-ending merged windows from any of the timestampValues hold up output of earlier-ending windows, using the provided WindowFn and TimestampCombiner.
validateInputFilePatternSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
Validates the input GCS path is accessible and that the path is well formed.
validateInputFilePatternSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
validateInputFilePatternSupported(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
Validate that a file pattern is conforming.
validateJavaBean(List<FieldValueTypeInformation>, List<FieldValueTypeInformation>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
validateMaterializations(Iterable<PCollectionView<?>>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
 
validateMethod(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
 
validateOutputFilePrefixSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
Validates the output GCS path is accessible and that the path is well formed.
validateOutputFilePrefixSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
validateOutputFilePrefixSupported(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
Validate that an output file prefix is conforming.
validateOutputResourceSupported(ResourceId) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
validateOutputResourceSupported(ResourceId) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
validateOutputResourceSupported(ResourceId) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
Validates that an output path is conforming.
ValidatesRunner - Interface in org.apache.beam.sdk.testing
Category tag for tests which validate that a Beam runner is correctly implemented.
validateTimeInterval(Long, TimeUnit) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
This function validates that interval is compatible with ZetaSQL timestamp values range.
validateTimestamp(Long) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
This function validates that Long representation of timestamp is compatible with ZetaSQL timestamp values range.
validateTimestampBounds(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
Validates that a given timestamp is within min and max bounds.
validateTransform() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Validates construction of this transform.
validateTransform() - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
 
Validation - Annotation Type in org.apache.beam.sdk.options
Validation represents a set of annotations that can be used to annotate getter properties on PipelineOptions with information representing the validation criteria to be used when validating with the PipelineOptionsValidator.
Validation.Required - Annotation Type in org.apache.beam.sdk.options
This criteria specifies that the value must be not null.
validator() - Method in class org.apache.beam.sdk.schemas.transforms.Cast
 
VALUE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
value() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
 
value() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
 
value(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
 
value(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
 
VALUE - Static variable in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
 
VALUE - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
value() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
Value(int) - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
 
Value(EnumerationType.Value, Object) - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
 
value() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a single value of type T.
value(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.value(), but with a coder explicitly supplied.
ValueCaptureType - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents the capture type of a change stream.
valueCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
valueCoder() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
Returns the Coder to use for the elements of the resulting values iterable.
valueEncoderOf(KvCoder<K, V>) - Method in interface org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider
 
ValueInSingleWindow<T> - Class in org.apache.beam.sdk.values
An immutable tuple of value, timestamp, window, and pane.
ValueInSingleWindow() - Constructor for class org.apache.beam.sdk.values.ValueInSingleWindow
 
ValueInSingleWindow.Coder<T> - Class in org.apache.beam.sdk.values
A coder for ValueInSingleWindow.
valueOf(String) - Static method in enum org.apache.beam.io.debezium.Connectors
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.FlexResourceSchedulingGoal
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
Deprecated.
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.runners.local.ExecutionDriver.DriverState
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options.SorterType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters.Kind
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.cassandra.CassandraIO.MutationType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Format
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.cdap.PluginConstants.PluginType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
Deprecated.
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.Compression
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
Deprecated.
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.FileBasedSource.Mode
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.FileIO.ReadMatches.DirectoryTreatment
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.FileSystem.LineageLevel
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.EmptyMatchTreatment
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.MatchResult.Status
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.iceberg.IcebergScanConfig.ScanType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.redis.RedisIO.Write.Method
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.snowflake.enums.CreateDisposition
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.snowflake.enums.StreamingLogLevel
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.solace.data.Solace.DestinationType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.solace.SolaceIO.WriterType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.TextIO.CompressionType
Deprecated.
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.TFRecordIO.CompressionType
Deprecated.
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
Deprecated.
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.jmh.schemas.RowBundle.Action
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.metrics.Lineage.Type
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.PipelineResult.State
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.ListQualifier
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.MapQualifier
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind
Returns the enum constant of this type with the specified name.
valueOf(String) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
Return an EnumerationType.Value corresponding to one of the enumeration strings.
valueOf(int) - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
Return an EnumerationType.Value corresponding to one of the enumeration integer values.
valueOf(String) - Static method in enum org.apache.beam.sdk.schemas.Schema.EquivalenceNullablePolicy
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.schemas.Schema.TypeName
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout.Kind
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.state.TimeDomain
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.testing.TestStream.EventType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.display.DisplayData.Type
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
Deprecated.
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.IsBounded
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.values.PCollection.IsBounded
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
Returns the enum constant of this type with the specified name.
ValueOrMetadata(boolean, T, MetaT) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
 
ValueProvider<T> - Interface in org.apache.beam.sdk.options
A ValueProvider abstracts the notion of fetching a value that may or may not be currently available.
ValueProvider.Deserializer - Class in org.apache.beam.sdk.options
For internal use only; no backwards compatibility guarantees.
ValueProvider.NestedValueProvider<T,X> - Class in org.apache.beam.sdk.options
ValueProvider.NestedValueProvider is an implementation of ValueProvider that allows for wrapping another ValueProvider object.
ValueProvider.RuntimeValueProvider<T> - Class in org.apache.beam.sdk.options
ValueProvider.RuntimeValueProvider is an implementation of ValueProvider that allows for a value to be provided at execution time rather than at graph construction time.
ValueProvider.Serializer - Class in org.apache.beam.sdk.options
For internal use only; no backwards compatibility guarantees.
ValueProvider.StaticValueProvider<T> - Class in org.apache.beam.sdk.options
ValueProvider.StaticValueProvider is an implementation of ValueProvider that allows for a static value to be provided.
ValueProviders - Class in org.apache.beam.sdk.options
Utilities for working with the ValueProvider interface.
values() - Static method in enum org.apache.beam.io.debezium.Connectors
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.FlexResourceSchedulingGoal
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
Deprecated.
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.runners.local.ExecutionDriver.DriverState
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.ordered.UnprocessedEvent.Reason
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.sbe.UnsignedOptions.Behavior
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options.SorterType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.sql.impl.cep.CEPKind
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters.Kind
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.PushDownOptions
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.cassandra.CassandraIO.MutationType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Format
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.cdap.PluginConstants.FormatProvider
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.cdap.PluginConstants.Hadoop
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.cdap.PluginConstants.PluginType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.clickhouse.TableSchema.DefaultType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.clickhouse.TableSchema.TypeName
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
Deprecated.
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.Compression
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
Deprecated.
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.FileBasedSource.Mode
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.FileIO.ReadMatches.DirectoryTreatment
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.FileSystem.LineageLevel
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.fs.EmptyMatchTreatment
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.fs.MatchResult.Status
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.RpcMethod
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.iceberg.IcebergScanConfig.ScanType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Writes just the values to Kafka.
values() - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.redis.RedisIO.Write.Method
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.snowflake.enums.CreateDisposition
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.snowflake.enums.StreamingLogLevel
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.snowflake.enums.WriteDisposition
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.solace.data.Solace.DestinationType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.solace.SolaceIO.SubmissionMode
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.solace.SolaceIO.WriterType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.TextIO.CompressionType
Deprecated.
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.TFRecordIO.CompressionType
Deprecated.
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
Deprecated.
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.jmh.schemas.RowBundle.Action
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.metrics.Lineage.Type
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.PipelineResult.State
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.ListQualifier
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.MapQualifier
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.schemas.Schema.EquivalenceNullablePolicy
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.schemas.Schema.TypeName
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields.Fanout.Kind
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
 
values() - Method in interface org.apache.beam.sdk.state.MapState
Returns an Iterable over the values contained in this map.
values() - Static method in enum org.apache.beam.sdk.state.TimeDomain
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.testing.TestStream.EventType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in class org.apache.beam.sdk.transforms.Deduplicate
Returns a deduplication transform that deduplicates values for up to 10 mins within the processing time domain.
values() - Static method in enum org.apache.beam.sdk.transforms.display.DisplayData.Type
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
Deprecated.
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.IsBounded
Returns an array containing the constants of this enum type, in the order they are declared.
Values<V> - Class in org.apache.beam.sdk.transforms
Values<V> takes a PCollection of KV<K, V>s and returns a PCollection<V> of the values.
values() - Static method in enum org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.values.PCollection.IsBounded
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
Returns an array containing the constants of this enum type, in the order they are declared.
ValueState<T> - Interface in org.apache.beam.sdk.state
A ReadableState cell containing a single value.
ValueWithRecordId<ValueT> - Class in org.apache.beam.sdk.values
For internal use only; no backwards compatibility guarantees.
ValueWithRecordId(ValueT, byte[]) - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId
 
ValueWithRecordId.StripIdsDoFn<T> - Class in org.apache.beam.sdk.values
DoFn to turn a ValueWithRecordId<T> back to the value T.
ValueWithRecordId.ValueWithRecordIdCoder<ValueT> - Class in org.apache.beam.sdk.values
A Coder for ValueWithRecordId, using a wrapped value Coder.
ValueWithRecordIdCoder(Coder<ValueT>) - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
VARBINARY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
VARCHAR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
VariableBytes - Class in org.apache.beam.sdk.schemas.logicaltypes
A LogicalType representing a variable-length byte array with specified maximum length.
VariableString - Class in org.apache.beam.sdk.schemas.logicaltypes
A LogicalType representing a variable-length string with specified maximum length.
VarianceFn<T extends java.lang.Number> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
Combine.CombineFn for Variance on Number types.
VarIntBenchmark - Class in org.apache.beam.sdk.jmh.util
Benchmarks for VarInt and variants.
VarIntBenchmark() - Constructor for class org.apache.beam.sdk.jmh.util.VarIntBenchmark
 
VarIntBenchmark.BlackholeOutput - Class in org.apache.beam.sdk.jmh.util
Output to Blackhole.
VarIntBenchmark.Bytes - Class in org.apache.beam.sdk.jmh.util
Input from randomly generated bytes.
VarIntBenchmark.ByteStringOutput - Class in org.apache.beam.sdk.jmh.util
Output to ByteStringOutputStream.
VarIntBenchmark.Longs - Class in org.apache.beam.sdk.jmh.util
Input from randomly generated longs.
VarIntCoder - Class in org.apache.beam.sdk.coders
A Coder that encodes Integers using between 1 and 5 bytes.
VarLongCoder - Class in org.apache.beam.sdk.coders
A Coder that encodes Longs using between 1 and 10 bytes.
verifyBucketAccessible(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
Checks whether the GCS bucket exists.
verifyCompatibility(Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast
 
verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Throw IncompatibleWindowException if this WindowFn does not perform the same merging as the given $WindowFn.
verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
verifyDeterministic() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.AtomicCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ByteCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.Coder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic(Coder<?>, String, Iterable<Coder<?>>) - Static method in class org.apache.beam.sdk.coders.Coder
Verifies all of the provided coders are deterministic.
verifyDeterministic(Coder<?>, String, Coder<?>...) - Static method in class org.apache.beam.sdk.coders.Coder
Verifies all of the provided coders are deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.CustomCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DelegateCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DequeCoder
Deque sizes are always known, so DequeIterable may be deterministic while the general IterableLikeCoder is not.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DoubleCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DurationCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.FloatCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.InstantCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.KvCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
LengthPrefixCoder is deterministic if the nested Coder is.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ListCoder
List sizes are always known, so ListIterable may be deterministic while the general IterableLikeCoder is not.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.MapCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.NullableCoder
NullableCoder is deterministic if the nested Coder is.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SerializableCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SetCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SnappyCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VoidCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ZstdCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.avro.coders.AvroCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.splunk.SplunkEventCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
verifyFieldValue(Object, Schema.FieldType, String) - Static method in class org.apache.beam.sdk.values.SchemaVerification
 
verifyPAssertsSucceeded(Pipeline, PipelineResult) - Static method in class org.apache.beam.sdk.testing.TestPipeline
Verifies all {PAsserts} in the pipeline have been executed and were successful.
verifyPath(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
verifyPath(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
verifyPath(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
Validate that a path is a valid path and that the path is accessible.
VersionDependentFlinkPipelineOptions - Interface in org.apache.beam.runners.flink
 
via(ProcessFunction<T, Long>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
 
via(ProcessFunction<KV<K, V>, KV<K, Long>>) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
 
via(Contextful<Contextful.Fn<UserT, OutputT>>, Contextful<Contextful.Fn<DestinationT, FileIO.Sink<OutputT>>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies how to create a FileIO.Sink for a particular destination and how to map the element type to the sink's output type.
via(Contextful<Contextful.Fn<UserT, OutputT>>, FileIO.Sink<OutputT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Like FileIO.Write.via(Contextful, Contextful), but uses the same sink for all destinations.
via(Contextful<Contextful.Fn<DestinationT, FileIO.Sink<UserT>>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Like FileIO.Write.via(Contextful, Contextful), but the output type of the sink is the same as the type of the input collection.
via(FileIO.Sink<UserT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Like FileIO.Write.via(Contextful), but uses the same FileIO.Sink for all destinations.
via(InferableFunction<? super InputT, ? extends Iterable<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
For a InferableFunction<InputT, ? extends Iterable<OutputT>> fn, return a PTransform that applies fn to every element of the input PCollection<InputT> and outputs all of the elements to the output PCollection<OutputT>.
via(SimpleFunction<? super InputT, ? extends Iterable<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
Binary compatibility adapter for FlatMapElements.via(ProcessFunction).
via(ProcessFunction<NewInputT, ? extends Iterable<OutputT>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
For a ProcessFunction<InputT, ? extends Iterable<OutputT>> fn, returns a PTransform that applies fn to every element of the input PCollection<InputT> and outputs all of the elements to the output PCollection<OutputT>.
via(SerializableFunction<NewInputT, ? extends Iterable<OutputT>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
Binary compatibility adapter for FlatMapElements.via(ProcessFunction).
via(Contextful<Contextful.Fn<NewInputT, Iterable<OutputT>>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
Like FlatMapElements.via(ProcessFunction), but allows access to additional context.
via(InferableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
For InferableFunction<InputT, OutputT> fn, returns a PTransform that takes an input PCollection<InputT> and returns a PCollection<OutputT> containing fn.apply(v) for every element v in the input.
via(SimpleFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
Binary compatibility adapter for MapElements.via(InferableFunction).
via(ProcessFunction<NewInputT, OutputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
For a ProcessFunction<InputT, OutputT> fn and output type descriptor, returns a PTransform that takes an input PCollection<InputT> and returns a PCollection<OutputT> containing fn.apply(v) for every element v in the input.
via(SerializableFunction<NewInputT, OutputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
Binary compatibility adapter for MapElements.via(ProcessFunction).
via(Contextful<Contextful.Fn<NewInputT, OutputT>>) - Method in class org.apache.beam.sdk.transforms.MapElements
Like MapElements.via(ProcessFunction), but supports access to context, such as side inputs.
via(SerializableFunction<NewKeyT, K2>) - Method in class org.apache.beam.sdk.transforms.MapKeys
Returns a MapKeys<K1, K2, V> PTransform for a ProcessFunction<NewK1, K2> with predefined MapKeys.outputType.
via(SerializableFunction<NewValueT, V2>) - Method in class org.apache.beam.sdk.transforms.MapValues
Returns a MapValues transform for a ProcessFunction<NewV1, V2> with predefined MapValues.outputType.
viaFlatMapFn(String, Coder<?>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
 
viaMapFn(String, Coder<?>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
 
viaRandomKey() - Static method in class org.apache.beam.sdk.transforms.Reshuffle
Encapsulates the sequence "pair input with unique key, apply Reshuffle.of(), drop the key" commonly used to break fusion.
VideoIntelligence - Class in org.apache.beam.sdk.extensions.ml
Factory class for PTransforms integrating with Google Cloud AI - VideoIntelligence service.
VideoIntelligence() - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence
 
VideoIntelligence.AnnotateVideoFromBytes - Class in org.apache.beam.sdk.extensions.ml
A PTransform taking a PCollection of ByteString and an optional side input with a context map and emitting lists of VideoAnnotationResults for each element.
VideoIntelligence.AnnotateVideoFromBytesWithContext - Class in org.apache.beam.sdk.extensions.ml
A PTransform taking a PCollection of KV of ByteString and VideoContext and emitting lists of VideoAnnotationResults for each element.
VideoIntelligence.AnnotateVideoFromUri - Class in org.apache.beam.sdk.extensions.ml
A PTransform taking a PCollection of String and an optional side input with a context map and emitting lists of VideoAnnotationResults for each element.
VideoIntelligence.AnnotateVideoFromURIWithContext - Class in org.apache.beam.sdk.extensions.ml
A PTransform taking a PCollection of KV of String and VideoContext and emitting lists of VideoAnnotationResults for each element.
View - Class in org.apache.beam.sdk.transforms
Transforms for creating PCollectionViews from PCollections (to read them as side inputs).
View.AsIterable<T> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.AsList<T> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.AsMap<K,V> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.AsMultimap<K,V> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.AsSingleton<T> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.CreatePCollectionView<ElemT,ViewT> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.ToListViewDoFn<T> - Class in org.apache.beam.sdk.transforms
Provides an index to value mapping using a random starting index and also provides an offset range for each window seen.
viewAsValues(PCollectionView<V>, Coder<V>) - Static method in class org.apache.beam.sdk.transforms.Reify
Pairs each element in a collection with the value of a side input associated with the element's window.
ViewFn<PrimitiveViewT,ViewT> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
ViewFn() - Constructor for class org.apache.beam.sdk.transforms.ViewFn
 
viewInGlobalWindow(PCollectionView<V>, Coder<V>) - Static method in class org.apache.beam.sdk.transforms.Reify
Returns a PCollection consisting of a single element, containing the value of the given view in the global window.
ViewP - Class in org.apache.beam.runners.jet.processors
Jet Processor implementation for Beam's side input producing primitives.
visitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by FieldSpecifierNotationParser.arrayQualifier().
visitArrayQualifier(FieldSpecifierNotationParser.ArrayQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by FieldSpecifierNotationParser.arrayQualifier().
visitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by the arrayQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
visitArrayQualifierList(FieldSpecifierNotationParser.ArrayQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by the arrayQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
visitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by FieldSpecifierNotationParser.dotExpression().
visitDotExpression(FieldSpecifierNotationParser.DotExpressionContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by FieldSpecifierNotationParser.dotExpression().
visitErrorNode(ErrorNode) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
visitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by FieldSpecifierNotationParser.fieldSpecifier().
visitFieldSpecifier(FieldSpecifierNotationParser.FieldSpecifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by FieldSpecifierNotationParser.fieldSpecifier().
visitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by FieldSpecifierNotationParser.mapQualifier().
visitMapQualifier(FieldSpecifierNotationParser.MapQualifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by FieldSpecifierNotationParser.mapQualifier().
visitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by the mapQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
visitMapQualifierList(FieldSpecifierNotationParser.MapQualifierListContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by the mapQualifierList labeled alternative in FieldSpecifierNotationParser.qualifierList().
visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.twister2.translators.Twister2BatchPipelineTranslator
 
visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
visitPrimitiveTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called for each primitive transform after all of its topological predecessors and inputs have been visited.
visitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
visitQualifiedComponent(FieldSpecifierNotationParser.QualifiedComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
visitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by the qualifyComponent labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
visitQualifyComponent(FieldSpecifierNotationParser.QualifyComponentContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by the qualifyComponent labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
visitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by the simpleIdentifier labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
visitSimpleIdentifier(FieldSpecifierNotationParser.SimpleIdentifierContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by the simpleIdentifier labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
visitTerminal(TerminalNode) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
visitValue(PValue, TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
visitValue(PValue, TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called for each value after the transform that produced the value has been visited.
visitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
Visit a parse tree produced by the wildcard labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
visitWildcard(FieldSpecifierNotationParser.WildcardContext) - Method in interface org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationVisitor
Visit a parse tree produced by the wildcard labeled alternative in FieldSpecifierNotationParser.dotExpressionComponent().
VOCABULARY - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
VOCABULARY - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
VoidCoder - Class in org.apache.beam.sdk.coders
A Coder for Void.
voids() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
vpnName(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService.Builder
 
vpnName() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionService
The name of the VPN to connect to.
vpnName(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory.Builder
Set Solace vpn name.
vpnName() - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthJcsmpSessionServiceFactory
The name of the VPN to connect to.
vpnName(String) - Method in class org.apache.beam.sdk.io.solace.broker.BasicAuthSempClientFactory.Builder
Set Solace vpn name.
vpnName(String) - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory.Builder
Optional.
vpnName() - Method in class org.apache.beam.sdk.io.solace.broker.GCPSecretSessionServiceFactory
 

W

Wait - Class in org.apache.beam.sdk.transforms
Delays processing of each window in a PCollection until signaled.
Wait() - Constructor for class org.apache.beam.sdk.transforms.Wait
 
Wait.OnSignal<T> - Class in org.apache.beam.sdk.transforms
waitForCompletion() - Method in class org.apache.beam.sdk.io.aws2.common.AsyncBatchWriteHandler
Wait for all pending requests to complete and check for failures.
waitForNMessages(int, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Repeatedly pull messages from TestPubsub.subscriptionPath(), returns after receiving n messages or after waiting for timeoutDuration.
waitForPort(String, int, int) - Static method in class org.apache.beam.sdk.extensions.python.PythonService
 
waitForStart(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Future that waits for a start signal for duration.
waitForSuccess(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Wait for a success signal for duration.
waitForUpTo(Duration) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.TestPubsub.PollingAssertion
 
waitTillUp(int) - Method in class org.apache.beam.sdk.transformservice.launcher.TransformServiceLauncher
 
waitUntilFinish() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
waitUntilFinish(Duration, MonitoringUtil.JobMessagesHandler) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Waits until the pipeline finishes and returns the final status.
waitUntilFinish() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
Waits until the pipeline finishes and returns the final status.
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
Waits until the pipeline finishes and returns the final status.
waitUntilFinish() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
 
waitUntilFinish() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.jet.JetPipelineResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.jet.JetPipelineResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
 
waitUntilFinish(Duration) - Method in interface org.apache.beam.sdk.PipelineResult
Waits until the pipeline finishes and returns the final status.
waitUntilFinish() - Method in interface org.apache.beam.sdk.PipelineResult
Waits until the pipeline finishes and returns the final status.
WallTime(Instant) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
 
Watch - Class in org.apache.beam.sdk.transforms
Given a "poll function" that produces a potentially growing set of outputs for an input, this transform simultaneously continuously watches the growth of output sets of all inputs, until a per-input termination condition is reached.
Watch() - Constructor for class org.apache.beam.sdk.transforms.Watch
 
Watch.Growth<InputT,OutputT,KeyT> - Class in org.apache.beam.sdk.transforms
Watch.Growth.PollFn<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A function that computes the current set of outputs for the given input, in the form of a Watch.Growth.PollResult.
Watch.Growth.PollResult<OutputT> - Class in org.apache.beam.sdk.transforms
The result of a single invocation of a Watch.Growth.PollFn.
Watch.Growth.TerminationCondition<InputT,StateT> - Interface in org.apache.beam.sdk.transforms
A strategy for determining whether it is time to stop polling the current input regardless of whether its output is complete or not.
Watch.WatchGrowthFn<InputT,OutputT,KeyT,TerminationStateT> - Class in org.apache.beam.sdk.transforms
 
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
Like Read#watchForNewFiles(Duration, TerminationCondition, boolean).
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
Like Read#watchForNewFiles(Duration, TerminationCondition).
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
Continuously watches for new files matching the filepattern, polling it at the given interval, until the given termination condition is reached.
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
Same as Read#watchForNewFiles(Duration, TerminationCondition, boolean) with matchUpdatedFiles=false.
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
Deprecated.
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.TextIO.Read
See MatchConfiguration#continuously(Duration, TerminationCondition, boolean).
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.TextIO.Read
Same as Read#watchForNewFiles(Duration, TerminationCondition, boolean) with matchUpdatedFiles=false.
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
Deprecated.
Same as Read#watchForNewFiles(Duration, TerminationCondition, boolean).
watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
Deprecated.
Same as Read#watchForNewFiles(Duration, TerminationCondition).
WatermarkAdvancingStreamingListener() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarkAdvancingStreamingListener
 
WatermarkEstimator<WatermarkEstimatorStateT> - Interface in org.apache.beam.sdk.transforms.splittabledofn
A WatermarkEstimator which is used for estimating output watermarks of a splittable DoFn.
WatermarkEstimators - Class in org.apache.beam.sdk.fn.splittabledofn
Support utilties for interacting with WatermarkEstimators.
WatermarkEstimators() - Constructor for class org.apache.beam.sdk.fn.splittabledofn.WatermarkEstimators
 
WatermarkEstimators - Class in org.apache.beam.sdk.transforms.splittabledofn
A set of WatermarkEstimators that users can use to advance the output watermark for their associated splittable DoFns.
WatermarkEstimators.Manual - Class in org.apache.beam.sdk.transforms.splittabledofn
Concrete implementation of a ManualWatermarkEstimator.
WatermarkEstimators.MonotonicallyIncreasing - Class in org.apache.beam.sdk.transforms.splittabledofn
A watermark estimator that observes timestamps of records output from a DoFn reporting the timestamp of the last element seen as the current watermark.
WatermarkEstimators.WallTime - Class in org.apache.beam.sdk.transforms.splittabledofn
A watermark estimator that tracks wall time.
WatermarkEstimators.WatermarkAndStateObserver<WatermarkEstimatorStateT> - Interface in org.apache.beam.sdk.fn.splittabledofn
Interface which allows for accessing the current watermark and watermark estimator state.
WatermarkEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
 
WatermarkHoldState - Interface in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
WatermarkParameters - Class in org.apache.beam.sdk.io.aws2.kinesis
WatermarkParameters contains the parameters used for watermark computation.
WatermarkParameters() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
 
WatermarkParameters - Class in org.apache.beam.sdk.io.kinesis
WatermarkParameters contains the parameters used for watermark computation.
WatermarkParameters() - Constructor for class org.apache.beam.sdk.io.kinesis.WatermarkParameters
 
WatermarkPolicy - Interface in org.apache.beam.sdk.io.aws2.kinesis
Implement this interface to define a custom watermark calculation heuristic.
WatermarkPolicy - Interface in org.apache.beam.sdk.io.kinesis
Implement this interface to define a custom watermark calculation heuristic.
WatermarkPolicyFactory - Interface in org.apache.beam.sdk.io.aws2.kinesis
Implement this interface to create a WatermarkPolicy.
WatermarkPolicyFactory - Interface in org.apache.beam.sdk.io.kinesis
Implement this interface to create a WatermarkPolicy.
WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy - Class in org.apache.beam.sdk.io.aws2.kinesis
ArrivalTimeWatermarkPolicy uses WatermarkPolicyFactory.CustomWatermarkPolicy for watermark computation.
WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy - Class in org.apache.beam.sdk.io.kinesis
ArrivalTimeWatermarkPolicy uses WatermarkPolicyFactory.CustomWatermarkPolicy for watermark computation.
WatermarkPolicyFactory.CustomWatermarkPolicy - Class in org.apache.beam.sdk.io.aws2.kinesis
CustomWatermarkPolicy uses parameters defined in WatermarkParameters to compute watermarks.
WatermarkPolicyFactory.CustomWatermarkPolicy - Class in org.apache.beam.sdk.io.kinesis
CustomWatermarkPolicy uses parameters defined in WatermarkParameters to compute watermarks.
WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy - Class in org.apache.beam.sdk.io.aws2.kinesis
Watermark policy where the processing time is used as the event time.
WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy - Class in org.apache.beam.sdk.io.kinesis
Watermark policy where the processing time is used as the event time.
watermarkStateInternal(TimestampCombiner) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards-compatibility guarantees.
WebPathParser - Class in org.apache.beam.sdk.io.gcp.healthcare
 
WebPathParser() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser
 
WebPathParser.DicomWebPath - Class in org.apache.beam.sdk.io.gcp.healthcare
 
weeks(int, int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
Returns a WindowFn that windows elements into periods measured by weeks.
WeightedList<T> - Class in org.apache.beam.sdk.fn.data
Facade for a List that keeps track of weight, for cache limit reasons.
WeightedList(List<T>, long) - Constructor for class org.apache.beam.sdk.fn.data.WeightedList
 
where(TypeParameter<X>, TypeDescriptor<X>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a new TypeDescriptor where the type variable represented by typeParameter are substituted by type.
where(Type, Type) - Method in class org.apache.beam.sdk.values.TypeDescriptor
A more general form of TypeDescriptor.where(TypeParameter, TypeDescriptor) that returns a new TypeDescriptor by matching formal against actual to resolve type variables in the current TypeDescriptor.
whereFieldId(int, SerializableFunction<FieldT, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
Set a predicate based on the value of a field, where the field is specified by id.
whereFieldIds(List<Integer>, SerializableFunction<Row, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
Set a predicate based on the value of multipled fields, specified by id.
whereFieldName(String, SerializableFunction<FieldT, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
Set a predicate based on the value of a field, where the field is specified by name.
whereFieldNames(List<String>, SerializableFunction<Row, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
Set a predicate based on the value of multipled fields, specified by name.
widening(Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
 
Widening() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast.Widening
 
WILDCARD - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
WILDCARD - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
WILDCARD() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
 
WildcardContext(FieldSpecifierNotationParser.DotExpressionComponentContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
 
wildcardToRegexp(String) - Static method in class org.apache.beam.sdk.io.FileSystemUtils
Expands glob expressions to regular expressions.
WINDMILL_SERVICE_EXPERIMENT - Static variable in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
Deprecated.
Use STREAMING_ENGINE_EXPERIMENT instead.
WindmillServiceStreamingRpcBatchLimitFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowStreamingPipelineOptions.WindmillServiceStreamingRpcBatchLimitFactory
 
window() - Method in class org.apache.beam.runners.twister2.utils.Twister2AssignContext
 
window() - Method in interface org.apache.beam.sdk.state.StateContext
Returns the window corresponding to the state.
window() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
Returns the window in which the timer is firing.
window() - Method in class org.apache.beam.sdk.transforms.DoFn.OnWindowExpirationContext
Returns the window in which the window expiration is firing.
Window<T> - Class in org.apache.beam.sdk.transforms.windowing
Window logically divides up or groups the elements of a PCollection into finite windows according to a WindowFn.
Window() - Constructor for class org.apache.beam.sdk.transforms.windowing.Window
 
window() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
Returns the window of the current element prior to this WindowFn being called.
Window.Assign<T> - Class in org.apache.beam.sdk.transforms.windowing
A Primitive PTransform that assigns windows to elements based on a WindowFn.
Window.ClosingBehavior - Enum in org.apache.beam.sdk.transforms.windowing
Specifies the conditions under which a final pane will be created when a window is permanently closed.
Window.OnTimeBehavior - Enum in org.apache.beam.sdk.transforms.windowing
Specifies the conditions under which an on-time pane will be created when a window is closed.
WINDOW_END - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
 
WINDOW_START - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
 
windowCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
windowCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
windowCoder(PCollection<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator
 
windowCoder() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns the Coder used for serializing the windows used by this windowFn.
WindowedContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.WindowedContext
 
windowedEncoder(Coder<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
windowedEncoder(Encoder<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
windowedEncoder(Coder<T>, Coder<W>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
windowedFilename(int, int, BoundedWindow, PaneInfo, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
windowedFilename(int, int, BoundedWindow, PaneInfo, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
When a sink has requested windowed or triggered output, this method will be invoked to return the file resource to be created given the base output directory and a FileBasedSink.OutputFileHints containing information about the file, including a suggested extension (e.g.
windowedMultiReceiver(DoFn<?, ?>.WindowedContext, Map<TupleTag<?>, Coder<?>>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
Returns a DoFn.MultiOutputReceiver that delegates to a DoFn.WindowedContext.
windowedMultiReceiver(DoFn<?, ?>.WindowedContext) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
Returns a DoFn.MultiOutputReceiver that delegates to a DoFn.WindowedContext.
windowedReceiver(DoFn<?, ?>.WindowedContext, TupleTag<T>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
Returns a DoFn.OutputReceiver that delegates to a DoFn.WindowedContext.
windowedValueEncoder(Encoder<T>, Encoder<W>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
Creates a Spark Encoder for T of StructType with fields value, timestamp, windows and pane.
windowedValues(Iterable<WindowedValue<T>>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.WindowedValues transform that produces a PCollection containing the elements of the provided Iterable with the specified windowing metadata.
windowedValues(WindowedValue<T>, WindowedValue<T>...) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.WindowedValues transform that produces a PCollection containing the specified elements with the specified windowing metadata.
windowedWrites - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Whether windowed writes are being used.
windowEncoder() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.TransformTranslator.Context
 
WindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
The argument to the Window transform used to assign elements into windows and to determine how windows are merged.
WindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn
 
WindowFn.AssignContext - Class in org.apache.beam.sdk.transforms.windowing
WindowFn.MergeContext - Class in org.apache.beam.sdk.transforms.windowing
WindowFnTestUtils - Class in org.apache.beam.sdk.testing
A utility class for testing WindowFns.
WindowFnTestUtils() - Constructor for class org.apache.beam.sdk.testing.WindowFnTestUtils
 
WindowGroupP<K,V> - Class in org.apache.beam.runners.jet.processors
Jet Processor implementation for Beam's GroupByKeyOnly + GroupAlsoByWindow primitives.
WINDOWING_STRATEGY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
WindowingStrategy<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.values
A WindowingStrategy describes the windowing behavior for a specific collection of values.
WindowingStrategy.AccumulationMode - Enum in org.apache.beam.sdk.values
The accumulation modes that can be used with windowing.
WindowIntoTransformProvider - Class in org.apache.beam.sdk.expansion.service
An implementation of TypedSchemaTransformProvider for WindowInto.
WindowIntoTransformProvider() - Constructor for class org.apache.beam.sdk.expansion.service.WindowIntoTransformProvider
 
WindowIntoTransformProvider.Configuration - Class in org.apache.beam.sdk.expansion.service
 
WindowIntoTransformProvider.Configuration.Builder - Class in org.apache.beam.sdk.expansion.service
 
WindowMappingFn<TargetWindowT extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
A function that takes the windows of elements in a main input and maps them to the appropriate window in a PCollectionView consumed as a side input.
WindowMappingFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
Create a new WindowMappingFn with zero maximum lookback.
WindowMappingFn(Duration) - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
Create a new WindowMappingFn with the specified maximum lookback.
windowOnlyContext(W) - Static method in class org.apache.beam.sdk.state.StateContexts
 
windows() - Static method in class org.apache.beam.sdk.transforms.Reify
Create a PTransform that will reify information from the processing context into instances of ValueInSingleWindow.
windows() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
Returns the current set of windows.
windowsInValue() - Static method in class org.apache.beam.sdk.transforms.Reify
Create a PTransform that will output all input KVs with the window pane info inside the value.
WireCoders - Class in org.apache.beam.runners.fnexecution.wire
Helpers to construct coders for gRPC port reads and writes.
with(PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join.FullOuterJoin
 
with(PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join.InnerJoin
 
with(PCollection<KV<K, V2>>, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join.LeftOuterJoin
 
with(PCollection<KV<K, V2>>, V1) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join.RightOuterJoin
 
with(SimpleFunction<DataT, InputT>, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
Returns a CombineFns.ComposedCombineFn that can take additional GlobalCombineFns and apply them as a single combine function.
with(SimpleFunction<DataT, InputT>, Coder, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
Like #with(SimpleFunction, CombineFn, TupleTag) but with an explicit input coder.
with(SimpleFunction<DataT, InputT>, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
Returns a CombineFns.ComposedCombineFnWithContext that can take additional GlobalCombineFns and apply them as a single combine function.
with(SimpleFunction<DataT, InputT>, Coder, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
Like #with(SimpleFunction, CombineFnWithContext, TupleTag) but with input coder.
with(SimpleFunction<DataT, InputT>, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
Returns a CombineFns.ComposedCombineFn with an additional Combine.CombineFn.
with(SimpleFunction<DataT, InputT>, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
with(SimpleFunction<DataT, InputT>, Coder, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
Returns a CombineFns.ComposedCombineFn with an additional Combine.CombineFn.
with(SimpleFunction<DataT, InputT>, Coder, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
with(SimpleFunction<DataT, InputT>, CombineFnBase.GlobalCombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
with(SimpleFunction<DataT, InputT>, Coder<InputT>, CombineFnBase.GlobalCombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
with(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Flatten
Returns a PTransform that flattens the input PCollection with a given PCollection resulting in a PCollection containing all the elements of both PCollections as its output.
with(PTransform<PBegin, PCollection<T>>) - Static method in class org.apache.beam.sdk.transforms.Flatten
Returns a PTransform that flattens the input PCollection with the output of another PTransform resulting in a PCollection containing all the elements of both the input PCollections and the output of the given PTransform as its output.
withAccelerator(String) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
Declares hardware accelerators that are desired to have in the execution environment.
withAccuracy(double, double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
Returns a new SketchFrequencies.CountMinSketchFn combiner with new precision accuracy parameters epsilon and confidence.
withAddresses(List<String>) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
Define the AMQP addresses where to receive messages.
withAdminUrl(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withAllFields() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
withAllowableResponseErrors(Set<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
Provide a set of textual error types which can be contained in Bulk API response items[].error.type field.
withAllowableResponseErrors(Set<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withAllowDuplicates(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
withAllowDuplicates() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
 
withAllowDuplicates(boolean) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
 
withAllowDuplicates(boolean) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeByKey
 
withAllowedLateness(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Override the amount of lateness allowed for data elements in the output PCollection and downstream PCollections until explicitly set again.
withAllowedLateness(Duration, Window.ClosingBehavior) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Override the amount of lateness allowed for data elements in the pipeline.
withAllowedLateness(Duration) - Method in class org.apache.beam.sdk.values.WindowingStrategy
Returns a WindowingStrategy identical to this but with the allowed lateness set to allowedLateness.
withAllowedTimestampSkew(Duration) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
Deprecated.
This method permits a to elements to be emitted behind the watermark. These elements are considered late, and if behind the allowed lateness of a downstream PCollection may be silently dropped. See https://github.com/apache/beam/issues/18065 for details on a replacement.
withAlreadyMerged(boolean) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
withApiKey(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch authentication is enabled, provide an API key.
withAppendOnly(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide an instruction to control whether the target index should be considered append-only.
withAppendOnly(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withAppProfileId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read using the specified app profile id.
withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read using the specified app profile id.
withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will stream from the cluster specified by app profile id.
withAppProfileId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write using the specified app profile id.
withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write using the specified app profile id.
withApproximateTrim(boolean) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
If RedisIO.WriteStreams.withMaxLen(long) is used, set the "~" prefix to the MAXLEN value, indicating to the server that it should use "close enough" trimming.
withArgs(Object...) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Positional arguments for the Python cross-language transform.
withArrivalTimePolicy() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
Returns an ArrivalTimeWatermarkPolicy.
withArrivalTimePolicy(Duration) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
Returns an ArrivalTimeWatermarkPolicy.
withArrivalTimePolicy() - Static method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory
Returns an ArrivalTimeWatermarkPolicy.
withArrivalTimePolicy(Duration) - Static method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory
Returns an ArrivalTimeWatermarkPolicy.
withArrivalTimeWatermarkPolicy() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies the WatermarkPolicyFactory as ArrivalTimeWatermarkPolicyFactory.
withArrivalTimeWatermarkPolicy(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies the WatermarkPolicyFactory as ArrivalTimeWatermarkPolicyFactory.
withArrivalTimeWatermarkPolicy() - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies the WatermarkPolicyFactory as ArrivalTimeWatermarkPolicyFactory.
withArrivalTimeWatermarkPolicy(Duration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies the WatermarkPolicyFactory as ArrivalTimeWatermarkPolicyFactory.
withAttemptTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read with the attempt timeout.
withAttemptTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the attempt timeout.
withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
Define the password to authenticate on the Redis server.
withAuth(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
 
withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
Use the redis AUTH command when connecting to the server; the format of the string can be either just a password or a username and password separated by a space.
withAuthenticator(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets authenticator for Snowflake.
withAutoLoading(boolean) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
withAutoScaler(AutoScaler) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Sets the AutoScaler to use for reporting backlog during the execution of this source.
withAutoSchemaUpdate(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, enables automatically detecting BigQuery table schema updates.
withAutoSharding() - Method in class org.apache.beam.sdk.io.FileIO.Write
 
withAutoSharding() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, enables using a dynamically determined number of shards to write to BigQuery.
withAutoSharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
withAutoSharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
If true, enables using a dynamically determined number of shards to write.
withAutoSharding() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
If true, enables using a dynamically determined number of shards to write.
withAutoSharding() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
withAutoSharding() - Method in class org.apache.beam.sdk.io.TextIO.Write
withAutoSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
 
withAvroDataModel(GenericData) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
Define the Avro data model; see AvroParquetReader.Builder#withDataModel(GenericData).
withAvroDataModel(GenericData) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
Define the Avro data model; see AvroParquetReader.Builder#withDataModel(GenericData).
withAvroDataModel(GenericData) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
Define the Avro data model; see AvroParquetWriter.Builder#withDataModel(GenericData).
withAvroFormatFunction(SerializableFunction<AvroWriteRequest<T>, GenericRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Formats the user's type into a GenericRecord to be written to BigQuery.
withAvroSchemaFactory(SerializableFunction<TableSchema, Schema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Uses the specified function to convert a TableSchema to a Schema.
withAvroWriter(SerializableFunction<Schema, DatumWriter<T>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes the user's type as avro using the supplied DatumWriter.
withAvroWriter(SerializableFunction<AvroWriteRequest<T>, AvroT>, SerializableFunction<Schema, DatumWriter<AvroT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Convert's the user's type to an avro record using the supplied avroFormatFunction.
withAwsClientsProvider(AwsClientsProvider) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
 
withAwsClientsProvider(String, String, Regions, String) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
 
withAwsClientsProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
 
withAwsClientsProvider(AwsClientsProvider) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Write
Deprecated.
 
withAwsClientsProvider(String, String, Regions, String) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Write
Deprecated.
 
withAwsClientsProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Write
Deprecated.
 
withAWSClientsProvider(AwsClientsProvider) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Allows to specify custom AwsClientsProvider.
withAWSClientsProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Specify credential details and region to be used to write to SNS.
withAWSClientsProvider(String, String, Regions, String) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Specify credential details and region to be used to write to SNS.
withAWSClientsProvider(AWSClientsProvider) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Allows to specify custom AWSClientsProvider.
withAWSClientsProvider(AWSCredentialsProvider, Regions) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify AWSCredentialsProvider and region to be used to read from Kinesis.
withAWSClientsProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify credential details and region to be used to read from Kinesis.
withAWSClientsProvider(AWSCredentialsProvider, Regions, String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify AWSCredentialsProvider and region to be used to read from Kinesis.
withAWSClientsProvider(String, String, Regions, String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify credential details and region to be used to read from Kinesis.
withAWSClientsProvider(AWSCredentialsProvider, Regions, String, boolean) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify AWSCredentialsProvider and region to be used to read from Kinesis.
withAWSClientsProvider(String, String, Regions, String, boolean) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify credential details and region to be used to read from Kinesis.
withAWSClientsProvider(AWSClientsProvider) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Allows to specify custom AWSClientsProvider.
withAWSClientsProvider(AWSCredentialsProvider, Regions) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify AWSCredentialsProvider and region to be used to write to Kinesis.
withAWSClientsProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify credential details and region to be used to write to Kinesis.
withAWSClientsProvider(AWSCredentialsProvider, Regions, String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify AWSCredentialsProvider and region to be used to write to Kinesis.
withAWSClientsProvider(String, String, Regions, String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify credential details and region to be used to write to Kinesis.
withAWSClientsProvider(AWSCredentialsProvider, Regions, String, boolean) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify credential details and region to be used to write to Kinesis.
withAWSClientsProvider(String, String, Regions, String, boolean) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify credential details and region to be used to write to Kinesis.
withBackendVersion(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Use to set explicitly which version of Elasticsearch the destination cluster is running.
withBackendVersion(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withBacklogReplicationAdjustment(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that overrides the replication delay adjustment duration with the provided duration.
withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
See FileIO.Write#withBadRecordErrorHandler(ErrorHandler) for details on usage.
withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Configures a new FileIO.Write with an ErrorHandler.
withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
 
withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Configure a ErrorHandler.BadRecordErrorHandler for sending records to if they fail to serialize when being sent to Kafka.
withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.WriteFiles
withBaseFilename(ResourceId) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
Sets the base filename.
withBaseFilename(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
withBasicCredentials(String, String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
If Solr basic authentication is enabled, provide the username and password.
withBatchCount(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
withBatchCount(Integer) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Sets batchCount for sending multiple events in a single request to the HEC.
withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
If true the uses Cloud Spanner batch API.
withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
By default the PartitionQuery API is used to read data from Cloud Spanner.
withBatchInitialCount(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
withBatchMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Returns a new TextIO.TypedWrite that will batch the input records using specified max buffering duration.
withBatchMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.TextIO.Write
withBatchMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will batch the input records using specified max buffering duration.
withBatchMaxBytes(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Max.
withBatchMaxBytes(long) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the maximum number of bytes to include in a batch.
withBatchMaxCount(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the maximum number of writes to include in a batch.
withBatchMaxRecords(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Max.
withBatchSize(Integer) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
withBatchSize(Integer) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
withBatchSize(int) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
The batch size to use, default (and AWS limit) is 10.
withBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide a size for the scroll read.
withBatchSize(long) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
Sets batch size for the write operation.
withBatchSize(int) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
Number of elements to batch.
withBatchSize(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
withBatchSize(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
Provide a maximum size in number of SQL statement for the batch.
withBatchSize(int) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
Reads from the table in batches of the specified size.
withBatchSize(long) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Define the size of the batch to group write operations.
withBatchSize(long) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withBatchSize(ValueProvider<Long>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withBatchSize(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
withBatchSize(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
 
withBatchSize(Integer) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
Provide a maximum number of rows that is written by one SQL statement.
withBatchSize(Integer) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Returns a new TextIO.TypedWrite that will batch the input records using specified batch size.
withBatchSize(Integer) - Method in class org.apache.beam.sdk.io.TextIO.Write
withBatchSize(Integer) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will batch the input records using specified batch size.
withBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the batch size limit (max number of bytes mutated per batch).
withBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Returns a new TextIO.TypedWrite that will batch the input records using specified batch size in bytes.
withBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.io.TextIO.Write
withBatchSizeBytes(Integer) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will batch the input records using specified batch size in bytes.
withBatchTargetLatency(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Target latency for batch requests.
withBatchTimeout(Duration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
The duration to accumulate records before timing out, default is 3 secs.
withBatchTimeout(Duration, boolean) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
The duration to accumulate records before timing out, default is 3 secs.
withBeamRowConverters(TypeDescriptor<T>, BigQueryIO.TypedRead.ToBeamRowFunction<T>, BigQueryIO.TypedRead.FromBeamRowFunction<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Sets the functions to convert elements to/from Row objects.
withBeamRowConverters(TypeDescriptor<Struct>, SpannerIO.Read.ToBeamRowFunction, SpannerIO.Read.FromBeamRowFunction) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
If set to true, a Beam schema will be inferred from the AVRO schema.
withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
Deprecated.
If set to true, a Beam schema will be inferred from the AVRO schema.
withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
If set to true, a Beam schema will be inferred from the AVRO schema.
withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
 
withBeamSchemas(boolean) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
 
withBearerToken(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch authentication is enabled, provide a bearer token.
withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Deprecated.
please set the options directly in BigtableIO.
withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Deprecated.
please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Deprecated.
please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Deprecated.
please set the options directly in BigtableIO.
withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Deprecated.
please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets the bootstrap servers for the Kafka consumer.
withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Sets the bootstrap servers to use for the Kafka consumer if unspecified via KafkaSourceDescriptor#getBootStrapServers()}.
withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Wrapper method over KafkaIO.WriteRecords.withBootstrapServers(String), used to keep the compatibility with old API based on KV type of element.
withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Returns a new KafkaIO.Write transform with Kafka producer pointing to bootstrapServers.
withBucket(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withBucket(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withBucketAuto(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Sets weather to use $bucketAuto or not.
withBulkDirective(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
Sets the bulk directive representation of an input document.
withByteSize(long) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
 
withByteSize(long, SerializableFunction<InputT, Long>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
 
withCache(Cache.Pair<RequestT, ResponseT>) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
Configures RequestResponseIO for reading and writing RequestT and ResponseT pairs using a cache.
withCallShouldBackoff(CallShouldBackoff<ResponseT>) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
Overrides the private no-op implementation of CallShouldBackoff that determines whether the DoFn should hold RequestTs.
withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
 
withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
withCatalogName(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
withCdapPlugin(Plugin<K, V>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
Sets a CDAP Plugin.
withCdapPlugin(Plugin<K, V>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
Sets a CDAP Plugin.
withCdapPluginClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
Sets a CDAP Plugin class.
withCdapPluginClass(Class<?>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
Sets a CDAP Plugin class.
withChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that uses changeStreamName as prefix for the metadata table.
withChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the change stream name.
withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets the XML file charset.
withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
Sets the charset used to write the file.
withCheckStopReadingFn(CheckStopReadingFn) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A custom CheckStopReadingFn that determines whether the ReadFromKafkaDoFn should stop reading from the given TopicPartition.
withCheckStopReadingFn(SerializableFunction<TopicPartition, Boolean>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A custom SerializableFunction that determines whether the ReadFromKafkaDoFn should stop reading from the given TopicPartition.
withCheckStopReadingFn(CheckStopReadingFn) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
A custom CheckStopReadingFn that determines whether the ReadFromKafkaDoFn should stop reading from the given TopicPartition.
withCheckStopReadingFn(SerializableFunction<TopicPartition, Boolean>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
A custom SerializableFunction that determines whether the ReadFromKafkaDoFn should stop reading from the given TopicPartition.
withChunkSize(Long) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
Configuration of DynamoDB client.
withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
Configuration of DynamoDB client.
withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Configuration of Kinesis & Cloudwatch clients.
withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Configuration of Kinesis client.
withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
Configuration of SNS client.
withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
Configuration of SQS client.
withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Write
Deprecated.
Configuration of SQS client.
withClientConfiguration(ClientConfiguration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
Configuration of SQS client.
withClientFactory(PubsubClient.PubsubClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
The default client to write to Pub/Sub is the PubsubJsonClient, created by the PubsubJsonClient.PubsubJsonClientFactory.
withClientFactory(PubsubClient.PubsubClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
The default client to write to Pub/Sub is the PubsubJsonClient, created by the PubsubJsonClient.PubsubJsonClientFactory.
withClientId(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
Set up the client ID prefix, which is used to construct a unique client ID.
withClientUrl(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withClientUrl(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
 
withCloseTimeout(Duration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Sets the amount of time to wait for callbacks from the runner stating that the output has been durably persisted before closing the connection to the JMS broker.
withClosingBehavior(Window.ClosingBehavior) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
withClustering(Clustering) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies the clustering fields to use when writing to a single output table.
withClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withCodec(CodecFactory) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
Specifies to use the given CodecFactory for each generated file.
withCodec(CodecFactory) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Writes to Avro file(s) compressed using specified codec.
withCodec(CodecFactory) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withCoder(Coder<T>) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
Applies a Coder to the connector.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
Sets a coder for the result of the parse function.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
Specifies the coder for the result of the parseFn.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
Specifies the coder for the result of the parseFn.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
Sets a coder for the result of the read function.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
Sets a coder for the result of the read function.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
Specifies the coder for the result of the AvroSource.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
 
withCoder(Coder<PublishResult>) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Encode the PublishResult with the given coder.
withCoder(Coder<Message>) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
Deprecated.
Optionally set a custom Message output coder if you need to access further (message) attributes.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
 
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
Specify the Coder used to serialize the document in the PCollection.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Coder used to serialize the entity in the PCollection.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.ReadAll
Specify the Coder used to serialize the entity in the PCollection.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Sets a Coder for the result of the parse function.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
Deprecated.

JdbcIO is able to infer appropriate coders from other parameters.

withCoder(Coder<OutputT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
Deprecated.

JdbcIO is able to infer appropriate coders from other parameters.

withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
Deprecated.

JdbcIO is able to infer appropriate coders from other parameters.

withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
Sets a Coder for the result of the parse function.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withCoder(Coder<OutputT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
Specify the output coder to use for output of the ParseFn.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
Specify the output coder to use for output of the ParseFn.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
A Coder to be used by the output PCollection generated by the source.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
Returns a Create.TimestampedValues PTransform like this one that uses the given Coder<T> to decode each of the objects into a value of type T.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
Returns a Create.Values PTransform like this one that uses the given Coder<T> to decode each of the objects into a value of type T.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
Returns a Create.WindowedValues PTransform like this one that uses the given Coder<T> to decode each of the objects into a value of type T.
withCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Causes the source to return a PubsubMessage that includes Pubsub attributes, and uses the given parsing function to transform the PubsubMessage into an output type.
withCollection(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Sets the collection to consider in the database.
withCollection(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Sets the collection where to write data in the database.
withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withCommitDeadline(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the commit deadline.
withCommitDeadline(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the commit deadline.
withCommitDeadline(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the deadline for the Commit API call.
withCommitRetrySettings(RetrySettings) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the commit retry settings.
withCompression(double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
Sets the compression factor cf.
withCompression(double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
Sets the compression factor cf.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.CompressedSource
withCompression(Compression) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
Reads from input sources using the specified compression type.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Specifies the Compression of all generated shard files.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
Reads files using the given Compression.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies to compress all generated shard files using the given Compression and, by default, append the respective extension to the filename.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Specifies the Compression of all generated shard files.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.Read
Reads from input sources using the specified compression type.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
Deprecated.
Reads from input sources using the specified compression type.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Returns a transform for writing to text files like this one but that compresses output using the given Compression.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.Write
withCompression(Compression) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
Returns a transform for reading TFRecord files that decompresses all input files using the specified compression type.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes to output files using the specified compression type.
withCompression(Compression) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Decompresses all input files using the specified compression type.
withCompressionCodec(CompressionCodecName) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
Specifies compression codec.
withCompressionEnabled(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
Configure whether the REST client should compress requests using gzip content encoding and add the "Accept-Encoding: gzip".
withCompressionType(TextIO.CompressionType) - Method in class org.apache.beam.sdk.io.TextIO.Read
withCompressionType(TextIO.CompressionType) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
withCompressionType(TFRecordIO.CompressionType) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
withCompressionType(TFRecordIO.CompressionType) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
withCompressionType(XmlIO.Read.CompressionType) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
withConcurrentRequests(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Max number of concurrent batch write requests per bundle.
withConcurrentRequests(int) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
Max number of concurrent batch write requests per bundle, default is 5.
withConfidence(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
Sets the confidence value, i.e.
withConfidence(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
Sets the confidence value, i.e.
withConfig(PluginConfig) - Method in class org.apache.beam.sdk.io.cdap.Plugin
Sets a plugin config.
withConfig(Config) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withConfig(Map<String, Object>) - Method in class org.apache.beam.sdk.managed.Managed.ManagedTransform
Use the input Map of configuration arguments to build and instantiate the underlying transform.
withConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
Sets the configuration properties like metastore URI.
withConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
Sets the configuration properties like metastore URI.
withConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.FileIO.Match
withConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Reads from the source using the options provided by the given configuration.
withConfiguration(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.WriteBuilder
Writes to the sink using the options provided by the given hadoop configuration.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Reads from the HBase instance indicated by the* given configuration.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
Writes to the HBase instance indicated by the* given Configuration.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
Writes to the HBase instance indicated by the given Configuration.
withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
Specify Hadoop configuration for ParquetReader.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
Specify Hadoop configuration for ParquetReader.
withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
Specify Hadoop configuration for ParquetReader.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ParseFiles
Specify Hadoop configuration for ParquetReader.
withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
Specify Hadoop configuration for ParquetReader.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
Specify Hadoop configuration for ParquetReader.
withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
Specify Hadoop configuration for ParquetReader.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
Specify Hadoop configuration for ParquetReader.
withConfiguration(Map<String, String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
Specify Hadoop configuration for ParquetWriter.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
Specify Hadoop configuration for ParquetWriter.
withConfigurationTransform(PTransform<PCollection<? extends KV<KeyT, ValueT>>, PCollectionView<Configuration>>) - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.WriteBuilder
Writes to the sink using configuration created by provided configurationTransformation.
withConfigUrl(String) - Method in class org.apache.beam.sdk.managed.Managed.ManagedTransform
Like Managed.ManagedTransform.withConfig(Map), but instead extracts the configuration arguments from a specified YAML file location.
withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
Provide the Elasticsearch connection configuration object.
withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide the Elasticsearch connection configuration object.
withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide the Elasticsearch connection configuration object.
withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withConnectionConfiguration(MqttIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
Define the MQTT connection configuration used to connect to the MQTT broker.
withConnectionConfiguration(MqttIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
Define MQTT connection configuration used to connect to the MQTT broker.
withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
 
withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
Predefine a RedisConnectionConfiguration and pass it to the builder.
withConnectionConfiguration(SolrIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
Provide the Solr connection configuration object.
withConnectionConfiguration(SolrIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
Provide the Solr connection configuration object.
withConnectionFactory(ConnectionFactory) - Method in interface org.apache.beam.sdk.io.jms.JmsIO.ConnectionFactoryContainer
 
withConnectionFactory(ConnectionFactory) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Specify the JMS connection factory to connect to the JMS broker.
withConnectionFactory(ConnectionFactory) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify the JMS connection factory to connect to the JMS broker.
withConnectionFactoryProviderFn(SerializableFunction<Void, ? extends ConnectionFactory>) - Method in interface org.apache.beam.sdk.io.jms.JmsIO.ConnectionFactoryContainer
 
withConnectionFactoryProviderFn(SerializableFunction<Void, ? extends ConnectionFactory>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Specify a JMS connection factory provider function to connect to the JMS broker.
withConnectionFactoryProviderFn(SerializableFunction<Void, ? extends ConnectionFactory>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify a JMS connection factory provider function to connect to the JMS broker.
withConnectionInitSqls(Collection<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
Sets the connection init sql statements to driver.connect(...).
withConnectionInitSqls(ValueProvider<Collection<String>>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
withConnectionProperties(Map<String, String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets a custom property to be used within the connection to your database.
withConnectionProperties(ValueProvider<Map<String, String>>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets a custom property to be used within the connection to your database.
withConnectionProperties(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
Sets the connection properties passed to driver.connect(...).
withConnectionProperties(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
withConnectionProperties(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
Sets the connection properties passed to driver.connect(...).
withConnectionProperty(String, String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets a custom property to be used within the connection to your database.
withConnectorClass(Class<?>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Applies the connectorClass to be used to connect to your database.
withConnectorClass(ValueProvider<Class<?>>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the connectorClass to be used to connect to your database.
withConnectorConfiguration(DebeziumIO.ConnectorConfiguration) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
Applies the given configuration to the connector.
withConnectTimeout(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Cassandra client connect timeout in ms.
withConnectTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Cassandra client connect timeout in ms.
withConnectTimeout(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Cassandra client socket option for connect timeout in ms.
withConnectTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Cassandra client socket option for connect timeout in ms.
withConnectTimeout(Integer) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If set, overwrites the default connect timeout (1000ms) in the RequestConfig of the Elastic RestClient.
withConsistencyLevel(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the consistency level for the request (e.g.
withConsistencyLevel(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the consistency level for the request (e.g.
withConsistencyLevel(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the consistency level for the request (e.g.
withConsistencyLevel(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the consistency level for the request (e.g.
withConsistencyLevel(InfluxDB.ConsistencyLevel) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
Sets the consistency level to use.
withConstructorArgs(Object...) - Method in class org.apache.beam.sdk.io.sparkreceiver.ReceiverBuilder
Method for specifying constructor arguments for corresponding ReceiverBuilder.sparkReceiverClass.
withConsumerArn(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specify consumer ARN to enable Enhanced Fan-Out.
withConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Replaces the configuration for the main consumer.
withConsumerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Update configuration for the backend main consumer.
withConsumerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Updates configuration for the main consumer.
withConsumerFactoryFn(SerializableFunction<Map<String, Object>, Consumer<byte[], byte[]>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A factory to create Kafka Consumer from consumer configuration.
withConsumerFactoryFn(SerializableFunction<Map<String, Object>, Consumer<byte[], byte[]>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
A factory to create Kafka Consumer from consumer configuration.
withConsumerFactoryFn(SerializableFunction<Map<String, Object>, ? extends Consumer<?, ?>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Wrapper method over KafkaIO.WriteRecords.withConsumerFactoryFn(SerializableFunction), used to keep the compatibility with old API based on KV type of element.
withConsumerFactoryFn(SerializableFunction<Map<String, Object>, ? extends Consumer<?, ?>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
When exactly-once semantics are enabled (see KafkaIO.WriteRecords.withEOS(int, String)), the sink needs to fetch previously stored state with Kafka topic.
withConsumerPollingTimeout(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets the timeout time in seconds for Kafka consumer polling request in the ReadFromKafkaDoFn.
withConsumerPollingTimeout(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Sets the timeout time in seconds for Kafka consumer polling request in the ReadFromKafkaDoFn.
withContainer(String) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
Specify the Cosmos container to read from.
withContentTypeHint(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
Sets a content type hint to make the file parser detection more efficient.
withCPUCount(int) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
Sets desired minimal CPU or vCPU count to have in transform's execution environment.
withCreateDisposition(BigQueryIO.Write.CreateDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies whether the table should be created if it does not exist.
withCreateDisposition(CreateDisposition) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
A disposition to be used during table preparation.
withCreateOrUpdateMetadataTable(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that, if set to true, will create or update metadata table before launching pipeline.
withCreateTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets the timestamps policy based on KafkaTimestampType.CREATE_TIME timestamp of the records.
withCreateTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Use the creation time of KafkaRecord as the output timestamp.
withCreateTime(Duration) - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
withCreatWatermarkEstimatorFn(SerializableFunction<Instant, WatermarkEstimator<Instant>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
A function to create a WatermarkEstimator.
withCredentials(Credentials) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the credentials.
withCredentials(ValueProvider<Credentials>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the credentials.
withCsvMapper(SnowflakeIO.CsvMapper<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
User-defined function mapping CSV lines into user data.
withCustomBeamRequirement(String) - Method in class org.apache.beam.sdk.extensions.python.PythonService
Override the Beam version to be installed in the service environment.
withCustomGcsTempLocation(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Provides a custom location on GCS for storing temporary files to be loaded via BigQuery batch load jobs.
withCustomRateLimitPolicy(RateLimitPolicyFactory) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies the RateLimitPolicyFactory for a custom rate limiter.
withCustomRateLimitPolicy(RateLimitPolicyFactory) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies the RateLimitPolicyFactory for a custom rate limiter.
withCustomRecordParsing(String, SerializableFunction<String, OutputT>) - Method in class org.apache.beam.sdk.io.csv.CsvIOParse
Configures custom cell parsing.
withCustomWatermarkPolicy(WatermarkPolicyFactory) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies the WatermarkPolicyFactory as a custom watermarkPolicyFactory.
withCustomWatermarkPolicy(WatermarkParameters) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
Returns an custom WatermarkPolicyFactory.
withCustomWatermarkPolicy(WatermarkPolicyFactory) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies the WatermarkPolicyFactory as a custom watermarkPolicyFactory.
withCustomWatermarkPolicy(WatermarkParameters) - Static method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory
Returns an custom WatermarkPolicyFactory.
withCypher(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withCypher(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withCypher(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withCypher(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withCypherLogging() - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withCypherLogging() - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withDatabase(String) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
Specify the Cosmos database to read from.
withDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
Sets the database name.
withDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
Sets the database name.
withDatabase(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
Reads from the specified database.
withDatabase(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
Writes to the specified database.
withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Sets the database to use.
withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Sets the database to use.
withDatabase(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
 
withDatabase(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets database to use.
withDatabase(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore for the specified database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore for the specified database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from the Cloud Datastore for the specified database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that writes to the Cloud Datastore for the database id.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner database ID.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner database ID.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner database.
withDatabaseRole(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner database role.
withDataBoostEnabled(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies if the pipeline has to be run on the independent compute resource.
withDatasetService(FakeDatasetService) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
withDataSourceConfiguration(InfluxDbIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
Reads from the InfluxDB instance indicated by the given configuration.
withDataSourceConfiguration(InfluxDbIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
Writes to the InfluxDB instance indicated by the given configuration.
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
 
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
See WriteVoid#withDataSourceConfiguration(DataSourceConfiguration).
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
 
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
withDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
 
withDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
 
withDataSourceConfiguration(SingleStoreIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
 
withDataSourceConfiguration(SnowflakeIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
Setting information about Snowflake server.
withDataSourceConfiguration(SnowflakeIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Setting information about Snowflake server.
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
 
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
 
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
Setting function that will provide SnowflakeIO.DataSourceConfiguration in runtime.
withDataSourceProviderFn(SerializableFunction<Void, DataSource>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Setting function that will provide SnowflakeIO.DataSourceConfiguration in runtime.
withDatumReaderFactory(AvroSource.DatumReaderFactory<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
Sets a custom AvroSource.DatumReaderFactory for reading.
withDatumReaderFactory(AvroSource.DatumReaderFactory<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
Sets a custom AvroSource.DatumReaderFactory for reading.
withDatumReaderFactory(AvroSource.DatumReaderFactory<?>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
Sets a custom AvroSource.DatumReaderFactory for reading.
withDatumWriterFactory(AvroSink.DatumWriterFactory<ElementT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
Sets a custom AvroSource.DatumReaderFactory for writing.
withDatumWriterFactory(AvroSink.DatumWriterFactory<OutputT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Specifies a AvroSink.DatumWriterFactory to use for creating DatumWriter instances.
withDatumWriterFactory(AvroSink.DatumWriterFactory<T>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withDdlString(String) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
withDeadLetterQueue() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
 
withDeadLetterTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Creates and returns a transform for writing read failures out to a dead-letter topic.
withDeadLetterTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
withDebugMode(StreamingLogLevel) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
The option to verbose info (or only errors) of loaded files while streaming.
withDecompression(CompressedSource.DecompressingChannelFactory) - Method in class org.apache.beam.sdk.io.CompressedSource
Return a CompressedSource that is like this one but will decompress its underlying file with the given CompressedSource.DecompressingChannelFactory.
withDeduplicateKeys(List<String>) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Write
Deprecated.
 
withDeduplicateKeys(List<String>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
 
withDeduplicateRecords(boolean) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
Optional, default: false.
WithDefault() - Constructor for class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
withDefaultConfig(boolean) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withDefaultConfig(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withDefaultHeaders(Header[]) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
For authentication or custom requirements, provide a set if default headers for the client.
withDefaultMissingValueInterpretation(AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specify how missing values should be interpreted when there is a default value in the schema.
withDefaultRateLimiter() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
 
withDefaultRateLimiter(Duration, Duration, Duration) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
 
withDefaultRateLimiter() - Static method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory
 
withDefaultRateLimiter(Duration, Duration, Duration) - Static method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory
 
withDefaultTableProvider(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
withDefaultThreadPool() - Static method in class org.apache.beam.runners.dataflow.util.PackageUtil
 
withDefaultValue(T) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
Default value to return for windows with no value in them.
withDelay(Supplier<Duration>) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
 
withDelay(Supplier<Duration>) - Static method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory
 
withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
Set the custom delimiter to be used in place of the default ones ('\r', '\n' or '\r\n').
withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.ReadFiles
withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.TextIO.Read
Set the custom delimiter to be used in place of the default ones ('\r', '\n' or '\r\n').
withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
Like Read#withDelimiter.
withDelimiter(char[]) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Specifies the delimiter after each string written.
withDelimiter(char[]) - Method in class org.apache.beam.sdk.io.TextIO.Write
withDeliveryMode(DeliveryMode) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
Set the delivery mode.
withDescription(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns a copy of the Field with the description set.
withDesiredBundleSizeBytes(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
Set a value for the bundle size for parallel reads.
withDesiredBundleSizeBytes(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
Set a value for the bundle size for parallel reads.
withDesiredBundleSizeBytes(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
Deprecated.
Set a value for the bundle size for parallel reads.
withDesiredBundleSizeBytes(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
Set a value for the bundle size for parallel reads.
withDestinationCoder(Coder<DestinationT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies a Coder for the destination type, if it can not be inferred from FileIO.Write.by(org.apache.beam.sdk.transforms.SerializableFunction<UserT, DestinationT>).
withDeterministicRecordIdFn(SerializableFunction<T, String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withDeveloperToken(String) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.Read
Creates and returns a new GoogleAdsV17.Read transform with the specified developer token.
withDeveloperToken(String) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.ReadAll
Creates and returns a new GoogleAdsV17.ReadAll transform with the specified developer token.
withDialectView(PCollectionView<Dialect>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
withDirectExecutor() - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
Returns a ManagedChannelFactory like this one, but will construct the channel to use the direct executor.
withDirectoryTreatment(FileIO.ReadMatches.DirectoryTreatment) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
Controls how to handle directories in the input PCollection.
withDirectWriteProtos(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withDisableAutoCommit(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
Whether to disable auto commit on read.
withDisableAutoCommit(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
Whether to disable auto commit on read.
withDisableAutoCommit(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
Whether to disable auto commit on read.
withDisableAutoCommit(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
Whether to disable auto commit on read.
withDisableCertificateValidation(boolean) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
Disable SSL certification validation.
withDisableCertificateValidation(boolean) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
Disable SSL certification validation.
withDisableCertificateValidation(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
withDisableCertificateValidation(Boolean) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Disables ssl certificate validation.
withDocVersionFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide a function to extract the doc version from the document.
withDocVersionFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withDocVersionType(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide a function to extract the doc version from the document.
withDocVersionType(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withDriverClassLoader(ClassLoader) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
Sets the class loader instance to be used to load the JDBC driver.
withDriverConfiguration(Neo4jIO.DriverConfiguration) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withDriverConfiguration(Neo4jIO.DriverConfiguration) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withDriverJars(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
Comma separated paths for JDBC drivers.
withDriverJars(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
Same as JdbcIO.DataSourceConfiguration.withDriverJars(String) but accepting a ValueProvider.
withDuration(Duration) - Method in class org.apache.beam.sdk.transforms.Deduplicate.KeyedValues
Returns a KeyedValues PTransform like this one but with the specified duration.
withDuration(Duration) - Method in class org.apache.beam.sdk.transforms.Deduplicate.Values
Returns a Values PTransform like this one but with the specified duration.
withDuration(Duration) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
Return a WithRepresentativeValues PTransform that is like this one, but with the specified deduplication duration.
withDynamicDelayRateLimitPolicy(Supplier<Duration>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies a dynamic delay rate limit policy with the given function being called at each polling interval to get the next delay value.
withDynamicDelayRateLimitPolicy(Supplier<Duration>) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies a dynamic delay rate limit policy with the given function being called at each polling interval to get the next delay value.
withDynamicRead(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Configure the KafkaIO to use WatchForKafkaTopicPartitions to detect and emit any new available TopicPartition for ReadFromKafkaDoFn to consume during pipeline execution time.
withEarlyFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
withEarlyFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
Creates a new Trigger like the this, except that it fires repeatedly whenever the given Trigger fires before the watermark has passed the end of the window.
withElementTimestamp() - Static method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
Returns KafkaPublishTimestampFunction returns element timestamp from ProcessContext.
withEmptyGlobalWindowDestination(DestinationT) - Method in class org.apache.beam.sdk.io.FileIO.Write
If FileIO.Write.withIgnoreWindowing() is specified, specifies a destination to be used in case the collection is empty, to generate the (only, empty) output file.
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
Configures whether or not a filepattern matching no files is allowed.
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
Deprecated.
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
 
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.Match
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.TextIO.Read
withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
Deprecated.
Same as Read#withEmptyMatchTreatment.
withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will use an official Bigtable emulator.
withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will use an official Bigtable emulator.
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner host, when an emulator is used.
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner emulator host.
withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner emulator host.
withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner emulator host.
withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner emulator host.
withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
withEnableBatchLogs(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Same as Builder#withEnableBatchLogs(ValueProvider) but without a ValueProvider.
withEnableBatchLogs(Boolean) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Method to enable batch logs.
withEnableGzipHttpCompression(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Same as Builder#withEnableGzipHttpCompression(ValueProvider) but without a ValueProvider.
withEnableGzipHttpCompression(Boolean) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Method to specify if HTTP requests sent to Splunk should be GZIP encoded.
withEndKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns new ByteKeyRange like this one, but with the specified end key.
withEndMessageId(MessageId) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
 
withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
Set the hostname and port of the Redis server to connect to.
withEndTimestamp(Long) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withEntity(Class<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the entity class (annotated POJO).
withEntity(Class<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the entity class in the input PCollection.
withEntries - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
 
withEntryMapper(SqsIO.WriteBatches.EntryMapperFn<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
Optional mapper to create a batch entry from a unique entry id and the input T, otherwise inferred from the schema.
withEntryMapper(SqsIO.WriteBatches.EntryMapperFn.Builder<T>) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
Optional mapper to create a batch entry from the input T using a builder, otherwise inferred from the schema.
withEnvironmentId(String) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
withEOS(int, String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Wrapper method over KafkaIO.WriteRecords.withEOS(int, String), used to keep the compatibility with old API based on KV type of element.
withEOS(int, String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Provides exactly-once semantics while writing to Kafka, which enables applications with end-to-end exactly-once guarantees on top of exactly-once semantics within Beam pipelines.
withEpsilon(double) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Returns an ApproximateQuantilesCombineFn that's like this one except that it uses the specified epsilon value.
withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Configures the PubSub read with an alternate error handler.
withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes any serialization failures out to the Error Handler.
withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
An optional error handler for handling records that failed to publish to Solace.
withErrorsTransformer(PTransform<PCollection<Row>, POutput>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
 
withErrorsTransformer(PTransform<PCollection<Row>, ? extends POutput>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
withEvent(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
Assigns the event payload to be sent to the HEC endpoint.
withEventStore(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
withEventStore(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
withEventStore(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
withEverythingCounted() - Method in class org.apache.beam.io.requestresponse.Monitoring
Turns on all monitoring.
withEverythingCountedExceptedCaching() - Method in class org.apache.beam.io.requestresponse.Monitoring
Turns on all monitoring except for cache related metrics.
withExceptionReporting(Schema) - Static method in class org.apache.beam.sdk.transforms.JsonToRow
Enable Exception Reporting support.
withExchange(String, String, String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
In AMQP, messages are published to an exchange and routed to queues based on the exchange type and a queue binding.
withExchange(String, String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
In AMQP, messages are published to an exchange and routed to queues based on the exchange type and a queue binding.
withExchange(String, String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
Defines the to-be-declared exchange where the messages will be sent.
withExchange(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
Defines the existing exchange where the messages will be sent.
withExecuteStreamingSqlRetrySettings(RetrySettings) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the ExecuteStreamingSql retry settings.
withExecutorService(ExecutorService) - Static method in class org.apache.beam.runners.dataflow.util.PackageUtil
 
withExistingPipelineOptions(BigtableIO.ExistingPipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that decides what to do if an existing pipeline exists with the same change stream name.
withExpansionService(String) - Method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
Sets an expansion service endpoint for DataframeTransform.
withExpansionService(String) - Method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
 
withExpansionService(String) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
Sets an expansion service endpoint for RunInference.
withExpireTime(Long) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
withExtendedErrorInfo() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Enables extended error information by enabling WriteResult.getFailedInsertsWithErr()
withExtendedErrorInfo(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
Specify whether to use extended error info or not.
withExtendedErrorInfo() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
Adds the error message to the returned error Row.
withExtensionsFrom(Iterable<Class<?>>) - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
Returns a DynamicProtoCoder like this one, but with the extensions from the given classes registered.
withExtensionsFrom(Iterable<Class<?>>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns a ProtoCoder like this one, but with the extensions from the given classes registered.
withExtensionsFrom(Class<?>...) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
withExternalSorterType(ExternalSorter.Options.SorterType) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Sets the external sorter type.
withExternalSynchronization(ExternalSynchronization) - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.ExternalSynchronizationBuilder
Specifies class which will provide external synchronization required for hadoop write operation.
withExtractOutputTimestampFn(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
A function to calculate output timestamp for a given KafkaRecord.
withExtractOutputTimestampFn(SerializableFunction<Message<byte[]>, Instant>) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withExtraPackages(List<String>) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Specifies that the given Python packages are required for this transform, which will cause them to be installed in both the construction-time and execution time environment.
withExtraPackages(List<String>) - Method in class org.apache.beam.sdk.extensions.python.PythonService
Specifies that the given Python packages should be installed for this service environment.
withExtraPackages(List<String>) - Method in class org.apache.beam.sdk.extensions.python.transforms.PythonMap
Specifies any extra packages required by the Python function.
withExtraPackages(List<String>) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
Specifies any extra packages required by the RunInference model handler.
withFailedInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies a policy for handling failed inserts.
withFailureMode(SpannerIO.FailureMode) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies failure mode.
WithFailures - Class in org.apache.beam.sdk.transforms
A collection of utilities for writing transforms that can handle exceptions raised during processing of elements.
WithFailures() - Constructor for class org.apache.beam.sdk.transforms.WithFailures
 
WithFailures.ExceptionAsMapHandler<T> - Class in org.apache.beam.sdk.transforms
A simple handler that extracts information from an exception to a Map<String, String> and returns a KV where the key is the input element that failed processing, and the value is the map of exception attributes.
WithFailures.ExceptionElement<T> - Class in org.apache.beam.sdk.transforms
The value type passed as input to exception handlers.
WithFailures.Result<OutputT extends POutput,FailureElementT> - Class in org.apache.beam.sdk.transforms
An intermediate output type for PTransforms that allows an output collection to live alongside a collection of elements that failed the transform.
WithFailures.ThrowableHandler<T> - Class in org.apache.beam.sdk.transforms
A handler that holds onto the Throwable that led to the exception, returning it along with the original value as a KV.
withFanout(int) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
withFanout(int) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineGlobally
 
withFanout(int) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform identical to this, but that uses an intermediate node to combine parts of the data to reduce load on the final global combine step.
withFaultTolerent(boolean) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
Instructs the read scan to resume a scan on another tablet server if the current server fails and faultTolerant is set to true.
withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
This method is used to set the size of the data that is going to be fetched and loaded in memory per every database call.
withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
This method is used to set the size of the data that is going to be fetched and loaded in memory per every database call.
withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
This method is used to set the size of the data that is going to be fetched and loaded in memory per every database call.
withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
The number of rows to fetch from the database in the same ResultSet round-trip.
withFieldAccessDescriptors(Map<FieldAccessDescriptor, Object>) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
Sets field values using the FieldAccessDescriptors.
withFieldIds(FieldAccessDescriptor, Integer...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that accesses the specified field ids as nested subfields of the baseDescriptor.
withFieldIds(FieldAccessDescriptor, Iterable<Integer>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that accesses the specified field ids as nested subfields of the baseDescriptor.
withFieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that access the specified fields.
withFieldIds(Iterable<Integer>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that access the specified fields.
withFieldNameAs(String, String) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Add a field with a new name.
withFieldNameAs(String, String) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Fields
Add a single field to the selection, along with the name the field should take in the selected schema.
withFieldNameAs(String, String) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
Allows renaming a specific nested field.
withFieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that access the specified fields.
withFieldNames(Iterable<String>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that access the specified fields.
withFieldNames(FieldAccessDescriptor, String...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that accesses the specified field names as nested subfields of the baseDescriptor.
withFieldNames(FieldAccessDescriptor, Iterable<String>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that accesses the specified field names as nested subfields of the baseDescriptor.
withFieldNamesAs(Map<String, String>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that accesses the specified fields, renaming those fields.
withFieldReordering() - Method in enum org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
 
withFields(JsonObject) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
Assigns fields value to the event metadata.
withFields(FieldAccessDescriptor.FieldDescriptor...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Returns a FieldAccessDescriptor that accesses the specified fields.
withFields(Iterable<FieldAccessDescriptor.FieldDescriptor>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Returns a FieldAccessDescriptor that accesses the specified fields.
withFieldValue(String, Object) - Method in class org.apache.beam.sdk.values.Row.Builder
Set a field value using the field name.
withFieldValue(Integer, Object) - Method in class org.apache.beam.sdk.values.Row.Builder
Set a field value using the field id.
withFieldValue(FieldAccessDescriptor, Object) - Method in class org.apache.beam.sdk.values.Row.Builder
Set a field value using a FieldAccessDescriptor.
withFieldValue(String, Object) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
Set a field value using the field name.
withFieldValue(Integer, Object) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
Set a field value using the field id.
withFieldValue(FieldAccessDescriptor, Object) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
Set a field value using a FieldAccessDescriptor.
withFieldValueGetters(Factory<List<FieldValueGetter<T, Object>>>, T) - Method in class org.apache.beam.sdk.values.Row.Builder
 
withFieldValues(Map<String, Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
Sets field values using the field names.
withFieldValues(Map<String, Object>) - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
Sets field values using the field names.
withFileExceptionHandler(ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
Specifies if exceptions should be logged only for streaming pipelines.
withFileExceptionHandler(ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
Specifies if exceptions should be logged only for streaming pipelines.
withFilename(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withFileNameTemplate(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
A template name for files saved to GCP.
withFilter(Filter) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Filters the rows read from HBase using the given* row filter.
withFilter(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
Sets the filter details.
withFilter(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withFilters(Bson) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
Sets the filters to find.
withFindKey(String) - Method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
Sets the filters to find.
withFixedDelay() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
 
withFixedDelay(Duration) - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
 
withFixedDelay() - Static method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory
 
withFixedDelay(Duration) - Static method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory
 
withFixedDelayRateLimitPolicy() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies a fixed delay rate limit policy with the default delay of 1 second.
withFixedDelayRateLimitPolicy(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies a fixed delay rate limit policy with the given delay.
withFixedDelayRateLimitPolicy() - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies a fixed delay rate limit policy with the default delay of 1 second.
withFixedDelayRateLimitPolicy(Duration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies a fixed delay rate limit policy with the given delay.
withFlowControl(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with flow control enabled if enableFlowControl is true.
withFlushRowLimit(Integer) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Sets number of row limit that will be saved to the staged file and then loaded to Snowflake.
withFlushTimeLimit(Duration) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Sets duration how often staged files will be created and then how often ingested by Snowflake during streaming.
withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Adds a footer string to each file.
withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
withFormat(DataFormat) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
See DataFormat.
withFormatFn(KuduIO.FormatFunction<T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
Writes using the given function to create the mutation operations from the input.
withFormatFunction(SourceRecordMapper<T>) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
Applies a SourceRecordMapper to the connector.
withFormatFunction(SerializableFunction<UserT, OutputT>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Specifies a format function to convert UserT to the output type.
withFormatFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Formats the user's type into a TableRow to be written to BigQuery.
withFormatFunction(SerializableFunction<UserT, String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Deprecated.
withFormatRecordOnFailureFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If an insert failure occurs, this function is applied to the originally supplied T element.
withFromDateTime(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
Read metric data from the fromDateTime.
withFullPublishResult() - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Encode the full PublishResult object, including sdkResponseMetadata and sdkHttpMetadata with the HTTP response headers.
withFullPublishResultWithoutHeaders() - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Encode the full PublishResult object, including sdkResponseMetadata and sdkHttpMetadata but excluding the HTTP response headers.
withGapDuration(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.Sessions
Creates a Sessions WindowFn with the specified gap duration.
withGCPApplicationDefaultCredentials() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Creates and sets the Application Default Credentials for a Kafka consumer.
withGCPApplicationDefaultCredentials() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Creates and sets the Application Default Credentials for a Kafka producer.
withGetOffsetFn(SerializableFunction<V, Long>) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
A function to get offset in order to start Receiver from it.
withGoogleAdsClientFactory(GoogleAdsClientFactory) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.Read
Creates and returns a new GoogleAdsV17.Read transform with the specified client factory.
withGoogleAdsClientFactory(GoogleAdsClientFactory) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.ReadAll
Creates and returns a new GoogleAdsV17.ReadAll transform with the specified client factory.
withGroupingFactor(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the multiple of max mutation (in terms of both bytes per batch and cells per batch) that is used to select a set of mutations to sort by key for batching.
withHadoopConfiguration(Class<K>, Class<V>) - Method in class org.apache.beam.sdk.io.cdap.Plugin
Sets a plugin Hadoop configuration.
withHadoopConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.cdap.Plugin
Sets a plugin Hadoop configuration.
withHasError(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
Used to set whether or not there was an error for a given document as indicated by the response from Elasticsearch.
withHasMultilineCSVRecords(Boolean) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
When reading RFC4180 CSV files that have values that span multiple lines, set this to true.
withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Adds a header string to each file.
withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
withHint(String, ResourceHint) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
Declares a custom resource hint that has a specified URN.
withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
Hints that the filepattern specified in AvroIO.Read.from(String) matches a very large number of files.
withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
Hints that the filepattern specified in ContextualTextIO.Read.from(String) matches a very large number of files.
withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.io.TextIO.Read
Hints that the filepattern specified in TextIO.Read.from(String) matches a very large number of files.
withHintMaxNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Provide a hint to the QoS system for the intended max number of workers for a pipeline.
withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity with a different worker count hint for ramp-up throttling.
withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey with a different worker count hint for ramp-up throttling.
withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write with a different worker count hint for ramp-up throttling.
withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner host.
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
Define the host name of the Redis server.
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
withHost(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
Assigns host value to the event metadata.
withHostName(String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the host name to be used on the database.
withHostName(ValueProvider<String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the host name to be used on the database.
withHosts(List<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the hosts of the Apache Cassandra instances.
withHosts(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the hosts of the Apache Cassandra instances.
withHosts(List<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the Cassandra instance hosts where to write data.
withHosts(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the hosts of the Apache Cassandra instances.
withHotKeyFanout(int) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
withHotKeyFanout(SerializableFunction<Row, Integer>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
withHotKeyFanout(SerializableFunction<? super K, Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
If a single key has disproportionately many values, it may become a bottleneck, especially in streaming mode.
withHotKeyFanout(int) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Like Combine.PerKey.withHotKeyFanout(SerializableFunction), but returning the given constant value for every key.
withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub, adding each record's unique identifier to the published messages in an attribute with the specified name.
withIdFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide a function to extract the id from the document.
withIdFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withIdGenerator(IdGenerator) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
 
withIgnoreSSLCertificate(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Enable ignoreSSLCertificate for ssl for connection (allow for self signed certificates).
withIgnoreSSLCertificate(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Enable ignoreSSLCertificate for ssl for connection (allow for self signed certificates).
withIgnoreVersionConflicts(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
Whether or not to suppress version conflict errors in a Bulk API response.
withIgnoreVersionConflicts(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withIgnoreWindowing() - Method in class org.apache.beam.sdk.io.FileIO.Write
Deprecated.
Avoid usage of this method: its effects are complex and it will be removed in future versions of Beam. Right now it exists for compatibility with WriteFiles.
withInclusiveEndAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the end time of the change stream.
withInclusiveStartAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the time that the change stream should be read from.
withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withIndex(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
Assigns index value to the event metadata.
withIndexes() - Method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
Sets include_indexes option for DataframeTransform.
withIndexFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide a function to extract the target index from the document allowing for dynamic document routing.
withIndexFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withInitialBackoff(Duration) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
Set initial backoff duration.
withInitialBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the initial backoff duration to be used before retrying a request for the first time.
withInitialPositionInStream(InitialPositionInStream) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specify reading from some initial position in stream.
withInitialPositionInStream(InitialPositionInStream) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify reading from some initial position in stream.
withInitialSplitDuration(Duration) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
 
withInitialTimestampInStream(Instant) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specify reading beginning at given Instant.
withInitialTimestampInStream(Instant) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify reading beginning at given Instant.
withInputDoc(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
Sets the input document i.e.
withInputMetadata(Metadata) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
Sets the input metadata for Parser.parse(java.io.InputStream, org.xml.sax.ContentHandler, org.apache.tika.metadata.Metadata, org.apache.tika.parser.ParseContext).
withInputTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Wrapper method over KafkaIO.WriteRecords.withInputTimestamp(), used to keep the compatibility with old API based on KV type of element.
withInputTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
The timestamp for each record being published is set to timestamp of the element in the pipeline.
withInsertDeduplicate(Boolean) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
For INSERT queries in the replicated table, specifies that deduplication of inserting blocks should be performed.
withInsertDistributedSync(Boolean) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
If setting is enabled, insert query into distributed waits until data will be sent to all nodes in cluster.
withInsertQuorum(Long) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
For INSERT queries in the replicated table, wait writing for the specified number of replicas and linearize the addition of the data.
withInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
Specify a retry policy for failed inserts.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.Read.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the project.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.Read.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the project.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will stream from the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.ReadChangeStream.withProjectId(java.lang.String) to be called to determine the project.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write into the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.Write.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the project.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write into the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.Write.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the project.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner instance ID.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner instance ID.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner instance.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner instance.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner instance.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner instance.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner instance.
withInterceptors(List<ClientInterceptor>) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
Returns a ManagedChannelFactory like this one, but which will apply the provided ClientInterceptors to any channel it creates.
withInterpolateFunction(SerializableFunction<FillGaps.InterpolateData<ValueT>, ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
This function can be used to modify elements before propagating to the next bucket.
withInterval(Duration) - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
 
withIsDeleteFn(ElasticsearchIO.Write.BooleanFieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide a function to extract the target operation either upsert or delete from the document fields allowing dynamic bulk operation decision.
withIsDeleteFn(ElasticsearchIO.Write.BooleanFieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withIsLocalChannelProvider(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies whether a local channel provider should be used.
withIsReady(Supplier<Boolean>) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
Returns a new TestStreams.Builder like this one with the specified CallStreamObserver.isReady() callback.
withIsUpsert(boolean) - Method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
 
withJobService(BigQueryServices.JobService) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
withJsonClustering(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
The same as BigQueryIO.Write.withClustering(Clustering), but takes a JSON-serialized Clustering object in a deferred ValueProvider.
withJsonSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Similar to BigQueryIO.Write.withSchema(TableSchema) but takes in a JSON-serialized TableSchema.
withJsonSchema(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withJsonTimePartitioning(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withKeyClass(Class<K>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
Sets a key class.
withKeyClass(Class<K>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
Sets a key class.
withKeyDeserializer(Class<? extends Deserializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a Kafka Deserializer to interpret key bytes read from Kafka.
withKeyDeserializer(DeserializerProvider<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
withKeyDeserializer(Class<? extends Deserializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Sets a Kafka Deserializer to interpret key bytes read from Kafka.
withKeyDeserializerAndCoder(Class<? extends Deserializer<K>>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a Kafka Deserializer for interpreting key bytes read from Kafka along with a Coder for helping the Beam runner materialize key objects at runtime if necessary.
withKeyDeserializerAndCoder(Class<? extends Deserializer<K>>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Sets a Kafka Deserializer for interpreting key bytes read from Kafka along with a Coder for helping the Beam runner materialize key objects at runtime if necessary.
withKeyDeserializerProvider(DeserializerProvider<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
 
withKeyDeserializerProviderAndCoder(DeserializerProvider<K>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
withKeyField(String) - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
 
withKeyField(String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
withKeyField(String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Set the name of the key field in the resulting schema.
withKeyPairAuth(String, PrivateKey) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPairPathAuth(ValueProvider<String>, String, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPairPathAuth(ValueProvider<String>, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPairPathAuth(String, String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPairPathAuth(String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPairRawAuth(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPairRawAuth(ValueProvider<String>, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPairRawAuth(String, String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPairRawAuth(String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets key pair authentication.
withKeyPattern(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read only rows in the specified range.
withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Reads only rows in the specified range.
withKeyRange(byte[], byte[]) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Reads only rows in the specified range.
withKeyRanges(ValueProvider<List<ByteKeyRange>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read only rows in the specified ranges.
withKeyRanges(List<ByteKeyRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read only rows in the specified ranges.
WithKeys<T> - Class in org.apache.beam.sdk.schemas.transforms
 
WithKeys<K,V> - Class in org.apache.beam.sdk.transforms
WithKeys<K, V> takes a PCollection<V>, and either a constant key of type K or a function from V to K, and returns a PCollection<KV<K, V>>, where each of the values in the input PCollection has been paired with either the constant key or a key computed from the value.
withKeySerializer(Class<? extends Serializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Wrapper method over KafkaIO.WriteRecords.withKeySerializer(Class), used to keep the compatibility with old API based on KV type of element.
withKeySerializer(Class<? extends Serializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Sets a Serializer for serializing key (if any) to bytes.
withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withKeyspace(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Cassandra keyspace where to read data.
withKeyspace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Cassandra keyspace where to read data.
withKeyspace(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the Cassandra keyspace where to write data.
withKeyspace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the Cassandra keyspace where to read data.
withKeystorePassword(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch uses SSL/TLS with mutual authentication (via shield), provide the password to open the client keystore.
withKeystorePath(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch uses SSL/TLS with mutual authentication (via shield), provide the keystore containing the client key.
withKeyTranslation(SimpleFunction<?, K>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Transforms the keys read from the source using the given key translation function.
withKeyTranslation(SimpleFunction<?, K>, Coder<K>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Transforms the keys read from the source using the given key translation function.
withKeyType(TypeDescriptor<K>) - Method in class org.apache.beam.sdk.transforms.WithKeys
Return a WithKeys that is like this one with the specified key type descriptor.
withKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
For query sources, use this Cloud KMS key to encrypt any temporary tables created.
withKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withKwarg(String, Object) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Specifies a single keyword argument for the Python cross-language transform.
withKwarg(String, Object) - Method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
Sets keyword arguments for the model loader.
withKwargs(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Specifies keyword arguments for the Python cross-language transform.
withKwargs(Row) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Specifies keyword arguments as a Row objects.
withLabel(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
Set the item label.
withLateFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
withLateFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
Creates a new Trigger like the this, except that it fires repeatedly whenever the given Trigger fires after the watermark has passed the end of the window.
withLimit(int) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
Sets the limit of documents to find.
withLinkUrl(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
Set the item link url.
withLiteralGqlQuery(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads the results of the specified GQL query.
withLiteralGqlQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
withLoadJobProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Set the project the BigQuery load job will be initiated from.
withLoadJobProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withLocalDc(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the local DC used for the load balancing.
withLocalDc(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the local DC used for the load balancing.
withLocalDc(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the local DC used by the load balancing policy.
withLocalDc(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the local DC used for the load balancing.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore Emulator running locally on the specified host port.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore Emulator running locally on the specified host port.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from a Datastore Emulator running at the given localhost address.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that writes to the Cloud Datastore Emulator running locally on the specified host port.
withLocksDirPath(String) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
Sets path to directory where locks will be stored.
withLogAppendTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
withLogAppendTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Use the log append time as the output timestamp.
withLogAppendTime() - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
A TimestampPolicy that assigns Kafka's log append time (server side ingestion time) to each record.
withLoginCustomerId(Long) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.Read
Creates and returns a new GoogleAdsV17.Read transform with the specified login customer ID.
withLoginCustomerId(Long) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.ReadAll
Creates and returns a new GoogleAdsV17.ReadAll transform with the specified login customer ID.
withLoginTimeout(Integer) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets loginTimeout that will be used in SnowflakeBasicDataSource.setLoginTimeout(int).
withLowerBound(PartitionColumnT) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
 
withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
withManualWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Use the WatermarkEstimators.Manual as the watermark estimator.
withMapper(ObjectMapper) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
Use custom Jackson ObjectMapper instead of the default one.
withMapper(ObjectMapper) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
Use custom Jackson ObjectMapper instead of the default one.
withMapperFactoryFn(SerializableFunction<Session, Mapper>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
A factory to create a specific Mapper for a given Cassandra Session.
withMapperFactoryFn(SerializableFunction<Session, Mapper>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
withMasterAddresses(String) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
Reads from the Kudu cluster on the specified master addresses.
withMasterAddresses(String) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
Writes to the Kudu cluster on the specified master addresses.
withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Parse
withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseAll
Deprecated.
withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Read
withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadAll
Deprecated.
withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.TextIO.Read
withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
Deprecated.
withMaxAttempts(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the maximum number of times a request will be attempted for a complete successful result.
withMaxBatchBufferingDuration(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
withMaxBatchBufferingDuration(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
Provide maximum buffering time to batch elements before committing SQL statement.
withMaxBatchBytesSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub are limited by 10mb in general.
withMaxBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
Provide a maximum size in number of documents for the batch see bulk API (https://www.elastic.co/guide/en/elasticsearch/reference/7.17/docs-bulk.html).
withMaxBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withMaxBatchSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub are batched to efficiently send data.
withMaxBatchSize(int) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
Provide a maximum size in number of documents for the batch.
withMaxBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
Provide a maximum size in bytes for the batch see bulk API (https://www.elastic.co/guide/en/elasticsearch/reference/7.17/docs-bulk.html).
withMaxBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withMaxBufferElementCount(Integer) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will break up read requests into smaller batches.
withMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
If using ElasticsearchIO.BulkIO.withUseStatefulBatches(boolean), this can be used to set a maximum elapsed time before buffered elements are emitted to Elasticsearch as a Bulk API request.
withMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withMaxBufferingDuration(Duration) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
Sets a time limit (in processing time) on how long an incomplete batch of elements is allowed to be buffered.
withMaxBytesPerBatch(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the max bytes a batch can have.
withMaxBytesPerPartition(long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Control how much data will be assigned to a single BigQuery load job.
withMaxCapacityPerShard(Integer) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies the maximum number of messages per one shard.
withMaxCapacityPerShard(Integer) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies the maximum number of messages per one shard.
withMaxCommitDelay(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
withMaxCommitDelay(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the max commit delay for high throughput writes.
withMaxCommitDelay(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the max commit delay for high throughput writes.
withMaxCommitDelay(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies max commit delay for the Commit API call for throughput optimized writes.
withMaxConnectionIdleTime(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Sets the maximum idle time for a pooled connection.
withMaxConnectionIdleTime(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Sets the maximum idle time for a pooled connection.
withMaxConnections(Integer) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
Sets the maximum total number of connections.
withMaxConnections(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
Same as JdbcIO.DataSourceConfiguration.withMaxConnections(Integer) but accepting a ValueProvider.
withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
Limits total time spent in backoff.
withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the maximum cumulative backoff.
withMaxCumulativeBackoff(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the maximum cumulative backoff.
withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the maximum cumulative backoff time when retrying after DEADLINE_EXCEEDED errors.
withMaxElementsPerBatch(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the max elements a batch can have.
withMaxFileSize(long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Controls the maximum byte size per file to be loaded into BigQuery.
withMaxFilesPerBundle(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Control how many files will be written concurrently by a single worker when using BigQuery load jobs before spilling to a shuffle.
withMaxFilesPerPartition(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Controls how many files will be assigned to a single BigQuery load job.
withMaxGapFillBuckets(Long) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
 
withMaxInputSize(long) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Returns an ApproximateQuantilesCombineFn that's like this one except that it uses the specified maxNumElements value.
withMaxInsertBlockSize(long) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
The maximum block size for insertion, if we control the creation of blocks for insertion.
withMaxLen(long) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
When appending (XADD) to a stream, set a MAXLEN option.
withMaxNumberOfRecords(Integer) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
Once the specified number of records has been reached, it will stop fetching them.
withMaxNumConnections(Integer) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
Optional.
withMaxNumMutations(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the cell mutation limit (maximum number of mutated cells per batch).
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
Define the max number of records received by the AmqpIO.Read.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
Deprecated.
Define the max number of records received by the SqsIO.Read.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies to read at most a given number of records.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
Define the max number of records received by the SqsIO.Read.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
Returns a new BoundedReadFromUnboundedSource that reads a bounded amount of data from the given UnboundedSource.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Define the max number of records that the source will read.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies to read at most a given number of records.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
Define the max number of records received by the MqttIO.Read.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
Define the max number of records received by the RabbitMqIO.Read.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.Read.Unbounded
Returns a new BoundedReadFromUnboundedSource that reads a bounded amount of data from the given UnboundedSource.
withMaxNumRows(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the row mutation limit (maximum number of mutated rows per batch).
withMaxNumWritersPerBundle(int) - Method in class org.apache.beam.sdk.io.WriteFiles
Set the maximum number of writers created in a bundle before spilling to shuffle.
withMaxOutstandingBytes(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the max number of outstanding bytes allowed before enforcing flow control.
withMaxOutstandingElements(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the max number of outstanding elements allowed before enforcing flow control.
withMaxParallelRequests(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
When using ElasticsearchIO.BulkIO.withUseStatefulBatches(boolean) Stateful Processing, states and therefore batches are maintained per-key-per-window.
withMaxParallelRequests(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withMaxParallelRequestsPerWindow(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
withMaxParallelRequestsPerWindow(int) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
Define the max read time (duration) while the AmqpIO.Read will receive messages.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
Deprecated.
Define the max read time (duration) while the SqsIO.Read will receive messages.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies to read records during maxReadTime.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
Define the max read time (duration) while the SqsIO.Read will receive messages.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
Returns a new BoundedReadFromUnboundedSource that reads a bounded amount of data from the given UnboundedSource.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.GenerateSequence
Specifies to stop generating elements after the given time.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Define the max read time that the source will read.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies to read records during maxReadTime.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
Define the max read time (duration) while the MqttIO.Read will receive messages.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
Define the max read time (duration) while the RabbitMqIO.Read will receive messages.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.Read.Unbounded
Returns a new BoundedReadFromUnboundedSource that reads a bounded amount of data from the given UnboundedSource.
withMaxRetries(int) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
Maximum number of retries per insert.
withMaxRetryJobs(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If set, this will set the max number of retry of batch load jobs.
withMaxTimeToRun(Long) - Method in class org.apache.beam.io.debezium.DebeziumIO.Read
Once the connector has run for the determined amount of time, it will stop.
withMemoryMB(int) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Sets the size of the memory buffer in megabytes.
withMergeFunction(SerializableBiFunction<TimestampedValue<ValueT>, TimestampedValue<ValueT>, TimestampedValue<ValueT>>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
If there are multiple values in a single timeseries bucket, this function is used to specify what to propagate to the next bucket.
withMessageMapper(JmsIO.MessageMapper<T>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
Specifies to put the given metadata into each generated file.
withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Writes to Avro file(s) with the specified metadata.
withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withMetadata() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Include metadata in result json documents.
withMetadata(Map<String, byte[]>) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Deprecated.
use schema options instead.
withMetadata(String, byte[]) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Deprecated.
use schema options instead.
withMetadata(String, String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
Deprecated.
use schema options instead.
withMetadataDatabase(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the metadata database.
withMetadataInstance(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the metadata database.
withMetadataTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the metadata table name.
withMetadataTableAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will use the cluster specified by app profile id to store the metadata of the stream.
withMetadataTableInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will use the Cloud Bigtable instance indicated by given parameter to manage the metadata of the stream.
withMetadataTableProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will use the Cloud Bigtable project indicated by given parameter to manage the metadata of the stream.
withMetadataTableTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will use specified table to store the metadata of the stream.
withMethod(BigQueryIO.TypedRead.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withMethod(BigQueryIO.Write.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Choose the method used to write data to BigQuery.
withMethod(RedisIO.Write.Method) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
withMetric(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
Sets the metric to use.
WithMetricsSupport - Class in org.apache.beam.runners.spark.metrics
A MetricRegistry decorator-like that supports AggregatorMetric and SparkBeamMetric as Gauges.
WithMetricsSupport - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
A MetricRegistry decorator-like that supports BeamMetricSets as Gauges.
withMinBundleSize(long) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
Sets the minimum bundle size.
withMinBundleSize(long) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets a parameter minBundleSize for the minimum bundle size of the source.
withMinNumberOfSplits(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
It's possible that system.size_estimates isn't populated or that the number of splits computed by Beam is still to low for Cassandra to handle it.
withMinNumberOfSplits(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
It's possible that system.size_estimates isn't populated or that the number of splits computed by Beam is still to low for Cassandra to handle it.
withMinRam(long) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
Sets desired minimal available RAM size to have in transform's execution environment.
withMinRam(String) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
Sets desired minimal available RAM size to have in transform's execution environment.
withMode(WindowingStrategy.AccumulationMode) - Method in class org.apache.beam.sdk.values.WindowingStrategy
Returns a WindowingStrategy identical to this but with the accumulation mode set to mode.
withMongoDbPipeline(List<BsonDocument>) - Method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
 
withMonitoringConfiguration(Monitoring) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
 
withMonotonicallyIncreasingWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Use the WatermarkEstimators.MonotonicallyIncreasing as the watermark estimator.
withName(String) - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
 
withName(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns a copy of the Field with the name set.
withNamedParameters(Map<String, ?>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
withNameOnlyQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
Update produced queries to only retrieve their __name__ thereby not retrieving any fields and reducing resource requirements.
withNamespace(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from the given namespace.
withNamespace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
withNamespace(Class<?>) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
Set the item namespace from the given Class.
withNaming(FileIO.Write.FileNaming) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies a custom strategy for generating filenames.
withNaming(SerializableFunction<DestinationT, FileIO.Write.FileNaming>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies a custom strategy for generating filenames depending on the destination, similar to FileIO.Write.withNaming(FileNaming).
withNaming(Contextful<Contextful.Fn<DestinationT, FileIO.Write.FileNaming>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Like FileIO.Write.withNaming(SerializableFunction) but allows accessing context, such as side inputs, from the function.
withNestedField(int, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that access the specified nested field.
withNestedField(String, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
Return a descriptor that access the specified nested field.
withNestedField(FieldAccessDescriptor.FieldDescriptor, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
withNestedFieldAs(String, String, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
withNoOutputTimestamp() - Method in interface org.apache.beam.sdk.state.Timer
Asserts that there is no output timestamp.
withNoSpilling() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
withNoSpilling() - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Whether to skip the spilling of data.
withNoSpilling() - Method in class org.apache.beam.sdk.io.FileIO.Write
withNoSpilling() - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Whether to skip the spilling of data.
withNoSpilling() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
withNoSpilling() - Method in class org.apache.beam.sdk.io.TextIO.Write
withNoSpilling() - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
withNoSpilling() - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that writes all data without spilling, simplifying the pipeline.
withNullable(boolean) - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
 
withNullable(boolean) - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns a copy of the Field with isNullable set.
withNullable(boolean) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
withNullBehavior(RowJson.RowJsonDeserializer.NullBehavior) - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn
Sets the behavior of the deserializer according to RowJson.RowJsonDeserializer.NullBehavior.
withNumberOfClientsPerWorker(int) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
The number of clients that each worker will create.
withNumberOfRecordsRead(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the number of records read in the partition change stream query before reading this record.
withNumBuckets(Integer) - Method in class org.apache.beam.sdk.transforms.Redistribute.RedistributeArbitrarily
 
withNumBuckets(Integer) - Method in class org.apache.beam.sdk.transforms.Reshuffle.ViaRandomKey
 
withNumFileShards(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Control how many file shards are written when using BigQuery load jobs.
withNumPartitions(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
The number of partitions.
withNumQuerySplits(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads by splitting the given query into numQuerySplits.
withNumShards(int) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Configures the number of output shards produced overall (when using unwindowed writes) or per-window (when using windowed writes).
withNumShards(int) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withNumShards(Integer) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Specifies to use a given fixed number of shards per window.
withNumShards(int) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies to use a given fixed number of shards per window.
withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.FileIO.Write
withNumShards(Integer) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Specifies to use a given fixed number of shards per window.
withNumShards(int) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
The number of workers used by the job to write to Solace.
withNumShards(int) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Configures the number of output shards produced overall (when using unwindowed writes) or per-window (when using windowed writes).
withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
withNumShards(int) - Method in class org.apache.beam.sdk.io.TextIO.Write
withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.TextIO.Write
withNumShards(int) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes to the provided number of shards.
withNumShards(int) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink using the specified number of shards.
withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink using the ValueProvider specified number of shards.
withNumSplits(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Sets the user defined number of splits.
withNumStorageWriteApiStreams(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Control how many parallel streams are used when using Storage API writes.
withOAuth(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets OAuth authentication.
withOAuth(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets OAuth authentication.
withOffset(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
Partitions the timestamp space into half-open intervals of the form [N * size + offset, (N + 1) * size + offset), where 0 is the epoch.
withOffset(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Assigns timestamps into half-open intervals of the form [N * period + offset, N * period + offset + size).
withOffsetConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Set additional configuration for the backend offset consumer.
withOffsetConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Set additional configuration for the offset consumer.
withOnCompleted(Runnable) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
Returns a new TestStreams.Builder like this one with the specified StreamObserver.onCompleted() callback.
withOnError(Runnable) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
Returns a new TestStreams.Builder like this one with the specified StreamObserver.onError(java.lang.Throwable) callback.
withOnError(Consumer<Throwable>) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
Returns a new TestStreams.Builder like this one with the specified StreamObserver.onError(java.lang.Throwable) consumer.
withOnNext(Consumer<T>) - Static method in class org.apache.beam.sdk.fn.test.TestStreams
Creates a test CallStreamObserver TestStreams.Builder that forwards StreamObserver.onNext(V) calls to the supplied Consumer.
withOnTimeBehavior(Window.OnTimeBehavior) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Override the default Window.OnTimeBehavior, to control whether to output an empty on-time pane.
withOnTimeBehavior(Window.OnTimeBehavior) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
withOperationTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read with the operation timeout.
withOperationTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the operation timeout.
withOptionalParticipation() - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
Means that this field will participate in a join even when not present, similar to SQL outer-join semantics.
withOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns a copy of the Field with the options set.
withOptions(Schema.Options.Builder) - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns a copy of the Field with the options set.
withOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema
Returns a copy of the Schema with the options set.
withOptions(Schema.Options.Builder) - Method in class org.apache.beam.sdk.schemas.Schema
Returns a copy of the Schema with the options set.
withOrdered(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Enables ordered bulk insertion (default: true).
withOrdinality - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
 
withoutDefaults() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform identical to this, but that does not attempt to provide a default value in the case of empty input.
withoutLimiter() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
 
withoutLimiter() - Static method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory
 
withoutMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Returns a PTransform for PCollection of KV, dropping Kafka metatdata.
withoutPartitioning() - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.PartitionedWriterBuilder
Writes to the sink without need to partition output into specified number of partitions.
withOutputCoder(Coder<?>) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Specifies the Coder of the output PCollections produced by this transform.
withOutputCoder(Coder<OutputT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
Specifies a Coder to use for the outputs.
withOutputCoders(Map<String, Coder<?>>) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Specifies the keys and Coders of the output PCollections produced by this transform.
withOutputFilenames() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
Specify that output filenames are wanted.
withOutputFilenames() - Method in class org.apache.beam.sdk.io.TextIO.Write
Specify that output filenames are wanted.
withOutputKeyCoder(Coder<KeyT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
Specifies the coder for the output key.
withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
withOutputParallelization(boolean) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
withOutputParallelization(Boolean) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
withOutputs(List<TimestampedValue<OutputT>>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
Returns a new Watch.Growth.PollResult like this one with the provided outputs.
withOutputSchema(Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Fields
Rename all output fields to match the specified schema.
withOutputSchema(Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
Rename all output fields to match the specified schema.
withOutputTags(TupleTag<OutputT>, TupleTagList) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns a new multi-output ParDo PTransform that's like this PTransform but with the specified output tags.
withOutputTimestamp(Instant) - Method in interface org.apache.beam.sdk.state.Timer
Sets event time timer's output timestamp.
withoutRepeater() - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
Turns off repeat invocations (default is on) of SetupTeardown and Caller, using the Repeater, in the setting of RequestResponseIO.REPEATABLE_ERROR_TYPES.
withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withoutSharding() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Forces a single file as output and empty shard name template.
withoutSharding() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withoutSharding() - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Forces a single file as output and empty shard name template.
withoutSharding() - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Forces a single file as output and empty shard name template.
withoutSharding() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Forces a single file as output and empty shard name template.
withoutSharding() - Method in class org.apache.beam.sdk.io.TextIO.Write
withoutSharding() - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Forces a single file as output.
withoutStrictParsing() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
During parsing of the arguments, we will skip over improperly formatted and unknown arguments.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Disable validation that the table exists or the query succeeds prior to pipeline submission.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Disables BigQuery table validation.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Disables validation that the table being read from exists.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Disables validation that the table being read and the metadata table exists, and that the app profile used is single cluster and single row transaction enabled.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Disables validation that the table being written to exists.
withoutValidation() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
Returns a transform for reading TFRecord files that has GCS path validation on pipeline creation disabled.
withOverloadRatio(double) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
The target ratio between requests sent and successful requests.
withParallelism(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
withParallelism(Integer) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Sets the number of parallel http client connections to the HEC.
withParameterSetter(JdbcIO.PreparedStatementSetter<ParameterT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
withParametersFunction(SerializableFunction<ParameterT, Map<String, Object>>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withParametersFunction(SerializableFunction<ParameterT, Map<String, Object>>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withParams(Map<String, Object>) - Method in class org.apache.beam.sdk.io.cdap.ConfigWrapper
Sets a Plugin parameters Map.
withParent(Schema.TypeName) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
 
withParseFn(SerializableFunction<GenericRecord, X>, Coder<X>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
Reads GenericRecord of unspecified schema and maps them to instances of a custom type using the given parseFn and encoded using the given coder.
withParseFn(SerializableFunction<RowResult, T>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
Provides the function to parse a row from Kudu into the typed object.
withParser(MongoDbGridFSIO.Parser<X>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withPartition(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
Sets the partition details.
withPartitionCols(List<String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
Set the names of the columns that are partitions.
withPartitionColumn(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
The name of a column of numeric type that will be used for partitioning.
withPartitionCreatedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which this partition was first detected and created in the metadata table.
withPartitionEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the end time for the partition change stream query that originated this record.
withPartitioner(KinesisPartitioner<T>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Specify how to partition records among all stream shards (required).
withPartitioner(KinesisPartitioner) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Allows to specify custom implementation of KinesisPartitioner.
withPartitioning() - Method in interface org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Write.PartitionedWriterBuilder
Writes to the sink with partitioning by Task Id.
withPartitionKey(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify default partition key.
withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Note that PartitionOptions are currently ignored.
withPartitionQueryTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the PartitionQuery timeout.
withPartitionQueryTimeout(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the PartitionQuery timeout.
withPartitionReadTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the PartitionRead timeout.
withPartitionReadTimeout(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the PartitionRead timeout.
withPartitionRunningAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which the connector started processing this partition.
withPartitionScheduledAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which this partition was scheduled to be queried.
withPartitionStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the start time for the partition change stream query that originated this record.
withPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the partition token where this record originated from.
withPassword(String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the password to connect to your database.
withPassword(ValueProvider<String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the password to connect to your database.
withPassword(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the password used for authentication.
withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the password used for authentication.
withPassword(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the password used for authentication.
withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the password used for authentication.
withPassword(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch authentication is enabled, provide the password.
withPassword(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
withPassword(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Define the password to connect to the JMS broker (authenticated).
withPassword(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Define the password to connect to the JMS broker (authenticated).
withPassword(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
withPassword(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withPassword(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
 
withPathPart(String) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
 
withPathPrefix(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch is not running at the root path, e.g.
withPayload(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
Assigns the payload to be used for reprocessing.
withPayloadFn(SerializableFunction<InputT, byte[]>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
withPercision(Integer) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
 
withPercision(Integer) - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
 
withPlacementId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
withPluginConfig(PluginConfig) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
Sets a PluginConfig.
withPluginConfig(PluginConfig) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
Sets a PluginConfig.
withPointInTimeSearch() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Configures the source to user Point In Time search iteration while reading data from Elasticsearch.
withPointInTimeSearchAndSortConfiguration(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Similar to the default PIT search but setting a specific sorting configuration which Elasticsearch will use to sort for the results.
withPointInTimeSearchAndTimestampSortProperty(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Similar to the default PIT search but setting an existing timestamp based property name which Elasticsearch will use to sort for the results.
withPollingInterval(Duration) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
If specified, polling for new partitions will happen at this periodicity.
withPollInterval(Duration) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
Specifies how long to wait after a call to Watch.Growth.PollFn before calling it again (if at all - according to Watch.Growth.PollResult and the Watch.Growth.TerminationCondition).
withPort(String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the port on which your database is listening.
withPort(ValueProvider<String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the port on which your database is listening.
withPort(int) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the port number of the Apache Cassandra instances.
withPort(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the port number of the Apache Cassandra instances.
withPort(int) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the Cassandra instance port number where to write data.
withPort(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the port number of the Apache Cassandra instances.
withPort(int) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
Define the port number of the Redis server.
withPort(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
withPortNumber(Integer) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets port number to use to connect to Snowflake.
withPositionalParameters(List<?>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
Returns an ApproximateDistinct.ApproximateDistinctFn combiner with a new precision p.
withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
Sets the precision p.
withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
Sets the precision p.
withPrecision(int) - Method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init.Builder
Explicitly set the precision parameter used to compute HLL++ sketch.
withPrecombining(boolean) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Enable precombining.
withPredicates(List<KuduPredicate>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
Filters the rows read from Kudu using the given predicates.
withPrefix(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies a common prefix to use for all generated filenames, if using the default file naming.
withPrefix(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
withPreparedStatementSetter(JdbcIO.PreparedStatementSetter<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
See WriteVoid#withPreparedStatementSetter(PreparedStatementSetter).
withPreparedStatementSetter(JdbcIO.PreparedStatementSetter<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
 
withPreparedStatementSetter(JdbcIO.PreparedStatementSetter<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
withPrimaryKey(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withProcessingTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
withProcessingTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Use the processing time as the output timestamp.
withProcessingTime() - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
A TimestampPolicy that assigns processing time to each record.
withProcessingTime() - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withProcessingTimePolicy() - Static method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
Returns an ProcessingTimeWatermarkPolicy.
withProcessingTimePolicy() - Static method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory
Returns an ProcessingTimeWatermarkPolicy.
withProcessingTimeWatermarkPolicy() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies the WatermarkPolicyFactory as ProcessingTimeWatermarkPolicyFactory.
withProcessingTimeWatermarkPolicy() - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies the WatermarkPolicyFactory as ProcessingTimeWatermarkPolicyFactory.
withProducerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Update configuration for the producer.
withProducerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Update configuration for the producer.
withProducerFactoryFn(SerializableFunction<Map<String, Object>, Producer<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Wrapper method over KafkaIO.WriteRecords.withProducerFactoryFn(SerializableFunction), used to keep the compatibility with old API based on KV type of element.
withProducerFactoryFn(SerializableFunction<Map<String, Object>, Producer<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Sets a custom function to create Kafka producer.
withProducerProperties(Properties) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify the configuration properties for Kinesis Producer Library (KPL).
withProjectedColumns(List<String>) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
Filters the columns read from the table to include only those specified.
withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
 
withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
 
withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
 
withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
 
withProjectId(String) - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
 
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable project indicated by given parameter, requires BigtableIO.Read.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the instance.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable project indicated by given parameter, requires BigtableIO.Read.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the instance.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will stream from the Cloud Bigtable project indicated by given parameter, requires BigtableIO.ReadChangeStream.withInstanceId(java.lang.String) to be called to determine the instance.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write into the Cloud Bigtable project indicated by given parameter, requires BigtableIO.Write.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the instance.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write into the Cloud Bigtable project indicated by given parameter, requires BigtableIO.Write.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the instance.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that writes to the Cloud Datastore for the default database.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner project ID.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner project ID.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner project.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner project.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner project.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner project.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner project.
withProjection(List<String>) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
Sets the projection.
withProjection(Schema, Schema) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
Enable the reading with projection.
withProjection(Schema, Schema) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
 
withPropagateSuccessfulStorageApiWrites(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If set to true, then all successful writes will be propagated to WriteResult and accessible via the WriteResult.getSuccessfulStorageApiInserts() method.
withPropagateSuccessfulStorageApiWrites(Predicate<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If called, then all successful writes will be propagated to WriteResult and accessible via the WriteResult.getSuccessfulStorageApiInserts() method.
withProtocol(TProtocolFactory) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ReadFiles
Specifies the TProtocolFactory to be used to decode Thrift objects.
withPublishRequestBuilder(SerializableFunction<T, PublishRequest.Builder>) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
Function to convert a message into a PublishRequest.Builder (mandatory).
withPublishRequestFn(SerializableFunction<T, PublishRequest>) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
withPublishTime() - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withPublishTimestampFunction(KafkaPublishTimestampFunction<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Deprecated.
use KafkaIO.WriteRecords and ProducerRecords to set publish timestamp.
withPublishTimestampFunction(KafkaPublishTimestampFunction<ProducerRecord<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Deprecated.
use ProducerRecords to set publish timestamp.
withPubsubRootUrl(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
withPullFrequencySec(Long) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
Delay in seconds between polling for new records updates.
withPullFrequencySec(Long) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
Delay in seconds between polling for new records updates.
withPulsarClient(SerializableFunction<String, PulsarClient>) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withQuery(String) - Method in class org.apache.beam.sdk.io.azure.cosmos.CosmosIO.Read
Specify the query to read data.
withQuery(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the query to read data.
withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the query to read data.
withQuery(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide a query used while reading from Elasticsearch.
withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide a ValueProvider that provides the query used while reading from Elasticsearch.
withQuery(Query) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads the results of the specified query.
withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withQuery(String) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.Read
Creates and returns a new GoogleAdsV17.Read transform with the specified query.
withQuery(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
Sets the query to use.
withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
withQuery(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
 
withQuery(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
 
withQuery(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
Provide a query used while reading from Solr.
withQueryFn(SerializableFunction<MongoCollection<Document>, MongoCursor<Document>>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Sets a queryFn.
withQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
BigQuery geographic location where the query job will be executed.
withQueryName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withQueryName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withQueryPlannerClass(Class<? extends QueryPlanner>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
withQueryPriority(BigQueryIO.TypedRead.QueryPriority) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withQueryStartedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time that the change stream query which produced this record started.
withQueryTempDataset(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Temporary dataset reference when using BigQueryIO.TypedRead.fromQuery(String).
withQueryTempProjectAndDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withQueryTransformation(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
A query to be executed in Snowflake.
withQueryTransformation(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
 
withQueue(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Specify the JMS queue destination name where to read messages from.
withQueue(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify the JMS queue destination name where to send messages to.
withQueue(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
If you want to directly consume messages from a specific queue, you just have to specify the queue name.
withQueue(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
Defines the queue where the messages will be sent.
withQueueDeclare(boolean) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
You can "force" the declaration of a queue on the RabbitMQ broker.
withQueueDeclare(boolean) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
If the queue is not declared by another application, RabbitMqIO can declare the queue itself.
withQueueUrl(String) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
Deprecated.
Define the queueUrl used by the SqsIO.Read to receive messages from SQS.
withQueueUrl(String) - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Read
Define the queueUrl used by the SqsIO.Read to receive messages from SQS.
withQuotationMark(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
Sets Snowflake-specific quotations around strings.
withQuotationMark(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
 
withQuotationMark(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Sets Snowflake-specific quotations around strings.
withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that does not throttle during ramp-up.
withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that does not throttle during ramp-up.
withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that does not throttle during ramp-up.
withRandomAccess() - Method in class org.apache.beam.sdk.transforms.View.AsList
Returns a PCollection view like this one, but whose resulting list will have RandomAccess (aka fast indexing).
withRandomAccess(boolean) - Method in class org.apache.beam.sdk.transforms.View.AsList
Returns a PCollection view like this one, but whose resulting list will have RandomAccess (aka fast indexing) according to the input parameter.
withRate(long, Duration) - Method in class org.apache.beam.sdk.io.GenerateSequence
Specifies to generate at most a given number of elements per a given period.
withRateLimitPolicy(GoogleAdsV17.RateLimitPolicyFactory) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.Read
Creates and returns a new GoogleAdsV17.Read transform with the specified rate limit policy factory.
withRateLimitPolicy(GoogleAdsV17.RateLimitPolicyFactory) - Method in class org.apache.beam.sdk.io.googleads.GoogleAdsV17.ReadAll
Creates and returns a new GoogleAdsV17.ReadAll transform with the specified rate limit policy factory.
withReadCommitted() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets "isolation_level" to "read_committed" in Kafka consumer configuration.
withReadCommitted() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Sets "isolation_level" to "read_committed" in Kafka consumer configuration.
withReadOperation(ReadOperation) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads at the specified readTime.
withReadTimeout(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Cassandra client read timeout in ms.
withReadTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Cassandra client read timeout in ms.
withReadTimeout(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Cassandra client socket option to set the read timeout in ms.
withReadTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Cassandra client socket option to set the read timeout in ms.
withReadTransaction() - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withReceiveTimeout(Duration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
If set, block for the Duration of timeout for each poll to new JMS record if the previous poll returns no new record.
withRecordAggregation(KinesisIO.RecordAggregation) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Enable record aggregation that is compatible with the KPL / KCL.
withRecordAggregation(Consumer<KinesisIO.RecordAggregation.Builder>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Enable record aggregation that is compatible with the KPL / KCL.
withRecordAggregationDisabled() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Disable KPL / KCL like record aggregation.
withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets a JAXB annotated class that can be populated using a record of the provided XML file.
withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
Writes objects of the given class mapped to XML elements using JAXB.
withRecordElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets name of the record element of the XML document.
withRecordElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
withRecordNumMetadata() - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
Allows the user to opt into getting recordNums associated with each record.
withRecordReadAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which the record was fully read.
withRecordStreamEndedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which the record finished streaming.
withRecordStreamStartedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which the record started to be streamed.
withRecordTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the timestamp of when this record occurred.
withRedistribute() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets redistribute transform that hints to the runner to try to redistribute the work evenly.
withRedistribute() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Enable Redistribute.
withRedistributeNumKeys(int) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
withRedistributeNumKeys(int) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
 
withRelativeError(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
Sets the relative error epsilon.
withRelativeError(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
Sets the relative error epsilon.
withReplicaInfo(SolrIO.ReplicaInfo) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
Read from a specific Replica (partition).
withReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Whether additional diagnostic metrics should be reported for a Transform.
withRepresentativeCoder(Coder<IdT>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
Return a WithRepresentativeValues PTransform that is like this one, but with the specified id type coder.
withRepresentativeType(TypeDescriptor<IdT>) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
Return a WithRepresentativeValues PTransform that is like this one, but with the specified id type descriptor.
withRepresentativeType(TypeDescriptor<IdT>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
Return a WithRepresentativeValues PTransform that is like this one, but with the specified output type descriptor.
withRepresentativeValueFn(SerializableFunction<T, IdT>) - Static method in class org.apache.beam.sdk.transforms.Deduplicate
Returns a deduplication transform that deduplicates values using the supplied representative value for up to 10 mins within the processing time domain.
withRepresentativeValueFn(SerializableFunction<T, IdT>) - Static method in class org.apache.beam.sdk.transforms.Distinct
Returns a Distinct<T, IdT> PTransform.
withRequestRecordsLimit(int) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies the maximum number of records in GetRecordsResult returned by GetRecords call which is limited by 10K records.
withRequestRecordsLimit(int) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies the maximum number of records in GetRecordsResult returned by GetRecords call which is limited by 10K records.
withRequiresDeduping() - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
If set, requires runner deduplication for the messages.
withResponseItemJson(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
Sets the element from Elasticsearch Bulk API response "items" pertaining to this WriteSummary.
withResultOutputTag(TupleTag<PublishResult>) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Tuple tag to store results.
withResults() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
Returns JdbcIO.WriteVoid transform which can be used in Wait.on(PCollection[]) to wait until all data is written.
withResumeDelay(Duration) - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
Builder method to set the value of DoFn.ProcessContinuation.resumeDelay().
withRetained(boolean) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
Whether or not the publish message should be retained by the messaging engine.
withRetentionPolicy(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
Sets the retention policy to use.
withRetentionPolicy(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
Sets the retention policy to use.
withRetryableCodes(ImmutableSet<StatusCode.Code>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the errors that will be retried by the client library for all operations.
withRetryConfiguration(DynamoDBIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Write
Deprecated.
Provides configuration to retry a failed request to publish a set of records to DynamoDB.
withRetryConfiguration(SnsIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Provides configuration to retry a failed request to publish a message to SNS.
withRetryConfiguration(ElasticsearchIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
Provides configuration to retry a failed batch call to Elasticsearch.
withRetryConfiguration(ElasticsearchIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withRetryConfiguration(JdbcIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
See WriteVoid#withRetryConfiguration(RetryConfiguration).
withRetryConfiguration(JdbcIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
When a SQL exception occurs, JdbcIO.Write uses this JdbcIO.RetryConfiguration to exponentially back off and retry the statements based on the JdbcIO.RetryConfiguration mentioned.
withRetryConfiguration(JdbcIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
When a SQL exception occurs, JdbcIO.Write uses this JdbcIO.RetryConfiguration to exponentially back off and retry the statements based on the JdbcIO.RetryConfiguration mentioned.
withRetryConfiguration(RetryConfiguration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify the JMS retry configuration.
withRetryConfiguration(SolrIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
Provides configuration to retry a failed batch call to Solr.
withRetryStrategy(JdbcIO.RetryStrategy) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
See WriteVoid#withRetryStrategy(RetryStrategy).
withRetryStrategy(JdbcIO.RetryStrategy) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
When a SQL exception occurs, JdbcIO.Write uses this JdbcIO.RetryStrategy to determine if it will retry the statements.
withRetryStrategy(JdbcIO.RetryStrategy) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
When a SQL exception occurs, JdbcIO.Write uses this JdbcIO.RetryStrategy to determine if it will retry the statements.
withRingRanges(Set<RingRange>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
withRingRanges(ValueProvider<Set<RingRange>>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
withRole(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets user's role to be used when running queries on Snowflake.
withRole(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets user's role to be used when running queries on Snowflake.
withRootCaCertificatePath(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Same as Builder#withRootCaCertificatePath(ValueProvider) but without a ValueProvider.
withRootCaCertificatePath(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkIO.Write
Method to set the root CA certificate.
withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets name of the root element of the XML document.
withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
Sets the enclosing root element for the generated XML files.
withRoutingFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide a function to extract the target routing from the document allowing for dynamic document routing.
withRoutingFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withRowFilter(ValueProvider<RowFilter>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will filter the rows read from Cloud Bigtable using the given row filter.
withRowFilter(RowFilter) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will filter the rows read from Cloud Bigtable using the given row filter.
withRowGroupSize(int) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
Specify row-group size; if not set or zero, a default is used by the underlying writer.
withRowMapper(JdbcIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withRowMapper(JdbcIO.RowMapper<OutputT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
withRowMapper(JdbcIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
 
withRowMapper(JdbcIO.RowMapper<V>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
withRowMapper(Neo4jIO.RowMapper<OutputT>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withRowMapper(SingleStoreIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
 
withRowMapper(SingleStoreIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
 
withRowMutationInformationFn(SerializableFunction<T, RowMutationInformation>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows upserting and deleting rows for tables with a primary key defined.
withRowOutput() - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
Data output type is Row, and schema is auto-inferred from the database.
withRowRestriction(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withRowRestriction(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Read only rows which match the specified filter, which must be a SQL expression compatible with Google standard SQL.
withRowSchema(Schema) - Method in class org.apache.beam.sdk.transforms.Create.Values
Returns a Create.Values PTransform like this one that uses the given Schema to represent objects.
withRpcPriority(Options.RpcPriority) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the RPC priority.
withRpcPriority(ValueProvider<Options.RpcPriority>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the RPC priority.
withRpcPriority(Options.RpcPriority) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the priority of the change stream queries.
withRunnerDeterminedSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink with runner-determined sharding.
withSamplePeriod(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the length of time sampled request data will be retained.
withSamplePeriodBucketSize(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the size of buckets within the specified samplePeriod.
withScan(Scan) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Filters the rows read from HBase using the given* scan.
withScanRequestFn(SerializableFunction<Void, ScanRequest>) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
Can't pass ScanRequest object directly from client since this object is not full serializable.
withScanRequestFn(SerializableFunction<Void, ScanRequest>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
Can't pass ScanRequest object directly from client since this object is not full serializable.
withScanResponseMapperFn(SerializableFunction<ScanResponse, T>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
 
withScanResultMapperFn(SerializableFunction<ScanResult, T>) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
Deprecated.
 
withSchema(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Sets the output schema.
withSchema(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withSchema(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
Reads files containing records that conform to the given schema.
withSchema(Schema) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
withSchema(Class<X>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroSource
Reads files containing records of the given class.
withSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.DocumentToRow
 
withSchema(TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Uses the specified schema for rows to be written.
withSchema(ValueProvider<TableSchema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withSchema(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets schema to use when connecting to Snowflake.
withSchema(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
withSchema(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
Returns a Create.TimestampedValues PTransform like this one that uses the given Schema to represent objects.
withSchema(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
Returns a Create.Values PTransform like this one that uses the given Schema to represent objects.
withSchema(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
Returns a Create.WindowedValues PTransform like this one that uses the given Schema to represent objects.
withSchema(Schema) - Static method in class org.apache.beam.sdk.transforms.JsonToRow
 
withSchema(Schema) - Static method in class org.apache.beam.sdk.values.Row
Creates a row builder with specified Row.getSchema().
withSchemaAndNullBehavior(Schema, RowJson.RowJsonDeserializer.NullBehavior) - Static method in class org.apache.beam.sdk.transforms.JsonToRow
 
withSchemaFromView(PCollectionView<Map<String, String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows the schemas for each table to be computed within the pipeline itself.
withSchemaReadySignal(PCollection<?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies an optional input PCollection that can be used as the signal for Wait.OnSignal to indicate when the database schema is ready to be read.
withSchemaUpdateOptions(Set<BigQueryIO.Write.SchemaUpdateOption>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows the schema of the destination table to be updated as a side effect of the write.
withScrollKeepalive(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide a scroll keepalive.
withSelectedFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withSelectedFields(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Read only the specified fields (columns) from a BigQuery table.
withSempClientFactory(SempClientFactory) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
Set a factory that creates a SempClientFactory.
withSerializer(SerializableFunction<T, byte[]>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Specify how to serialize records to bytes on the stream (required).
withServerName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets the name of the Snowflake server.
withServerName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
 
withServerUri(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
Set up the MQTT broker URI.
withSessionConfig(SessionConfig) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withSessionConfig(SessionConfig) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withSessionConfig(ValueProvider<SessionConfig>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withSessionServiceFactory(SessionServiceFactory) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
Set a factory that creates a SessionService.
withSessionServiceFactory(SessionServiceFactory) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
Set the provider used to obtain the properties to initialize a new session in the broker.
withShard(int) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
withShardedKey() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
Outputs batched elements associated with sharded input keys.
withSharding(PTransform<PCollection<UserT>, PCollectionView<Integer>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies a PTransform to use for computing the desired number of shards in each window.
withSharding(PTransform<PCollection<UserT>, PCollectionView<Integer>>) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink using the specified PTransform to compute the number of shards.
withShardingFunction(ShardingFunction<UserT, DestinationT>) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink using the specified sharding function to assign shard for inputs.
withShardNameTemplate(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Uses the given ShardNameTemplate for naming output files.
withShardNameTemplate(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Uses the given ShardNameTemplate for naming output files.
withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Uses the given shard name template.
withShardsNumber(Integer) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Number of shards that are created per window.
withShardTemplate(String) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Uses the given ShardNameTemplate for naming output files.
withShardTemplate(String) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
Sets the shard template.
withShardTemplate(String) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Uses the given ShardNameTemplate for naming output files.
withSideInput() - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
 
withSideInput(String, PCollectionView<?>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
Returns a new multi-output ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInput(String, PCollectionView<?>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns a new ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInputs(List<PCollectionView<?>>) - Method in class org.apache.beam.sdk.io.WriteFiles
 
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform identical to this, but with the specified side inputs to use in CombineWithContext.CombineFnWithContext.
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform identical to this, but with the specified side inputs to use in CombineWithContext.CombineFnWithContext.
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns a PTransform identical to this, but with the specified side inputs to use in CombineWithContext.CombineFnWithContext.
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns a PTransform identical to this, but with the specified side inputs to use in CombineWithContext.CombineFnWithContext.
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
Returns a new multi-output ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
withSideInputs(Map<String, PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
Returns a new multi-output ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns a new ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns a new ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInputs(Map<String, PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns a new ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSingletonValues() - Method in class org.apache.beam.sdk.transforms.View.AsMap
Deprecated.
this method simply returns this AsMap unmodified
withSize(int) - Static method in class org.apache.beam.sdk.extensions.avro.schemas.utils.AvroUtils.FixedBytesField
Create a AvroUtils.FixedBytesField with the specified size.
withSize(long) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
 
withSkew(Duration) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withSkipHeaderLines(int) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
withSkipHeaderLines(int) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
 
withSkipIfEmpty(boolean) - Method in class org.apache.beam.sdk.io.WriteFiles
Set this sink to skip writing any files if the PCollection is empty.
withSkipIfEmpty() - Method in class org.apache.beam.sdk.io.WriteFiles
 
withSkipKeyClone(boolean) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Determines if key clone should be skipped or not (default is 'false').
withSkipValueClone(boolean) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Determines if value clone should be skipped or not (default is 'false').
withSnowflakeServices(SnowflakeServices) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
A snowflake service SnowflakeServices implementation which is supposed to be used.
withSnowPipe(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Sets name of SnowPipe which can be created in Snowflake dashboard or cli:
withSnowPipe(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Same as withSnowPipe(String, but with a ValueProvider.
withSocketTimeout(Integer) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If set, overwrites the default max retry timeout (30000ms) in the Elastic RestClient and the default socket timeout (30000ms) in the RequestConfig of the Elastic RestClient.
withSource(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
Assigns source value to the event metadata.
withSourceConnector(SourceConnector) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the SourceConnector to be used.
withSourceConnector(ValueProvider<SourceConnector>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
 
withSourceType(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
Assigns sourceType value to the event metadata.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner configuration.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner configuration.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner configuration.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner configuration.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner configuration.
withSparkReceiverBuilder(ReceiverBuilder<V, ? extends Receiver<V>>) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
Sets ReceiverBuilder with value and custom Spark Receiver class.
withSparsePrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
Sets the sparse representation's precision sp.
withSparsePrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
Sets the sparse representation's precision sp.
withSparseRepresentation(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
Returns an ApproximateDistinct.ApproximateDistinctFn combiner with a new sparse representation's precision sp.
withSsl(SSLOptions) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Optionally, specify SSLOptions configuration to utilize SSL.
withSsl(ValueProvider<SSLOptions>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Optionally, specify SSLOptions configuration to utilize SSL.
withSsl(SSLOptions) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Optionally, specify SSLOptions configuration to utilize SSL.
withSsl(ValueProvider<SSLOptions>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Optionally, specify SSLOptions configuration to utilize SSL.
withSSL(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
Define if a SSL connection to Redis server should be used.
withSSLEnabled(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Enable ssl for connection.
withSSLEnabled(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Enable ssl for connection.
withSSLInvalidHostNameAllowed(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Enable invalidHostNameAllowed for ssl for connection.
withSSLInvalidHostNameAllowed(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Enable invalidHostNameAllowed for ssl for connection.
withStagingBucketName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
Name of the cloud bucket (GCS by now) to use as tmp location of CSVs during COPY statement.
withStagingBucketName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
 
withStagingBucketName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Name of the cloud bucket (GCS by now) to use as tmp location of CSVs during COPY statement.
withStagingBucketName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
 
withStartingDay(int, int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
withStartingMonth(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
withStartingYear(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
withStartKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns new ByteKeyRange like this one, but with the specified start key.
withStartOffset(Long) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
Inclusive start offset from which the reading should be started.
withStartOffset(Long) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
Inclusive start offset from which the reading should be started.
withStartPollTimeoutSec(Long) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
Delay in seconds before start polling.
withStartPollTimeoutSec(Long) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
Waiting time after the Receiver starts.
withStartReadTime(Instant) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Use timestamp to set up start offset.
withStartTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will start streaming at the specified start time.
withStartTimestamp(Long) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
withStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
 
withStatement(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
 
withStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
withStatement(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
withStatementPreparator(JdbcIO.StatementPreparator) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withStatementPreparator(JdbcIO.StatementPreparator) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadRows
 
withStatementPreparator(SingleStoreIO.StatementPreparator) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
 
withStatusCode(Integer) - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
Assigns a return status code to assist with debugging.
withStatusMessage(String) - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
Assigns a return status message to assist with debugging.
withStopReadTime(Instant) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Use timestamp to set up stop offset.
withStopTime(Instant) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
 
withStorageClient(BigQueryServices.StorageClient) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
withStorageIntegrationName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
Name of the Storage Integration in Snowflake to be used.
withStorageIntegrationName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
 
withStorageIntegrationName(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Name of the Storage Integration in Snowflake to be used.
withStorageIntegrationName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
 
withStreamName(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specify reading from streamName.
withStreamName(String) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
Kinesis stream name which will be used for writing (required).
withStreamName(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specify reading from streamName.
withStreamName(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
Specify Kinesis stream name which will be used for writing, this name is required.
withSubmissionMode(SolaceIO.SubmissionMode) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
This setting controls the JCSMP property MESSAGE_CALLBACK_ON_REACTOR.
withSuccessfulInsertsPropagation(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, it enables the propagation of the successfully inserted TableRows on BigQuery as part of the WriteResult object when using BigQueryIO.Write.Method.STREAMING_INSERTS.
withSuffix(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Configures the filename suffix for written files.
withSuffix(String) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withSuffix(String) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Configures the filename suffix for written files.
withSuffix(String) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
Sets the suffix.
withSuffix(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies a common suffix to use for all generated filenames, if using the default file naming.
withSuffix(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
withSuffix(String) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Configures the filename suffix for written files.
withSuffix(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Configures the filename suffix for written files.
withSuffix(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
withSuffix(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes to the file(s) with the given filename suffix.
withSyncInterval(int) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Sets the approximate number of uncompressed bytes to write in each block for the AVRO container format.
withSyncInterval(int) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withTable(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Cassandra table where to read data.
withTable(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the Cassandra table where to read data.
withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withTable(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
Sets the table name to read from.
withTable(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
Sets the table name to write to, the table should exist beforehand.
withTable(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
Name of the table in the external database.
withTable(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
withTable(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
 
withTable(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
withTable(String) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Read
Reads from the specified table.
withTable(String) - Method in class org.apache.beam.sdk.io.kudu.KuduIO.Write
Writes to the specified table.
withTable(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Read
 
withTable(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.ReadWithPartitions
 
withTable(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
 
withTableDescription(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies the table description.
withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will stream from the specified table.
withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write to the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write to the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Reads from the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
Writes to the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.WriteRowMutations
Writes to the specified table.
withTableProvider(TableProvider) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
This method creates BeamSqlEnv using empty Pipeline Options.
withTableProvider(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
withTableReference(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
withTableSchema(TableSchema) - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
Set TableSchema.
withTableSchema(SnowflakeTableSchema) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
Table schema to be used during creating table.
withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Set the base directory used to generate temporary files.
withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Set the base directory used to generate temporary files.
withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Set the base directory used to generate temporary files.
withTempDirectory(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
Specifies a directory into which all temporary files will be placed.
withTempDirectory(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Set the base directory used to generate temporary files.
withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Set the base directory used to generate temporary files.
withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Set the base directory used to generate temporary files.
withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.Write
withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.Write
withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Use new template-compatible source implementation.
withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
withTempLocation(String) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Sets the path to a temporary location where the sorter writes intermediate files.
withTerminationCondition(Watch.Growth.TerminationCondition<HCatalogIO.Read, ?>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
If specified, the poll function will stop polling after the termination condition has been satisfied.
withTerminationPerInput(Watch.Growth.TerminationCondition<InputT, ?>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
Specifies a Watch.Growth.TerminationCondition that will be independently used for every input.
withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withThrottleDuration(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the amount of time an attempt will be throttled if deemed necessary based on previous success rate.
withThrottlingReportTargetMs(int) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
This method has been deprecated in Beam 2.60.0. It does not have an effect.
withThrottlingTargetMs(int) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
This method has been deprecated in Beam 2.60.0. It does not have an effect.
withThrowWriteErrors(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
Whether to throw runtime exceptions when write (IO) errors occur.
withThrowWriteErrors(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withTikaConfigPath(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
withTikaConfigPath(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
Like with(tikaConfigPath).
withTime(Long) - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
Assigns time value to the event metadata.
withTimeDomain(TimeDomain) - Method in class org.apache.beam.sdk.transforms.Deduplicate.KeyedValues
Returns a KeyedValues PTransform like this one but with the specified time domain.
withTimeDomain(TimeDomain) - Method in class org.apache.beam.sdk.transforms.Deduplicate.Values
Returns a Values PTransform like this one but with the specified time domain.
withTimeDomain(TimeDomain) - Method in class org.apache.beam.sdk.transforms.Deduplicate.WithRepresentativeValues
Returns a WithRepresentativeValues PTransform like this one but with the specified time domain.
withTimeout(Duration) - Method in class org.apache.beam.io.requestresponse.RequestResponseIO
Overrides the RequestResponseIO.DEFAULT_TIMEOUT expected timeout of all user custom code.
withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
Define the Redis connection timeout.
withTimeout(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadKeyPatterns
 
withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
Set the connection timeout for the Redis server connection.
withTimePartitioning(TimePartitioning) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows newly created tables to include a TimePartitioning class.
withTimePartitioning(ValueProvider<TimePartitioning>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withTimestamp(Instant) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
Sets the timestamp of the element in the PCollection, to be used in order to output WriteSummary to the same window from which the inputDoc originated.
withTimestamp(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub and adds each record's timestamp to the published messages in an attribute with the specified name.
withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withTimestampCombiner(TimestampCombiner) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Override the default TimestampCombiner, to control the output timestamp of values output from a GroupByKey operation.
withTimestampCombiner(TimestampCombiner) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
withTimestampFn(SerializableFunction<KinesisRecord, Instant>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
Specify the SerializableFunction to extract the event time from a KinesisRecord.
withTimestampFn(SerializableFunction<Long, Instant>) - Method in class org.apache.beam.sdk.io.GenerateSequence
Specifies the function to use to assign timestamps to the elements.
withTimestampFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Deprecated.
withTimestampFn(SerializableFunction<KafkaRecord<K, V>, Instant>) - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
withTimestampFn(SerializableFunction<KinesisRecord, Instant>) - Method in class org.apache.beam.sdk.io.kinesis.WatermarkParameters
Specify the SerializableFunction to extract the event time from a KinesisRecord.
withTimestampFn(SerializableFunction<T, Instant>) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
The timestamp function, used for estimating the watermark, mapping the record T to an Instant
withTimestampFn(SerializableFunction<V, Instant>) - Method in class org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO.Read
A function to calculate timestamp for a record.
withTimestampFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Deprecated.
withTimestampPolicyFactory(TimestampPolicyFactory<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Provide custom TimestampPolicyFactory to set event times and watermark for each partition.
WithTimestamps<T> - Class in org.apache.beam.sdk.transforms
A PTransform for assigning timestamps to all the elements of a PCollection.
withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
withToDateTime(String) - Method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Read
Read metric data till the toDateTime.
withTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
withTopic(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Specify the JMS topic destination name where to receive messages from.
withTopic(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify the JMS topic destination name where to send messages to.
withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets the topic to read from.
withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Wrapper method over KafkaIO.WriteRecords.withTopic(String), used to keep the compatibility with old API based on KV type of element.
withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Sets the default Kafka topic to write to.
withTopic(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
Set up the MQTT getTopic pattern.
withTopic(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Read
 
withTopic(String) - Method in class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
 
withTopicArn(String) - Method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
SNS topic ARN used for publishing to SNS.
withTopicFn(SerializableFunction<InputT, String>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
withTopicName(String) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
Specify the SNS topic which will be used for writing, this name is mandatory.
withTopicNameMapper(SerializableFunction<EventT, String>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify the JMS topic destination name where to send messages to dynamically.
withTopicPartitions(List<TopicPartition>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a list of partitions to read from.
withTopicPattern(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Internally sets a Pattern of topics to read from.
withTopics(List<String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a list of topics to read from.
withTotalStreamTimeMillis(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the total streaming time (in millis) for this record.
withTraceSampleProbability(Double) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Deprecated.
This configuration has no effect, as tracing is not available.
withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withTransactionConfig(TransactionConfig) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
withTransactionConfig(TransactionConfig) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withTransactionConfig(ValueProvider<TransactionConfig>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withTrigger(Trigger) - Method in class org.apache.beam.sdk.values.WindowingStrategy
Returns a WindowingStrategy identical to this but with the trigger set to wildcardTrigger.
withTriggeringFrequency(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Choose the frequency at which file writes are triggered.
withTriggeringFrequency(Duration) - Method in class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
Sets the frequency at which data is written to files and a new Snapshot is produced.
withTrustSelfSignedCerts(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch uses SSL/TLS then configure whether to trust self signed certs or not.
withType(Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Field
Returns a copy of the Field with the Schema.FieldType set.
withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
Returns a Create.TimestampedValues PTransform like this one that uses the given TypeDescriptor<T> to determine the Coder to use to decode each of the objects into a value of type T.
withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
Returns a Create.Values PTransform like this one that uses the given TypeDescriptor<T> to determine the Coder to use to decode each of the objects into a value of type T.
withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
Returns a Create.WindowedValues PTransform like this one that uses the given TypeDescriptor<T> to determine the Coder to use to decode each of the objects into a value of type T.
withTypeFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide a function to extract the target type from the document allowing for dynamic document routing.
withTypeFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withTypeHint(Class<?>, Schema.FieldType) - Method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
Specifies the field type of arguments.
withUnwindMapName(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withUnwindMapName(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
withUpdateConfiguration(UpdateConfiguration) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
withUpdateFields(UpdateField...) - Method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
Sets the configurations for multiple updates.
withUpdateKey(String) - Method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
 
withUpperBound(PartitionColumnT) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadWithPartitions
 
withUpsertScript(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Whether to use scripted updates and what script to use.
withUpsertScript(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withUpToDateThreshold(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Read
Specifies how late records consumed by this source can be to still be considered on time.
withUpToDateThreshold(Duration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Deprecated.
Specifies how late records consumed by this source can be to still be considered on time.
withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Define the location of the MongoDB instances using an URI.
withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
Define the location of the MongoDB instances using an URI.
withUri(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
 
withUri(String) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
 
withUrl(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withUrl(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withUrl(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets URL of Snowflake server in following format: jdbc:snowflake://.snowflakecomputing.com
withUrls(List<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withUrls(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withUseCorrelationId(boolean) - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Read
Toggles deduplication of messages based on the amqp correlation-id property on incoming messages.
withUsePartialUpdate(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
Provide an instruction to control whether partial updates or inserts (default) are issued to Elasticsearch.
withUsePartialUpdate(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withUserDataMapper(SingleStoreIO.UserDataMapper<T>) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
 
withUserDataMapper(SnowflakeIO.UserDataMapper<T>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
User-defined function mapping user data into CSV lines.
withUsername(String) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the username to connect to your database.
withUsername(ValueProvider<String>) - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
Sets the username to connect to your database.
withUsername(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the username for authentication.
withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
Specify the username for authentication.
withUsername(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the username used for authentication.
withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
Specify the username for authentication.
withUsername(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch authentication is enabled, provide the username.
withUsername(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
withUsername(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Define the username to connect to the JMS broker (authenticated).
withUsername(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Define the username to connect to the JMS broker (authenticated).
withUsername(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
withUsername(String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
 
withUsername(String) - Method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO.DataSourceConfiguration
 
withUsernamePasswordAuth(String, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets username/password authentication.
withUsernamePasswordAuth(ValueProvider<String>, ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets username/password authentication.
withUsesReshuffle(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ParseFiles
Specifies if a Reshuffle should run before file reads occur.
withUsesReshuffle(boolean) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.ReadFiles
Specifies if a Reshuffle should run before file reads occur.
withUseStatefulBatches(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BulkIO
Whether or not to use Stateful Processing to ensure bulk requests have the desired number of entities i.e.
withUseStatefulBatches(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
withValidate(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
withValidation() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Enable validation of the PubSub Read.
withValidation() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Enable validation of the PubSub Write.
withValidation() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
After creation we will validate that PipelineOptions conforms to all the validation criteria from <T>.
withValidation() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory
After creation we will validate that <T> conforms to all the validation criteria.
withValidationEventHandler(ValidationEventHandler) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets the ValidationEventHandler to use with JAXB.
withValidationEventHandler(ValidationEventHandler) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
withValueClass(Class<V>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Read
Sets a value class.
withValueClass(Class<V>) - Method in class org.apache.beam.sdk.io.cdap.CdapIO.Write
Sets a value class.
withValueDeserializer(Class<? extends Deserializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a Kafka Deserializer to interpret value bytes read from Kafka.
withValueDeserializer(DeserializerProvider<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
withValueDeserializer(Class<? extends Deserializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Sets a Kafka Deserializer to interpret value bytes read from Kafka.
withValueDeserializerAndCoder(Class<? extends Deserializer<V>>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a Kafka Deserializer for interpreting value bytes read from Kafka along with a Coder for helping the Beam runner materialize value objects at runtime if necessary.
withValueDeserializerAndCoder(Class<? extends Deserializer<V>>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Sets a Kafka Deserializer for interpreting value bytes read from Kafka along with a Coder for helping the Beam runner materialize value objects at runtime if necessary.
withValueDeserializerProvider(DeserializerProvider<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
 
withValueDeserializerProviderAndCoder(DeserializerProvider<V>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
withValueField(String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
withValueMapper(SerializableBiFunction<EventT, Session, Message>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Map the EventT object to a Message.
withValueSerializer(Class<? extends Serializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Wrapper method over KafkaIO.WriteRecords.withValueSerializer(Class), used to keep the compatibility with old API based on KV type of element.
withValueSerializer(Class<? extends Serializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
Sets a Serializer for serializing value to bytes.
withValueTranslation(SimpleFunction<?, V>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Transforms the values read from the source using the given value translation function.
withValueTranslation(SimpleFunction<?, V>, Coder<V>) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
Transforms the values read from the source using the given value translation function.
withWallTimeWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
Use the WatermarkEstimators.WallTime as the watermark estimator.
withWarehouse(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets Snowflake Warehouse to use.
withWarehouse(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
Sets Snowflake Warehouse to use.
withWatermark(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
Returns a new Watch.Growth.PollResult like this one with the provided watermark.
withWatermarkFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Deprecated.
withWatermarkFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Deprecated.
withWatermarkIdleDurationThreshold(Duration) - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
Specify the watermark idle duration to consider before advancing the watermark.
withWatermarkIdleDurationThreshold(Duration) - Method in class org.apache.beam.sdk.io.kinesis.WatermarkParameters
Specify the watermark idle duration to consider before advancing the watermark.
withWatermarkIdleDurationThreshold(Duration) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Read
Optional.
withWindowCoder(Coder<? extends BoundedWindow>) - Method in class org.apache.beam.sdk.transforms.Create.WindowedValues
Returns a Create.WindowedValues PTransform like this one that uses the given Coder<T> to decode each of the objects into a value of type T.
withWindowedWrites() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.TypedWrite
Preserves windowing of input elements and writes them to files based on the element's window.
withWindowedWrites() - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Write
withWindowedWrites() - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Preserves windowing of input elements and writes them to files based on the element's window.
withWindowedWrites() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
Specify that writes are windowed.
withWindowedWrites() - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Preserves windowing of input elements and writes them to files based on the element's window.
withWindowedWrites() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Preserves windowing of input elements and writes them to files based on the element's window.
withWindowedWrites() - Method in class org.apache.beam.sdk.io.TextIO.Write
withWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that writes preserves windowing on it's input.
withWindowFn(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.values.WindowingStrategy
Returns a WindowingStrategy identical to this but with the window function set to wildcardWindowFn.
withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.csv.CsvIO.Write
Returns a transform for writing to text files like this one but that has the given FileBasedSink.WritableByteChannelFactory to be used by the FileBasedSink during output.
withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.json.JsonIO.Write
Returns a transform for writing to text files like this one but that has the given FileBasedSink.WritableByteChannelFactory to be used by the FileBasedSink during output.
withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
Returns a transform for writing to text files like this one but that has the given FileBasedSink.WritableByteChannelFactory to be used by the FileBasedSink during output.
withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.TextIO.Write
See TypedWrite#withWritableByteChannelFactory(WritableByteChannelFactory).
withWriteDisposition(BigQueryIO.Write.WriteDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies what to do with existing data in the table, in case the table already exists.
withWriteDisposition(WriteDisposition) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
A disposition to be used during writing to table phase.
withWriteRequestMapperFn(SerializableFunction<T, KV<String, WriteRequest>>) - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Write
Deprecated.
 
withWriteRequestMapperFn(SerializableFunction<T, KV<String, WriteRequest>>) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
 
withWriteResults() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a BigtableIO.WriteWithResults that will emit a BigtableWriteResult for each batch of rows written.
withWriteResults(JdbcIO.RowMapper<V>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
Returns JdbcIO.WriteWithResults transform that could return a specific result.
withWriterType(SolaceIO.WriterType) - Method in class org.apache.beam.sdk.io.solace.SolaceIO.Write
Set the type of writer used by the connector.
withWriteTempDataset(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Temporary dataset.
withWriteTransaction() - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.ReadAll
 
witValueField(String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
Set the name of the value field in the resulting schema.
WordCount - Class in org.apache.beam.runners.spark.structuredstreaming.examples
Duplicated from beam-examples-java to avoid dependency.
WordCount() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount
 
WordCount.CountWords - Class in org.apache.beam.runners.spark.structuredstreaming.examples
A PTransform that converts a PCollection containing lines of text into a PCollection of formatted word counts.
WordCount.FormatAsTextFn - Class in org.apache.beam.runners.spark.structuredstreaming.examples
A SimpleFunction that converts a Word and Count into a printable string.
WordCount.WordCountOptions - Interface in org.apache.beam.runners.spark.structuredstreaming.examples
Options supported by WordCount.
WorkerLogLevelOverrides() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Deprecated.
 
workerStatus(StreamObserver<BeamFnApi.WorkerStatusRequest>) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
 
wrap(Throwable) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
 
wrap(String) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
 
wrapDescriptorProto(DescriptorProtos.DescriptorProto) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
WrappedList(List<Object>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
 
WrappedMap(Map<Object, Object>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
 
WrappedRow(Row) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
 
WrappedSupervisor - Class in org.apache.beam.sdk.io.sparkreceiver
Wrapper class for ReceiverSupervisor that doesn't use Spark Environment.
WrappedSupervisor(Receiver<?>, SparkConf, SerializableFunction<Object[], Void>) - Constructor for class org.apache.beam.sdk.io.sparkreceiver.WrappedSupervisor
 
wrapping(StreamObserver<V>) - Static method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
Create a new SynchronizedStreamObserver which will delegate all calls to the underlying StreamObserver, synchronizing access to that observer.
wrapProcessContext(DoFn<InputT, ?>.ProcessContext) - Static method in class org.apache.beam.sdk.transforms.Contextful.Fn.Context
Convenience wrapper for creating a Contextful.Fn.Context from a DoFn.ProcessContext, to support the common case when a PTransform is invoking the closure from inside a DoFn.
wrapSegment(String) - Static method in class org.apache.beam.sdk.metrics.Lineage
Wrap segment to valid segment name.
WritableCoder<T extends Writable> - Class in org.apache.beam.sdk.io.hadoop
A WritableCoder is a Coder for a Java class that implements Writable.
WritableCoder(Class<T>) - Constructor for class org.apache.beam.sdk.io.hadoop.WritableCoder
 
WritableCoder.WritableCoderProviderRegistrar - Class in org.apache.beam.sdk.io.hadoop
A CoderProviderRegistrar which registers a CoderProvider which can handle writable types.
WritableCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
 
write(Kryo, Output) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues.BaseSideInputValues
 
write(ElementT) - Method in class org.apache.beam.sdk.extensions.avro.io.AvroIO.Sink
 
write(Class<T>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Writes a PCollection to an Avro file (or multiple Avro files matching a sharding pattern).
write(ByteBuffer) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingSeekableByteChannel
 
write(ByteBuffer) - Method in class org.apache.beam.sdk.extensions.gcp.util.channels.CountingWritableByteChannel
 
write(int) - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
write(byte[], int, int) - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
write() - Static method in class org.apache.beam.sdk.io.amqp.AmqpIO
 
Write() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO
Deprecated.
 
Write() - Constructor for class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Write
Deprecated.
 
write() - Static method in class org.apache.beam.sdk.io.aws.sns.SnsIO
Deprecated.
 
Write() - Constructor for class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
Deprecated.
 
write() - Static method in class org.apache.beam.sdk.io.aws.sqs.SqsIO
Deprecated.
 
Write() - Constructor for class org.apache.beam.sdk.io.aws.sqs.SqsIO.Write
Deprecated.
 
write() - Static method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
 
Write() - Constructor for class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO
Returns a new KinesisIO.Write transform for writing to Kinesis.
Write() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.aws2.sns.SnsIO
 
Write() - Constructor for class org.apache.beam.sdk.io.aws2.sns.SnsIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
Deprecated.
Use SqsIO.writeBatches() for more configuration options.
Write() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.SqsIO.Write
Deprecated.
 
write() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
Provide a CassandraIO.Write PTransform to write data to a Cassandra database.
Write() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.cdap.CdapIO
 
Write() - Constructor for class org.apache.beam.sdk.io.cdap.CdapIO.Write
 
write(String, String) - Static method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
 
Write() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
 
write(String, CSVFormat) - Static method in class org.apache.beam.sdk.io.csv.CsvIO
Instantiates a CsvIO.Write for writing user types in CSVFormat format.
Write() - Constructor for class org.apache.beam.sdk.io.csv.CsvIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
 
Write() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
write(OutputT) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Called for each value in the bundle.
write(ElementT) - Method in interface org.apache.beam.sdk.io.FileIO.Sink
Appends a single element to the file.
write() - Static method in class org.apache.beam.sdk.io.FileIO
Writes elements to files using a FileIO.Sink.
Write() - Constructor for class org.apache.beam.sdk.io.FileIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection to a BigQuery table.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Creates an uninitialized BigtableIO.Write.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
write() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.Write builder.
write() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
The class returned by this method provides the ability to create PTransforms for write operations available in the Firestore V1 API provided by FirestoreStub.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
 
Write() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
 
Write() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
write(PublisherOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
Write messages to Pub/Sub Lite.
write(Object, Encoder) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
Serializes a Timestamp received as datum to the output encoder out.
write() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
Creates an uninitialized instance of SpannerIO.Write.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
Creates an Write.Builder for creation of Write Transformation.
write() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
Creates an uninitialized HBaseIO.Write.
write() - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO
Write data to Hive.
Write() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
 
Write() - Constructor for class org.apache.beam.sdk.io.influxdb.InfluxDbIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
Write data to a JDBC datasource.
write() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
 
Write() - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.Write
 
write(String) - Static method in class org.apache.beam.sdk.io.json.JsonIO
Instantiates a JsonIO.Write for writing user types in JSONFormat format.
Write() - Constructor for class org.apache.beam.sdk.io.json.JsonIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
Creates an uninitialized KafkaIO.Write PTransform.
Write() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.kinesis.KinesisIO
Deprecated.
A PTransform writing data to Kinesis.
Write() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
Deprecated.
 
write() - Static method in class org.apache.beam.sdk.io.kudu.KuduIO
 
Write() - Constructor for class org.apache.beam.sdk.io.kudu.KuduIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
Write data to GridFS.
write(MongoDbGridFSIO.WriteFn<T>) - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
 
Write() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
write(T, OutputStream) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.WriteFn
Output the object to the given OutputStream.
write() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbIO
Write data to MongoDB.
Write() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
 
Write() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
write(GenericRecord) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
 
write() - Static method in class org.apache.beam.sdk.io.pulsar.PulsarIO
Write to Apache Pulsar.
Write() - Constructor for class org.apache.beam.sdk.io.pulsar.PulsarIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO
 
Write() - Constructor for class org.apache.beam.sdk.io.rabbitmq.RabbitMqIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
Write data to a Redis server.
Write() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
Write data to a SingleStoreDB datasource.
Write() - Constructor for class org.apache.beam.sdk.io.singlestore.SingleStoreIO.Write
 
write(SnowflakeBatchServiceConfig) - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceImpl
Writing data to Snowflake in batch mode.
write(SnowflakeBatchServiceConfig) - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices.BatchService
 
write(SnowflakeStreamingServiceConfig) - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices.StreamingService
 
write(SnowflakeStreamingServiceConfig) - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceImpl
Writing data to Snowflake in streaming mode.
write() - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO
Write data to Snowflake via COPY statement.
Write() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Write
 
write(SerializableFunction<T, Solace.Record>) - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
Create a SolaceIO.Write transform, to write to Solace with a custom type.
write() - Static method in class org.apache.beam.sdk.io.solace.SolaceIO
Create a SolaceIO.Write transform, to write to Solace using Solace.Record objects.
Write() - Constructor for class org.apache.beam.sdk.io.solace.SolaceIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.solr.SolrIO
 
Write() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.Write
 
write(String, String) - Static method in class org.apache.beam.sdk.io.splunk.SplunkIO
Write to Splunk's Http Event Collector (HEC).
write(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.splunk.SplunkIO
Write() - Constructor for class org.apache.beam.sdk.io.splunk.SplunkIO.Write
 
write(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
write() - Static method in class org.apache.beam.sdk.io.TextIO
A PTransform that writes a PCollection to a text file (or multiple text files matching a sharding pattern), with each element of the input collection encoded into its own line.
write(byte[]) - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
 
write() - Static method in class org.apache.beam.sdk.io.TFRecordIO
A PTransform that writes a PCollection to TFRecord file (or multiple TFRecord files matching a sharding pattern), with each element of the input collection encoded into its own record.
Write() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Write
 
write(T) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
 
write(T) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ThriftWriter
 
write(T) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
write() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
Writes all elements in the input PCollection to a single XML file using XmlIO.sink(java.lang.Class<T>).
Write() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Write
 
write(String) - Static method in class org.apache.beam.sdk.managed.Managed
Instantiates a Managed.ManagedTransform transform for the specified sink.
write(T) - Method in interface org.apache.beam.sdk.state.ValueState
Set the value.
WRITE_TRANSFORMS - Static variable in class org.apache.beam.sdk.managed.Managed
 
WRITE_URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
 
WRITE_URN - Static variable in class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar
 
WRITE_URN - Static variable in class org.apache.beam.sdk.io.snowflake.crosslanguage.SnowflakeTransformRegistrar
 
writeArtifacts(RunnerApi.Pipeline, String) - Method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
Stages all dependencies in pipeline into the jar file at outputStream, returning a new pipeline that references these artifacts as classpath artifacts.
writeAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded Avro messages of a given type to a Google Cloud Pub/Sub stream.
writeAvros(Class<T>, SerializableFunction<ValueInSingleWindow<T>, Map<String, String>>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded Avro messages of a given type to a Google Cloud Pub/Sub stream.
writeBatches() - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsIO
 
WriteBatches() - Constructor for class org.apache.beam.sdk.io.aws2.sqs.SqsIO.WriteBatches
 
WriteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
 
WriteBuilder() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.WriteBuilder
 
WriteBuilder - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
 
WriteBuilder() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder
 
WriteBuilder.Configuration - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
Parameters class to expose the transform to an external SDK.
writeCallMetric(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
writeCompressed(WritableByteChannel) - Method in enum org.apache.beam.sdk.io.Compression
 
writeCustomType() - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Deprecated.
Use AvroIO.writeCustomType(Class) instead and provide the custom record class
writeCustomType(Class<OutputT>) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
A PTransform that writes a PCollection to an avro file (or multiple avro files matching a sharding pattern), with each element of the input collection encoded into its own record of type OutputT.
writeCustomType() - Static method in class org.apache.beam.sdk.io.TextIO
A PTransform that writes a PCollection to a text file (or multiple text files matching a sharding pattern), with each element of the input collection encoded into its own line.
writeCustomTypeToGenericRecords() - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Similar to AvroIO.writeCustomType(), but specialized for the case where the output type is GenericRecord.
writeDefaultJobName(JarOutputStream, String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
 
writeDetectNewPartitionMissingPartitions(HashMap<Range.ByteStringRange, Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Write to metadata table serialized missing partitions and how long they have been missing.
writeDetectNewPartitionVersion() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Set the version number for DetectNewPartition.
WriteDisposition - Enum in org.apache.beam.sdk.io.snowflake.enums
Enum containing all supported dispositions during writing to table phase.
writeDynamic() - Static method in class org.apache.beam.sdk.io.FileIO
Writes elements to files using a FileIO.Sink and grouping the elements using "dynamic destinations".
WriteErrorMetrics(String) - Constructor for class org.apache.beam.sdk.transforms.errorhandling.ErrorHandler.PTransformErrorHandler.WriteErrorMetrics
 
writeExternal(ObjectOutput) - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
WriteFailure(Write, WriteResult, Status) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
WriteFiles<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
A PTransform that writes to a FileBasedSink.
WriteFiles() - Constructor for class org.apache.beam.sdk.io.WriteFiles
 
WriteFilesResult<DestinationT> - Class in org.apache.beam.sdk.io
The result of a WriteFiles transform.
writeFooter() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Writes footer at the end of output files.
writeGenericRecords(Schema) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Writes Avro records of the specified schema.
writeGenericRecords(String) - Static method in class org.apache.beam.sdk.extensions.avro.io.AvroIO
Writes Avro records of the specified schema.
writeGenericRecords() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection containing GenericRecords to a BigQuery table.
WriteGrouped(SpannerIO.Write) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
 
writeHeader() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Writes header at the beginning of output files.
WriteJmsResult<EventT> - Class in org.apache.beam.sdk.io.jms
Return type of JmsIO.Write transform.
WriteJmsResult(Pipeline, TupleTag<EventT>, PCollection<EventT>) - Constructor for class org.apache.beam.sdk.io.jms.WriteJmsResult
 
writeMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes to a Google Cloud Pub/Sub stream.
writeMessagesDynamic() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Enables dynamic destination topics.
writeMetrics(MetricQueryResults) - Method in interface org.apache.beam.sdk.metrics.MetricsSink
 
writeNewPartition(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
After a split or merge from a close stream, write the new partition's information to the metadata table.
WriteOperation(FileBasedSink<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Constructs a WriteOperation using the default strategy for generating a temporary directory from the base output filename.
WriteOperation(FileBasedSink<?, DestinationT, OutputT>, ResourceId) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Create a new WriteOperation.
writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection containing protocol buffer objects to a BigQuery table.
writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded protobuf messages of a given type to a Google Cloud Pub/Sub stream.
writeProtos(Class<T>, SerializableFunction<ValueInSingleWindow<T>, Map<String, String>>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded protobuf messages of a given type to a Google Cloud Pub/Sub stream.
Writer(FileBasedSink.WriteOperation<DestinationT, OutputT>, String) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.Writer
Construct a new FileBasedSink.Writer that will produce files of the given MIME type.
writeRecords() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
Creates an uninitialized KafkaIO.WriteRecords PTransform.
WriteRecords() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
 
WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.WriteRegistrar
 
WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
 
WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergSchemaTransformTranslation.WriteRegistrar
 
WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.WriteRegistrar
 
WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.upgrade.KafkaIOTranslation.WriteRegistrar
 
WriteResult - Class in org.apache.beam.sdk.io.gcp.bigquery
The result of a BigQueryIO.Write transform.
writeRowMutations() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
 
writeRows(String, CSVFormat) - Static method in class org.apache.beam.sdk.io.csv.CsvIO
Instantiates a CsvIO.Write for writing Rows in CSVFormat format.
writeRows(IcebergCatalogConfig) - Static method in class org.apache.beam.sdk.io.iceberg.IcebergIO
 
WriteRows() - Constructor for class org.apache.beam.sdk.io.iceberg.IcebergIO.WriteRows
 
writeRows(String) - Static method in class org.apache.beam.sdk.io.json.JsonIO
Instantiates a JsonIO.Write for writing Rows in JSONFormat format.
writeRows() - Static method in class org.apache.beam.sdk.io.singlestore.SingleStoreIO
Write Beam Rows to a SingleStoreDB datasource.
writeStreams() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
Write stream data to a Redis server.
WriteStreams() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.WriteStreams
 
WriteStreamServiceImpl(BigQueryOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
writeStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes UTF-8 encoded strings to a Google Cloud Pub/Sub stream.
WriteSuccessSummary(int, long) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
writeTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection containing TableRows to a BigQuery table.
writeTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
Milliseconds to wait for a write on a socket before an exception is thrown.
writeTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
Milliseconds to wait for a write on a socket before an exception is thrown.
writeTo(OutputStream, int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Writes length bytes starting at offset from the backing data store to the specified output stream.
writeToPort(String, BeamFnApi.RemoteGrpcPort) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
Create a RemoteGrpcPortWrite which writes the RunnerApi.PCollection with the provided Pipeline id to the provided BeamFnApi.RemoteGrpcPort.
WriteToPulsarDoFn - Class in org.apache.beam.sdk.io.pulsar
Transform for writing to Apache Pulsar.
writeUnwind() - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO
Write all rows using a Neo4j Cypher UNWIND cypher statement.
WriteUnwind() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO.WriteUnwind
 
writeUserEvent() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
 
writeVoid() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
 
WriteVoid() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteVoid
 
WriteWithResults() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.WriteWithResults
 
WS - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
WS - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 

X

XmlConfiguration() - Constructor for class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration.XmlConfiguration
 
xmlConfigurationBuilder() - Static method in class org.apache.beam.sdk.io.fileschematransform.FileWriteSchemaTransformConfiguration
 
XmlIO - Class in org.apache.beam.sdk.io.xml
Transforms for reading and writing XML files using JAXB mappers.
XmlIO() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO
 
XmlIO.Read<T> - Class in org.apache.beam.sdk.io.xml
Implementation of XmlIO.read().
XmlIO.Read.CompressionType - Enum in org.apache.beam.sdk.io.xml
Deprecated.
Use Compression instead.
XmlIO.ReadFiles<T> - Class in org.apache.beam.sdk.io.xml
Implementation of XmlIO.readFiles().
XmlIO.Sink<T> - Class in org.apache.beam.sdk.io.xml
XmlIO.Write<T> - Class in org.apache.beam.sdk.io.xml
Implementation of XmlIO.write().
XmlSource<T> - Class in org.apache.beam.sdk.io.xml
Implementation of XmlIO.read().
XmlWriteSchemaTransformFormatProvider - Class in org.apache.beam.sdk.io.fileschematransform
XmlWriteSchemaTransformFormatProvider() - Constructor for class org.apache.beam.sdk.io.fileschematransform.XmlWriteSchemaTransformFormatProvider
 

Y

yamlStringFromMap(Map<String, Object>) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
 
yamlStringToMap(String) - Static method in class org.apache.beam.sdk.schemas.utils.YamlUtils
 
YamlUtils - Class in org.apache.beam.sdk.schemas.utils
 
YamlUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.YamlUtils
 
years(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
Returns a WindowFn that windows elements into periods measured by years.

Z

ZERO_CURSOR - Static variable in class org.apache.beam.sdk.io.redis.RedisCursor
 
ZERO_KEY - Static variable in class org.apache.beam.sdk.io.redis.RedisCursor
 
ZETASQL_FUNCTION_GROUP_NAME - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCatalog
Same as Function.ZETASQL_FUNCTION_GROUP_NAME.
ZETASQL_NUMERIC_MAX_VALUE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
 
ZETASQL_NUMERIC_MIN_VALUE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
 
ZETASQL_NUMERIC_SCALE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlCalciteTranslationUtils
 
ZETASQL_TIMESTAMP_ADD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
 
ZetaSqlBeamTranslationUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql
Utility methods for ZetaSQL <=> Beam translation.
ZetaSqlCalciteTranslationUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql
Utility methods for ZetaSQL <=> Calcite translation.
ZetaSqlException - Exception in org.apache.beam.sdk.extensions.sql.zetasql
Exception to be thrown by the Beam ZetaSQL planner.
ZetaSqlException(StatusRuntimeException) - Constructor for exception org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlException
 
ZetaSqlException(String) - Constructor for exception org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlException
 
ZetaSQLQueryPlanner - Class in org.apache.beam.sdk.extensions.sql.zetasql
ZetaSQLQueryPlanner.
ZetaSQLQueryPlanner(FrameworkConfig) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
 
ZetaSQLQueryPlanner(JdbcConnection, Collection<RuleSet>) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
Called by BeamSqlEnv.instantiatePlanner() reflectively.
ZetaSqlScalarFunctionImpl - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
ZetaSQL-specific extension to ScalarFunctionImpl.
ZetaSqlUnnest - Class in org.apache.beam.sdk.extensions.sql.zetasql.unnest
This class is a copy of Uncollect.java in Calcite: https://github.com/apache/calcite/blob/calcite-1.20.0/core/src/main/java/org/apache/calcite/rel/core/Uncollect.java except that in deriveUncollectRowType() it does not unwrap array elements of struct type.
ZetaSqlUnnest(RelOptCluster, RelTraitSet, RelNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
Creates an Uncollect.
ZetaSqlUnnest(RelInput) - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
Creates an Uncollect by parsing serialized output.
ZetaSqlUserDefinedSQLNativeTableValuedFunction - Class in org.apache.beam.sdk.extensions.sql.impl
This is a class to indicate that a TVF is a ZetaSQL SQL native UDTVF.
ZetaSqlUserDefinedSQLNativeTableValuedFunction(SqlIdentifier, SqlReturnTypeInference, SqlOperandTypeInference, SqlOperandTypeChecker, List<RelDataType>, Function) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.ZetaSqlUserDefinedSQLNativeTableValuedFunction
 
ZstdCoder<T> - Class in org.apache.beam.sdk.coders
Wraps an existing coder with Zstandard compression.

_

_ATN - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
_ATN - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
_decisionToDFA - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
_decisionToDFA - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
_serializedATN - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
_serializedATN - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
_sharedContextCache - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
 
_sharedContextCache - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
 
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z _