- abort() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
 
- 
De-registers the handler for all future requests for state for the registered process bundle
 instruction id.
 
- absolute(String, String...) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
- 
Construct a path from an absolute component path hierarchy.
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.array.BeamSqlArrayExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.array.BeamSqlArrayItemExpression
 
-  
 
- accept(List<BeamSqlExpression>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlBinaryOperator
 
-  
 
- accept(BeamSqlExpression, BeamSqlExpression) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlBinaryOperator
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCaseExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCastExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCorrelVariableExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlDefaultExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlDotExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
- 
assertion to make sure the input and output are supported in this expression.
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlInputRefExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlLocalRefExpression
 
-  
 
- accept(List<BeamSqlExpression>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlOperator
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlOperatorExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUdfExpression
 
-  
 
- accept(List<BeamSqlExpression>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUnaryOperator
 
-  
 
- accept(BeamSqlExpression) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUnaryOperator
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.collection.BeamSqlCardinalityExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.collection.BeamSqlSingleElementExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
 
- 
Compare operation must have 2 operands.
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNotNullExpression
 
- 
only one operand is required.
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNullExpression
 
- 
only one operand is required.
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentDateExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimeExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimestampExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDatetimeMinusExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDatetimeMinusIntervalExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDatetimePlusExpression
 
- 
Requires exactly 2 operands.
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlIntervalMultiplyExpression
 
- 
Requires exactly 2 operands.
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlTimestampMinusIntervalExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlTimestampMinusTimestampExpression
 
- 
Requires exactly 2 operands.
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlLogicalExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.map.BeamSqlMapExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.map.BeamSqlMapItemExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathBinaryExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathUnaryExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPiExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandIntegerExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.BeamSqlReinterpretExpression
 
-  
 
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.row.BeamSqlFieldAccessExpression
 
-  
 
- accept(WindowedValue<T>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver
 
-  
 
- accept(BeamFnApi.Elements.Data) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
-  
 
- accept(T) - Method in interface org.apache.beam.sdk.fn.data.FnDataReceiver
 
-  
 
- accept(T1, T2) - Method in interface org.apache.beam.sdk.fn.function.ThrowingBiConsumer
 
-  
 
- accept(T) - Method in interface org.apache.beam.sdk.fn.function.ThrowingConsumer
 
-  
 
- accept(T) - Method in class org.apache.beam.sdk.fn.stream.DataStreams.BlockingQueueIterator
 
-  
 
- accessPattern() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
-  
 
- AccumulatingCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
-  
 
- accumulatingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
Returns a new Window PTransform that uses the registered WindowFn and
 Triggering behavior, and that accumulates elements in a pane after they are triggered.
 
- ACCUMULATOR_NAME - Static variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
 
-  
 
- AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
 
-  
 
- AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
 
-  
 
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
Return the ack deadline, in seconds, for subscription.
 
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
Acknowldege messages from subscription with ackIds.
 
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
-  
 
- add(int, GlobalWatermarkHolder.SparkWatermarks) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
-  
 
- add(T, long, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
 
-  
 
- add(T, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
 
-  
 
- add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
-  
 
- add(List<ValueInSingleWindow<T>>, TableDataInsertAllResponse.InsertErrors, TableReference, ValueInSingleWindow<TableRow>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
-  
 
- add(InputT) - Method in interface org.apache.beam.sdk.state.GroupingState
 
- 
Add a value to the buffer.
 
- add(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
 
- 
For internal use only: no backwards compatibility guarantees.
 
- add(long) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
 
- 
Adds a value to the heap, returning whether the value is (large enough to be) in the heap.
 
- add(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
 
- 
Register the given display item.
 
- addAccum(AccumT) - Method in interface org.apache.beam.sdk.state.CombiningState
 
- 
Add an accumulator to this state cell.
 
- addAccumulator(NamedAggregators, NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.AggAccumParam
 
-  
 
- addAll(Map<Integer, Queue<GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
-  
 
- addArray(List<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
 
-  
 
- addArray(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
 
-  
 
- addArrayField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addBoolean(Map<String, Object>, String, boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addBooleanField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addByteArrayField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addByteField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addCollectionToSingletonOutput(PCollection<?>, String, PCollectionView<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Adds an output to this CollectionToSingleton Dataflow step, consuming the specified
 input PValue and producing the specified output PValue.
 
- addDataSet(String, DataSet<T>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
-  
 
- addDataStream(String, DataStream<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
-  
 
- addDateTimeField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addDecimalField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addDouble(Map<String, Object>, String, Double) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addDoubleField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addElements(T, T...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
 
- 
Adds the specified elements to the source with timestamp equal to the current watermark.
 
- addElements(TimestampedValue<T>, TimestampedValue<T>...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
 
- 
Adds the specified elements to the source with the provided timestamps.
 
- addEncodingInput(Coder<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Sets the encoding for this Dataflow step.
 
- addField(Schema.Field) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addFields(List<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addFields(Schema.Field...) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addFloatField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addIfAbsent(T) - Method in interface org.apache.beam.sdk.state.SetState
 
- 
Ensures a value is a member of the set, returning true if it was added and false otherwise.
 
- addIfNotDefault(DisplayData.ItemSpec<T>, T) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
 
- 
Register the given display item if the value is different than the specified default.
 
- addIfNotNull(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
 
- 
Register the given display item if the value is not null.
 
- addInPlace(NamedAggregators, NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.AggAccumParam
 
-  
 
- addInput(String, Boolean) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Adds an input with the given name and value to this Dataflow step.
 
- addInput(String, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Adds an input with the given name and value to this Dataflow step.
 
- addInput(String, Long) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Adds an input with the given name and value to this Dataflow step.
 
- addInput(String, PInput) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Adds an input with the given name to this Dataflow step, coming from the specified input
 PValue.
 
- addInput(String, Map<String, Object>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Adds an input that is a dictionary of strings to objects.
 
- addInput(String, List<? extends Map<String, Object>>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Adds an input that is a list of objects.
 
- addInput(HyperLogLogPlus, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
-  
 
- addInput(SketchFrequencies.Sketch<InputT>, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
-  
 
- addInput(MergingDigest, Double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
-  
 
- addInput(CovarianceAccumulator, KV<T, T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
-  
 
- addInput(VarianceAccumulator, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
-  
 
- addInput(BeamAggregationTransforms.AggregationAccumulator, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAdaptor
 
-  
 
- addInput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique, T) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
-  
 
- addInput(InputT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
 
- 
Adds the given input value to this accumulator, modifying this accumulator.
 
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
-  
 
- addInput(double[], Double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
-  
 
- addInput(Combine.Holder<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
-  
 
- addInput(int[], Integer) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
-  
 
- addInput(long[], Long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
-  
 
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
 
- 
Adds the given input value to the given accumulator, returning the new accumulator value.
 
- addInput(List<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
-  
 
- addInput(Object[], DataT) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
-  
 
- addInput(Object[], DataT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
-  
 
- addInput(AccumT, InputT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
- 
Adds the given input value to the given accumulator, returning the new accumulator value.
 
- addInput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>, T) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
-  
 
- addInt16Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addInt32Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addInt64Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addLengthPrefixedCoder(String, RunnerApi.Components.Builder, boolean) - Static method in class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
 
- 
 
- addList(Map<String, Object>, String, List<? extends Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addList(Map<String, Object>, String, T[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addLong(Map<String, Object>, String, long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addLongs(Map<String, Object>, String, long...) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addMapField(String, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addMessage(Message) - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
 
-  
 
- addMessageListener(Consumer<JobApi.JobMessage>) - Method in class org.apache.beam.runners.flink.FlinkJobInvocation
 
-  
 
- addMessageListener(Consumer<JobApi.JobMessage>) - Method in interface org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation
 
- 
Listen for job messages with a Consumer.
 
- addNameFilter(MetricNameFilter) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
- 
 
- addNull(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addNullableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addOutput(String, PCollection<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
 
- 
Adds a primitive output to this Dataflow step with the given name as the local output name,
 producing the specified output PValue, including its Coder if a TypedPValue.
 
- addOverrideForClass(Class<?>, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
 
- 
Deprecated.
Overrides the default log level for the passed in class.
 
- addOverrideForClass(Class<?>, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
 
- 
Overrides the default log level for the passed in class.
 
- addOverrideForName(String, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
 
- 
Deprecated.
Overrides the default log level for the passed in name.
 
- addOverrideForName(String, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
 
- 
Overrides the default log logLevel for the passed in name.
 
- addOverrideForPackage(Package, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
 
- 
Deprecated.
Overrides the default log level for the passed in package.
 
- addOverrideForPackage(Package, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
 
- 
Overrides the default log level for the passed in package.
 
- addProperties(Configuration, Properties) - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
 
- 
Transfer the properties to the configuration object.
 
- addRowField(String, Schema) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addRows(Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
- 
Add rows to the builder.
 
- addRows(String, Row...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- addRows(Duration, Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
- 
Add rows to the builder.
 
- addRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
 
- 
Creates a runner-side wire coder for a port read/write for the given PCollection.
 
- addSdkWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
 
- 
Creates an SDK-side wire coder for a port read/write for the given PCollection.
 
- addStateListener(Consumer<JobApi.JobState.Enum>) - Method in class org.apache.beam.runners.flink.FlinkJobInvocation
 
-  
 
- addStateListener(Consumer<JobApi.JobState.Enum>) - Method in interface org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation
 
- 
Listen for job state changes with a Consumer.
 
- addStep(PTransform<?, ?>, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
- 
Adds a step to the Dataflow workflow for the given transform, with the given Dataflow step
 type.
 
- addStep(String) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
- 
Add a step filter.
 
- addString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addStringField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- addStringList(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- addTraceFor(AbstractGoogleClient, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
 
- 
 
- addTraceFor(AbstractGoogleClientRequest<?>, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
 
- 
 
- addValue(Object) - Method in class org.apache.beam.sdk.values.Row.Builder
 
-  
 
- addValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
 
-  
 
- addValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
 
-  
 
- advance() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
-  
 
- advance(String) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
- 
Advances the watermarks to the next-in-line watermarks.
 
- advance() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
- 
 
- advance() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
-  
 
- advance() - Method in class org.apache.beam.sdk.io.Source.Reader
 
- 
Advances the reader to the next valid record.
 
- advance() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
Advances the reader to the next valid record.
 
- advanceBy(Duration) - Static method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
 
- 
For internal use only: no backwards compatibility guarantees.
 
- advanceImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
-  
 
- advanceImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
- 
Advances to the next record and returns true, or returns false if there is no next
 record.
 
- advanceNextBatchWatermarkToInfinity() - Method in class org.apache.beam.runners.spark.io.CreateStream
 
- 
Advances the watermark in the next batch to the end-of-time.
 
- advanceProcessingTime(Duration) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
 
- 
Advance the processing time by the specified amount.
 
- advanceTo(Instant) - Static method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
 
- 
For internal use only: no backwards compatibility guarantees.
 
- advanceWatermark() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
- 
Advances the watermark.
 
- advanceWatermarkForNextBatch(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
 
- 
Advances the watermark in the next batch.
 
- advanceWatermarkTo(Instant) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
 
- 
Advance the watermark of this source to the specified instant.
 
- advanceWatermarkToInfinity() - Method in class org.apache.beam.sdk.testing.TestStream.Builder
 
- 
Advance the watermark to infinity, completing this 
TestStream.
 
 
- AdvancingPhaser - Class in org.apache.beam.sdk.fn.stream
 
- 
A Phaser which never terminates.
 
- AdvancingPhaser(int) - Constructor for class org.apache.beam.sdk.fn.stream.AdvancingPhaser
 
-  
 
- AfterAll - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A composite 
Trigger that fires when all of its sub-triggers are ready.
 
 
- AfterEach - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A composite 
Trigger that executes its sub-triggers in order.
 
 
- AfterFirst - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A composite 
Trigger that fires once after at least one of its sub-triggers have fired.
 
 
- AfterPane - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
Trigger that fires at some point after a specified number of input elements have
 arrived.
 
 
- AfterProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
Trigger trigger that fires at a specified point in processing time, relative to when
 input first arrives.
 
 
- AfterSynchronizedProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
 
- 
FOR INTERNAL USE ONLY.
 
- afterTimeSinceNewOutput(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
 
- afterTimeSinceNewOutput(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
 
- afterTotalOf(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
 
- afterTotalOf(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
 
- AfterWatermark - Class in org.apache.beam.sdk.transforms.windowing
 
- 
AfterWatermark triggers fire based on progress of the system watermark.
 
- AfterWatermark.AfterWatermarkEarlyAndLate - Class in org.apache.beam.sdk.transforms.windowing
 
-  
 
- AfterWatermark.FromEndOfWindow - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A watermark trigger targeted relative to the end of the window.
 
- AggAccumParam - Class in org.apache.beam.runners.spark.aggregators
 
- 
Aggregator accumulator param.
 
- AggAccumParam() - Constructor for class org.apache.beam.runners.spark.aggregators.AggAccumParam
 
-  
 
- aggregate(Combine.CombineFn<InputT, ?, OutputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
- 
 
- aggregate(Combine.CombineFn<InputT, ?, OutputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
- 
 
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
- 
Build up an aggregation function over the input elements.
 
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
- 
Build up an aggregation function over the input elements.
 
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
- 
Build up an aggregation function over the input elements.
 
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
- 
Build up an aggregation function over the input elements.
 
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
 
- 
Build up an aggregation function over the input elements.
 
- AggregationAccumulator() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAccumulator
 
-  
 
- AggregationAccumulatorCoder(List<Coder>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAccumulatorCoder
 
-  
 
- AggregationAdaptor(List<Pair<AggregateCall, String>>, Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAdaptor
 
-  
 
- AggregationGroupByKeyFn(Schema, int, ImmutableBitSet) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationGroupByKeyFn
 
-  
 
- AggregatorMetric - Class in org.apache.beam.runners.spark.metrics
 
- 
 
- AggregatorMetricSource - Class in org.apache.beam.runners.spark.metrics
 
- 
 
- AggregatorMetricSource(String, NamedAggregators) - Constructor for class org.apache.beam.runners.spark.metrics.AggregatorMetricSource
 
-  
 
- AggregatorsAccumulator - Class in org.apache.beam.runners.spark.aggregators
 
- 
For resilience, Accumulators are required to be wrapped in a Singleton.
 
- AggregatorsAccumulator() - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
 
-  
 
- AggregatorsAccumulator.AccumulatorCheckpointingSparkListener - Class in org.apache.beam.runners.spark.aggregators
 
- 
 
- align(Duration) - Method in interface org.apache.beam.sdk.state.Timer
 
- 
 
- alignedTo(Duration, Instant) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
- 
Aligns timestamps to the smallest multiple of period since the offset greater
 than the timestamp.
 
- alignedTo(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
- 
Aligns the time to be the smallest multiple of period greater than the epoch boundary
 (aka new Instant(0)).
 
- alignTo(Duration, Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- alignTo(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- AlignTo() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
-  
 
- ALL_CONTEXTS - Static variable in class org.apache.beam.sdk.testing.CoderProperties
 
- 
All the contexts, for use in test cases.
 
- ALL_KEYS - Static variable in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
The range of all keys, with empty start and end keys.
 
- allFields() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
-  
 
- allMatches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- allMatches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- AllMatches(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.AllMatches
 
-  
 
- allocatePortAndCreate(BindableService, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.runners.fnexecution.InProcessServerFactory
 
-  
 
- allocatePortAndCreate(BindableService, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.runners.fnexecution.ServerFactory
 
- 
Creates an instance of this server using an ephemeral port chosen automatically.
 
- allocatePortAndCreate(BindableService, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.runners.fnexecution.ServerFactory.InetSocketAddressServerFactory
 
-  
 
- allocatePortAndCreateFor(ServiceT, ServerFactory) - Static method in class org.apache.beam.runners.fnexecution.GrpcFnServer
 
- 
 
- allOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- allOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- allOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
 
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
-  
 
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
-  
 
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
- 
Whether this reader should allow dynamic splitting of the offset ranges.
 
- AlwaysPassMatcher() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
 
-  
 
- AlwaysPassMatcherFactory() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
 
-  
 
- alwaysRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
 
- 
Always retry all failures.
 
- AmqpIO - Class in org.apache.beam.sdk.io.amqp
 
- 
AmqpIO supports AMQP 1.0 protocol using the Apache QPid Proton-J library.
 
- AmqpIO.Read - Class in org.apache.beam.sdk.io.amqp
 
- 
A 
PTransform to read/receive messages using AMQP 1.0 protocol.
 
 
- AmqpIO.Write - Class in org.apache.beam.sdk.io.amqp
 
- 
A 
PTransform to send messages using AMQP 1.0 protocol.
 
 
- AmqpMessageCoder - Class in org.apache.beam.sdk.io.amqp
 
- 
A coder for AMQP message.
 
- AmqpMessageCoder() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
 
-  
 
- AmqpMessageCoderProviderRegistrar - Class in org.apache.beam.sdk.io.amqp
 
- 
 
- AmqpMessageCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
 
-  
 
- and(TupleTag<V>, List<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
- 
Returns a new 
CoGbkResult based on this, with the given tag and given data added to it.
 
 
- and(TupleTag<V>, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
- 
Returns a new KeyedPCollectionTuple<K> that is the same as this, appended with the
 given PCollection.
 
- and(PCollection.IsBounded) - Method in enum org.apache.beam.sdk.values.PCollection.IsBounded
 
- 
Returns the composed IsBounded property.
 
- and(PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionList
 
- 
 
- and(Iterable<PCollection<T>>) - Method in class org.apache.beam.sdk.values.PCollectionList
 
- 
 
- and(TupleTag<T>, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
- 
 
- and(TupleTag<?>) - Method in class org.apache.beam.sdk.values.TupleTagList
 
- 
 
- and(List<TupleTag<?>>) - Method in class org.apache.beam.sdk.values.TupleTagList
 
- 
 
- any(long) - Static method in class org.apache.beam.sdk.transforms.Sample
 
- 
Sample#any(long) takes a PCollection<T> and a limit, and produces a new PCollection<T> containing up to limit elements of the input PCollection.
 
- anyCombineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
 
- 
Returns a 
Combine.CombineFn that computes a fixed-sized potentially non-uniform sample of its
 inputs.
 
 
- anyOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- anyOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- anything() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- ApexPipelineOptions - Interface in org.apache.beam.runners.apex
 
- 
Options that configure the Apex pipeline.
 
- ApexRunner - Class in org.apache.beam.runners.apex
 
- 
A 
PipelineRunner that translates the pipeline to an Apex DAG and executes it on an Apex
 cluster.
 
 
- ApexRunner(ApexPipelineOptions) - Constructor for class org.apache.beam.runners.apex.ApexRunner
 
-  
 
- ApexRunner.CreateApexPCollectionView<ElemT,ViewT> - Class in org.apache.beam.runners.apex
 
- 
 
- ApexRunnerRegistrar - Class in org.apache.beam.runners.apex
 
- 
 
- ApexRunnerRegistrar.Options - Class in org.apache.beam.runners.apex
 
- 
 
- ApexRunnerRegistrar.Runner - Class in org.apache.beam.runners.apex
 
- 
 
- ApexRunnerResult - Class in org.apache.beam.runners.apex
 
- 
Result of executing a 
Pipeline with Apex in embedded mode.
 
 
- ApexRunnerResult(DAG, Launcher.AppHandle) - Constructor for class org.apache.beam.runners.apex.ApexRunnerResult
 
-  
 
- ApexYarnLauncher - Class in org.apache.beam.runners.apex
 
- 
Proxy to launch the YARN application through the hadoop script to run in the pre-configured
 environment (class path, configuration, native libraries etc.).
 
- ApexYarnLauncher() - Constructor for class org.apache.beam.runners.apex.ApexYarnLauncher
 
-  
 
- ApexYarnLauncher.LaunchParams - Class in org.apache.beam.runners.apex
 
- 
Launch parameters that will be serialized and passed to the child process.
 
- ApexYarnLauncher.ProcessWatcher - Class in org.apache.beam.runners.apex
 
- 
Starts a command and waits for it to complete.
 
- append(K, W, Iterator<V>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
 
- 
Appends the values to the bag user state for the given key and window.
 
- applicableTo(PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
 
-  
 
- ApplicationNameOptions - Interface in org.apache.beam.sdk.options
 
- 
Options that allow setting the application name.
 
- apply(WindowFunction.Context<T2>) - Method in class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator.GearpumpWindowFn
 
-  
 
- apply(InputT) - Method in interface org.apache.beam.sdk.coders.DelegateCoder.CodingFunction
 
-  
 
- apply(List<BeamSqlPrimitive>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlBinaryOperator
 
-  
 
- apply(BeamSqlPrimitive, BeamSqlPrimitive) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlBinaryOperator
 
-  
 
- apply(List<BeamSqlPrimitive>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlOperator
 
-  
 
- apply(List<BeamSqlPrimitive>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUnaryOperator
 
-  
 
- apply(BeamSqlPrimitive) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUnaryOperator
 
-  
 
- apply(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationGroupByKeyFn
 
-  
 
- apply(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.WindowTimestampFn
 
-  
 
- apply(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.ExtractJoinFields
 
-  
 
- apply(KV<Row, KV<Row, Row>>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.JoinParts2WholeRow
 
-  
 
- apply(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
 
-  
 
- apply(T1, T2) - Method in interface org.apache.beam.sdk.fn.function.ThrowingBiFunction
 
-  
 
- apply(T1) - Method in interface org.apache.beam.sdk.fn.function.ThrowingFunction
 
-  
 
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
-  
 
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
 
-  
 
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
-  
 
- apply(SQLException) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
 
-  
 
- apply(SQLException) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.RetryStrategy
 
-  
 
- apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
 
- 
 
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
 
- 
 
- apply(T) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert.MatcherCheckerFn
 
-  
 
- apply(Statement, Description) - Method in class org.apache.beam.sdk.testing.TestPipeline
 
-  
 
- apply(double, double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
- 
Applies the binary operation to the two operands, returning the result.
 
- apply(V, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
- 
Applies the binary operation to the two operands, returning the result.
 
- apply(int, int) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
- 
Applies the binary operation to the two operands, returning the result.
 
- apply(long, long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
- 
Applies the binary operation to the two operands, returning the result.
 
- apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
 
- 
Applies this CombineFn to a collection of input values to produce a combined output
 value.
 
- apply(Iterable<? extends InputT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
- 
Applies this CombineFnWithContext to a collection of input values to produce a
 combined output value.
 
- apply(InputT, Contextful.Fn.Context) - Method in interface org.apache.beam.sdk.transforms.Contextful.Fn
 
- 
Invokes the function on the given input with the given context.
 
- apply(PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
- 
 
- apply(String, PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
- 
Applies the given 
PTransform to this input 
KeyedPCollectionTuple and returns
 its 
OutputT.
 
 
- apply(InputT) - Method in interface org.apache.beam.sdk.transforms.SerializableFunction
 
- 
Returns the result of invoking this function on the given input.
 
- apply(InputT) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
 
-  
 
- apply(PrimitiveViewT) - Method in class org.apache.beam.sdk.transforms.ViewFn
 
- 
A function to adapt a primitive view type to a desired view type.
 
- apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
 
- 
 
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
 
- 
Applies the given 
PTransform to this 
PBegin, using 
name to identify
 this specific application of the transform.
 
 
- apply(PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
 
- apply(String, PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
Applies the given 
PTransform to this input 
PCollection, using 
name to
 identify this specific application of the transform.
 
 
- apply(PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
 
- 
 
- apply(String, PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
 
- 
Applies the given 
PTransform to this input 
PCollectionList, using 
name
 to identify this specific application of the transform.
 
 
- apply(PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
- 
 
- apply(String, PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
- 
 
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
 
-  
 
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
 
-  
 
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
 
-  
 
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
 
-  
 
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
 
-  
 
- applyTransform(InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- applyTransform(String, InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- ApproximateDistinct - Class in org.apache.beam.sdk.extensions.sketching
 
- 
PTransforms for computing the approximate number of distinct elements in a stream.
 
 
- ApproximateDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
 
-  
 
- ApproximateDistinct.ApproximateDistinctFn<InputT> - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- ApproximateDistinct.GloballyDistinct<InputT> - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- ApproximateDistinct.HyperLogLogPlusCoder - Class in org.apache.beam.sdk.extensions.sketching
 
- 
Coder for HyperLogLogPlus class.
 
- ApproximateDistinct.PerKeyDistinct<K,V> - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- ApproximateQuantiles - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for getting an idea of a PCollection's data distribution using
 approximate N-tiles (e.g.
 
- ApproximateQuantiles.ApproximateQuantilesCombineFn<T,ComparatorT extends java.util.Comparator<T> & java.io.Serializable> - Class in org.apache.beam.sdk.transforms
 
- 
The ApproximateQuantilesCombineFn combiner gives an idea of the distribution of a
 collection of values using approximate N-tiles.
 
- ApproximateUnique - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for estimating the number of distinct elements in a PCollection, or
 the number of distinct values associated with each key in a PCollection of KVs.
 
- ApproximateUnique() - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique
 
-  
 
- ApproximateUnique.ApproximateUniqueCombineFn<T> - Class in org.apache.beam.sdk.transforms
 
- 
CombineFn that computes an estimate of the number of distinct values that were
 combined.
 
- ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique - Class in org.apache.beam.sdk.transforms
 
- 
A heap utility class to efficiently track the largest added elements.
 
- ApproximateUniqueCombineFn(long, Coder<T>) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
-  
 
- array() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Returns the backing array.
 
- array(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
Create an array type for the given field type.
 
- arrayContaining(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayContaining(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayContaining(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayContaining(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayContainingInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayContainingInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayContainingInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayContainingInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- arrayWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- ArtifactRetrievalService - Interface in org.apache.beam.runners.fnexecution.artifact
 
- 
An implementation of the Beam Artifact Retrieval Service.
 
- ARTIFACTS - Static variable in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 
-  
 
- as(Class<T>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
- 
Transforms this object into an object of type <T> saving each property that has been
 manipulated.
 
- as(Class<T>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
- 
Creates and returns an object that implements <T>.
 
- as(Class<T>) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
 
- 
Creates and returns an object that implements <T> using the values configured on this
 builder during construction.
 
- asCloudObject(Coder<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
 
- 
 
- asInputStream(int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Returns an InputStream wrapper which supplies the portion of this backing byte buffer
 starting at offset and up to length bytes.
 
- asIterable() - Static method in class org.apache.beam.sdk.transforms.View
 
- 
 
- AsJsons<InputT> - Class in org.apache.beam.sdk.extensions.jackson
 
- 
PTransform for serializing objects to JSON 
Strings.
 
 
- asList() - Static method in class org.apache.beam.sdk.transforms.View
 
- 
 
- asMap() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
-  
 
- asMap() - Static method in class org.apache.beam.sdk.transforms.View
 
- 
 
- asMultimap() - Static method in class org.apache.beam.sdk.transforms.View
 
- 
 
- asOutputReference(PValue, AppliedPTransform<?, ?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
- 
Encode a PValue reference as an output reference.
 
- asOutputStream() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Returns an output stream which writes to the backing buffer from the current position.
 
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
 
- 
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub
 API.
 
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
- 
Returns the string representation of this topic as a path used in the Cloud Pub/Sub API.
 
- asResponseObserver() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
-  
 
- ASSERTION_ERROR - Static variable in class org.apache.beam.runners.apex.ApexRunner
 
- 
TODO: this isn't thread safe and may cause issues when tests run in parallel Holds any most
 resent assertion error that was raised while processing elements.
 
- assertionError() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
-  
 
- assertSourcesEqualReferenceSource(BoundedSource<T>, List<? extends BoundedSource<T>>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Given a reference Source and a list of Sources, assert that the union of the
 records read from the list of sources is equal to the records read from the reference source.
 
- assertSplitAtFractionBehavior(BoundedSource<T>, int, double, SourceTestUtils.ExpectedSplitOutcome, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
 
- assertSplitAtFractionExhaustive(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Asserts that for each possible start position, BoundedSource.BoundedReader#splitAtFraction at every interesting fraction (halfway between two
 fractions that differ by at least one item) can be called successfully and the results are
 consistent if a split succeeds.
 
- assertSplitAtFractionFails(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Asserts that the source's reader fails to splitAtFraction(fraction) after
 reading numItemsToReadBeforeSplit items.
 
- assertSplitAtFractionSucceedsAndConsistent(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Verifies some consistency properties of BoundedSource.BoundedReader#splitAtFraction on
 the given source.
 
- assertThatAllRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
-  
 
- assertUnstartedReaderReadsSameAsItsSource(BoundedSource.BoundedReader<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Assert that a Reader returns a Source that, when read from, produces the same
 records as the reader.
 
- assign(BoundedWindow, Instant) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
 
- 
 
- assignableTo(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
 
- 
Returns true if this Schema can be assigned to another Schema.
 
- assignableToIgnoreNullable(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
 
- 
Returns true if this Schema can be assigned to another Schema, igmoring nullable.
 
- AssignContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
 
-  
 
- assignedWindows(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
 
- assignedWindowsWithValue(WindowFn<T, W>, TimestampedValue<T>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
 
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
-  
 
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
Returns true if this 
WindowFn always assigns an element to exactly one window.
 
 
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
-  
 
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
- 
Returns the single window to which elements with this timestamp belong.
 
- assignWindows(WindowFn<Object, GlobalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- assignWindows(WindowFn<Object, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
-  
 
- assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
-  
 
- assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
-  
 
- assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- assignWindows(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
Given a timestamp and element, returns the set of windows into which it should be placed.
 
- asSingleton() - Static method in class org.apache.beam.sdk.transforms.View
 
- 
 
- asSingletonView() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
Returns a 
PTransform that produces a 
PCollectionView whose elements are the
 result of combining elements per-window in the input 
PCollection.
 
 
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
 
- 
 
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
- 
 
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
 
- 
 
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
- 
 
- atMinimumTimestamp(V) - Static method in class org.apache.beam.sdk.values.TimestampedValue
 
- 
 
- AtomicCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder that has no component 
Coders or other configuration.
 
 
- AtomicCoder() - Constructor for class org.apache.beam.sdk.coders.AtomicCoder
 
-  
 
- AtomicLongFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
 
-  
 
- attached() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
-  
 
- attachValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
 
-  
 
- attributes - Variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
-  
 
- AUTO - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- autoCastField(Schema.Field, Object) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
 
-  
 
- AvailableParallelismFactory() - Constructor for class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
 
-  
 
- AvroCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder using Avro binary format.
 
 
- AvroCoder(Class<T>, Schema) - Constructor for class org.apache.beam.sdk.coders.AvroCoder
 
-  
 
- AvroIO - Class in org.apache.beam.sdk.io
 
- 
 
- AvroIO.Parse<T> - Class in org.apache.beam.sdk.io
 
- 
 
- AvroIO.ParseAll<T> - Class in org.apache.beam.sdk.io
 
- 
 
- AvroIO.Read<T> - Class in org.apache.beam.sdk.io
 
- 
 
- AvroIO.ReadAll<T> - Class in org.apache.beam.sdk.io
 
- 
 
- AvroIO.RecordFormatter<ElementT> - Interface in org.apache.beam.sdk.io
 
- 
Formats an element of a user type into a record with the given schema.
 
- AvroIO.Sink<ElementT> - Class in org.apache.beam.sdk.io
 
- 
 
- AvroIO.TypedWrite<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
 
- 
 
- AvroIO.Write<T> - Class in org.apache.beam.sdk.io
 
- 
 
- AvroReader(AvroSource<T>) - Constructor for class org.apache.beam.sdk.io.AvroSource.AvroReader
 
- 
Reads Avro records of type T from the specified source.
 
- AvroSource<T> - Class in org.apache.beam.sdk.io
 
- 
Do not use in pipelines directly: most users should use 
AvroIO.Read.
 
 
- AvroSource.AvroReader<T> - Class in org.apache.beam.sdk.io
 
- 
 
- AvroUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Utils to help convert Apache Avro types to Beam types.
 
- AvroUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroUtils
 
-  
 
- awaitCompletion() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
-  
 
- awaitCompletion() - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
 
-  
 
- awaitCompletion() - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
 
- 
Block until the client has completed reading from the inbound stream.
 
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
-  
 
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
 
-  
 
- AwsClientsProvider - Interface in org.apache.beam.sdk.io.aws.sns
 
- 
Provides instances of AWS clients.
 
- AWSClientsProvider - Interface in org.apache.beam.sdk.io.kinesis
 
- 
Provides instances of AWS clients.
 
- AwsModule - Class in org.apache.beam.sdk.io.aws.options
 
- 
 
- AwsModule() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsModule
 
-  
 
- AwsOptions - Interface in org.apache.beam.sdk.io.aws.options
 
- 
Options used to configure Amazon Web Services specific options such as credentials and region.
 
- AwsOptions.AwsUserCredentialsFactory - Class in org.apache.beam.sdk.io.aws.options
 
- 
Attempts to load AWS credentials.
 
- AwsOptions.ClientConfigurationFactory - Class in org.apache.beam.sdk.io.aws.options
 
- 
Default AWS client configuration.
 
- AwsPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.aws.options
 
- 
A registrar containing the default AWS options.
 
- AwsPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsPipelineOptionsRegistrar
 
-  
 
- AwsUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsUserCredentialsFactory
 
-  
 
- BACKLOG_UNKNOWN - Static variable in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
Constant representing an unknown amount of backlog.
 
- backlogBytes() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
 
- 
Gauge for source backlog in bytes.
 
- backlogBytesOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
 
- 
Gauge for source split backlog in bytes.
 
- backlogElements() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
 
- 
Gauge for source backlog in elements.
 
- backlogElementsOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
 
- 
Gauge for source split backlog in elements.
 
- bag() - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
Create a 
StateSpec for a 
BagState, optimized for adding values frequently and
 occasionally retrieving all the values that have been added.
 
 
- bag(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
 
- BagState<T> - Interface in org.apache.beam.sdk.state
 
- 
 
- BagUserStateSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
-  
 
- BaseBeamTable - Class in org.apache.beam.sdk.extensions.sql.impl.schema
 
- 
Each IO in Beam has one table schema, by extending 
BaseBeamTable.
 
 
- BaseBeamTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BaseBeamTable
 
-  
 
- BatchStatefulParDoOverrides - Class in org.apache.beam.runners.dataflow
 
- 
PTransformOverrideFactories that expands to correctly implement
 stateful 
ParDo using window-unaware 
BatchViewOverrides.GroupByKeyAndSortValuesOnly to linearize
 processing per key.
 
 
- BatchStatefulParDoOverrides() - Constructor for class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
 
-  
 
- BatchStatefulParDoOverrides.BatchStatefulDoFn<K,V,OutputT> - Class in org.apache.beam.runners.dataflow
 
- 
A key-preserving 
DoFn that explodes an iterable that has been grouped by key and
 window.
 
 
- BEAM_FN_API_DATA_BUFFER_LIMIT - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver
 
-  
 
- BEAM_QUERYSTRING_PREFIX - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
- 
Querystring parameters that begin with 
"beam." will be interpreted as 
PipelineOptions.
 
 
- BeamAggregationRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
 
- BeamAggregationRel(RelOptCluster, RelTraitSet, RelNode, boolean, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>, WindowFn<Row, IntervalWindow>, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
-  
 
- BeamAggregationRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
Rule to detect the window/trigger settings.
 
- BeamAggregationRule(Class<? extends Aggregate>, Class<? extends Project>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
 
-  
 
- BeamAggregationTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
Collections of PTransform and DoFn used to perform GROUP-BY operation.
 
- BeamAggregationTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms
 
-  
 
- BeamAggregationTransforms.AggregationAccumulator - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
A class to holder varied accumulator objects.
 
- BeamAggregationTransforms.AggregationAccumulatorCoder - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
 
- BeamAggregationTransforms.AggregationAdaptor - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
An adaptor class to invoke Calcite UDAF instances in Beam CombineFn.
 
- BeamAggregationTransforms.AggregationGroupByKeyFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
extract group-by fields.
 
- BeamAggregationTransforms.MergeAggregationRecord - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
Merge KV to single record.
 
- BeamAggregationTransforms.WindowTimestampFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
Assign event timestamp.
 
- BeamBigQueryTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
 
- 
BeamBigQueryTable represent a BigQuery table as a target.
 
- BeamBigQueryTable(Schema, String) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQueryTable
 
-  
 
- BeamCalciteSchema - Class in org.apache.beam.sdk.extensions.sql.impl
 
- 
 
- BeamCalciteSchema(TableProvider) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- BeamCalciteSchemaFactory - Class in org.apache.beam.sdk.extensions.sql.impl
 
- 
 
- BeamCalcRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a Project node.
 
- BeamCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
-  
 
- BeamCalcRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
 
- BeamEnumerableConverter - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a Enumerable node.
 
- BeamEnumerableConverter(RelOptCluster, RelTraitSet, RelNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
-  
 
- BeamEnumerableConverterRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
A 
ConverterRule to Convert 
BeamRelNode to 
EnumerableConvention.
 
 
- BeamFileSystemArtifactRetrievalService - Class in org.apache.beam.runners.fnexecution.artifact
 
- 
 
- BeamFileSystemArtifactRetrievalService() - Constructor for class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService
 
-  
 
- BeamFileSystemArtifactStagingService - Class in org.apache.beam.runners.fnexecution.artifact
 
- 
This implementation is experimental.
 
- BeamFileSystemArtifactStagingService() - Constructor for class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 
-  
 
- BeamFnDataBufferingOutboundObserver<T> - Class in org.apache.beam.sdk.fn.data
 
- 
 
- BeamFnDataGrpcMultiplexer - Class in org.apache.beam.sdk.fn.data
 
- 
A gRPC multiplexer for a specific Endpoints.ApiServiceDescriptor.
 
- BeamFnDataGrpcMultiplexer(Endpoints.ApiServiceDescriptor, OutboundObserverFactory, OutboundObserverFactory.BasicFactory<BeamFnApi.Elements, BeamFnApi.Elements>) - Constructor for class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
-  
 
- BeamFnDataInboundObserver<T> - Class in org.apache.beam.sdk.fn.data
 
- 
Decodes individually consumed 
BeamFnApi.Elements.Data with the provided 
Coder
 passing the individual decoded elements to the provided consumer.
 
 
- BeamFnDataInboundObserver(Coder<WindowedValue<T>>, FnDataReceiver<WindowedValue<T>>, InboundDataClient) - Constructor for class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
-  
 
- BeamIntersectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a Intersect node.
 
- BeamIntersectRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
 
-  
 
- BeamIntersectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
ConverterRule to replace Intersect with BeamIntersectRel.
 
- BeamIOSinkRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a TableModify node.
 
- BeamIOSinkRel(RelOptCluster, RelOptTable, Prepare.CatalogReader, RelNode, TableModify.Operation, List<String>, List<RexNode>, boolean, BeamSqlTable, Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
-  
 
- BeamIOSinkRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
 
- BeamIOSourceRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a TableScan node.
 
- BeamIOSourceRel(RelOptCluster, RelOptTable, BeamSqlTable, Map<String, String>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
-  
 
- BeamJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a Join node.
 
- BeamJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
-  
 
- BeamJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
ConverterRule to replace Join with BeamJoinRel.
 
- BeamJoinTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
Collections of PTransform and DoFn used to perform JOIN operation.
 
- BeamJoinTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
 
-  
 
- BeamJoinTransforms.ExtractJoinFields - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
A SimpleFunction to extract join fields from the specified row.
 
- BeamJoinTransforms.JoinAsLookup - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
Transform to execute Join as Lookup.
 
- BeamJoinTransforms.JoinParts2WholeRow - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
A SimpleFunction to combine two rows into one.
 
- BeamJoinTransforms.SideInputJoinDoFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
A DoFn which implement the sideInput-JOIN.
 
- BeamKafkaCSVTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
 
- 
A Kafka topic that saves records as CSV format.
 
- BeamKafkaCSVTable(Schema, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
 
-  
 
- BeamKafkaCSVTable(Schema, String, List<String>, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
 
-  
 
- BeamKafkaCSVTable.CsvRecorderDecoder - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
 
- 
A PTransform to convert 
KV<byte[], byte[]> to 
Row.
 
 
- BeamKafkaCSVTable.CsvRecorderEncoder - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
 
- 
A PTransform to convert 
Row to 
KV<byte[], byte[]>.
 
 
- BeamKafkaTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.kafka
 
- 
BeamKafkaTable represent a Kafka topic, as source or target.
 
- BeamKafkaTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- BeamKafkaTable(Schema, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- BeamKafkaTable(Schema, List<TopicPartition>, String) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- BeamLogicalConvention - Enum in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
Convertion for Beam SQL.
 
- BeamMinusRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a Minus node.
 
- BeamMinusRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
 
-  
 
- BeamMinusRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
ConverterRule to replace Minus with BeamMinusRel.
 
- BeamPCollectionTable<InputT> - Class in org.apache.beam.sdk.extensions.sql.impl.schema
 
- 
BeamPCollectionTable converts a PCollection<Row> as a virtual table, then a
 downstream query can query directly.
 
- BeamPCollectionTable(PCollection<InputT>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
 
-  
 
- BeamRelDataTypeSystem - Class in org.apache.beam.sdk.extensions.sql.impl.planner
 
- 
customized data type in Beam.
 
- BeamRelNode - Interface in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
A 
RelNode that can also give a 
PTransform that implements the expression.
 
 
- beamRow2CsvLine(Row, CSVFormat) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
 
-  
 
- BeamRuleSets - Class in org.apache.beam.sdk.extensions.sql.impl.planner
 
- 
RuleSet used in BeamQueryPlanner.
 
- BeamRuleSets() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
 
-  
 
- BeamSetOperatorRelBase - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
Delegate for Set operators: BeamUnionRel, BeamIntersectRel and BeamMinusRel.
 
- BeamSetOperatorRelBase(BeamRelNode, BeamSetOperatorRelBase.OpType, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
 
-  
 
- BeamSetOperatorRelBase.OpType - Enum in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
Set operator type.
 
- BeamSetOperatorsTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
Collections of PTransform and DoFn used to perform Set operations.
 
- BeamSetOperatorsTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms
 
-  
 
- BeamSetOperatorsTransforms.BeamSqlRow2KvFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
Transform a BeamSqlRow to a KV<BeamSqlRow, BeamSqlRow>.
 
- BeamSetOperatorsTransforms.SetOperatorFilteringDoFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
Filter function used for Set operators.
 
- BeamSortRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a Sort node.
 
- BeamSortRel(RelOptCluster, RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
-  
 
- BeamSortRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
ConverterRule to replace Sort with BeamSortRel.
 
- BeamSparkRunnerRegistrator - Class in org.apache.beam.runners.spark.coders
 
- 
Custom KryoRegistrators for Beam's Spark runner needs.
 
- BeamSparkRunnerRegistrator() - Constructor for class org.apache.beam.runners.spark.coders.BeamSparkRunnerRegistrator
 
-  
 
- BeamSqlAbsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'ABS' function.
 
- BeamSqlAbsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAbsExpression
 
-  
 
- BeamSqlAcosExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'ACOS' function.
 
- BeamSqlAcosExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAcosExpression
 
-  
 
- BeamSqlAndExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
 
- 
BeamSqlExpression for 'AND' operation.
 
- BeamSqlAndExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlAndExpression
 
-  
 
- BeamSqlArithmeticExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
 
- 
Base class for all arithmetic operators.
 
- BeamSqlArithmeticExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
 
-  
 
- BeamSqlArithmeticExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
 
-  
 
- BeamSqlArrayExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.array
 
- 
Represents ARRAY expression in SQL.
 
- BeamSqlArrayExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.array.BeamSqlArrayExpression
 
-  
 
- BeamSqlArrayItemExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.array
 
- 
Implements array element access expression.
 
- BeamSqlArrayItemExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.array.BeamSqlArrayItemExpression
 
-  
 
- BeamSqlAsinExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'ASIN' function.
 
- BeamSqlAsinExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAsinExpression
 
-  
 
- BeamSqlAtan2Expression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
 
- BeamSqlAtan2Expression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAtan2Expression
 
-  
 
- BeamSqlAtanExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'ATAN' function.
 
- BeamSqlAtanExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAtanExpression
 
-  
 
- BeamSqlBinaryOperator - Interface in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
An operator that is applied to already-evaluated arguments.
 
- BeamSqlCardinalityExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.collection
 
- 
Implements CARDINALITY(collection) operation which returns the number of elements in the
 collection.
 
- BeamSqlCardinalityExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.collection.BeamSqlCardinalityExpression
 
-  
 
- BeamSqlCaseExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
BeamSqlCaseExpression represents CASE, NULLIF, COALESCE in SQL.
 
- BeamSqlCaseExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCaseExpression
 
-  
 
- BeamSqlCastExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
Base class to support 'CAST' operations for all SqlTypeName.
 
- BeamSqlCastExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCastExpression
 
-  
 
- BeamSqlCeilExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'CEIL' function.
 
- BeamSqlCeilExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCeilExpression
 
-  
 
- BeamSqlCli - Class in org.apache.beam.sdk.extensions.sql
 
- 
BeamSqlCli provides methods to execute Beam SQL with an interactive client.
 
 
- BeamSqlCli() - Constructor for class org.apache.beam.sdk.extensions.sql.BeamSqlCli
 
-  
 
- BeamSqlCompareExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
 
- BeamSqlCompareExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
 
-  
 
- BeamSqlCorrelVariableExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
A primitive operation for dereferencing a correlation variable.
 
- BeamSqlCorrelVariableExpression(SqlTypeName, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCorrelVariableExpression
 
-  
 
- BeamSqlCosExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'COS' function.
 
- BeamSqlCosExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCosExpression
 
-  
 
- BeamSqlCotExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'COT' function.
 
- BeamSqlCotExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCotExpression
 
-  
 
- BeamSqlCurrentDateExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
BeamSqlExpression for CURRENT_DATE and LOCALTIME.
 
- BeamSqlCurrentDateExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentDateExpression
 
-  
 
- BeamSqlCurrentTimeExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
BeamSqlExpression for LOCALTIME and CURRENT_TIME.
 
- BeamSqlCurrentTimeExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimeExpression
 
-  
 
- BeamSqlCurrentTimestampExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
BeamSqlExpression for LOCALTIMESTAMP and CURRENT_TIMESTAMP.
 
- BeamSqlCurrentTimestampExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimestampExpression
 
-  
 
- BeamSqlDatetimeMinusExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
Infix '-' operation for timestamps.
 
- BeamSqlDatetimeMinusExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDatetimeMinusExpression
 
-  
 
- BeamSqlDatetimeMinusIntervalExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
minus ('-') operator for 'datetime - interval' expressions.
 
- BeamSqlDatetimeMinusIntervalExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDatetimeMinusIntervalExpression
 
-  
 
- BeamSqlDatetimePlusExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
DATETIME_PLUS operation.
 
- BeamSqlDatetimePlusExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDatetimePlusExpression
 
-  
 
- BeamSqlDefaultExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
DEFAULT keyword for UDF with optional parameter.
 
- BeamSqlDefaultExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlDefaultExpression
 
-  
 
- BeamSqlDegreesExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'DEGREES' function.
 
- BeamSqlDegreesExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlDegreesExpression
 
-  
 
- BeamSqlDivideExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
 
- 
'/' operator.
 
- BeamSqlDivideExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlDivideExpression
 
-  
 
- BeamSqlDotExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
Implements DOT operator to access fields of dynamic ROWs.
 
- BeamSqlDotExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlDotExpression
 
-  
 
- BeamSqlEnv - Class in org.apache.beam.sdk.extensions.sql.impl
 
- 
Contains the metadata of tables/UDF functions, and exposes APIs to
 query/validate/optimize/translate SQL statements.
 
- BeamSqlEqualsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for = operation.
 
- BeamSqlEqualsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
 
-  
 
- BeamSqlExpExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'EXP' function.
 
- BeamSqlExpExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlExpExpression
 
-  
 
- BeamSqlExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
BeamSqlExpression is an equivalent expression in BeamSQL, of RexNode in Calcite.
 
- BeamSqlExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- BeamSqlExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- BeamSqlExpressionEnvironment - Interface in org.apache.beam.sdk.extensions.sql.impl.interpreter
 
- 
 
- BeamSqlExpressionEnvironments - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter
 
- 
 
- BeamSqlExpressionEnvironments() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionEnvironments
 
-  
 
- BeamSqlExpressionExecutor - Interface in org.apache.beam.sdk.extensions.sql.impl.interpreter
 
- 
BeamSqlExpressionExecutor fills the gap between relational expressions in Calcite SQL and
 executable code.
 
- BeamSqlFieldAccessExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.row
 
- 
Represents a field access expression.
 
- BeamSqlFieldAccessExpression(BeamSqlExpression, int, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.row.BeamSqlFieldAccessExpression
 
-  
 
- BeamSqlFloorExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'FLOOR' function.
 
- BeamSqlFloorExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlFloorExpression
 
-  
 
- BeamSqlFnExecutor - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter
 
- 
 
- BeamSqlFnExecutor(RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlFnExecutor
 
-  
 
- BeamSqlGreaterThanExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for > operation.
 
- BeamSqlGreaterThanExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
 
-  
 
- BeamSqlGreaterThanOrEqualsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for >= operation.
 
- BeamSqlGreaterThanOrEqualsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
 
-  
 
- BeamSqlInputRefExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
A primitive operation for direct field extraction.
 
- BeamSqlInputRefExpression(SqlTypeName, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlInputRefExpression
 
-  
 
- BeamSqlIntervalMultiplyExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
Multiplication operator for intervals.
 
- BeamSqlIntervalMultiplyExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlIntervalMultiplyExpression
 
-  
 
- BeamSqlIsNotNullExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for 'IS NOT NULL' operation.
 
- BeamSqlIsNotNullExpression(BeamSqlExpression) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNotNullExpression
 
-  
 
- BeamSqlIsNullExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for 'IS NULL' operation.
 
- BeamSqlIsNullExpression(BeamSqlExpression) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNullExpression
 
-  
 
- BeamSqlLessThanExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for < operation.
 
- BeamSqlLessThanExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
 
-  
 
- BeamSqlLessThanOrEqualsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for <= operation.
 
- BeamSqlLessThanOrEqualsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
 
-  
 
- BeamSqlLikeExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for 'LIKE' operation.
 
- BeamSqlLikeExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLikeExpression
 
-  
 
- BeamSqlLnExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'LN' function.
 
- BeamSqlLnExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlLnExpression
 
-  
 
- BeamSqlLocalRefExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
A primitive operation for dereferencing a correlation variable.
 
- BeamSqlLocalRefExpression(SqlTypeName, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlLocalRefExpression
 
-  
 
- BeamSqlLogExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'Log10' function.
 
- BeamSqlLogExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlLogExpression
 
-  
 
- BeamSqlLogicalExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
 
- 
BeamSqlExpression for Logical operators.
 
- BeamSqlLogicalExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlLogicalExpression
 
-  
 
- BeamSqlMapExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.map
 
- 
Represents MAP expression in SQL.
 
- BeamSqlMapExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.map.BeamSqlMapExpression
 
-  
 
- BeamSqlMapItemExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.map
 
- 
Implements map key access expression.
 
- BeamSqlMapItemExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.map.BeamSqlMapItemExpression
 
-  
 
- BeamSqlMathBinaryExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
Base class for all binary functions such as POWER, MOD, RAND_INTEGER, ATAN2, ROUND, TRUNCATE.
 
- BeamSqlMathBinaryExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathBinaryExpression
 
-  
 
- BeamSqlMathUnaryExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
Base class for all unary functions such as ABS, SQRT, LN, LOG10, EXP, CEIL, FLOOR, RAND, ACOS,
 ASIN, ATAN, COS, COT, DEGREES, RADIANS, SIGN, SIN, TAN.
 
- BeamSqlMathUnaryExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathUnaryExpression
 
-  
 
- BeamSqlMinusExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
 
- 
'-' operator.
 
- BeamSqlMinusExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlMinusExpression
 
-  
 
- BeamSqlModExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
 
- 
'%' operator.
 
- BeamSqlModExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlModExpression
 
-  
 
- BeamSqlMultiplyExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
 
- 
'*' operator.
 
- BeamSqlMultiplyExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlMultiplyExpression
 
-  
 
- BeamSqlNotEqualsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlExpression for <> operation.
 
- BeamSqlNotEqualsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
 
-  
 
- BeamSqlNotExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
 
- 
BeamSqlExpression for logical operator: NOT.
 
- BeamSqlNotExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlNotExpression
 
-  
 
- BeamSqlNotLikeExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
BeamSqlNotLikeExpression for <> operation.
 
- BeamSqlNotLikeExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotLikeExpression
 
-  
 
- BeamSqlOperator - Interface in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
An operator that is applied to already-evaluated arguments.
 
- BeamSqlOperatorExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
A generic expression form for an operator applied to arguments.
 
- BeamSqlOperatorExpression(BeamSqlOperator, List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlOperatorExpression
 
-  
 
- BeamSqlOrExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
 
- 
BeamSqlExpression for 'OR' operation.
 
- BeamSqlOrExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlOrExpression
 
-  
 
- BeamSqlOutputToConsoleFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
A test PTransform to display output in console.
 
- BeamSqlOutputToConsoleFn(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlOutputToConsoleFn
 
-  
 
- BeamSqlPiExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
Base class for the PI function.
 
- BeamSqlPiExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPiExpression
 
-  
 
- BeamSqlPlusExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
 
- 
'+' operator.
 
- BeamSqlPlusExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlPlusExpression
 
-  
 
- BeamSqlPowerExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathBinaryExpression for 'POWER' function.
 
- BeamSqlPowerExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPowerExpression
 
-  
 
- BeamSqlPrimitive<T> - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
 
- BeamSqlRadiansExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'RADIANS' function.
 
- BeamSqlRadiansExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRadiansExpression
 
-  
 
- BeamSqlRandExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'RAND([seed])' function.
 
- BeamSqlRandExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandExpression
 
-  
 
- BeamSqlRandIntegerExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'RAND_INTEGER([seed, ] numeric)' function.
 
- BeamSqlRandIntegerExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandIntegerExpression
 
-  
 
- BeamSqlReinterpretExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret
 
- 
BeamSqlExpression for Reinterpret call.
 
- BeamSqlReinterpretExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.BeamSqlReinterpretExpression
 
-  
 
- BeamSqlRelUtils - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
Utilities for BeamRelNode.
 
- BeamSqlRelUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
-  
 
- BeamSqlRoundExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathBinaryExpression for 'ROUND' function.
 
- BeamSqlRoundExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRoundExpression
 
-  
 
- BeamSqlRow2KvFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
 
-  
 
- BeamSqlSeekableTable - Interface in org.apache.beam.sdk.extensions.sql
 
- 
A seekable table converts a JOIN operator to an inline lookup.
 
- BeamSqlSignExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'SIGN' function.
 
- BeamSqlSignExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlSignExpression
 
-  
 
- BeamSqlSinExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'SIN' function.
 
- BeamSqlSinExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlSinExpression
 
-  
 
- BeamSqlSingleElementExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.collection
 
- 
Implements ELEMENT(collection) operation which returns the single element from the collection.
 
- BeamSqlSingleElementExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.collection.BeamSqlSingleElementExpression
 
-  
 
- BeamSqlTable - Interface in org.apache.beam.sdk.extensions.sql
 
- 
This interface defines a Beam Sql Table.
 
- BeamSqlTanExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathUnaryExpression for 'TAN' function.
 
- BeamSqlTanExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlTanExpression
 
-  
 
- BeamSqlTimestampMinusIntervalExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
'-' operator for 'timestamp - interval' expressions.
 
- BeamSqlTimestampMinusIntervalExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlTimestampMinusIntervalExpression
 
-  
 
- BeamSqlTimestampMinusTimestampExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
Infix '-' operation for timestamps.
 
- BeamSqlTimestampMinusTimestampExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlTimestampMinusTimestampExpression
 
-  
 
- BeamSqlTruncateExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
BeamSqlMathBinaryExpression for 'TRUNCATE' function.
 
- BeamSqlTruncateExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlTruncateExpression
 
-  
 
- BeamSqlUdf - Interface in org.apache.beam.sdk.extensions.sql
 
- 
Interface to create a UDF in Beam SQL.
 
- BeamSqlUdfExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
invoke a UDF function.
 
- BeamSqlUdfExpression(Method, List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUdfExpression
 
-  
 
- BeamSqlUnaryOperator - Interface in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
An operator that is applied to already-evaluated arguments.
 
- BeamTableUtils - Class in org.apache.beam.sdk.extensions.sql.impl.schema
 
- 
Utility methods for working with BeamTable.
 
- BeamTableUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
 
-  
 
- BeamUncollectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to implement an uncorrelated 
Uncollect, aka UNNEST.
 
 
- BeamUncollectRel(RelOptCluster, RelTraitSet, RelNode, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
 
-  
 
- BeamUncollectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
 
- BeamUnionRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
 
- BeamUnionRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
 
-  
 
- BeamUnionRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
 
- BeamUnnestRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to implement UNNEST, supporting specifically only 
Correlate with
 
Uncollect.
 
 
- BeamUnnestRel(RelOptCluster, RelTraitSet, RelNode, RelDataType, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
-  
 
- BeamUnnestRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
 
- BeamValuesRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamRelNode to replace a Values node.
 
- BeamValuesRel(RelOptCluster, RelDataType, ImmutableList<ImmutableList<RexLiteral>>, RelTraitSet) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
 
-  
 
- BeamValuesRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
ConverterRule to replace Values with BeamValuesRel.
 
- beforeStart(ClientCallStreamObserver<RespT>) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
-  
 
- begin() - Method in class org.apache.beam.sdk.Pipeline
 
- 
Returns a 
PBegin owned by this Pipeline.
 
 
- beginningOnDay(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- beginningOnDay(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- BIG_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- BIG_QUERY_INSERT_ERROR_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
-  
 
- BigDecimalCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- BigDecimalConverter - Class in org.apache.beam.sdk.extensions.sql.impl.utils
 
- 
Provides converters from 
BigDecimal to other numeric types based on the input 
Schema.TypeName.
 
 
- BigDecimalConverter() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
 
-  
 
- bigdecimals() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- BigEndianIntegerCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- BigEndianLongCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- BigEndianShortCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- BigIntegerCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- bigintegers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- BigQueryCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- BigQueryCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
 
-  
 
- BigQueryHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
A set of helper functions and classes used by 
BigQueryIO.
 
 
- BigQueryHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
-  
 
- BigQueryInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Model definition for BigQueryInsertError.
 
- BigQueryInsertError(TableRow, TableDataInsertAllResponse.InsertErrors, TableReference) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
-  
 
- BigQueryInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- BigQueryInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
-  
 
- BigQueryIO - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- BigQueryIO.Read - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- BigQueryIO.TypedRead<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- BigQueryIO.TypedRead.QueryPriority - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
- 
An enumeration type for the priority of a query.
 
- BigQueryIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- BigQueryIO.Write.CreateDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
- 
An enumeration type for the BigQuery create disposition strings.
 
- BigQueryIO.Write.Method - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Determines the method used to insert data in BigQuery.
 
- BigQueryIO.Write.WriteDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
- 
An enumeration type for the BigQuery write disposition strings.
 
- BigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Properties needed when using Google BigQuery with the Apache Beam SDK.
 
- BigQueryServices - Interface in org.apache.beam.sdk.io.gcp.bigquery
 
- 
An interface for real, mock, or fake implementations of Cloud BigQuery services.
 
- BigQueryServices.DatasetService - Interface in org.apache.beam.sdk.io.gcp.bigquery
 
- 
An interface to get, create and delete Cloud BigQuery datasets and tables.
 
- BigQueryServices.JobService - Interface in org.apache.beam.sdk.io.gcp.bigquery
 
- 
An interface for the Cloud BigQuery load service.
 
- BigQueryTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
 
- 
BigQuery table provider.
 
- BigQueryTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
 
-  
 
- BigQueryUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Utility methods for BigQuery related operations.
 
- BigQueryUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
-  
 
- BigtableIO - Class in org.apache.beam.sdk.io.gcp.bigtable
 
- 
Transforms for reading from and writing to Google Cloud Bigtable.
 
 
- BigtableIO.Read - Class in org.apache.beam.sdk.io.gcp.bigtable
 
- 
A 
PTransform that reads from Google Cloud Bigtable.
 
 
- BigtableIO.Write - Class in org.apache.beam.sdk.io.gcp.bigtable
 
- 
A 
PTransform that writes to Google Cloud Bigtable.
 
 
- BinaryCombineDoubleFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
-  
 
- BinaryCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
-  
 
- BinaryCombineIntegerFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
-  
 
- BinaryCombineLongFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
-  
 
- bind(String, StateBinder) - Method in interface org.apache.beam.sdk.state.StateSpec
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- bindBag(String, StateSpec<BagState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
-  
 
- bindCombining(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
-  
 
- bindCombiningWithContext(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
-  
 
- bindMap(String, StateSpec<MapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
-  
 
- bindSet(String, StateSpec<SetState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
-  
 
- bindValue(String, StateSpec<ValueState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
-  
 
- bindWatermark(String, StateSpec<WatermarkHoldState>, TimestampCombiner) - Method in interface org.apache.beam.sdk.state.StateBinder
 
- 
 
- BitSetCoder - Class in org.apache.beam.sdk.coders
 
- 
Coder for BitSet.
 
- Block() - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.Block
 
-  
 
- BlockBasedReader(BlockBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
-  
 
- BlockBasedSource<T> - Class in org.apache.beam.sdk.io
 
- 
A 
BlockBasedSource is a 
FileBasedSource where a file consists of blocks of
 records.
 
 
- BlockBasedSource(String, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
 
- 
Creates a BlockBasedSource based on a file name or pattern.
 
- BlockBasedSource(String, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
 
- 
 
- BlockBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
 
- 
 
- BlockBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
 
- 
 
- BlockBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
 
- 
Creates a BlockBasedSource for a single file.
 
- BlockBasedSource.Block<T> - Class in org.apache.beam.sdk.io
 
- 
A Block represents a block of records that can be read.
 
- BlockBasedSource.BlockBasedReader<T> - Class in org.apache.beam.sdk.io
 
- 
 
- BlockingQueueIterator(BlockingQueue<T>) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.BlockingQueueIterator
 
-  
 
- BOOLEAN - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- BOOLEAN - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of boolean fields.
 
- BooleanCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- BooleanCoder() - Constructor for class org.apache.beam.sdk.coders.BooleanCoder
 
-  
 
- booleans() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- Bounded(SparkContext, BoundedSource<T>, SerializablePipelineOptions, String) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
-  
 
- BoundedReader() - Constructor for class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
-  
 
- BoundedReadFromUnboundedSource<T> - Class in org.apache.beam.sdk.io
 
- 
PTransform that reads a bounded amount of data from an 
UnboundedSource, specified
 as one or both of a maximum number of elements or a maximum period of time to read.
 
 
- BoundedSource<T> - Class in org.apache.beam.sdk.io
 
- 
A 
Source that reads a finite amount of input and, because of that, supports some
 additional operations.
 
 
- BoundedSource() - Constructor for class org.apache.beam.sdk.io.BoundedSource
 
-  
 
- BoundedSource.BoundedReader<T> - Class in org.apache.beam.sdk.io
 
- 
A Reader that reads a bounded amount of input and supports some additional operations,
 such as progress estimation and dynamic work rebalancing.
 
- BoundedSourceWrapper<T> - Class in org.apache.beam.runners.gearpump.translators.io
 
- 
wrapper over BoundedSource for Gearpump DataSource API.
 
- BoundedSourceWrapper(BoundedSource<T>, PipelineOptions) - Constructor for class org.apache.beam.runners.gearpump.translators.io.BoundedSourceWrapper
 
-  
 
- BoundedWindow - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
BoundedWindow represents window information assigned to data elements.
 
 
- BoundedWindow() - Constructor for class org.apache.beam.sdk.transforms.windowing.BoundedWindow
 
-  
 
- boundedWindowToGearpumpWindow(BoundedWindow) - Static method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils
 
-  
 
- broadcast(JavaSparkContext) - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
-  
 
- BufferedExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
 
- 
Sorter that will use in memory sorting until the values can't fit into memory and will
 then fall back to external sorting.
 
- BufferedExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
 
- 
Contains configuration for the sorter.
 
- BufferingStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
 
- 
A thread safe StreamObserver which uses a bounded queue to pass elements to a processing
 thread responsible for interacting with the underlying CallStreamObserver.
 
- BufferingStreamObserver(Phaser, CallStreamObserver<T>, ExecutorService, int) - Constructor for class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
-  
 
- build() - Method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate.TimerUpdateBuilder
 
- 
 
- build() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.Reinterpreter.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
-  
 
- build() - Method in class org.apache.beam.sdk.values.Row.Builder
 
-  
 
- buildBeamSqlSchema(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
 
- 
Create a RowsBuilder with the specified row type info.
 
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
 
-  
 
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
 
-  
 
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonTableProvider
 
-  
 
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
-  
 
- buildBeamSqlTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
 
- 
 
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
 
-  
 
- buildBeamSqlTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
-  
 
- builder(StructuralKey<?>) - Static method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate
 
- 
Creates a new 
WatermarkManager.TimerUpdate builder with the provided completed timers that needs the
 set and deleted timers to be added to it.
 
 
- builder() - Static method in class org.apache.beam.runners.fnexecution.jobsubmission.JobPreparation
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
 
- 
 
- Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion.Builder
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.Reinterpreter
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.Reinterpreter.Builder
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubMessageToRow
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.Builder
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
 
- 
 
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.metrics.MetricsFilter
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
-  
 
- builder() - Static method in class org.apache.beam.sdk.schemas.Schema
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.schemas.Schema.Builder
 
-  
 
- Builder() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
-  
 
- builderFor(Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
 
- 
Create a ManagedChannelBuilder for the provided Endpoints.ApiServiceDescriptor.
 
- builderFor(Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.test.InProcessManagedChannelFactory
 
-  
 
- buildIOReader(PBegin) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlTable
 
- 
create a PCollection<Row> from source.
 
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
 
-  
 
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQueryTable
 
-  
 
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
-  
 
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
-  
 
- buildIOReader(PBegin) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
-  
 
- buildIOWriter(PCollection<Row>) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlTable
 
- 
create a IO.write() instance to write to target.
 
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
 
-  
 
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQueryTable
 
-  
 
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
-  
 
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
 
-  
 
- buildIOWriter(PCollection<Row>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
 
-  
 
- buildPTransform() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
-  
 
- buildPTransform() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
 
-  
 
- buildRows(Schema, List<?>) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
 
- 
Convenient way to build a BeamSqlRows.
 
- buildTemporaryFilename(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
Constructs a temporary file resource given the temporary directory and a filename.
 
- Bundle<T,CollectionT> - Interface in org.apache.beam.runners.local
 
- 
An immutable collection of elements which are part of a PCollection.
 
- BundleProgressHandler - Interface in org.apache.beam.runners.fnexecution.control
 
- 
A handler for bundle progress messages, both during bundle execution and on its completion.
 
- by(SerializableFunction<UserT, DestinationT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies how to partition elements into groups ("destinations").
 
- by(Contextful<Contextful.Fn<UserT, DestinationT>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- by(PredicateT) - Static method in class org.apache.beam.sdk.transforms.Filter
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that satisfy the given predicate.
 
- byFieldAccessDescriptor(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
 
- 
Returns a transform that groups all elements in the input 
PCollection keyed by the
 fields specified.
 
 
- byFieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
 
- 
Returns a transform that groups all elements in the input 
PCollection keyed by the list
 of fields specified.
 
 
- byFieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Group
 
- 
Returns a transform that groups all elements in the input 
PCollection keyed by the list
 of fields specified.
 
 
- BYTE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of byte fields.
 
- ByteArray - Class in org.apache.beam.runners.spark.util
 
- 
Serializable byte array.
 
- ByteArray(byte[]) - Constructor for class org.apache.beam.runners.spark.util.ByteArray
 
-  
 
- ByteArrayCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- ByteCoder - Class in org.apache.beam.sdk.coders
 
- 
A 
ByteCoder encodes 
Byte values in 1 byte using Java serialization.
 
 
- ByteKey - Class in org.apache.beam.sdk.io.range
 
- 
A class representing a key consisting of an array of bytes.
 
- ByteKeyRange - Class in org.apache.beam.sdk.io.range
 
- 
A class representing a range of 
ByteKeys.
 
 
- ByteKeyRangeTracker - Class in org.apache.beam.sdk.io.range
 
- 
 
- ByteKeyRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
 
- 
 
- BYTES - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of bytes fields.
 
- bytes() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- bytesRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
 
- 
Counter of bytes read by a source.
 
- bytesReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
 
- 
Counter of bytes read by a source split.
 
- ByteStringCoder - Class in org.apache.beam.sdk.extensions.protobuf
 
- 
A 
Coder for 
ByteString objects based on their encoded Protocol Buffer form.
 
 
- bytesWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
 
- 
Counter of bytes written to a sink.
 
- CACHED_GETTERS - Static variable in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
-  
 
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
 
-  
 
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlDivideExpression
 
-  
 
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlMinusExpression
 
-  
 
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlModExpression
 
-  
 
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlMultiplyExpression
 
-  
 
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlPlusExpression
 
-  
 
- CalciteUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
 
- 
Utility methods for Calcite related operations.
 
- CalciteUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAbsExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAcosExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAsinExpression
 
-  
 
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAtan2Expression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAtanExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCeilExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCosExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCotExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlDegreesExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlExpExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlFloorExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlLnExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlLogExpression
 
-  
 
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathBinaryExpression
 
- 
The base method for implementation of math binary functions.
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathUnaryExpression
 
- 
For the operands of other type SqlTypeName.NUMERIC_TYPES.
 
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPowerExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRadiansExpression
 
-  
 
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRoundExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlSignExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlSinExpression
 
-  
 
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlTanExpression
 
-  
 
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlTruncateExpression
 
-  
 
- CalendarWindows - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A collection of 
WindowFns that windows values into calendar-based windows such as spans
 of days, months, or years.
 
 
- CalendarWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.CalendarWindows
 
-  
 
- CalendarWindows.DaysWindows - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
WindowFn that windows elements into periods measured by days.
 
 
- CalendarWindows.MonthsWindows - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
WindowFn that windows elements into periods measured by months.
 
 
- CalendarWindows.YearsWindows - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
WindowFn that windows elements into periods measured by years.
 
 
- cancel() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
-  
 
- cancel() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
-  
 
- cancel() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
-  
 
- cancel() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
-  
 
- cancel(JobApi.CancelJobRequest, StreamObserver<JobApi.CancelJobResponse>) - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- cancel() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
-  
 
- cancel() - Method in class org.apache.beam.runners.flink.FlinkJobInvocation
 
-  
 
- cancel() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
-  
 
- cancel(JobApi.CancelJobRequest, StreamObserver<JobApi.CancelJobResponse>) - Method in class org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService
 
-  
 
- cancel() - Method in interface org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation
 
- 
Cancel the job.
 
- cancel() - Method in class org.apache.beam.runners.gearpump.GearpumpPipelineResult
 
-  
 
- cancel() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
-  
 
- cancel() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
-  
 
- cancel() - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
 
-  
 
- cancel() - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
 
- 
Cancels the client, causing it to drop any future inbound data.
 
- cancel() - Method in interface org.apache.beam.sdk.PipelineResult
 
- 
Cancels the pipeline execution.
 
- cancelled() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
 
- 
Report that the pipeline has been cancelled.
 
- canConvert(SqlTypeName, SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.Reinterpreter
 
-  
 
- canConvertConvention(Convention) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
-  
 
- CannotProvideCoderException - Exception in org.apache.beam.sdk.coders
 
- 
 
- CannotProvideCoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
-  
 
- CannotProvideCoderException(String, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
-  
 
- CannotProvideCoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
-  
 
- CannotProvideCoderException(String, Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
-  
 
- CannotProvideCoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
-  
 
- CannotProvideCoderException(Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
-  
 
- CannotProvideCoderException.ReasonCode - Enum in org.apache.beam.sdk.coders
 
- 
Indicates the reason that 
Coder inference failed.
 
 
- canStopPolling(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
 
- 
 
- CassandraIO - Class in org.apache.beam.sdk.io.cassandra
 
- 
An IO to read from Apache Cassandra.
 
- CassandraIO.Read<T> - Class in org.apache.beam.sdk.io.cassandra
 
- 
 
- CassandraIO.Write<T> - Class in org.apache.beam.sdk.io.cassandra
 
- 
 
- CassandraService<T> - Interface in org.apache.beam.sdk.io.cassandra
 
- 
An interface for real or fake implementations of Cassandra.
 
- CassandraService.Writer<T> - Interface in org.apache.beam.sdk.io.cassandra
 
- 
Writer for an entity.
 
- CassandraServiceImpl<T> - Class in org.apache.beam.sdk.io.cassandra
 
- 
An implementation of the 
CassandraService that actually use a Cassandra instance.
 
 
- CassandraServiceImpl() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
 
-  
 
- CassandraServiceImpl.WriterImpl - Class in org.apache.beam.sdk.io.cassandra
 
- 
Writer storing an entity into Apache Cassandra database.
 
- CHAR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- CHAR_LENGTH - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators
 
-  
 
- characters() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
-  
 
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
-  
 
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
 
- 
Called by the runner after DoFn.ProcessElement returns.
 
- checkIfAnySubscriptionExists(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
 
- 
Check if topics exist.
 
- checkpoint() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
-  
 
- checkpoint() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
-  
 
- checkpoint() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
 
- 
Signals that the current 
DoFn.ProcessElement call should terminate as soon as possible:
 after this method returns, the tracker MUST refuse all future claim calls, and 
RestrictionTracker.checkDone() MUST succeed.
 
 
- classesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
 
- 
 
- classesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
-  
 
- classNamesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
 
- 
 
- classNamesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
-  
 
- CLASSPATH_SCHEME - Static variable in class org.apache.beam.runners.apex.ApexRunner
 
-  
 
- cleanup() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
-  
 
- cleanupOnCancelOrFinish() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
- 
Opportunity for a subclass to perform cleanup, such as removing temporary files.
 
- clear(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
 
- 
Clears the bag user state for the given key and window.
 
- clear() - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
 
-  
 
- clear() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
-  
 
- clear() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
-  
 
- clear() - Method in interface org.apache.beam.sdk.state.State
 
- 
Clear out the state location.
 
- clearCache() - Static method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
-  
 
- clearOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- clearOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- clientBuffered(ExecutorService) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
 
- 
Create a buffering 
OutboundObserverFactory for client-side RPCs with the specified
 
ExecutorService and the default buffer size.
 
 
- clientBuffered(ExecutorService, int) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
 
- 
Create a buffering 
OutboundObserverFactory for client-side RPCs with the specified
 
ExecutorService and buffer size.
 
 
- ClientConfigurationFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsOptions.ClientConfigurationFactory
 
-  
 
- clientDirect() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
 
- 
Create the default 
OutboundObserverFactory for client-side RPCs, which uses basic
 unbuffered flow control and adds synchronization to provide thread safety of access to the
 returned observer.
 
 
- Clock - Interface in org.apache.beam.runners.direct
 
- 
Access to the current time.
 
- clone() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
 
-  
 
- clone() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
-  
 
- clonesOf(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
 
-  
 
- close() - Method in class org.apache.beam.runners.direct.portable.artifact.LocalFileSystemArtifactRetrievalService
 
-  
 
- close() - Method in class org.apache.beam.runners.direct.portable.artifact.LocalFileSystemArtifactStagerService
 
-  
 
- close() - Method in class org.apache.beam.runners.direct.portable.artifact.UnsupportedArtifactRetrievalService
 
-  
 
- close() - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.SimpleStageBundleFactory
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.WrappedSdkHarnessClient
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
 
-  
 
- close() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
 
- 
Closes this bundle.
 
- close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.ActiveBundle
 
- 
Blocks till bundle processing is finished.
 
- close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
 
- 
Deprecated.
  
- close() - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
 
-  
 
- close() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
 
- 
.
 
- close() - Method in interface org.apache.beam.runners.fnexecution.FnService
 
- 
.
 
- close() - Method in class org.apache.beam.runners.fnexecution.GrpcFnServer
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
 
-  
 
- close() - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
 
-  
 
- close() - Method in class org.apache.beam.runners.gearpump.translators.io.GearpumpSource
 
-  
 
- close() - Method in class org.apache.beam.runners.reference.CloseableResource
 
- 
Closes the underlying resource.
 
- close(T) - Method in interface org.apache.beam.runners.reference.CloseableResource.Closer
 
-  
 
- close() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
-  
 
- close() - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionExecutor
 
-  
 
- close() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlFnExecutor
 
-  
 
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver
 
-  
 
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
-  
 
- close() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
 
- 
.
 
- close() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.BlockingQueueIterator
 
-  
 
- close() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
-  
 
- close() - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl.WriterImpl
 
-  
 
- close() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
Closes the channel and returns the bundle result.
 
- close() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
- 
Closes any ReadableByteChannel created for the current reader.
 
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
- 
Gracefully close the underlying netty channel.
 
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- close() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
-  
 
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
-  
 
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
-  
 
- close() - Method in class org.apache.beam.sdk.io.Source.Reader
 
- 
Closes the reader.
 
- close() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- CloseableFnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
 
- 
A receiver of streamed data that can be closed.
 
- CloseableResource<T> - Class in org.apache.beam.runners.reference
 
- 
An AutoCloseable that wraps a resource that needs to be cleaned up but does not implement
 AutoCloseable itself.
 
- CloseableResource.CloseException - Exception in org.apache.beam.runners.reference
 
- 
An exception that wraps errors thrown while a resource is being closed.
 
- CloseableResource.Closer<T> - Interface in org.apache.beam.runners.reference
 
- 
A function that knows how to clean up after a resource.
 
- CloseableThrowingConsumer<T> - Interface in org.apache.beam.sdk.fn.function
 
- 
 
- closeTo(double, double) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- CloudDebuggerOptions - Interface in org.apache.beam.runners.dataflow.options
 
- 
Options for controlling Cloud Debugger.
 
- CloudObject - Class in org.apache.beam.runners.dataflow.util
 
- 
A representation of an arbitrary Java object to be instantiated by Dataflow workers.
 
- cloudObjectClassName() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
 
- 
 
- CloudObjects - Class in org.apache.beam.runners.dataflow.util
 
- 
 
- CloudObjectTranslator<T> - Interface in org.apache.beam.runners.dataflow.util
 
- 
A translator that takes an object and creates a 
CloudObject which can be converted back
 to the original object.
 
 
- CloudResourceManagerOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
 
- 
Properties needed when using Google CloudResourceManager with the Apache Beam SDK.
 
- CO_GBK_RESULT_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- Coder<T> - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder<T> defines how to encode and decode values of type 
T into
 byte streams.
 
 
- Coder() - Constructor for class org.apache.beam.sdk.coders.Coder
 
-  
 
- Coder.Context - Class in org.apache.beam.sdk.coders
 
- 
 
- Coder.NonDeterministicException - Exception in org.apache.beam.sdk.coders
 
- 
Exception thrown by 
Coder.verifyDeterministic() if the encoding is not deterministic,
 including details of why the encoding is not deterministic.
 
 
- CoderCloudObjectTranslatorRegistrar - Interface in org.apache.beam.runners.dataflow.util
 
- 
Coder authors have the ability to automatically have their 
Coder registered with
 the Dataflow Runner by creating a 
ServiceLoader entry and a concrete implementation of
 this interface.
 
 
- coderConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T> and values of type T, the values are equal
 if and only if the encoded bytes are equal.
 
- coderConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T>, Coder.Context, and values of type T, the values are equal if and only if the encoded bytes are equal, in any Coder.Context.
 
- coderDecodeEncodeContentsEqual(Coder<CollectionT>, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in any Coder.Context.
 
- coderDecodeEncodeContentsEqualInContext(Coder<CollectionT>, Coder.Context, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in the given Coder.Context.
 
- coderDecodeEncodeContentsInSameOrder(Coder<IterableT>, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in any Coder.Context.
 
- coderDecodeEncodeContentsInSameOrderInContext(Coder<IterableT>, Coder.Context, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<Iterable<T>>, and value of type Iterable<T>,
 encoding followed by decoding yields an equal value of type Collection<T>, in the given
 Coder.Context.
 
- coderDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T>, and value of type T, encoding followed by
 decoding yields an equal value of type T, in any Coder.Context.
 
- coderDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T>, Coder.Context, and value of type T, encoding followed by decoding yields an equal value of type T.
 
- coderDecodesBase64(Coder<T>, String, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
-  
 
- coderDecodesBase64(Coder<T>, List<String>, List<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
-  
 
- coderDecodesBase64ContentsEqual(Coder<IterableT>, String, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
-  
 
- coderDecodesBase64ContentsEqual(Coder<IterableT>, List<String>, List<IterableT>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
-  
 
- coderDeterministic(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T>, and values of type T, if the values are
 equal then the encoded bytes are equal, in any Coder.Context.
 
- coderDeterministicInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T>, Coder.Context, and values of type T, if the values are equal then the encoded bytes are equal.
 
- coderEncodesBase64(Coder<T>, T, String) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
-  
 
- coderEncodesBase64(Coder<T>, List<T>, List<String>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
-  
 
- CoderException - Exception in org.apache.beam.sdk.coders
 
- 
An Exception thrown if there is a problem encoding or decoding a value.
 
- CoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
-  
 
- CoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
-  
 
- CoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
-  
 
- coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.CoderProvider
 
- 
Returns a Coder<T> to use for values of a particular type, given the Coders for each of
 the type's generic parameter types.
 
- coderForFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.coders.RowCoder
 
- 
Returns the coder used for a given primitive type.
 
- coderFromCloudObject(CloudObject) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
 
-  
 
- CoderHelpers - Class in org.apache.beam.runners.spark.coders
 
- 
Serialization utility class.
 
- CoderProperties - Class in org.apache.beam.sdk.testing
 
- 
Properties for use in 
Coder tests.
 
 
- CoderProperties() - Constructor for class org.apache.beam.sdk.testing.CoderProperties
 
-  
 
- CoderProperties.TestElementByteSizeObserver - Class in org.apache.beam.sdk.testing
 
- 
An ElementByteSizeObserver that records the observed element sizes for testing
 purposes.
 
- CoderProvider - Class in org.apache.beam.sdk.coders
 
- 
 
- CoderProvider() - Constructor for class org.apache.beam.sdk.coders.CoderProvider
 
-  
 
- CoderProviderRegistrar - Interface in org.apache.beam.sdk.coders
 
- 
Coder creators have the ability to automatically have their 
coders
 registered with this SDK by creating a 
ServiceLoader entry and a concrete implementation
 of this interface.
 
 
- CoderProviders - Class in org.apache.beam.sdk.coders
 
- 
Static utility methods for creating and working with 
CoderProviders.
 
 
- CoderRegistry - Class in org.apache.beam.sdk.coders
 
- 
 
- coderSerializable(Coder<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that the given Coder<T> can be correctly serialized and deserialized.
 
- CoGbkResult - Class in org.apache.beam.sdk.transforms.join
 
- 
 
- CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
 
- 
 
- CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>, int) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
 
-  
 
- CoGbkResult.CoGbkResultCoder - Class in org.apache.beam.sdk.transforms.join
 
- 
 
- CoGbkResultSchema - Class in org.apache.beam.sdk.transforms.join
 
- 
 
- CoGbkResultSchema(TupleTagList) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
- 
Builds a schema from a tuple of TupleTag<?>s.
 
- CoGroupByKey<K> - Class in org.apache.beam.sdk.transforms.join
 
- 
 
- COLLECTION_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- CollectionCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
 
- CollectionCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.CollectionCoder
 
-  
 
- collectionId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
-  
 
- column(SqlParserPos, SqlIdentifier, SqlDataTypeSpec, SqlNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
 
- 
Creates a column declaration.
 
- Combine - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for combining PCollection elements globally and per-key.
 
- combine(Iterable<? extends Instant>) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
 
- 
 
- combine(Instant...) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
 
- 
 
- Combine.AccumulatingCombineFn<InputT,AccumT extends Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT>,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
 
- 
The type of mutable accumulator values used by this AccumulatingCombineFn.
 
- Combine.BinaryCombineDoubleFn - Class in org.apache.beam.sdk.transforms
 
- 
An abstract subclass of 
Combine.CombineFn for implementing combiners that are more easily and
 efficiently expressed as binary operations on 
doubles.
 
 
- Combine.BinaryCombineFn<V> - Class in org.apache.beam.sdk.transforms
 
- 
An abstract subclass of 
Combine.CombineFn for implementing combiners that are more easily
 expressed as binary operations.
 
 
- Combine.BinaryCombineIntegerFn - Class in org.apache.beam.sdk.transforms
 
- 
An abstract subclass of 
Combine.CombineFn for implementing combiners that are more easily and
 efficiently expressed as binary operations on 
ints
 
 
- Combine.BinaryCombineLongFn - Class in org.apache.beam.sdk.transforms
 
- 
An abstract subclass of 
Combine.CombineFn for implementing combiners that are more easily and
 efficiently expressed as binary operations on 
longs.
 
 
- Combine.CombineFn<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
A CombineFn<InputT, AccumT, OutputT> specifies how to combine a collection of input
 values of type InputT into a single output value of type OutputT.
 
- Combine.Globally<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
Combine.Globally<InputT, OutputT> takes a 
PCollection<InputT> and returns a
 
PCollection<OutputT> whose elements are the result of combining all the elements in
 each window of the input 
PCollection, using a specified 
CombineFn<InputT, AccumT, OutputT>.
 
 
- Combine.GloballyAsSingletonView<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
Combine.GloballyAsSingletonView<InputT, OutputT> takes a 
PCollection<InputT>
 and returns a 
PCollectionView<OutputT> whose elements are the result of combining all
 the elements in each window of the input 
PCollection, using a specified 
CombineFn<InputT, AccumT, OutputT>.
 
 
- Combine.GroupedValues<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
GroupedValues<K, InputT, OutputT> takes a 
PCollection<KV<K, Iterable<InputT>>>,
 such as the result of 
GroupByKey, applies a specified 
CombineFn<InputT, AccumT, OutputT> to each of the input 
KV<K, Iterable<InputT>>
 elements to produce a combined output 
KV<K, OutputT> element, and returns a 
PCollection<KV<K, OutputT>> containing all the combined output elements.
 
 
- Combine.Holder<V> - Class in org.apache.beam.sdk.transforms
 
- 
Holds a single value value of type V which may or may not be present.
 
- Combine.IterableCombineFn<V> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Combine.PerKey<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
PerKey<K, InputT, OutputT> takes a PCollection<KV<K, InputT>>, groups it by
 key, applies a combining function to the InputT values associated with each key to
 produce a combined OutputT value, and returns a PCollection<KV<K, OutputT>>
 representing a map from each distinct key of the input PCollection to the corresponding
 combined value.
 
- Combine.PerKeyWithHotKeyFanout<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Combine.SimpleCombineFn<V> - Class in org.apache.beam.sdk.transforms
 
- 
Deprecated.
 
- CombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.CombineFn
 
-  
 
- combineFn() - Static method in class org.apache.beam.sdk.transforms.Count
 
- 
 
- combineFn() - Static method in class org.apache.beam.sdk.transforms.Latest
 
- 
 
- combineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
 
- 
Returns a 
Combine.CombineFn that computes a fixed-sized uniform sample of its inputs.
 
 
- CombineFnBase - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- CombineFnBase() - Constructor for class org.apache.beam.sdk.transforms.CombineFnBase
 
-  
 
- CombineFnBase.GlobalCombineFn<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- CombineFns - Class in org.apache.beam.sdk.transforms
 
- 
Static utility methods that create combine function instances.
 
- CombineFns() - Constructor for class org.apache.beam.sdk.transforms.CombineFns
 
-  
 
- CombineFns.CoCombineResult - Class in org.apache.beam.sdk.transforms
 
- 
A tuple of outputs produced by a composed combine functions.
 
- CombineFns.ComposeCombineFnBuilder - Class in org.apache.beam.sdk.transforms
 
- 
 
- CombineFns.ComposedCombineFn<DataT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- CombineFns.ComposedCombineFnWithContext<DataT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- CombineFnTester - Class in org.apache.beam.sdk.testing
 
- 
 
- CombineFnTester() - Constructor for class org.apache.beam.sdk.testing.CombineFnTester
 
-  
 
- CombineFnWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
-  
 
- CombineWithContext - Class in org.apache.beam.sdk.transforms
 
- 
This class contains combine functions that have access to PipelineOptions and side inputs
 through CombineWithContext.Context.
 
- CombineWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext
 
-  
 
- CombineWithContext.CombineFnWithContext<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
A combine function that has access to PipelineOptions and side inputs through CombineWithContext.Context.
 
- CombineWithContext.Context - Class in org.apache.beam.sdk.transforms
 
- 
Information accessible to all methods in CombineFnWithContext and KeyedCombineFnWithContext.
 
- CombineWithContext.RequiresContextInternal - Interface in org.apache.beam.sdk.transforms
 
- 
An internal interface for signaling that a GloballyCombineFn or a PerKeyCombineFn needs to access CombineWithContext.Context.
 
- combining(Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
 
- combining(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
For internal use only; no backwards compatibility guarantees
 
- combining(Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
Identical to #combining(CombineFn), but with an accumulator coder explicitly supplied.
 
- combining(Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
For internal use only; no backwards compatibility guarantees
 
- combiningFromInputInternal(Coder<InputT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- CombiningState<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.state
 
- 
A 
ReadableState cell defined by a 
Combine.CombineFn, accepting multiple input values,
 combining them as specified into accumulators, and producing a single output value.
 
 
- comment(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
-  
 
- commit() - Method in class org.apache.beam.runners.fnexecution.splittabledofn.SDFFeederViaStateAndTimers
 
- 
Commits the state and timers: clears both if no checkpoint happened, or adjusts the restriction
 and sets a wake-up timer if a checkpoint happened.
 
- commitManifest(ArtifactApi.CommitManifestRequest, StreamObserver<ArtifactApi.CommitManifestResponse>) - Method in class org.apache.beam.runners.direct.portable.artifact.LocalFileSystemArtifactStagerService
 
-  
 
- commitManifest(ArtifactApi.CommitManifestRequest, StreamObserver<ArtifactApi.CommitManifestResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 
-  
 
- commitOffsetsInFinalize() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Finalized offsets are committed to Kafka.
 
- commonPrefixLength(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
 
- 
Compute the length of the common prefix of the two provided sets of bytes.
 
- compact(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
 
- 
Returns an accumulator that represents the same logical value as the input accumulator, but
 may have a more compact representation.
 
- compact(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
-  
 
- compact(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
-  
 
- compact(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
-  
 
- compact(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
- 
Returns an accumulator that represents the same logical value as the input accumulator, but
 may have a more compact representation.
 
- compare(JobMessage, JobMessage) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
 
-  
 
- compare(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
 
-  
 
- compare(RandomAccessData, RandomAccessData, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
 
- 
Compare the two sets of bytes starting at the given offset.
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
 
- 
Compare between String values, mapping to SqlTypeName.VARCHAR.
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
 
- 
Compare between Boolean values, mapping to SqlTypeName.BOOLEAN.
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
 
- 
Compare between Number values, including SqlTypeName.BIGINT, SqlTypeName.DECIMAL, SqlTypeName.DOUBLE, SqlTypeName.FLOAT, SqlTypeName.INTEGER, SqlTypeName.SMALLINT and SqlTypeName.TINYINT.
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
 
- 
Compare between DateTime values, mapping to SqlTypeName.DATETIME_TYPES.
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
 
-  
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
 
-  
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
 
-  
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
 
-  
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
 
-  
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
 
-  
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
 
-  
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
 
-  
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
 
-  
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
 
-  
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
 
-  
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
 
-  
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
 
-  
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
 
-  
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
 
-  
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
 
-  
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
 
-  
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
 
-  
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
 
-  
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
 
-  
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLikeExpression
 
-  
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLikeExpression
 
-  
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLikeExpression
 
-  
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLikeExpression
 
-  
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
 
-  
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
 
-  
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
 
-  
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
 
-  
 
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotLikeExpression
 
-  
 
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotLikeExpression
 
-  
 
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotLikeExpression
 
-  
 
- compare(DateTime, DateTime) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotLikeExpression
 
-  
 
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Largest
 
- 
Deprecated.
  
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Natural
 
-  
 
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Reversed
 
-  
 
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Smallest
 
- 
Deprecated.
  
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByKey
 
-  
 
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByValue
 
-  
 
- compareTo(ByteArray) - Method in class org.apache.beam.runners.spark.util.ByteArray
 
-  
 
- compareTo(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKey
 
- 
ByteKey implements 
Comparable<ByteKey> by comparing the arrays
 in lexicographic order.
 
 
- compareTo(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
-  
 
- CompletableFutureInboundDataClient - Class in org.apache.beam.sdk.fn.data
 
- 
 
- complete() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
-  
 
- complete() - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
 
-  
 
- complete() - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
 
- 
Mark the client as completed.
 
- complete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
 
- 
Constructs a 
Watch.Growth.PollResult with the given outputs and declares that there will be no
 new outputs for the current input.
 
 
- complete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
 
- 
 
- completed() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
 
- 
Report that the pipeline has successfully completed.
 
- COMPONENT_ENCODINGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- compose() - Static method in class org.apache.beam.sdk.transforms.CombineFns
 
- 
 
- compose(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
 
- 
For a SerializableFunction<InputT, OutputT> fn, returns a PTransform
 given by applying fn.apply(v) to the input PCollection<InputT>.
 
- ComposeCombineFnBuilder() - Constructor for class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
 
-  
 
- COMPOSITE_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- CompositeSource - Class in org.apache.beam.runners.spark.metrics
 
- 
Composite source made up of several MetricRegistry instances.
 
- CompositeSource(String, MetricRegistry...) - Constructor for class org.apache.beam.runners.spark.metrics.CompositeSource
 
-  
 
- CompressedReader(CompressedSource<T>, FileBasedSource.FileBasedReader<T>) - Constructor for class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
- 
Create a CompressedReader from a CompressedSource and delegate reader.
 
- CompressedSource<T> - Class in org.apache.beam.sdk.io
 
- 
A Source that reads from compressed files.
 
- CompressedSource.CompressedReader<T> - Class in org.apache.beam.sdk.io
 
- 
 
- CompressedSource.CompressionMode - Enum in org.apache.beam.sdk.io
 
- 
 
- CompressedSource.DecompressingChannelFactory - Interface in org.apache.beam.sdk.io
 
- 
Factory interface for creating channels that decompress the content of an underlying channel.
 
- Compression - Enum in org.apache.beam.sdk.io
 
- 
Various compression types for reading/writing files.
 
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
-  
 
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
-  
 
- computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
-  
 
- CONCAT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators
 
-  
 
- CONCAT_FIELD_NAMES - Static variable in class org.apache.beam.sdk.schemas.transforms.Unnest
 
- 
This is the default naming policy for naming fields.
 
- CONCRETE_CLASS - Static variable in class org.apache.beam.sdk.io.WriteFiles
 
- 
For internal use by runners.
 
- ConfigurationLocator() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
 
-  
 
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
-  
 
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
-  
 
- configure() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
Returns a new builder for a 
Window transform for setting windowing parameters other
 than the windowing function.
 
 
- connect(String, Properties) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
-  
 
- connect(TableProvider) - Static method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
-  
 
- connect() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
- 
Connect to the Redis instance.
 
- CONNECT_STRING_PREFIX - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
-  
 
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
-  
 
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ConnectionConfiguration
 
-  
 
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
-  
 
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
 
-  
 
- connectToSpanner() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ByteCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.Coder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DurationCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.FloatCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.InstantCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
- 
LengthPrefixCoder is consistent with equals if the nested Coder is.
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
- 
NullableCoder is consistent with equals if the nested Coder is.
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.RowCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
- 
Returns 
true if this 
Coder is injective with respect to 
Object.equals(java.lang.Object).
 
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
-  
 
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
-  
 
- ConsoleIO - Class in org.apache.beam.runners.spark.io
 
- 
Print to console.
 
- ConsoleIO.Write - Class in org.apache.beam.runners.spark.io
 
- 
Write to console.
 
- ConsoleIO.Write.Unbound<T> - Class in org.apache.beam.runners.spark.io
 
- 
 
- constant(FileBasedSink.FilenamePolicy, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
 
- 
 
- constant(FileBasedSink.FilenamePolicy) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
 
- 
A specialization of #constant(FilenamePolicy, SerializableFunction) for the case where
 UserT and OutputT are the same type and the format function is the identity.
 
- constant(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
 
-  
 
- constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
 
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
-  
 
- contains(T) - Method in interface org.apache.beam.sdk.state.SetState
 
- 
Returns true if this set contains the specified element.
 
- contains(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- contains(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- contains(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- contains(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- contains(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
- 
Returns whether this window contains the given window.
 
- containsInAnyOrder(T...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
Asserts that the iterable in question contains the provided elements.
 
- containsInAnyOrder(Iterable<T>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
Asserts that the iterable in question contains the provided elements.
 
- containsInAnyOrder(T...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
- 
Checks that the Iterable contains the expected elements, in any order.
 
- containsInAnyOrder(Iterable<T>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
- 
Checks that the Iterable contains the expected elements, in any order.
 
- containsInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- containsInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- containsInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- containsInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- containsKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
Returns 
true if the specified 
ByteKey is contained within this range.
 
 
- containsString(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- Context(boolean) - Constructor for class org.apache.beam.sdk.coders.Coder.Context
 
- 
Deprecated.
  
- Context(TableDataInsertAllResponse.InsertErrors) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
 
-  
 
- Context() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.Context
 
-  
 
- Context() - Constructor for class org.apache.beam.sdk.transforms.Contextful.Fn.Context
 
-  
 
- Contextful<ClosureT> - Class in org.apache.beam.sdk.transforms
 
- 
Pair of a bit of user code (a "closure") and the 
Requirements needed to run it.
 
 
- Contextful.Fn<InputT,OutputT> - Interface in org.apache.beam.sdk.transforms
 
- 
A function from an input to an output that may additionally access 
Contextful.Fn.Context when
 computing the result.
 
 
- Contextful.Fn.Context - Class in org.apache.beam.sdk.transforms
 
- 
 
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.Match
 
- 
 
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
 
- 
 
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
- 
Continuously watches for new files at the given interval until the given termination
 condition is reached, where the input to the condition is the filepattern.
 
- control(StreamObserver<BeamFnApi.InstructionRequest>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
 
- 
Called by gRPC for each incoming connection from an SDK harness, and enqueue an available SDK
 harness client.
 
- ControlClientPool - Interface in org.apache.beam.runners.fnexecution.control
 
- 
 
- ControlClientPool.Sink - Interface in org.apache.beam.runners.fnexecution.control
 
- 
 
- ControlClientPool.Source - Interface in org.apache.beam.runners.fnexecution.control
 
- 
 
- convert(Function<BeamSqlPrimitive, BeamSqlPrimitive>) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion.Builder
 
-  
 
- convert(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion
 
-  
 
- convert(SqlTypeName, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.Reinterpreter
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
 
-  
 
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
 
-  
 
- Convert - Class in org.apache.beam.sdk.schemas.transforms
 
- 
A set of utilities for converting between different objects supporting schemas.
 
- Convert() - Constructor for class org.apache.beam.sdk.schemas.transforms.Convert
 
-  
 
- convertAvroFormat(Schema.Field, Object) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroUtils
 
-  
 
- convertToBagSpecInternal(StateSpec<CombiningState<InputT, AccumT, OutputT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- convertToFileResourceIfPossible(String) - Static method in class org.apache.beam.sdk.io.FileBasedSink
 
- 
This is a helper function for turning a user-provided output filename prefix and converting it
 into a 
ResourceId for writing output files.
 
 
- copy() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Returns a copy of this RandomAccessData.
 
- copy(RelTraitSet, RelNode, boolean, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
 
-  
 
- copy(RelTraitSet, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
-  
 
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
-  
 
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
 
-  
 
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
-  
 
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
-  
 
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
 
-  
 
- copy(RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
-  
 
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
 
-  
 
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
 
-  
 
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
-  
 
- copy(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
 
- 
Copies a List of file-like resources from one location to another.
 
- copy(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
 
- 
Copies a List of file-like resources from one location to another.
 
- copyFrom(ByteBuffer) - Static method in class org.apache.beam.sdk.io.range.ByteKey
 
- 
Creates a new 
ByteKey backed by a copy of the data remaining in the specified 
ByteBuffer.
 
 
- copyFrom(byte[]) - Static method in class org.apache.beam.sdk.io.range.ByteKey
 
- 
Creates a new 
ByteKey backed by a copy of the specified 
byte[].
 
 
- copyWithLocalRefExprs(List<BeamSqlExpression>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionEnvironment
 
- 
An environment that shares input row, window, and correlation variables but local refs are
 replaced with the given unevaluated expressions.
 
- Count - Class in org.apache.beam.sdk.transforms
 
- 
 
- countAsserts(Pipeline) - Static method in class org.apache.beam.sdk.testing.PAssert
 
-  
 
- COUNTER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
 
-  
 
- Counter - Interface in org.apache.beam.sdk.metrics
 
- 
A metric that reports a single long value and can be incremented or decremented.
 
- counter(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
 
- 
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
 
- counter(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
 
- 
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
 
- CounterMark(long, Instant) - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMark
 
- 
Creates a checkpoint mark reflecting the last emitted value.
 
- CountingSource - Class in org.apache.beam.sdk.io
 
- 
 
- CountingSource.CounterMark - Class in org.apache.beam.sdk.io
 
- 
The checkpoint for an unbounded 
CountingSource is simply the last value produced.
 
 
- CovarianceFn<T extends java.lang.Number> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
 
- 
Combine.CombineFn for Covariance on Number types.
 
- CrashingRunner - Class in org.apache.beam.sdk.testing
 
- 
 
- CrashingRunner() - Constructor for class org.apache.beam.sdk.testing.CrashingRunner
 
-  
 
- create(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowClient
 
-  
 
- create(PCollectionView<?>, Coder<T>) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
 
-  
 
- create() - Static method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
 
-  
 
- create() - Static method in class org.apache.beam.runners.direct.portable.artifact.UnsupportedArtifactRetrievalService
 
-  
 
- create(ServerFactory) - Static method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- create(Clock, ExecutableGraph<ExecutableT, ? super CollectionT>, Function<ExecutableT, String>) - Static method in class org.apache.beam.runners.direct.WatermarkManager
 
- 
 
- create(String, String, ListeningExecutorService, RunnerApi.Pipeline, FlinkPipelineOptions, List<String>) - Static method in class org.apache.beam.runners.flink.FlinkJobInvocation
 
-  
 
- create(ListeningExecutorService, FlinkJobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobInvoker
 
-  
 
- create(FlinkJobServerDriver.ServerConfiguration, ListeningExecutorService, ServerFactory, ServerFactory) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
-  
 
- create(boolean) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
 
-  
 
- create() - Static method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService
 
-  
 
- create(JobInfo, Map<String, EnvironmentFactory.Provider>) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
 
-  
 
- create() - Static method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
 
- 
 
- create(String) - Method in interface org.apache.beam.runners.fnexecution.control.OutputReceiverFactory
 
- 
 
- create(EnvironmentFactory, GrpcFnServer<GrpcDataService>, GrpcFnServer<GrpcStateService>) - Static method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
 
- 
Deprecated.
  
- create(ExecutorService, OutboundObserverFactory) - Static method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
-  
 
- create(PipelineOptions, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<FnApiControlClientPoolService>, ControlClientPool.Source) - Static method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
 
-  
 
- create(ProcessManager, RunnerApi.Environment, String, InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
 
-  
 
- create(ProcessManager, GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
 
-  
 
- create() - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
 
-  
 
- create(ServiceT, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.runners.fnexecution.GrpcFnServer
 
- 
Create a 
GrpcFnServer for the provided 
FnService which will run at the endpoint
 specified in the 
Endpoints.ApiServiceDescriptor.
 
 
- create() - Static method in class org.apache.beam.runners.fnexecution.InProcessServerFactory
 
-  
 
- create(BindableService, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.runners.fnexecution.InProcessServerFactory
 
-  
 
- create(Endpoints.ApiServiceDescriptor, Function<String, String>, ThrowingConsumer<String>, JobInvoker) - Static method in class org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService
 
- 
Creates an InMemoryJobService.
 
- create(String, String, String, Struct) - Static method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
-  
 
- create(ProvisionApi.ProvisionInfo) - Static method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
 
-  
 
- create(BindableService, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.runners.fnexecution.ServerFactory
 
- 
Creates an instance of this server at the address specified by the given service descriptor.
 
- create(BindableService, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.runners.fnexecution.ServerFactory.InetSocketAddressServerFactory
 
-  
 
- create() - Static method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
 
- 
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.reference.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkPipelineOptions.TmpCheckpointDirFactory
 
-  
 
- create() - Static method in class org.apache.beam.runners.spark.SparkRunner
 
- 
Creates and returns a new SparkRunner with default options.
 
- create(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
 
- 
Creates and returns a new SparkRunner with specified options.
 
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
 
-  
 
- create(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
 
-  
 
- create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
- 
 
- create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
- 
 
- create(double) - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
- 
 
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
-  
 
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.SortValues
 
- 
Returns a 
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT> PTransform.
 
 
- create(SchemaPlus, String, Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchemaFactory
 
-  
 
- create() - Static method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
 
- 
 
- create(StreamObserver<ReqT>, Runnable) - Static method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
-  
 
- create() - Static method in class org.apache.beam.sdk.fn.test.InProcessManagedChannelFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsUserCredentialsFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.AwsOptions.ClientConfigurationFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.S3Options.S3UploadBufferSizeBytesFactory
 
-  
 
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.aws.sns.SnsIO.RetryConfiguration
 
-  
 
- create(String[], String, String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
- 
Creates a new Elasticsearch connection configuration.
 
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.RetryConfiguration
 
- 
Creates RetryConfiguration for 
ElasticsearchIO with provided maxAttempts,
 maxDurations and exponential backoff based retries.
 
 
- create(WritableByteChannel) - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
 
- 
Deprecated.
  
- create(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
 
-  
 
- create(EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
- 
 
- create(ResourceIdT, CreateOptions) - Method in class org.apache.beam.sdk.io.FileSystem
 
- 
Returns a write channel for the given ResourceIdT.
 
- create(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileSystems
 
- 
 
- create(ResourceId, CreateOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
 
- 
 
- create(MatchResult.Status, List<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
 
- 
 
- create(MatchResult.Status, IOException) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
 
- 
 
- create(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
- 
Creates an instance of this rule.
 
- create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
 
- 
Creates an instance of this rule.
 
- create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
- 
Creates an instance of this rule.
 
- create(Mutation, Mutation...) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
- 
Creates a new group.
 
- create(Mutation, Iterable<Mutation>) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
-  
 
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- create(BatchTransactionId) - Static method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
 
-  
 
- create(DataSource) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
-  
 
- create(String, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
-  
 
- create(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
-  
 
- create(String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
- 
Describe a connection configuration to the MQTT broker.
 
- create(String, String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
- 
Describe a connection configuration to the MQTT broker.
 
- create() - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
-  
 
- create(String, int) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
-  
 
- create(String) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
 
- 
Creates a new Solr connection configuration.
 
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.RetryConfiguration
 
-  
 
- create(long, long, long, long) - Static method in class org.apache.beam.sdk.metrics.DistributionResult
 
-  
 
- create(long, Instant) - Static method in class org.apache.beam.sdk.metrics.GaugeResult
 
-  
 
- create(PipelineOptions) - Method in interface org.apache.beam.sdk.options.DefaultValueFactory
 
- 
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.NoOpMetricsSink
 
-  
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.UserAgentFactory
 
-  
 
- create() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
 
- 
Creates and returns an object that implements 
PipelineOptions using the values
 configured on this builder during construction.
 
 
- create() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
- 
 
- create() - Static method in class org.apache.beam.sdk.Pipeline
 
- 
 
- create(PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
 
- 
 
- create() - Static method in class org.apache.beam.sdk.PipelineRunner
 
- 
 
- create() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
- 
 
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Filter
 
-  
 
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Unnest
 
-  
 
- create() - Static method in class org.apache.beam.sdk.testing.TestPipeline
 
- 
Creates and returns a new test pipeline.
 
- create(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
 
-  
 
- create(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream
 
- 
 
- create(Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.testing.TestStream
 
-  
 
- create(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
- 
Returns an approximate quantiles combiner with the given compareFn and desired number
 of quantiles.
 
- create(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
- 
 
- create(int, ComparatorT, long, double) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
- 
Creates an approximate quantiles combiner with the given compareFn and desired number
 of quantiles.
 
- Create<T> - Class in org.apache.beam.sdk.transforms
 
- 
Create<T> takes a collection of elements of type T known when the pipeline is
 constructed and returns a PCollection<T> containing the elements.
 
- Create() - Constructor for class org.apache.beam.sdk.transforms.Create
 
-  
 
- create() - Static method in class org.apache.beam.sdk.transforms.Distinct
 
- 
Returns a Distinct<T> PTransform.
 
- create() - Static method in class org.apache.beam.sdk.transforms.GroupByKey
 
- 
Returns a GroupByKey<K, V> PTransform.
 
- create() - Static method in class org.apache.beam.sdk.transforms.Impulse
 
- 
 
- create() - Static method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
 
- 
Returns a CoGroupByKey<K> PTransform.
 
- create() - Static method in class org.apache.beam.sdk.transforms.Keys
 
- 
Returns a Keys<K> PTransform.
 
- create() - Static method in class org.apache.beam.sdk.transforms.KvSwap
 
- 
Returns a KvSwap<K, V> PTransform.
 
- create() - Static method in class org.apache.beam.sdk.transforms.Values
 
- 
Returns a Values<V> PTransform.
 
- Create.OfValueProvider<T> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Create.TimestampedValues<T> - Class in org.apache.beam.sdk.transforms
 
- 
A PTransform that creates a PCollection whose elements have associated
 timestamps.
 
- Create.Values<T> - Class in org.apache.beam.sdk.transforms
 
- 
A PTransform that creates a PCollection from a set of in-memory objects.
 
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAdaptor
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
 
- 
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
-  
 
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
-  
 
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
- 
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
-  
 
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
-  
 
- createBatchExecutionEnvironment(FlinkPipelineOptions, List<String>) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
 
- 
If the submitted job is a batch processing job, this method creates the adequate Flink ExecutionEnvironment depending on the user-specified options.
 
- createBuilder(S3Options) - Method in interface org.apache.beam.sdk.io.aws.options.S3ClientBuilderFactory
 
-  
 
- createBuilder(S3Options) - Method in class org.apache.beam.sdk.io.aws.s3.DefaultS3ClientBuilderFactory
 
-  
 
- createContextual(DeserializationContext, BeanProperty) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
 
-  
 
- CreateDataflowView<ElemT,ViewT> - Class in org.apache.beam.runners.dataflow
 
- 
 
- createDataset(String, String, String, String, Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Create a 
Dataset with the given 
location, 
description and default
 expiration time for tables in the dataset (if 
null, tables don't expire).
 
 
- createDecompressingChannel(ReadableByteChannel) - Method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
 
- 
Deprecated.
  
- createDecompressingChannel(ReadableByteChannel) - Method in interface org.apache.beam.sdk.io.CompressedSource.DecompressingChannelFactory
 
- 
Given a channel, create a channel that decompresses the content read from the channel.
 
- createDefault() - Static method in class org.apache.beam.runners.fnexecution.ServerFactory
 
- 
 
- createDefault() - Static method in interface org.apache.beam.runners.fnexecution.ServerFactory.UrlFactory
 
-  
 
- createDefault() - Static method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
Creates a CoderRegistry containing registrations for all standard coders part of the core Java
 Apache Beam SDK and also any registrations provided by 
coder
 registrars.
 
 
- createDefault() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
 
-  
 
- createDefault() - Static method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
-  
 
- createEnvironment(RunnerApi.Environment) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory
 
- 
 
- createEnvironment(RunnerApi.Environment) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
 
-  
 
- createEnvironment(RunnerApi.Environment) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory
 
- 
Creates an active RunnerApi.Environment and returns a handle to it.
 
- createEnvironment(RunnerApi.Environment) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
 
- 
 
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
 
-  
 
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
 
-  
 
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
 
- 
 
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.Provider
 
-  
 
- createEpoll() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
 
-  
 
- createForSubrangeOfFile(String, long, long) - Method in class org.apache.beam.sdk.io.AvroSource
 
- 
 
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.AvroSource
 
-  
 
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.BlockBasedSource
 
- 
Creates a BlockBasedSource for the specified range in a single file.
 
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.CompressedSource
 
- 
Creates a CompressedSource for a subrange of a file.
 
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
- 
Creates and returns a new FileBasedSource of the same type as the current FileBasedSource backed by a given file and an offset range.
 
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
-  
 
- createFrom(String) - Static method in class org.apache.beam.sdk.fn.channel.SocketAddressFactory
 
- 
Parse a SocketAddress from the given string.
 
- CreateGearpumpPCollectionViewTranslator<ElemT,ViewT> - Class in org.apache.beam.runners.gearpump.translators
 
- 
CreateGearpumpPCollectionView bridges input stream to down stream transforms.
 
- CreateGearpumpPCollectionViewTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.CreateGearpumpPCollectionViewTranslator
 
-  
 
- createGetters(Class<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetterFactory
 
- 
 
- createGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.utils.JavaBeanGetterFactory
 
-  
 
- createGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.utils.PojoValueGetterFactory
 
-  
 
- createInputFormatInstance() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
- 
Creates instance of InputFormat class.
 
- createJar(File, File) - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
 
- 
Create a jar file from the given directory.
 
- createJob(Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
 
- 
Creates the Dataflow Job.
 
- createKinesisProducer(KinesisProducerConfiguration) - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
 
-  
 
- CreateOptions - Class in org.apache.beam.sdk.io.fs
 
- 
An abstract class that contains common configuration options for creating resources.
 
- CreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions
 
-  
 
- CreateOptions.Builder<BuilderT extends CreateOptions.Builder<BuilderT>> - Class in org.apache.beam.sdk.io.fs
 
- 
 
- CreateOptions.StandardCreateOptions - Class in org.apache.beam.sdk.io.fs
 
- 
A standard configuration options with builder.
 
- CreateOptions.StandardCreateOptions.Builder - Class in org.apache.beam.sdk.io.fs
 
- 
 
- createPane(boolean, boolean, PaneInfo.Timing) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
-  
 
- createPane(boolean, boolean, PaneInfo.Timing, long, long) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
- 
Factory method to create a 
PaneInfo with the specified parameters.
 
 
- createPipelineOptions(Map<String, String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
-  
 
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollection
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>, TupleTag<?>) - Static method in class org.apache.beam.sdk.values.PCollection
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- createProcessContext(ValueInSingleWindow<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- createRandomSubscription(PubsubClient.ProjectPath, PubsubClient.TopicPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
Create a random subscription for topic.
 
- createReader(PipelineOptions) - Method in class org.apache.beam.runners.gearpump.translators.io.BoundedSourceWrapper
 
-  
 
- createReader(PipelineOptions) - Method in class org.apache.beam.runners.gearpump.translators.io.GearpumpSource
 
-  
 
- createReader(PipelineOptions) - Method in class org.apache.beam.runners.gearpump.translators.io.UnboundedSourceWrapper
 
-  
 
- createReader(PipelineOptions, UnboundedSource.CheckpointMark) - Method in class org.apache.beam.runners.gearpump.translators.io.ValuesSource
 
-  
 
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
 
- 
 
- createReader(CassandraIO.CassandraSource<T>) - Method in interface org.apache.beam.sdk.io.cassandra.CassandraService
 
- 
 
- createReader(CassandraIO.CassandraSource<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
 
-  
 
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
-  
 
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
-  
 
- createReader(PipelineOptions, JmsCheckpointMark) - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
-  
 
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
-  
 
- createReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.sdk.io.UnboundedSource
 
- 
 
- createRunner(ReadyCheckingSideInputReader) - Method in class org.apache.beam.runners.gearpump.translators.utils.DoFnRunnerFactory
 
-  
 
- createSetters(Class<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.FieldValueSetterFactory
 
- 
 
- createSetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.utils.JavaBeanSetterFactory
 
-  
 
- createSetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.utils.PojoValueSetterFactory
 
-  
 
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.AvroSource
 
-  
 
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BlockBasedSource
 
- 
Creates a BlockBasedReader.
 
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.CompressedSource
 
- 
Creates a FileBasedReader to read a single file.
 
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
- 
Creates and returns an instance of a FileBasedReader implementation for the current
 source assuming the source represents a single file.
 
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
-  
 
- createSnsPublisher() - Method in interface org.apache.beam.sdk.io.aws.sns.AwsClientsProvider
 
-  
 
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
- 
 
- CreateStream<T> - Class in org.apache.beam.runners.spark.io
 
- 
Create an input stream from Queue.
 
- createStreamExecutionEnvironment(FlinkPipelineOptions, List<String>) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
 
- 
If the submitted job is a stream processing job, this method creates the adequate Flink StreamExecutionEnvironment depending on the
 user-specified options.
 
- createStructuralValues(Coder<T>, List<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Testing utilities below depend on standard assertions and matchers to compare elements read by
 sources.
 
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
Create subscription to topic.
 
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
 
-  
 
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
-  
 
- createTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
 
- 
Creates a table.
 
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
-  
 
- createTable(Table) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Creates the specified table if it does not exist.
 
- CreateTables<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Creates any tables needed before performing streaming writes to the tables.
 
- CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
 
-  
 
- createTimestampPolicy(TopicPartition, Optional<Instant>) - Method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
 
- 
Creates a TimestampPolicy for a partition.
 
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
Create topic.
 
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- createTransaction() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
 
- 
Returns a transform that creates a batch transaction.
 
- CreateTransaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
-  
 
- createTranslationContext(JobInfo, FlinkPipelineOptions, List<String>) - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
 
- 
Creates a batch translation context.
 
- createTranslationContext(JobInfo, FlinkPipelineOptions, List<String>) - Static method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
 
- 
Creates a streaming translation context.
 
- createTranslator() - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
 
- 
Creates a batch translator.
 
- createUrl(String, int) - Method in interface org.apache.beam.runners.fnexecution.ServerFactory.UrlFactory
 
-  
 
- createWithPortSupplier(Supplier<Integer>) - Static method in class org.apache.beam.runners.fnexecution.ServerFactory
 
- 
 
- createWithUrlFactory(ServerFactory.UrlFactory) - Static method in class org.apache.beam.runners.fnexecution.ServerFactory
 
- 
 
- createWithUrlFactoryAndPortSupplier(ServerFactory.UrlFactory, Supplier<Integer>) - Static method in class org.apache.beam.runners.fnexecution.ServerFactory
 
- 
Create a 
ServerFactory that uses the given url factory and ports from a supplier.
 
 
- createWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink
 
- 
 
- createWriter(CassandraIO.Write<T>) - Method in interface org.apache.beam.sdk.io.cassandra.CassandraService
 
- 
 
- createWriter(CassandraIO.Write<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
 
-  
 
- createWriter() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
 
- CredentialFactory - Interface in org.apache.beam.sdk.extensions.gcp.auth
 
- 
Construct an oauth credential to be used by the SDK and the SDK workers.
 
- csvLines2BeamRows(CSVFormat, String, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
 
- 
Decode zero or more CSV records from the given string, according to the specified 
CSVFormat, and converts them to 
Rows with the specified 
Schema.
 
 
- CsvRecorderDecoder(Schema, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable.CsvRecorderDecoder
 
-  
 
- CsvRecorderEncoder(Schema, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable.CsvRecorderEncoder
 
-  
 
- CsvSink - Class in org.apache.beam.runners.spark.metrics.sink
 
- 
 
- CsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
 
-  
 
- CsvToRow(Schema, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
 
-  
 
- ctxt - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
-  
 
- current() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
 
-  
 
- currentEventTime() - Method in interface org.apache.beam.sdk.state.Timers
 
- 
Returns the current event time.
 
- currentInputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- currentOutputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- currentProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- currentProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
 
- 
Returns the current processing time.
 
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
-  
 
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
-  
 
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
 
- 
Returns a restriction accurately describing the full range of work the current DoFn.ProcessElement call will do, including already completed work.
 
- currentSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- currentSynchronizedProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
 
- 
Returns the current synchronized processing time or null if unknown.
 
- currentWatermark - Variable in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
 
-  
 
- CUSTOM_SOURCE_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- CustomCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
 
- CustomCoder() - Constructor for class org.apache.beam.sdk.coders.CustomCoder
 
-  
 
- Customer - Class in org.apache.beam.sdk.extensions.sql.example.model
 
- 
Describes a customer.
 
- Customer(int, String, String) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
-  
 
- Customer() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
-  
 
- CustomTimestampPolicyWithLimitedDelay<K,V> - Class in org.apache.beam.sdk.io.kafka
 
- 
A policy for custom record timestamps where timestamps within a partition are expected to be
 roughly monotonically increasing with a cap on out of order event delays (say 1 minute).
 
- CustomTimestampPolicyWithLimitedDelay(SerializableFunction<KafkaRecord<K, V>, Instant>, Duration, Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
 
- 
A policy for custom record timestamps where timestamps are expected to be roughly monotonically
 increasing with out of order event delays less than maxDelay.
 
- data(StreamObserver<BeamFnApi.Elements>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
-  
 
- DataflowClient - Class in org.apache.beam.runners.dataflow
 
- 
Wrapper around the generated Dataflow client to provide common functionality.
 
- DataflowClientFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
 
-  
 
- DataflowJobAlreadyExistsException - Exception in org.apache.beam.runners.dataflow
 
- 
An exception that is thrown if the unique job name constraint of the Dataflow service is broken
 because an existing job with the same job name is currently active.
 
- DataflowJobAlreadyExistsException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyExistsException
 
- 
Create a new 
DataflowJobAlreadyExistsException with the specified 
DataflowPipelineJob and message.
 
 
- DataflowJobAlreadyUpdatedException - Exception in org.apache.beam.runners.dataflow
 
- 
An exception that is thrown if the existing job has already been updated within the Dataflow
 service and is no longer able to be updated.
 
- DataflowJobAlreadyUpdatedException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyUpdatedException
 
- 
Create a new 
DataflowJobAlreadyUpdatedException with the specified 
DataflowPipelineJob and message.
 
 
- DataflowJobException - Exception in org.apache.beam.runners.dataflow
 
- 
 
- DataflowPipelineDebugOptions - Interface in org.apache.beam.runners.dataflow.options
 
- 
Internal.
 
- DataflowPipelineDebugOptions.DataflowClientFactory - Class in org.apache.beam.runners.dataflow.options
 
- 
Returns the default Dataflow client built from the passed in PipelineOptions.
 
- DataflowPipelineDebugOptions.StagerFactory - Class in org.apache.beam.runners.dataflow.options
 
- 
 
- DataflowPipelineJob - Class in org.apache.beam.runners.dataflow
 
- 
A DataflowPipelineJob represents a job submitted to Dataflow using 
DataflowRunner.
 
 
- DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
- 
Constructs the job.
 
- DataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
 
- 
 
- DataflowPipelineOptions.StagingLocationFactory - Class in org.apache.beam.runners.dataflow.options
 
- 
 
- DataflowPipelineRegistrar - Class in org.apache.beam.runners.dataflow
 
- 
 
- DataflowPipelineRegistrar.Options - Class in org.apache.beam.runners.dataflow
 
- 
 
- DataflowPipelineRegistrar.Runner - Class in org.apache.beam.runners.dataflow
 
- 
 
- DataflowPipelineTranslator - Class in org.apache.beam.runners.dataflow
 
- 
 
- DataflowPipelineTranslator.JobSpecification - Class in org.apache.beam.runners.dataflow
 
- 
The result of a job translation.
 
- DataflowPipelineWorkerPoolOptions - Interface in org.apache.beam.runners.dataflow.options
 
- 
Options that are used to configure the Dataflow pipeline worker pool.
 
- DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType - Enum in org.apache.beam.runners.dataflow.options
 
- 
Type of autoscaling algorithm to use.
 
- DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory - Class in org.apache.beam.runners.dataflow.options
 
- 
Returns the default Docker container image that executes Dataflow worker harness, residing in
 Google Container Registry.
 
- DataflowProfilingAgentConfiguration() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowProfilingOptions.DataflowProfilingAgentConfiguration
 
-  
 
- DataflowProfilingOptions - Interface in org.apache.beam.runners.dataflow.options
 
- 
Options for controlling profiling of pipeline execution.
 
- DataflowProfilingOptions.DataflowProfilingAgentConfiguration - Class in org.apache.beam.runners.dataflow.options
 
- 
Configuration the for profiling agent.
 
- DataflowRunner - Class in org.apache.beam.runners.dataflow
 
- 
A 
PipelineRunner that executes the operations in the pipeline by first translating them
 to the Dataflow representation using the 
DataflowPipelineTranslator and then submitting
 them to a Dataflow service for execution.
 
 
- DataflowRunner(DataflowPipelineOptions) - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner
 
-  
 
- DataflowRunner.StreamingPCollectionViewWriterFn<T> - Class in org.apache.beam.runners.dataflow
 
- 
 
- DataflowRunnerHooks - Class in org.apache.beam.runners.dataflow
 
- 
An instance of this class can be passed to the 
DataflowRunner to add user defined hooks
 to be invoked at various times during pipeline execution.
 
 
- DataflowRunnerHooks() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunnerHooks
 
-  
 
- DataflowRunnerInfo - Class in org.apache.beam.runners.dataflow
 
- 
 
- DataflowServiceException - Exception in org.apache.beam.runners.dataflow
 
- 
Signals there was an error retrieving information about a job from the Cloud Dataflow Service.
 
- DataflowTemplateJob - Class in org.apache.beam.runners.dataflow.util
 
- 
 
- DataflowTemplateJob() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
-  
 
- DataflowTransport - Class in org.apache.beam.runners.dataflow.util
 
- 
Helpers for cloud communication.
 
- DataflowTransport() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTransport
 
-  
 
- DataflowWorkerHarnessOptions - Interface in org.apache.beam.runners.dataflow.options
 
- 
Options that are used exclusively within the Dataflow worker harness.
 
- DataflowWorkerLoggingOptions - Interface in org.apache.beam.runners.dataflow.options
 
- 
 
- DataflowWorkerLoggingOptions.Level - Enum in org.apache.beam.runners.dataflow.options
 
- 
Deprecated.
The set of log levels that can be used on the Dataflow worker.
 
- DataflowWorkerLoggingOptions.WorkerLogLevelOverrides - Class in org.apache.beam.runners.dataflow.options
 
- 
Deprecated.
Defines a log level override for a specific class, package, or name.
 
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
-  
 
- DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
 
- 
 
- DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
 
- 
 
- DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
 
- 
A 
PTransform that deletes 
Entities from Cloud Datastore.
 
 
- DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
 
- 
A 
PTransform that deletes 
Entities associated with the given 
Keys from Cloud Datastore.
 
 
- DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
 
- 
A 
PTransform that reads the result rows of a Cloud Datastore query as 
Entity
 objects.
 
 
- DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
 
- 
A 
PTransform that writes 
Entity objects to Cloud Datastore.
 
 
- DataStreamDecoder(Coder<T>, InputStream) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
 
-  
 
- DataStreams - Class in org.apache.beam.sdk.fn.stream
 
- 
 
- DataStreams() - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams
 
-  
 
- DataStreams.BlockingQueueIterator<T> - Class in org.apache.beam.sdk.fn.stream
 
- 
Allows for one or more writing threads to append values to this iterator while one reading
 thread reads values.
 
- DataStreams.DataStreamDecoder<T> - Class in org.apache.beam.sdk.fn.stream
 
- 
An adapter which converts an 
InputStream to an 
Iterator of 
T values
 using the specified 
Coder.
 
 
- DataStreams.ElementDelimitedOutputStream - Class in org.apache.beam.sdk.fn.stream
 
- 
 
- DataStreams.OutputChunkConsumer<T> - Interface in org.apache.beam.sdk.fn.stream
 
- 
 
- DATE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- DATE_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- DATE_TYPES_TO_BIGINT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.DatetimeReinterpretConversions
 
-  
 
- DateOperators - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
Date operator implementations.
 
- DateOperators() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.DateOperators
 
-  
 
- DATETIME - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of datetime fields.
 
- DATETIME_CEIL - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.DateOperators
 
- 
Implementation of CEIL(date or time TO unit).
 
- DATETIME_FLOOR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.DateOperators
 
- 
Implementation of FLOOR(date or time TO unit).
 
- DatetimeReinterpretConversions - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret
 
- 
Utility class to contain implementations of datetime SQL type conversions.
 
- DatetimeReinterpretConversions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.DatetimeReinterpretConversions
 
-  
 
- days(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
 
- 
Returns a 
WindowFn that windows elements into periods measured by days.
 
 
- dec() - Method in interface org.apache.beam.sdk.metrics.Counter
 
-  
 
- dec(long) - Method in interface org.apache.beam.sdk.metrics.Counter
 
-  
 
- DECIMAL - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- DECIMAL - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of decimal fields.
 
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.AvroCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.Coder
 
- 
Decodes a value of type T from the given input stream in the given context.
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
 
- 
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.FloatCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.RowCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SnappyCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAccumulatorCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
-  
 
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
-  
 
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
-  
 
- decodePane(byte) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
-  
 
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.CollectionCoder
 
- 
Builds an instance of IterableT, this coder's associated Iterable-like subtype,
 from a list of decoded elements.
 
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
 
-  
 
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
- 
Builds an instance of IterableT, this coder's associated Iterable-like subtype,
 from a list of decoded elements.
 
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
 
-  
 
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.SetCoder
 
- 
Builds an instance of IterableT, this coder's associated Iterable-like subtype,
 from a list of decoded elements.
 
- decrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
 
- 
Returns an 
IdGenerators that will provide successive decrementing longs.
 
 
- deduceOutputType(SqlTypeName, SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
 
-  
 
- Default - Annotation Type in org.apache.beam.sdk.options
 
- 
Default represents a set of annotations that can be used to annotate getter properties on
 
PipelineOptions with information representing the default value to be returned if no
 value is specified.
 
 
- Default.Boolean - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified boolean primitive value.
 
- Default.Byte - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified byte primitive value.
 
- Default.Character - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified char primitive value.
 
- Default.Class - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified Class value.
 
- Default.Double - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified double primitive value.
 
- Default.Enum - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified enum.
 
- Default.Float - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified float primitive value.
 
- Default.InstanceFactory - Annotation Type in org.apache.beam.sdk.options
 
- 
 
- Default.Integer - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified int primitive value.
 
- Default.Long - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified long primitive value.
 
- Default.Short - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified short primitive value.
 
- Default.String - Annotation Type in org.apache.beam.sdk.options
 
- 
This represents that the default of the option is the specified String value.
 
- DEFAULT_BYTE_ARRAY_CODER - Static variable in class org.apache.beam.sdk.io.TFRecordIO
 
- 
The default coder, which returns each record of the input file as a byte array.
 
- DEFAULT_MAX_NUM_ELEMENTS - Static variable in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
- 
The cost (in time and space) to compute quantiles to a given accuracy is a function of the
 total number of elements in the data set.
 
- DEFAULT_OUTBOUND_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.stream.DataStreams
 
-  
 
- DEFAULT_SCHEME - Static variable in class org.apache.beam.sdk.io.FileSystems
 
-  
 
- DEFAULT_UNWINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
- 
The default sharding name template.
 
- DEFAULT_WINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
- 
The default windowed sharding name template used when writing windowed files.
 
- DefaultCoder - Annotation Type in org.apache.beam.sdk.coders
 
- 
The 
DefaultCoder annotation specifies a 
Coder class to handle encoding and
 decoding instances of the annotated class.
 
 
- DefaultCoder.DefaultCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
 
- 
 
- DefaultCoderCloudObjectTranslatorRegistrar - Class in org.apache.beam.runners.dataflow.util
 
- 
 
- DefaultCoderCloudObjectTranslatorRegistrar() - Constructor for class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
-  
 
- DefaultCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
 
-  
 
- DefaultConcludeTransform() - Constructor for class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
 
-  
 
- DefaultFilenamePolicy - Class in org.apache.beam.sdk.io
 
- 
 
- DefaultFilenamePolicy.Params - Class in org.apache.beam.sdk.io
 
- 
 
- DefaultFilenamePolicy.ParamsCoder - Class in org.apache.beam.sdk.io
 
- 
 
- DefaultJobBundleFactory - Class in org.apache.beam.runners.fnexecution.control
 
- 
 
- DefaultJobBundleFactory.SimpleStageBundleFactory - Class in org.apache.beam.runners.fnexecution.control
 
- 
A simple stage bundle factory for remotely processing bundles.
 
- DefaultJobBundleFactory.WrappedSdkHarnessClient - Class in org.apache.beam.runners.fnexecution.control
 
- 
Holder for an 
SdkHarnessClient along with its associated state and data servers.
 
 
- DefaultJobServerConfigFactory() - Constructor for class org.apache.beam.runners.reference.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
 
-  
 
- defaultNaming(String, String) - Static method in class org.apache.beam.sdk.io.FileIO.Write
 
-  
 
- defaultNaming(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.FileIO.Write
 
-  
 
- DefaultPipelineOptionsRegistrar - Class in org.apache.beam.sdk.options
 
- 
 
- DefaultPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
 
-  
 
- DefaultProjectFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
 
-  
 
- DefaultRetryStrategy() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
 
-  
 
- Defaults() - Constructor for class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
-  
 
- DefaultS3ClientBuilderFactory - Class in org.apache.beam.sdk.io.aws.s3
 
- 
Construct AmazonS3ClientBuilder with default values of S3 client properties like path style
 access, accelerated mode, etc.
 
- DefaultS3ClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws.s3.DefaultS3ClientBuilderFactory
 
-  
 
- DefaultSchema - Annotation Type in org.apache.beam.sdk.schemas
 
- 
 
- DefaultSchema.DefaultSchemaProvider - Class in org.apache.beam.sdk.schemas
 
- 
 
- DefaultSchema.DefaultSchemaProviderRegistrar - Class in org.apache.beam.sdk.schemas
 
- 
Registrar for default schemas.
 
- DefaultSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.DefaultSchema.DefaultSchemaProvider
 
-  
 
- DefaultSchemaProviderRegistrar() - Constructor for class org.apache.beam.sdk.schemas.DefaultSchema.DefaultSchemaProviderRegistrar
 
-  
 
- DefaultStopPipelineWatermarkFactory() - Constructor for class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
 
-  
 
- DefaultTrigger - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A trigger that is equivalent to Repeatedly.forever(AfterWatermark.pastEndOfWindow()).
 
- defaultValue() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
 
- 
Returns the default value when there are no values added to the accumulator.
 
- defaultValue() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
 
- 
Returns the default value when there are no values added to the accumulator.
 
- defaultValue() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
-  
 
- defaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
 
- 
Returns the default value of this transform, or null if there isn't one.
 
- DefaultValueFactory<T> - Interface in org.apache.beam.sdk.options
 
- 
 
- delay(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- Delay() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
 
-  
 
- delegate() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
-  
 
- delegate(HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
 
- 
Register display data from the specified component on behalf of the current component.
 
- delegateBasedUponType(EnumMap<BeamFnApi.StateKey.TypeCase, StateRequestHandler>) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
 
- 
Returns a 
StateRequestHandler which delegates to the supplied handler depending on the
 
BeamFnApi.StateRequests 
type.
 
 
- DelegateCoder<T,IntermediateT> - Class in org.apache.beam.sdk.coders
 
- 
A 
DelegateCoder<T, IntermediateT> wraps a 
Coder for 
IntermediateT and
 encodes/decodes values of type 
T by converting to/from 
IntermediateT and then
 encoding/decoding using the underlying 
Coder<IntermediateT>.
 
 
- DelegateCoder(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.DelegateCoder
 
-  
 
- DelegateCoder.CodingFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.coders
 
- 
 
- delete(Collection<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
 
- 
Deletes a collection of resources.
 
- delete(Collection<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
 
- 
Deletes a collection of resources.
 
- deleteDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Deletes the dataset specified by the datasetId value.
 
- deletedTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate.TimerUpdateBuilder
 
- 
Adds the provided timer to the collection of deleted timers, removing it from set timers if
 it has previously been set.
 
- deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
 
- 
 
- deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
 
- 
 
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
Delete subscription.
 
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- deleteTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Deletes the table specified by tableId from the dataset.
 
- deleteTimer(StateNamespace, String, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- deleteTimer(StateNamespace, String) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- deleteTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
 
- 
Removes the timer set in this context for the timestmap and timeDomain.
 
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
-  
 
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- delimitElement() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
-  
 
- dependsOnlyOnEarliestTimestamp() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
 
- 
Returns true if the result of combination of many output timestamps actually depends
 only on the earliest.
 
- dependsOnlyOnWindow() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
 
- 
Returns true if the result does not depend on what outputs were combined but only the
 window they are in.
 
- deregister() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
 
- 
De-registers the handler for all future requests for state for the registered process bundle
 instruction id.
 
- deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
 
-  
 
- describeMismatchSafely(PipelineResult, Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
-  
 
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
-  
 
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.RegexMatcher
 
-  
 
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
 
-  
 
- Description - Annotation Type in org.apache.beam.sdk.options
 
- 
Descriptions are used to generate human readable output when the --help command is
 specified.
 
- deserialize(String, byte[]) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
-  
 
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
 
-  
 
- deserializeTimers(Collection<byte[]>, TimerInternals.TimerDataCoder) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- detect(String) - Static method in enum org.apache.beam.sdk.io.Compression
 
-  
 
- DirectOptions - Interface in org.apache.beam.runners.direct
 
- 
 
- DirectOptions.AvailableParallelismFactory - Class in org.apache.beam.runners.direct
 
- 
 
- DIRECTORY_CONTAINER - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
 
- 
Shard is a file within a directory.
 
- DirectRegistrar - Class in org.apache.beam.runners.direct
 
- 
 
- DirectRegistrar.Options - Class in org.apache.beam.runners.direct
 
- 
 
- DirectRegistrar.Runner - Class in org.apache.beam.runners.direct
 
- 
 
- DirectRunner - Class in org.apache.beam.runners.direct
 
- 
 
- DirectRunner() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
 
-  
 
- DirectRunner.DirectPipelineResult - Class in org.apache.beam.runners.direct
 
- 
 
- DirectStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
 
- 
A StreamObserver which uses synchronization on the underlying CallStreamObserver
 to provide thread safety.
 
- DirectStreamObserver(Phaser, CallStreamObserver<T>) - Constructor for class org.apache.beam.sdk.fn.stream.DirectStreamObserver
 
-  
 
- DirectTestOptions - Interface in org.apache.beam.runners.direct
 
- 
Internal-only options for tweaking the behavior of the 
DirectRunner in ways that users
 should never do.
 
 
- DISALLOW_COMBINER_LIFTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- discardingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
Returns a new Window PTransform that uses the registered WindowFn and
 Triggering behavior, and that discards elements in a pane after they are triggered.
 
- dispatchBag(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
-  
 
- dispatchBag(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
-  
 
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
-  
 
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
-  
 
- dispatchDefault() - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
-  
 
- dispatchMap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
-  
 
- dispatchMap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
-  
 
- dispatchSet(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
-  
 
- dispatchSet(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
-  
 
- dispatchValue(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
 
-  
 
- dispatchValue(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
-  
 
- DISPLAY_DATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- DisplayData - Class in org.apache.beam.sdk.transforms.display
 
- 
Static display data associated with a pipeline component.
 
- DisplayData.Builder - Interface in org.apache.beam.sdk.transforms.display
 
- 
Utility to build up display data from a component and its included subcomponents.
 
- DisplayData.Identifier - Class in org.apache.beam.sdk.transforms.display
 
- 
Unique identifier for a display data item within a component.
 
- DisplayData.Item - Class in org.apache.beam.sdk.transforms.display
 
- 
Items are the unit of display data.
 
 
- DisplayData.ItemSpec<T> - Class in org.apache.beam.sdk.transforms.display
 
- 
 
- DisplayData.Path - Class in org.apache.beam.sdk.transforms.display
 
- 
Structured path of registered display data within a component hierarchy.
 
- DisplayData.Type - Enum in org.apache.beam.sdk.transforms.display
 
- 
Display data type.
 
- Distinct<T> - Class in org.apache.beam.sdk.transforms
 
- 
Distinct<T> takes a PCollection<T> and returns a PCollection<T> that has
 all distinct elements of the input.
 
- Distinct() - Constructor for class org.apache.beam.sdk.transforms.Distinct
 
-  
 
- Distinct.WithRepresentativeValues<T,IdT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Distribution - Interface in org.apache.beam.sdk.metrics
 
- 
A metric that reports information about the distribution of reported values.
 
- distribution(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
 
- 
Create a metric that records various statistics about the distribution of reported values.
 
- distribution(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
 
- 
Create a metric that records various statistics about the distribution of reported values.
 
- DistributionResult - Class in org.apache.beam.sdk.metrics
 
- 
 
- DistributionResult() - Constructor for class org.apache.beam.sdk.metrics.DistributionResult
 
-  
 
- doChecks(PAssert.PAssertionSite, ActualT, SerializableFunction<ActualT, Void>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
-  
 
- DockerEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
 
- 
 
- DockerEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
 
- 
Provider for DockerEnvironmentFactory.
 
- DoFn<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
The argument to 
ParDo providing the code to use to process elements of the input 
PCollection.
 
 
- DoFn() - Constructor for class org.apache.beam.sdk.transforms.DoFn
 
-  
 
- DoFn.BoundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.Element - Annotation Type in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.FieldAccess - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for specifying specific fields that are accessed in a Schema PCollection.
 
- DoFn.FinishBundle - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method to use to finish processing a batch of elements.
 
- DoFn.FinishBundleContext - Class in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.GetInitialRestriction - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method that maps an element to an initial restriction for a 
splittable DoFn.
 
 
- DoFn.GetRestrictionCoder - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method that returns the coder to use for the restriction of a 
splittable DoFn.
 
 
- DoFn.MultiOutputReceiver - Interface in org.apache.beam.sdk.transforms
 
- 
Receives tagged output for a multi-output function.
 
- DoFn.NewTracker - Annotation Type in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.OnTimer - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for registering a callback for a timer.
 
- DoFn.OnTimerContext - Class in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.OnWindowExpiration - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method to use for performing actions on window expiration.
 
- DoFn.OutputReceiver<T> - Interface in org.apache.beam.sdk.transforms
 
- 
Receives values of the given type.
 
- DoFn.ProcessContext - Class in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.ProcessContinuation - Class in org.apache.beam.sdk.transforms
 
- 
When used as a return value of 
DoFn.ProcessElement, indicates whether there is more work to
 be done for the current element.
 
 
- DoFn.ProcessElement - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method to use for processing elements.
 
- DoFn.RequiresStableInput - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Experimental - no backwards compatibility guarantees.
 
- DoFn.Setup - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method to use to prepare an instance for processing bundles of elements.
 
- DoFn.SplitRestriction - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method that splits restriction of a 
splittable DoFn into multiple parts to
 be processed in parallel.
 
 
- DoFn.StartBundle - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method to use to prepare an instance for processing a batch of elements.
 
- DoFn.StartBundleContext - Class in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.StateId - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for declaring and dereferencing state cells.
 
- DoFn.Teardown - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for the method to use to clean up this instance before it is discarded.
 
- DoFn.TimerId - Annotation Type in org.apache.beam.sdk.transforms
 
- 
Annotation for declaring and dereferencing timers.
 
- DoFn.Timestamp - Annotation Type in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.UnboundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
 
- 
 
- DoFn.WindowedContext - Class in org.apache.beam.sdk.transforms
 
- 
Information accessible to all methods in this 
DoFn where the context is in some window.
 
 
- DoFnFunction<InputT,OutputT> - Class in org.apache.beam.runners.gearpump.translators.functions
 
- 
Gearpump 
FlatMapFunction wrapper over Beam 
DoFn.
 
 
- DoFnFunction(GearpumpPipelineOptions, DoFn<InputT, OutputT>, WindowingStrategy<?, ?>, Collection<PCollectionView<?>>, Map<String, PCollectionView<?>>, TupleTag<OutputT>, Map<TupleTag<?>, Coder<?>>, List<TupleTag<?>>) - Constructor for class org.apache.beam.runners.gearpump.translators.functions.DoFnFunction
 
-  
 
- DoFnOutputReceivers - Class in org.apache.beam.sdk.transforms
 
- 
 
- DoFnOutputReceivers() - Constructor for class org.apache.beam.sdk.transforms.DoFnOutputReceivers
 
-  
 
- DoFnRunnerFactory<InputT,OutputT> - Class in org.apache.beam.runners.gearpump.translators.utils
 
- 
a serializable SimpleDoFnRunner.
 
- DoFnRunnerFactory(GearpumpPipelineOptions, DoFn<InputT, OutputT>, Collection<PCollectionView<?>>, DoFnRunners.OutputManager, TupleTag<OutputT>, List<TupleTag<?>>, StepContext, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>) - Constructor for class org.apache.beam.runners.gearpump.translators.utils.DoFnRunnerFactory
 
-  
 
- DoFnRunnerWithMetricsUpdate<InputT,OutputT> - Class in org.apache.beam.runners.flink.metrics
 
- 
DoFnRunner decorator which registers MetricsContainerImpl.
 
- DoFnRunnerWithMetricsUpdate(String, DoFnRunner<InputT, OutputT>, RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
-  
 
- DoFnTester<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- DoFnTester.CloningBehavior - Enum in org.apache.beam.sdk.transforms
 
- 
 
- DOUBLE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- DOUBLE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of double fields.
 
- DoubleCoder - Class in org.apache.beam.sdk.coders
 
- 
A 
DoubleCoder encodes 
Double values in 8 bytes using Java serialization.
 
 
- doubles() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
Returns a PTransform that takes an input PCollection<Double> and returns a
 PCollection<Double> whose contents is the maximum of the input PCollection's
 elements, or Double.NEGATIVE_INFINITY if there are no elements.
 
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
Returns a PTransform that takes an input PCollection<Double> and returns a
 PCollection<Double> whose contents is the minimum of the input PCollection's
 elements, or Double.POSITIVE_INFINITY if there are no elements.
 
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
 
- 
Returns a PTransform that takes an input PCollection<Double> and returns a
 PCollection<Double> whose contents is the sum of the input PCollection's
 elements, or 0 if there are no elements.
 
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns
 a PCollection<KV<K, Double>> that contains an output element mapping each distinct key
 in the input PCollection to the maximum of the values associated with that key in the
 input PCollection.
 
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns
 a PCollection<KV<K, Double>> that contains an output element mapping each distinct key
 in the input PCollection to the minimum of the values associated with that key in the
 input PCollection.
 
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
 
- 
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns
 a PCollection<KV<K, Double>> that contains an output element mapping each distinct key
 in the input PCollection to the sum of the values associated with that key in the input
 PCollection.
 
- drive() - Method in interface org.apache.beam.runners.local.ExecutionDriver
 
-  
 
- dropTable(SqlParserPos, boolean, SqlIdentifier) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
 
- 
Creates a DROP TABLE.
 
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
 
-  
 
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
-  
 
- dropTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
 
- 
Drops a table.
 
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
-  
 
- dryRunQuery(String, JobConfigurationQuery, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
 
- 
Dry runs the query in the given project.
 
- DurationCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- DynamicAvroDestinations<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
 
- 
 
- DynamicAvroDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicAvroDestinations
 
-  
 
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
-  
 
- DynamicDestinations<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
This class provides the most general way of specifying dynamic BigQuery table destinations.
 
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 
-  
 
- DynamicFileDestinations - Class in org.apache.beam.sdk.io
 
- 
 
- DynamicFileDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicFileDestinations
 
-  
 
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
- 
Factory for creating Pubsub clients using gRCP transport.
 
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
- 
Factory for creating Pubsub clients using Json transport.
 
- fail(Throwable) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
-  
 
- fail(Throwable) - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
 
-  
 
- fail(Throwable) - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
 
- 
Mark the client as completed with an exception.
 
- failed(Exception) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
 
- 
Report that a failure has occurred.
 
- failed(Error) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
 
- 
Report that a failure has occurred.
 
- failure(String, String, Metadata, Throwable) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
 
-  
 
- failure(PAssert.PAssertionSite, Throwable) - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
-  
 
- FAILURE_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
 
-  
 
- fewKeys() - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
- 
Returns whether it groups just few keys.
 
- Field() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field
 
-  
 
- fieldAccess(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
 
- 
 
- FieldAccessDescriptor - Class in org.apache.beam.sdk.schemas
 
- 
Used inside of a 
DoFn to describe which fields in a schema
 type need to be accessed for processing.
 
 
- FieldAccessDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
-  
 
- fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
 
- 
Select a set of top-level field ids from the row.
 
- fieldIdsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
-  
 
- fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
 
- 
Select a set of top-level field names from the row.
 
- FieldType() - Constructor for class org.apache.beam.sdk.schemas.Schema.FieldType
 
-  
 
- FieldTypeDescriptors - Class in org.apache.beam.sdk.schemas
 
- 
Utilities for converting between 
Schema field types and 
TypeDescriptors that
 define Java objects which can represent these field types.
 
 
- FieldTypeDescriptors() - Constructor for class org.apache.beam.sdk.schemas.FieldTypeDescriptors
 
-  
 
- fieldTypeForJavaType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.FieldTypeDescriptors
 
- 
 
- FieldValueGetter<ObjectT,ValueT> - Interface in org.apache.beam.sdk.schemas
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- FieldValueGetterFactory - Interface in org.apache.beam.sdk.schemas
 
- 
A factory interface for creating 
FieldValueGetter objects
 corresponding to a class.
 
 
- fieldValueGetterFactory() - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
-  
 
- fieldValueGetterFactory() - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
 
-  
 
- FieldValueSetter<ObjectT,ValueT> - Interface in org.apache.beam.sdk.schemas
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- FieldValueSetterFactory - Interface in org.apache.beam.sdk.schemas
 
- 
A factory interface for creating 
FieldValueSetter objects
 corresponding to a class.
 
 
- fieldValueSetterFactory() - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
-  
 
- fieldValueSetterFactory() - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
 
-  
 
- FileBasedReader(FileBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
- 
Subclasses should not perform IO operations at the constructor.
 
- FileBasedSink<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
 
- 
Abstract class for file-based output.
 
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
 
- 
Construct a 
FileBasedSink with the given temp directory, producing uncompressed files.
 
 
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, FileBasedSink.WritableByteChannelFactory) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
 
- 
Construct a 
FileBasedSink with the given temp directory and output channel type.
 
 
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, Compression) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
 
- 
Construct a 
FileBasedSink with the given temp directory and output channel type.
 
 
- FileBasedSink.CompressionType - Enum in org.apache.beam.sdk.io
 
- 
 
- FileBasedSink.DynamicDestinations<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
 
- 
 
- FileBasedSink.FilenamePolicy - Class in org.apache.beam.sdk.io
 
- 
A naming policy for output files.
 
- FileBasedSink.FileResult<DestinationT> - Class in org.apache.beam.sdk.io
 
- 
Result of a single bundle write.
 
- FileBasedSink.FileResultCoder<DestinationT> - Class in org.apache.beam.sdk.io
 
- 
 
- FileBasedSink.OutputFileHints - Interface in org.apache.beam.sdk.io
 
- 
Provides hints about how to generate output files, such as a suggested filename suffix (e.g.
 
- FileBasedSink.WritableByteChannelFactory - Interface in org.apache.beam.sdk.io
 
- 
 
- FileBasedSink.WriteOperation<DestinationT,OutputT> - Class in org.apache.beam.sdk.io
 
- 
Abstract operation that manages the process of writing to 
FileBasedSink.
 
 
- FileBasedSink.Writer<DestinationT,OutputT> - Class in org.apache.beam.sdk.io
 
- 
 
- FileBasedSource<T> - Class in org.apache.beam.sdk.io
 
- 
A common base class for all file-based 
Sources.
 
 
- FileBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
 
- 
Create a FileBaseSource based on a file or a file pattern specification, with the given
 strategy for treating filepatterns that do not match any files.
 
- FileBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
 
- 
 
- FileBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
 
- 
Create a FileBasedSource based on a single file.
 
- FileBasedSource.FileBasedReader<T> - Class in org.apache.beam.sdk.io
 
- 
A 
reader that implements code common to readers of 
FileBasedSources.
 
 
- FileBasedSource.Mode - Enum in org.apache.beam.sdk.io
 
- 
A given FileBasedSource represents a file resource of one of these types.
 
- FileChecksumMatcher - Class in org.apache.beam.sdk.testing
 
- 
Matcher to verify file checksum in E2E test.
 
- FileChecksumMatcher(String, String) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
 
- 
Constructor that uses default shard template.
 
- FileChecksumMatcher(String, String, Pattern) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
 
- 
Constructor using a custom shard template.
 
- FileChecksumMatcher(String, ShardedFile) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
 
- 
Constructor using an entirely custom ShardedFile implementation.
 
- FileIO - Class in org.apache.beam.sdk.io
 
- 
General-purpose transforms for working with files: listing files (matching), reading and writing.
 
- FileIO() - Constructor for class org.apache.beam.sdk.io.FileIO
 
-  
 
- FileIO.Match - Class in org.apache.beam.sdk.io
 
- 
 
- FileIO.MatchAll - Class in org.apache.beam.sdk.io
 
- 
 
- FileIO.MatchConfiguration - Class in org.apache.beam.sdk.io
 
- 
Describes configuration for matching filepatterns, such as 
EmptyMatchTreatment and
 continuous watching for matching files.
 
 
- FileIO.ReadableFile - Class in org.apache.beam.sdk.io
 
- 
A utility class for accessing a potentially compressed file.
 
- FileIO.ReadMatches - Class in org.apache.beam.sdk.io
 
- 
 
- FileIO.Sink<ElementT> - Interface in org.apache.beam.sdk.io
 
- 
 
- FileIO.Write<DestinationT,UserT> - Class in org.apache.beam.sdk.io
 
- 
 
- FileIO.Write.FileNaming - Interface in org.apache.beam.sdk.io
 
- 
A policy for generating names for shard files.
 
- FilenamePolicy() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
 
-  
 
- filepattern(String) - Method in class org.apache.beam.sdk.io.FileIO.Match
 
- 
Matches the given filepattern.
 
- filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Match
 
- 
 
- filepattern(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
 
- 
Matches the given filepattern.
 
- filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
 
- 
 
- FileResult(ResourceId, int, BoundedWindow, PaneInfo, DestinationT) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- FileResultCoder(Coder<BoundedWindow>, Coder<DestinationT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
-  
 
- FileSystem<ResourceIdT extends ResourceId> - Class in org.apache.beam.sdk.io
 
- 
File system interface in Beam.
 
- FileSystem() - Constructor for class org.apache.beam.sdk.io.FileSystem
 
-  
 
- FileSystemRegistrar - Interface in org.apache.beam.sdk.io
 
- 
 
- FileSystems - Class in org.apache.beam.sdk.io
 
- 
 
- FileSystems() - Constructor for class org.apache.beam.sdk.io.FileSystems
 
-  
 
- Filter - Class in org.apache.beam.sdk.schemas.transforms
 
- 
A 
PTransform for filtering a collection of schema types.
 
 
- Filter() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter
 
-  
 
- Filter<T> - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for filtering from a PCollection the elements satisfying a predicate,
 or satisfying an inequality with a given value based on the elements' natural ordering.
 
- Filter.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
 
- 
Implementation of the filter.
 
- finalizeCheckpoint() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
-  
 
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
 
-  
 
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
 
- 
Acknowledge all outstanding message.
 
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
-  
 
- finalizeCheckpoint() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
 
- 
Called by the system to signal that this checkpoint mark has been committed along with all
 the records which have been read from the 
UnboundedSource.UnboundedReader since the previous
 checkpoint was taken.
 
 
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.UnboundedSource.CheckpointMark.NoopCheckpointMark
 
-  
 
- finalizeDestination(DestinationT, BoundedWindow, Integer, Collection<FileBasedSink.FileResult<DestinationT>>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
-  
 
- find(String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- find(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- find(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- find(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- find(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- find(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- Find(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Find
 
-  
 
- findAll(String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- findAll(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- FindAll(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindAll
 
-  
 
- findExpressionOfType(List<BeamSqlExpression>, SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.SqlTypeUtils
 
- 
Finds an operand with provided type.
 
- findExpressionOfType(List<BeamSqlExpression>, Collection<SqlTypeName>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.SqlTypeUtils
 
- 
Finds an operand with the type in typesToFind.
 
- findKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- findKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- findKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- findKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
 
- FindKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindKV
 
-  
 
- FindName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindName
 
-  
 
- FindNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindNameKV
 
-  
 
- finishBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
-  
 
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
-  
 
- finishBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- FinishBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
 
-  
 
- finishSpecifying() - Method in interface org.apache.beam.sdk.state.StateSpec
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
After building, finalizes this 
PValue to make it ready for running.
 
 
- finishSpecifying(PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.PValue
 
- 
After building, finalizes this 
PValue to make it ready for being used as an input to a
 
PTransform.
 
 
- finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
 
-  
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
-  
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
-  
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.WriteFilesResult
 
-  
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
 
-  
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionList
 
-  
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
-  
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PDone
 
- 
Does nothing; there is nothing to finish specifying.
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.POutput
 
- 
As part of applying the producing 
PTransform, finalizes this output to make it ready
 for being used as an input and for running.
 
 
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
 
-  
 
- finishWrite() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
 
- fixDefaults() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
- 
Fixes all the defaults so that equals can be used to check that two strategies are the same,
 regardless of the state of "defaulted-ness".
 
- fixedSizeGlobally(int) - Static method in class org.apache.beam.sdk.transforms.Sample
 
- 
Returns a PTransform that takes a PCollection<T>, selects sampleSize
 elements, uniformly at random, and returns a PCollection<Iterable<T>> containing the
 selected elements.
 
- fixedSizePerKey(int) - Static method in class org.apache.beam.sdk.transforms.Sample
 
- 
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a
 PCollection<KV<K, Iterable<V>>> that contains an output element mapping each distinct
 key in the input PCollection to a sample of sampleSize values associated with
 that key in the input PCollection, taken uniformly at random.
 
- FixedWindows - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
WindowFn that windows values into fixed-size timestamp-based windows.
 
 
- flatMap(List<TranslatorUtils.RawUnionValue>) - Method in class org.apache.beam.runners.gearpump.translators.functions.DoFnFunction
 
-  
 
- flatMap(WindowedValue<T>) - Method in class org.apache.beam.runners.gearpump.translators.WindowAssignTranslator.AssignWindows
 
-  
 
- FlatMapElements<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for mapping a simple function that returns iterables over the elements of a
 
PCollection and merging the results.
 
 
- Flatten - Class in org.apache.beam.sdk.transforms
 
- 
Flatten<T> takes multiple PCollection<T>s bundled into a PCollectionList<T> and returns a single PCollection<T> containing all the elements in
 all the input PCollections.
 
- Flatten() - Constructor for class org.apache.beam.sdk.transforms.Flatten
 
-  
 
- Flatten.Iterables<T> - Class in org.apache.beam.sdk.transforms
 
- 
FlattenIterables<T> takes a PCollection<Iterable<T>> and returns a PCollection<T> that contains all the elements from each iterable.
 
- Flatten.PCollections<T> - Class in org.apache.beam.sdk.transforms
 
- 
 
- FlattenPCollectionsTranslator<T> - Class in org.apache.beam.runners.gearpump.translators
 
- 
Flatten.FlattenPCollectionList is translated to Gearpump merge function.
 
- FlattenPCollectionsTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.FlattenPCollectionsTranslator
 
-  
 
- flattenRel(RelStructuredTypeFlattener) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
-  
 
- FlattenWithHeterogeneousCoders - Interface in org.apache.beam.sdk.testing
 
- 
 
- FlinkBatchPortablePipelineTranslator - Class in org.apache.beam.runners.flink
 
- 
A translator that translates bounded portable pipelines into executable Flink pipelines.
 
- FlinkBatchPortablePipelineTranslator.BatchTranslationContext - Class in org.apache.beam.runners.flink
 
- 
Batch translation context.
 
- FlinkDetachedRunnerResult - Class in org.apache.beam.runners.flink
 
- 
Result of a detached execution of a 
Pipeline with Flink.
 
 
- FlinkExecutionEnvironments - Class in org.apache.beam.runners.flink
 
- 
Utilities for Flink execution environments.
 
- FlinkExecutionEnvironments() - Constructor for class org.apache.beam.runners.flink.FlinkExecutionEnvironments
 
-  
 
- FlinkJobInvocation - Class in org.apache.beam.runners.flink
 
- 
 
- FlinkJobInvoker - Class in org.apache.beam.runners.flink
 
- 
 
- FlinkJobServerDriver - Class in org.apache.beam.runners.flink
 
- 
Driver program that starts a job server.
 
- FlinkJobServerDriver.ServerConfiguration - Class in org.apache.beam.runners.flink
 
- 
Configuration for the jobServer.
 
- FlinkMetricContainer - Class in org.apache.beam.runners.flink.metrics
 
- 
Helper class for holding a MetricsContainerImpl and forwarding Beam metrics to Flink
 accumulators and metrics.
 
- FlinkMetricContainer(RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
 
-  
 
- FlinkMetricContainer.FlinkDistributionGauge - Class in org.apache.beam.runners.flink.metrics
 
- 
 
- FlinkMetricContainer.FlinkGauge - Class in org.apache.beam.runners.flink.metrics
 
- 
 
- FlinkPipelineOptions - Interface in org.apache.beam.runners.flink
 
- 
Options which can be used to configure a Flink PipelineRunner.
 
- FlinkPortablePipelineTranslator<T extends FlinkPortablePipelineTranslator.TranslationContext> - Interface in org.apache.beam.runners.flink
 
- 
Interface for portable Flink translators.
 
- FlinkPortablePipelineTranslator.TranslationContext - Interface in org.apache.beam.runners.flink
 
- 
The context used for pipeline translation.
 
- FlinkRunner - Class in org.apache.beam.runners.flink
 
- 
A 
PipelineRunner that executes the operations in the pipeline by first translating them
 to a Flink Plan and then executing them either locally or on a Flink cluster, depending on the
 configuration.
 
 
- FlinkRunnerRegistrar - Class in org.apache.beam.runners.flink
 
- 
AutoService registrar - will register FlinkRunner and FlinkOptions as possible pipeline runner
 services.
 
- FlinkRunnerRegistrar.Options - Class in org.apache.beam.runners.flink
 
- 
Pipeline options registrar.
 
- FlinkRunnerRegistrar.Runner - Class in org.apache.beam.runners.flink
 
- 
Pipeline runner registrar.
 
- FlinkRunnerResult - Class in org.apache.beam.runners.flink
 
- 
Result of executing a 
Pipeline with Flink.
 
 
- FlinkStreamingPortablePipelineTranslator - Class in org.apache.beam.runners.flink
 
- 
Translate an unbounded portable pipeline representation into a Flink pipeline representation.
 
- FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext - Class in org.apache.beam.runners.flink
 
- 
Streaming translation context.
 
- FlinkTransformOverrides - Class in org.apache.beam.runners.flink
 
- 
 
- FlinkTransformOverrides() - Constructor for class org.apache.beam.runners.flink.FlinkTransformOverrides
 
-  
 
- FLOAT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- FLOAT - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of float fields.
 
- FloatCoder - Class in org.apache.beam.sdk.coders
 
- 
A 
FloatCoder encodes 
Float values in 8 bytes using Java serialization.
 
 
- floats() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- flush() - Method in class org.apache.beam.sdk.io.AvroIO.Sink
 
-  
 
- flush() - Method in interface org.apache.beam.sdk.io.FileIO.Sink
 
- 
Flushes the buffered state (if any) before the channel is closed.
 
- flush() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
 
-  
 
- flush() - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
-  
 
- flush() - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
 
-  
 
- flush() - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
-  
 
- fn(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
 
- 
 
- fn(Contextful.Fn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
 
- 
 
- FnApiControlClient - Class in org.apache.beam.runners.fnexecution.control
 
- 
A client for the control plane of an SDK harness, which can issue requests to it over the Fn API.
 
- FnApiControlClientPoolService - Class in org.apache.beam.runners.fnexecution.control
 
- 
A Fn API control service which adds incoming SDK harness connections to a sink.
 
- FnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
 
- 
A receiver of streamed data.
 
- FnDataService - Interface in org.apache.beam.runners.fnexecution.data
 
- 
The 
FnDataService is able to forward inbound elements to a consumer and is also a
 consumer of outbound elements.
 
 
- FnService - Interface in org.apache.beam.runners.fnexecution
 
- 
An interface sharing common behavior with services used during execution of user Fns.
 
- fold(KV<Instant, WindowedValue<KV<K, List<V>>>>, KV<Instant, WindowedValue<KV<K, V>>>) - Method in class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator.Merge
 
-  
 
- forBagUserStateHandlerFactory(ProcessBundleDescriptors.ExecutableProcessBundleDescriptor, StateRequestHandlers.BagUserStateHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
 
- 
 
- forBatch(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
-  
 
- forBoolean(Boolean) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject representing the given value.
 
- forClass(Class<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject to be used for serializing an instance of the supplied class
 for transport via the Dataflow API.
 
- forClassName(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject to be used for serializing data to be deserialized using the
 supplied class name the supplied class name for transport via the Dataflow API.
 
- forCoder(TypeDescriptor<?>, Coder<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
 
- 
Creates a 
CoderProvider that always returns the given coder for the specified type.
 
 
- forConsumer(Coder<WindowedValue<T>>, FnDataReceiver<WindowedValue<T>>) - Static method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
-  
 
- forDescriptor(Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
 
-  
 
- forEncoding(ByteString) - Static method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
 
-  
 
- forever(Trigger) - Static method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
- 
Create a composite trigger that repeatedly executes the trigger repeated, firing each
 time it fires and ignoring any indications to finish.
 
- forField(Field) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference.TypeInformation
 
- 
 
- forFloat(Float) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject representing the given value.
 
- forFloat(Double) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject representing the given value.
 
- forGetter(Method) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference.TypeInformation
 
- 
 
- forHandler(RunnerApi.Environment, InstructionRequestHandler) - Static method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
 
- 
 
- forInteger(Long) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject representing the given value.
 
- forInteger(Integer) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject representing the given value.
 
- forKnownType(Object) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject representing the given value of a well-known cloud object
 type.
 
- forLocation(LogicalEndpoint, Coder<WindowedValue<T>>, StreamObserver<BeamFnApi.Elements>) - Static method in class org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver
 
-  
 
- forLocationWithBufferLimit(int, LogicalEndpoint, Coder<WindowedValue<T>>, StreamObserver<BeamFnApi.Elements>) - Static method in class org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver
 
-  
 
- FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- formatRecord(ElementT, Schema) - Method in interface org.apache.beam.sdk.io.AvroIO.RecordFormatter
 
-  
 
- formatRecord(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
- 
Convert an input record type into the output type.
 
- formatTimestamp(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
 
- 
 
- forNewInput(Instant, InputT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
 
- 
Called by the 
Watch transform to create a new independent termination state for a
 newly arrived 
InputT.
 
 
- forOrdinal(int) - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
-  
 
- forPipeline(RunnerApi.Pipeline, Struct, File) - Static method in class org.apache.beam.runners.direct.portable.ReferenceRunner
 
-  
 
- forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
-  
 
- forRequestObserver(String, StreamObserver<BeamFnApi.InstructionRequest>) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
- 
 
- forRootDirectory(File) - Static method in class org.apache.beam.runners.direct.portable.artifact.LocalFileSystemArtifactRetrievalService
 
-  
 
- forRootDirectory(File) - Static method in class org.apache.beam.runners.direct.portable.artifact.LocalFileSystemArtifactStagerService
 
-  
 
- forRow(Row, BoundedWindow) - Static method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionEnvironments
 
- 
An environment with a fixed row and window but not expressions or correlation variables.
 
- forRowAndCorrelVariables(Row, BoundedWindow, List<Row>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionEnvironments
 
- 
An environment with a fixed row and window and correlation variables.
 
- forSetter(Method) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference.TypeInformation
 
- 
 
- forSideInput(String, String, RunnerApi.FunctionSpec, Coder<T>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
 
- 
 
- forSideInputHandlerFactory(Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, StateRequestHandlers.SideInputHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
 
- 
 
- forSqlType(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
 
-  
 
- forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
 
-  
 
- forStage(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.JobBundleFactory
 
-  
 
- forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
 
- 
Deprecated.
  
- forStreamFromSources(List<Integer>, Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
- 
Build the TimerInternals according to the feeding streams.
 
- forStreaming(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
-  
 
- forString(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject representing the given value.
 
- forTransformHierarchy(TransformHierarchy, PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
 
-  
 
- forTypeName(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
-  
 
- forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandlerFactory
 
-  
 
- ForwardingClientResponseObserver<ReqT,RespT> - Class in org.apache.beam.sdk.fn.stream
 
- 
A ClientResponseObserver which delegates all StreamObserver calls.
 
- forWriter(LogWriter) - Static method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
 
-  
 
- from(Map<String, String>) - Static method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
 
- 
Deprecated.
Expects a map keyed by logger Names with values representing Levels.
 
- from(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion.Builder
 
-  
 
- from(Collection<SqlTypeName>) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion.Builder
 
-  
 
- from(SqlTypeName...) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion.Builder
 
-  
 
- from() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion
 
-  
 
- from(ExecutorService) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
 
-  
 
- from(Supplier<ExecutorService>) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
 
-  
 
- from(String) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
 
- 
Reads from the given filename or filepattern.
 
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
 
- 
 
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
- 
Reads from the given filename or filepattern.
 
- from(String) - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
- 
 
- from(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.AvroSource
 
- 
Reads from the given file name or pattern ("glob").
 
- from(String) - Static method in class org.apache.beam.sdk.io.AvroSource
 
- 
 
- from(FileBasedSource<T>) - Static method in class org.apache.beam.sdk.io.CompressedSource
 
- 
Creates a CompressedSource from an underlying FileBasedSource.
 
- from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
Reads a BigQuery table specified as "[project_id]:[dataset_id].[table_id]" or "[dataset_id].[table_id]" for tables within the current project.
 
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
 
- from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
 
- from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- from(long) - Static method in class org.apache.beam.sdk.io.GenerateSequence
 
- 
Specifies the minimum number to generate (inclusive).
 
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
 
- 
Reads from the given filename or filepattern.
 
- from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
 
- 
 
- from(BoundedSource<T>) - Method in class org.apache.beam.sdk.io.Read.Builder
 
- 
Returns a new Read.Bounded PTransform reading from the given BoundedSource.
 
- from(UnboundedSource<T, ?>) - Method in class org.apache.beam.sdk.io.Read.Builder
 
- 
Returns a new Read.Unbounded PTransform reading from the given UnboundedSource.
 
- from(BoundedSource<T>) - Static method in class org.apache.beam.sdk.io.Read
 
- 
Returns a new Read.Bounded PTransform reading from the given BoundedSource.
 
- from(UnboundedSource<T, ?>) - Static method in class org.apache.beam.sdk.io.Read
 
- 
 
- from(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
 
- 
Provide name of collection while reading from Solr.
 
- from(String) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
Reads text files that reads from the file(s) with the given filename or filename pattern.
 
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
 
- from(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
- 
Returns a transform for reading TFRecord files that reads from the file(s) with the given
 filename or filename pattern.
 
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
- 
 
- from(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
 
- from(Map<String, String>) - Static method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
 
- 
Expects a map keyed by logger Names with values representing LogLevels.
 
- from(HasDisplayData) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
 
- fromArgs(String...) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
 
- 
Sets the command line arguments to parse when constructing the 
PipelineOptions.
 
 
- fromArgs(String...) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
- 
Sets the command line arguments to parse when constructing the 
PipelineOptions.
 
 
- fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
Utility method for deserializing a byte array using the specified coder.
 
- fromByteArrays(Collection<byte[]>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
Utility method for deserializing a Iterable of byte arrays using the specified coder.
 
- fromByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
A function wrapper for converting a byte array to an object.
 
- fromByteFunction(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
A function wrapper for converting a byte array pair to a key-value pair.
 
- fromByteFunctionIterable(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
A function wrapper for converting a byte array pair to a key-value pair, where values are
 Iterable.
 
- fromCanonical(Compression) - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
 
- 
Deprecated.
  
- fromCloudDuration(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
 
- 
Converts a Dataflow API duration string into a 
Duration.
 
 
- fromCloudObject(CloudObject) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
 
- 
Converts back into the original object from a provided 
CloudObject.
 
 
- fromCloudTime(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
 
- 
Converts a time value received via the Dataflow API into the corresponding 
Instant.
 
 
- fromConfig(FlinkJobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
-  
 
- fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
 
- 
Note that the BeamFnApi.ProcessBundleDescriptor is constructed by:
 
   Adding gRPC read and write nodes wiring them to the specified data endpoint.
 
- fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
 
-  
 
- fromFile(File, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.apex.ApexRunner
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.apex.TestApexRunner
 
-  
 
- fromOptions(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
 
- 
Constructs a translator from the provided options.
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
 
- 
Construct a runner from the provided options.
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.TestDataflowRunner
 
- 
Constructs a runner from the provided options.
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.GcsStager
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.direct.DirectRunner
 
- 
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkRunner
 
- 
Construct a runner from the provided options.
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.gearpump.GearpumpRunner
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.gearpump.TestGearpumpRunner
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.reference.PortableRunner
 
- 
Constructs a runner from the provided options.
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.reference.testing.TestPortableRunner
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
 
- 
Creates and returns a new SparkRunner with specified options.
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunnerDebugger
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.TestSparkRunner
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
 
-  
 
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
-  
 
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemRegistrar
 
-  
 
- fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.FileSystemRegistrar
 
- 
 
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
 
-  
 
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.LocalFileSystemRegistrar
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.PipelineRunner
 
- 
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.CrashingRunner
 
-  
 
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipeline
 
-  
 
- fromParams(String[]) - Static method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobServer
 
-  
 
- fromParams(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
-  
 
- fromParams(DefaultFilenamePolicy.Params) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
- 
 
- fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
 
- 
Creates a class representing a Pub/Sub subscription from the specified subscription path.
 
- fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
- 
Creates a class representing a Cloud Pub/Sub topic from the specified topic path.
 
- fromPath(Path, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
-  
 
- fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
-  
 
- fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
-  
 
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
Reads results received after executing the given query.
 
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
 
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- fromRawEvents(Coder<T>, List<TestStream.Event<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream
 
- 
For internal use only.
 
- FromRawUnionValue() - Constructor for class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils.FromRawUnionValue
 
-  
 
- fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.DefaultSchema.DefaultSchemaProvider
 
- 
Given a type, returns a function that converts from a 
Row object to that type.
 
 
- fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
 
-  
 
- fromRowFunction(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
 
- 
Given a type, returns a function that converts from a 
Row object to that type.
 
 
- fromRows(Class<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
 
- 
 
- fromRows(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
 
- 
 
- fromSerializableFunctionWithOutputType(SerializableFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.SimpleFunction
 
-  
 
- fromSpec(Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Constructs a CloudObject by copying the supplied serialized object spec, which must
 represent an SDK object serialized for transport via the Dataflow API.
 
- fromStandardParameters(ValueProvider<ResourceId>, String, String, boolean) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
- 
 
- fromStaticMethods(Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
 
- 
Creates a 
CoderProvider from a class's 
static <T> Coder<T> of(TypeDescriptor<T>,
 List<Coder<?>>) method.
 
 
- fromString(String, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
-  
 
- fromString(ValueProvider<String>, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
-  
 
- fromSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
- 
Reads from the given subscription.
 
- fromSubscription(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
- 
 
- fromSupplier(SerializableMatchers.SerializableSupplier<Matcher<T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- fromTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
- 
Creates and returns a transform for reading from a Cloud Pub/Sub topic.
 
- fromTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
- 
 
- fullOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
 
- 
Full Outer Join of two collections of KV elements.
 
- Gauge - Interface in org.apache.beam.sdk.metrics
 
- 
A metric that reports the latest value out of reported values.
 
- gauge(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
 
- 
Create a metric that can have its new value set, and is aggregated by taking the last reported
 value.
 
- gauge(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
 
- 
Create a metric that can have its new value set, and is aggregated by taking the last reported
 value.
 
- GaugeResult - Class in org.apache.beam.sdk.metrics
 
- 
The result of a 
Gauge metric.
 
 
- GaugeResult() - Constructor for class org.apache.beam.sdk.metrics.GaugeResult
 
-  
 
- GaugeResult.EmptyGaugeResult - Class in org.apache.beam.sdk.metrics
 
- 
 
- GcpCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
 
- 
Construct an oauth credential to be used by the SDK and the SDK workers.
 
- GcpCredentialFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
 
-  
 
- GcpIoPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.gcp.common
 
- 
A registrar containing the default GCP options.
 
- GcpIoPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
 
-  
 
- GcpOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
 
- 
Options used to configure Google Cloud Platform specific options such as the project and
 credentials.
 
- GcpOptions.DefaultProjectFactory - Class in org.apache.beam.sdk.extensions.gcp.options
 
- 
Attempts to infer the default project based upon the environment this application is executing
 within.
 
- GcpOptions.GcpTempLocationFactory - Class in org.apache.beam.sdk.extensions.gcp.options
 
- 
 
- GcpOptions.GcpUserCredentialsFactory - Class in org.apache.beam.sdk.extensions.gcp.options
 
- 
Attempts to load the GCP credentials.
 
- GcpPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.gcp.options
 
- 
A registrar containing the default GCP options.
 
- GcpPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
 
-  
 
- GcpTempLocationFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
 
-  
 
- GcpUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
 
-  
 
- GcsCreateOptions - Class in org.apache.beam.sdk.extensions.gcp.storage
 
- 
An abstract class that contains common configuration options for creating resources.
 
- GcsCreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
 
-  
 
- GcsCreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.storage
 
- 
 
- GcsFileSystemRegistrar - Class in org.apache.beam.sdk.extensions.gcp.storage
 
- 
AutoService registrar for the GcsFileSystem.
 
- GcsFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
 
-  
 
- GcsOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
 
- 
Options used to configure Google Cloud Storage.
 
- GcsOptions.ExecutorServiceFactory - Class in org.apache.beam.sdk.extensions.gcp.options
 
- 
Returns the default ExecutorService to use within the Apache Beam SDK.
 
- GcsOptions.PathValidatorFactory - Class in org.apache.beam.sdk.extensions.gcp.options
 
- 
 
- GcsPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
 
- 
 
- GcsResourceId - Class in org.apache.beam.sdk.extensions.gcp.storage
 
- 
ResourceId implementation for Google Cloud Storage.
 
 
- GcsStager - Class in org.apache.beam.runners.dataflow.util
 
- 
Utility class for staging files to GCS.
 
- gcsUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
 
- 
The buffer size (in bytes) to use when uploading files to GCS.
 
- GearpumpPipelineOptions - Interface in org.apache.beam.runners.gearpump
 
- 
Options that configure the Gearpump pipeline.
 
- GearpumpPipelineResult - Class in org.apache.beam.runners.gearpump
 
- 
Result of executing a 
Pipeline with Gearpump.
 
 
- GearpumpPipelineResult(ClientContext, RunningApplication) - Constructor for class org.apache.beam.runners.gearpump.GearpumpPipelineResult
 
-  
 
- GearpumpPipelineTranslator - Class in org.apache.beam.runners.gearpump.translators
 
- 
 
- GearpumpPipelineTranslator(TranslationContext) - Constructor for class org.apache.beam.runners.gearpump.translators.GearpumpPipelineTranslator
 
-  
 
- GearpumpRunner - Class in org.apache.beam.runners.gearpump
 
- 
A 
PipelineRunner that executes the operations in the pipeline by first translating them
 to Gearpump Stream DSL and then executing them on a Gearpump cluster.
 
 
- GearpumpRunner(GearpumpPipelineOptions) - Constructor for class org.apache.beam.runners.gearpump.GearpumpRunner
 
-  
 
- GearpumpRunnerRegistrar - Class in org.apache.beam.runners.gearpump
 
- 
 
- GearpumpRunnerRegistrar.Options - Class in org.apache.beam.runners.gearpump
 
- 
 
- GearpumpRunnerRegistrar.Runner - Class in org.apache.beam.runners.gearpump
 
- 
 
- GearpumpSource<T> - Class in org.apache.beam.runners.gearpump.translators.io
 
- 
 
- GearpumpWindowFn(boolean) - Constructor for class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator.GearpumpWindowFn
 
-  
 
- generate(Schema, UUID) - Static method in class org.apache.beam.sdk.coders.RowCoderGenerator
 
-  
 
- GenerateSequence - Class in org.apache.beam.sdk.io
 
- 
A 
PTransform that produces longs starting from the given value, and either up to the
 given limit or until 
Long.MAX_VALUE / until the given time elapses.
 
 
- GenerateSequence() - Constructor for class org.apache.beam.sdk.io.GenerateSequence
 
-  
 
- generateStagingSessionToken(String, String) - Static method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 
- 
 
- get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
 
- 
Returns an Iterable of values representing the bag user state for the given key and
 window.
 
- get(byte[], W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandler
 
- 
Returns an Iterable of values representing the side input for the given key and
 window.
 
- get() - Method in class org.apache.beam.runners.reference.CloseableResource
 
- 
Gets the underlying resource.
 
- get() - Static method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
-  
 
- get(Long) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
- 
 
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
-  
 
- get() - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
-  
 
- get() - Method in interface org.apache.beam.sdk.options.ValueProvider
 
- 
 
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
-  
 
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
-  
 
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
-  
 
- get(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
 
-  
 
- get(K) - Method in interface org.apache.beam.sdk.state.MapState
 
- 
A deferred lookup.
 
- get() - Method in interface org.apache.beam.sdk.testing.SerializableMatchers.SerializableSupplier
 
-  
 
- get(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
 
- 
Returns the value represented by the given 
TupleTag.
 
 
- get(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
 
- 
 
- get(K) - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
 
-  
 
- get(int) - Method in class org.apache.beam.sdk.values.PCollectionList
 
- 
Returns the 
PCollection at the given index (origin zero).
 
 
- get(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
- 
 
- get(int) - Method in class org.apache.beam.sdk.values.TupleTagList
 
- 
Returns the 
TupleTag at the given index (origin zero).
 
 
- getAccum() - Method in interface org.apache.beam.sdk.state.CombiningState
 
- 
Read the merged accumulator for this state cell.
 
- getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<KV<T, T>>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAdaptor
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
 
- 
Returns the Coder to use for accumulator AccumT values, or null if it is not
 able to be inferred.
 
- getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
-  
 
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
-  
 
- getAdditionalInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
-  
 
- getAdditionalInputs() - Method in class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
 
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
- 
 
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
- 
 
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
- 
 
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.PTransform
 
- 
 
- getAdditionalOutputTags() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
-  
 
- getAddresses() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
-  
 
- getAlgorithm() - Method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
 
- 
Returns the string representation of this type.
 
- getAll(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
- 
Returns the values from the table represented by the given TupleTag<V> as an Iterable<V> (which may be empty if there are no results).
 
- getAll() - Method in class org.apache.beam.sdk.values.PCollectionList
 
- 
 
- getAll() - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
- 
 
- getAll() - Method in class org.apache.beam.sdk.values.TupleTagList
 
- 
 
- getAllowedLateness() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.DoFn
 
- 
 
- getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.WithTimestamps
 
- 
 
- getApexDAG() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
- 
Return the DAG executed by the pipeline.
 
- getApexLauncher() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.LaunchParams
 
-  
 
- getApiRootUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
The root URL for the Dataflow API.
 
- getApiServiceDescriptor() - Method in class org.apache.beam.runners.fnexecution.GrpcFnServer
 
- 
Get an 
Endpoints.ApiServiceDescriptor describing the endpoint this 
GrpcFnServer is bound
 to.
 
 
- getApplicationName() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- getApplicationName() - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- getAppliedFn(CoderRegistry, Coder<? extends KV<K, ? extends Iterable<InputT>>>, WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
- 
 
- getAppName() - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
 
- 
Name of application, for display purposes.
 
- getApproximateArrivalTimestamp() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getArgumentTypes(Method) - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns a list of argument types for the given method, which must be a part of the class.
 
- getArray(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get an array value by field name, IllegalStateException is thrown if schema doesn't
 match.
 
- getArray(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get an array value by field index, IllegalStateException is thrown if schema doesn't
 match.
 
- getArtifact(ArtifactApi.GetArtifactRequest, StreamObserver<ArtifactApi.ArtifactChunk>) - Method in class org.apache.beam.runners.direct.portable.artifact.LocalFileSystemArtifactRetrievalService
 
-  
 
- getArtifact(ArtifactApi.GetArtifactRequest, StreamObserver<ArtifactApi.ArtifactChunk>) - Method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService
 
-  
 
- getAttempted() - Method in interface org.apache.beam.sdk.metrics.MetricResult
 
- 
Return the value of this metric across all attempts of executing all parts of the pipeline.
 
- getAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
- 
Returns the given attribute value.
 
- getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
- 
Returns the full map of attributes.
 
- getAutoscalingAlgorithm() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
[Experimental] The autoscaling algorithm to use for the workerpool.
 
- getAwsCredentialsProvider() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
- 
The credential instance that should be used to authenticate against AWS services.
 
- getAwsRegion() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
- 
AWS region used by the AWS client.
 
- getAwsServiceEndpoint() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
- 
The AWS service endpoint used by the AWS client.
 
- getBacklogCheckTime() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
 
- 
The time at which latest offset for the partition was fetched in order to calculate backlog.
 
- getBagUserStateSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
- 
Get a mapping from PTransform id to user state input id to 
bag user
 states that are used during execution.
 
 
- getBatchClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
-  
 
- getBatchDuration() - Method in class org.apache.beam.runners.spark.io.CreateStream
 
-  
 
- getBatches() - Method in class org.apache.beam.runners.spark.io.CreateStream
 
- 
Get the underlying queue representing the mock stream of micro-batches.
 
- getBatchIntervalMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getBeamRelInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
-  
 
- getBeamSqlTable() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
-  
 
- getBeamSqlUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
 
- 
 
- getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- getBoolean(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getBoolean(Map<String, Object>, String, Boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getBoolean() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getBoolean(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getBoolean(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a Boolean value by field index, ClassCastException is thrown if schema
 doesn't match.
 
- getBootstrapServers() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- getBufferSize() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
-  
 
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.SimpleStageBundleFactory
 
-  
 
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
 
- 
Get a new 
bundle for processing the data in an executable stage.
 
 
- getBundleSize() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getByte() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getByte(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Schema.TypeName.BYTE value by field name, 
IllegalStateException is thrown if
 schema doesn't match.
 
 
- getByte(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Schema.TypeName.BYTE value by field index, 
ClassCastException is thrown if
 schema doesn't match.
 
 
- getBytes(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getBytes(Map<String, Object>, String, byte[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getBytes() - Method in class org.apache.beam.sdk.io.range.ByteKey
 
- 
Returns a newly-allocated 
byte[] representing this 
ByteKey.
 
 
- getBytes(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Schema.TypeName.BYTES value by field name, 
IllegalStateException is thrown if
 schema doesn't match.
 
 
- getBytes(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getBytesPerOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
- 
Returns approximately how many bytes of data correspond to a single offset in this source.
 
- getCause() - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
- 
Returns the reason that this 
WindowFn is invalid.
 
 
- getChannelFactory() - Method in class org.apache.beam.sdk.io.CompressedSource
 
-  
 
- getCheckpointDir() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getCheckpointDurationMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getCheckpointingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getCheckpointingMode() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getCheckpointMark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
-  
 
- getCheckpointMark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
 
- getCheckpointMarkCoder() - Method in class org.apache.beam.runners.gearpump.translators.io.ValuesSource
 
-  
 
- getCheckpointMarkCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
-  
 
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
-  
 
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.UnboundedSource
 
- 
Returns a 
Coder for encoding and decoding the checkpoints for this source.
 
 
- getCheckpointTimeoutMillis() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getClasses() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns a set of 
TypeDescriptors, one for each superclass (including this class).
 
 
- getClassName() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
 
- 
Gets the name of the Java class that this CloudObject represents.
 
- getClientConfiguration() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
- 
The client configuration instance that should be used to configure AWS service clients.
 
- getClientContext() - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- getCloningBehavior() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- getClosingBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- getClosure() - Method in class org.apache.beam.sdk.transforms.Contextful
 
- 
Returns the closure.
 
- getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.aws.sns.AwsClientsProvider
 
-  
 
- getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
 
-  
 
- getCmd() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.LaunchParams
 
-  
 
- getCodec(DestinationT) - Method in class org.apache.beam.sdk.io.DynamicAvroDestinations
 
- 
Return an AVRO codec for a given destination.
 
- getCoder() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
 
-  
 
- getCoder() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
 
-  
 
- getCoder(Class<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
Returns the 
Coder to use for values of the given class.
 
 
- getCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
Returns the 
Coder to use for values of the given type.
 
 
- getCoder(TypeDescriptor<OutputT>, TypeDescriptor<InputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
 
- getCoder(Class<? extends T>, Class<T>, Map<Type, ? extends Coder<?>>, TypeVariable<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
 
- getCoder() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
- 
Returns the coder used to encode/decode the intermediate values produced/consumed by the coding
 functions of this DelegateCoder.
 
- getCoder() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
- 
 
- getCoder() - Method in class org.apache.beam.sdk.values.PCollection
 
- 
Returns the 
Coder used by this 
PCollection to encode and decode the values
 stored in it.
 
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.AtomicCoder
 
- 
If this is a 
Coder for a parameterized type, returns the list of 
Coders being
 used for each of the parameters in the same order they appear within the parameterized type's
 type signature.
 
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.Coder
 
- 
If this is a 
Coder for a parameterized type, returns the list of 
Coders being
 used for each of the parameters in the same order they appear within the parameterized type's
 type signature.
 
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.CustomCoder
 
- 
If this is a 
Coder for a parameterized type, returns the list of 
Coders being
 used for each of the parameters in the same order they appear within the parameterized type's
 type signature.
 
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.MapCoder
 
- 
If this is a 
Coder for a parameterized type, returns the list of 
Coders being
 used for each of the parameters in the same order they appear within the parameterized type's
 type signature.
 
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.SnappyCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
-  
 
- getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
-  
 
- getCoderInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
 
- 
 
- getCoderInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
-  
 
- getCoderProvider() - Static method in class org.apache.beam.sdk.coders.AvroCoder
 
- 
 
- getCoderProvider() - Static method in class org.apache.beam.sdk.coders.SerializableCoder
 
- 
 
- getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
- 
 
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
- 
 
- getCoderProviders() - Method in interface org.apache.beam.sdk.coders.CoderProviderRegistrar
 
- 
 
- getCoderProviders() - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
 
-  
 
- getCoderProviders() - Method in class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
 
-  
 
- getCoderProviders() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
 
-  
 
- getCoderProviders() - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
 
-  
 
- getCoderProviders() - Method in class org.apache.beam.sdk.io.aws.sns.SnsCoderProviderRegistrar
 
-  
 
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
 
-  
 
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
 
-  
 
- getCoderProviders() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
 
-  
 
- getCoderProviders() - Method in class org.apache.beam.sdk.io.hbase.HBaseCoderProviderRegistrar
 
-  
 
- getCoderRegistry() - Method in class org.apache.beam.sdk.Pipeline
 
- 
 
- getCoGbkResultSchema() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
- 
 
- getCollection() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
 
- 
Returns the underlying PCollection of this TaggedKeyedPCollection.
 
- getCollectionElementType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
-  
 
- getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- getCombineFn() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
 
-  
 
- getCombineFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
-  
 
- getCombineFn() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
-  
 
- getComment() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
-  
 
- getCommitted() - Method in interface org.apache.beam.sdk.metrics.MetricResult
 
- 
Return the value of this metric across all successfully completed parts of the pipeline.
 
- getCompletedTimers() - Method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate
 
-  
 
- getComponents() - Method in class org.apache.beam.sdk.coders.AtomicCoder
 
- 
Returns the list of 
Coders that are components of this 
Coder.
 
 
- getComponents() - Method in class org.apache.beam.sdk.coders.StructuredCoder
 
- 
Returns the list of 
Coders that are components of this 
Coder.
 
 
- getComponents() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
- 
Hierarchy list of component paths making up the full path, starting with the top-level child
 component path.
 
- getComponents() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
-  
 
- getComponents() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
-  
 
- getComponents() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
-  
 
- getComponentType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns the component type if this type is an array type, otherwise returns null.
 
- getCompression() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
- 
 
- getComputeNumShards() - Method in class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- getConfigFile() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
-  
 
- getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
-  
 
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
-  
 
- getConnectStringPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
-  
 
- getContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
 
- 
Provides the container version that will be used for constructing harness image paths.
 
- getContent() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
- 
Returns the extracted text.
 
- getContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
-  
 
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
-  
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
- 
Deprecated.
  
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
- 
Return a trigger to use after a 
GroupByKey to preserve the intention of this trigger.
 
 
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
- 
 
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
 
-  
 
- getCorrectlyTypedResult(BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
 
-  
 
- getCorrelVariable(int) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionEnvironment
 
- 
Gets the value for a correlation variable.
 
- getCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
-  
 
- getCount() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
-  
 
- getCount() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
-  
 
- getCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
 
- 
Return the 
Counter that should be used for implementing the given 
metricName in
 this container.
 
 
- getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
-  
 
- getCounters() - Method in interface org.apache.beam.sdk.metrics.MetricQueryResults
 
- 
Return the metric results for the counters that matched the filter.
 
- getCountryOfResidence() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
-  
 
- getCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.auth.CredentialFactory
 
-  
 
- getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
 
- 
Returns a default GCP Credentials or null when it fails.
 
- getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
 
-  
 
- getCredentialFactoryClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
- 
The class of the credential factory that should be created and used to create credentials.
 
- getCsvFormat() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
 
-  
 
- getCurrent() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
-  
 
- getCurrent() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
-  
 
- getCurrent() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
- 
Gets the current record from the delegate reader.
 
- getCurrent() - Method in class org.apache.beam.sdk.io.Source.Reader
 
- 
 
- getCurrentBlock() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
-  
 
- getCurrentBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
- 
 
- getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
-  
 
- getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
- 
Returns the largest offset such that starting to read from that offset includes the current
 block.
 
- getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
-  
 
- getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
- 
Returns the size of the current block in bytes as it is represented in the underlying file,
 if possible.
 
- getCurrentContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
 
- 
 
- getCurrentDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
-  
 
- getCurrentDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
 
- 
Returns the ResourceId that represents the current directory of this ResourceId.
 
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
-  
 
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
-  
 
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
- 
 
- getCurrentParent() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
- 
Gets the parent composite transform to the current transform, if one exists.
 
- getCurrentRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
 
- 
Returns the current record.
 
- getCurrentRecordId() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
Returns a unique identifier for the current record.
 
- getCurrentSource() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
-  
 
- getCurrentSource() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
-  
 
- getCurrentSource() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
- 
Returns a Source describing the same input that this Reader currently reads
 (including items already read).
 
- getCurrentSource() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
-  
 
- getCurrentSource() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
-  
 
- getCurrentSource() - Method in class org.apache.beam.sdk.io.Source.Reader
 
- 
Returns a Source describing the same input that this Reader currently reads
 (including items already read).
 
- getCurrentSource() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
 
- getCurrentTimestamp() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
-  
 
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
- 
By default, returns the minimum possible timestamp.
 
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
-  
 
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.Source.Reader
 
- 
Returns the timestamp associated with the current data item.
 
- getCurrentTransform() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
-  
 
- getCustomerId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
-  
 
- getDanglingDataSets() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
-  
 
- getData() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getDataAsBytes() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getDatabaseAdminClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
-  
 
- getDatabaseClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
-  
 
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- getDataCoder() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
 
-  
 
- getDataflowClient() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
An instance of the Dataflow client.
 
- getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
Dataflow endpoint to use.
 
- getDataflowJobFile() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
The path to write the translated Dataflow job specification out to at job submission time.
 
- getDataflowOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
-  
 
- getDataflowRunnerInfo() - Static method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
 
- 
 
- getDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Gets the specified 
Dataset resource by dataset ID.
 
 
- getDataSetOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
-  
 
- getDatasetService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
 
- 
 
- getDataStreamOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
-  
 
- getDate() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getDateTime(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getDateTime(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- getDebuggee() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
- 
The Cloud Debugger debuggee to associate with.
 
- getDecimal() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getDecimal(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getDecimal(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a BigDecimal value by field index, ClassCastException is thrown if schema
 doesn't match.
 
- getDefault() - Static method in class org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter
 
-  
 
- getDefaultCoder(TypeDescriptor<?>, CoderRegistry) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
- 
Returns the default coder for a given type descriptor.
 
- getDefaultDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
- 
Returns the default destination.
 
- getDefaultEnvironmentConfig() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- getDefaultEnvironmentType() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- getDefaultJavaEnvironmentUrl() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- getDefaultOutputCoder() - Method in class org.apache.beam.runners.gearpump.translators.io.ValuesSource
 
-  
 
- getDefaultOutputCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
-  
 
- getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.Source
 
- 
 
- getDefaultOutputCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
-  
 
- getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
-  
 
- getDefaultOutputCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
-  
 
- getDefaultOutputCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
-  
 
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
 
- 
Returns the Coder to use by default for output OutputT values, or null if it
 is not able to be inferred.
 
- getDefaultOutputCoder() - Method in class org.apache.beam.sdk.transforms.PTransform
 
- 
 
- getDefaultOutputCoder(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
 
- 
 
- getDefaultOutputCoder(InputT, PCollection<T>) - Method in class org.apache.beam.sdk.transforms.PTransform
 
- 
 
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
-  
 
- getDefaultOverrides(boolean) - Static method in class org.apache.beam.runners.flink.FlinkTransformOverrides
 
-  
 
- getDefaultOverrides(boolean) - Static method in class org.apache.beam.runners.spark.SparkTransformOverrides
 
-  
 
- getDefaultSchema(CalciteConnection) - Static method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
-  
 
- getDefaultSdkHarnessLogLevel() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
- 
This option controls the default log level of all loggers without a log level override.
 
- getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
 
- 
Returns the default value that was specified.
 
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
-  
 
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
-  
 
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
-  
 
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
- 
Return a 
WindowMappingFn that returns the earliest window that contains the end of the
 main-input window.
 
 
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
Returns the default 
WindowMappingFn to use to map main input windows to side input
 windows.
 
 
- getDefaultWorkerLogLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
- 
Deprecated.
This option controls the default log level of all loggers without a log level override.
 
- getDelay() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
 
-  
 
- getDeletedTimers() - Method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate
 
-  
 
- getDescription() - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
Returns the field's description.
 
- getDestination(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
- 
Returns an object that represents at a high level the destination being written to.
 
- getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
Return the user destination object for this writer.
 
- getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 
- 
Returns an object that represents at a high level which table is being written to.
 
- getDestinationCoder() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
- 
Returns the coder for DestinationT.
 
- getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 
- 
Returns the coder for DestinationT.
 
- getDestinationFile(boolean, FileBasedSink.DynamicDestinations<?, DestinationT, ?>, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- getDictionary(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getDiskSizeGb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
Remote worker disk size, in gigabytes, or 0 to use the default size.
 
- getDistribution(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
 
- 
Return the 
Distribution that should be used for implementing the given 
metricName in this container.
 
 
- getDistributions() - Method in interface org.apache.beam.sdk.metrics.MetricQueryResults
 
- 
Return the metric results for the distributions that matched the filter.
 
- getDouble() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getDouble(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getDouble(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getDumpHeapOnOOM() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
If true, save a heap dump before killing a thread or process which is GC thrashing
 or out of memory.
 
- getDynamicDestinations() - Method in class org.apache.beam.sdk.io.FileBasedSink
 
- 
 
- getEarlyTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
-  
 
- getElemCoder() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
-  
 
- getElementCoders() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
-  
 
- getElementCount() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
- 
The number of elements after which this trigger may fire.
 
- getElements() - Method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
 
-  
 
- getElements() - Method in class org.apache.beam.sdk.transforms.Create.Values
 
-  
 
- getEmbeddedCluster() - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- getEnableCloudDebugger() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
- 
Whether to enable the Cloud Debugger snapshot agent for the current job.
 
- getEnableMetrics() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getEnableSparkMetricSinks() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getEncodedElementByteSize(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
-  
 
- getEncodedElementByteSize(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
-  
 
- getEncodedElementByteSize(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
-  
 
- getEncodedElementByteSize(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.Coder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
-  
 
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
- 
Overridden to short-circuit the default StructuredCoder behavior of encoding and
 counting the bytes.
 
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
- 
Overridden to short-circuit the default StructuredCoder behavior of encoding and
 counting the bytes.
 
- getEncodedElementByteSize(String) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
- 
Returns the size in bytes of the encoded value using this coder.
 
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
-  
 
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
-  
 
- getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
-  
 
- getEncodedElementByteSize(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
-  
 
- getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
-  
 
- getEncodedElementByteSize(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
-  
 
- getEncodedElementByteSize(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
-  
 
- getEncodedElementByteSize(BigQueryInsertError) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
-  
 
- getEncodedElementByteSize(TableRow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.AvroCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.Coder
 
- 
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.CollectionCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DurationCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.FloatCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.InstantCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.IterableCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ListCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.MapCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SetCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VoidCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
-  
 
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
-  
 
- getEncodedWindow() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
 
-  
 
- getEndKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
 
- getEndOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
- 
Returns the specified ending offset of the source.
 
- getEnv() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.LaunchParams
 
-  
 
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
 
-  
 
- getEnvironment() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
 
- 
Return the environment that the remote handles.
 
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
 
-  
 
- getError() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
-  
 
- getError() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
- 
Returns the parse error, if the file was parsed unsuccessfully.
 
- getErrorAsString() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
- 
 
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
 
- 
An estimate of the total size (in bytes) of the data that would be read from this source.
 
- getEstimatedSizeBytes(CassandraIO.Read<T>) - Method in interface org.apache.beam.sdk.io.cassandra.CassandraService
 
- 
Returns an estimation of the size that could be read.
 
- getEstimatedSizeBytes(CassandraIO.Read<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
 
-  
 
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
-  
 
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
-  
 
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
-  
 
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
-  
 
- getEvents() - Method in class org.apache.beam.sdk.testing.TestStream
 
- 
 
- getExecutable() - Method in class org.apache.beam.runners.direct.WatermarkManager.FiredTimers
 
-  
 
- getExecutables() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
-  
 
- getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
-  
 
- getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
-  
 
- getExecutionRetryDelay() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getExecutorService() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
- 
The ExecutorService instance to use to create threads, can be overridden to specify an
 ExecutorService that is compatible with the users environment.
 
- getExpectedAssertions() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
-  
 
- getExperiments() - Method in interface org.apache.beam.sdk.options.ExperimentalOptions
 
-  
 
- getExplicitHashKey(byte[]) - Method in interface org.apache.beam.sdk.io.kinesis.KinesisPartitioner
 
-  
 
- getExpression(SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getExtendedSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getExtensionHosts() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
-  
 
- getExtensionRegistry() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
- 
Returns the 
ExtensionRegistry listing all known Protocol Buffers extension messages to
 
T registered with this 
ProtoCoder.
 
 
- getFailedInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
- 
 
- getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
- 
 
- getFailedMutations() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
-  
 
- getFanout() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
-  
 
- getField(int) - Method in class org.apache.beam.sdk.schemas.Schema
 
- 
Return a field by index.
 
- getField(String) - Method in class org.apache.beam.sdk.schemas.Schema
 
-  
 
- getFieldCount() - Method in class org.apache.beam.sdk.schemas.Schema
 
- 
Return the count of fields.
 
- getFieldCount() - Method in class org.apache.beam.sdk.values.Row
 
- 
Return the size of data fields.
 
- getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithGetters
 
-  
 
- getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithStorage
 
-  
 
- getFieldNames() - Method in class org.apache.beam.sdk.schemas.Schema
 
- 
Return the list of all field names.
 
- getFields() - Method in class org.apache.beam.sdk.schemas.Schema
 
-  
 
- getFileLocation() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
- 
Returns the absolute path to the input file.
 
- getFilename() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
-  
 
- getFilename(BoundedWindow, PaneInfo, int, int, Compression) - Method in interface org.apache.beam.sdk.io.FileIO.Write.FileNaming
 
- 
Generates the filename.
 
- getFilename() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
 
- 
Returns the name of the file or directory denoted by this ResourceId.
 
- getFilenamePolicy(DestinationT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
- 
 
- getFileOrPatternSpec() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- getFileOrPatternSpecProvider() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- getFilePattern() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
-  
 
- getFilesToStage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
List of local files to make available to workers.
 
- getFilesToStage() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
- 
List of local files to make available to workers.
 
- getFilesToStage() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
- 
List of local files to make available to workers.
 
- getFilesToStage() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
- 
List of local files to make available to workers.
 
- getFlatJsonRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
- 
Loads rows from BigQuery into 
Rows with given 
Schema.
 
 
- getFlinkMaster() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
- 
The url of the Flink JobManager on which to execute pipelines.
 
- getFloat() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getFloat(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Schema.TypeName.FLOAT value by field name, 
IllegalStateException is thrown if
 schema doesn't match.
 
 
- getFloat(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getFn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
-  
 
- getFn() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
-  
 
- getFn() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
 
- getFn() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
- 
 
- getFn() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
- 
 
- getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
-  
 
- getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
-  
 
- getFnApiEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
 
- 
Provides the FnAPI environment's major version number.
 
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
-  
 
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
- 
Returns a value in [0, 1] representing approximately what fraction of the 
current source this reader has read so far, or 
null if such an
 estimate is not available.
 
 
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
-  
 
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
-  
 
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- getFractionConsumed() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
 
- 
 
- getFractionOfBlockConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
 
- 
Returns the fraction of the block already consumed, if possible, as a value in [0,
 1].
 
- getFrom() - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
-  
 
- getFromRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
- 
Returns the toRow conversion function.
 
- getFromRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Retrieve the function that converts a 
Row object to the specified type.
 
 
- getFromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Retrieve the function that converts a 
Row object to the specified type.
 
 
- getFromRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
 
- 
Returns the attached schema's fromRowFunction.
 
- getFullName(PTransform<?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
- 
Returns the full name of the currently being translated transform.
 
- getFunctionNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getFunctions(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getGapDuration() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
-  
 
- getGauge(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
 
- 
Return the 
Gauge that should be used for implementing the given 
metricName in
 this container.
 
 
- getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
-  
 
- getGauges() - Method in interface org.apache.beam.sdk.metrics.MetricQueryResults
 
- 
Return the metric results for the gauges that matched the filter.
 
- getGcloudCancelCommand(DataflowPipelineOptions, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
-  
 
- getGcpCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
- 
The credential instance that should be used to authenticate against GCP services.
 
- getGcpTempLocation() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
- 
A GCS path for storing temporary files in GCP.
 
- getGcsEndpoint() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
- 
GCS endpoint to use.
 
- getGcsUploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
- 
The buffer size (in bytes) to use when uploading files to GCS.
 
- getGcsUtil() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
- 
The GcsUtil instance that should be used to communicate with Google Cloud Storage.
 
- getGetters(Class<?>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
- 
 
- getGetters(Class<?>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
-  
 
- getGetters() - Method in class org.apache.beam.sdk.values.RowWithGetters
 
-  
 
- getGetterTarget() - Method in class org.apache.beam.sdk.values.RowWithGetters
 
-  
 
- getGoogleApiTrace() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
 
- 
This option enables tracing of API calls to Google services used within the Apache Beam SDK.
 
- getHdfsConfiguration() - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
 
-  
 
- getHeaderAccessor() - Static method in class org.apache.beam.runners.fnexecution.GrpcContextHeaderAccessorProvider
 
-  
 
- getHeaders() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
-  
 
- getHighWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
-  
 
- getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
-  
 
- getHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- getId() - Method in class org.apache.beam.runners.flink.FlinkJobInvocation
 
-  
 
- getId() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
 
- 
Get an id used to represent this bundle.
 
- getId() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.ActiveBundle
 
- 
Returns an id used to represent this bundle.
 
- getId() - Method in interface org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation
 
-  
 
- getId() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
-  
 
- getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
-  
 
- getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
-  
 
- getId() - Method in interface org.apache.beam.sdk.fn.IdGenerator
 
-  
 
- getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
-  
 
- getId() - Method in class org.apache.beam.sdk.values.TupleTag
 
- 
Returns the id of this TupleTag.
 
- getId() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
-  
 
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
- 
Get the id attribute.
 
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
- 
Get the id attribute.
 
- getImplementor(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
-  
 
- getInboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
-  
 
- getIncompatibleGlobalWindowErrorMessage() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
 
- 
Returns the error message for not supported default values in Combine.globally().
 
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
-  
 
- getIndex() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
-  
 
- getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- getIndex(TupleTag<?>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
- 
Returns the index for the given tuple tag, if the tag is present in this schema, -1 if it
 isn't.
 
- getIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
- 
The zero-based index of this trigger firing that produced this pane.
 
- getInput(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
-  
 
- getInput() - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- getinputFormatClass() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- getinputFormatKeyClass() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- getinputFormatValueClass() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- getInputReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
 
- 
Get a map of PCollection ids to 
receivers which consume input elements,
 forwarding them to the remote environment.
 
 
- getInputReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.ActiveBundle
 
- 
Get a map of PCollection ids to 
receivers which consume input
 elements, forwarding them to the remote environment.
 
 
- getInputRef() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlInputRefExpression
 
-  
 
- getInputs(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
- 
Returns the input of the currently being translated transform.
 
- getInputs() - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- getInputStream(PValue) - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- getInputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
 
- 
Returns a 
TypeDescriptor capturing what is known statically about the input type of
 this 
CombineFn instance's most-derived class.
 
 
- getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
 
- 
Returns a 
TypeDescriptor capturing what is known statically about the input type of
 this 
DoFn instance's most-derived class.
 
 
- getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.SimpleFunction
 
- 
 
- getInputValueCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
 
- 
Returns the Coder of the values of the input to this transform.
 
- getInputWatermark() - Method in class org.apache.beam.runners.direct.WatermarkManager.TransformWatermarks
 
- 
Returns the input watermark of the AppliedPTransform.
 
- getInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
-  
 
- getInstance() - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
 
-  
 
- getInstance() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
-  
 
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- getInstructionId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
-  
 
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
 
-  
 
- getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
 
- 
 
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
 
-  
 
- getInt(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getInt(Map<String, Object>, String, Integer) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getInt16(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Schema.TypeName.INT16 value by field name, 
IllegalStateException is thrown if
 schema doesn't match.
 
 
- getInt16(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getInt32(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Schema.TypeName.INT32 value by field name, 
IllegalStateException is thrown if
 schema doesn't match.
 
 
- getInt32(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getInt64(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Schema.TypeName.INT64 value by field name, 
IllegalStateException is thrown if
 schema doesn't match.
 
 
- getInt64(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getInteger() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getInterface() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
-  
 
- getInterfaces() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns a set of 
TypeDescriptors, one for each interface implemented by this class.
 
 
- getJAXBClass() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
-  
 
- getJmsCorrelationID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsDeliveryMode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsDestination() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsExpiration() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsMessageID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsPriority() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsRedelivered() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsReplyTo() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsTimestamp() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJmsType() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getJob(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
 
- 
Gets the Dataflow Job with the given jobId.
 
- getJob() - Method in exception org.apache.beam.runners.dataflow.DataflowJobException
 
- 
Returns the failed job.
 
- getJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
 
-  
 
- getJob(JobReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
 
- 
 
- getJobEndpoint() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- getJobId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
- 
Get the id of this job.
 
- getJobId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
- 
The identity of the Dataflow job.
 
- getJobId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
-  
 
- getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
-  
 
- getJobInfo() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
 
-  
 
- getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
-  
 
- getJobMessages(String, long) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
- 
Return job messages sorted in ascending order by timestamp.
 
- getJobMetrics(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
 
- 
Gets the JobMetrics with the given jobId.
 
- getJobMonitoringPageURL(String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
- 
 
- getJobMonitoringPageURL(String, String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
-  
 
- getJobName() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- getJobServerConfig() - Method in interface org.apache.beam.runners.reference.testing.TestPortablePipelineOptions
 
-  
 
- getJobServerDriver() - Method in interface org.apache.beam.runners.reference.testing.TestPortablePipelineOptions
 
-  
 
- getJobService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
 
- 
 
- getJsonTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- getKey() - Method in class org.apache.beam.runners.direct.WatermarkManager.FiredTimers
 
-  
 
- getKey() - Method in interface org.apache.beam.runners.local.Bundle
 
- 
Returns the key that was output in the most recent GroupByKey in the execution of this
 bundle.
 
- getKey() - Method in class org.apache.beam.runners.local.StructuralKey
 
- 
 
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
-  
 
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
- 
The key for the display item.
 
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
The key for the display item.
 
- getKey() - Method in class org.apache.beam.sdk.values.KV
 
- 
Returns the key of this 
KV.
 
 
- getKey() - Method in class org.apache.beam.sdk.values.ShardedKey
 
-  
 
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
 
-  
 
- getKeyCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
 
- 
Returns the Coder of the keys of the input to this transform, which is also used as the
 Coder of the keys of the output of this transform.
 
- getKeyCoder() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
- 
 
- getKeyedCollections() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
- 
 
- getKeyRange() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
- 
Returns the range of keys that will be read from the table.
 
- getKeyRanges() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
Returns the range of keys that will be read from the table.
 
- getKeySet() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- getKeystorePassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
-  
 
- getKeystorePath() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
-  
 
- getKeyTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- getKeyTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- getKind() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
 
- 
Return the display name for this factory.
 
- getKindString() - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.io.Read.Bounded
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.transforms.PTransform
 
- 
Returns the name to use by default for this PTransform (not including the names of any
 enclosing PTransforms).
 
- getKindString() - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
-  
 
- getKindString() - Method in class org.apache.beam.sdk.values.PValueBase
 
- 
Returns a 
String capturing the kind of this 
PValueBase.
 
 
- getKinesisClient() - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
 
-  
 
- getKV() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
-  
 
- getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
- 
Retrieve the optional label for an item.
 
- getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
The optional label for an item.
 
- getLabels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
- 
Labels that will be applied to the billing records for this job.
 
- getLastEmitted() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
 
- 
Returns the last value emitted by the reader.
 
- getLastWatermarkedBatchTime() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
-  
 
- getLatencyTrackingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getLateTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
-  
 
- getLegacyEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
 
- 
Provides the legacy environment's major version number.
 
- getLimitCountOfSortRel() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
-  
 
- getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
- 
Retrieve the optional link URL for an item.
 
- getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
The optional link URL for an item.
 
- getListeners() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
-  
 
- getListOfMaps(Map<String, Object>, String, List<Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getLiteralGqlQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- getLocalhost() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- getLocalRef(int) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionEnvironment
 
- 
Gets the value for a local variable reference.
 
- getLocalValue() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
-  
 
- getLocation() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
-  
 
- getLong(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getLong(Map<String, Object>, String, Long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getLong() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getLowWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
-  
 
- getMainOutputTag() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
-  
 
- getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
-  
 
- getMainTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
- 
The main trigger, which will continue firing until the "until" trigger fires.
 
- getManifest(ArtifactApi.GetManifestRequest, StreamObserver<ArtifactApi.GetManifestResponse>) - Method in class org.apache.beam.runners.direct.portable.artifact.LocalFileSystemArtifactRetrievalService
 
-  
 
- getManifest(ArtifactApi.GetManifestRequest, StreamObserver<ArtifactApi.GetManifestResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService
 
-  
 
- getMap(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a MAP value by field name, IllegalStateException is thrown if schema doesn't match.
 
- getMap(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a MAP value by field index, IllegalStateException is thrown if schema doesn't
 match.
 
- getMapKeyType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
-  
 
- getMapValueType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
-  
 
- getMaterialization() - Method in class org.apache.beam.sdk.transforms.ViewFn
 
- 
Gets the materialization of this 
ViewFn.
 
 
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
 
-  
 
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
 
-  
 
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
 
-  
 
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
 
-  
 
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
 
-  
 
- getMax() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
-  
 
- getMaxBundleSize() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getMaxBundleTimeMills() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getMaxConditionCost() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
- 
The maximum cost (as a ratio of CPU time) allowed for evaluating conditional snapshots.
 
- getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
- 
Returns the actual ending offset of the current source.
 
- getMaxNumericPrecision() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
-  
 
- getMaxNumericScale() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
-  
 
- getMaxNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
The maximum number of workers to use for the workerpool.
 
- getMaxRecordsPerBatch() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getMean() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
-  
 
- getMean() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
-  
 
- getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
 
- 
Returns the configured size of the memory buffer.
 
- getMessage() - Method in exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
-  
 
- getMessage() - Method in exception org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
 
-  
 
- getMessageBacklog() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
 
- 
Current backlog in messages (latest offset of the partition - last processed record offset).
 
- getMessages() - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
 
-  
 
- getMessageStream(JobApi.JobMessagesRequest, StreamObserver<JobApi.JobMessagesResponse>) - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- getMessageStream(JobApi.JobMessagesRequest, StreamObserver<JobApi.JobMessagesResponse>) - Method in class org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService
 
-  
 
- getMessageType() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
- 
Returns the Protocol Buffers 
Message type this 
ProtoCoder supports.
 
 
- getMetadata(DestinationT) - Method in class org.apache.beam.sdk.io.DynamicAvroDestinations
 
- 
Return AVRO file metadata for a given destination.
 
- getMetadata() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
- 
Returns the MatchResult.Metadata of the file.
 
- getMetadata() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
- 
Returns the extracted metadata.
 
- getMetadata() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
Returns optional extra metadata.
 
- getMetaStore() - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
 
-  
 
- getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
-  
 
- getMetricsHttpSinkUrl() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- getMetricsPushPeriod() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- getMetricsSink() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- getMimeType() - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
 
- 
Deprecated.
  
- getMimeType() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
 
- 
Returns the MIME type that should be used for the files that will hold the output data.
 
- getMin() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
-  
 
- getMinBundleSize() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
- 
Returns the minimum bundle size that should be used when splitting the source into sub-sources.
 
- getMinCpuPlatform() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
Specifies a Minimum CPU platform for VM instances.
 
- getMinimumTimestamp() - Method in interface org.apache.beam.runners.local.Bundle
 
- 
Return the minimum timestamp among elements in this bundle.
 
- getMinPauseBetweenCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getMinReadTimeMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getMode() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- getMode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- getMonthOfYear() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- getMutableOutput(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- getName() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
-  
 
- getName() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
-  
 
- getName() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
-  
 
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
-  
 
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
-  
 
- getName() - Method in interface org.apache.beam.sdk.metrics.Metric
 
- 
 
- getName() - Method in class org.apache.beam.sdk.metrics.MetricName
 
- 
The name of this metric.
 
- getName() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
- 
 
- getName() - Method in interface org.apache.beam.sdk.metrics.MetricResult
 
- 
Return the name of the metric.
 
- getName() - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
Returns the field name.
 
- getName() - Method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference.TypeInformation
 
-  
 
- getName() - Method in class org.apache.beam.sdk.transforms.PTransform
 
- 
Returns the transform name.
 
- getName() - Method in class org.apache.beam.sdk.values.PCollection
 
- 
 
- getName() - Method in interface org.apache.beam.sdk.values.PValue
 
- 
Returns the name of this 
PValue.
 
 
- getName() - Method in class org.apache.beam.sdk.values.PValueBase
 
- 
 
- getNameOverride() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
-  
 
- getNameOverride() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
-  
 
- getNamespace() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricName
 
- 
The namespace associated with this metric.
 
- getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
- 
 
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
-  
 
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
- 
The namespace for the display item.
 
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
The namespace for the display item.
 
- getNeedsAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
-  
 
- getNetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
 
- getNextOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
-  
 
- getNonSpeculativeIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
- 
The zero-based index of this trigger firing among non-speculative panes.
 
- getNullable() - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
Returns whether the field supports null values.
 
- getNum() - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
 
-  
 
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- getNumberOfExecutionRetries() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getNumberOfWorkerHarnessThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
Number of threads to use on the Dataflow worker harness.
 
- getNumQuerySplits() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- getNumShardsProvider() - Method in class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- getNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
Number of workers to use when executing the Dataflow job.
 
- getObject(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getObjectReuse() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
-  
 
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
-  
 
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
-  
 
- getOldestPendingTimestamp() - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
 
-  
 
- getOnCreateMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
-  
 
- getOnly(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
- 
If there is a singleton value for the given tag, returns it.
 
- getOnly(TupleTag<V>, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
- 
If there is a singleton value for the given tag, returns it.
 
- getOnSuccessMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
-  
 
- getOnTimeBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
 
-  
 
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
 
-  
 
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
 
-  
 
- getOperands() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
 
-  
 
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
 
-  
 
- getOptions() - Method in class org.apache.beam.sdk.Pipeline
 
-  
 
- getOptions() - Method in class org.apache.beam.sdk.testing.TestPipeline
 
-  
 
- getOptionsId() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
- 
Provides a process wide unique ID for this 
PipelineOptions object, assigned at graph
 construction time.
 
 
- getOrCreateReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
-  
 
- getOriginalWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
- 
Returns the original windowFn that this InvalidWindows replaced.
 
- getOutboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
-  
 
- getOutName(int) - Method in class org.apache.beam.sdk.values.TupleTag
 
- 
If this TupleTag is tagging output outputIndex of a PTransform, returns
 the name that should be used by default for the output.
 
- getOutput(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
-  
 
- getOutput() - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
-  
 
- getOutputCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
-  
 
- getOutputCoder(SerializableFunction<InputT, OutputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
 
- getOutputCoder() - Method in class org.apache.beam.sdk.io.AvroSource
 
-  
 
- getOutputCoder() - Method in class org.apache.beam.sdk.io.CompressedSource
 
- 
Returns the delegate source's output coder.
 
- getOutputCoder() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
-  
 
- getOutputCoder() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
-  
 
- getOutputCoder() - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
-  
 
- getOutputCoder() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
-  
 
- getOutputCoder() - Method in class org.apache.beam.sdk.io.Source
 
- 
Returns the Coder to use for the data read from this source.
 
- getOutputCoder() - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
-  
 
- getOutputFile() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
-  
 
- getOutputKvCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
 
- 
Returns the Coder of the output of this transform.
 
- getOutputs(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
- 
Returns the output of the currently being translated transform.
 
- getOutputs() - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- getOutputStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
 
- getOutputTargetCoders() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
- 
 
- getOutputTime(Instant, GlobalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- getOutputTime(Instant, W) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
-  
 
- getOutputTime(Instant, W) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
-  
 
- getOutputTime(Instant, IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
- 
Ensures that later sliding windows have an output time that is past the end of earlier windows.
 
- getOutputTime(Instant, W) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
Returns the output timestamp to use for data depending on the given inputTimestamp in
 the specified window.
 
- getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- getOutputType() - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlOperator
 
-  
 
- getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getOutputType() - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators.StringOperator
 
-  
 
- getOutputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
 
- 
Returns a 
TypeDescriptor capturing what is known statically about the output type of
 this 
CombineFn instance's most-derived class.
 
 
- getOutputTypeDescriptor() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
-  
 
- getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
 
- 
Returns a 
TypeDescriptor capturing what is known statically about the output type of
 this 
DoFn instance's most-derived class.
 
 
- getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.SimpleFunction
 
- 
 
- getOutputWatermark() - Method in class org.apache.beam.runners.direct.WatermarkManager.TransformWatermarks
 
- 
Returns the output watermark of the AppliedPTransform.
 
- getOverrides() - Method in class org.apache.beam.runners.apex.ApexRunner
 
-  
 
- getOverrideWindmillBinary() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
Custom windmill_main binary to use with the streaming runner.
 
- getPane() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
 
- 
Returns the pane of this ValueInSingleWindow in its window.
 
- getPaneInfo() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- getParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getParallelism() - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
-  
 
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
-  
 
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
-  
 
- getPartitionKey(byte[]) - Method in interface org.apache.beam.sdk.io.kinesis.KinesisPartitioner
 
-  
 
- getPartitionKey() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
-  
 
- getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
-  
 
- getPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
-  
 
- getPassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
-  
 
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
-  
 
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
-  
 
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
-  
 
- getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
-  
 
- getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
- 
The path for the display item within a component hierarchy.
 
- getPathValidator() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
- 
The path validator instance that should be used to validate paths.
 
- getPathValidatorClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
- 
The class of the validator that should be created and used to validate paths.
 
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
- 
Returns the main PubSub message.
 
- getPayload() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getPCollection() - Method in interface org.apache.beam.runners.local.Bundle
 
- 
Returns the PCollection that the elements of this bundle belong to.
 
- getPCollection() - Method in interface org.apache.beam.sdk.values.PCollectionView
 
- 
For internal use only.
 
- getPCollection() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
-  
 
- getPCollectionInputs() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
-  
 
- getPCollectionInputs() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
 
-  
 
- getPerDestinationOutputFilenames() - Method in class org.apache.beam.sdk.io.WriteFilesResult
 
- 
Returns a 
PCollection of all output filenames generated by this 
WriteFiles
 organized by user destination type.
 
 
- getPerElementConsumers(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
-  
 
- getPerElementInputs(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
-  
 
- getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.io.WriteFilesResult
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.values.PBegin
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionList
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
-  
 
- getPipeline() - Method in class org.apache.beam.sdk.values.PDone
 
-  
 
- getPipeline() - Method in interface org.apache.beam.sdk.values.PInput
 
- 
 
- getPipeline() - Method in interface org.apache.beam.sdk.values.POutput
 
- 
 
- getPipeline() - Method in class org.apache.beam.sdk.values.PValueBase
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.apex.ApexRunnerRegistrar.Options
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
 
-  
 
- getPipelineOptions() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
- 
Returns the configured pipeline options.
 
- getPipelineOptions() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Options
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
 
-  
 
- getPipelineOptions() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunner
 
- 
For testing.
 
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.TestFlinkRunner
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.gearpump.GearpumpRunnerRegistrar.Options
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.reference.testing.TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
 
-  
 
- getPipelineOptions() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
 
- 
Perform a DFS(Depth-First-Search) to find the PipelineOptions config.
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws.options.AwsPipelineOptionsRegistrar
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
 
-  
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
 
-  
 
- getPipelineOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptionsRegistrar
 
-  
 
- getPipelineOptions() - Method in interface org.apache.beam.sdk.state.StateContext
 
- 
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
 
- 
Returns the 
PipelineOptions specified with the 
PipelineRunner invoking this
 
KeyedCombineFn.
 
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
 
- 
Returns the 
PipelineOptions specified with the 
PipelineRunner invoking this 
DoFn.
 
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
 
- 
Returns the 
PipelineOptions specified with the 
PipelineRunner invoking this 
DoFn.
 
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
 
- 
Returns the 
PipelineOptions specified with the 
PipelineRunner invoking this 
DoFn.
 
 
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
 
-  
 
- getPipelineRunners() - Method in class org.apache.beam.runners.apex.ApexRunnerRegistrar.Runner
 
-  
 
- getPipelineRunners() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
 
-  
 
- getPipelineRunners() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Runner
 
-  
 
- getPipelineRunners() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
 
-  
 
- getPipelineRunners() - Method in class org.apache.beam.runners.gearpump.GearpumpRunnerRegistrar.Runner
 
-  
 
- getPipelineRunners() - Method in class org.apache.beam.runners.reference.PortableRunnerRegistrar
 
-  
 
- getPipelineRunners() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
 
-  
 
- getPipelineUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
- 
The URL of the staged portable pipeline.
 
- getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
-  
 
- getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
-  
 
- getPositionForFractionConsumed(double) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
- 
Returns a position P such that the range [start, P) represents approximately
 the given fraction of the range [start, end).
 
- getProcessBundleDescriptor() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
-  
 
- getProcessingTimeAdvance() - Method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
 
-  
 
- getProcessor(BeamFnApi.ProcessBundleDescriptor, Map<String, RemoteInputDestination<WindowedValue<?>>>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
 
- 
Provides 
SdkHarnessClient.BundleProcessor that is capable of processing bundles not containing any state
 accesses such as:
 
   Side inputs
   User state
   Remote references
 
 
 
- getProcessor(BeamFnApi.ProcessBundleDescriptor, Map<String, RemoteInputDestination<WindowedValue<?>>>, StateDelegator) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
 
- 
 
- getProduced(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
-  
 
- getProducer(PValue) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
- 
Get the 
AppliedPTransform that produced the provided 
PValue.
 
 
- getProducer(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
-  
 
- getProfilingAgentConfiguration() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
-  
 
- getProject() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- getProject() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
- 
Project id to use when launching jobs.
 
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
- 
Get the project path.
 
- getProjectId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
- 
Get the project this job exists in.
 
- getProjectId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
-  
 
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- getProperties() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
 
-  
 
- getProperties() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
-  
 
- getProperties() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
-  
 
- getProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
-  
 
- getProviderRuntimeValues() - Method in interface org.apache.beam.sdk.testing.TestPipeline.TestValueProviderOptions
 
-  
 
- getProvisionInfo(ProvisionApi.GetProvisionInfoRequest, StreamObserver<ProvisionApi.GetProvisionInfoResponse>) - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
 
-  
 
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
 
-  
 
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
 
-  
 
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- getPubsubRootUrl() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
 
- 
Root URL for use with the Google Cloud Pub/Sub API.
 
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- getRange() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
- 
Returns the current range.
 
- getRawType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns the 
Class underlying the 
Type represented by this 
TypeDescriptor.
 
 
- getReadDurationMillis() - Method in class org.apache.beam.runners.spark.io.SparkUnboundedSource.Metadata
 
-  
 
- getReadTime() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getReadTimePercentage() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getReason() - Method in exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
-  
 
- getReasons() - Method in exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
-  
 
- getReceiver() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
 
-  
 
- getRecord() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
-  
 
- getRecordType() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
-  
 
- getRegion() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
- 
Get the region this job exists in.
 
- getRegion() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
- 
The Google Compute Engine 
region for
 creating Dataflow jobs.
 
 
- getRegisteredOptions() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
-  
 
- getRegistrationFuture() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor
 
-  
 
- getRemoteInputDestinations() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
- 
 
- getRepeatedTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
-  
 
- getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
- 
 
- getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
-  
 
- getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollection<OutputT>, ParDo.SingleOutput<InputT, OutputT>>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
 
-  
 
- getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollectionTuple, PTransform<PCollection<? extends InputT>, PCollectionTuple>>) - Method in class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
 
-  
 
- getRequirements() - Method in class org.apache.beam.sdk.transforms.Contextful
 
- 
Returns the requirements needed to run the closure.
 
- getRetainDockerContainers() - Method in interface org.apache.beam.sdk.options.ManualDockerEnvironmentOptions
 
-  
 
- getRetainExternalizedCheckpointsOnCancellation() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
 
-  
 
- getRootCause() - Method in exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
- 
 
- getRootTransforms() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
 
-  
 
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
-  
 
- getRow(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Schema.TypeName.ROW value by field name, 
IllegalStateException is thrown if
 schema doesn't match.
 
 
- getRow(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a 
Row value by field index, 
IllegalStateException is thrown if schema
 doesn't match.
 
 
- getRowReceiver(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
 
- 
 
- getRowSchema() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
-  
 
- getRuleSets() - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
 
-  
 
- getRunMillis() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- getRunner() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
- 
The pipeline runner that will be used to execute the pipeline.
 
- getS3ClientFactoryClass() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- getS3StorageClass() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- getS3ThreadPoolSize() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- getS3UploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- getSaveHeapDumpsToGcsPath() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
CAUTION: This option implies dumpHeapOnOOM, and has similar caveats.
 
- getSaveProfilesToGcs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
-  
 
- getScan() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
-  
 
- getScan() - Method in class org.apache.beam.sdk.io.hbase.HBaseQuery
 
-  
 
- getSchema() - Method in class org.apache.beam.sdk.coders.AvroCoder
 
- 
Returns the schema used by this coder.
 
- getSchema() - Method in class org.apache.beam.sdk.coders.RowCoder
 
-  
 
- getSchema() - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlTable
 
- 
Get the schema info of the table.
 
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BaseBeamTable
 
-  
 
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
-  
 
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.DynamicAvroDestinations
 
- 
Return an AVRO schema for a given destination.
 
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 
- 
Returns the table schema for the destination.
 
- getSchema() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
- 
Returns the schema associated with this type.
 
- getSchema(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Get a schema for a given Class type.
 
- getSchema(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
 
- getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
-  
 
- getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
- 
 
- getSchema() - Method in class org.apache.beam.sdk.values.PCollection
 
- 
Returns the attached schema.
 
- getSchema() - Method in class org.apache.beam.sdk.values.Row
 
- 
Return 
Schema which describes the fields.
 
 
- getSchemaProviders() - Method in class org.apache.beam.sdk.schemas.DefaultSchema.DefaultSchemaProviderRegistrar
 
-  
 
- getSchemaProviders() - Method in interface org.apache.beam.sdk.schemas.SchemaProviderRegistrar
 
- 
 
- getSchemaRegistry() - Method in class org.apache.beam.sdk.Pipeline
 
-  
 
- getScheme() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
-  
 
- getScheme() - Method in class org.apache.beam.sdk.io.FileSystem
 
- 
Get the URI scheme which defines the namespace of the 
FileSystem.
 
 
- getScheme() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
 
- 
Get the scheme which defines the namespace of the 
ResourceId.
 
 
- getSdkComponents() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
-  
 
- getSdkHarnessLogLevelOverrides() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
- 
This option controls the log levels for specifically named loggers.
 
- getSdkWorkerId() - Method in interface org.apache.beam.runners.fnexecution.HeaderAccessor
 
- 
This method should be called from the request method.
 
- getSdkWorkerParallelism() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getSerializableFunctionUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
 
- 
 
- getSerializers() - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- getServer() - Method in class org.apache.beam.runners.fnexecution.GrpcFnServer
 
- 
 
- getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
 
-  
 
- getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
 
-  
 
- getServerFactory() - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
 
- 
 
- getService() - Method in class org.apache.beam.runners.fnexecution.GrpcFnServer
 
- 
 
- getServiceAccount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
- 
Run the job as a specific service account, instead of the default GCE robot.
 
- getSetters(Class<?>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
- 
 
- getSetters(Class<?>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
-  
 
- getSetTimers() - Method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate
 
-  
 
- getShard() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- getShardId() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getShardNumber() - Method in class org.apache.beam.sdk.values.ShardedKey
 
-  
 
- getShort() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
- 
Return the optional short value for an item, or null if none is provided.
 
- getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
The optional short value for an item, or null if none is provided.
 
- getSideInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
-  
 
- getSideInputs(ExecutableStage) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
 
-  
 
- getSideInputs() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
- 
Override to specify that this object needs access to one or more side inputs.
 
- getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 
- 
Specifies that this object needs access to one or more side inputs.
 
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
Returns the side inputs used by this Combine operation.
 
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
-  
 
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
- 
Returns the side inputs used by this Combine operation.
 
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
-  
 
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
-  
 
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Requirements
 
- 
The side inputs that this 
Contextful needs access to.
 
 
- getSideInputSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
- 
Get a mapping from PTransform id to side input id to 
side inputs that
 are used during execution.
 
 
- getSideInputWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
 
- 
Returns the window of the side input corresponding to the given window of the main input.
 
- getSingleFileMetadata() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
- 
Returns the information about the single file that this source is reading from.
 
- getSink() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
 
- 
Sink for control clients.
 
- getSink() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
 
-  
 
- getSink() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
Returns the FileBasedSink for this write operation.
 
- getSink() - Method in class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- getSize() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
-  
 
- getSize() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- getSource() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
 
- 
Source of control clients.
 
- getSource() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
 
-  
 
- getSource() - Method in class org.apache.beam.sdk.io.Read.Bounded
 
- 
Returns the BoundedSource used to create this Read PTransform.
 
- getSource() - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
- 
Returns the UnboundedSource used to create this Read PTransform.
 
- getSource() - Method in class org.apache.beam.sdk.io.TextIO.Read
 
-  
 
- getSource() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
-  
 
- getSourceStream(DataSource) - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- getSparkMaster() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getSplit() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableSplit
 
-  
 
- getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
Returns the size of the backlog of unread data in the underlying data source represented by
 this split of this source.
 
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
- 
 
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
-  
 
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
-  
 
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
-  
 
- getSplitPointsProcessed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
- 
Returns the total number of split points that have been processed.
 
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
-  
 
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
- 
 
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
-  
 
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
-  
 
- getSSEAlgorithm() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- getSSEAwsKeyManagementParams() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- getSSECustomerKey() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- getStableUniqueNames() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
- 
Whether to check for stable unique names on each transform.
 
- getStager() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
The resource stager instance that should be used to stage resources.
 
- getStagerClass() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
The class responsible for staging resources to be accessible by workers during job execution.
 
- getStagingLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
- 
GCS path for staging local files, e.g.
 
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- getStartKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
 
- getStartOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
- 
Returns the starting offset of the source.
 
- getStartPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
-  
 
- getStartPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- getStartPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
 
- 
Returns the starting position of the current range, inclusive.
 
- getStartTime() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
 
- 
Returns the time the reader was started.
 
- getState() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
-  
 
- getState() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
-  
 
- getState() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
-  
 
- getState() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
-  
 
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.GetJobStateResponse>) - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- getState() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
-  
 
- getState() - Method in class org.apache.beam.runners.flink.FlinkJobInvocation
 
-  
 
- getState() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
-  
 
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.GetJobStateResponse>) - Method in class org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService
 
-  
 
- getState() - Method in interface org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation
 
- 
Retrieve the job's current state.
 
- getState() - Method in class org.apache.beam.runners.gearpump.GearpumpPipelineResult
 
-  
 
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.GetJobStateResponse>) - Method in class org.apache.beam.runners.reference.testing.TestJobService
 
-  
 
- getState() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
-  
 
- getState() - Method in interface org.apache.beam.sdk.PipelineResult
 
- 
Retrieves the current state of the pipeline execution.
 
- getStateBackend() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
- 
State backend to store Beam's state during computation.
 
- getStateCoder() - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
 
- 
 
- getStateStream(JobApi.GetJobStateRequest, StreamObserver<JobApi.GetJobStateResponse>) - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- getStateStream(JobApi.GetJobStateRequest, StreamObserver<JobApi.GetJobStateResponse>) - Method in class org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService
 
-  
 
- getStep() - Method in interface org.apache.beam.sdk.metrics.MetricResult
 
- 
Return the step context to which this metric result applies.
 
- getStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
 
- 
Returns the mapping of AppliedPTransforms to the internal step name
 for that AppliedPTransform.
 
- getStopPipelineWatermark() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
-  
 
- getStopPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
-  
 
- getStopPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- getStopPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
 
- 
Returns the ending position of the current range, exclusive.
 
- getStorageLevel() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getStreamName() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getString(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getString() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getString(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
 
- getString(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get a String value by field index, ClassCastException is thrown if schema
 doesn't match.
 
- getStrings(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
-  
 
- getSubnetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
 
- getSubSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getSubSchemaNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
- 
Get the subscription being read from.
 
- getSubscriptionProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
- 
 
- getSubSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getSuggestedFilenameSuffix() - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
 
- 
Deprecated.
  
- getSuggestedFilenameSuffix() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
 
-  
 
- getSuggestedSuffix() - Method in enum org.apache.beam.sdk.io.Compression
 
-  
 
- getSum() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
-  
 
- getSum() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
-  
 
- getSumAndReset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
-  
 
- getSupertype(Class<? super T>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns the generic form of a supertype.
 
- getSupportedClass() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
 
- 
 
- getSynchronizedProcessingInputTime() - Method in class org.apache.beam.runners.direct.WatermarkManager.TransformWatermarks
 
- 
Returns the synchronized processing input time of the AppliedPTransform.
 
- getSynchronizedProcessingOutputTime() - Method in class org.apache.beam.runners.direct.WatermarkManager.TransformWatermarks
 
- 
Returns the synchronized processing output time of the AppliedPTransform.
 
- getSynchronizedProcessingOutputWatermark() - Method in interface org.apache.beam.runners.local.Bundle
 
- 
Returns the processing time output watermark at the time the producing Executable
 committed this bundle.
 
- getSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
-  
 
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
-  
 
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
Returns the table to read, or null if reading from a query instead.
 
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Returns the table reference, or null.
 
- getTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Gets the specified 
Table resource by table ID.
 
 
- getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 
- 
 
- getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- getTableDescription() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
Returns the table being read from.
 
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
-  
 
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
-  
 
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseQuery
 
-  
 
- getTableNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getTableProvider() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
Returns the table to read, or null if reading from a query instead.
 
- getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- getTableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
 
-  
 
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
-  
 
- getTables() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
 
- 
Get all tables from this provider.
 
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
-  
 
- getTableSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
-  
 
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
 
-  
 
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
 
-  
 
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonTableProvider
 
-  
 
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
-  
 
- getTableType() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
 
- 
Gets the table type this provider handles.
 
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
 
-  
 
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
-  
 
- getTag(int) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
- 
Returns the tuple tag at the given index.
 
- getTag() - Method in class org.apache.beam.sdk.values.TaggedPValue
 
- 
Returns the local tag associated with the 
PValue.
 
 
- getTagInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
 
- 
 
- getTagInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
- 
 
- getTagsToSideInputs(Collection<PCollectionView<?>>) - Static method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils
 
-  
 
- getTarget() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
 
-  
 
- getTarget() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
-  
 
- getTargetDataset() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
-  
 
- getTargetParallelism() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
-  
 
- getTempDatasetId() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
-  
 
- getTempDirectoryProvider() - Method in class org.apache.beam.sdk.io.FileBasedSink
 
- 
 
- getTempFilename() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- getTemplateLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
- 
Where the runner should generate a template file.
 
- getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
 
- 
Returns the configured temporary location.
 
- getTempLocation() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
- 
A pipeline level default location for storing temporary files.
 
- getTempRoot() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
-  
 
- GetterBasedSchemaProvider - Class in org.apache.beam.sdk.schemas
 
- 
 
- GetterBasedSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
 
-  
 
- getTestTimeoutSeconds() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
-  
 
- getTimeDomain() - Method in interface org.apache.beam.sdk.state.TimerSpec
 
-  
 
- getTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- getTimers() - Method in class org.apache.beam.runners.direct.WatermarkManager.FiredTimers
 
- 
Gets all of the timers that have fired within the provided 
TimeDomain.
 
 
- getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
-  
 
- getTimerSpec() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
-  
 
- getTimerSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
- 
Get a mapping from PTransform id to timer id to 
timer specs that are used
 during execution.
 
 
- getTimes() - Method in class org.apache.beam.runners.spark.io.CreateStream
 
- 
 
- getTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
 
- 
Returns timestamp for element being published to Kafka.
 
- getTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
-  
 
- getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
 
-  
 
- getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult
 
-  
 
- getTimestamp() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
-  
 
- getTimestamp() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
 
- 
Returns the timestamp of this ValueInSingleWindow.
 
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
- 
Get the timestamp attribute.
 
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
- 
Get the timestamp attribute.
 
- getTimestampCombiner() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
 
- 
Return the 
TimestampCombiner which will be used to determine a watermark hold time
 given an element timestamp, and to combine watermarks from windows which are about to be
 merged.
 
 
- getTimestampCombiner() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
 
-  
 
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
 
- 
Returns record timestamp (aka event time).
 
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
 
-  
 
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
 
-  
 
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
 
-  
 
- getTimestampTransforms() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
- 
The transforms applied to the arrival time of an element to determine when this trigger allows
 output.
 
- getTimestampType() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
-  
 
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- getTiming() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
- 
Return the timing of this pane.
 
- getTo() - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
-  
 
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
- 
Get the topic being written to.
 
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
- 
Get the topic being read from.
 
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
-  
 
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
-  
 
- getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
- 
 
- getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
- 
 
- getTopics() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
 
-  
 
- getToRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
- 
Returns the fromRow conversion function.
 
- getToRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Rerieve the function that converts an object of the specified type to a 
Row object.
 
 
- getToRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Rerieve the function that converts an object of the specified type to a 
Row object.
 
 
- getToRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
 
- 
Returns the attached schema's toRowFunction.
 
- getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
Returns the size of the backlog of unread data in the underlying data source represented by
 all splits of this source.
 
- getTraitDef() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
-  
 
- getTransformNameMapping() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
Mapping of old PTranform names to new ones, specified as JSON {"oldName":"newName",...}
 .
 
- getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.Registrar
 
-  
 
- getTransformTranslator(Class<TransformT>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
 
- 
Returns the 
TransformTranslator to use for instances of the specified PTransform class,
 or null if none registered.
 
 
- getTranslator() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
- 
Returns the DataflowPipelineTranslator associated with this object.
 
- getTrigger() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- getTupleTag() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
 
- 
Returns the TupleTag of this TaggedKeyedPCollection.
 
- getTupleTagList() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
- 
Returns the TupleTagList tuple associated with this schema.
 
- getType() - Method in class org.apache.beam.sdk.coders.AvroCoder
 
- 
Returns the type this coder encodes/decodes.
 
- getType(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getType() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
- 
type of the table.
 
- getType() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
-  
 
- getType() - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
 
- getType() - Method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference.TypeInformation
 
-  
 
- getType() - Method in interface org.apache.beam.sdk.testing.TestStream.Event
 
-  
 
- getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
- 
 
- getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
 
- getType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
 
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollection
 
- 
 
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.TupleTag
 
- 
Returns a TypeDescriptor capturing what is known statically about the type of this
 TupleTag instance's most-derived class.
 
- getTypeName() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
-  
 
- getTypeNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- getTypeParameter(String) - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns a TypeVariable for the named type parameter.
 
- getTypes() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns a set of 
TypeDescriptor, one for each superclass as well as each
 interface implemented by this class.
 
 
- getUdafs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
 
-  
 
- getUnderlyingDoFn() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
-  
 
- getUnionCoder() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
-  
 
- getUnionTag() - Method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils.RawUnionValue
 
-  
 
- getUnionTag() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
-  
 
- getUniqueId() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
-  
 
- getUntilTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
- 
The trigger that signals termination of this trigger.
 
- getUrn(PrimitiveParDoSingleFactory.ParDoSingle<?, ?>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
 
-  
 
- getUrn() - Method in interface org.apache.beam.sdk.transforms.Materialization
 
- 
 
- getUsePublicIps() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
Specifies whether worker pools should be started with public IP addresses.
 
- getUserAgent() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
- 
A user agent string as per RFC2616, describing the pipeline to external services.
 
- getUsername() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
-  
 
- getUsesProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- getUUID() - Method in class org.apache.beam.sdk.schemas.Schema
 
- 
Get this schema's UUID.
 
- getV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
-  
 
- getV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
-  
 
- getValue() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer.FlinkDistributionGauge
 
-  
 
- getValue() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer.FlinkGauge
 
-  
 
- getValue() - Method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils.RawUnionValue
 
-  
 
- getValue(String, Class<T>) - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators
 
-  
 
- getValue() - Method in class org.apache.beam.runners.spark.util.ByteArray
 
-  
 
- getValue() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
-  
 
- getValue() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
-  
 
- getValue() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- getValue() - Method in class org.apache.beam.sdk.io.range.ByteKey
 
- 
Returns a read-only 
ByteBuffer representing this 
ByteKey.
 
 
- getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
 
-  
 
- getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult
 
-  
 
- getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
- 
Retrieve the value of the display item.
 
- getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
The value of the display item.
 
- getValue() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
-  
 
- getValue() - Method in class org.apache.beam.sdk.values.KV
 
- 
Returns the value of this 
KV.
 
 
- getValue(int) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get value by field index, ClassCastException is thrown if schema doesn't match.
 
- getValue(String) - Method in class org.apache.beam.sdk.values.Row
 
- 
Get value by field name, ClassCastException is thrown if type doesn't match.
 
- getValue(int) - Method in class org.apache.beam.sdk.values.RowWithGetters
 
-  
 
- getValue(int) - Method in class org.apache.beam.sdk.values.RowWithStorage
 
-  
 
- getValue() - Method in class org.apache.beam.sdk.values.TaggedPValue
 
- 
 
- getValue() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
-  
 
- getValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
 
- 
Returns the value of this ValueInSingleWindow.
 
- getValue() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
-  
 
- getValueCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- getValueCoder() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
- 
Gets the value coder that will be prefixed by the length.
 
- getValueCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
 
-  
 
- getValueCoder() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
- 
 
- getValueCoder() - Method in class org.apache.beam.sdk.testing.TestStream
 
-  
 
- getValueCoder() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
-  
 
- getValueCoder() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
-  
 
- getValues() - Method in class org.apache.beam.sdk.values.Row
 
- 
Return the list of data values.
 
- getValues() - Method in class org.apache.beam.sdk.values.RowWithGetters
 
-  
 
- getValues() - Method in class org.apache.beam.sdk.values.RowWithStorage
 
-  
 
- getValueTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- getValueTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- getView() - Method in class org.apache.beam.runners.apex.ApexRunner.CreateApexPCollectionView
 
-  
 
- getView() - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
-  
 
- getView() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
 
-  
 
- getView() - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
 
- 
 
- getViewFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
 
- 
 
- getViewFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
-  
 
- getWatermark() - Method in class org.apache.beam.runners.gearpump.translators.io.GearpumpSource
 
-  
 
- getWatermark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
-  
 
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
 
-  
 
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
 
- 
Returns watermark for the partition.
 
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
 
-  
 
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
 
-  
 
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
 
-  
 
- getWatermark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
Returns a timestamp before or at the timestamps of all future elements read by this reader.
 
- getWatermark() - Method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
 
-  
 
- getWatermarkMillis() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
-  
 
- getWatermarks(ExecutableT) - Method in class org.apache.beam.runners.direct.WatermarkManager
 
- 
Gets the input and output watermarks for an AppliedPTransform.
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
-  
 
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
- 
Deprecated.
  
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- getWindmillServiceEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
- 
Custom windmill service endpoint.
 
- getWindmillServicePort() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- getWindow() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- getWindow() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
 
- 
Returns the window of this ValueInSingleWindow.
 
- getWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
-  
 
- getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
-  
 
- getWindowFn() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- getWindowingStrategy() - Method in class org.apache.beam.sdk.values.PCollection
 
- 
 
- getWindowingStrategyInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
 
- 
 
- getWindowingStrategyInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
- 
 
- getWindowMappingFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
 
- 
For internal use only.
 
- getWindowMappingFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
-  
 
- getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
Returns a 
TypeDescriptor capturing what is known statically about the window type of
 this 
WindowFn instance's most-derived class.
 
 
- getWorkerCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
- 
The size of the worker's in-memory cache, in megabytes.
 
- getWorkerDiskType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
Specifies what type of persistent disk is used.
 
- getWorkerHarnessContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
Docker container image that executes Dataflow worker harness, residing in Google Container
 Registry.
 
- getWorkerId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
- 
The identity of the worker running this pipeline.
 
- getWorkerId() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
-  
 
- getWorkerLogLevelOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
- 
Deprecated.
This option controls the log levels for specifically named loggers.
 
- getWorkerMachineType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
Machine type to create Dataflow worker VMs as.
 
- getWorkerSystemErrMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
- 
Deprecated.
Controls the log level given to messages printed to System.err.
 
- getWorkerSystemOutMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
- 
Deprecated.
Controls the log level given to messages printed to System.out.
 
- getWritableByteChannelFactory() - Method in class org.apache.beam.sdk.io.FileBasedSink
 
- 
 
- getWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
Return the WriteOperation that this Writer belongs to.
 
- getYarnDeployDependencies() - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
 
- 
From the current classpath, find the jar files that need to be deployed with the application to
 run on YARN.
 
- getZone() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
- 
 
- getZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
- 
 
- global(Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
- 
Build a global TimerInternals for all feeding streams.
 
- Global() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.Global
 
-  
 
- globalDefault() - Static method in class org.apache.beam.sdk.values.WindowingStrategy
 
- 
Return a fully specified, default windowing strategy.
 
- GlobalDigest() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
 
-  
 
- globally() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
 
- 
Computes the approximate number of distinct elements in the input PCollection<InputT>
 and returns a PCollection<Long>.
 
- globally() - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
 
- 
Create the 
PTransform that will build a Count-min sketch for keeping track of the
 frequency of the elements in the whole stream.
 
 
- globally() - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
 
- 
Compute the stream in order to build a T-Digest structure (MergingDigest) for keeping track of
 the stream distribution and returns a PCollection<MergingDigest>.
 
- globally() - Static method in class org.apache.beam.sdk.schemas.transforms.Group
 
- 
Returns a transform that groups all elements in the input 
PCollection.
 
 
- globally(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
 
- 
Returns a PTransform that takes a PCollection<T> and returns a PCollection<List<T>> whose single value is a List of the approximate N-tiles
 of the elements of the input PCollection.
 
- globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
 
- 
 
- globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
 
- 
Returns a PTransform that takes a PCollection<T> and returns a PCollection<Long> containing a single value that is an estimate of the number of distinct
 elements in the input PCollection.
 
- globally(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
 
- 
 
- globally(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
 
- 
Returns a 
Combine.Globally PTransform that uses the given 
SerializableFunction to combine all the elements in each window of the input 
PCollection into a single value in the output 
PCollection.
 
 
- globally(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
 
- 
Returns a 
Combine.Globally PTransform that uses the given 
GloballyCombineFn to combine all the elements in each window of the input 
PCollection
 into a single value in the output 
PCollection.
 
 
- globally() - Static method in class org.apache.beam.sdk.transforms.Count
 
- 
 
- globally() - Static method in class org.apache.beam.sdk.transforms.Latest
 
- 
Returns a 
PTransform that takes as input a 
PCollection<T> and returns a 
PCollection<T> whose contents is the latest element according to its event time, or null if there are no elements.
 
 
- globally() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the maximum according to the natural ordering of T of
 the input PCollection's elements, or null if there are no elements.
 
- globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the maximum of the input PCollection's elements, or
 null if there are no elements.
 
- globally() - Static method in class org.apache.beam.sdk.transforms.Mean
 
- 
Returns a PTransform that takes an input PCollection<NumT> and returns a PCollection<Double> whose contents is the mean of the input PCollection's elements, or
 0 if there are no elements.
 
- globally() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the minimum according to the natural ordering of T of
 the input PCollection's elements, or null if there are no elements.
 
- globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the minimum of the input PCollection's elements, or
 null if there are no elements.
 
- GloballyDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
 
-  
 
- GlobalSketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
 
-  
 
- GlobalWatermarkHolder - Class in org.apache.beam.runners.spark.util
 
- 
A store to hold the global watermarks for a micro-batch.
 
- GlobalWatermarkHolder() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
-  
 
- GlobalWatermarkHolder.SparkWatermarks - Class in org.apache.beam.runners.spark.util
 
- 
 
- GlobalWatermarkHolder.WatermarkAdvancingStreamingListener - Class in org.apache.beam.runners.spark.util
 
- 
Advance the WMs onBatchCompleted event.
 
- GlobalWindow - Class in org.apache.beam.sdk.transforms.windowing
 
- 
The default window into which all data is placed (via 
GlobalWindows).
 
 
- GlobalWindow.Coder - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- GlobalWindows - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
WindowFn that assigns all data to the same window.
 
 
- GlobalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- GoogleApiDebugOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
 
- 
These options configure debug settings for Google API clients created within the Apache Beam SDK.
 
- GoogleApiDebugOptions.GoogleApiTracer - Class in org.apache.beam.sdk.extensions.gcp.options
 
- 
 
- GoogleApiTracer() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
 
-  
 
- GraphiteSink - Class in org.apache.beam.runners.spark.metrics.sink
 
- 
 
- GraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
 
-  
 
- greaterThan(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- greaterThan(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- greaterThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that are greater than a given value, based on the elements'
 natural ordering.
 
- greaterThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that are greater than or equal to a given value, based on the
 elements' natural ordering.
 
- greaterThanOrEqualTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- greaterThanOrEqualTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- Group - Class in org.apache.beam.sdk.schemas.transforms
 
- 
 
- Group() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group
 
-  
 
- Group.ByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
 
- 
a 
PTransform that groups schema elements based on the given fields.
 
 
- Group.CombineByFields<InputT,OutputT> - Class in org.apache.beam.sdk.schemas.transforms
 
- 
 
- Group.CombineFieldsByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
 
- 
a 
PTransform that does a per-key combine using an aggregation built up by calls to
 aggregateField and aggregateFields.
 
 
- Group.CombineFieldsGlobally<InputT> - Class in org.apache.beam.sdk.schemas.transforms
 
- 
a 
PTransform that does a global combine using an aggregation built up by calls to
 aggregateField and aggregateFields.
 
 
- Group.CombineGlobally<InputT,OutputT> - Class in org.apache.beam.sdk.schemas.transforms
 
- 
 
- Group.Global<InputT> - Class in org.apache.beam.sdk.schemas.transforms
 
- 
A 
PTransform for doing global aggregations on schema PCollections.
 
 
- groupAlsoByWindow(JavaDStream<WindowedValue<KV<K, Iterable<WindowedValue<InputT>>>>>, Coder<K>, Coder<WindowedValue<InputT>>, WindowingStrategy<?, W>, SerializablePipelineOptions, List<Integer>, String) - Static method in class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
 
-  
 
- groupBy(WindowedValue<KV<K, V>>) - Method in class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator.GroupByFn
 
-  
 
- GroupByKey<K,V> - Class in org.apache.beam.sdk.transforms
 
- 
GroupByKey<K, V> takes a PCollection<KV<K, V>>, groups the values by key and
 windows, and returns a PCollection<KV<K, Iterable<V>>> representing a map from each
 distinct key and window of the input PCollection to an Iterable over all the
 values associated with that key in the input per window.
 
- GroupByKeyTranslator<K,V> - Class in org.apache.beam.runners.gearpump.translators
 
- 
GroupByKey is translated to Gearpump groupBy function.
 
 
- GroupByKeyTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator
 
-  
 
- GroupByKeyTranslator.GearpumpWindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.runners.gearpump.translators
 
- 
A transform used internally to translate Beam's Window to Gearpump's Window.
 
- GroupByKeyTranslator.GroupByFn<K,V> - Class in org.apache.beam.runners.gearpump.translators
 
- 
A transform used internally to group KV message by its key.
 
- GroupByKeyTranslator.KeyedByTimestamp<K,V> - Class in org.apache.beam.runners.gearpump.translators
 
- 
A transform used internally to transform WindowedValue to KV.
 
- GroupByKeyTranslator.Merge<K,V> - Class in org.apache.beam.runners.gearpump.translators
 
- 
A transform used internally by Gearpump which encapsulates the merge logic.
 
- grouped() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
 
- groupedValues(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
 
- 
Returns a 
Combine.GroupedValues PTransform that takes a 
PCollection of 
KVs where a key maps to an 
Iterable of values, e.g., the result
 of a 
GroupByKey, then uses the given 
SerializableFunction to combine all the
 values associated with a key, ignoring the key.
 
 
- groupedValues(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
 
- 
Returns a 
Combine.GroupedValues PTransform that takes a 
PCollection of 
KVs where a key maps to an 
Iterable of values, e.g., the result
 of a 
GroupByKey, then uses the given 
CombineFn to combine all the values
 associated with a key, ignoring the key.
 
 
- GroupingState<InputT,OutputT> - Interface in org.apache.beam.sdk.state
 
- 
A 
ReadableState cell that combines multiple input values and outputs a single value of a
 different type.
 
 
- GroupIntoBatches<K,InputT> - Class in org.apache.beam.sdk.transforms
 
- 
A 
PTransform that batches inputs to a desired batch size.
 
 
- Growth() - Constructor for class org.apache.beam.sdk.transforms.Watch.Growth
 
-  
 
- growthOf(Watch.Growth.PollFn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Watch
 
- 
Watches the growth of the given poll function.
 
- growthOf(Watch.Growth.PollFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch
 
- 
Watches the growth of the given poll function.
 
- growthOf(Contextful<Watch.Growth.PollFn<InputT, OutputT>>, SerializableFunction<OutputT, KeyT>) - Static method in class org.apache.beam.sdk.transforms.Watch
 
- 
Watches the growth of the given poll function, using the given "key function" to deduplicate
 outputs.
 
- GrpcContextHeaderAccessorProvider - Class in org.apache.beam.runners.fnexecution
 
- 
A HeaderAccessorProvider which intercept the header in a GRPC request and expose the relevant
 fields.
 
- GrpcContextHeaderAccessorProvider() - Constructor for class org.apache.beam.runners.fnexecution.GrpcContextHeaderAccessorProvider
 
-  
 
- GrpcDataService - Class in org.apache.beam.runners.fnexecution.data
 
- 
 
- GrpcFnServer<ServiceT extends FnService> - Class in org.apache.beam.runners.fnexecution
 
- 
A 
gRPC Server which manages a single 
FnService.
 
 
- GrpcLoggingService - Class in org.apache.beam.runners.fnexecution.logging
 
- 
An implementation of the Beam Fn Logging Service over gRPC.
 
- GrpcStateService - Class in org.apache.beam.runners.fnexecution.state
 
- 
An implementation of the Beam Fn State service.
 
- id() - Method in class org.apache.beam.runners.fnexecution.jobsubmission.JobPreparation
 
-  
 
- id - Variable in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
-  
 
- Identifier() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
-  
 
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
- 
Returns the identity element of this operation, i.e.
 
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
- 
Returns the value that should be used for the combine of the empty set.
 
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
- 
Returns the identity element of this operation, i.e.
 
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
- 
Returns the identity element of this operation, i.e.
 
- identity() - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
 
-  
 
- IDENTITY_ELEMENT - Static variable in class org.apache.beam.sdk.metrics.DistributionResult
 
- 
The IDENTITY_ELEMENT is used to start accumulating distributions.
 
- IdGenerator - Interface in org.apache.beam.sdk.fn
 
- 
A generator of unique IDs.
 
- IdGenerators - Class in org.apache.beam.sdk.fn
 
- 
 
- IdGenerators() - Constructor for class org.apache.beam.sdk.fn.IdGenerators
 
-  
 
- ignoreInput(Watch.Growth.TerminationCondition<?, StateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
Wraps a given input-independent 
Watch.Growth.TerminationCondition as an equivalent condition with
 a given input type, passing 
null to the original condition as input.
 
 
- ignoreUnknownValues() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Accept rows that contain values that do not match the schema.
 
- immediate(T) - Static method in class org.apache.beam.sdk.state.ReadableStates
 
- 
A 
ReadableState constructed from a constant value, hence immediately available.
 
 
- immutableNames() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
-  
 
- immutableNamesBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
-  
 
- immutableSteps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
-  
 
- immutableStepsBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
-  
 
- implement(EnumerableRelImplementor, EnumerableRel.Prefer) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
-  
 
- Impulse - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- IMPULSE_ELEMENT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- in(Pipeline) - Static method in class org.apache.beam.sdk.values.PBegin
 
- 
 
- in(Pipeline) - Static method in class org.apache.beam.sdk.values.PDone
 
- 
 
- inbound(Iterator<ByteString>) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
 
- 
Converts multiple ByteStrings into a single InputStream.
 
- InboundDataClient - Interface in org.apache.beam.sdk.fn.data
 
- 
A client representing some stream of inbound data.
 
- inc() - Method in interface org.apache.beam.sdk.metrics.Counter
 
- 
Increment the counter.
 
- inc(long) - Method in interface org.apache.beam.sdk.metrics.Counter
 
- 
Increment the counter by the given amount.
 
- include(String, HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
 
- 
Register display data from the specified subcomponent at the given path.
 
- inCombinedNonLatePanes(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
Creates a new 
PAssert.IterableAssert like this one, but with the assertion restricted to only
 run on the provided window across all panes that were not produced by the arrival of late
 data.
 
 
- inCombinedNonLatePanes(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- IncompatibleWindowException - Exception in org.apache.beam.sdk.transforms.windowing
 
- 
 
- IncompatibleWindowException(WindowFn<?, ?>, String) - Constructor for exception org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
 
-  
 
- incomplete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
 
- 
Constructs a 
Watch.Growth.PollResult with the given outputs and declares that new outputs might
 appear for the current input.
 
 
- incomplete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
 
- 
 
- increment() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Returns a RandomAccessData that is the smallest value of same length which is strictly greater
 than this.
 
- incrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
 
- 
Returns an 
IdGenerator which provides successive incrementing longs.
 
 
- INDEX_OF_MAX - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
 
- 
Shard name containing the index and max.
 
- indexOf(String) - Method in class org.apache.beam.sdk.schemas.Schema
 
- 
Find the index of a given field.
 
- inEarlyGlobalWindowPanes() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
 
- inEarlyGlobalWindowPanes() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- inEarlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
Creates a new 
PAssert.IterableAssert like this one, but with the assertion restricted to only
 run on the provided window across all panes that were produced by the arrival of early data.
 
 
- inEarlyPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- inEarlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
 
- 
Creates a new 
PAssert.SingletonAssert like this one, but with the assertion restricted to
 only run on the provided window, running the checker only on early panes for each key.
 
 
- inferType(Object) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
 
- inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
Creates a new 
PAssert.IterableAssert like this one, but with the assertion restricted to only
 run on the provided window, running the checker only on the final pane for each key.
 
 
- inFinalPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
 
- 
Creates a new 
PAssert.SingletonAssert like this one, but with the assertion restricted to
 only run on the provided window, running the checker only on the final pane for each key.
 
 
- init() - Method in class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator.Merge
 
-  
 
- init(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
 
- 
Init aggregators accumulator if it has not been initiated.
 
- init(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
- 
Init metrics accumulator if it has not been initiated.
 
- INIT_CAP - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators
 
- 
INITCAP.
 
- initAccumulators(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
 
- 
Init Metrics/Aggregators accumulators.
 
- initialize(Map<ExecutableT, ? extends Iterable<Bundle<?, CollectionT>>>) - Method in class org.apache.beam.runners.direct.WatermarkManager
 
-  
 
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
 
-  
 
- initialize(AbstractGoogleClientRequest<?>) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
 
-  
 
- initialSystemTimeAt(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
 
- 
Set the initial synchronized processing time.
 
- inMemory(TableProvider...) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
-  
 
- InMemoryJobService - Class in org.apache.beam.runners.fnexecution.jobsubmission
 
- 
A InMemoryJobService that prepares and runs jobs on behalf of a client using a 
JobInvoker.
 
 
- InMemoryMetaStore - Class in org.apache.beam.sdk.extensions.sql.meta.store
 
- 
A 
MetaStore which stores the meta info in memory.
 
 
- InMemoryMetaStore() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
-  
 
- InMemoryMetaTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
 
- 
A InMemoryMetaTableProvider is an abstract TableProvider for in-memory types.
 
- InMemoryMetaTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
 
-  
 
- inNamespace(String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
-  
 
- inNamespace(Class<?>) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
-  
 
- Inner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter.Inner
 
-  
 
- Inner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Unnest.Inner
 
-  
 
- innerJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
 
- 
Inner join of two collections of KV elements.
 
- inOnlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
 
- 
Creates a new 
PAssert.SingletonAssert like this one, but with the assertion restricted to
 only run on the provided window.
 
 
- inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
Creates a new 
PAssert.IterableAssert like this one, but with the assertion restricted to only
 run on the provided window.
 
 
- inOnTimePane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
 
- 
Creates a new 
PAssert.SingletonAssert like this one, but with the assertion restricted to
 only run on the provided window, running the checker only on the on-time pane for each key.
 
 
- inOrder(Trigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
- 
Returns an AfterEach Trigger with the given subtriggers.
 
- inOrder(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
- 
Returns an AfterEach Trigger with the given subtriggers.
 
- InProcessManagedChannelFactory - Class in org.apache.beam.sdk.fn.test
 
- 
 
- InProcessServerFactory - Class in org.apache.beam.runners.fnexecution
 
- 
A 
ServerFactory which creates 
servers with the 
InProcessServerBuilder.
 
 
- inputOf(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
Returns a type descriptor for the input of the given 
SerializableFunction, subject to
 Java type erasure: may contain unresolved type variables if the type was erased.
 
 
- inputOf(Contextful.Fn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- INPUTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- insertAll(TableReference, List<ValueInSingleWindow<TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Inserts 
TableRows with the specified insertIds if not null.
 
 
- InsertRetryPolicy - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
A retry policy for streaming BigQuery inserts.
 
- InsertRetryPolicy() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
 
-  
 
- InsertRetryPolicy.Context - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Contains information about a failed insert.
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchemaFactory
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
-  
 
- INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
 
- 
 
- INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
-  
 
- InstantCoder - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder for joda 
Instant that encodes it as a big endian 
Long shifted
 such that lexicographic ordering of the bytes corresponds to chronological order.
 
 
- InstantDeserializer - Class in org.apache.beam.sdk.io.kafka.serialization
 
- 
 
- InstantDeserializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
-  
 
- instantiateRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
 
- 
Instantiates a runner-side wire coder for the given PCollection.
 
- InstantSerializer - Class in org.apache.beam.sdk.io.kafka.serialization
 
- 
 
- InstantSerializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
-  
 
- InstructionRequestHandler - Interface in org.apache.beam.runners.fnexecution.control
 
- 
Interface for any function that can handle a Fn API BeamFnApi.InstructionRequest.
 
- INT16 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of int16 fields.
 
- INT32 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of int32 fields.
 
- INT64 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of int64 fields.
 
- INTEGER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- INTEGER_TYPES_TO_BIGINT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.IntegerReinterpretConversions
 
-  
 
- IntegerReinterpretConversions - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret
 
- 
Utility class to contain implementations of SQL integer type conversions.
 
- IntegerReinterpretConversions() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.IntegerReinterpretConversions
 
-  
 
- integers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
Returns a PTransform that takes an input PCollection<Integer> and returns a
 PCollection<Integer> whose contents is the maximum of the input PCollection's
 elements, or Integer.MIN_VALUE if there are no elements.
 
- integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
Returns a PTransform that takes an input PCollection<Integer> and returns a
 PCollection<Integer> whose contents is a single value that is the minimum of the input
 PCollection's elements, or Integer.MAX_VALUE if there are no elements.
 
- integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
 
- 
Returns a PTransform that takes an input PCollection<Integer> and returns a
 PCollection<Integer> whose contents is the sum of the input PCollection's
 elements, or 0 if there are no elements.
 
- integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and
 returns a PCollection<KV<K, Integer>> that contains an output element mapping each
 distinct key in the input PCollection to the maximum of the values associated with that
 key in the input PCollection.
 
- integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and
 returns a PCollection<KV<K, Integer>> that contains an output element mapping each
 distinct key in the input PCollection to the minimum of the values associated with that
 key in the input PCollection.
 
- integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
 
- 
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and
 returns a PCollection<KV<K, Integer>> that contains an output element mapping each
 distinct key in the input PCollection to the sum of the values associated with that key
 in the input PCollection.
 
- interceptor() - Static method in class org.apache.beam.runners.fnexecution.GrpcContextHeaderAccessorProvider
 
-  
 
- Internal - Annotation Type in org.apache.beam.sdk.annotations
 
- 
Signifies that a publicly accessible API (public class, method or field) is intended for internal
 use only and not for public consumption.
 
- interpolateKey(double) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
Returns a 
ByteKey key such that 
[startKey, key) represents
 approximately the specified fraction of the range 
[startKey, endKey).
 
 
- intersects(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
- 
Returns whether this window intersects the given window.
 
- INTERVALS_DURATIONS_TYPES - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.TimeUnitUtils
 
- 
supported interval and duration type.
 
- IntervalWindow - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- IntervalWindow(Instant, Instant) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
- 
Creates a new IntervalWindow that represents the half-open time interval [start, end).
 
- IntervalWindow(Instant, ReadableDuration) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
-  
 
- IntervalWindow.IntervalWindowCoder - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- IntervalWindowCoder() - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
-  
 
- into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
 
- 
 
- into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
 
- 
 
- into(WindowFn<? super T, ?>) - Static method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
Creates a 
Window PTransform that uses the given 
WindowFn to window the
 data.
 
 
- InvalidWindows<W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
WindowFn that represents an invalid pipeline state.
 
 
- InvalidWindows(String, WindowFn<?, W>) - Constructor for class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
-  
 
- invoke(RunnerApi.Pipeline, Struct, String) - Method in class org.apache.beam.runners.flink.FlinkJobInvoker
 
-  
 
- invoke(RunnerApi.Pipeline, Struct, String) - Method in interface org.apache.beam.runners.fnexecution.jobsubmission.JobInvoker
 
- 
Start running a job, abstracting its state as a 
JobInvocation instance.
 
 
- invokeAdvance(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
-  
 
- invokeStart(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
-  
 
- inWindow(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
Creates a new 
PAssert.IterableAssert like this one, but with the assertion restricted to only
 run on the provided window.
 
 
- inWindow(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- IS_MERGING_WINDOW_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- IS_PAIR_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- IS_STREAM_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- IS_WRAPPER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- isAccessible() - Method in interface org.apache.beam.sdk.options.ValueProvider
 
- 
 
- isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
-  
 
- isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
-  
 
- isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
-  
 
- isAllowedLatenessSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- isArray() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns true if this type is known to be an array type.
 
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
- 
Returns true if the reader is at a split point.
 
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
- 
Returns true only for the first record; compressed sources cannot be split.
 
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
- 
 
- isBlockOnRun() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
-  
 
- isBounded() - Method in class org.apache.beam.sdk.values.PCollection
 
-  
 
- isBoundedCollection(Collection<PValue>) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
-  
 
- isCollectionType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
-  
 
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
-  
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
-  
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
-  
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
- 
InvalidWindows objects with the same originalWindowFn are compatible.
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
-  
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
 
- isCompositeType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- isCompressed(String) - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
 
- 
Deprecated.
Returns whether the file's extension matches of one of the known compression formats.
 
- isCompressed(String) - Method in enum org.apache.beam.sdk.io.Compression
 
-  
 
- isDateType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- isDdl(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
-  
 
- isDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
-  
 
- isDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
 
- 
Returns 
true if this 
ResourceId represents a directory, false otherwise.
 
 
- isDisjoint(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
- 
Returns whether this window is disjoint from the given window.
 
- isDone() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
 
-  
 
- isDone() - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
 
-  
 
- isDone() - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
 
- 
Returns true if the client is done, either via completing successfully or by being cancelled.
 
- isDone() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
- 
 
- isDone() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
-  
 
- isDone() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- isEmbeddedExecution() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- isEmbeddedExecutionDebugMode() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- isEmpty() - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
-  
 
- isEmpty() - Method in class org.apache.beam.sdk.io.range.ByteKey
 
- 
Returns 
true if the 
byte[] backing this 
ByteKey is of length 0.
 
 
- isEmpty() - Method in interface org.apache.beam.sdk.state.GroupingState
 
- 
 
- isEmpty() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
-  
 
- isEmpty() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
-  
 
- isEmpty() - Method in class org.apache.beam.sdk.transforms.Requirements
 
- 
Whether this is an empty set of requirements.
 
- isEnforceEncodability() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
-  
 
- isEnforceImmutability() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
-  
 
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
 
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- isEqualTo(T) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
 
- 
Asserts that the value in question is equal to the provided value, according to Object.equals(java.lang.Object).
 
- isExternalizedCheckpointsEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
- 
Enables or disables externalized checkpoints.
 
- isFinished() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.ProcessWatcher
 
-  
 
- isFirst() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
- 
Return true if this is the first pane produced for the associated window.
 
- isFnApi() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
 
-  
 
- isForceStreaming() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
-  
 
- isForceWatermarkSync() - Method in class org.apache.beam.runners.spark.io.CreateStream
 
-  
 
- isIn(Collection<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- isIn(Coder<T>, Collection<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- isIn(T[]) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- isIn(Coder<T>, T[]) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- isInputSortRelAndLimitOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
 
-  
 
- isInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
Returns whether or not this transformation applies a default value.
 
- isLast() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
- 
Return true if this is the last pane that will be produced in the associated window.
 
- isLimitOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
 
-  
 
- isMapType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- isMetricsSupported() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
 
- 
Indicates whether metrics reporting is supported.
 
- isModeSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- isMutable() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- isNonMerging() - Method in class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator.GearpumpWindowFn
 
-  
 
- isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
 
-  
 
- isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
Returns true if this WindowFn never needs to merge any windows.
 
- isNullable() - Method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference.TypeInformation
 
-  
 
- isNumericType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- isOneOf(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- isOneOf(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- isOperandNumeric(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathBinaryExpression
 
- 
The method to check whether operands are numeric or not.
 
- isParDoFusionEnabled() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- isPrimitiveType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- isReadSeekEfficient() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
-  
 
- isRegisterByteSizeObserverCheap(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
-  
 
- isRegisterByteSizeObserverCheap(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
- 
 
- isRegisterByteSizeObserverCheap(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
 
-  
 
- isRegisterByteSizeObserverCheap(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
 
- 
 
- isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.Coder
 
- 
 
- isRegisterByteSizeObserverCheap(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
- 
 
- isRegisterByteSizeObserverCheap(ReadableDuration) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
- 
 
- isRegisterByteSizeObserverCheap(IterableT) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
- 
 
- isRegisterByteSizeObserverCheap(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
 
- 
Returns whether both keyCoder and valueCoder are considered not expensive.
 
- isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
- 
LengthPrefixCoder is cheap if valueCoder is cheap.
 
- isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
- 
NullableCoder is cheap if valueCoder is cheap.
 
- isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
- 
 
- isRegisterByteSizeObserverCheap(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
- 
 
- isRegisterByteSizeObserverCheap(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
- 
 
- isRegisterByteSizeObserverCheap(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
-  
 
- isRegisterByteSizeObserverCheap(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
 
-  
 
- isRegisterByteSizeObserverCheap(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
-  
 
- isRegisterByteSizeObserverCheap(RawUnionValue) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
- 
Since this coder uses elementCoders.get(index) and coders that are known to run in constant
 time, we defer the return value to that coder.
 
- isRunnerDeterminedSharding() - Method in interface org.apache.beam.runners.direct.DirectTestOptions
 
-  
 
- isShutdownSourcesOnFinalWatermark() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
- 
Whether to shutdown sources when their watermark reaches +Inf.
 
- isSplittable() - Method in class org.apache.beam.sdk.io.CompressedSource
 
- 
Determines whether a single file represented by this source is splittable.
 
- isSplittable() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
- 
Determines whether a file represented by this source is can be split into bundles.
 
- isStarted() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
- 
 
- isStarted() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- isStreaming() - Method in interface org.apache.beam.sdk.options.StreamingOptions
 
- 
Set to true if running a streaming pipeline.
 
- isStringType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- isSubtypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Return true if this type is a subtype of the given type.
 
- isSuccess() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
- 
Returns whether this file was parsed successfully.
 
- isSuccess() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
-  
 
- isSupertypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns true if this type is assignable from the given type.
 
- isTableEmpty(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Returns true if the table is empty.
 
- isTermainal() - Method in enum org.apache.beam.runners.local.ExecutionDriver.DriverState
 
-  
 
- isTerminal() - Method in enum org.apache.beam.sdk.PipelineResult.State
 
-  
 
- isTerminated(JobApi.JobState.Enum) - Static method in interface org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation
 
-  
 
- isTimestampCombinerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- isTriggerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- isTupleTracingEnabled() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- isUnknown() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
- 
Return true if there is no timing information for the current 
PaneInfo.
 
 
- isUpdate() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
- 
Whether to update the currently running pipeline with the same name as this one.
 
- isWholeStream - Variable in class org.apache.beam.sdk.coders.Coder.Context
 
- 
Deprecated.
Whether the encoded or decoded value fills the remainder of the output or input (resp.)
 record/stream contents.
 
- item(String, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and string value.
 
- item(String, ValueProvider<?>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
 
- item(String, Integer) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and integer value.
 
- item(String, Long) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and integer value.
 
- item(String, Float) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and floating point value.
 
- item(String, Double) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and floating point value.
 
- item(String, Boolean) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and boolean value.
 
- item(String, Instant) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and timestamp value.
 
- item(String, Duration) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and duration value.
 
- item(String, Class<T>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key and class value.
 
- item(String, DisplayData.Type, T) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
 
- 
Create a display item for the specified key, type, and value.
 
- Item() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
-  
 
- items() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
-  
 
- ItemSpec() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
-  
 
- IterableCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
 
- IterableCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.IterableCoder
 
-  
 
- IterableLikeCoder<T,IterableT extends java.lang.Iterable<T>> - Class in org.apache.beam.sdk.coders
 
- 
An abstract base class with functionality for assembling a 
Coder for a class that
 implements 
Iterable.
 
 
- IterableLikeCoder(Coder<T>, String) - Constructor for class org.apache.beam.sdk.coders.IterableLikeCoder
 
-  
 
- iterables() - Static method in class org.apache.beam.sdk.transforms.Flatten
 
- 
Returns a PTransform that takes a PCollection<Iterable<T>> and returns a PCollection<T> containing all the elements from all the Iterables.
 
- iterables() - Static method in class org.apache.beam.sdk.transforms.ToString
 
- 
Transforms each item in the iterable of the input 
PCollection to a 
String using
 the 
Object.toString() method followed by a "," until the last element in the iterable.
 
 
- iterables(String) - Static method in class org.apache.beam.sdk.transforms.ToString
 
- 
Transforms each item in the iterable of the input 
PCollection to a 
String using
 the 
Object.toString() method followed by the specified delimiter until the last element
 in the iterable.
 
 
- iterables(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- iterableView(PCollection<KV<Void, T>>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
 
- 
Returns a 
PCollectionView<Iterable<T>> capable of processing elements windowed using
 the provided 
WindowingStrategy.
 
 
- IterableViewFn() - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
 
-  
 
- iterableWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- iterableWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- iterator() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
-  
 
- OBJECT_TYPE_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- of(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.apex.ApexRunner.CreateApexPCollectionView
 
-  
 
- of() - Static method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
-  
 
- of(BeamFnApi.ProcessBundleDescriptor, Map<String, RemoteInputDestination<WindowedValue<?>>>, Map<BeamFnApi.Target, Coder<WindowedValue<?>>>, Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, Map<String, Map<String, ProcessBundleDescriptors.BagUserStateSpec>>, Map<String, Map<String, ProcessBundleDescriptors.TimerSpec>>) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
 
-  
 
- of(String, String, RunnerApi.FunctionSpec, Coder<T>, Coder<W>) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
-  
 
- of(Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
 
-  
 
- of(Coder<T>, BeamFnApi.Target) - Static method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
 
-  
 
- of(K, Coder<K>) - Static method in class org.apache.beam.runners.local.StructuralKey
 
- 
Create a new Structural Key of the provided key that can be encoded by the provided coder.
 
- of(T, CloseableResource.Closer<T>) - Static method in class org.apache.beam.runners.reference.CloseableResource
 
- 
 
- of(Coder<T>, Duration, boolean) - Static method in class org.apache.beam.runners.spark.io.CreateStream
 
- 
Creates a new Spark based stream intended for test purposes.
 
- of(Coder<T>, Duration) - Static method in class org.apache.beam.runners.spark.io.CreateStream
 
- 
Creates a new Spark based stream without forced watermark sync, intended for test purposes.
 
- of(NamedAggregators) - Static method in class org.apache.beam.runners.spark.metrics.AggregatorMetric
 
-  
 
- of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.AvroCoder
 
- 
Returns an AvroCoder instance for the provided element type.
 
- of(Class<T>) - Static method in class org.apache.beam.sdk.coders.AvroCoder
 
- 
Returns an AvroCoder instance for the provided element class.
 
- of(Schema) - Static method in class org.apache.beam.sdk.coders.AvroCoder
 
- 
Returns an AvroCoder instance for the Avro schema.
 
- of(Class<T>, Schema) - Static method in class org.apache.beam.sdk.coders.AvroCoder
 
- 
Returns an AvroCoder instance for the provided element type using the provided Avro
 schema.
 
- of() - Static method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.BitSetCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.BooleanCoder
 
- 
 
- of() - Static method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.ByteCoder
 
-  
 
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.CollectionCoder
 
-  
 
- of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
 
-  
 
- of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.DoubleCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.DurationCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.FloatCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.InstantCoder
 
-  
 
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.IterableCoder
 
-  
 
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
-  
 
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.ListCoder
 
-  
 
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.MapCoder
 
- 
Produces a MapCoder with the given keyCoder and valueCoder.
 
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.NullableCoder
 
-  
 
- of(Schema) - Static method in class org.apache.beam.sdk.coders.RowCoder
 
-  
 
- of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
 
- 
 
- of(Class<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
 
- 
 
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SetCoder
 
- 
Produces a 
SetCoder with the given 
elementCoder.
 
 
- of(Coder<KeyT>) - Static method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
-  
 
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SnappyCoder
 
- 
 
- of(Class<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- of(Class<T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.VarIntCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.VarLongCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.coders.VoidCoder
 
-  
 
- of(Class<? extends OutputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.AsJsons
 
- 
 
- of(Class<? extends OutputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
 
- 
 
- of() - Static method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
-  
 
- of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
- 
Returns a 
ProtoCoder for the given Protocol Buffers 
Message.
 
 
- of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
- 
 
- of() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
 
-  
 
- of(SqlTypeName, T) - Static method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
- 
A builder function to create from Type and value directly.
 
- of(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
- 
Convenient way to build a mocked bounded table.
 
- of(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
- 
Build a mocked bounded table with the specified type.
 
- of(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
- 
Convenient way to build a mocked unbounded table.
 
- of(String, BeamFnApi.Target) - Static method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
 
-  
 
- of() - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
 
-  
 
- of(Coder<BoundedWindow>, Coder<DestinationT>) - Static method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.io.fs.MetadataCoder
 
- 
 
- of() - Static method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
 
- 
 
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
-  
 
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
-  
 
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
-  
 
- of(Class<T>) - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
- 
Returns a WritableCoder instance for the provided element class.
 
- of(String, Scan) - Static method in class org.apache.beam.sdk.io.hbase.HBaseQuery
 
-  
 
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
-  
 
- of(int...) - Static method in class org.apache.beam.sdk.io.range.ByteKey
 
- 
Creates a new 
ByteKey backed by a copy of the specified 
int[].
 
 
- of(ByteKey, ByteKey) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
Creates a new 
ByteKeyRange with the given start and end keys.
 
 
- of(ByteKeyRange) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
- 
 
- of() - Static method in class org.apache.beam.sdk.io.ReadableFileCoder
 
- 
 
- of(Class<T>) - Static method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
- 
Create a coder for a given type of JAXB annotated objects.
 
- of(ValueProvider<X>, SerializableFunction<X, T>) - Static method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
- 
 
- of(T) - Static method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
- 
 
- of(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
Return's a field with the give name and type.
 
- of(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
 
- of(Schema.Field...) - Static method in class org.apache.beam.sdk.schemas.Schema
 
-  
 
- of(Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
 
- 
 
- of(Schema) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
 
- 
 
- of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
- 
Returns a CombineFn that uses the given SerializableFunction to combine
 values.
 
- of(SerializableFunction<Iterable<V>, V>, int) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
- 
Returns a CombineFn that uses the given SerializableFunction to combine
 values, attempting to buffer at least bufferSize values between invocations.
 
- of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
 
- 
Deprecated.
Returns a CombineFn that uses the given SerializableFunction to combine
 values.
 
- of(ClosureT, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
 
- 
Constructs a pair of the given closure and its requirements.
 
- of(Iterable<T>) - Static method in class org.apache.beam.sdk.transforms.Create
 
- 
Returns a new 
Create.Values transform that produces a 
PCollection containing
 elements of the provided 
Iterable.
 
 
- of(T, T...) - Static method in class org.apache.beam.sdk.transforms.Create
 
- 
Returns a new 
Create.Values transform that produces a 
PCollection containing
 the specified elements.
 
 
- of(Map<K, V>) - Static method in class org.apache.beam.sdk.transforms.Create
 
- 
Returns a new 
Create.Values transform that produces a 
PCollection of 
KVs corresponding to the keys and values of the specified 
Map.
 
 
- of(DisplayData.Path, Class<?>, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
-  
 
- of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- of(CoGbkResultSchema, UnionCoder) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
- 
 
- of(TupleTag<V>, List<V>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
- 
Returns a new CoGbkResult that contains just the given tag and given data.
 
- of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
-  
 
- of(TupleTag<InputT>, PCollection<KV<K, InputT>>) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
- 
Returns a new KeyedPCollectionTuple<K> with the given tag and initial PCollection.
 
- of(List<Coder<?>>) - Static method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
- 
Builds a union coder with the given list of element coders.
 
- of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
 
- of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
 
- of() - Static method in class org.apache.beam.sdk.transforms.Mean
 
- 
A Combine.CombineFn that computes the arithmetic mean (a.k.a.
 
- of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
 
- of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
 
- of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.ParDo
 
- 
 
- of(int, Partition.PartitionFn<? super T>) - Static method in class org.apache.beam.sdk.transforms.Partition
 
- 
Returns a new Partition PTransform that divides its input PCollection
 into the given number of partitions, using the given partitioning function.
 
- of() - Static method in class org.apache.beam.sdk.transforms.Reshuffle
 
- 
Deprecated.
  
- of(ByteKeyRange) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
-  
 
- of(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<List<T>> with a single element containing the largest count elements of
 the input PCollection<T>, in decreasing order, sorted using the given Comparator<T>.
 
- of(PCollectionView<ViewT>) - Static method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
 
-  
 
- of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
- 
Returns an AfterAll Trigger with the given subtriggers.
 
- of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
- 
Returns an AfterAll Trigger with the given subtriggers.
 
- of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
- 
Returns an AfterFirst Trigger with the given subtriggers.
 
- of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
- 
Returns an AfterFirst Trigger with the given subtriggers.
 
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
- 
Returns the default trigger.
 
- of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
- 
Partitions the timestamp space into half-open intervals of the form [N * size, (N + 1) * size),
 where 0 is the epoch.
 
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
-  
 
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
-  
 
- of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
- 
Assigns timestamps into half-open intervals of the form [N * period, N * period + size), where
 0 is the epoch.
 
- of(SerializableFunction<V, K>) - Static method in class org.apache.beam.sdk.transforms.WithKeys
 
- 
Returns a PTransform that takes a PCollection<V> and returns a PCollection<KV<K, V>>, where each of the values in the input PCollection has been
 paired with a key computed from the value by invoking the given SerializableFunction.
 
- of(K) - Static method in class org.apache.beam.sdk.transforms.WithKeys
 
- 
Returns a PTransform that takes a PCollection<V> and returns a PCollection<KV<K, V>>, where each of the values in the input PCollection has been
 paired with the given key.
 
- of(SerializableFunction<T, Instant>) - Static method in class org.apache.beam.sdk.transforms.WithTimestamps
 
- 
 
- of(K, V) - Static method in class org.apache.beam.sdk.values.KV
 
- 
Returns a 
KV with the given key and value.
 
 
- of(PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionList
 
- 
 
- of(Iterable<PCollection<T>>) - Static method in class org.apache.beam.sdk.values.PCollectionList
 
- 
 
- of(TupleTag<T>, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
 
- 
 
- of(K, int) - Static method in class org.apache.beam.sdk.values.ShardedKey
 
-  
 
- of(TupleTag<?>, PValue) - Static method in class org.apache.beam.sdk.values.TaggedPValue
 
-  
 
- of(V, Instant) - Static method in class org.apache.beam.sdk.values.TimestampedValue
 
- 
Returns a new TimestampedValue with the given value and timestamp.
 
- of(Coder<T>) - Static method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
-  
 
- of(TupleTag<?>) - Static method in class org.apache.beam.sdk.values.TupleTagList
 
- 
 
- of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.values.TupleTagList
 
- 
 
- of(Class<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
 
- of(Type) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
 
- of(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
-  
 
- of(T, Instant, BoundedWindow, PaneInfo) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow
 
-  
 
- of(Coder<ValueT>) - Static method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
-  
 
- of(WindowFn<T, W>) - Static method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
 
- ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
 
- ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Sum
 
- 
 
- ofExpandedValue(PValue) - Static method in class org.apache.beam.sdk.values.TaggedPValue
 
-  
 
- offerCoders(Coder[]) - Method in interface org.apache.beam.sdk.state.StateSpec
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- offeringClientsToPool(ControlClientPool.Sink, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
 
- 
 
- ofFirstElement() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
-  
 
- offset(Duration) - Method in interface org.apache.beam.sdk.state.Timer
 
- 
 
- OFFSET_INFINITY - Static variable in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
- 
Offset corresponding to infinity.
 
- OffsetBasedReader(OffsetBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
-  
 
- OffsetBasedSource<T> - Class in org.apache.beam.sdk.io
 
- 
A 
BoundedSource that uses offsets to define starting and ending positions.
 
 
- OffsetBasedSource(long, long, long) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource
 
-  
 
- OffsetBasedSource.OffsetBasedReader<T> - Class in org.apache.beam.sdk.io
 
- 
 
- OffsetRange - Class in org.apache.beam.sdk.io.range
 
- 
A restriction represented by a range of integers [from, to).
 
- OffsetRange(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRange
 
-  
 
- OffsetRangeTracker - Class in org.apache.beam.sdk.io.range
 
- 
 
- OffsetRangeTracker(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
- 
Creates an OffsetRangeTracker for the specified range.
 
- OffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
 
- 
 
- OffsetRangeTracker(OffsetRange) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
-  
 
- ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
 
- ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
 
- ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Sum
 
- 
 
- ofLongs() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
 
- ofLongs() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
 
- ofLongs() - Static method in class org.apache.beam.sdk.transforms.Sum
 
- 
 
- ofPrimitiveOutputsInternal(Pipeline, TupleTagList, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, PCollection.IsBounded) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- ofProvider(ValueProvider<T>, Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
 
- 
 
- ofSize(long) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
 
-  
 
- on(PCollection<?>...) - Static method in class org.apache.beam.sdk.transforms.Wait
 
- 
Waits on the given signal collections.
 
- on(List<PCollection<?>>) - Static method in class org.apache.beam.sdk.transforms.Wait
 
- 
Waits on the given signal collections.
 
- ON_TIME_AND_ONLY_FIRING - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
- 
PaneInfo to use when there will be exactly one firing and it is on time.
 
- onAdvance(int, int) - Method in class org.apache.beam.sdk.fn.stream.AdvancingPhaser
 
-  
 
- onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
 
-  
 
- onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
 
-  
 
- onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarkAdvancingStreamingListener
 
-  
 
- OnceTrigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
 
-  
 
- onClaimed(PositionT) - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.ClaimObserver
 
- 
 
- onClaimFailed(PositionT) - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.ClaimObserver
 
- 
 
- onClose(Consumer<FnApiControlClient>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
 
-  
 
- onCompleted(BeamFnApi.ProcessBundleResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
 
- 
Handles the bundle's completion report.
 
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
-  
 
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
 
-  
 
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
-  
 
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
 
-  
 
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
-  
 
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
 
-  
 
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
-  
 
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
 
-  
 
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
 
-  
 
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule
 
-  
 
- onNext(T) - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
 
-  
 
- onNext(T) - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
 
-  
 
- onNext(ReqT) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
 
-  
 
- onNext(V) - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
 
-  
 
- onProgress(BeamFnApi.ProcessBundleProgressResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
 
- 
Handles a progress report from the bundle while it is executing.
 
- onSeenNewOutput(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
 
- 
 
- onTimer(String, BoundedWindow, Instant, TimeDomain) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
-  
 
- OnTimerContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
 
-  
 
- op(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- open(TaskContext, Instant) - Method in class org.apache.beam.runners.gearpump.translators.io.GearpumpSource
 
-  
 
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.AvroIO.Sink
 
-  
 
- open(String) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
 
- open() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
- 
 
- open(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileIO.Sink
 
- 
Initializes writing to the given channel.
 
- open(ResourceIdT) - Method in class org.apache.beam.sdk.io.FileSystem
 
- 
Returns a read channel for the given ResourceIdT.
 
- open(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
 
- 
 
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
 
-  
 
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
-  
 
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
 
-  
 
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
-  
 
- openSeekable() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
- 
 
- operands - Variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- Options() - Constructor for class org.apache.beam.runners.apex.ApexRunnerRegistrar.Options
 
-  
 
- Options() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
 
-  
 
- Options() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Options
 
-  
 
- Options() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
 
-  
 
- options() - Method in class org.apache.beam.runners.fnexecution.jobsubmission.JobPreparation
 
-  
 
- Options() - Constructor for class org.apache.beam.runners.gearpump.GearpumpRunnerRegistrar.Options
 
-  
 
- Options() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
 
-  
 
- options() - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
-  
 
- Options() - Constructor for class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
 
-  
 
- opType(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- opValueEvaluated(int, Row, BoundedWindow, BeamSqlExpressionEnvironment) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- Order - Class in org.apache.beam.sdk.extensions.sql.example.model
 
- 
Describes an order.
 
- Order(int, int) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Order
 
-  
 
- Order() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Order
 
-  
 
- OrderByKey() - Constructor for class org.apache.beam.sdk.values.KV.OrderByKey
 
-  
 
- OrderByValue() - Constructor for class org.apache.beam.sdk.values.KV.OrderByValue
 
-  
 
- orFinally(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
- 
Specify an ending condition for this trigger.
 
- OrFinallyTrigger - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
Trigger that executes according to its main trigger until its "finally" trigger fires.
 
 
- org.apache.beam.runners.apex - package org.apache.beam.runners.apex
 
- 
Implementation of the Beam runner for Apache Apex.
 
- org.apache.beam.runners.dataflow - package org.apache.beam.runners.dataflow
 
- 
Provides a Beam runner that executes pipelines on the Google Cloud Dataflow service.
 
- org.apache.beam.runners.dataflow.options - package org.apache.beam.runners.dataflow.options
 
- 
 
- org.apache.beam.runners.dataflow.util - package org.apache.beam.runners.dataflow.util
 
- 
Provides miscellaneous internal utilities used by the Google Cloud Dataflow runner.
 
- org.apache.beam.runners.direct - package org.apache.beam.runners.direct
 
- 
 
- org.apache.beam.runners.direct.portable - package org.apache.beam.runners.direct.portable
 
- 
 
- org.apache.beam.runners.direct.portable.artifact - package org.apache.beam.runners.direct.portable.artifact
 
- 
Provides local implementations of the Artifact API services.
 
- org.apache.beam.runners.direct.portable.job - package org.apache.beam.runners.direct.portable.job
 
- 
An execution engine for Beam Pipelines that uses the Java Runner harness and the Fn API to
 execute.
 
- org.apache.beam.runners.flink - package org.apache.beam.runners.flink
 
- 
Internal implementation of the Beam runner for Apache Flink.
 
- org.apache.beam.runners.flink.metrics - package org.apache.beam.runners.flink.metrics
 
- 
Internal metrics implementation of the Beam runner for Apache Flink.
 
- org.apache.beam.runners.fnexecution - package org.apache.beam.runners.fnexecution
 
- 
Utilities used by runners to interact with the fn execution components of the Beam Portability
 Framework.
 
- org.apache.beam.runners.fnexecution.artifact - package org.apache.beam.runners.fnexecution.artifact
 
- 
Pipeline execution-time artifact-management services, including abstract implementations of the
 Artifact Retrieval Service.
 
- org.apache.beam.runners.fnexecution.control - package org.apache.beam.runners.fnexecution.control
 
- 
Utilities for a Beam runner to interact with the Fn API Control Service via java abstractions.
 
- org.apache.beam.runners.fnexecution.data - package org.apache.beam.runners.fnexecution.data
 
- 
Utilities for a Beam runner to interact with the Fn API Data Service via java abstractions.
 
- org.apache.beam.runners.fnexecution.environment - package org.apache.beam.runners.fnexecution.environment
 
- 
Classes used to instantiate and manage SDK harness environments.
 
- org.apache.beam.runners.fnexecution.environment.testing - package org.apache.beam.runners.fnexecution.environment.testing
 
- 
Test utilities for the environment management package.
 
- org.apache.beam.runners.fnexecution.jobsubmission - package org.apache.beam.runners.fnexecution.jobsubmission
 
- 
Job management services for use in beam runners.
 
- org.apache.beam.runners.fnexecution.logging - package org.apache.beam.runners.fnexecution.logging
 
- 
Classes used to log informational messages over the Beam Fn Logging Service.
 
- org.apache.beam.runners.fnexecution.provisioning - package org.apache.beam.runners.fnexecution.provisioning
 
- 
Provision api services.
 
- org.apache.beam.runners.fnexecution.splittabledofn - package org.apache.beam.runners.fnexecution.splittabledofn
 
- 
Utilities for a Beam runner to interact with a remotely running splittable DoFn.
 
- org.apache.beam.runners.fnexecution.state - package org.apache.beam.runners.fnexecution.state
 
- 
State API services.
 
- org.apache.beam.runners.fnexecution.wire - package org.apache.beam.runners.fnexecution.wire
 
- 
Wire coders for communications between runner and SDK harness.
 
- org.apache.beam.runners.gearpump - package org.apache.beam.runners.gearpump
 
- 
Internal implementation of the Beam runner for Apache Gearpump.
 
- org.apache.beam.runners.gearpump.translators - package org.apache.beam.runners.gearpump.translators
 
- 
Gearpump specific translators.
 
- org.apache.beam.runners.gearpump.translators.functions - package org.apache.beam.runners.gearpump.translators.functions
 
- 
Gearpump specific wrappers for Beam DoFn.
 
- org.apache.beam.runners.gearpump.translators.io - package org.apache.beam.runners.gearpump.translators.io
 
- 
Gearpump specific wrappers for Beam I/O.
 
- org.apache.beam.runners.gearpump.translators.utils - package org.apache.beam.runners.gearpump.translators.utils
 
- 
Utilities for translators.
 
- org.apache.beam.runners.local - package org.apache.beam.runners.local
 
- 
Utilities useful when executing a pipeline on a single machine.
 
- org.apache.beam.runners.reference - package org.apache.beam.runners.reference
 
- 
Support for executing a pipeline locally over the Beam fn API.
 
- org.apache.beam.runners.reference.testing - package org.apache.beam.runners.reference.testing
 
- 
Testing utilities for the reference runner.
 
- org.apache.beam.runners.spark - package org.apache.beam.runners.spark
 
- 
Internal implementation of the Beam runner for Apache Spark.
 
- org.apache.beam.runners.spark.aggregators - package org.apache.beam.runners.spark.aggregators
 
- 
Provides internal utilities for implementing Beam aggregators using Spark accumulators.
 
- org.apache.beam.runners.spark.coders - package org.apache.beam.runners.spark.coders
 
- 
Beam coders and coder-related utilities for running on Apache Spark.
 
- org.apache.beam.runners.spark.io - package org.apache.beam.runners.spark.io
 
- 
Spark-specific transforms for I/O.
 
- org.apache.beam.runners.spark.metrics - package org.apache.beam.runners.spark.metrics
 
- 
Provides internal utilities for implementing Beam metrics using Spark accumulators.
 
- org.apache.beam.runners.spark.metrics.sink - package org.apache.beam.runners.spark.metrics.sink
 
- 
Spark sinks that supports beam metrics and aggregators.
 
- org.apache.beam.runners.spark.stateful - package org.apache.beam.runners.spark.stateful
 
- 
Spark-specific stateful operators.
 
- org.apache.beam.runners.spark.util - package org.apache.beam.runners.spark.util
 
- 
Internal utilities to translate Beam pipelines to Spark.
 
- org.apache.beam.sdk - package org.apache.beam.sdk
 
- 
Provides a simple, powerful model for building both batch and streaming parallel data processing
 
Pipelines.
 
 
- org.apache.beam.sdk.annotations - package org.apache.beam.sdk.annotations
 
- 
Defines annotations used across the SDK.
 
- org.apache.beam.sdk.coders - package org.apache.beam.sdk.coders
 
- 
Defines 
Coders to specify how data is encoded to and
 decoded from byte strings.
 
 
- org.apache.beam.sdk.extensions.gcp.auth - package org.apache.beam.sdk.extensions.gcp.auth
 
- 
Defines classes related to interacting with Credentials for pipeline
 creation and execution containing Google Cloud Platform components.
 
- org.apache.beam.sdk.extensions.gcp.options - package org.apache.beam.sdk.extensions.gcp.options
 
- 
Defines 
PipelineOptions for configuring pipeline execution
 for Google Cloud Platform components.
 
 
- org.apache.beam.sdk.extensions.gcp.storage - package org.apache.beam.sdk.extensions.gcp.storage
 
- 
Defines IO connectors for Google Cloud Storage.
 
- org.apache.beam.sdk.extensions.jackson - package org.apache.beam.sdk.extensions.jackson
 
- 
Utilities for parsing and creating JSON serialized objects.
 
- org.apache.beam.sdk.extensions.joinlibrary - package org.apache.beam.sdk.extensions.joinlibrary
 
- 
Utilities for performing SQL-style joins of keyed 
PCollections.
 
 
- org.apache.beam.sdk.extensions.protobuf - package org.apache.beam.sdk.extensions.protobuf
 
- 
Defines a 
Coder for Protocol Buffers messages, 
ProtoCoder.
 
 
- org.apache.beam.sdk.extensions.sketching - package org.apache.beam.sdk.extensions.sketching
 
- 
Utilities for computing statistical indicators using probabilistic sketches.
 
- org.apache.beam.sdk.extensions.sorter - package org.apache.beam.sdk.extensions.sorter
 
- 
Utility for performing local sort of potentially large sets of values.
 
- org.apache.beam.sdk.extensions.sql - package org.apache.beam.sdk.extensions.sql
 
- 
BeamSQL provides a new interface to run a SQL statement with Beam.
 
- org.apache.beam.sdk.extensions.sql.example.model - package org.apache.beam.sdk.extensions.sql.example.model
 
- 
Java classes used to for modeling the examples.
 
- org.apache.beam.sdk.extensions.sql.impl - package org.apache.beam.sdk.extensions.sql.impl
 
- 
Implementation classes of BeamSql.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter - package org.apache.beam.sdk.extensions.sql.impl.interpreter
 
- 
interpreter generate runnable 'code' to execute SQL relational expressions.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
Implementation for operators in SqlStdOperatorTable.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
 
- 
Arithmetic operators.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.array - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.array
 
- 
Expressions implementing array operations.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.collection - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.collection
 
- 
Expressions implementing collections operations.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
 
- 
Comparison operators.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
date functions.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
 
- 
Logical operators.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.map - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.map
 
- 
Expressions implementing map operations.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
 
- 
MATH functions/operators.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret
 
- 
Implementation for Reinterpret type conversions.
 
- org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.row - package org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.row
 
- 
Support for fields of type ROW.
 
- org.apache.beam.sdk.extensions.sql.impl.parser - package org.apache.beam.sdk.extensions.sql.impl.parser
 
- 
Beam SQL parsing additions to Calcite SQL.
 
- org.apache.beam.sdk.extensions.sql.impl.planner - package org.apache.beam.sdk.extensions.sql.impl.planner
 
- 
BeamQueryPlanner is the main interface.
 
- org.apache.beam.sdk.extensions.sql.impl.rel - package org.apache.beam.sdk.extensions.sql.impl.rel
 
- 
BeamSQL specified nodes, to replace RelNode.
 
- org.apache.beam.sdk.extensions.sql.impl.rule - package org.apache.beam.sdk.extensions.sql.impl.rule
 
- 
 
- org.apache.beam.sdk.extensions.sql.impl.schema - package org.apache.beam.sdk.extensions.sql.impl.schema
 
- 
define table schema, to map with Beam IO components.
 
- org.apache.beam.sdk.extensions.sql.impl.transform - package org.apache.beam.sdk.extensions.sql.impl.transform
 
- 
 
- org.apache.beam.sdk.extensions.sql.impl.transform.agg - package org.apache.beam.sdk.extensions.sql.impl.transform.agg
 
- 
Implementation of standard SQL aggregation functions, e.g.
 
- org.apache.beam.sdk.extensions.sql.impl.utils - package org.apache.beam.sdk.extensions.sql.impl.utils
 
- 
Utility classes.
 
- org.apache.beam.sdk.extensions.sql.meta - package org.apache.beam.sdk.extensions.sql.meta
 
- 
Metadata related classes.
 
- org.apache.beam.sdk.extensions.sql.meta.provider - package org.apache.beam.sdk.extensions.sql.meta.provider
 
- 
Table providers.
 
- org.apache.beam.sdk.extensions.sql.meta.provider.bigquery - package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
 
- 
Table schema for BigQuery.
 
- org.apache.beam.sdk.extensions.sql.meta.provider.kafka - package org.apache.beam.sdk.extensions.sql.meta.provider.kafka
 
- 
Table schema for KafkaIO.
 
- org.apache.beam.sdk.extensions.sql.meta.provider.pubsub - package org.apache.beam.sdk.extensions.sql.meta.provider.pubsub
 
- 
 
- org.apache.beam.sdk.extensions.sql.meta.provider.test - package org.apache.beam.sdk.extensions.sql.meta.provider.test
 
- 
Table schema for in-memory test data.
 
- org.apache.beam.sdk.extensions.sql.meta.provider.text - package org.apache.beam.sdk.extensions.sql.meta.provider.text
 
- 
Table schema for text files.
 
- org.apache.beam.sdk.extensions.sql.meta.store - package org.apache.beam.sdk.extensions.sql.meta.store
 
- 
Meta stores.
 
- org.apache.beam.sdk.fn - package org.apache.beam.sdk.fn
 
- 
The top level package for the Fn Execution Java libraries.
 
- org.apache.beam.sdk.fn.channel - package org.apache.beam.sdk.fn.channel
 
- 
gRPC channel management.
 
- org.apache.beam.sdk.fn.data - package org.apache.beam.sdk.fn.data
 
- 
Classes to interact with the portability framework data plane.
 
- org.apache.beam.sdk.fn.function - package org.apache.beam.sdk.fn.function
 
- 
Java 8 functional interface extensions.
 
- org.apache.beam.sdk.fn.stream - package org.apache.beam.sdk.fn.stream
 
- 
gRPC stream management.
 
- org.apache.beam.sdk.fn.test - package org.apache.beam.sdk.fn.test
 
- 
Utilities for testing use of this package.
 
- org.apache.beam.sdk.fn.windowing - package org.apache.beam.sdk.fn.windowing
 
- 
Common utilities related to windowing during execution of a pipeline.
 
- org.apache.beam.sdk.io - package org.apache.beam.sdk.io
 
- 
Defines transforms for reading and writing common storage formats, including 
AvroIO, and 
TextIO.
 
 
- org.apache.beam.sdk.io.amqp - package org.apache.beam.sdk.io.amqp
 
- 
Transforms for reading and writing using AMQP 1.0 protocol.
 
- org.apache.beam.sdk.io.aws.options - package org.apache.beam.sdk.io.aws.options
 
- 
Defines 
PipelineOptions for configuring pipeline execution
 for Amazon Web Services components.
 
 
- org.apache.beam.sdk.io.aws.s3 - package org.apache.beam.sdk.io.aws.s3
 
- 
Defines IO connectors for Amazon Web Services S3.
 
- org.apache.beam.sdk.io.aws.sns - package org.apache.beam.sdk.io.aws.sns
 
- 
Defines IO connectors for Amazon Web Services SNS.
 
- org.apache.beam.sdk.io.aws.sqs - package org.apache.beam.sdk.io.aws.sqs
 
- 
Defines IO connectors for Amazon Web Services SQS.
 
- org.apache.beam.sdk.io.cassandra - package org.apache.beam.sdk.io.cassandra
 
- 
Transforms for reading and writing from/to Apache Cassandra.
 
- org.apache.beam.sdk.io.elasticsearch - package org.apache.beam.sdk.io.elasticsearch
 
- 
Transforms for reading and writing from Elasticsearch.
 
- org.apache.beam.sdk.io.fs - package org.apache.beam.sdk.io.fs
 
- 
Apache Beam FileSystem interfaces and their default implementations.
 
- org.apache.beam.sdk.io.gcp.bigquery - package org.apache.beam.sdk.io.gcp.bigquery
 
- 
Defines transforms for reading and writing from Google BigQuery.
 
- org.apache.beam.sdk.io.gcp.bigtable - package org.apache.beam.sdk.io.gcp.bigtable
 
- 
Defines transforms for reading and writing from Google Cloud Bigtable.
 
- org.apache.beam.sdk.io.gcp.common - package org.apache.beam.sdk.io.gcp.common
 
- 
Defines common Google Cloud Platform IO support classes.
 
- org.apache.beam.sdk.io.gcp.datastore - package org.apache.beam.sdk.io.gcp.datastore
 
- 
Provides an API for reading from and writing to 
Google Cloud Datastore over different
 versions of the Cloud Datastore Client libraries.
 
 
- org.apache.beam.sdk.io.gcp.pubsub - package org.apache.beam.sdk.io.gcp.pubsub
 
- 
 
- org.apache.beam.sdk.io.gcp.spanner - package org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- org.apache.beam.sdk.io.hadoop - package org.apache.beam.sdk.io.hadoop
 
- 
Classes shared by Hadoop based IOs.
 
- org.apache.beam.sdk.io.hadoop.inputformat - package org.apache.beam.sdk.io.hadoop.inputformat
 
- 
Defines transforms for reading from Data sources which implement Hadoop Input Format.
 
- org.apache.beam.sdk.io.hbase - package org.apache.beam.sdk.io.hbase
 
- 
Transforms for reading and writing from/to Apache HBase.
 
- org.apache.beam.sdk.io.hcatalog - package org.apache.beam.sdk.io.hcatalog
 
- 
Transforms for reading and writing using HCatalog.
 
- org.apache.beam.sdk.io.hdfs - package org.apache.beam.sdk.io.hdfs
 
- 
FileSystem implementation for any Hadoop 
FileSystem.
 
 
- org.apache.beam.sdk.io.jdbc - package org.apache.beam.sdk.io.jdbc
 
- 
Transforms for reading and writing from JDBC.
 
- org.apache.beam.sdk.io.jms - package org.apache.beam.sdk.io.jms
 
- 
Transforms for reading and writing from JMS (Java Messaging Service).
 
- org.apache.beam.sdk.io.kafka - package org.apache.beam.sdk.io.kafka
 
- 
Transforms for reading and writing from Apache Kafka.
 
- org.apache.beam.sdk.io.kafka.serialization - package org.apache.beam.sdk.io.kafka.serialization
 
- 
Kafka serializers and deserializers.
 
- org.apache.beam.sdk.io.kinesis - package org.apache.beam.sdk.io.kinesis
 
- 
Transforms for reading and writing from Amazon Kinesis.
 
- org.apache.beam.sdk.io.mongodb - package org.apache.beam.sdk.io.mongodb
 
- 
Transforms for reading and writing from MongoDB.
 
- org.apache.beam.sdk.io.mqtt - package org.apache.beam.sdk.io.mqtt
 
- 
Transforms for reading and writing from MQTT.
 
- org.apache.beam.sdk.io.parquet - package org.apache.beam.sdk.io.parquet
 
- 
Transforms for reading and writing from Parquet.
 
- org.apache.beam.sdk.io.range - package org.apache.beam.sdk.io.range
 
- 
Provides thread-safe helpers for implementing dynamic work rebalancing in position-based bounded
 sources.
 
- org.apache.beam.sdk.io.redis - package org.apache.beam.sdk.io.redis
 
- 
Transforms for reading and writing from Redis.
 
- org.apache.beam.sdk.io.solr - package org.apache.beam.sdk.io.solr
 
- 
Transforms for reading and writing from/to Solr.
 
- org.apache.beam.sdk.io.tika - package org.apache.beam.sdk.io.tika
 
- 
Transform for reading and parsing files with Apache Tika.
 
- org.apache.beam.sdk.io.xml - package org.apache.beam.sdk.io.xml
 
- 
Transforms for reading and writing Xml files.
 
- org.apache.beam.sdk.metrics - package org.apache.beam.sdk.metrics
 
- 
Metrics allow exporting information about the execution of a pipeline.
 
- org.apache.beam.sdk.options - package org.apache.beam.sdk.options
 
- 
 
- org.apache.beam.sdk.schemas - package org.apache.beam.sdk.schemas
 
- 
Defines 
Schema and other classes for representing schema'd
 data in a 
Pipeline.
 
 
- org.apache.beam.sdk.schemas.transforms - package org.apache.beam.sdk.schemas.transforms
 
- 
Defines transforms that work on PCollections with schemas..
 
- org.apache.beam.sdk.schemas.utils - package org.apache.beam.sdk.schemas.utils
 
- 
Defines utilities for deailing with schemas.
 
- org.apache.beam.sdk.state - package org.apache.beam.sdk.state
 
- 
Classes and interfaces for interacting with state.
 
- org.apache.beam.sdk.testing - package org.apache.beam.sdk.testing
 
- 
Defines utilities for unit testing Apache Beam pipelines.
 
- org.apache.beam.sdk.transforms - package org.apache.beam.sdk.transforms
 
- 
Defines 
PTransforms for transforming data in a pipeline.
 
 
- org.apache.beam.sdk.transforms.display - package org.apache.beam.sdk.transforms.display
 
- 
 
- org.apache.beam.sdk.transforms.join - package org.apache.beam.sdk.transforms.join
 
- 
Defines the 
CoGroupByKey transform for joining
 multiple PCollections.
 
 
- org.apache.beam.sdk.transforms.splittabledofn - package org.apache.beam.sdk.transforms.splittabledofn
 
- 
 
- org.apache.beam.sdk.transforms.windowing - package org.apache.beam.sdk.transforms.windowing
 
- 
Defines the 
Window transform for dividing the
 elements in a PCollection into windows, and the 
Trigger for controlling when those elements are output.
 
 
- org.apache.beam.sdk.values - package org.apache.beam.sdk.values
 
- 
 
- out() - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
 
- 
 
- out(int) - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
 
- 
 
- outbound(DataStreams.OutputChunkConsumer<ByteString>) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
 
- 
Converts a single element delimited OutputStream into multiple ByteStrings.
 
- outbound(DataStreams.OutputChunkConsumer<ByteString>, int) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
 
- 
Converts a single element delimited OutputStream into multiple ByteStrings using the specified maximum chunk size.
 
- OutboundObserverFactory - Class in org.apache.beam.sdk.fn.stream
 
- 
Creates factories which determine an underlying StreamObserver implementation to use in
 to interact with fn execution APIs.
 
- OutboundObserverFactory() - Constructor for class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
 
-  
 
- OutboundObserverFactory.BasicFactory<ReqT,RespT> - Interface in org.apache.beam.sdk.fn.stream
 
- 
Creates an outbound observer for the given inbound observer.
 
- outboundObserverFor(StreamObserver<ReqT>) - Method in interface org.apache.beam.sdk.fn.stream.OutboundObserverFactory.BasicFactory
 
-  
 
- outboundObserverFor(OutboundObserverFactory.BasicFactory<ReqT, RespT>, StreamObserver<ReqT>) - Method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
 
- 
Creates an outbound observer for the given inbound observer by potentially inserting hooks into
 the inbound and outbound observers.
 
- OUTER - Static variable in class org.apache.beam.sdk.coders.Coder.Context
 
- 
Deprecated.
The outer context: the value being encoded or decoded takes up the remainder of the
 record/stream contents.
 
- OutgoingMessage(byte[], Map<String, String>, long, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
-  
 
- OUTPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- output(T) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
 
- 
Output the object.
 
- output(T, Instant) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
 
- 
Output the object using the specified timestamp.
 
- output(OutputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
 
- 
Adds the given element to the main output PCollection at the given timestamp in the
 given window.
 
- output(TupleTag<T>, T, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
 
- 
Adds the given element to the output PCollection with the given tag at the given
 timestamp in the given window.
 
- output(T) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
 
-  
 
- output(OutputT) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
 
- 
Adds the given element to the main output PCollection.
 
- output(TupleTag<T>, T) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
 
- 
Adds the given element to the output PCollection with the given tag.
 
- OUTPUT_INFO - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- OUTPUT_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- outputOf(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
Returns a type descriptor for the output of the given 
SerializableFunction, subject to
 Java type erasure: may contain unresolved type variables if the type was erased.
 
 
- outputOf(Contextful.Fn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- OutputReceiverFactory - Interface in org.apache.beam.runners.fnexecution.control
 
- 
A factory that can create output receivers during an executable stage.
 
- OutputReference - Class in org.apache.beam.runners.dataflow.util
 
- 
A representation used by Steps to reference the
 output of other Steps.
 
- OutputReference(String, String) - Constructor for class org.apache.beam.runners.dataflow.util.OutputReference
 
-  
 
- outputRuntimeOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
- 
 
- outputTarget() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
-  
 
- outputType - Variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
 
-  
 
- outputWithTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
 
-  
 
- outputWithTimestamp(OutputT, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
 
- 
Adds the given element to the main output PCollection, with the given timestamp.
 
- outputWithTimestamp(TupleTag<T>, T, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
 
- 
Adds the given element to the specified output PCollection, with the given timestamp.
 
- overlaps(ByteKeyRange) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
Returns 
true if the specified 
ByteKeyRange overlaps this range.
 
 
- OVERLAY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators
 
-  
 
- pane() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
 
- 
Returns information about the pane within this window into which the input element has been
 assigned.
 
- PaneInfo - Class in org.apache.beam.sdk.transforms.windowing
 
- 
Provides information about the pane an element belongs to.
 
- PaneInfo.PaneInfoCoder - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A Coder for encoding PaneInfo instances.
 
- PaneInfo.Timing - Enum in org.apache.beam.sdk.transforms.windowing
 
- 
Enumerates the possibilities for the timing of this pane firing related to the input and output
 watermarks for its computation.
 
- PARALLEL_INPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- Params() - Constructor for class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
- 
Construct a default Params object.
 
- ParamsCoder() - Constructor for class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
 
-  
 
- ParDo - Class in org.apache.beam.sdk.transforms
 
- 
ParDo is the core element-wise transform in Apache Beam, invoking a user-specified
 function on each of the elements of the input 
PCollection to produce zero or more output
 elements, all of which are collected into the output 
PCollection.
 
 
- ParDo() - Constructor for class org.apache.beam.sdk.transforms.ParDo
 
-  
 
- ParDo.MultiOutput<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
A 
PTransform that, when applied to a 
PCollection<InputT>, invokes a
 user-specified 
DoFn<InputT, OutputT> on all its elements, which can emit elements to
 any of the 
PTransform's output 
PCollections, which are bundled into a result
 
PCollectionTuple.
 
 
- ParDo.SingleOutput<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
A 
PTransform that, when applied to a 
PCollection<InputT>, invokes a
 user-specified 
DoFn<InputT, OutputT> on all its elements, with all its outputs
 collected into an output 
PCollection<OutputT>.
 
 
- ParDoMultiOutputTranslator<InputT,OutputT> - Class in org.apache.beam.runners.gearpump.translators
 
- 
ParDo.MultiOutput is translated to Gearpump flatMap function with 
DoFn wrapped in
 
DoFnFunction.
 
 
- ParDoMultiOutputTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.ParDoMultiOutputTranslator
 
-  
 
- ParDoMultiOverrideFactory<InputT,OutputT> - Class in org.apache.beam.runners.direct
 
- 
A 
PTransformOverrideFactory that provides overrides for applications of a 
ParDo
 in the direct runner.
 
 
- ParDoMultiOverrideFactory() - Constructor for class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
 
-  
 
- ParquetIO - Class in org.apache.beam.sdk.io.parquet
 
- 
IO to read and write Parquet files.
 
- ParquetIO.Read - Class in org.apache.beam.sdk.io.parquet
 
- 
 
- ParquetIO.ReadFiles - Class in org.apache.beam.sdk.io.parquet
 
- 
 
- ParquetIO.Sink - Class in org.apache.beam.sdk.io.parquet
 
- 
 
- Parse() - Constructor for class org.apache.beam.sdk.io.AvroIO.Parse
 
-  
 
- parse(GridFSDBFile, MongoDbGridFSIO.ParserCallback<T>) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Parser
 
-  
 
- parse() - Static method in class org.apache.beam.sdk.io.tika.TikaIO
 
- 
Parses files matching a given filepattern.
 
- Parse() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO.Parse
 
-  
 
- ParseAll() - Constructor for class org.apache.beam.sdk.io.AvroIO.ParseAll
 
-  
 
- parseAllGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
 
- ParseException - Exception in org.apache.beam.sdk.extensions.sql.impl
 
- 
Exception thrown when Beam SQL is unable to parse the statement.
 
- ParseException(Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.impl.ParseException
 
-  
 
- ParseException(String, Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.impl.ParseException
 
-  
 
- parseFiles() - Static method in class org.apache.beam.sdk.io.tika.TikaIO
 
- 
 
- ParseFiles() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
-  
 
- parseGenericRecords(SerializableFunction<GenericRecord, T>) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
Reads Avro file(s) containing records of an unspecified schema and converting each record to a
 custom type.
 
- ParseJsons<OutputT> - Class in org.apache.beam.sdk.extensions.jackson
 
- 
 
- parseQuery(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
-  
 
- ParseResult - Class in org.apache.beam.sdk.io.tika
 
- 
The result of parsing a single file with Tika: contains the file's location, metadata, extracted
 text, and optionally an error.
 
- parseTableSpec(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
- 
Parse a table specification in the form "[project_id]:[dataset_id].[table_id]" or
 "[dataset_id].[table_id]".
 
- Partition<T> - Class in org.apache.beam.sdk.transforms
 
- 
Partition takes a PCollection<T> and a PartitionFn, uses the PartitionFn to split the elements of the input PCollection into N partitions,
 and returns a PCollectionList<T> that bundles N PCollection<T>s
 containing the split elements.
 
- Partition.PartitionFn<T> - Interface in org.apache.beam.sdk.transforms
 
- 
A function object that chooses an output partition for an element.
 
- PartitionContext() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
 
-  
 
- partitioner() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
-  
 
- partitionFor(T, int) - Method in interface org.apache.beam.sdk.transforms.Partition.PartitionFn
 
- 
Chooses the partition into which to put the given element.
 
- PartitioningWindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
WindowFn that places each value into exactly one window based on its timestamp and
 never merges windows.
 
 
- PartitioningWindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
-  
 
- PartitionMark(String, int, long, long) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
-  
 
- PAssert - Class in org.apache.beam.sdk.testing
 
- 
An assertion on the contents of a 
PCollection incorporated into the pipeline.
 
 
- PAssert.DefaultConcludeTransform - Class in org.apache.beam.sdk.testing
 
- 
Default transform to check that a PAssert was successful.
 
- PAssert.GroupThenAssert<T> - Class in org.apache.beam.sdk.testing
 
- 
A transform that applies an assertion-checking function over iterables of ActualT to
 the entirety of the contents of its input.
 
- PAssert.GroupThenAssertForSingleton<T> - Class in org.apache.beam.sdk.testing
 
- 
A transform that applies an assertion-checking function to a single iterable contained as the
 sole element of a 
PCollection.
 
 
- PAssert.IterableAssert<T> - Interface in org.apache.beam.sdk.testing
 
- 
Builder interface for assertions applicable to iterables and PCollection contents.
 
- PAssert.OneSideInputAssert<ActualT> - Class in org.apache.beam.sdk.testing
 
- 
An assertion checker that takes a single 
PCollectionView<ActualT>
 and an assertion over 
ActualT, and checks it within a Beam pipeline.
 
 
- PAssert.PAssertionSite - Class in org.apache.beam.sdk.testing
 
- 
Track the place where an assertion is defined.
 
- PAssert.PCollectionContentsAssert<T> - Class in org.apache.beam.sdk.testing
 
- 
 
- PAssert.PCollectionContentsAssert.MatcherCheckerFn<T> - Class in org.apache.beam.sdk.testing
 
- 
Check that the passed-in matchers match the existing data.
 
- PAssert.SingletonAssert<T> - Interface in org.apache.beam.sdk.testing
 
- 
Builder interface for assertions applicable to a single value.
 
- pastEndOfWindow() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark
 
- 
Creates a trigger that fires when the watermark passes the end of the window.
 
- pastFirstElementInPane() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
- 
Creates a trigger that fires when the current processing time passes the processing time at
 which this trigger saw the first element in a pane.
 
- patchTableDescription(TableReference, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
- 
Patch BigQuery 
Table description.
 
 
- PathValidator - Interface in org.apache.beam.sdk.extensions.gcp.storage
 
- 
For internal use only; no backwards compatibility guarantees.
 
- PathValidatorFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
 
-  
 
- PBegin - Class in org.apache.beam.sdk.values
 
- 
 
- PBegin(Pipeline) - Constructor for class org.apache.beam.sdk.values.PBegin
 
- 
 
- PCollection<T> - Class in org.apache.beam.sdk.values
 
- 
 
- PCollection.IsBounded - Enum in org.apache.beam.sdk.values
 
- 
The enumeration of cases for whether a 
PCollection is bounded.
 
 
- PCollectionContentsAssert(PCollection<T>, PAssert.PAssertionSite) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- PCollectionContentsAssert(PCollection<T>, PAssert.AssertionWindows, SimpleFunction<Iterable<ValueInSingleWindow<T>>, Iterable<T>>, PAssert.PAssertionSite) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- PCollectionList<T> - Class in org.apache.beam.sdk.values
 
- 
 
- pCollections() - Static method in class org.apache.beam.sdk.transforms.Flatten
 
- 
 
- PCollectionTuple - Class in org.apache.beam.sdk.values
 
- 
 
- PCollectionView<T> - Interface in org.apache.beam.sdk.values
 
- 
 
- PCollectionViews - Class in org.apache.beam.sdk.values
 
- 
For internal use only; no backwards compatibility guarantees.
 
- PCollectionViews() - Constructor for class org.apache.beam.sdk.values.PCollectionViews
 
-  
 
- PCollectionViews.IterableViewFn<T> - Class in org.apache.beam.sdk.values
 
- 
Implementation which is able to adapt a multimap materialization to a Iterable<T>.
 
- PCollectionViews.ListViewFn<T> - Class in org.apache.beam.sdk.values
 
- 
Implementation which is able to adapt a multimap materialization to a List<T>.
 
- PCollectionViews.MapViewFn<K,V> - Class in org.apache.beam.sdk.values
 
- 
Implementation which is able to adapt a multimap materialization to a Map<K, V>.
 
- PCollectionViews.MultimapViewFn<K,V> - Class in org.apache.beam.sdk.values
 
- 
Implementation which is able to adapt a multimap materialization to a Map<K,
 Iterable<V>>.
 
- PCollectionViews.SimplePCollectionView<ElemT,PrimitiveViewT,ViewT,W extends BoundedWindow> - Class in org.apache.beam.sdk.values
 
- 
A class for 
PCollectionView implementations, with additional type parameters that are
 not visible at pipeline assembly time when the view is used as a side input.
 
 
- PCollectionViews.SingletonViewFn<T> - Class in org.apache.beam.sdk.values
 
- 
Implementation which is able to adapt a multimap materialization to a T.
 
- PDone - Class in org.apache.beam.sdk.values
 
- 
 
- peekOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- peekOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- peekOutputElementsInWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- peekOutputElementsInWindow(TupleTag<OutputT>, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- peekOutputElementsWithTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- perElement() - Static method in class org.apache.beam.sdk.transforms.Count
 
- 
 
- perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
 
- 
Like 
ApproximateDistinct.globally() but per key, i.e computes the approximate number of distinct values per
 key in a 
PCollection<KV<K, V>> and returns 
PCollection<KV<K, Long>>.
 
 
- perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
 
- 
Like 
SketchFrequencies.globally() but per key, i.e a Count-min sketch per key in 
PCollection<KV<K, V>> and returns a 
PCollection<KV<K, {@link CountMinSketch}>>.
 
 
- perKey() - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
 
- 
 
- perKey(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
 
- 
Returns a PTransform that takes a PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the
 input PCollection to a List of the approximate N-tiles of the values
 associated with that key in the input PCollection.
 
- perKey(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
 
- 
 
- perKey(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
 
- 
Returns a PTransform that takes a PCollection<KV<K, V>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the
 input PCollection to an estimate of the number of distinct values associated with that
 key in the input PCollection.
 
- perKey(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
 
- 
 
- perKey(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
 
- 
Returns a 
Combine.PerKey PTransform that first groups its input 
PCollection of 
KVs by keys and windows, then invokes the given function on each of the
 values lists to produce a combined value, and then returns a 
PCollection of 
KVs
 mapping each distinct key to its combined value for each window.
 
 
- perKey(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
 
- 
Returns a 
Combine.PerKey PTransform that first groups its input 
PCollection of 
KVs by keys and windows, then invokes the given function on each of the
 values lists to produce a combined value, and then returns a 
PCollection of 
KVs
 mapping each distinct key to its combined value for each window.
 
 
- perKey() - Static method in class org.apache.beam.sdk.transforms.Count
 
- 
Returns a 
PTransform that counts the number of elements associated with each key of its
 input 
PCollection.
 
 
- perKey() - Static method in class org.apache.beam.sdk.transforms.Latest
 
- 
Returns a 
PTransform that takes as input a 
PCollection<KV<K, V>> and returns a
 
PCollection<KV<K, V>> whose contents is the latest element per-key according to its
 event time.
 
 
- perKey() - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a
 PCollection<KV<K, T>> that contains an output element mapping each distinct key in the
 input PCollection to the maximum according to the natural ordering of T of the
 values associated with that key in the input PCollection.
 
- perKey(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
 
- 
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a
 PCollection<KV<K, T>> that contains one output element per key mapping each to the
 maximum of the values associated with that key in the input PCollection.
 
- perKey() - Static method in class org.apache.beam.sdk.transforms.Mean
 
- 
Returns a PTransform that takes an input PCollection<KV<K, N>> and returns a
 PCollection<KV<K, Double>> that contains an output element mapping each distinct key in
 the input PCollection to the mean of the values associated with that key in the input
 PCollection.
 
- perKey() - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a
 PCollection<KV<K, T>> that contains an output element mapping each distinct key in the
 input PCollection to the minimum according to the natural ordering of T of the
 values associated with that key in the input PCollection.
 
- perKey(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
 
- 
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a
 PCollection<KV<K, T>> that contains one output element per key mapping each to the
 minimum of the values associated with that key in the input PCollection.
 
- perKey(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
 
- 
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a
 PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key
 in the input PCollection to the largest count values associated with that key
 in the input PCollection<KV<K, V>>, in decreasing order, sorted using the given Comparator<V>.
 
- PerKeyDigest() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
 
-  
 
- PerKeyDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
 
-  
 
- PerKeySketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
 
-  
 
- PInput - Interface in org.apache.beam.sdk.values
 
- 
The interface for things that might be input to a 
PTransform.
 
 
- pipeline() - Method in class org.apache.beam.runners.fnexecution.jobsubmission.JobPreparation
 
-  
 
- Pipeline - Class in org.apache.beam.sdk
 
- 
 
- Pipeline(PipelineOptions) - Constructor for class org.apache.beam.sdk.Pipeline
 
-  
 
- Pipeline.PipelineExecutionException - Exception in org.apache.beam.sdk
 
- 
Thrown during execution of a 
Pipeline, whenever user code within that 
Pipeline
 throws an exception.
 
 
- Pipeline.PipelineVisitor - Interface in org.apache.beam.sdk
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- Pipeline.PipelineVisitor.CompositeBehavior - Enum in org.apache.beam.sdk
 
- 
Control enum for indicating whether or not a traversal should process the contents of a
 composite transform or not.
 
- Pipeline.PipelineVisitor.Defaults - Class in org.apache.beam.sdk
 
- 
 
- pipelineExecution - Variable in class org.apache.beam.runners.spark.SparkPipelineResult
 
-  
 
- PipelineExecutionException(Throwable) - Constructor for exception org.apache.beam.sdk.Pipeline.PipelineExecutionException
 
- 
 
- PipelineMessageReceiver - Interface in org.apache.beam.runners.local
 
- 
Handles failures in the form of exceptions.
 
- pipelineOptions() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
-  
 
- PipelineOptions - Interface in org.apache.beam.sdk.options
 
- 
PipelineOptions are used to configure Pipelines.
 
- PipelineOptions.AtomicLongFactory - Class in org.apache.beam.sdk.options
 
- 
DefaultValueFactory which supplies an ID that is guaranteed to be unique within the
 given process.
 
 
- PipelineOptions.CheckEnabled - Enum in org.apache.beam.sdk.options
 
- 
Enumeration of the possible states for a given check.
 
- PipelineOptions.DirectRunner - Class in org.apache.beam.sdk.options
 
- 
A 
DefaultValueFactory that obtains the class of the 
DirectRunner if it exists
 on the classpath, and throws an exception otherwise.
 
 
- PipelineOptions.JobNameFactory - Class in org.apache.beam.sdk.options
 
- 
 
- PipelineOptions.NoOpMetricsSink - Class in org.apache.beam.sdk.options
 
- 
A 
DefaultValueFactory that obtains the class of the 
NoOpMetricsSink if it
 exists on the classpath, and throws an exception otherwise.
 
 
- PipelineOptions.UserAgentFactory - Class in org.apache.beam.sdk.options
 
- 
Returns a user agent string constructed from ReleaseInfo.getName() and ReleaseInfo.getVersion(), in the format [name]/[version].
 
- PipelineOptionsFactory - Class in org.apache.beam.sdk.options
 
- 
 
- PipelineOptionsFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptionsFactory
 
-  
 
- PipelineOptionsFactory.Builder - Class in org.apache.beam.sdk.options
 
- 
 
- PipelineOptionsRegistrar - Interface in org.apache.beam.sdk.options
 
- 
PipelineOptions creators have the ability to automatically have their 
PipelineOptions registered with this SDK by creating a 
ServiceLoader entry and a
 concrete implementation of this interface.
 
 
- PipelineOptionsValidator - Class in org.apache.beam.sdk.options
 
- 
 
- PipelineOptionsValidator() - Constructor for class org.apache.beam.sdk.options.PipelineOptionsValidator
 
-  
 
- PipelineResult - Interface in org.apache.beam.sdk
 
- 
 
- PipelineResult.State - Enum in org.apache.beam.sdk
 
- 
Possible job states, for both completed and ongoing jobs.
 
- PipelineRunner<ResultT extends PipelineResult> - Class in org.apache.beam.sdk
 
- 
 
- PipelineRunner() - Constructor for class org.apache.beam.sdk.PipelineRunner
 
-  
 
- plusDelayOf(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
- 
Adds some delay to the original target time.
 
- POJOUtils - Class in org.apache.beam.sdk.schemas.utils
 
- 
A set of utilities yo generate getter and setter classes for POJOs.
 
- POJOUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.POJOUtils
 
-  
 
- PojoValueGetterFactory - Class in org.apache.beam.sdk.schemas.utils
 
- 
 
- PojoValueGetterFactory() - Constructor for class org.apache.beam.sdk.schemas.utils.PojoValueGetterFactory
 
-  
 
- PojoValueSetterFactory - Class in org.apache.beam.sdk.schemas.utils
 
- 
 
- PojoValueSetterFactory() - Constructor for class org.apache.beam.sdk.schemas.utils.PojoValueSetterFactory
 
-  
 
- PollFn() - Constructor for class org.apache.beam.sdk.transforms.Watch.Growth.PollFn
 
-  
 
- pollFor(Duration) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.PollingAssertion
 
-  
 
- pollJob(JobReference, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
 
- 
Waits for the job is Done, and returns the job.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.AvroIO.ParseAll
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.AvroIO.ReadAll
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.CompressedSource
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
- 
Populates the display data.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
 
- 
Populates the display data.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Read.Bounded
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
- 
Populate the display data with connectionConfiguration details.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Source
 
- 
Register display data for the given transform or component.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in interface org.apache.beam.sdk.transforms.display.HasDisplayData
 
- 
Register display data for the given transform or component.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.DoFn
 
- 
Register display data for the given transform or component.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Filter
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.MapElements
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
- 
Register display data for the given transform or component.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Partition
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.PTransform
 
- 
Register display data for the given transform or component.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
 
- 
Register display data for the given transform or component.
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
-  
 
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
Register display data for the given transform or component.
 
- PortablePipelineOptions - Interface in org.apache.beam.sdk.options
 
- 
Pipeline options common to all portable runners.
 
- PortableRunner - Class in org.apache.beam.runners.reference
 
- 
 
- PortableRunnerRegistrar - Class in org.apache.beam.runners.reference
 
- 
Registrar for the poratble runner.
 
- PortableRunnerRegistrar() - Constructor for class org.apache.beam.runners.reference.PortableRunnerRegistrar
 
-  
 
- POSITION - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators
 
-  
 
- POutput - Interface in org.apache.beam.sdk.values
 
- 
The interface for things that might be output from a 
PTransform.
 
 
- precisionForRelativeError(double) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
 
- 
Computes the precision based on the desired relative error.
 
- prepare(JobApi.PrepareJobRequest, StreamObserver<JobApi.PrepareJobResponse>) - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- prepare(JobApi.PrepareJobRequest, StreamObserver<JobApi.PrepareJobResponse>) - Method in class org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService
 
-  
 
- prepare(JobApi.PrepareJobRequest, StreamObserver<JobApi.PrepareJobResponse>) - Method in class org.apache.beam.runners.reference.testing.TestJobService
 
-  
 
- prepare() - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionExecutor
 
- 
invoked before data processing.
 
- prepare() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlFnExecutor
 
-  
 
- prepareForProcessing() - Method in class org.apache.beam.sdk.transforms.DoFn
 
- 
 
- prepareWrite(WritableByteChannel) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
Called with the channel that a subclass will write its header, footer, and values to.
 
- PrepareWrite<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- PrepareWrite(DynamicDestinations<T, DestinationT>, SerializableFunction<T, TableRow>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
 
-  
 
- primary() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
-  
 
- PrimitiveParDoSingleFactory<InputT,OutputT> - Class in org.apache.beam.runners.dataflow
 
- 
 
- PrimitiveParDoSingleFactory() - Constructor for class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
 
-  
 
- PrimitiveParDoSingleFactory.ParDoSingle<InputT,OutputT> - Class in org.apache.beam.runners.dataflow
 
- 
A single-output primitive 
ParDo.
 
 
- PrimitiveParDoSingleFactory.PayloadTranslator - Class in org.apache.beam.runners.dataflow
 
- 
 
- PrimitiveParDoSingleFactory.Registrar - Class in org.apache.beam.runners.dataflow
 
- 
 
- printHelp(PrintStream) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
- 
Outputs the set of registered options with the PipelineOptionsFactory with a description for
 each one if available to the output stream.
 
- printHelp(PrintStream, Class<? extends PipelineOptions>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
- 
Outputs the set of options available to be set for the passed in 
PipelineOptions
 interface.
 
 
- process(List<JobMessage>) - Method in interface org.apache.beam.runners.dataflow.util.MonitoringUtil.JobMessagesHandler
 
- 
Process the rows.
 
- process(List<JobMessage>) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler
 
-  
 
- processBundle(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- processBundle(InputT...) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- ProcessBundleDescriptors - Class in org.apache.beam.runners.fnexecution.control
 
- 
Utility methods for creating BeamFnApi.ProcessBundleDescriptor instances.
 
- ProcessBundleDescriptors() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
 
-  
 
- ProcessBundleDescriptors.BagUserStateSpec<K,V,W extends BoundedWindow> - Class in org.apache.beam.runners.fnexecution.control
 
- 
A container type storing references to the key, value, and window 
Coder used when
 handling bag user state requests.
 
 
- ProcessBundleDescriptors.ExecutableProcessBundleDescriptor - Class in org.apache.beam.runners.fnexecution.control
 
-  
 
- ProcessBundleDescriptors.SideInputSpec<K,T,W extends BoundedWindow> - Class in org.apache.beam.runners.fnexecution.control
 
- 
A container type storing references to the key, value, and window 
Coder used when
 handling side input state requests.
 
 
- ProcessBundleDescriptors.TimerSpec<K,V,W extends BoundedWindow> - Class in org.apache.beam.runners.fnexecution.control
 
- 
A container type storing references to the key, timer and payload coders and the remote input
 destination used when handling timer requests.
 
- ProcessContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.ProcessContext
 
-  
 
- ProcessContinuation() - Constructor for class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
 
-  
 
- processElement(DoFn<KV<K, Iterable<KV<Instant, WindowedValue<KV<K, V>>>>>, OutputT>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
-  
 
- processElement(DoFn<Iterable<T>, T>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
 
-  
 
- processElement(WindowedValue<InputT>) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
-  
 
- processElement(DoFn<KV<Row, Row>, Row>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.MergeAggregationRecord
 
-  
 
- processElement(DoFn<KV<Row, Row>, Row>.ProcessContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.SideInputJoinDoFn
 
-  
 
- processElement(DoFn<KV<Row, CoGbkResult>, Row>.ProcessContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.SetOperatorFilteringDoFn
 
-  
 
- processElement(DoFn<Row, Void>.ProcessContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlOutputToConsoleFn
 
-  
 
- processElement(DoFn<PubsubMessage, Row>.ProcessContext) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubMessageToRow
 
-  
 
- processElement(DoFn<T, Void>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
-  
 
- processElement(InputT) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- processElement(DoFn<ValueWithRecordId<T>, T>.ProcessContext) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.StripIdsDoFn
 
-  
 
- ProcessEnvironment - Class in org.apache.beam.runners.fnexecution.environment
 
- 
Environment for process-based execution.
 
- ProcessEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
 
- 
 
- ProcessEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
 
- 
Provider of ProcessEnvironmentFactory.
 
- ProcessingTimeEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
 
-  
 
- ProcessingTimePolicy() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
 
-  
 
- ProcessManager - Class in org.apache.beam.runners.fnexecution.environment
 
- 
A simple process manager which forks processes and kills them if necessary.
 
- processTimestampedElement(TimestampedValue<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- ProcessWatcher(Process) - Constructor for class org.apache.beam.runners.apex.ApexYarnLauncher.ProcessWatcher
 
-  
 
- processWindowedElement(InputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- PROJECT_ID_REGEXP - Static variable in class org.apache.beam.runners.dataflow.DataflowRunner
 
- 
Project IDs must contain lowercase letters, digits, or dashes.
 
- projectPathFromId(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
-  
 
- projectPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
-  
 
- properties(JSONObject) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
-  
 
- PROPERTY_BEAM_TEST_PIPELINE_OPTIONS - Static variable in class org.apache.beam.sdk.testing.TestPipeline
 
- 
 
- propertyName() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
- 
Returns the property name associated with this provider.
 
- propertyName() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
- 
Returns the property name that corresponds to this provider.
 
- PropertyNames - Class in org.apache.beam.runners.dataflow.util
 
- 
Constant property names used by the SDK in CloudWorkflow specifications.
 
- PropertyNames() - Constructor for class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- ProtobufCoderProviderRegistrar - Class in org.apache.beam.sdk.extensions.protobuf
 
- 
 
- ProtobufCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
 
-  
 
- ProtoCoder<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.extensions.protobuf
 
- 
A 
Coder using Google Protocol Buffers binary format.
 
 
- Provider(PipelineOptions) - Constructor for class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
 
-  
 
- Provider() - Constructor for class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
 
-  
 
- Provider() - Constructor for class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.Provider
 
-  
 
- PTransform<InputT extends PInput,OutputT extends POutput> - Class in org.apache.beam.sdk.transforms
 
- 
A 
PTransform<InputT, OutputT> is an operation that takes an 
InputT (some subtype
 of 
PInput) and produces an 
OutputT (some subtype of 
POutput).
 
 
- PTransform() - Constructor for class org.apache.beam.sdk.transforms.PTransform
 
-  
 
- PTransform(String) - Constructor for class org.apache.beam.sdk.transforms.PTransform
 
-  
 
- publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
Publish outgoingMessages to Pubsub topic.
 
- publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- publish(List<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
 
- 
 
- PublishResultCoder - Class in org.apache.beam.sdk.io.aws.sns
 
- 
Custom Coder for handling publish result.
 
- PUBSUB_ID_ATTRIBUTE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- PUBSUB_SERIALIZED_ATTRIBUTES_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- PUBSUB_SUBSCRIPTION - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- PUBSUB_SUBSCRIPTION_OVERRIDE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- PUBSUB_TIMESTAMP_ATTRIBUTE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- PUBSUB_TOPIC - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- PUBSUB_TOPIC_OVERRIDE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- PubsubClient - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
An (abstract) helper class for talking to Pubsub via an underlying transport.
 
- PubsubClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
-  
 
- PubsubClient.OutgoingMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
A message to be sent to Pubsub.
 
- PubsubClient.ProjectPath - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Path representing a cloud project id.
 
- PubsubClient.PubsubClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Factory for creating clients.
 
- PubsubClient.SubscriptionPath - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Path representing a Pubsub subscription.
 
- PubsubClient.TopicPath - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Path representing a Pubsub topic.
 
- PubsubCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
 
- PubsubCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
 
-  
 
- PubsubGrpcClient - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
A helper class for talking to Pubsub via grpc.
 
- PubsubIO - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Read and Write 
PTransforms for Cloud Pub/Sub streams.
 
 
- PubsubIO.PubsubSubscription - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Class representing a Cloud Pub/Sub Subscription.
 
- PubsubIO.PubsubTopic - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Class representing a Cloud Pub/Sub Topic.
 
- PubsubIO.Read<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
 
- PubsubIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
 
- PubsubIO.Write.PubsubBoundedWriter - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Writer to Pubsub which batches messages from bounded collections.
 
- PubsubJsonClient - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
A Pubsub client using JSON transport.
 
- PubsubJsonTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.pubsub
 
- 
 
- PubsubJsonTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonTableProvider
 
-  
 
- PubsubMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Class representing a Pub/Sub message.
 
- PubsubMessage(byte[], Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
-  
 
- PubsubMessagePayloadOnlyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload.
 
- PubsubMessagePayloadOnlyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
-  
 
- PubsubMessageToRow - Class in org.apache.beam.sdk.extensions.sql.meta.provider.pubsub
 
- 
 
- PubsubMessageToRow() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubMessageToRow
 
-  
 
- PubsubMessageWithAttributesCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
A coder for PubsubMessage including attributes.
 
- PubsubMessageWithAttributesCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
-  
 
- PubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Properties that can be set when using Google Cloud Pub/Sub with the Apache Beam SDK.
 
- PubsubUnboundedSink - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
A PTransform which streams messages to Pubsub.
 
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
-  
 
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, int, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
-  
 
- PubsubUnboundedSource - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
 
- PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
- 
Construct an unbounded source to consume from the Pubsub subscription.
 
- pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
- 
Request the next batch of up to batchSize messages from subscription.
 
- pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
-  
 
- pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
-  
 
- put(String, InstructionRequestHandler) - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool.Sink
 
- 
 
- put(K, V) - Method in interface org.apache.beam.sdk.state.MapState
 
- 
Associates the specified value with the specified key in this state.
 
- putArtifact(StreamObserver<ArtifactApi.PutArtifactResponse>) - Method in class org.apache.beam.runners.direct.portable.artifact.LocalFileSystemArtifactStagerService
 
-  
 
- putArtifact(StreamObserver<ArtifactApi.PutArtifactResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 
-  
 
- putIfAbsent(K, V) - Method in interface org.apache.beam.sdk.state.MapState
 
- 
A deferred read-followed-by-write.
 
- PValue - Interface in org.apache.beam.sdk.values
 
- 
For internal use.
 
- PValueBase - Class in org.apache.beam.sdk.values
 
- 
For internal use.
 
- PValueBase(Pipeline) - Constructor for class org.apache.beam.sdk.values.PValueBase
 
-  
 
- PValueBase() - Constructor for class org.apache.beam.sdk.values.PValueBase
 
- 
No-arg constructor to allow subclasses to implement Serializable.
 
- RandomAccessData - Class in org.apache.beam.runners.dataflow.util
 
- 
An elastic-sized byte array which allows you to manipulate it as a stream, or access it directly.
 
- RandomAccessData() - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Constructs a RandomAccessData with a default buffer size.
 
- RandomAccessData(byte[]) - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Constructs a RandomAccessData with the initial buffer.
 
- RandomAccessData(int) - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Constructs a RandomAccessData with the given buffer size.
 
- RandomAccessData.RandomAccessDataCoder - Class in org.apache.beam.runners.dataflow.util
 
- 
A 
Coder which encodes the valid parts of this stream.
 
 
- RandomAccessData.UnsignedLexicographicalComparator - Class in org.apache.beam.runners.dataflow.util
 
- 
A Comparator that compares two byte arrays lexicographically.
 
- RandomAccessDataCoder() - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
-  
 
- RangeTracker<PositionT> - Interface in org.apache.beam.sdk.io.range
 
- 
A 
RangeTracker is a thread-safe helper object for implementing dynamic work rebalancing
 in position-based 
BoundedSource.BoundedReader subclasses.
 
 
- RawUnionValue(String, Object) - Constructor for class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils.RawUnionValue
 
- 
Constructs a partial union from the given union tag and value.
 
- RawUnionValue - Class in org.apache.beam.sdk.transforms.join
 
- 
This corresponds to an integer union tag and value.
 
- RawUnionValue(int, Object) - Constructor for class org.apache.beam.sdk.transforms.join.RawUnionValue
 
- 
Constructs a partial union from the given union tag and value.
 
- read() - Method in class org.apache.beam.runners.gearpump.translators.io.GearpumpSource
 
-  
 
- read(JavaStreamingContext, SerializablePipelineOptions, UnboundedSource<T, CheckpointMarkT>, String) - Static method in class org.apache.beam.runners.spark.io.SparkUnboundedSource
 
-  
 
- read(T) - Method in interface org.apache.beam.sdk.fn.stream.DataStreams.OutputChunkConsumer
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.amqp.AmqpIO
 
-  
 
- Read() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpIO.Read
 
-  
 
- read(Class<T>) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
Reads records of the given type from an Avro file (or multiple Avro files matching a pattern).
 
- Read() - Constructor for class org.apache.beam.sdk.io.AvroIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.aws.sqs.SqsIO
 
-  
 
- Read() - Constructor for class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
 
- 
 
- Read() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
 
-  
 
- Read() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
 
- 
 
- read(SerializableFunction<SchemaAndRecord, T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
 
- 
Reads from a BigQuery table or query and returns a 
PCollection with one element per
 each row of the table or query result, parsed from the BigQuery AVRO format using the specified
 function.
 
 
- read() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
 
- 
 
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
-  
 
- read() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
 
- 
 
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
 
- 
 
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO
 
- 
 
- Read() - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
 
- 
 
- read() - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO
 
- 
Read data from Hive.
 
- Read() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
 
- 
Read data from a JDBC datasource.
 
- Read() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
 
-  
 
- Read() - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
 
- 
 
- Read() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.kinesis.KinesisIO
 
- 
 
- Read() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
 
- 
Read data from GridFS.
 
- Read() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbIO
 
- 
Read data from MongoDB.
 
- Read() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
 
-  
 
- Read() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
-  
 
- read(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
 
- 
Reads 
GenericRecord from a Parquet file (or multiple Parquet files matching the
 pattern).
 
 
- Read() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.Read
 
-  
 
- Read - Class in org.apache.beam.sdk.io
 
- 
 
- Read() - Constructor for class org.apache.beam.sdk.io.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
 
- 
Read data from a Redis server.
 
- Read() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.solr.SolrIO
 
-  
 
- Read() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.TextIO
 
- 
A 
PTransform that reads from one or more text files and returns a bounded 
PCollection containing one element for each line of the input files.
 
 
- Read() - Constructor for class org.apache.beam.sdk.io.TextIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.TFRecordIO
 
- 
A 
PTransform that reads from a TFRecord file (or multiple TFRecord files matching a
 pattern) and returns a 
PCollection containing the decoding of each of the records of
 the TFRecord file(s) as a byte array.
 
 
- Read() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Read
 
-  
 
- read() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
 
- 
Reads XML files as a 
PCollection of a given type mapped via JAXB.
 
 
- Read() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Read
 
-  
 
- read() - Method in interface org.apache.beam.sdk.state.BagState
 
-  
 
- read() - Method in interface org.apache.beam.sdk.state.CombiningState
 
-  
 
- read() - Method in interface org.apache.beam.sdk.state.ReadableState
 
- 
Read the current value, blocking until it is available.
 
- Read.Bounded<T> - Class in org.apache.beam.sdk.io
 
- 
 
- Read.Builder - Class in org.apache.beam.sdk.io
 
- 
Helper class for building 
Read transforms.
 
 
- Read.Unbounded<T> - Class in org.apache.beam.sdk.io
 
- 
 
- ReadableFileCoder - Class in org.apache.beam.sdk.io
 
- 
 
- ReadableFileCoder() - Constructor for class org.apache.beam.sdk.io.ReadableFileCoder
 
-  
 
- ReadableState<T> - Interface in org.apache.beam.sdk.state
 
- 
 
- ReadableStates - Class in org.apache.beam.sdk.state
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- ReadableStates() - Constructor for class org.apache.beam.sdk.state.ReadableStates
 
-  
 
- readAll(Class<T>) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
 
- ReadAll() - Constructor for class org.apache.beam.sdk.io.AvroIO.ReadAll
 
-  
 
- readAll() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
 
- 
 
- ReadAll() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
-  
 
- readAll() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
 
- 
 
- readAll() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
 
- 
Like 
JdbcIO.read(), but executes multiple instances of the query substituting each element of a
 
PCollection as query parameters.
 
 
- ReadAll() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
-  
 
- readAll() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
 
- 
Like 
RedisIO.read() but executes multiple instances of the Redis query substituting each
 element of a 
PCollection as key pattern.
 
 
- ReadAll() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.ReadAll
 
-  
 
- readAll() - Static method in class org.apache.beam.sdk.io.TextIO
 
- 
 
- ReadAll() - Constructor for class org.apache.beam.sdk.io.TextIO.ReadAll
 
-  
 
- readAllGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
 
- readAllGenericRecords(String) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
 
- ReadAllViaFileBasedSource<T> - Class in org.apache.beam.sdk.io
 
- 
 
- ReadAllViaFileBasedSource(long, SerializableFunction<String, ? extends FileBasedSource<T>>, Coder<T>) - Constructor for class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
 
-  
 
- readAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that continuously reads binary encoded Avro messages of the given
 type from a Google Cloud Pub/Sub stream.
 
 
- ReadBoundedTranslator<T> - Class in org.apache.beam.runners.gearpump.translators
 
- 
Read.Bounded is translated to Gearpump source function and 
BoundedSource is
 wrapped into Gearpump 
DataSource.
 
 
- ReadBoundedTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.ReadBoundedTranslator
 
-  
 
- readBytes() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
 
- 
A specific instance of uninitialized 
KafkaIO.read() where key and values are bytes.
 
 
- readDecompressed(ReadableByteChannel) - Method in enum org.apache.beam.sdk.io.Compression
 
-  
 
- Reader() - Constructor for class org.apache.beam.sdk.io.Source.Reader
 
-  
 
- ReaderInvocationUtil<OutputT,ReaderT extends Source.Reader<OutputT>> - Class in org.apache.beam.runners.flink.metrics
 
- 
Util for invoking Source.Reader methods that might require a MetricsContainerImpl
 to be active.
 
- ReaderInvocationUtil(String, PipelineOptions, FlinkMetricContainer) - Constructor for class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
-  
 
- readExternal(ObjectInput) - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
-  
 
- readFiles(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
 
- 
 
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles
 
-  
 
- readFiles() - Static method in class org.apache.beam.sdk.io.TextIO
 
- 
 
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.TextIO.ReadFiles
 
-  
 
- readFiles() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
 
- 
 
- ReadFiles() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
 
-  
 
- readFrom(InputStream, int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Reads length bytes from the specified input stream writing them into the backing data
 store starting at offset.
 
- readFromPort(BeamFnApi.RemoteGrpcPort, String) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
-  
 
- readFromSource(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
 
- readFromSplitsOfSource(BoundedSource<T>, long, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
-  
 
- readFromStartedReader(Source.Reader<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Reads all elements from the given started Source.Reader.
 
- readFromUnstartedReader(Source.Reader<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Reads all elements from the given unstarted Source.Reader.
 
- readFullyAsBytes() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
- 
Returns the full contents of the file as bytes.
 
- readFullyAsUTF8String() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
- 
Returns the full contents of the file as a String decoded as UTF-8.
 
- readGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
Reads Avro file(s) containing records of the specified schema.
 
- readGenericRecords(String) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
Reads Avro file(s) containing records of the specified schema.
 
- readLater() - Method in interface org.apache.beam.sdk.state.BagState
 
-  
 
- readLater() - Method in interface org.apache.beam.sdk.state.CombiningState
 
-  
 
- readLater() - Method in interface org.apache.beam.sdk.state.GroupingState
 
-  
 
- readLater() - Method in interface org.apache.beam.sdk.state.ReadableState
 
- 
Indicate that the value will be read later.
 
- readLater() - Method in interface org.apache.beam.sdk.state.SetState
 
-  
 
- readLater() - Method in interface org.apache.beam.sdk.state.ValueState
 
-  
 
- readLater() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
 
-  
 
- readMatches() - Static method in class org.apache.beam.sdk.io.FileIO
 
- 
 
- ReadMatches() - Constructor for class org.apache.beam.sdk.io.FileIO.ReadMatches
 
-  
 
- readMessage() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
 
-  
 
- readMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that continuously reads from a Google Cloud Pub/Sub stream.
 
 
- readMessagesWithAttributes() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that continuously reads from a Google Cloud Pub/Sub stream.
 
 
- readNextBlock() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
-  
 
- readNextBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
- 
Read the next block from the input.
 
- readNextRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
 
- 
Reads the next record from the block and returns true iff one exists.
 
- readNextRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
- 
 
- readNextRecord() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
- 
Reads the next record via the delegate reader.
 
- readNextRecord() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
- 
 
- readNItemsFromStartedReader(Source.Reader<T>, int) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Read elements from a Source.Reader that has already had Source.Reader#start
 called on it, until n elements are read.
 
- readNItemsFromUnstartedReader(Source.Reader<T>, int) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Read elements from a Source.Reader until n elements are read.
 
- readOnly(String, Map<String, BeamSqlTable>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
-  
 
- ReadOnlyTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
 
- 
A ReadOnlyTableProvider provides in-memory read only set of BeamSqlTable
 BeamSqlTables.
 
- ReadOnlyTableProvider(String, Map<String, BeamSqlTable>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
 
-  
 
- ReadOperation - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
Encapsulates a spanner read operation.
 
- ReadOperation() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- readProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that continuously reads binary encoded protobuf messages of the
 given type from a Google Cloud Pub/Sub stream.
 
 
- readRemainingFromReader(Source.Reader<T>, boolean) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Read all remaining elements from a Source.Reader.
 
- readStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that continuously reads UTF-8 encoded strings from a Google Cloud
 Pub/Sub stream.
 
 
- readTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
 
- 
 
- ReadUnboundedTranslator<T> - Class in org.apache.beam.runners.gearpump.translators
 
- 
Read.Unbounded is translated to Gearpump source function and 
UnboundedSource is
 wrapped into Gearpump 
DataSource.
 
 
- ReadUnboundedTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.ReadUnboundedTranslator
 
-  
 
- receive(LogicalEndpoint, Coder<WindowedValue<T>>, FnDataReceiver<WindowedValue<T>>) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
 
- 
Registers a receiver to be notified upon any incoming elements.
 
- receive(LogicalEndpoint, Coder<WindowedValue<T>>, FnDataReceiver<WindowedValue<T>>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
-  
 
- recordId - Variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
- 
If using an id attribute, the record id to associate with this record's metadata so the
 receiver can reject duplicates.
 
- RedisConnectionConfiguration - Class in org.apache.beam.sdk.io.redis
 
- 
RedisConnectionConfiguration describes and wraps a connectionConfiguration to Redis
 server or cluster.
 
- RedisConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
-  
 
- RedisIO - Class in org.apache.beam.sdk.io.redis
 
- 
An IO to manipulate Redis key/value database.
 
- RedisIO.Read - Class in org.apache.beam.sdk.io.redis
 
- 
 
- RedisIO.ReadAll - Class in org.apache.beam.sdk.io.redis
 
- 
 
- RedisIO.Write - Class in org.apache.beam.sdk.io.redis
 
- 
 
- RedisIO.Write.Method - Enum in org.apache.beam.sdk.io.redis
 
- 
Determines the method used to insert data in Redis.
 
- ReferenceRunner - Class in org.apache.beam.runners.direct.portable
 
- 
The "ReferenceRunner" engine implementation.
 
- ReferenceRunnerJobServer - Class in org.apache.beam.runners.direct.portable.job
 
- 
 
- ReferenceRunnerJobService - Class in org.apache.beam.runners.direct.portable.job
 
- 
The ReferenceRunner uses the portability framework to execute a Pipeline on a single machine.
 
- ReflectUtils - Class in org.apache.beam.sdk.schemas.utils
 
- 
A set of reflection helper methods.
 
- ReflectUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils
 
-  
 
- refreshAll() - Method in class org.apache.beam.runners.direct.WatermarkManager
 
- 
Refresh the watermarks contained within this 
WatermarkManager, causing all watermarks
 to be advanced as far as possible.
 
 
- Regex - Class in org.apache.beam.sdk.transforms
 
- 
PTransorms to use Regular Expressions to process elements in a 
PCollection.
 
 
- Regex.AllMatches - Class in org.apache.beam.sdk.transforms
 
- 
Regex.MatchesName<String> takes a PCollection<String> and returns a PCollection<List<String>> representing the value extracted from all the Regex groups of the
 input PCollection to the number of times that element occurs in the input.
 
- Regex.Find - Class in org.apache.beam.sdk.transforms
 
- 
Regex.Find<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
 
- Regex.FindAll - Class in org.apache.beam.sdk.transforms
 
- 
Regex.Find<String> takes a PCollection<String> and returns a PCollection<List<String>> representing the value extracted from the Regex groups of the input
 PCollection to the number of times that element occurs in the input.
 
- Regex.FindKV - Class in org.apache.beam.sdk.transforms
 
- 
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a
 PCollection<KV<String, String>> representing the key and value extracted from the Regex
 groups of the input PCollection to the number of times that element occurs in the
 input.
 
- Regex.FindName - Class in org.apache.beam.sdk.transforms
 
- 
Regex.Find<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
 
- Regex.FindNameKV - Class in org.apache.beam.sdk.transforms
 
- 
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a
 PCollection<KV<String, String>> representing the key and value extracted from the Regex
 groups of the input PCollection to the number of times that element occurs in the
 input.
 
- Regex.Matches - Class in org.apache.beam.sdk.transforms
 
- 
Regex.Matches<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
 
- Regex.MatchesKV - Class in org.apache.beam.sdk.transforms
 
- 
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a
 PCollection<KV<String, String>> representing the key and value extracted from the Regex
 groups of the input PCollection to the number of times that element occurs in the
 input.
 
- Regex.MatchesName - Class in org.apache.beam.sdk.transforms
 
- 
Regex.MatchesName<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
 
- Regex.MatchesNameKV - Class in org.apache.beam.sdk.transforms
 
- 
Regex.MatchesNameKV<KV<String, String>> takes a PCollection<String> and returns
 a PCollection<KV<String, String>> representing the key and value extracted from the
 Regex groups of the input PCollection to the number of times that element occurs in the
 input.
 
- Regex.ReplaceAll - Class in org.apache.beam.sdk.transforms
 
- 
Regex.ReplaceAll<String> takes a PCollection<String> and returns a PCollection<String> with all Strings that matched the Regex being replaced with the
 replacement string.
 
- Regex.ReplaceFirst - Class in org.apache.beam.sdk.transforms
 
- 
Regex.ReplaceFirst<String> takes a PCollection<String> and returns a PCollection<String> with the first Strings that matched the Regex being replaced with the
 replacement string.
 
- Regex.Split - Class in org.apache.beam.sdk.transforms
 
- 
Regex.Split<String> takes a PCollection<String> and returns a PCollection<String> with the input string split into individual items in a list.
 
- RegexMatcher - Class in org.apache.beam.sdk.testing
 
- 
Hamcrest matcher to assert a string matches a pattern.
 
- RegexMatcher(String) - Constructor for class org.apache.beam.sdk.testing.RegexMatcher
 
-  
 
- register(RelOptPlanner) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
 
-  
 
- register(RelOptPlanner) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
-  
 
- register(Class<? extends PipelineOptions>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
- 
This registers the interface with this factory.
 
- registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.Coder
 
- 
Notifies the ElementByteSizeObserver about the byte size of the encoded value using
 this Coder.
 
- registerByteSizeObserver(ReadableDuration, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
-  
 
- registerByteSizeObserver(IterableT, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
-  
 
- registerByteSizeObserver(KV<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.KvCoder
 
- 
Notifies ElementByteSizeObserver about the byte size of the encoded value using this coder.
 
- registerByteSizeObserver(Map<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.MapCoder
 
-  
 
- registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
- 
Overridden to short-circuit the default StructuredCoder behavior of encoding and
 counting the bytes.
 
- registerByteSizeObserver(RawUnionValue, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
- 
Notifies ElementByteSizeObserver about the byte size of the encoded value using this coder.
 
- registerClasses(Kryo) - Method in class org.apache.beam.runners.spark.coders.BeamSparkRunnerRegistrator
 
-  
 
- registerCoderForClass(Class<?>, Coder<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
Registers the provided 
Coder for the given class.
 
 
- registerCoderForType(TypeDescriptor<?>, Coder<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
Registers the provided 
Coder for the given type.
 
 
- registerCoderProvider(CoderProvider) - Method in class org.apache.beam.sdk.coders.CoderRegistry
 
- 
Registers 
coderProvider as a potential 
CoderProvider which can produce 
Coder instances.
 
 
- registerConsumer(LogicalEndpoint, Consumer<BeamFnApi.Elements.Data>) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
-  
 
- registerForProcessBundleInstructionId(String, StateRequestHandler) - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
 
-  
 
- registerForProcessBundleInstructionId(String, StateRequestHandler) - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator
 
- 
Registers the supplied handler for the given process bundle instruction id for all BeamFnApi.StateRequests with a matching id.
 
- registerJavaBean(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Register a JavaBean type for automatic schema inference.
 
- registerJavaBean(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Register a JavaBean type for automatic schema inference.
 
- registerPOJO(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Register a POJO type for automatic schema inference.
 
- registerPOJO(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Register a POJO type for automatic schema inference.
 
- registerProvider(TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
 
-  
 
- registerProvider(TableProvider) - Method in interface org.apache.beam.sdk.extensions.sql.meta.store.MetaStore
 
- 
Register a table provider.
 
- registerSchemaForClass(Class<T>, Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
Register a schema for a specific Class type.
 
- registerSchemaForType(TypeDescriptor<T>, Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
 
- registerSchemaProvider(SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
 
- registerSchemaProvider(Class<T>, SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
 
- registerSchemaProvider(TypeDescriptor<T>, SchemaProvider) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
 
- 
 
- registerTransformTranslator(Class<TransformT>, TransformTranslator<? extends TransformT>) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
 
- 
Records that instances of the specified PTransform class should be translated by default by the
 corresponding 
TransformTranslator.
 
 
- registerUdaf(String, Combine.CombineFn) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
- 
Register a UDAF function which can be used in GROUP-BY expression.
 
- registerUdaf(String, Combine.CombineFn) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
- 
register a Combine.CombineFn as UDAF function used in this query.
 
- registerUdf(String, Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
- 
Register a UDF function which can be used in SQL expression.
 
- registerUdf(String, Class<? extends BeamSqlUdf>) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
- 
Register a UDF function which can be used in SQL expression.
 
- registerUdf(String, SerializableFunction) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
- 
 
- registerUdf(String, Class<? extends BeamSqlUdf>) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
- 
register a UDF function used in this query.
 
- registerUdf(String, SerializableFunction) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
- 
 
- Registrar() - Constructor for class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.Registrar
 
-  
 
- Reify - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for converting between explicit and implicit form of various Beam
 values.
 
 
- ReifyAsIterable<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
This transforms turns a side input into a singleton PCollection that can be used as the main
 input for another transform.
 
- ReifyAsIterable() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
 
-  
 
- ReinterpretConversion - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret
 
- 
Defines conversion between 2 SQL types.
 
- ReinterpretConversion.Builder - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret
 
- 
 
- Reinterpreter - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret
 
- 
Class that tracks conversions between SQL types.
 
- Reinterpreter.Builder - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret
 
- 
Builder for Reinterpreter.
 
- relativeErrorForPrecision(int) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
 
-  
 
- relativeFileNaming(ValueProvider<String>, FileIO.Write.FileNaming) - Static method in class org.apache.beam.sdk.io.FileIO.Write
 
-  
 
- remerge() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
Creates a 
Window PTransform that does not change assigned windows, but will
 cause windows to be merged again as part of the next 
GroupByKey.
 
 
- RemoteBundle - Interface in org.apache.beam.runners.fnexecution.control
 
- 
A bundle capable of handling input data elements for a bundle descriptor by
 forwarding them to a remote environment for processing.
 
- RemoteEnvironment - Interface in org.apache.beam.runners.fnexecution.environment
 
- 
A handle to an available remote RunnerApi.Environment.
 
- RemoteEnvironment.SimpleRemoteEnvironment - Class in org.apache.beam.runners.fnexecution.environment
 
- 
 
- RemoteGrpcPortRead - Class in org.apache.beam.sdk.fn.data
 
- 
An execution-time only RunnerApi.PTransform which represents an SDK harness reading from a BeamFnApi.RemoteGrpcPort.
 
- RemoteGrpcPortRead() - Constructor for class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
-  
 
- RemoteGrpcPortWrite - Class in org.apache.beam.sdk.fn.data
 
- 
An execution-time only RunnerApi.PTransform which represents a write from within an SDK harness to
 a BeamFnApi.RemoteGrpcPort.
 
- RemoteGrpcPortWrite() - Constructor for class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
-  
 
- RemoteInputDestination<T> - Class in org.apache.beam.runners.fnexecution.data
 
- 
A pair of 
Coder and 
BeamFnApi.Target which specifies the arguments to a 
FnDataService to send data to a remote harness.
 
 
- RemoteInputDestination() - Constructor for class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
 
-  
 
- RemoteOutputReceiver<T> - Class in org.apache.beam.runners.fnexecution.control
 
- 
 
- RemoteOutputReceiver() - Constructor for class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
 
-  
 
- remove() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.BlockingQueueIterator
 
-  
 
- remove() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
 
-  
 
- remove(K) - Method in interface org.apache.beam.sdk.state.MapState
 
- 
Remove the mapping for a key from this map if it is present.
 
- remove(T) - Method in interface org.apache.beam.sdk.state.SetState
 
- 
Removes the specified element from this set if it is present.
 
- removeArtifacts(String) - Method in class org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 
-  
 
- removeTemporaryFiles(Collection<ResourceId>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
-  
 
- rename(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
 
- 
Renames a List of file-like resources from one location to another.
 
- rename(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
 
- 
Renames a List of file-like resources from one location to another.
 
- render() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
 
-  
 
- renderAll() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators
 
-  
 
- Repeatedly - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
Trigger that fires according to its subtrigger forever.
 
 
- replaceAll(List<PTransformOverride>) - Method in class org.apache.beam.sdk.Pipeline
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- replaceAll(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
Returns a 
Regex.ReplaceAll PTransform that checks if a portion of the line
 matches the Regex and replaces all matches with the replacement String.
 
 
- replaceAll(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
Returns a 
Regex.ReplaceAll PTransform that checks if a portion of the line
 matches the Regex and replaces all matches with the replacement String.
 
 
- ReplaceAll(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.ReplaceAll
 
-  
 
- replaceFirst(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
Returns a 
Regex.ReplaceAll PTransform that checks if a portion of the line
 matches the Regex and replaces the first match with the replacement String.
 
 
- replaceFirst(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
Returns a 
Regex.ReplaceAll PTransform that checks if a portion of the line
 matches the Regex and replaces the first match with the replacement String.
 
 
- ReplaceFirst(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
 
-  
 
- replaceTransforms(Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
-  
 
- reportElementSize(long) - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
-  
 
- reportWorkItemStatus(String, ReportWorkItemStatusRequest) - Method in class org.apache.beam.runners.dataflow.DataflowClient
 
- 
Reports the status of the work item for jobId.
 
- Requirements - Class in org.apache.beam.sdk.transforms
 
- 
Describes the run-time requirements of a 
Contextful, such as access to side inputs.
 
 
- requiresDeduping() - Method in class org.apache.beam.sdk.io.UnboundedSource
 
- 
Returns whether this source requires explicit deduping.
 
- requiresSideInputs(Collection<PCollectionView<?>>) - Static method in class org.apache.beam.sdk.transforms.Requirements
 
- 
Describes the need for access to the given side inputs.
 
- requiresSideInputs(PCollectionView<?>...) - Static method in class org.apache.beam.sdk.transforms.Requirements
 
- 
 
- reset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
-  
 
- resetCache() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
- 
Resets the set of interfaces registered with this factory to the default state.
 
- resetLocal() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
-  
 
- resetTo(int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Resets the end of the stream to the specified position.
 
- Reshuffle<K,V> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Reshuffle.ViaRandomKey<T> - Class in org.apache.beam.sdk.transforms
 
- 
Deprecated.
 
- ReshuffleTrigger<W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- ReshuffleTrigger() - Constructor for class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
- 
Deprecated.
  
- resolve(String, ResolveOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
-  
 
- resolve(String, ResolveOptions) - Method in interface org.apache.beam.sdk.io.fs.ResourceId
 
- 
Returns a child ResourceId under this.
 
- resolve(Schema) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
-  
 
- ResolveOptions - Interface in org.apache.beam.sdk.io.fs
 
- 
 
- ResolveOptions.StandardResolveOptions - Enum in org.apache.beam.sdk.io.fs
 
- 
Defines the standard resolve options.
 
- resolveType(Type) - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns a 
TypeDescriptor representing the given type, with type variables resolved
 according to the specialization in this type.
 
 
- resourceId() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
-  
 
- ResourceId - Interface in org.apache.beam.sdk.io.fs
 
- 
An identifier which represents a file-like resource.
 
- ResourceIdCoder - Class in org.apache.beam.sdk.io.fs
 
- 
 
- ResourceIdCoder() - Constructor for class org.apache.beam.sdk.io.fs.ResourceIdCoder
 
-  
 
- ResourceIdTester - Class in org.apache.beam.sdk.io.fs
 
- 
 
- RESTRICTION_CODER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- RestrictionTracker<RestrictionT,PositionT> - Class in org.apache.beam.sdk.transforms.splittabledofn
 
- 
Manages concurrent access to the restriction and keeps track of its claimed part for a 
splittable DoFn.
 
 
- RestrictionTracker() - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
 
-  
 
- RestrictionTracker.ClaimObserver<PositionT> - Interface in org.apache.beam.sdk.transforms.splittabledofn
 
- 
 
- resultCoder() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandler
 
- 
Returns the 
Coder to use for the elements of the resulting values iterable.
 
 
- resume(TimerInternals.TimerData) - Method in class org.apache.beam.runners.fnexecution.splittabledofn.SDFFeederViaStateAndTimers
 
- 
Resumes from a timer and returns the current element/restriction pair (with an up-to-date value
 of the restriction).
 
- resume() - Static method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
 
- 
Indicates that there is more work to be done for the current element.
 
- resumeDelay() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
 
- 
 
- retrievalToken() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
-  
 
- RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.aws.sns.SnsIO.RetryConfiguration
 
-  
 
- RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.RetryConfiguration
 
-  
 
- RetryConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.RetryConfiguration
 
-  
 
- retryTransientErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
 
- 
Retry all failures except for known persistent errors.
 
- Reversed() - Constructor for class org.apache.beam.sdk.transforms.Top.Reversed
 
-  
 
- rightOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
 
- 
Right Outer Join of two collections of KV elements.
 
- root() - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
- 
Path for display data registered by a top-level component.
 
- row(Schema) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
Create a map type for the given key and value types.
 
- Row - Class in org.apache.beam.sdk.values
 
- 
Row is an immutable tuple-like schema to represent one element in a 
PCollection.
 
 
- Row.Builder - Class in org.apache.beam.sdk.values
 
- 
 
- RowCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- RowCoderGenerator - Class in org.apache.beam.sdk.coders
 
- 
A utility for automatically generating a 
Coder for 
Row objects corresponding to a
 specific schema.
 
 
- RowCoderGenerator() - Constructor for class org.apache.beam.sdk.coders.RowCoderGenerator
 
-  
 
- rowReceiver(DoFn<?, ?>.WindowedContext, TupleTag<T>, SchemaCoder<T>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
 
- 
Returns a 
DoFn.OutputReceiver that automatically converts a 
Row to the user's output
 type and delegates to 
WindowedContextOutputReceiver.
 
 
- rows() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- RowWithGetters - Class in org.apache.beam.sdk.values
 
- 
 
- RowWithStorage - Class in org.apache.beam.sdk.values
 
- 
Concrete subclass of 
Row that explicitly stores all fields of the row.
 
 
- run(Pipeline) - Method in class org.apache.beam.runners.apex.ApexRunner
 
-  
 
- run() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.ProcessWatcher
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.apex.TestApexRunner
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.dataflow.TestDataflowRunner
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.direct.DirectRunner
 
-  
 
- run(JobApi.RunJobRequest, StreamObserver<JobApi.RunJobResponse>) - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- run() - Method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.flink.FlinkRunner
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.flink.TestFlinkRunner
 
-  
 
- run(JobApi.RunJobRequest, StreamObserver<JobApi.RunJobResponse>) - Method in class org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.gearpump.GearpumpRunner
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.gearpump.TestGearpumpRunner
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.reference.PortableRunner
 
-  
 
- run(JobApi.RunJobRequest, StreamObserver<JobApi.RunJobResponse>) - Method in class org.apache.beam.runners.reference.testing.TestJobService
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.reference.testing.TestPortableRunner
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.spark.SparkRunner
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger
 
-  
 
- run(Pipeline) - Method in class org.apache.beam.runners.spark.TestSparkRunner
 
-  
 
- run() - Method in interface org.apache.beam.sdk.fn.function.ThrowingRunnable
 
-  
 
- run() - Method in class org.apache.beam.sdk.Pipeline
 
- 
 
- run(PipelineOptions) - Method in class org.apache.beam.sdk.Pipeline
 
- 
 
- run(Pipeline) - Method in class org.apache.beam.sdk.PipelineRunner
 
- 
Processes the given 
Pipeline, potentially asynchronously, returning a runner-specific
 type of result.
 
 
- run(PTransform<PBegin, ?>, PipelineOptions) - Method in class org.apache.beam.sdk.PipelineRunner
 
- 
 
- run(PTransform<PBegin, ?>) - Method in class org.apache.beam.sdk.PipelineRunner
 
- 
 
- run(Pipeline) - Method in class org.apache.beam.sdk.testing.CrashingRunner
 
-  
 
- run() - Method in class org.apache.beam.sdk.testing.TestPipeline
 
- 
Runs this 
TestPipeline, unwrapping any 
AssertionError that is raised during
 testing.
 
 
- run(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipeline
 
- 
 
- Runner() - Constructor for class org.apache.beam.runners.apex.ApexRunnerRegistrar.Runner
 
-  
 
- Runner() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
 
-  
 
- Runner() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Runner
 
-  
 
- Runner() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
 
-  
 
- Runner() - Constructor for class org.apache.beam.runners.gearpump.GearpumpRunnerRegistrar.Runner
 
-  
 
- Runner() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
 
-  
 
- runResourceIdBattery(ResourceId) - Static method in class org.apache.beam.sdk.io.fs.ResourceIdTester
 
- 
 
- runWindowFn(WindowFn<T, W>, List<Long>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
Runs the 
WindowFn over the provided input, returning a map of windows to the timestamps
 in those windows.
 
 
- runWindowFnWithValue(WindowFn<T, W>, List<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
Runs the 
WindowFn over the provided input, returning a map of windows to the timestamps
 in those windows.
 
 
- S3ClientBuilderFactory - Interface in org.apache.beam.sdk.io.aws.options
 
- 
Construct AmazonS3ClientBuilder from S3 pipeline options.
 
- S3FileSystemRegistrar - Class in org.apache.beam.sdk.io.aws.s3
 
- 
AutoService registrar for the S3FileSystem.
 
- S3FileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.s3.S3FileSystemRegistrar
 
-  
 
- S3Options - Interface in org.apache.beam.sdk.io.aws.options
 
- 
Options used to configure Amazon Web Services S3.
 
- S3Options.S3UploadBufferSizeBytesFactory - Class in org.apache.beam.sdk.io.aws.options
 
- 
Provide the default s3 upload buffer size in bytes: 64MB if more than 512MB in RAM are
 available and 5MB otherwise.
 
- S3UploadBufferSizeBytesFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.S3Options.S3UploadBufferSizeBytesFactory
 
-  
 
- Sample - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for taking samples of the elements in a PCollection, or samples of
 the values associated with each key in a PCollection of KVs.
 
- Sample() - Constructor for class org.apache.beam.sdk.transforms.Sample
 
-  
 
- Sample.FixedSizedSampleFn<T> - Class in org.apache.beam.sdk.transforms
 
- 
CombineFn that computes a fixed-size sample of a collection of values.
 
- satisfies(RelTrait) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
-  
 
- satisfies(SerializableFunction<Iterable<T>, Void>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
 
- 
Applies the provided checking function (presumably containing assertions) to the iterable in
 question.
 
- satisfies(SerializableFunction<Iterable<T>, Void>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
-  
 
- satisfies(SerializableFunction<T, Void>) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
 
- 
Applies the provided checking function (presumably containing assertions) to the value in
 question.
 
- SCALAR_FIELD_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- schema - Variable in class org.apache.beam.sdk.extensions.sql.impl.schema.BaseBeamTable
 
-  
 
- schema(Schema) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
-  
 
- Schema - Class in org.apache.beam.sdk.schemas
 
- 
 
- Schema(List<Schema.Field>) - Constructor for class org.apache.beam.sdk.schemas.Schema
 
-  
 
- Schema.Builder - Class in org.apache.beam.sdk.schemas
 
- 
Builder class for building 
Schema objects.
 
 
- Schema.Field - Class in org.apache.beam.sdk.schemas
 
- 
Field of a row.
 
- Schema.Field.Builder - Class in org.apache.beam.sdk.schemas
 
- 
 
- Schema.FieldType - Class in org.apache.beam.sdk.schemas
 
- 
A descriptor of a single field type.
 
- Schema.TypeName - Enum in org.apache.beam.sdk.schemas
 
- 
An enumerated list of type constructors.
 
- SchemaAndRecord - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
A wrapper for a 
GenericRecord and the 
TableSchema representing the schema of the
 table (or query) it was generated from.
 
 
- SchemaAndRecord(GenericRecord, TableSchema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
-  
 
- SchemaCoder<T> - Class in org.apache.beam.sdk.schemas
 
- 
SchemaCoder is used as the coder for types that have schemas registered.
 
 
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.DefaultSchema.DefaultSchemaProvider
 
-  
 
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
 
-  
 
- schemaFor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
 
-  
 
- schemaFor(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
 
- 
Lookup a schema for the given type.
 
- schemaFromClass(Class<?>, Function<Class, List<StaticSchemaInference.TypeInformation>>) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
 
- 
Infer a schema from a Java class.
 
- schemaFromJavaBeanClass(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
 
- 
Create a 
Schema for a Java Bean class.
 
 
- schemaFromPojoClass(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
 
-  
 
- SchemaProvider - Interface in org.apache.beam.sdk.schemas
 
- 
Concrete implementations of this class allow creation of schema service objects that vend a
 
Schema for a specific type.
 
 
- SchemaProviderRegistrar - Interface in org.apache.beam.sdk.schemas
 
- 
SchemaProvider creators have the ability to automatically have their 
schemaProvider registered with this SDK by creating a 
ServiceLoader entry
 and a concrete implementation of this interface.
 
 
- SchemaRegistry - Class in org.apache.beam.sdk.schemas
 
- 
 
- scopedMetricsContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
 
- 
 
- SDFFeederViaStateAndTimers<InputT,RestrictionT> - Class in org.apache.beam.runners.fnexecution.splittabledofn
 
- 
Helper class for feeding element/restricton pairs into a PTransformTranslation.SPLITTABLE_PROCESS_ELEMENTS_URN transform, implementing checkpointing
 only, by using state and timers for storing the last element/restriction pair, similarly to
 SplittableParDoViaKeyedWorkItems.ProcessFn but in a portable
 fashion.
 
- SDFFeederViaStateAndTimers(StateInternals, TimerInternals, Coder<InputT>, Coder<RestrictionT>, Coder<BoundedWindow>) - Constructor for class org.apache.beam.runners.fnexecution.splittabledofn.SDFFeederViaStateAndTimers
 
- 
Initializes the feeder.
 
- SDK_WORKER_PARALLELISM_PIPELINE - Static variable in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- SDK_WORKER_PARALLELISM_STAGE - Static variable in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- SdkHarnessClient - Class in org.apache.beam.runners.fnexecution.control
 
- 
A high-level client for an SDK harness.
 
- SdkHarnessClient.ActiveBundle - Class in org.apache.beam.runners.fnexecution.control
 
- 
An active bundle for a particular BeamFnApi.ProcessBundleDescriptor.
 
- SdkHarnessClient.BundleProcessor - Class in org.apache.beam.runners.fnexecution.control
 
- 
A processor capable of creating bundles for some registered BeamFnApi.ProcessBundleDescriptor.
 
- SdkHarnessLogLevelOverrides() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
 
-  
 
- SdkHarnessOptions - Interface in org.apache.beam.sdk.options
 
- 
Options that are used to control configuration of the SDK harness.
 
- SdkHarnessOptions.LogLevel - Enum in org.apache.beam.sdk.options
 
- 
The set of log levels that can be used in the SDK harness.
 
- SdkHarnessOptions.SdkHarnessLogLevelOverrides - Class in org.apache.beam.sdk.options
 
- 
Defines a log level override for a specific class, package, or name.
 
- seed(WindowedValue<KV<InputT, RestrictionT>>) - Method in class org.apache.beam.runners.fnexecution.splittabledofn.SDFFeederViaStateAndTimers
 
- 
Passes the initial element/restriction pair.
 
- seekRow(Row) - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
 
- 
return a list of Row with given key set.
 
- Select<T> - Class in org.apache.beam.sdk.schemas.transforms
 
- 
A 
PTransform for selecting a subset of fields from a schema type.
 
 
- send(LogicalEndpoint, Coder<WindowedValue<T>>) - Method in interface org.apache.beam.runners.fnexecution.data.FnDataService
 
- 
Creates a receiver to which you can write data values and have them sent over this data plane
 service.
 
- send(LogicalEndpoint, Coder<WindowedValue<T>>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
 
-  
 
- SerializableCoder<T extends java.io.Serializable> - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder for Java classes that implement 
Serializable.
 
 
- SerializableCoder(Class<T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.SerializableCoder
 
-  
 
- SerializableCoder.SerializableCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
 
- 
 
- SerializableCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
 
-  
 
- SerializableComparator<T> - Interface in org.apache.beam.sdk.transforms
 
- 
A Comparator that is also Serializable.
 
- SerializableConfiguration - Class in org.apache.beam.sdk.io.hadoop
 
- 
A wrapper to allow Hadoop Configurations to be serialized using Java's standard
 serialization mechanisms.
 
- SerializableConfiguration() - Constructor for class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
-  
 
- SerializableConfiguration(Configuration) - Constructor for class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
-  
 
- SerializableFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.transforms
 
- 
A function that computes an output value of type OutputT from an input value of type
 InputT and is Serializable.
 
- SerializableFunctions - Class in org.apache.beam.sdk.transforms
 
- 
 
- SerializableFunctions() - Constructor for class org.apache.beam.sdk.transforms.SerializableFunctions
 
-  
 
- SerializableMatcher<T> - Interface in org.apache.beam.sdk.testing
 
- 
A 
Matcher that is also 
Serializable.
 
 
- SerializableMatchers - Class in org.apache.beam.sdk.testing
 
- 
 
- SerializableMatchers.SerializableSupplier<T> - Interface in org.apache.beam.sdk.testing
 
- 
Supplies values of type T, and is serializable.
 
- SerializableSplit() - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableSplit
 
-  
 
- SerializableSplit(InputSplit) - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableSplit
 
-  
 
- serialize(String, Instant) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
-  
 
- serialize(ValueProvider<?>, JsonGenerator, SerializerProvider) - Method in class org.apache.beam.sdk.options.ValueProvider.Serializer
 
-  
 
- SERIALIZED_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- Serializer() - Constructor for class org.apache.beam.sdk.options.ValueProvider.Serializer
 
-  
 
- serializeTimers(Collection<TimerInternals.TimerData>, TimerInternals.TimerDataCoder) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- ServerConfiguration() - Constructor for class org.apache.beam.runners.flink.FlinkJobServerDriver.ServerConfiguration
 
-  
 
- serverDirect() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
 
- 
 
- ServerFactory - Class in org.apache.beam.runners.fnexecution
 
- 
A gRPC server factory.
 
- ServerFactory() - Constructor for class org.apache.beam.runners.fnexecution.ServerFactory
 
-  
 
- ServerFactory.InetSocketAddressServerFactory - Class in org.apache.beam.runners.fnexecution
 
- 
Creates a gRPC Server using the default server factory.
 
- ServerFactory.UrlFactory - Interface in org.apache.beam.runners.fnexecution
 
- 
Factory that constructs client-accessible URLs from a local server address and port.
 
- Sessions - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- set(long) - Method in interface org.apache.beam.sdk.metrics.Gauge
 
- 
Set current value for this gauge.
 
- set(ObjectT, ValueT) - Method in interface org.apache.beam.sdk.schemas.FieldValueSetter
 
- 
Sets the specified field on object to value.
 
- set() - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
 
- set(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
 
- set(Instant) - Method in interface org.apache.beam.sdk.state.Timer
 
- 
Sets or resets the time in the timer's 
TimeDomain at which it should fire.
 
 
- set(long...) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
 
- setApiRootUrl(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setApplicationName(String) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- setApplicationName(String) - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- setAppName(String) - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
 
-  
 
- setAutoscalingAlgorithm(DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setAwsCredentialsProvider(AWSCredentialsProvider) - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
-  
 
- setAwsRegion(String) - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
-  
 
- setAwsServiceEndpoint(String) - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
-  
 
- setBatchIntervalMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setBlockOnRun(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
-  
 
- setBundleSize(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setCheckpointDir(String) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setCheckpointDurationMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setCheckpointingInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setCheckpointingMode(CheckpointingMode) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setCheckpointTimeoutMillis(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setClaimObserver(RestrictionTracker.ClaimObserver<PositionT>) - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
 
- 
 
- setClientConfiguration(ClientConfiguration) - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
 
-  
 
- setClientContext(ClientContext) - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- setCloningBehavior(DoFnTester.CloningBehavior) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- SetCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
 
- SetCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.SetCoder
 
-  
 
- setCoder(Coder<T>) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
Sets the 
Coder used by this 
PCollection to encode and decode the values stored
 in it.
 
 
- setCoderRegistry(CoderRegistry) - Method in class org.apache.beam.sdk.Pipeline
 
- 
 
- setConfigFile(String) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- setCountryOfResidence(String) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
-  
 
- setCredentialFactoryClass(Class<? extends CredentialFactory>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
-  
 
- setCurrentContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
 
- 
 
- setCurrentTransform(TransformHierarchy.Node, Pipeline) - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- setCustomerId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
-  
 
- setDataflowClient(Dataflow) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setDataflowEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setDataflowJobFile(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setDebuggee(Debuggee) - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
-  
 
- setDefaultEnvironmentConfig(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- setDefaultEnvironmentType(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- setDefaultJavaEnvironmentUrl(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- setDefaultPipelineOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
 
- 
Sets the default configuration in workers.
 
- setDefaultSdkHarnessLogLevel(SdkHarnessOptions.LogLevel) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
-  
 
- setDefaultWorkerLogLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
- 
Deprecated.
  
- setDescription(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
-  
 
- setDiskSizeGb(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setDumpHeapOnOOM(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setEmbeddedCluster(EmbeddedCluster) - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- setEmbeddedExecution(boolean) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- setEmbeddedExecutionDebugMode(boolean) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- setEnableCloudDebugger(boolean) - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
-  
 
- setEnableMetrics(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setEnableSparkMetricSinks(Boolean) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setEnforceEncodability(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
-  
 
- setEnforceImmutability(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
-  
 
- setExecutionRetryDelay(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setExecutorService(ExecutorService) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
-  
 
- setExpectedAssertions(Integer) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
-  
 
- setExperiments(List<String>) - Method in interface org.apache.beam.sdk.options.ExperimentalOptions
 
-  
 
- setExternalizedCheckpointsEnabled(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setFilesToStage(List<String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setFilesToStage(List<String>) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setFilesToStage(List<String>) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setFilesToStage(List<String>) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- setFlinkMaster(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setForceStreaming(boolean) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
-  
 
- setGcpCredential(Credentials) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
-  
 
- setGcpTempLocation(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
-  
 
- setGcsEndpoint(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
-  
 
- setGcsUploadBufferSizeBytes(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
-  
 
- setGcsUploadBufferSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
-  
 
- setGcsUtil(GcsUtil) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
-  
 
- setGoogleApiTrace(GoogleApiDebugOptions.GoogleApiTracer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
 
-  
 
- setHdfsConfiguration(List<Configuration>) - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
 
-  
 
- setHooks(DataflowRunnerHooks) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
- 
Sets callbacks to invoke during execution see DataflowRunnerHooks.
 
- setId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
-  
 
- setId(int) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
 
-  
 
- setIsBoundedInternal(PCollection.IsBounded) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- setIsReadSeekEfficient(boolean) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
-  
 
- setJobEndpoint(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- setJobId(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
-  
 
- setJobName(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setJobServerConfig(String...) - Method in interface org.apache.beam.runners.reference.testing.TestPortablePipelineOptions
 
-  
 
- setJobServerDriver(Class) - Method in interface org.apache.beam.runners.reference.testing.TestPortablePipelineOptions
 
-  
 
- setLabels(Map<String, String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- setLatencyTrackingInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setListeners(List<JavaStreamingListener>) - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
-  
 
- setMaxBundleSize(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setMaxBundleTimeMills(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setMaxConditionCost(double) - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
-  
 
- setMaxNumWorkers(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setMaxRecordsPerBatch(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setMetricsHttpSinkUrl(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setMetricsPushPeriod(Long) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setMetricsSink(Class<? extends MetricsSink>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setMetricsSupported(boolean) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
 
- 
Called by the run to indicate whether metrics reporting is supported.
 
- setMimeType(String) - Method in class org.apache.beam.sdk.io.fs.CreateOptions.Builder
 
-  
 
- setMinCpuPlatform(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setMinPauseBetweenCheckpoints(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setMinReadTimeMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setName(String) - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
 
-  
 
- setName(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
-  
 
- setName(String) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
 
- setName(String) - Method in class org.apache.beam.sdk.values.PValueBase
 
- 
 
- setNetwork(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setNullable(Boolean) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
-  
 
- setNumberOfExecutionRetries(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setNumberOfWorkerHarnessThreads(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setNumWorkers(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setObjectReuse(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setOnCreateMatcher(SerializableMatcher<PipelineResult>) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
-  
 
- setOnSuccessMatcher(SerializableMatcher<PipelineResult>) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
-  
 
- SetOperatorFilteringDoFn(TupleTag<Row>, TupleTag<Row>, BeamSetOperatorRelBase.OpType, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.SetOperatorFilteringDoFn
 
-  
 
- setOptionsId(long) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setOutputStream(PValue, JavaStream<OutputT>) - Method in class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- setOverrideWindmillBinary(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setParallelism(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setParallelism(int) - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- setParameters(T, PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.PreparedStatementSetter
 
-  
 
- setParameters(PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.StatementPreparator
 
-  
 
- setParDoFusionEnabled(boolean) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- setPathValidator(PathValidator) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
-  
 
- setPathValidatorClass(Class<? extends PathValidator>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
-  
 
- setPipelineUrl(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- setProfilingAgentConfiguration(DataflowProfilingOptions.DataflowProfilingAgentConfiguration) - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
-  
 
- setProject(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- setProject(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
-  
 
- setProvidedSparkContext(JavaSparkContext) - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
-  
 
- setProviderRuntimeValues(ValueProvider<Map<String, Object>>) - Method in interface org.apache.beam.sdk.testing.TestPipeline.TestValueProviderOptions
 
-  
 
- setPubsubRootUrl(String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
 
-  
 
- setReadTimePercentage(Double) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setRegion(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- setRelative() - Method in interface org.apache.beam.sdk.state.Timer
 
- 
Sets the timer relative to the current time, according to any offset and alignment specified.
 
- setResourceId(ResourceId) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
-  
 
- setRetainDockerContainers(boolean) - Method in interface org.apache.beam.sdk.options.ManualDockerEnvironmentOptions
 
-  
 
- setRetainExternalizedCheckpointsOnCancellation(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setRowSchema(Schema) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
Sets a schema on this PCollection.
 
- setRunMillis(long) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- setRunner(Class<? extends PipelineRunner<?>>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setRunnerDeterminedSharding(boolean) - Method in interface org.apache.beam.runners.direct.DirectTestOptions
 
-  
 
- sets(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- setS3ClientFactoryClass(Class<? extends S3ClientBuilderFactory>) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- setS3StorageClass(String) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- setS3ThreadPoolSize(int) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- setS3UploadBufferSizeBytes(Integer) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- setSaveHeapDumpsToGcsPath(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setSaveProfilesToGcs(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
-  
 
- setSchema(Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
 
- setSdkHarnessLogLevelOverrides(SdkHarnessOptions.SdkHarnessLogLevelOverrides) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
 
-  
 
- setSdkWorkerParallelism(String) - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
 
-  
 
- setSerializers(Map<String, String>) - Method in interface org.apache.beam.runners.gearpump.GearpumpPipelineOptions
 
-  
 
- setServiceAccount(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- setShutdownSourcesOnFinalWatermark(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setSideInput(PCollectionView<T>, BoundedWindow, T) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- setSideInputs(Map<PCollectionView<?>, Map<BoundedWindow, ?>>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- setSizeBytes(long) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
-  
 
- setSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
-  
 
- setSparkMaster(String) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setSSEAlgorithm(String) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- setSSEAwsKeyManagementParams(SSEAwsKeyManagementParams) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- setSSECustomerKey(SSECustomerKey) - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
 
-  
 
- setStableUniqueNames(PipelineOptions.CheckEnabled) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setStager(Stager) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setStagerClass(Class<? extends Stager>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setStagingLocation(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- SetState<T> - Interface in org.apache.beam.sdk.state
 
- 
 
- setStateBackend(StateBackend) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
-  
 
- setStopPipelineWatermark(Long) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
-  
 
- setStorageLevel(String) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setStreaming(boolean) - Method in interface org.apache.beam.sdk.options.StreamingOptions
 
-  
 
- setSubnetwork(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setTargetDataset(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
-  
 
- setTargetParallelism(int) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
-  
 
- setTempDatasetId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
-  
 
- setTemplateLocation(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- setTempLocation(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setTempRoot(String) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
-  
 
- setTestTimeoutSeconds(Long) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
-  
 
- setTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate.TimerUpdateBuilder
 
- 
Adds the provided timer to the collection of set timers, removing it from deleted timers if
 it has previously been deleted.
 
- setTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- setTimer(StateNamespace, String, Instant, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- setTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
 
- 
Sets a timer to fire when the event time watermark, the current processing time, or the
 synchronized processing time watermark surpasses a given timestamp.
 
- setTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
-  
 
- setTransformNameMapping(Map<String, String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setTupleTracingEnabled(boolean) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
-  
 
- setType(Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Field.Builder
 
-  
 
- setTypeDescriptor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
 
- setup() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
-  
 
- setup() - Method in class org.apache.beam.runners.gearpump.translators.functions.DoFnFunction
 
-  
 
- setUp() - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
 
- 
prepare the instance.
 
- setUpdate(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
-  
 
- setUsePublicIps(Boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setUserAgent(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
-  
 
- setUsesProvidedSparkContext(boolean) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
-  
 
- setUUID(UUID) - Method in class org.apache.beam.sdk.schemas.Schema
 
- 
Set this schema's UUID.
 
- setWindmillServiceEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setWindmillServicePort(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
-  
 
- setWindowedWrites(boolean) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
Indicates that the operation will be performing windowed writes.
 
- setWindowingStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- setWorkerCacheMb(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
-  
 
- setWorkerDiskType(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setWorkerHarnessContainerImage(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setWorkerId(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
-  
 
- setWorkerLogLevelOverrides(DataflowWorkerLoggingOptions.WorkerLogLevelOverrides) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
- 
Deprecated.
  
- setWorkerMachineType(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setWorkerSystemErrMessageLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
- 
Deprecated.
  
- setWorkerSystemOutMessageLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
- 
Deprecated.
  
- setZone(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
-  
 
- setZone(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
-  
 
- ShardedKey<K> - Class in org.apache.beam.sdk.values
 
- 
A key and a shard number.
 
- ShardedKeyCoder<KeyT> - Class in org.apache.beam.sdk.coders
 
- 
 
- ShardedKeyCoder(Coder<KeyT>) - Constructor for class org.apache.beam.sdk.coders.ShardedKeyCoder
 
-  
 
- ShardNameTemplate - Class in org.apache.beam.sdk.io
 
- 
Standard shard naming templates.
 
- ShardNameTemplate() - Constructor for class org.apache.beam.sdk.io.ShardNameTemplate
 
-  
 
- shorts() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- shouldDefer(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
-  
 
- shouldResume() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
 
- 
If false, the 
DoFn promises that there is no more work remaining for the current
 element, so the runner should not resume the 
DoFn.ProcessElement call.
 
 
- shouldRetry(InsertRetryPolicy.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
 
- 
Return true if this failure should be retried.
 
- sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
 
- 
Returns the value of a given side input.
 
- sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 
- 
Returns the value of a given side input.
 
- sideInput(PCollectionView<T>) - Method in interface org.apache.beam.sdk.state.StateContext
 
- 
Returns the value of the side input for the corresponding state window.
 
- sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
 
- 
Returns the value of the side input for the window corresponding to the main input's window
 in which values are being combined.
 
- sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.Contextful.Fn.Context
 
- 
Accesses the given side input.
 
- sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
 
- 
Returns the value of the side input.
 
- SideInputBroadcast<T> - Class in org.apache.beam.runners.spark.util
 
- 
Broadcast helper for side inputs.
 
- sideInputId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
-  
 
- sideInputJoin(PCollection<KV<Row, Row>>, PCollection<KV<Row, Row>>, Schema, Schema) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
 
-  
 
- SideInputJoinDoFn(JoinRelType, Row, PCollectionView<Map<Row, Iterable<Row>>>, boolean, Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.SideInputJoinDoFn
 
-  
 
- SideInputSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
-  
 
- signalStart() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
- 
Outputs a message that the pipeline has started.
 
- signalSuccessWhen(Coder<T>, SerializableFunction<T, String>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
- 
Outputs a success message when successPredicate is evaluated to true.
 
- signalSuccessWhen(Coder<T>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
- 
 
- SimpleCombineFn(SerializableFunction<Iterable<V>, V>) - Constructor for class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
 
- 
Deprecated.
  
- SimpleFunction<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- SimpleFunction() - Constructor for class org.apache.beam.sdk.transforms.SimpleFunction
 
-  
 
- SimpleFunction(SerializableFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.transforms.SimpleFunction
 
-  
 
- SimpleRemoteEnvironment() - Constructor for class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
 
-  
 
- SingleEnvironmentInstanceJobBundleFactory - Class in org.apache.beam.runners.fnexecution.control
 
- 
 
- singleOutputOverrideFactory(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
 
- 
Returns a 
PTransformOverrideFactory that replaces a single-output 
ParDo with a
 composite transform specialized for the 
DataflowRunner.
 
 
- singletonView(PCollection<KV<Void, T>>, WindowingStrategy<?, W>, boolean, T, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
 
- 
Returns a 
PCollectionView<T> capable of processing elements windowed using the provided
 
WindowingStrategy.
 
 
- sink(Class<ElementT>) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
 
- Sink() - Constructor for class org.apache.beam.sdk.io.AvroIO.Sink
 
-  
 
- sink - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
The Sink that this WriteOperation will write to.
 
- sink(Schema) - Static method in class org.apache.beam.sdk.io.parquet.ParquetIO
 
- 
 
- Sink() - Constructor for class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
 
-  
 
- sink() - Static method in class org.apache.beam.sdk.io.TextIO
 
- 
 
- Sink() - Constructor for class org.apache.beam.sdk.io.TextIO.Sink
 
-  
 
- sink() - Static method in class org.apache.beam.sdk.io.TFRecordIO
 
- 
 
- Sink() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Sink
 
-  
 
- sink(Class<T>) - Static method in class org.apache.beam.sdk.io.xml.XmlIO
 
- 
Outputs records as XML-formatted elements using JAXB.
 
- Sink() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
-  
 
- SinkMetrics - Class in org.apache.beam.sdk.metrics
 
- 
Standard Sink Metrics.
 
- SinkMetrics() - Constructor for class org.apache.beam.sdk.metrics.SinkMetrics
 
-  
 
- sinkViaGenericRecords(Schema, AvroIO.RecordFormatter<ElementT>) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
 
- size() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Returns the number of bytes in the backing array that are valid.
 
- size() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
- 
Returns the number of columns for this schema.
 
- size() - Method in class org.apache.beam.sdk.values.PCollectionList
 
- 
 
- size() - Method in class org.apache.beam.sdk.values.TupleTagList
 
- 
Returns the number of TupleTags in this TupleTagList.
 
- sizeBytes() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
-  
 
- Sketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
 
-  
 
- SketchFrequencies - Class in org.apache.beam.sdk.extensions.sketching
 
- 
PTransforms to compute the estimate frequency of each element in a stream.
 
- SketchFrequencies() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
 
-  
 
- SketchFrequencies.CountMinSketchFn<InputT> - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- SketchFrequencies.GlobalSketch<InputT> - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- SketchFrequencies.PerKeySketch<K,V> - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- SketchFrequencies.Sketch<T> - Class in org.apache.beam.sdk.extensions.sketching
 
- 
Wrap StreamLib's Count-Min Sketch to support counting all user types by hashing the encoded
 user type using the supplied deterministic coder.
 
- skipInvalidRows() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Insert all valid rows of a request, even if invalid rows exist.
 
- Slf4jLogWriter - Class in org.apache.beam.runners.fnexecution.logging
 
- 
A 
LogWriter which uses an 
SLF4J Logger as the underlying log backend.
 
 
- SlidingWindows - Class in org.apache.beam.sdk.transforms.windowing
 
- 
A 
WindowFn that windows values into possibly overlapping fixed-size timestamp-based
 windows.
 
 
- SMALL_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- smallest(int) - Static method in class org.apache.beam.sdk.transforms.Top
 
- 
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<List<T>> with a single element containing the smallest count elements of
 the input PCollection<T>, in increasing order, sorted according to their natural order.
 
- Smallest() - Constructor for class org.apache.beam.sdk.transforms.Top.Smallest
 
- 
Deprecated.
  
- smallestDoublesFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
 
- 
 
- smallestFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
 
- 
 
- smallestIntsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
 
- 
 
- smallestLongsFn(int) - Static method in class org.apache.beam.sdk.transforms.Top
 
- 
 
- smallestPerKey(int) - Static method in class org.apache.beam.sdk.transforms.Top
 
- 
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a
 PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key
 in the input PCollection to the smallest count values associated with that key
 in the input PCollection<KV<K, V>>, in increasing order, sorted according to their
 natural order.
 
- SnappyCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
Wraps an existing coder with Snappy compression.
 
- snapshot(SchemaVersion) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
 
-  
 
- SnsCoderProviderRegistrar - Class in org.apache.beam.sdk.io.aws.sns
 
- 
 
- SnsCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.sns.SnsCoderProviderRegistrar
 
-  
 
- SnsIO - Class in org.apache.beam.sdk.io.aws.sns
 
- 
 
- SnsIO() - Constructor for class org.apache.beam.sdk.io.aws.sns.SnsIO
 
-  
 
- SnsIO.RetryConfiguration - Class in org.apache.beam.sdk.io.aws.sns
 
- 
A POJO encapsulating a configuration for retry behavior when issuing requests to SNS.
 
- SnsIO.Write - Class in org.apache.beam.sdk.io.aws.sns
 
- 
 
- SocketAddressFactory - Class in org.apache.beam.sdk.fn.channel
 
- 
Creates a SocketAddress based upon a supplied string.
 
- SocketAddressFactory() - Constructor for class org.apache.beam.sdk.fn.channel.SocketAddressFactory
 
-  
 
- SolrIO - Class in org.apache.beam.sdk.io.solr
 
- 
Transforms for reading and writing data from/to Solr.
 
- SolrIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.solr
 
- 
A POJO describing a connection configuration to Solr.
 
- SolrIO.Read - Class in org.apache.beam.sdk.io.solr
 
- 
 
- SolrIO.RetryConfiguration - Class in org.apache.beam.sdk.io.solr
 
- 
A POJO encapsulating a configuration for retry behavior when issuing requests to Solr.
 
- SolrIO.Write - Class in org.apache.beam.sdk.io.solr
 
- 
 
- sort() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
-  
 
- SORT_VALUES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- SortValues<PrimaryKeyT,SecondaryKeyT,ValueT> - Class in org.apache.beam.sdk.extensions.sorter
 
- 
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT> takes a PCollection<KV<PrimaryKeyT,
 Iterable<KV<SecondaryKeyT, ValueT>>>> with elements consisting of a primary key and iterables
 over <secondary key, value> pairs, and returns a PCollection<KV<PrimaryKeyT,
 Iterable<KV<SecondaryKeyT, ValueT>>> of the same elements but with values sorted by a secondary
 key.
 
- Source<T> - Class in org.apache.beam.sdk.io
 
- 
Base class for defining input formats and creating a Source for reading the input.
 
- Source() - Constructor for class org.apache.beam.sdk.io.Source
 
-  
 
- Source.Reader<T> - Class in org.apache.beam.sdk.io
 
- 
The interface that readers of custom input sources must implement.
 
- SOURCE_DOES_NOT_NEED_SPLITTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- SOURCE_ESTIMATED_SIZE_BYTES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- SOURCE_IS_INFINITE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- SOURCE_METADATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- SOURCE_SPEC - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- SOURCE_STEP_INPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- SourceMetrics - Class in org.apache.beam.sdk.metrics
 
- 
 
- SourceMetrics() - Constructor for class org.apache.beam.sdk.metrics.SourceMetrics
 
-  
 
- sourceName() - Method in class org.apache.beam.runners.spark.metrics.AggregatorMetricSource
 
-  
 
- sourceName() - Method in class org.apache.beam.runners.spark.metrics.CompositeSource
 
-  
 
- sourceName() - Method in class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
 
-  
 
- SourceRDD - Class in org.apache.beam.runners.spark.io
 
- 
Classes implementing Beam 
Source RDDs.
 
 
- SourceRDD() - Constructor for class org.apache.beam.runners.spark.io.SourceRDD
 
-  
 
- SourceRDD.Bounded<T> - Class in org.apache.beam.runners.spark.io
 
- 
 
- SourceRDD.Unbounded<T,CheckpointMarkT extends UnboundedSource.CheckpointMark> - Class in org.apache.beam.runners.spark.io
 
- 
 
- SourceTestUtils - Class in org.apache.beam.sdk.testing
 
- 
Helper functions and test harnesses for checking correctness of 
Source implementations.
 
 
- SourceTestUtils() - Constructor for class org.apache.beam.sdk.testing.SourceTestUtils
 
-  
 
- SourceTestUtils.ExpectedSplitOutcome - Enum in org.apache.beam.sdk.testing
 
- 
 
- span(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
- 
Returns the minimal window that includes both this window and the given window.
 
- SpannerAccessor - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
Manages lifecycle of DatabaseClient and Spanner instances.
 
- SpannerConfig - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
Configuration for a Cloud Spanner client.
 
- SpannerConfig() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- SpannerConfig.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- SpannerIO - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- SpannerIO.CreateTransaction - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- SpannerIO.CreateTransaction.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- SpannerIO.FailureMode - Enum in org.apache.beam.sdk.io.gcp.spanner
 
- 
A failure handling strategy.
 
- SpannerIO.Read - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- SpannerIO.ReadAll - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- SpannerIO.Write - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
A 
PTransform that writes 
Mutation objects to Google Cloud Spanner.
 
 
- SpannerIO.WriteGrouped - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- SpannerWriteResult - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
 
- SpannerWriteResult(Pipeline, PCollection<Void>, PCollection<MutationGroup>, TupleTag<MutationGroup>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
-  
 
- SparkBeamMetricSource - Class in org.apache.beam.runners.spark.metrics
 
- 
A Spark 
Source that is tailored to expose a 
SparkBeamMetric, wrapping an
 underlying 
MetricResults instance.
 
 
- SparkBeamMetricSource(String) - Constructor for class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
 
-  
 
- SparkContextOptions - Interface in org.apache.beam.runners.spark
 
- 
A custom 
PipelineOptions to work with properties related to 
JavaSparkContext.
 
 
- SparkContextOptions.EmptyListenersList - Class in org.apache.beam.runners.spark
 
- 
Returns an empty list, to avoid handling null.
 
- SparkGroupAlsoByWindowViaWindowSet - Class in org.apache.beam.runners.spark.stateful
 
- 
An implementation of GroupByKeyViaGroupByKeyOnly.GroupAlsoByWindow logic for grouping by windows and controlling
 trigger firings and pane accumulation.
 
- SparkGroupAlsoByWindowViaWindowSet() - Constructor for class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
 
-  
 
- SparkNativePipelineVisitor - Class in org.apache.beam.runners.spark
 
- 
Pipeline visitor for translating a Beam pipeline into equivalent Spark operations.
 
- SparkPipelineOptions - Interface in org.apache.beam.runners.spark
 
- 
Spark runner 
PipelineOptions handles Spark execution-related configurations, such as the
 master address, batch-interval, and other user-related knobs.
 
 
- SparkPipelineOptions.TmpCheckpointDirFactory - Class in org.apache.beam.runners.spark
 
- 
Returns the default checkpoint directory of /tmp/${job.name}.
 
- SparkPipelineResult - Class in org.apache.beam.runners.spark
 
- 
Represents a Spark pipeline execution result.
 
- SparkRunner - Class in org.apache.beam.runners.spark
 
- 
The SparkRunner translate operations defined on a pipeline to a representation executable by
 Spark, and then submitting the job to Spark to be executed.
 
- SparkRunner.Evaluator - Class in org.apache.beam.runners.spark
 
- 
Evaluator on the pipeline.
 
- SparkRunnerDebugger - Class in org.apache.beam.runners.spark
 
- 
Pipeline runner which translates a Beam pipeline into equivalent Spark operations, without
 running them.
 
- SparkRunnerDebugger.DebugSparkPipelineResult - Class in org.apache.beam.runners.spark
 
- 
 
- SparkRunnerRegistrar - Class in org.apache.beam.runners.spark
 
- 
 
- SparkRunnerRegistrar.Options - Class in org.apache.beam.runners.spark
 
- 
 
- SparkRunnerRegistrar.Runner - Class in org.apache.beam.runners.spark
 
- 
 
- SparkSideInputReader - Class in org.apache.beam.runners.spark.util
 
- 
A SideInputReader for the SparkRunner.
 
- SparkSideInputReader(Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>) - Constructor for class org.apache.beam.runners.spark.util.SparkSideInputReader
 
-  
 
- SparkTimerInternals - Class in org.apache.beam.runners.spark.stateful
 
- 
An implementation of TimerInternals for the SparkRunner.
 
- SparkTransformOverrides - Class in org.apache.beam.runners.spark
 
- 
 
- SparkTransformOverrides() - Constructor for class org.apache.beam.runners.spark.SparkTransformOverrides
 
-  
 
- SparkUnboundedSource - Class in org.apache.beam.runners.spark.io
 
- 
 
- SparkUnboundedSource() - Constructor for class org.apache.beam.runners.spark.io.SparkUnboundedSource
 
-  
 
- SparkUnboundedSource.Metadata - Class in org.apache.beam.runners.spark.io
 
- 
A metadata holder for an input stream partition.
 
- SparkWatermarks(Instant, Instant, Instant) - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
-  
 
- split(BeamFnApi.BundleSplit) - Method in class org.apache.beam.runners.fnexecution.splittabledofn.SDFFeederViaStateAndTimers
 
- 
Signals that a split happened.
 
- split(int, PipelineOptions) - Method in class org.apache.beam.runners.gearpump.translators.io.ValuesSource
 
-  
 
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
 
- 
Splits the source into bundles of approximately desiredBundleSizeBytes.
 
- split(CassandraIO.Read<T>, long) - Method in interface org.apache.beam.sdk.io.cassandra.CassandraService
 
- 
Split a table read into several sources.
 
- split(CassandraIO.Read<T>, long) - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
 
-  
 
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
-  
 
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
-  
 
- split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
-  
 
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
-  
 
- split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
-  
 
- split(int) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
Returns a list of up to 
numSplits + 1 ByteKeys in ascending order,
 where the keys have been interpolated to form roughly equal sub-ranges of this 
ByteKeyRange, assuming a uniform distribution of keys within this range.
 
 
- split(long, long) - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
-  
 
- split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.UnboundedSource
 
- 
Returns a list of UnboundedSource objects representing the instances of this source
 that should be used when executing the workflow.
 
- split(String) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
Returns a 
Regex.Split PTransform that splits a string on the regular expression
 and then outputs each item.
 
 
- split(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
Returns a 
Regex.Split PTransform that splits a string on the regular expression
 and then outputs each item.
 
 
- split(String, boolean) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
Returns a 
Regex.Split PTransform that splits a string on the regular expression
 and then outputs each item.
 
 
- split(Pattern, boolean) - Static method in class org.apache.beam.sdk.transforms.Regex
 
- 
Returns a 
Regex.Split PTransform that splits a string on the regular expression
 and then outputs each item.
 
 
- Split(Pattern, boolean) - Constructor for class org.apache.beam.sdk.transforms.Regex.Split
 
-  
 
- SPLIT_POINTS_UNKNOWN - Static variable in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
- 
 
- splitAtFraction(double) - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
- 
Tells the reader to narrow the range of the input it's going to read and give up the
 remainder, so that the new range would contain approximately the given fraction of the amount
 of data in the current range.
 
- splitAtFraction(double) - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
-  
 
- SqlCheckConstraint - Class in org.apache.beam.sdk.extensions.sql.impl.parser
 
- 
Parse tree for UNIQUE, PRIMARY KEY constraints.
 
- SqlColumnDeclaration - Class in org.apache.beam.sdk.extensions.sql.impl.parser
 
- 
Parse tree for column.
 
- SqlCreateExternalTable - Class in org.apache.beam.sdk.extensions.sql.impl.parser
 
- 
Parse tree for CREATE EXTERNAL TABLE statement.
 
- SqlCreateExternalTable(SqlParserPos, boolean, boolean, SqlIdentifier, List<Schema.Field>, SqlNode, SqlNode, SqlNode, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
 
- 
Creates a SqlCreateExternalTable.
 
- SqlDdlNodes - Class in org.apache.beam.sdk.extensions.sql.impl.parser
 
- 
Utilities concerning SqlNode for DDL.
 
- SqlDropTable - Class in org.apache.beam.sdk.extensions.sql.impl.parser
 
- 
Parse tree for DROP TABLE statement.
 
- SqlSetOptionBeam - Class in org.apache.beam.sdk.extensions.sql.impl.parser
 
- 
SQL parse tree node to represent SET and RESET statements.
 
- SqlSetOptionBeam(SqlParserPos, String, SqlIdentifier, SqlNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.parser.SqlSetOptionBeam
 
-  
 
- SqlTransform - Class in org.apache.beam.sdk.extensions.sql
 
- 
 
- SqlTransform() - Constructor for class org.apache.beam.sdk.extensions.sql.SqlTransform
 
-  
 
- SqlTypeUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
 
- 
Utils to help with SqlTypes.
 
- SqlTypeUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.SqlTypeUtils
 
-  
 
- SqsIO - Class in org.apache.beam.sdk.io.aws.sqs
 
- 
An unbounded source for Amazon Simple Queue Service (SQS).
 
- SqsIO.Read - Class in org.apache.beam.sdk.io.aws.sqs
 
- 
 
- SqsIO.Write - Class in org.apache.beam.sdk.io.aws.sqs
 
- 
 
- StageBundleFactory - Interface in org.apache.beam.runners.fnexecution.control
 
- 
A bundle factory scoped to a particular 
ExecutableStage, which has all of the resources
 it needs to provide new 
RemoteBundles.
 
 
- stageDefaultFiles() - Method in class org.apache.beam.runners.dataflow.util.GcsStager
 
- 
 
- stageDefaultFiles() - Method in interface org.apache.beam.runners.dataflow.util.Stager
 
- 
Stage default files and return a list of DataflowPackage objects describing the actual
 location at which each file was staged.
 
- stageFiles(List<String>) - Method in class org.apache.beam.runners.dataflow.util.GcsStager
 
- 
 
- stageFiles(List<String>) - Method in interface org.apache.beam.runners.dataflow.util.Stager
 
- 
Stage files and return a list of packages DataflowPackage objects describing th actual
 location at which each file was staged.
 
- Stager - Interface in org.apache.beam.runners.dataflow.util
 
- 
Interface for staging files needed for running a Dataflow pipeline.
 
- StagerFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
 
-  
 
- stageToFile(byte[], String) - Method in class org.apache.beam.runners.dataflow.util.GcsStager
 
-  
 
- stageToFile(byte[], String) - Method in interface org.apache.beam.runners.dataflow.util.Stager
 
- 
Stage bytes to a target file name wherever this stager stages things.
 
- StagingLocationFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
 
-  
 
- StandardCreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
 
-  
 
- start() - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobServer
 
-  
 
- start() - Method in class org.apache.beam.runners.flink.FlinkJobInvocation
 
-  
 
- start() - Method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
-  
 
- start() - Method in interface org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation
 
- 
Start the job.
 
- start() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
-  
 
- start() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
-  
 
- start() - Method in class org.apache.beam.sdk.io.Source.Reader
 
- 
Initializes the reader and advances the reader to the first record.
 
- start() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
- 
Initializes the reader and advances the reader to the first record.
 
- start() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
- 
Returns the start of this window, inclusive.
 
- startBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
-  
 
- startBundle(DoFn<T, Void>.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
-  
 
- startBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- StartBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
 
-  
 
- startCopyJob(JobReference, JobConfigurationTableCopy) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
 
- 
Start a BigQuery copy job.
 
- startExtractJob(JobReference, JobConfigurationExtract) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
 
- 
Start a BigQuery extract job.
 
- startImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
-  
 
- startImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
- 
 
- startLoadJob(JobReference, JobConfigurationLoad) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
 
- 
Start a BigQuery load job.
 
- startProcess(String, String, List<String>, Map<String, String>) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
 
- 
Forks a process with the given command, arguments, and additional environment variables.
 
- startQueryJob(JobReference, JobConfigurationQuery) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
 
- 
Start a BigQuery query job.
 
- startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
-  
 
- startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
- 
Creates a decompressing channel from the input channel and passes it to its delegate reader's
 FileBasedReader#startReading(ReadableByteChannel).
 
- startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
- 
Performs any initialization of the subclass of FileBasedReader that involves IO
 operations.
 
- startsWith(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
 
- 
 
- state(StreamObserver<BeamFnApi.StateResponse>) - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
 
-  
 
- state - Variable in class org.apache.beam.runners.spark.SparkPipelineResult
 
-  
 
- State - Interface in org.apache.beam.sdk.state
 
- 
 
- StateBinder - Interface in org.apache.beam.sdk.state
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- StateContext<W extends BoundedWindow> - Interface in org.apache.beam.sdk.state
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- StateContexts - Class in org.apache.beam.sdk.state
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- StateContexts() - Constructor for class org.apache.beam.sdk.state.StateContexts
 
-  
 
- StateDelegator - Interface in org.apache.beam.runners.fnexecution.state
 
- 
The 
StateDelegator is able to delegate 
BeamFnApi.StateRequests to a set of registered
 handlers.
 
 
- StateDelegator.Registration - Interface in org.apache.beam.runners.fnexecution.state
 
- 
Allows callers to deregister from receiving further state requests.
 
- stateInternals() - Method in class org.apache.beam.runners.gearpump.translators.utils.NoOpStepContext
 
-  
 
- StateRequestHandler - Interface in org.apache.beam.runners.fnexecution.state
 
- 
Handler for StateRequests.
 
- StateRequestHandlers - Class in org.apache.beam.runners.fnexecution.state
 
- 
 
- StateRequestHandlers() - Constructor for class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
 
-  
 
- StateRequestHandlers.BagUserStateHandler<K,V,W extends BoundedWindow> - Interface in org.apache.beam.runners.fnexecution.state
 
- 
A handler for bag user state.
 
- StateRequestHandlers.BagUserStateHandlerFactory - Interface in org.apache.beam.runners.fnexecution.state
 
- 
 
- StateRequestHandlers.SideInputHandler<V,W extends BoundedWindow> - Interface in org.apache.beam.runners.fnexecution.state
 
- 
A handler for side inputs.
 
- StateRequestHandlers.SideInputHandlerFactory - Interface in org.apache.beam.runners.fnexecution.state
 
- 
 
- StateSpec<StateT extends State> - Interface in org.apache.beam.sdk.state
 
- 
A specification of a persistent state cell.
 
- StateSpec.Cases<ResultT> - Interface in org.apache.beam.sdk.state
 
- 
Cases for doing a "switch" on the type of 
StateSpec.
 
 
- StateSpec.Cases.WithDefault<ResultT> - Class in org.apache.beam.sdk.state
 
- 
A base class for a visitor with a default method for cases it is not interested in.
 
- StateSpecFunctions - Class in org.apache.beam.runners.spark.stateful
 
- 
A class containing StateSpec mappingFunctions.
 
- StateSpecFunctions() - Constructor for class org.apache.beam.runners.spark.stateful.StateSpecFunctions
 
-  
 
- StateSpecs - Class in org.apache.beam.sdk.state
 
- 
 
- StaticGrpcProvisionService - Class in org.apache.beam.runners.fnexecution.provisioning
 
- 
A provision service that returns a static response to all calls.
 
- StaticSchemaInference - Class in org.apache.beam.sdk.schemas.utils
 
- 
A set of utilities for inferring a Beam 
Schema from static Java types.
 
 
- StaticSchemaInference() - Constructor for class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
 
-  
 
- StaticSchemaInference.TypeInformation - Class in org.apache.beam.sdk.schemas.utils
 
- 
Relevant information about a Java type.
 
- status() - Method in class org.apache.beam.sdk.io.fs.MatchResult
 
- 
 
- STATUS_BACKOFF_FACTORY - Static variable in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
-  
 
- steps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
-  
 
- stop() - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobServer
 
-  
 
- stop() - Method in class org.apache.beam.runners.flink.FlinkJobServerDriver
 
-  
 
- stop() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
-  
 
- stop() - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
 
-  
 
- stop() - Static method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
 
- 
Indicates that there is no more work to be done for the current element.
 
- stopProcess(String) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
 
- 
Stops a previously started process identified by its unique id.
 
- StreamingInserts<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
PTransform that performs streaming BigQuery write.
 
- StreamingInserts(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
 
- 
Constructor.
 
- StreamingIT - Interface in org.apache.beam.sdk.testing
 
- 
 
- StreamingOptions - Interface in org.apache.beam.sdk.options
 
- 
Options used to configure streaming.
 
- StreamingWriteTables - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- StreamingWriteTables() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
-  
 
- STRING - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
The type of string fields.
 
- STRING_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
-  
 
- StringDelegateCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder that wraps a 
Coder<String> and encodes/decodes values via string
 representations.
 
 
- StringDelegateCoder(Class<T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- StringOperators - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
String operator implementations.
 
- StringOperators() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators
 
-  
 
- StringOperators.StringOperator - Interface in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
 
- 
 
- strings() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
- 
 
- StringUtf8Coder - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder that encodes 
Strings in UTF-8 encoding.
 
 
- StripIdsDoFn() - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId.StripIdsDoFn
 
-  
 
- stripPartitionDecorator(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
- 
Strip off any partition decorator information from a tablespec.
 
- Structs - Class in org.apache.beam.runners.dataflow.util
 
- 
A collection of static methods for manipulating datastructure representations transferred via the
 Dataflow API.
 
- StructuralByteArray - Class in org.apache.beam.sdk.coders
 
- 
A wrapper around a byte[] that uses structural, value-based equality rather than byte[]'s normal
 object identity.
 
- StructuralByteArray(byte[]) - Constructor for class org.apache.beam.sdk.coders.StructuralByteArray
 
-  
 
- StructuralKey<K> - Class in org.apache.beam.runners.local
 
- 
 
- structuralValue(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
- 
Returns an object with an Object.equals() method that represents structural equality on
 the argument.
 
- structuralValue(T) - Method in class org.apache.beam.sdk.coders.Coder
 
- 
Returns an object with an Object.equals() method that represents structural equality on
 the argument.
 
- structuralValue(T) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
- 
Returns an object with an Object.equals() method that represents structural equality on
 the argument.
 
- structuralValue(Iterable<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
 
-  
 
- structuralValue(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- structuralValue(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
-  
 
- structuralValue(T) - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
- 
The structural value of the object is the object itself.
 
- structuralValue(T) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- structuralValue(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
-  
 
- structuralValue(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
-  
 
- structuralValue(TimestampedValue<T>) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
-  
 
- structuralValueConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T> and values of type T, the structural
 values are equal if and only if the encoded bytes are equal.
 
- structuralValueConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T>, Coder.Context, and values of type T, the structural values are equal if and only if the encoded bytes are equal, in any Coder.Context.
 
- structuralValueDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T> and value of type T, the structural value
 is equal to the structural value yield by encoding and decoding the original value.
 
- structuralValueDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
Verifies that for the given Coder<T>, Coder.Context, and value of type T, the structural value is equal to the structural value yield by encoding and decoding the
 original value, in any Coder.Context.
 
- StructuredCoder<T> - Class in org.apache.beam.sdk.coders
 
- 
 
- StructuredCoder() - Constructor for class org.apache.beam.sdk.coders.StructuredCoder
 
-  
 
- subscriptionPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
-  
 
- subscriptionPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
-  
 
- SUBSTRING - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators
 
-  
 
- subTriggers - Variable in class org.apache.beam.sdk.transforms.windowing.Trigger
 
-  
 
- subTriggers() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
-  
 
- success(String, String, Metadata) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
 
-  
 
- success(String, String) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
 
-  
 
- success() - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
-  
 
- SUCCESS_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
 
-  
 
- SuccessOrFailure - Class in org.apache.beam.sdk.testing
 
- 
 
- Sum - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for computing the sum of the elements in a PCollection, or the sum of
 the values associated with each key in a PCollection of KVs.
 
- SynchronizedStreamObserver<V> - Class in org.apache.beam.sdk.fn.stream
 
- 
A StreamObserver which provides synchronous access access to an underlying StreamObserver.
 
- Table - Class in org.apache.beam.sdk.extensions.sql.meta
 
- 
Represents the metadata of a BeamSqlTable.
 
- Table() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.Table
 
-  
 
- Table.Builder - Class in org.apache.beam.sdk.extensions.sql.meta
 
- 
 
- TABLE_ROW_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
-  
 
- TableDestination - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Encapsulates a BigQuery table destination.
 
- TableDestination(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- TableDestination(TableReference, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- TableDestination(TableReference, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- TableDestination(String, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- TableDestination(TableReference, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- TableDestination(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- TableDestinationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- TableDestinationCoderV2 - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- TableDestinationCoderV2() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
-  
 
- TableProvider - Interface in org.apache.beam.sdk.extensions.sql.meta.provider
 
- 
A TableProvider handles the metadata CRUD of a specified kind of tables.
 
- TableRowJsonCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
A 
Coder that encodes BigQuery 
TableRow objects in their native JSON format.
 
 
- tableRows(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- tables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- tableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
-  
 
- TaggedKeyedPCollection(TupleTag<V>, PCollection<KV<K, V>>) - Constructor for class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
 
-  
 
- TaggedPValue - Class in org.apache.beam.sdk.values
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- TaggedPValue() - Constructor for class org.apache.beam.sdk.values.TaggedPValue
 
-  
 
- take(String, Duration) - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool.Source
 
- 
 
- takeOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- takeOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- takeOutputElementsWithTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
- 
 
- TDigestQuantiles - Class in org.apache.beam.sdk.extensions.sketching
 
- 
PTransforms for getting information about quantiles in a stream.
 
- TDigestQuantiles() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
 
-  
 
- TDigestQuantiles.GlobalDigest - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- TDigestQuantiles.PerKeyDigest<K> - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- TDigestQuantiles.TDigestQuantilesFn - Class in org.apache.beam.sdk.extensions.sketching
 
- 
 
- teardown() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
-  
 
- teardown() - Method in class org.apache.beam.runners.gearpump.translators.functions.DoFnFunction
 
-  
 
- tearDown() - Method in interface org.apache.beam.sdk.extensions.sql.BeamSqlSeekableTable
 
- 
cleanup resources of the instance.
 
- tempDirectory - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
Directory for temporary output files.
 
- TestApexRunner - Class in org.apache.beam.runners.apex
 
- 
 
- TestBigQuery - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Test rule which creates a new table with specified schema, with randomized name and exposes few
 APIs to work with it.
 
- TestBigQuery.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Interface to implement a polling assertion.
 
- TestBigQuery.RowsAssertion - Interface in org.apache.beam.sdk.io.gcp.bigquery
 
- 
Interface for creating a polling eventual assertion.
 
- TestBigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- TestBoundedTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
 
- 
Mocked table for bounded data sources.
 
- TestBoundedTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
 
-  
 
- testByteCount(Coder<T>, Coder.Context, T[]) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
- 
A utility method that passes the given (unencoded) elements through coder's
 registerByteSizeObserver() and encode() methods, and confirms they are mutually consistent.
 
- testCombineFn(Combine.CombineFn<InputT, AccumT, OutputT>, List<InputT>, OutputT) - Static method in class org.apache.beam.sdk.testing.CombineFnTester
 
- 
Tests that the the 
Combine.CombineFn, when applied to the provided input, produces the provided
 output.
 
 
- testCombineFn(Combine.CombineFn<InputT, AccumT, OutputT>, List<InputT>, Matcher<? super OutputT>) - Static method in class org.apache.beam.sdk.testing.CombineFnTester
 
-  
 
- TestDataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow
 
- 
 
- TestDataflowRunner - Class in org.apache.beam.runners.dataflow
 
- 
 
- TestElementByteSizeObserver() - Constructor for class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
-  
 
- TestExecutors - Class in org.apache.beam.sdk.fn.test
 
- 
A 
TestRule that validates that all submitted tasks finished and were completed.
 
 
- TestExecutors() - Constructor for class org.apache.beam.sdk.fn.test.TestExecutors
 
-  
 
- TestExecutors.TestExecutorService - Interface in org.apache.beam.sdk.fn.test
 
- 
A union of the 
ExecutorService and 
TestRule interfaces.
 
 
- TestFlinkRunner - Class in org.apache.beam.runners.flink
 
- 
Test Flink runner.
 
- TestGearpumpRunner - Class in org.apache.beam.runners.gearpump
 
- 
 
- testingPipelineOptions() - Static method in class org.apache.beam.sdk.testing.TestPipeline
 
- 
 
- TestJobService - Class in org.apache.beam.runners.reference.testing
 
- 
A JobService for tests.
 
- TestJobService(Endpoints.ApiServiceDescriptor, String, String, JobApi.JobState.Enum) - Constructor for class org.apache.beam.runners.reference.testing.TestJobService
 
-  
 
- TestPipeline - Class in org.apache.beam.sdk.testing
 
- 
A creator of test pipelines that can be used inside of tests that can be configured to run
 locally or against a remote pipeline runner.
 
- TestPipeline.AbandonedNodeException - Exception in org.apache.beam.sdk.testing
 
- 
An exception thrown in case an abandoned 
PTransform is
 detected, that is, a 
PTransform that has not been run.
 
 
- TestPipeline.PipelineRunMissingException - Exception in org.apache.beam.sdk.testing
 
- 
An exception thrown in case a test finishes without invoking 
Pipeline.run().
 
 
- TestPipeline.TestValueProviderOptions - Interface in org.apache.beam.sdk.testing
 
- 
 
- TestPipelineOptions - Interface in org.apache.beam.sdk.testing
 
- 
 
- TestPipelineOptions.AlwaysPassMatcher - Class in org.apache.beam.sdk.testing
 
- 
Matcher which will always pass.
 
- TestPipelineOptions.AlwaysPassMatcherFactory - Class in org.apache.beam.sdk.testing
 
- 
 
- TestPortablePipelineOptions - Interface in org.apache.beam.runners.reference.testing
 
- 
 
- TestPortablePipelineOptions.DefaultJobServerConfigFactory - Class in org.apache.beam.runners.reference.testing
 
- 
Factory for default config.
 
- TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar - Class in org.apache.beam.runners.reference.testing
 
- 
 
- TestPortablePipelineOptionsRegistrar() - Constructor for class org.apache.beam.runners.reference.testing.TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar
 
-  
 
- TestPortableRunner - Class in org.apache.beam.runners.reference.testing
 
- 
 
- TestPubsub - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Test rule which creates a new topic with randomized name and exposes the APIs to work with it.
 
- TestPubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
 
- 
 
- TestPubsubSignal - Class in org.apache.beam.sdk.io.gcp.pubsub
 
- 
Test rule which observes elements of the 
PCollection and checks whether they match the
 success criteria.
 
 
- TestSparkPipelineOptions - Interface in org.apache.beam.runners.spark
 
- 
 
- TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory - Class in org.apache.beam.runners.spark
 
- 
A factory to provide the default watermark to stop a pipeline that reads from an unbounded
 source.
 
- TestSparkRunner - Class in org.apache.beam.runners.spark
 
- 
The SparkRunner translate operations defined on a pipeline to a representation executable by
 Spark, and then submitting the job to Spark to be executed.
 
- TestStream<T> - Class in org.apache.beam.sdk.testing
 
- 
A testing input that generates an unbounded 
PCollection of elements, advancing the
 watermark and processing time as elements are emitted.
 
 
- TestStream.Builder<T> - Class in org.apache.beam.sdk.testing
 
- 
 
- TestStream.ElementEvent<T> - Class in org.apache.beam.sdk.testing
 
- 
 
- TestStream.Event<T> - Interface in org.apache.beam.sdk.testing
 
- 
 
- TestStream.EventType - Enum in org.apache.beam.sdk.testing
 
- 
 
- TestStream.ProcessingTimeEvent<T> - Class in org.apache.beam.sdk.testing
 
- 
 
- TestStream.WatermarkEvent<T> - Class in org.apache.beam.sdk.testing
 
- 
 
- TestStreams - Class in org.apache.beam.sdk.fn.test
 
- 
Utility methods which enable testing of StreamObservers.
 
- TestStreams() - Constructor for class org.apache.beam.sdk.fn.test.TestStreams
 
-  
 
- TestStreams.Builder<T> - Class in org.apache.beam.sdk.fn.test
 
- 
A builder for a test CallStreamObserver that performs various callbacks.
 
- TestTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
 
- 
Base class for mocked table.
 
- TestTable(Schema) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
 
-  
 
- TestTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
 
- 
Test in-memory table provider for use in tests.
 
- TestTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
 
-  
 
- TestTableUtils - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
 
- 
Utility functions for mock classes.
 
- TestTableUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
 
-  
 
- TestUnboundedTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.test
 
- 
A mocked unbounded table.
 
- TextIO - Class in org.apache.beam.sdk.io
 
- 
 
- TextIO.CompressionType - Enum in org.apache.beam.sdk.io
 
- 
 
- TextIO.Read - Class in org.apache.beam.sdk.io
 
- 
 
- TextIO.ReadAll - Class in org.apache.beam.sdk.io
 
- 
 
- TextIO.ReadFiles - Class in org.apache.beam.sdk.io
 
- 
 
- TextIO.Sink - Class in org.apache.beam.sdk.io
 
- 
 
- TextIO.TypedWrite<UserT,DestinationT> - Class in org.apache.beam.sdk.io
 
- 
 
- TextIO.Write - Class in org.apache.beam.sdk.io
 
- 
 
- TextTable - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
 
- 
 
- TextTable(Schema, String, PTransform<PCollection<String>, PCollection<Row>>, PTransform<PCollection<Row>, PCollection<String>>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
 
- 
Text table with the specified read and write transforms.
 
- TextTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
 
- 
Text table provider.
 
- TextTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
 
-  
 
- TextTableProvider.CsvToRow - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
 
- 
Read-side converter for 
TextTable with format 
'csv'.
 
 
- TextTableProvider.LinesReadConverter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
 
- 
Read-side converter for 
TextTable with format 
'lines'.
 
 
- TextTableProvider.LinesWriteConverter - Class in org.apache.beam.sdk.extensions.sql.meta.provider.text
 
- 
Write-side converter for for 
TextTable with format 
'lines'.
 
 
- TextualIntegerCoder - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder that encodes 
Integer Integers as the ASCII bytes of their textual,
 decimal, representation.
 
 
- TextualIntegerCoder() - Constructor for class org.apache.beam.sdk.coders.TextualIntegerCoder
 
-  
 
- TFRecordIO - Class in org.apache.beam.sdk.io
 
- 
PTransforms for reading and writing TensorFlow TFRecord files.
 
 
- TFRecordIO.CompressionType - Enum in org.apache.beam.sdk.io
 
- 
 
- TFRecordIO.Read - Class in org.apache.beam.sdk.io
 
- 
 
- TFRecordIO.Sink - Class in org.apache.beam.sdk.io
 
- 
 
- TFRecordIO.Write - Class in org.apache.beam.sdk.io
 
- 
 
- that(PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
 
- that(String, PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
 
- thatMap(PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
 
- thatMap(String, PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
 
- thatMultimap(PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
 
- thatMultimap(String, PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
 
- thatSingleton(PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
Constructs a 
PAssert.SingletonAssert for the value of the provided 
PCollection
 PCollection<T>, which must be a singleton.
 
 
- thatSingleton(String, PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
Constructs a 
PAssert.SingletonAssert for the value of the provided 
PCollection
 PCollection<T> with the specified reason.
 
 
- thatSingletonIterable(PCollection<? extends Iterable<T>>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
 
- thatSingletonIterable(String, PCollection<? extends Iterable<T>>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
- 
 
- ThrowingBiConsumer<T1,T2> - Interface in org.apache.beam.sdk.fn.function
 
- 
A BiConsumer which can throw Exceptions.
 
- ThrowingBiFunction<T1,T2,T3> - Interface in org.apache.beam.sdk.fn.function
 
- 
A BiFunction which can throw Exceptions.
 
- ThrowingConsumer<T> - Interface in org.apache.beam.sdk.fn.function
 
- 
A Consumer which can throw Exceptions.
 
- ThrowingFunction<T1,T2> - Interface in org.apache.beam.sdk.fn.function
 
- 
A Function which can throw Exceptions.
 
- ThrowingRunnable - Interface in org.apache.beam.sdk.fn.function
 
- 
A Runnable which can throw Exceptions.
 
- throwNullCredentialException() - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
 
-  
 
- TikaIO - Class in org.apache.beam.sdk.io.tika
 
- 
Transforms for parsing arbitrary files using 
Apache Tika.
 
 
- TikaIO() - Constructor for class org.apache.beam.sdk.io.tika.TikaIO
 
-  
 
- TikaIO.Parse - Class in org.apache.beam.sdk.io.tika
 
- 
 
- TikaIO.ParseFiles - Class in org.apache.beam.sdk.io.tika
 
- 
 
- TIME - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- TIME_TO_BIGINT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.DatetimeReinterpretConversions
 
-  
 
- TimeDomain - Enum in org.apache.beam.sdk.state
 
- 
TimeDomain specifies whether an operation is based on timestamps of elements or current
 "real-world" time as reported while processing.
 
 
- timeDomain() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
 
- 
Returns the time domain of the current timer.
 
- Timer - Interface in org.apache.beam.sdk.state
 
- 
A timer for a specified time domain that can be set to register the desire for further processing
 at particular time in its specified time domain.
 
- timer(TimeDomain) - Static method in class org.apache.beam.sdk.state.TimerSpecs
 
-  
 
- timerId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
-  
 
- timerInternals() - Method in class org.apache.beam.runners.gearpump.translators.utils.NoOpStepContext
 
-  
 
- Timers - Interface in org.apache.beam.sdk.state
 
- 
Interface for interacting with time.
 
- TimerSpec() - Constructor for class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
-  
 
- TimerSpec - Interface in org.apache.beam.sdk.state
 
- 
A specification for a 
Timer.
 
 
- TimerSpecs - Class in org.apache.beam.sdk.state
 
- 
 
- TimerSpecs() - Constructor for class org.apache.beam.sdk.state.TimerSpecs
 
-  
 
- TIMESTAMP - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- timestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
 
- 
Returns the timestamp of the current timer.
 
- timestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
 
- 
Returns the timestamp of the input element.
 
- timestamp() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
 
- 
Returns the timestamp of the current element.
 
- TIMESTAMP_MAX_VALUE - Static variable in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
 
- 
The maximum value for any Beam timestamp.
 
- TIMESTAMP_MIN_VALUE - Static variable in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
 
- 
The minimum value for any Beam timestamp.
 
- timestampColumnIndex(int) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
 
-  
 
- TimestampCombiner - Enum in org.apache.beam.sdk.transforms.windowing
 
- 
Policies for combining timestamps that occur within a window.
 
- TimeStampComparator() - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
 
-  
 
- timestamped(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.transforms.Create
 
- 
 
- timestamped(TimestampedValue<T>, TimestampedValue<T>...) - Static method in class org.apache.beam.sdk.transforms.Create
 
- 
 
- timestamped(Iterable<T>, Iterable<Long>) - Static method in class org.apache.beam.sdk.transforms.Create
 
- 
Returns a new root transform that produces a 
PCollection containing the specified
 elements with the specified timestamps.
 
 
- TimestampedValue<V> - Class in org.apache.beam.sdk.values
 
- 
An immutable pair of a value and a timestamp.
 
- TimestampedValue(V, Instant) - Constructor for class org.apache.beam.sdk.values.TimestampedValue
 
-  
 
- TimestampedValue.TimestampedValueCoder<T> - Class in org.apache.beam.sdk.values
 
- 
 
- timestampMsSinceEpoch - Variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
- 
Timestamp for element (ms since epoch).
 
- TimestampPolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
 
- 
A timestamp policy to assign event time for messages in a Kafka partition and watermark for it.
 
- TimestampPolicy() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicy
 
-  
 
- TimestampPolicy.PartitionContext - Class in org.apache.beam.sdk.io.kafka
 
- 
The context contains state maintained in the reader for the partition.
 
- TimestampPolicyFactory<KeyT,ValueT> - Interface in org.apache.beam.sdk.io.kafka
 
- 
An extendable factory to create a 
TimestampPolicy for each partition at runtime by
 KafkaIO reader.
 
 
- TimestampPolicyFactory.LogAppendTimePolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
 
- 
Assigns Kafka's log append time (server side ingestion time) to each record.
 
- TimestampPolicyFactory.ProcessingTimePolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
 
- 
A simple policy that uses current time for event time and watermark.
 
- TimestampPolicyFactory.TimestampFnPolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
 
- 
Internal policy to support deprecated withTimestampFn API.
 
- timestamps() - Static method in class org.apache.beam.sdk.transforms.Reify
 
- 
 
- timestampsInValue() - Static method in class org.apache.beam.sdk.transforms.Reify
 
- 
Create a 
PTransform that will output all input 
KVs with the timestamp inside
 the value.
 
 
- TimestampTransform - Class in org.apache.beam.sdk.transforms.windowing
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- TimestampTransform.AlignTo - Class in org.apache.beam.sdk.transforms.windowing
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- TimestampTransform.Delay - Class in org.apache.beam.sdk.transforms.windowing
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- timeUnitInternalMultiplier(SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.TimeUnitUtils
 
-  
 
- TimeUnitUtils - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
 
- 
Utils to convert between Calcite's TimeUnit and Sql intervals.
 
- TimeUnitUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.TimeUnitUtils
 
-  
 
- TimeUtil - Class in org.apache.beam.runners.dataflow.util
 
- 
A helper class for converting between Dataflow API and SDK time representations.
 
- TINY_INT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- TmpCheckpointDirFactory() - Constructor for class org.apache.beam.runners.spark.SparkPipelineOptions.TmpCheckpointDirFactory
 
-  
 
- to(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion.Builder
 
-  
 
- to() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion
 
-  
 
- to(String) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Writes to file(s) with the given output prefix.
 
- to(ResourceId) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Writes to file(s) with the given output prefix.
 
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
 
- to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
 
- to(DynamicAvroDestinations<UserT, NewDestinationT, OutputT>) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
 
- to(String) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- to(ResourceId) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
See TypedWrite#to(FilenamePolicy).
 
- to(DynamicAvroDestinations<T, ?, T>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- to(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies a common directory for all generated files.
 
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- to(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- to(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- to(SerializableFunction<ValueInSingleWindow<T>, TableDestination>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Writes to table specified by the specified table function.
 
- to(DynamicDestinations<T, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- to(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
- 
Publishes to the specified topic.
 
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
- 
 
- to(long) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
- 
Specifies the maximum number to generate (exclusive).
 
- to(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
 
- 
Provide name of collection while reading from Solr.
 
- to(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Writes to text files with the given prefix.
 
- to(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- to(FileBasedSink.DynamicDestinations<UserT, NewDestinationT, String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- to(SerializableFunction<UserT, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- to(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- to(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- to(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
See TypedWrite#to(FilenamePolicy).
 
- to(FileBasedSink.DynamicDestinations<String, ?, String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- to(SerializableFunction<String, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- to(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
Writes TFRecord file(s) with the given output prefix.
 
- to(ResourceId) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
Writes TFRecord file(s) with a prefix given by the specified resource.
 
- to(FileBasedSink<UserT, DestinationT, OutputT>) - Static method in class org.apache.beam.sdk.io.WriteFiles
 
- 
Creates a 
WriteFiles transform that writes to the given 
FileBasedSink, letting
 the runner control how many different shards are produced.
 
 
- to(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
- 
Writes to files with the given path prefix.
 
- to(Class<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
 
- 
 
- to(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
 
- 
 
- toAdditionalInputs(Iterable<PCollectionView<?>>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
 
- 
 
- toBeamRow(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
- 
 
- toBeamRow(GenericRecord, Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
-  
 
- toBeamRow(Schema, TableSchema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
- 
Tries to parse the JSON 
TableRow from BigQuery.
 
 
- toBuilder() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
 
-  
 
- toBuilder() - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
-  
 
- toByteArray(T, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
Utility method for serializing an object using the specified coder.
 
- toByteArrays(Iterable<T>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
Utility method for serializing a Iterable of values using the specified coder.
 
- toByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
A function wrapper for converting an object to a bytearray.
 
- toByteFunction(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
 
- 
A function wrapper for converting a key-value pair to a byte array pair.
 
- toCalciteRowType(Schema, RelDataTypeFactory) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
- 
Create an instance of RelDataType so it can be used to create a table.
 
- toCloudDuration(ReadableDuration) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
 
- 
 
- toCloudObject(T) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
 
- 
Converts the provided object into an equivalent 
CloudObject.
 
 
- toCloudTime(ReadableInstant) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
 
- 
 
- toDefaultPolicies(SerializableFunction<UserT, DefaultFilenamePolicy.Params>, DefaultFilenamePolicy.Params, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
 
- 
 
- toEnumerable(BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
 
-  
 
- toField(RelDataTypeField) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- toField(String, RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- toFieldType(SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- toFieldType(RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- toList(JavaStream<TranslatorUtils.RawUnionValue>) - Static method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils
 
-  
 
- Top - Class in org.apache.beam.sdk.transforms
 
- 
PTransforms for finding the largest (or smallest) set of elements in a PCollection, or the largest (or smallest) set of values associated with each key in a PCollection of KVs.
 
- Top.Largest<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Top.Natural<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
 
- 
A Serializable Comparator that that uses the compared elements' natural
 ordering.
 
- Top.Reversed<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
 
- 
Serializable Comparator that that uses the reverse of the compared elements'
 natural ordering.
 
- Top.Smallest<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Top.TopCombineFn<T,ComparatorT extends java.util.Comparator<T> & java.io.Serializable> - Class in org.apache.beam.sdk.transforms
 
- 
CombineFn for Top transforms that combines a bunch of Ts into a single
 count-long List<T>, using compareFn to choose the largest Ts.
 
- toPCollection(Pipeline, BeamRelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
 
-  
 
- TopCombineFn(int, ComparatorT) - Constructor for class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
-  
 
- topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
 
- 
Topic path where events will be published to.
 
- topicPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
-  
 
- topicPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
-  
 
- toProvisionInfo() - Method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
 
-  
 
- toPTransform() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
 
-  
 
- toPTransform() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
-  
 
- toRecordField(Object[], int) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableUtils
 
-  
 
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
 
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
 
- toRow(Schema) - Static method in class org.apache.beam.sdk.values.Row
 
- 
 
- toRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.DefaultSchema.DefaultSchemaProvider
 
- 
Given a type, return a function that converts that type to a 
Row object If no schema
 exists, returns null.
 
 
- toRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
 
-  
 
- toRowFunction(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
 
- 
Given a type, return a function that converts that type to a 
Row object If no schema
 exists, returns null.
 
 
- toRows() - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
 
- 
 
- toSchema(RelDataType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
- 
Generate 
Schema from 
RelDataType which is used to create table.
 
 
- toSchema() - Static method in class org.apache.beam.sdk.schemas.Schema
 
- 
 
- toSqlTypeName(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- toState(String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
-  
 
- toString() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
 
-  
 
- toString() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
-  
 
- toString() - Method in class org.apache.beam.runners.dataflow.TestDataflowRunner
 
-  
 
- toString() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
-  
 
- toString() - Method in class org.apache.beam.runners.direct.WatermarkManager.FiredTimers
 
-  
 
- toString() - Method in class org.apache.beam.runners.direct.WatermarkManager.TransformWatermarks
 
-  
 
- toString() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
-  
 
- toString() - Method in class org.apache.beam.runners.flink.FlinkRunner
 
-  
 
- toString() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
-  
 
- toString() - Method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils.RawUnionValue
 
-  
 
- toString() - Method in class org.apache.beam.runners.reference.PortableRunner
 
-  
 
- toString() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators
 
-  
 
- toString() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
-  
 
- toString() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
-  
 
- toString() - Method in class org.apache.beam.sdk.coders.Coder.Context
 
- 
Deprecated.
  
- toString() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
-  
 
- toString() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- toString() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
-  
 
- toString() - Method in class org.apache.beam.sdk.coders.StructuredCoder
 
-  
 
- toString() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
-  
 
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
 
-  
 
- toString() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.ReinterpretConversion
 
-  
 
- toString() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
-  
 
- toString() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
 
-  
 
- toString() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
 
- 
Returns the string representation of this 
ResourceId.
 
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.hbase.HBaseQuery
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
-  
 
- toString() - Method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.range.ByteKey
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.range.OffsetRange
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- toString() - Method in class org.apache.beam.sdk.io.tika.ParseResult
 
-  
 
- toString() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
-  
 
- toString() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
-  
 
- toString() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
-  
 
- toString() - Method in class org.apache.beam.sdk.Pipeline
 
-  
 
- toString() - Method in class org.apache.beam.sdk.schemas.Schema
 
-  
 
- toString() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
-  
 
- toString() - Method in class org.apache.beam.sdk.testing.TestPipeline
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.Contextful
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.PTransform
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
-  
 
- ToString - Class in org.apache.beam.sdk.transforms
 
- 
 
- toString(StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
 
- 
Creates a human-readable representation of the given state of this condition.
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
-  
 
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
 
- 
Deprecated.
  
- toString() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.KV
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.PValueBase
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.Row
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.ShardedKey
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.TupleTag
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.TupleTagList
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.TypeParameter
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
-  
 
- toString() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- toTableRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
- 
 
- toTableRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
-  
 
- toTableSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
- 
 
- toTableSchema(PCollection<Row>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
- 
 
- toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
- 
 
- toUnsplittableSource(BoundedSource<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
 
- 
Returns an equivalent unsplittable BoundedSource<T>.
 
- Transaction - Class in org.apache.beam.sdk.io.gcp.spanner
 
- 
A transaction object.
 
- Transaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
-  
 
- transactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
-  
 
- transfer() - Method in class org.apache.beam.runners.reference.CloseableResource
 
- 
 
- TransformExecutor - Interface in org.apache.beam.runners.direct.portable
 
- 
A Runnable that will execute a PTransform on some bundle of input.
 
- TransformExecutor - Interface in org.apache.beam.runners.direct
 
- 
A Runnable that will execute a PTransform on some bundle of input.
 
- transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
-  
 
- transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
-  
 
- transformId() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
 
-  
 
- transformStepNames - Variable in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
-  
 
- TransformTranslator<TransformT extends PTransform> - Interface in org.apache.beam.runners.dataflow
 
- 
 
- TransformTranslator<T extends PTransform> - Interface in org.apache.beam.runners.gearpump.translators
 
- 
 
- TransformTranslator.StepTranslationContext - Interface in org.apache.beam.runners.dataflow
 
- 
 
- TransformTranslator.TranslationContext - Interface in org.apache.beam.runners.dataflow
 
- 
The interface provided to registered callbacks for interacting with the 
DataflowRunner,
 including reading and writing the values of 
PCollections and side inputs.
 
 
- translate(Pipeline, ApexPipelineOptions) - Static method in class org.apache.beam.runners.apex.TestApexRunner
 
-  
 
- translate(Pipeline, DataflowRunner, List<DataflowPackage>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
 
- 
Translates a 
Pipeline into a 
JobSpecification.
 
 
- translate(AppliedPTransform<?, ?, PrimitiveParDoSingleFactory.ParDoSingle<?, ?>>, SdkComponents) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
 
-  
 
- translate(TransformT, TransformTranslator.TranslationContext) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator
 
-  
 
- translate(FlinkBatchPortablePipelineTranslator.BatchTranslationContext, RunnerApi.Pipeline) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
 
-  
 
- translate(T, RunnerApi.Pipeline) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
 
- 
Translates the given pipeline.
 
- translate(FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext, RunnerApi.Pipeline) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
 
-  
 
- translate(CreateStreamingGearpumpView.CreateGearpumpPCollectionView<ElemT, ViewT>, TranslationContext) - Method in class org.apache.beam.runners.gearpump.translators.CreateGearpumpPCollectionViewTranslator
 
-  
 
- translate(Flatten.PCollections<T>, TranslationContext) - Method in class org.apache.beam.runners.gearpump.translators.FlattenPCollectionsTranslator
 
-  
 
- translate(Pipeline) - Method in class org.apache.beam.runners.gearpump.translators.GearpumpPipelineTranslator
 
-  
 
- translate(GroupByKey<K, V>, TranslationContext) - Method in class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator
 
-  
 
- translate(ParDo.MultiOutput<InputT, OutputT>, TranslationContext) - Method in class org.apache.beam.runners.gearpump.translators.ParDoMultiOutputTranslator
 
-  
 
- translate(Read.Bounded<T>, TranslationContext) - Method in class org.apache.beam.runners.gearpump.translators.ReadBoundedTranslator
 
-  
 
- translate(Read.Unbounded<T>, TranslationContext) - Method in class org.apache.beam.runners.gearpump.translators.ReadUnboundedTranslator
 
-  
 
- translate(T, TranslationContext) - Method in interface org.apache.beam.runners.gearpump.translators.TransformTranslator
 
-  
 
- translate(Window.Assign<T>, TranslationContext) - Method in class org.apache.beam.runners.gearpump.translators.WindowAssignTranslator
 
-  
 
- translate(TransformHierarchy.Node, TransformT, Class<TransformT>) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
- 
Determine if this Node belongs to a Bounded branch of the pipeline, or Unbounded, and
 translate with the proper translator.
 
- translateOnly - Variable in class org.apache.beam.runners.apex.ApexRunner
 
-  
 
- TranslationContext - Class in org.apache.beam.runners.gearpump.translators
 
- 
 
- TranslationContext(JavaStreamApp, GearpumpPipelineOptions) - Constructor for class org.apache.beam.runners.gearpump.translators.TranslationContext
 
-  
 
- translator - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
-  
 
- TranslatorUtils - Class in org.apache.beam.runners.gearpump.translators.utils
 
- 
Utility methods for translators.
 
- TranslatorUtils() - Constructor for class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils
 
-  
 
- TranslatorUtils.FromRawUnionValue<OutputT> - Class in org.apache.beam.runners.gearpump.translators.utils
 
- 
Converts @link{RawUnionValue} to @link{WindowedValue}.
 
- TranslatorUtils.RawUnionValue - Class in org.apache.beam.runners.gearpump.translators.utils
 
- 
This is copied from org.apache.beam.sdk.transforms.join.RawUnionValue.
 
- traverseTopologically(Pipeline.PipelineVisitor) - Method in class org.apache.beam.sdk.Pipeline
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- Trigger - Class in org.apache.beam.sdk.transforms.windowing
 
- 
Triggers control when the elements for a specific key and window are output.
 
- Trigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger
 
-  
 
- Trigger() - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger
 
-  
 
- Trigger.OnceTrigger - Class in org.apache.beam.sdk.transforms.windowing
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- triggering(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
Sets a non-default trigger for this Window PTransform.
 
- TRIM - Static variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.StringOperators
 
-  
 
- trivial() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
 
- 
Creates an 
OutboundObserverFactory that simply delegates to the base factory, with no
 flow control or synchronization.
 
 
- tryClaim(PositionT) - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
 
- 
Attempts to claim the block of work in the current restriction identified by the given
 position.
 
- tryClaimImpl(ByteKey) - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
 
- 
Attempts to claim the given key.
 
- tryClaimImpl(Long) - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
- 
Attempts to claim the given offset.
 
- tryClaimImpl(PositionT) - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
 
- 
 
- tryReturnRecordAt(boolean, ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
-  
 
- tryReturnRecordAt(boolean, Long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- tryReturnRecordAt(boolean, long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- tryReturnRecordAt(boolean, PositionT) - Method in interface org.apache.beam.sdk.io.range.RangeTracker
 
- 
Atomically determines whether a record at the given position can be returned and updates
 internal state.
 
- trySplitAtPosition(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
-  
 
- trySplitAtPosition(Long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- trySplitAtPosition(long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
-  
 
- trySplitAtPosition(PositionT) - Method in interface org.apache.beam.sdk.io.range.RangeTracker
 
- 
 
- TUPLE_TAGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- TupleTag<V> - Class in org.apache.beam.sdk.values
 
- 
 
- TupleTag() - Constructor for class org.apache.beam.sdk.values.TupleTag
 
- 
Constructs a new TupleTag, with a fresh unique id.
 
- TupleTag(String) - Constructor for class org.apache.beam.sdk.values.TupleTag
 
- 
Constructs a new TupleTag with the given id.
 
- TupleTagList - Class in org.apache.beam.sdk.values
 
- 
 
- type - Variable in class org.apache.beam.runners.dataflow.util.OutputReference
 
-  
 
- type(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
 
-  
 
- type() - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
 
-  
 
- type() - Method in interface org.apache.beam.sdk.schemas.FieldValueSetter
 
- 
Returns the field type.
 
- TypeDescriptor<T> - Class in org.apache.beam.sdk.values
 
- 
A description of a Java type, including actual generic parameters where possible.
 
- TypeDescriptor() - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
 
- 
 
- TypeDescriptor(Object) - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Creates a 
TypeDescriptor representing the type parameter 
T, which should
 resolve to a concrete type in the context of the class 
clazz.
 
 
- TypeDescriptor(Class<?>) - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Creates a 
TypeDescriptor representing the type parameter 
T, which should
 resolve to a concrete type in the context of the class 
clazz.
 
 
- TypeDescriptors - Class in org.apache.beam.sdk.values
 
- 
A utility class for creating 
TypeDescriptor objects for different types, such as Java
 primitive types, containers and 
KVs of other 
TypeDescriptor objects, and
 extracting type variables of parameterized types (e.g.
 
 
- TypeDescriptors() - Constructor for class org.apache.beam.sdk.values.TypeDescriptors
 
-  
 
- TypeDescriptors.TypeVariableExtractor<InputT,OutputT> - Interface in org.apache.beam.sdk.values
 
- 
 
- TypedRead() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
-  
 
- TypedWrite() - Constructor for class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
-  
 
- TypedWrite() - Constructor for class org.apache.beam.sdk.io.TextIO.TypedWrite
 
-  
 
- TypeParameter<T> - Class in org.apache.beam.sdk.values
 
- 
 
- TypeParameter() - Constructor for class org.apache.beam.sdk.values.TypeParameter
 
-  
 
- v1() - Static method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreIO
 
- 
Returns a 
DatastoreV1 that provides an API for accessing Cloud Datastore through v1
 version of Datastore Client library.
 
 
- validate() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
-  
 
- validate() - Method in class org.apache.beam.sdk.io.AvroSource
 
-  
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
-  
 
- validate() - Method in class org.apache.beam.sdk.io.CompressedSource
 
- 
Validates that the delegate source is a valid source and that the channel factory is not null.
 
- validate() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
 
-  
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSink
 
-  
 
- validate() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
-  
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
-  
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
-  
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
-  
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
-  
 
- validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- validate() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
-  
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
-  
 
- validate(T) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
-  
 
- validate() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
-  
 
- validate() - Method in class org.apache.beam.sdk.io.Source
 
- 
Checks that this source is valid, before it can be used in a pipeline.
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- validate(Class<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.options.PipelineOptionsValidator
 
- 
Validates that the passed 
PipelineOptions conforms to all the validation criteria from
 the passed in interface.
 
 
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.transforms.PTransform
 
- 
Called before running the Pipeline to verify this transform is fully and correctly specified.
 
- validateCli(Class<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.options.PipelineOptionsValidator
 
- 
Validates that the passed 
PipelineOptions from command line interface (CLI) conforms to
 all the validation criteria from the passed in interface.
 
 
- validateGetOutputTimestamp(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
Assigns the given 
timestamp to windows using the specified 
windowFn, and
 verifies that result of 
windowFn.getOutputTime for later windows
 (as defined by 
maxTimestamp won't prevent the watermark from passing the end of earlier
 windows.
 
 
- validateGetOutputTimestamps(WindowFn<T, W>, TimestampCombiner, List<List<Long>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
Verifies that later-ending merged windows from any of the timestamps hold up output of
 earlier-ending windows, using the provided 
WindowFn and 
TimestampCombiner.
 
 
- validateGetOutputTimestampsWithValue(WindowFn<T, W>, TimestampCombiner, List<List<TimestampedValue<T>>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
Verifies that later-ending merged windows from any of the timestampValues hold up output of
 earlier-ending windows, using the provided 
WindowFn and 
TimestampCombiner.
 
 
- validateGetOutputTimestampWithValue(WindowFn<T, W>, TimestampedValue<T>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
Assigns the given 
timestampedValue to windows using the specified 
windowFn, and
 verifies that result of 
windowFn.getOutputTime for later windows
 (as defined by 
maxTimestamp won't prevent the watermark from passing the end of earlier
 windows.
 
 
- validateInputFilePatternSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
- 
Validates the the input GCS path is accessible and that the path is well formed.
 
- validateInputFilePatternSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
-  
 
- validateInputFilePatternSupported(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
 
- 
Validate that a file pattern is conforming.
 
- validateNonInterferingOutputTimes(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
Assigns the given timestamp to windows using the specified windowFn, and
 verifies that result of windowFn.getOutputTimestamp for each window is within the
 proper bound.
 
- validateNonInterferingOutputTimesWithValue(WindowFn<T, W>, TimestampedValue<T>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
- 
Assigns the given timestampedValue to windows using the specified windowFn, and
 verifies that result of windowFn.getOutputTimestamp for each window is within the
 proper bound.
 
- validateOutputFilePrefixSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
- 
Validates the the output GCS path is accessible and that the path is well formed.
 
- validateOutputFilePrefixSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
-  
 
- validateOutputFilePrefixSupported(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
 
- 
Validate that an output file prefix is conforming.
 
- validateOutputResourceSupported(ResourceId) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
-  
 
- validateOutputResourceSupported(ResourceId) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
-  
 
- validateOutputResourceSupported(ResourceId) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
 
- 
Validates that an output path is conforming.
 
- ValidatesRunner - Interface in org.apache.beam.sdk.testing
 
- 
Category tag for tests which validate that a Beam runner is correctly implemented.
 
- Validation - Annotation Type in org.apache.beam.sdk.options
 
- 
 
- Validation.Required - Annotation Type in org.apache.beam.sdk.options
 
- 
This criteria specifies that the value must be not null.
 
- VALUE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- value() - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
Create a 
StateSpec for a single value of type 
T.
 
 
- value(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
 
- valueCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
-  
 
- ValueInSingleWindow<T> - Class in org.apache.beam.sdk.values
 
- 
An immutable tuple of value, timestamp, window, and pane.
 
- ValueInSingleWindow() - Constructor for class org.apache.beam.sdk.values.ValueInSingleWindow
 
-  
 
- ValueInSingleWindow.Coder<T> - Class in org.apache.beam.sdk.values
 
- 
 
- valueOf(String) - Static method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
 
- 
Deprecated.
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.runners.local.ExecutionDriver.DriverState
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.annotations.Experimental.Kind
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
 
- 
Deprecated.
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.Compression
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
 
- 
Deprecated.
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.FileBasedSource.Mode
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.EmptyMatchTreatment
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.MatchResult.Status
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.redis.RedisIO.Write.Method
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.TextIO.CompressionType
 
- 
Deprecated.
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.TFRecordIO.CompressionType
 
- 
Deprecated.
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
 
- 
Deprecated.
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.PipelineResult.State
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.state.TimeDomain
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.testing.TestStream.EventType
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.display.DisplayData.Type
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
 
- 
Deprecated.
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.values.PCollection.IsBounded
 
- 
Returns the enum constant of this type with the specified name.
 
- valueOf(String) - Static method in enum org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
 
- 
Returns the enum constant of this type with the specified name.
 
- ValueProvider<T> - Interface in org.apache.beam.sdk.options
 
- 
A 
ValueProvider abstracts the notion of fetching a value that may or may not be currently
 available.
 
 
- ValueProvider.Deserializer - Class in org.apache.beam.sdk.options
 
- 
For internal use only; no backwards compatibility guarantees.
 
- ValueProvider.NestedValueProvider<T,X> - Class in org.apache.beam.sdk.options
 
- 
 
- ValueProvider.RuntimeValueProvider<T> - Class in org.apache.beam.sdk.options
 
- 
 
- ValueProvider.Serializer - Class in org.apache.beam.sdk.options
 
- 
For internal use only; no backwards compatibility guarantees.
 
- ValueProvider.StaticValueProvider<T> - Class in org.apache.beam.sdk.options
 
- 
 
- ValueProviders - Class in org.apache.beam.sdk.options
 
- 
 
- values() - Static method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
 
- 
Deprecated.
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.runners.local.ExecutionDriver.DriverState
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.annotations.Experimental.Kind
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.OpType
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
 
- 
Deprecated.
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.Compression
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
 
- 
Deprecated.
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.FileBasedSource.Mode
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.fs.EmptyMatchTreatment
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.fs.MatchResult.Status
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
Writes just the values to Kafka.
 
- values() - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.redis.RedisIO.Write.Method
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.TextIO.CompressionType
 
- 
Deprecated.
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.TFRecordIO.CompressionType
 
- 
Deprecated.
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
 
- 
Deprecated.
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.options.SdkHarnessOptions.LogLevel
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.PipelineResult.State
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.schemas.Schema.TypeName
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Method in interface org.apache.beam.sdk.state.MapState
 
- 
Returns an Iterable over the values contained in this map.
 
- values() - Static method in enum org.apache.beam.sdk.state.TimeDomain
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.testing.TestStream.EventType
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.transforms.display.DisplayData.Type
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
 
- 
Deprecated.
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- Values<V> - Class in org.apache.beam.sdk.transforms
 
- 
Values<V> takes a PCollection of KV<K, V>s and returns a PCollection<V> of the values.
 
- values() - Static method in enum org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.OnTimeBehavior
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.values.PCollection.IsBounded
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- values() - Static method in enum org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
 
- 
Returns an array containing the constants of this enum type, in
the order they are declared.
 
- ValuesSource<T> - Class in org.apache.beam.runners.gearpump.translators.io
 
- 
unbounded source that reads from a Java Iterable.
 
- ValuesSource(Iterable<T>, Coder<T>) - Constructor for class org.apache.beam.runners.gearpump.translators.io.ValuesSource
 
-  
 
- ValueState<T> - Interface in org.apache.beam.sdk.state
 
- 
 
- ValueWithRecordId<ValueT> - Class in org.apache.beam.sdk.values
 
- 
For internal use only; no backwards compatibility guarantees.
 
- ValueWithRecordId(ValueT, byte[]) - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId
 
-  
 
- ValueWithRecordId.StripIdsDoFn<T> - Class in org.apache.beam.sdk.values
 
- 
DoFn to turn a 
ValueWithRecordId<T> back to the value 
T.
 
 
- ValueWithRecordId.ValueWithRecordIdCoder<ValueT> - Class in org.apache.beam.sdk.values
 
- 
A 
Coder for 
ValueWithRecordId, using a wrapped value 
Coder.
 
 
- ValueWithRecordIdCoder(Coder<ValueT>) - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
-  
 
- VARCHAR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
 
-  
 
- VarianceFn<T extends java.lang.Number> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
 
- 
Combine.CombineFn for Variance on Number types.
 
- VarIntCoder - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder that encodes 
Integers using between 1 and 5 bytes.
 
 
- VarLongCoder - Class in org.apache.beam.sdk.coders
 
- 
A 
Coder that encodes 
Longs using between 1 and 10 bytes.
 
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
-  
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
-  
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
-  
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- verifyCompatibility(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.AtomicCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.AvroCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ByteCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.Coder
 
- 
 
- verifyDeterministic(Coder<?>, String, Iterable<Coder<?>>) - Static method in class org.apache.beam.sdk.coders.Coder
 
- 
Verifies all of the provided coders are deterministic.
 
- verifyDeterministic(Coder<?>, String, Coder<?>...) - Static method in class org.apache.beam.sdk.coders.Coder
 
- 
Verifies all of the provided coders are deterministic.
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.CustomCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DurationCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.FloatCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.InstantCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.KvCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
- 
LengthPrefixCoder is deterministic if the nested Coder is.
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ListCoder
 
- 
List sizes are always known, so ListIterable may be deterministic while the general
 IterableLikeCoder is not.
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.MapCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
- 
NullableCoder is deterministic if the nested Coder is.
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.RowCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SetCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SnappyCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VoidCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
- 
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
-  
 
- verifyDeterministic() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
-  
 
- verifyPAssertsSucceeded(Pipeline, PipelineResult) - Static method in class org.apache.beam.sdk.testing.TestPipeline
 
- 
Verifies all {
PAsserts} in the pipeline have been executed and were successful.
 
 
- verifyPath(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
-  
 
- verifyPath(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
-  
 
- verifyPath(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
 
- 
Validate that a path is a valid path and that the path is accessible.
 
- via(Contextful<Contextful.Fn<UserT, OutputT>>, Contextful<Contextful.Fn<DestinationT, FileIO.Sink<OutputT>>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies how to create a 
FileIO.Sink for a particular destination and how to map the
 element type to the sink's output type.
 
 
- via(Contextful<Contextful.Fn<UserT, OutputT>>, FileIO.Sink<OutputT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- via(Contextful<Contextful.Fn<DestinationT, FileIO.Sink<UserT>>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- via(FileIO.Sink<UserT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- via(SimpleFunction<? super InputT, ? extends Iterable<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
 
- 
For a 
SimpleFunction<InputT, ? extends Iterable<OutputT>> fn, return a 
PTransform that applies 
fn to every element of the input 
PCollection<InputT>
 and outputs all of the elements to the output 
PCollection<OutputT>.
 
 
- via(SerializableFunction<NewInputT, ? extends Iterable<OutputT>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
 
- 
For a 
SerializableFunction<InputT, ? extends Iterable<OutputT>> fn, returns a
 
PTransform that applies 
fn to every element of the input 
PCollection<InputT> and outputs all of the elements to the output 
PCollection<OutputT>.
 
 
- via(Contextful<Contextful.Fn<NewInputT, Iterable<OutputT>>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
 
- 
 
- via(SimpleFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
 
- 
For a SimpleFunction<InputT, OutputT> fn, returns a PTransform that
 takes an input PCollection<InputT> and returns a PCollection<OutputT>
 containing fn.apply(v) for every element v in the input.
 
- via(SerializableFunction<NewInputT, OutputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
 
- 
For a SerializableFunction<InputT, OutputT> fn and output type descriptor,
 returns a PTransform that takes an input PCollection<InputT> and returns a
 PCollection<OutputT> containing fn.apply(v) for every element v in the
 input.
 
- via(Contextful<Contextful.Fn<NewInputT, OutputT>>) - Method in class org.apache.beam.sdk.transforms.MapElements
 
- 
 
- viaRandomKey() - Static method in class org.apache.beam.sdk.transforms.Reshuffle
 
- 
Deprecated.
Encapsulates the sequence "pair input with unique key, apply 
Reshuffle.of(), drop the
 key" commonly used to break fusion.
 
 
- View - Class in org.apache.beam.sdk.transforms
 
- 
 
- View.AsIterable<T> - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- View.AsList<T> - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- View.AsMap<K,V> - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- View.AsMultimap<K,V> - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- View.AsSingleton<T> - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- View.CreatePCollectionView<ElemT,ViewT> - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- viewAsValues(PCollectionView<V>, Coder<V>) - Static method in class org.apache.beam.sdk.transforms.Reify
 
- 
Pairs each element in a collection with the value of a side input associated with the element's
 window.
 
- ViewFn<PrimitiveViewT,ViewT> - Class in org.apache.beam.sdk.transforms
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- ViewFn() - Constructor for class org.apache.beam.sdk.transforms.ViewFn
 
-  
 
- viewInGlobalWindow(PCollectionView<V>, Coder<V>) - Static method in class org.apache.beam.sdk.transforms.Reify
 
- 
Returns a 
PCollection consisting of a single element, containing the value of the given
 view in the global window.
 
 
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.gearpump.translators.GearpumpPipelineTranslator
 
-  
 
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
-  
 
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
-  
 
- visitPrimitiveTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
 
- 
Called for each primitive transform after all of its topological predecessors and inputs have
 been visited.
 
- visitValue(PValue, TransformHierarchy.Node) - Method in class org.apache.beam.runners.gearpump.translators.GearpumpPipelineTranslator
 
-  
 
- visitValue(PValue, TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
-  
 
- visitValue(PValue, TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
 
- 
Called for each value after the transform that produced the value has been visited.
 
- VoidCoder - Class in org.apache.beam.sdk.coders
 
- 
 
- voids() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
 
-  
 
- Wait - Class in org.apache.beam.sdk.transforms
 
- 
Delays processing of each window in a 
PCollection until signaled.
 
 
- Wait() - Constructor for class org.apache.beam.sdk.transforms.Wait
 
-  
 
- Wait.OnSignal<T> - Class in org.apache.beam.sdk.transforms
 
- 
 
- waitForStart(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
- 
Future that waits for a start signal for duration.
 
- waitForSuccess(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
- 
Wait for a success signal for duration.
 
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
-  
 
- waitUntilFinish() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
-  
 
- waitUntilFinish() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
-  
 
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
-  
 
- waitUntilFinish(Duration, MonitoringUtil.JobMessagesHandler) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
- 
Waits until the pipeline finishes and returns the final status.
 
- waitUntilFinish() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
- 
Waits until the pipeline finishes and returns the final status.
 
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
- 
Waits until the pipeline finishes and returns the final status.
 
- waitUntilFinish() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
-  
 
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
-  
 
- waitUntilFinish() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
-  
 
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
-  
 
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.gearpump.GearpumpPipelineResult
 
-  
 
- waitUntilFinish() - Method in class org.apache.beam.runners.gearpump.GearpumpPipelineResult
 
-  
 
- waitUntilFinish() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
-  
 
- waitUntilFinish(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
-  
 
- waitUntilFinish(Duration) - Method in interface org.apache.beam.sdk.PipelineResult
 
- 
Waits until the pipeline finishes and returns the final status.
 
- waitUntilFinish() - Method in interface org.apache.beam.sdk.PipelineResult
 
- 
Waits until the pipeline finishes and returns the final status.
 
- Watch - Class in org.apache.beam.sdk.transforms
 
- 
Given a "poll function" that produces a potentially growing set of outputs for an input, this
 transform simultaneously continuously watches the growth of output sets of all inputs, until a
 per-input termination condition is reached.
 
- Watch() - Constructor for class org.apache.beam.sdk.transforms.Watch
 
-  
 
- Watch.Growth<InputT,OutputT,KeyT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Watch.Growth.PollFn<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Watch.Growth.PollResult<OutputT> - Class in org.apache.beam.sdk.transforms
 
- 
 
- Watch.Growth.TerminationCondition<InputT,StateT> - Interface in org.apache.beam.sdk.transforms
 
- 
A strategy for determining whether it is time to stop polling the current input regardless of
 whether its output is complete or not.
 
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
 
- 
Like Read#watchForNewFiles.
 
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.AvroIO.ParseAll
 
- 
Like Read#watchForNewFiles.
 
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
- 
Continuously watches for new files matching the filepattern, polling it at the given
 interval, until the given termination condition is reached.
 
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.AvroIO.ReadAll
 
- 
Like Read#watchForNewFiles.
 
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
 
- watchForNewFiles(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
 
- 
Same as Read#watchForNewFiles(Duration, TerminationCondition).
 
- WatermarkAdvancingStreamingListener() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarkAdvancingStreamingListener
 
-  
 
- WatermarkEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
 
-  
 
- WatermarkHoldState - Interface in org.apache.beam.sdk.state
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- WatermarkManager<ExecutableT,CollectionT> - Class in org.apache.beam.runners.direct
 
- 
Manages watermarks of 
PCollections and input and output watermarks of 
AppliedPTransforms to provide event-time and completion tracking for in-memory
 execution.
 
 
- WatermarkManager.FiredTimers<ExecutableT> - Class in org.apache.beam.runners.direct
 
- 
A pair of TimerInternals.TimerData and key which can be delivered to the appropriate AppliedPTransform.
 
- WatermarkManager.TimerUpdate - Class in org.apache.beam.runners.direct
 
- 
A collection of newly set, deleted, and completed timers.
 
- WatermarkManager.TimerUpdate.TimerUpdateBuilder - Class in org.apache.beam.runners.direct
 
- 
 
- WatermarkManager.TransformWatermarks - Class in org.apache.beam.runners.direct
 
- 
A reference to the input and output watermarks of an AppliedPTransform.
 
- watermarkStateInternal(TimestampCombiner) - Static method in class org.apache.beam.sdk.state.StateSpecs
 
- 
For internal use only; no backwards-compatibility guarantees.
 
- weeks(int, int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
 
- 
Returns a 
WindowFn that windows elements into periods measured by weeks.
 
 
- where(TypeParameter<X>, TypeDescriptor<X>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
Returns a new TypeDescriptor where the type variable represented by typeParameter are substituted by type.
 
- where(Type, Type) - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
- 
 
- whereFieldId(int, SerializableFunction<?, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
 
- 
Set a predicate based on the value of a field, where the field is specified by id.
 
- whereFieldIds(List<Integer>, SerializableFunction<Row, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
 
- 
Set a predicate based on the value of multipled fields, specified by id.
 
- whereFieldName(String, SerializableFunction<?, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
 
- 
Set a predicate based on the value of a field, where the field is specified by name.
 
- whereFieldNames(List<String>, SerializableFunction<Row, Boolean>) - Method in class org.apache.beam.sdk.schemas.transforms.Filter.Inner
 
- 
Set a predicate based on the value of multipled fields, specified by name.
 
- window() - Method in interface org.apache.beam.sdk.state.StateContext
 
- 
Returns the window corresponding to the state.
 
- window() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
 
- 
Returns the window in which the timer is firing.
 
- Window<T> - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- Window() - Constructor for class org.apache.beam.sdk.transforms.windowing.Window
 
-  
 
- window() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
 
- 
Returns the window of the current element prior to this WindowFn being called.
 
- Window.Assign<T> - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- Window.ClosingBehavior - Enum in org.apache.beam.sdk.transforms.windowing
 
- 
Specifies the conditions under which a final pane will be created when a window is permanently
 closed.
 
- Window.OnTimeBehavior - Enum in org.apache.beam.sdk.transforms.windowing
 
- 
Specifies the conditions under which an on-time pane will be created when a window is closed.
 
- WindowAssignTranslator<T> - Class in org.apache.beam.runners.gearpump.translators
 
- 
Window.Assign is translated to Gearpump flatMap function.
 
- WindowAssignTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.WindowAssignTranslator
 
-  
 
- WindowAssignTranslator.AssignWindows<T> - Class in org.apache.beam.runners.gearpump.translators
 
- 
A Function used internally by Gearpump to wrap the actual Beam's WindowFn.
 
- windowCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.BagUserStateSpec
 
-  
 
- windowCoder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
-  
 
- windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
 
- 
Returns the 
Coder used for serializing the windows used by this windowFn.
 
 
- WindowedContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.WindowedContext
 
-  
 
- windowedFilename(int, int, BoundedWindow, PaneInfo, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
-  
 
- windowedFilename(int, int, BoundedWindow, PaneInfo, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
 
- 
When a sink has requested windowed or triggered output, this method will be invoked to return
 the file 
resource to be created given the base output directory and a
 
FileBasedSink.OutputFileHints containing information about the file, including a suggested
 extension (e.g.
 
 
- windowedMultiReceiver(DoFn<?, ?>.WindowedContext, Map<TupleTag<?>, Coder<?>>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
 
- 
 
- windowedMultiReceiver(DoFn<?, ?>.WindowedContext) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
 
- 
 
- windowedReceiver(DoFn<?, ?>.WindowedContext, TupleTag<T>) - Static method in class org.apache.beam.sdk.transforms.DoFnOutputReceivers
 
- 
 
- windowedWrites - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
Whether windowed writes are being used.
 
- WindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
 
- 
The argument to the 
Window transform used to assign elements into windows and to
 determine how windows are merged.
 
 
- WindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn
 
-  
 
- WindowFn.AssignContext - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- WindowFn.MergeContext - Class in org.apache.beam.sdk.transforms.windowing
 
- 
 
- WindowFnTestUtils - Class in org.apache.beam.sdk.testing
 
- 
 
- WindowFnTestUtils() - Constructor for class org.apache.beam.sdk.testing.WindowFnTestUtils
 
-  
 
- WINDOWING_STRATEGY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
-  
 
- WindowingStrategy<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.values
 
- 
A WindowingStrategy describes the windowing behavior for a specific collection of values.
 
- WindowingStrategy.AccumulationMode - Enum in org.apache.beam.sdk.values
 
- 
The accumulation modes that can be used with windowing.
 
- WindowMappingFn<TargetWindowT extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
 
- 
Experimental! This will be ready for users eventually, but should be considered internal for
 now.
 
- WindowMappingFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
 
- 
 
- WindowMappingFn(Duration) - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
 
- 
 
- windowOnlyContext(W) - Static method in class org.apache.beam.sdk.state.StateContexts
 
-  
 
- windows() - Static method in class org.apache.beam.sdk.transforms.Reify
 
- 
 
- windows() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
 
- 
Returns the current set of windows.
 
- windowsInValue() - Static method in class org.apache.beam.sdk.transforms.Reify
 
- 
Create a 
PTransform that will output all input 
KVs with the window pane info
 inside the value.
 
 
- WindowTimestampFn(int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.WindowTimestampFn
 
-  
 
- WireCoders - Class in org.apache.beam.runners.fnexecution.wire
 
- 
Helpers to construct coders for gRPC port reads and writes.
 
- with(SimpleFunction<DataT, InputT>, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
 
- 
 
- with(SimpleFunction<DataT, InputT>, Coder, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
 
- 
Like #with(SimpleFunction, CombineFn, TupleTag) but with an explicit input coder.
 
- with(SimpleFunction<DataT, InputT>, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
 
- 
 
- with(SimpleFunction<DataT, InputT>, Coder, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
 
- 
Like #with(SimpleFunction, CombineFnWithContext, TupleTag) but with input coder.
 
- with(SimpleFunction<DataT, InputT>, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
- 
 
- with(SimpleFunction<DataT, InputT>, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
- 
 
- with(SimpleFunction<DataT, InputT>, Coder, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
- 
 
- with(SimpleFunction<DataT, InputT>, Coder, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
- 
 
- with(SimpleFunction<DataT, InputT>, CombineFnBase.GlobalCombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
- 
 
- with(SimpleFunction<DataT, InputT>, Coder<InputT>, CombineFnBase.GlobalCombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
- 
 
- withAccuracy(double, double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
 
- 
 
- withAddresses(List<String>) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
 
- 
Define the AMQP addresses where to receive messages.
 
- withAllFields() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
-  
 
- withAllowedLateness(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
Override the amount of lateness allowed for data elements in the output 
PCollection and
 downstream 
PCollections until explicitly set again.
 
 
- withAllowedLateness(Duration, Window.ClosingBehavior) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
Override the amount of lateness allowed for data elements in the pipeline.
 
- withAllowedLateness(Duration) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
- 
Returns a 
WindowingStrategy identical to 
this but with the allowed lateness set
 to 
allowedLateness.
 
 
- withAllowedTimestampSkew(Duration) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
 
- 
 
- withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
- 
Define the password to authenticate on the Redis server.
 
- withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
-  
 
- withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadAll
 
-  
 
- withAuth(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
-  
 
- withAutoUdfUdafLoad(boolean) - Method in class org.apache.beam.sdk.extensions.sql.SqlTransform
 
-  
 
- withAWSClientsProvider(AwsClientsProvider) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
 
- 
 
- withAWSClientsProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
 
- 
Specify credential details and region to be used to write to SNS.
 
- withAWSClientsProvider(String, String, Regions, String) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
 
- 
Specify credential details and region to be used to write to SNS.
 
- withAWSClientsProvider(AWSClientsProvider) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
 
- withAWSClientsProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specify credential details and region to be used to read from Kinesis.
 
- withAWSClientsProvider(String, String, Regions, String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specify credential details and region to be used to read from Kinesis.
 
- withAWSClientsProvider(AWSClientsProvider) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
 
- 
 
- withAWSClientsProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
 
- 
Specify credential details and region to be used to write to Kinesis.
 
- withAWSClientsProvider(String, String, Regions, String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
 
- 
Specify credential details and region to be used to write to Kinesis.
 
- withBaseFilename(ResourceId) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
- 
Sets the base filename.
 
- withBaseFilename(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
- 
 
- withBasicCredentials(String, String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
 
- 
If Solr basic authentication is enabled, provide the username and password.
 
- withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
If true the uses Cloud Spanner batch API.
 
- withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
By default Batch API is used to read data from Cloud Spanner.
 
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
- 
Provide a size for the scroll read.
 
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
 
- 
Sets batch size for the write operation.
 
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
- 
Provide a maximum size in number of SQL statenebt for the batch.
 
- withBatchSize(long) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
- 
Define the size of the batch to group write operations.
 
- withBatchSize(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
-  
 
- withBatchSize(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadAll
 
-  
 
- withBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the batch size limit.
 
- withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
Returns a new 
BigtableIO.Read that will read from the Cloud Bigtable instance with
 customized options provided by given configurator.
 
 
- withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
Returns a new 
BigtableIO.Write that will read from the Cloud Bigtable instance with
 customized options provided by given configurator.
 
 
- withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets the bootstrap servers for the Kafka consumer.
 
- withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
Returns a new 
KafkaIO.Write transform with Kafka producer pointing to 
bootstrapServers.
 
 
- withBucket(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- withBucket(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
-  
 
- withCassandraService(CassandraService<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify an instance of 
CassandraService used to connect and read from Cassandra
 database.
 
 
- withCassandraService(CassandraService<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
- 
Specify the 
CassandraService used to connect and write into the Cassandra database.
 
 
- withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
Sets the XML file charset.
 
- withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
 
- 
 
- withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
-  
 
- withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
- 
Sets the charset used to write the file.
 
- withChunkSize(Long) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
-  
 
- withClosingBehavior(Window.ClosingBehavior) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- withCodec(CodecFactory) - Method in class org.apache.beam.sdk.io.AvroIO.Sink
 
- 
Specifies to use the given 
CodecFactory for each generated file.
 
 
- withCodec(CodecFactory) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Writes to Avro file(s) compressed using specified codec.
 
- withCodec(CodecFactory) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
 
- 
Sets a coder for the result of the parse function.
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.AvroIO.ParseAll
 
- 
Specifies the coder for the result of the parseFn.
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
Sets a 
Coder for the result of the parse function.
 
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
-  
 
- withCoder(Coder<OutputT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
-  
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
-  
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
 
- 
Returns a 
Create.TimestampedValues PTransform like this one that uses the given
 
Coder<T> to decode each of the objects into a value of type 
T.
 
 
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
 
- 
Returns a 
Create.Values PTransform like this one that uses the given 
Coder<T>
 to decode each of the objects into a value of type 
T.
 
 
- withCollection(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
- 
Sets the collection to consider in the database.
 
- withCollection(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
- 
Sets the collection where to write data in the database.
 
- withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withCompletedTimers(Iterable<TimerInternals.TimerData>) - Method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate.TimerUpdateBuilder
 
- 
 
- withCompletedTimers(Iterable<TimerInternals.TimerData>) - Method in class org.apache.beam.runners.direct.WatermarkManager.TimerUpdate
 
- 
 
- withCompression(double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
 
- 
Sets the compression factor cf.
 
- withCompression(double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.PerKeyDigest
 
- 
Sets the compression factor cf.
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.CompressedSource
 
- 
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
 
- 
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies to compress all generated shard files using the given 
Compression and, by
 default, append the respective extension to the filename.
 
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
Reads from input sources using the specified compression type.
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
 
- 
Reads from input sources using the specified compression type.
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Returns a transform for writing to text files like this one but that compresses output using
 the given 
Compression.
 
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
- 
Returns a transform for reading TFRecord files that decompresses all input files using the
 specified compression type.
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
Writes to output files using the specified compression type.
 
- withCompression(Compression) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
Decompresses all input files using the specified compression type.
 
- withCompressionType(TextIO.CompressionType) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
 
- withCompressionType(TextIO.CompressionType) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
 
- 
 
- withCompressionType(TFRecordIO.CompressionType) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
- 
 
- withCompressionType(TFRecordIO.CompressionType) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
 
- withCompressionType(XmlIO.Read.CompressionType) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
 
- withConfidence(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
 
- 
Sets the confidence value, i.e.
 
- withConfidence(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
 
- 
Sets the confidence value, i.e.
 
- withConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
- 
Sets the configuration properties like metastore URI.
 
- withConfigProperties(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
 
- 
Sets the configuration properties like metastore URI.
 
- withConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.FileIO.Match
 
- 
 
- withConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
 
- 
 
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
- 
Reads from the source using the options provided by the given configuration.
 
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
- 
Reads from the HBase instance indicated by the* given configuration.
 
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.ReadAll
 
- 
Reads from the HBase instance indicated by the* given configuration.
 
- withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
- 
Writes to the HBase instance indicated by the* given Configuration.
 
- withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
- 
Provide the Elasticsearch connection configuration object.
 
- withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
- 
Provide the Elasticsearch connection configuration object.
 
- withConnectionConfiguration(MqttIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
- 
Define the MQTT connection configuration used to connect to the MQTT broker.
 
- withConnectionConfiguration(MqttIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
- 
Define MQTT connection configuration used to connect to the MQTT broker.
 
- withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
-  
 
- withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadAll
 
-  
 
- withConnectionConfiguration(RedisConnectionConfiguration) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
-  
 
- withConnectionConfiguration(SolrIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
 
- 
Provide the Solr connection configuration object.
 
- withConnectionConfiguration(SolrIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
 
- 
Provide the Solr connection configuration object.
 
- withConnectionFactory(ConnectionFactory) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
- 
Specify the JMS connection factory to connect to the JMS broker.
 
- withConnectionFactory(ConnectionFactory) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
 
- 
Specify the JMS connection factory to connect to the JMS broker.
 
- withConnectionProperties(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
- 
Sets the connection properties passed to driver.connect(...).
 
- withConnectionProperties(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
- 
 
- withConsistencyLevel(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
-  
 
- withConsistencyLevel(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
-  
 
- withConsumerFactoryFn(SerializableFunction<Map<String, Object>, Consumer<byte[], byte[]>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
A factory to create Kafka Consumer from consumer configuration.
 
- withConsumerFactoryFn(SerializableFunction<Map<String, Object>, ? extends Consumer<?, ?>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
 
- withContentTypeHint(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
- 
Sets a content type hint to make the file parser detection more efficient.
 
- withConversion(ReinterpretConversion) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.reinterpret.Reinterpreter.Builder
 
-  
 
- withCreateDisposition(BigQueryIO.Write.CreateDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Specifies whether the table should be created if it does not exist.
 
- withCreateTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withCreateTime(Duration) - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
 
- 
 
- withCustomGcsTempLocation(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Provides a custom location on GCS for storing temporary files to be loaded via BigQuery batch
 load jobs.
 
- withDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
- 
Sets the database name.
 
- withDatabase(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
 
- 
Sets the database name.
 
- withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
-  
 
- withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
- 
Sets the database to use.
 
- withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
- 
Sets the database to use.
 
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
- 
Specifies the Cloud Spanner database.
 
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
- 
Specifies the Cloud Spanner database.
 
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
Specifies the Cloud Spanner database.
 
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
Specifies the Cloud Spanner database.
 
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
Specifies the Cloud Spanner database.
 
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
Specifies the Cloud Spanner database.
 
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner database.
 
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner database.
 
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
-  
 
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
-  
 
- withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
-  
 
- withDecompression(CompressedSource.DecompressingChannelFactory) - Method in class org.apache.beam.sdk.io.CompressedSource
 
- 
 
- WithDefault() - Constructor for class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
 
-  
 
- withDefaultValue(T) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
 
- 
Default value to return for windows with no value in them.
 
- withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
Set the custom delimiter to be used in place of the default ones ('\r', '\n' or '\r\n').
 
- withDelimiter(byte[]) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
 
- 
Like Read#withDelimiter.
 
- withDelimiter(char[]) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Specifies the delimiter after each string written.
 
- withDelimiter(char[]) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withDescription(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
Returns a copy of the Field with the description set.
 
- withDestinationCoder(Coder<DestinationT>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withDirectoryTreatment(FileIO.ReadMatches.DirectoryTreatment) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
 
- 
Controls how to handle directories in the input 
PCollection.
 
 
- withEarlyFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
-  
 
- withEarlyFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
- 
Creates a new Trigger like the this, except that it fires repeatedly whenever the
 given Trigger fires before the watermark has passed the end of the window.
 
- withElementTimestamp() - Static method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
 
- 
 
- withEmptyGlobalWindowDestination(DestinationT) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
 
- 
Like Read#withEmptyMatchTreatment.
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.AvroIO.ParseAll
 
- 
Like Read#withEmptyMatchTreatment.
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
- 
Configures whether or not a filepattern matching no files is allowed.
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.AvroIO.ReadAll
 
- 
Like Read#withEmptyMatchTreatment.
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.AvroSource
 
-  
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.Match
 
- 
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
 
- 
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
 
- 
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
 
- withEmptyMatchTreatment(EmptyMatchTreatment) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
 
- 
Same as Read#withEmptyMatchTreatment.
 
- withEndKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
Returns new 
ByteKeyRange like this one, but with the specified end key.
 
 
- withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
-  
 
- withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadAll
 
-  
 
- withEndpoint(String, int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
-  
 
- withEntity(Class<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify the entity class (annotated POJO).
 
- withEntity(Class<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
- 
 
- withEOS(int, String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
Provides exactly-once semantics while writing to Kafka, which enables applications with
 end-to-end exactly-once guarantees on top of exactly-once semantics within Beam
 pipelines.
 
- withEpsilon(double) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
- 
Returns an ApproximateQuantilesCombineFn that's like this one except that it uses the
 specified epsilon value.
 
- withExpireTime(Long) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
-  
 
- withExtendedErrorInfo() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- withExtendedErrorInfo(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
 
- 
Specify whether to use extended error info or not.
 
- withExtensionsFrom(Iterable<Class<?>>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
- 
Returns a 
ProtoCoder like this one, but with the extensions from the given classes
 registered.
 
 
- withExtensionsFrom(Class<?>...) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
- 
 
- withFailedInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Specfies a policy for handling fPailed inserts.
 
- withFailureMode(SpannerIO.FailureMode) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies failure mode.
 
- withFanout(int) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
Returns a 
PTransform identical to this, but that uses an intermediate node to combine
 parts of the data to reduce load on the final global combine step.
 
 
- withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
- 
This method is used to set the size of the data that is going to be fetched and loaded in
 memory per every database call.
 
- withFetchSize(int) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
- 
This method is used to set the size of the data that is going to be fetched and loaded in
 memory per every database call.
 
- withFieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
- 
Return a descriptor that access the specified fields.
 
- withFieldIds(Iterable<Integer>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
- 
Return a descriptor that access the specified fields.
 
- withFieldNameFunction(SerializableFunction<List<String>, String>) - Method in class org.apache.beam.sdk.schemas.transforms.Unnest.Inner
 
- 
Sets a policy for naming deeply-nested fields.
 
- withFieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
- 
Return a descriptor that access the specified fields.
 
- withFieldNames(Iterable<String>) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
- 
Return a descriptor that access the specified fields.
 
- withFieldValueGetters(FieldValueGetterFactory, Object) - Method in class org.apache.beam.sdk.values.Row.Builder
 
-  
 
- withFilename(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
-  
 
- withFilter(Filter) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
- 
Filters the rows read from HBase using the given* row filter.
 
- withFilter(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
- 
Sets the filter details.
 
- withFilter(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- withFilter(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
- 
Sets a filter on the documents in a collection.
 
- withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
-  
 
- withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Adds a footer string to each file.
 
- withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withFormatFunction(SerializableFunction<UserT, OutputT>) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Specifies a format function to convert UserT to the output type.
 
- withFormatFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Formats the user's type into a 
TableRow to be written to BigQuery.
 
 
- withFormatFunction(SerializableFunction<UserT, String>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- withGapDuration(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
- 
Creates a 
Sessions WindowFn with the specified gap duration.
 
 
- withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
-  
 
- withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Adds a header string to each file.
 
- withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.io.AvroIO.Parse
 
- 
Like Read#withHintMatchesManyFiles().
 
- withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
- 
 
- withHintMatchesManyFiles() - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
 
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
- 
Specifies the Cloud Spanner host.
 
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
-  
 
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
Specifies the Cloud Spanner host.
 
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
Specifies the Cloud Spanner host.
 
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
-  
 
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner host.
 
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner host.
 
- withHost(String) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
- 
Define the host name of the Redis server.
 
- withHosts(List<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify the hosts of the Apache Cassandra instances.
 
- withHosts(List<String>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
- 
Specify the Cassandra instance hosts where to write data.
 
- withHotKeyFanout(SerializableFunction<? super K, Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
- 
If a single key has disproportionately many values, it may become a bottleneck, especially in
 streaming mode.
 
- withHotKeyFanout(int) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
- 
 
- withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
- 
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub
 message attributes, specifies the name of the attribute containing the unique identifier.
 
- withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
- 
Writes to Pub/Sub, adding each record's unique identifier to the published messages in an
 attribute with the specified name.
 
- withIdFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
- 
Provide a function to extract the id from the document.
 
- withIdGenerator(IdGenerator) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
 
-  
 
- withIgnoreWindowing() - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withIndexFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
- 
Provide a function to extract the target index from the document allowing for dynamic
 document routing.
 
- withInitialPositionInStream(InitialPositionInStream) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specify reading from some initial position in stream.
 
- withInitialTimestampInStream(Instant) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specify reading beginning at given 
Instant.
 
 
- withInputMetadata(Metadata) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
- 
Sets the input metadata for Parser.parse(java.io.InputStream, org.xml.sax.ContentHandler, org.apache.tika.metadata.Metadata, org.apache.tika.parser.ParseContext).
 
- withInputTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
The timestamp for each record being published is set to timestamp of the element in the
 pipeline.
 
- withInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
 
- 
Specify a retry policy for failed inserts.
 
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
- 
Specifies the Cloud Spanner instance.
 
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
- 
Specifies the Cloud Spanner instance.
 
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
Specifies the Cloud Spanner instance.
 
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
Specifies the Cloud Spanner instance.
 
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
Specifies the Cloud Spanner instance.
 
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
Specifies the Cloud Spanner instance.
 
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner instance.
 
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner instance.
 
- withInterceptors(List<ClientInterceptor>) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
 
- 
Returns a 
ManagedChannelFactory like this one, but which will apply the provided 
ClientInterceptors to any channel it creates.
 
 
- withIsReady(Supplier<Boolean>) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
 
- 
Returns a new 
TestStreams.Builder like this one with the specified 
CallStreamObserver.isReady() callback.
 
 
- withJsonSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- withJsonSchema(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- withJsonTimePartitioning(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- withKeepAlive(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
- 
Sets whether socket keep alive is enabled.
 
- withKeepAlive(boolean) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
- 
Sets whether socket keep alive is enabled.
 
- withKeyDeserializer(Class<? extends Deserializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets a Kafka Deserializer to interpret key bytes read from Kafka.
 
- withKeyDeserializerAndCoder(Class<? extends Deserializer<K>>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets a Kafka 
Deserializer for interpreting key bytes read from Kafka along with a
 
Coder for helping the Beam runner materialize key objects at runtime if necessary.
 
 
- withKeyPattern(String) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
-  
 
- withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
Returns a new 
BigtableIO.Read that will read only rows in the specified range.
 
 
- withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
- 
Reads only rows in the specified range.
 
- withKeyRange(byte[], byte[]) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
- 
Reads only rows in the specified range.
 
- withKeyRanges(List<ByteKeyRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
Returns a new 
BigtableIO.Read that will read only rows in the specified ranges.
 
 
- WithKeys<K,V> - Class in org.apache.beam.sdk.transforms
 
- 
WithKeys<K, V> takes a PCollection<V>, and either a constant key of type K or a function from V to K, and returns a PCollection<KV<K, V>>, where
 each of the values in the input PCollection has been paired with either the constant key
 or a key computed from the value.
 
- withKeySerializer(Class<? extends Serializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
Sets a Serializer for serializing key (if any) to bytes.
 
- withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withKeyspace(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify the Cassandra keyspace where to read data.
 
- withKeyspace(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
- 
Specify the Cassandra keyspace where to write data.
 
- withKeystorePassword(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
- 
If Elasticsearch uses SSL/TLS with mutual authentication (via shield), provide the password
 to open the client keystore.
 
- withKeystorePath(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
- 
If Elasticsearch uses SSL/TLS with mutual authentication (via shield), provide the keystore
 containing the client key.
 
- withKeyTranslation(SimpleFunction<?, K>) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
- 
Transforms the keys read from the source using the given key translation function.
 
- withKeyType(TypeDescriptor<K>) - Method in class org.apache.beam.sdk.transforms.WithKeys
 
- 
Return a 
WithKeys that is like this one with the specified key type descriptor.
 
 
- withLabel(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
 
- withLateFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
-  
 
- withLateFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
- 
Creates a new Trigger like the this, except that it fires repeatedly whenever the
 given Trigger fires after the watermark has passed the end of the window.
 
- withLinkUrl(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
 
- withLiteralGqlQuery(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
Returns a new 
DatastoreV1.Read that reads the results of the specified GQL query.
 
 
- withLiteralGqlQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
 
- withLoadJobProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Set the project the BigQuery load job will be initiated from.
 
- withLoadJobProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
-  
 
- withLocalDc(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify the local DC used for the load balancing.
 
- withLocalDc(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
- 
Specify the local DC used by the load balancing policy.
 
- withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
 
- 
Returns a new 
DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore Emulator
 running locally on the specified host port.
 
 
- withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
 
- 
Returns a new 
DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore Emulator
 running locally on the specified host port.
 
 
- withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
Returns a new 
DatastoreV1.Read that reads from a Datastore Emulator running at the
 given localhost address.
 
 
- withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
 
- 
Returns a new 
DatastoreV1.Write that writes to the Cloud Datastore Emulator running locally on
 the specified host port.
 
 
- withLogAppendTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withLogAppendTime() - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
 
- 
A 
TimestampPolicy that assigns Kafka's log append time (server side ingestion time) to
 each record.
 
 
- withMapper(ObjectMapper) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
 
- 
 
- withMapper(ObjectMapper) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
 
- 
 
- withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
 
- 
 
- withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.AvroIO.ParseAll
 
- 
 
- withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
- 
 
- withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.AvroIO.ReadAll
 
- 
 
- withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
- 
 
- withMatchConfiguration(FileIO.MatchConfiguration) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
 
- 
 
- withMaxBatchBytesSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
- 
Writes to Pub/Sub are limited by 10mb in general.
 
- withMaxBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
- 
Provide a maximum size in number of documents for the batch see bulk API
 (https://www.elastic.co/guide/en/elasticsearch/reference/2.4/docs-bulk.html).
 
- withMaxBatchSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
- 
Writes to Pub/Sub are batched to efficiently send data.
 
- withMaxBatchSize(int) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
 
- 
Provide a maximum size in number of documents for the batch.
 
- withMaxBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
- 
Provide a maximum size in bytes for the batch see bulk API
 (https://www.elastic.co/guide/en/elasticsearch/reference/2.4/docs-bulk.html).
 
- withMaxConnectionIdleTime(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
- 
Sets the maximum idle time for a pooled connection.
 
- withMaxConnectionIdleTime(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
- 
Sets the maximum idle time for a pooled connection.
 
- withMaxInputSize(long) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
- 
Returns an ApproximateQuantilesCombineFn that's like this one except that it uses the
 specified maxNumElements value.
 
- withMaxNumMutations(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the cell mutation limit.
 
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
 
- 
Define the max number of records received by the 
AmqpIO.Read.
 
 
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
 
- 
Define the max number of records received by the 
SqsIO.Read.
 
 
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
- 
 
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
- 
Define the max number of records that the source will read.
 
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specifies to read at most a given number of records.
 
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
- 
Define the max number of records received by the 
MqttIO.Read.
 
 
- withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
- 
 
- withMaxNumWritersPerBundle(int) - Method in class org.apache.beam.sdk.io.WriteFiles
 
- 
Set the maximum number of writers created in a bundle before spilling to shuffle.
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
 
- 
Define the max read time (duration) while the 
AmqpIO.Read will receive messages.
 
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
 
- 
Define the max read time (duration) while the 
SqsIO.Read will receive messages.
 
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
- 
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
- 
Specifies to stop generating elements after the given time.
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
- 
Define the max read time that the source will read.
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specifies to read records during maxReadTime.
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
- 
Define the max read time (duration) while the 
MqttIO.Read will receive messages.
 
 
- withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
- 
 
- withMemoryMB(int) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
 
- 
Sets the size of the memory buffer in megabytes.
 
- withMessageMapper(JmsIO.MessageMapper<T>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
-  
 
- withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.io.AvroIO.Sink
 
- 
Specifies to put the given metadata into each generated file.
 
- withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Writes to Avro file(s) with the specified metadata.
 
- withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withMetadata() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
- 
Include metadata in result json documents.
 
- withMetadata(byte[]) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
Returns a copy of the descriptor with metadata set.
 
- withMetadata(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
 
- 
Returns a copy of the descriptor with metadata set.
 
- withMethod(BigQueryIO.Write.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Choose the method used to write data to BigQuery.
 
- withMethod(RedisIO.Write.Method) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
-  
 
- WithMetricsSupport - Class in org.apache.beam.runners.spark.metrics
 
- 
A 
MetricRegistry decorator-like that supports 
AggregatorMetric and 
SparkBeamMetric as 
Gauges.
 
 
- withMinBundleSize(long) - Method in class org.apache.beam.sdk.io.AvroSource
 
- 
Sets the minimum bundle size.
 
- withMinBundleSize(long) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
Sets a parameter minBundleSize for the minimum bundle size of the source.
 
- withMinNumberOfSplits(Integer) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
It's possible that system.size_estimates isn't populated or that the number of splits
 computed by Beam is still to low for Cassandra to handle it.
 
- withMode(WindowingStrategy.AccumulationMode) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
- 
Returns a 
WindowingStrategy identical to 
this but with the accumulation mode
 set to 
mode.
 
 
- withName(String) - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
Returns a copy of the Field with the name set.
 
- withNamespace(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
 
- withNamespace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
 
- withNamespace(Class<?>) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
- 
 
- withNaming(FileIO.Write.FileNaming) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies a custom strategy for generating filenames.
 
- withNaming(SerializableFunction<DestinationT, FileIO.Write.FileNaming>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withNaming(Contextful<Contextful.Fn<DestinationT, FileIO.Write.FileNaming>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withNestedField(int, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
- 
Return a descriptor that access the specified nested field.
 
- withNestedField(String, FieldAccessDescriptor) - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
 
- 
Return a descriptor that access the specified nested field.
 
- withNullable(boolean) - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
Returns a copy of the Field with isNullable set.
 
- withNumFileShards(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Control how many file shards are written when using BigQuery load jobs.
 
- withNumQuerySplits(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
Returns a new 
DatastoreV1.Read that reads by splitting the given 
query into
 
numQuerySplits.
 
 
- withNumShards(int) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Configures the number of output shards produced overall (when using unwindowed writes) or
 per-window (when using windowed writes).
 
- withNumShards(int) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withNumShards(int) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies to use a given fixed number of shards per window.
 
- withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withNumShards(int) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Configures the number of output shards produced overall (when using unwindowed writes) or
 per-window (when using windowed writes).
 
- withNumShards(int) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withNumShards(int) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
Writes to the provided number of shards.
 
- withNumShards(int) - Method in class org.apache.beam.sdk.io.WriteFiles
 
- 
 
- withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.WriteFiles
 
- 
 
- withNumSplits(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
- 
Sets the user defined number of splits.
 
- withOffset(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
- 
Partitions the timestamp space into half-open intervals of the form [N * size + offset, (N + 1)
 * size + offset), where 0 is the epoch.
 
- withOffset(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
- 
Assigns timestamps into half-open intervals of the form [N * period + offset, N * period +
 offset + size).
 
- withOnCompleted(Runnable) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
 
- 
Returns a new 
TestStreams.Builder like this one with the specified 
StreamObserver.onCompleted() callback.
 
 
- withOnError(Runnable) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
 
- 
Returns a new 
TestStreams.Builder like this one with the specified 
StreamObserver.onError(java.lang.Throwable)
 callback.
 
 
- withOnError(Consumer<Throwable>) - Method in class org.apache.beam.sdk.fn.test.TestStreams.Builder
 
- 
Returns a new 
TestStreams.Builder like this one with the specified 
StreamObserver.onError(java.lang.Throwable)
 consumer.
 
 
- withOnNext(Consumer<T>) - Static method in class org.apache.beam.sdk.fn.test.TestStreams
 
- 
Creates a test 
CallStreamObserver TestStreams.Builder that forwards 
StreamObserver.onNext(V) calls to the supplied 
Consumer.
 
 
- withOnTimeBehavior(Window.OnTimeBehavior) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
(Experimental) Override the default 
Window.OnTimeBehavior, to control whether to
 output an empty on-time pane.
 
 
- withOnTimeBehavior(Window.OnTimeBehavior) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- withoutDefaults() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
Returns a 
PTransform identical to this, but that does not attempt to provide a
 default value in the case of empty input.
 
 
- withoutMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Returns a 
PTransform for PCollection of 
KV, dropping Kafka metatdata.
 
 
- withOutputCoder(Coder<OutputT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
Specifies a 
Coder to use for the outputs.
 
 
- withOutputFilenames() - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
Specify that output filenames are wanted.
 
- withOutputFilenames() - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
Specify that output filenames are wanted.
 
- withOutputKeyCoder(Coder<KeyT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
Specifies the coder for the output key.
 
- withOutputTags(TupleTag<OutputT>, TupleTagList) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
- 
 
- withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
 
- withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- withoutSharding() - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Forces a single file as output and empty shard name template.
 
- withoutSharding() - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withoutSharding() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Forces a single file as output and empty shard name template.
 
- withoutSharding() - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withoutSharding() - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
Forces a single file as output.
 
- withoutStrictParsing() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
 
- 
During parsing of the arguments, we will skip over improperly formatted and unknown
 arguments.
 
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
Disable validation that the table exists or the query succeeds prior to pipeline submission.
 
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Disables BigQuery table validation.
 
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
Disables validation that the table being read from exists.
 
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
Disables validation that the table being written to exists.
 
- withoutValidation() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
- 
Returns a transform for reading TFRecord files that has GCS path validation on pipeline
 creation disabled.
 
- withParameterSetter(JdbcIO.PreparedStatementSetter<ParameterT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
-  
 
- withParseFn(SerializableFunction<GenericRecord, X>, Coder<X>) - Method in class org.apache.beam.sdk.io.AvroSource
 
- 
Reads 
GenericRecord of unspecified schema and maps them to instances of a custom type
 using the given 
parseFn and encoded using the given coder.
 
 
- withParser(MongoDbGridFSIO.Parser<X>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- withPartition(Map<String, String>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
 
- 
Sets the partition details.
 
- withPartitioner(KinesisPartitioner) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
 
- 
 
- withPartitionKey(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
 
- 
Specify default partition key.
 
- withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withPassword(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify the password for authentication.
 
- withPassword(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
- 
Specify the password used for authentication.
 
- withPassword(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
- 
If Elasticsearch authentication is enabled, provide the password.
 
- withPassword(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
-  
 
- withPassword(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
-  
 
- withPassword(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
- 
Define the password to connect to the JMS broker (authenticated).
 
- withPassword(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
 
- 
Define the password to connect to the JMS broker (authenticated).
 
- withPassword(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
-  
 
- withPollInterval(Duration) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
 
- withPort(int) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify the port number of the Apache Cassandra instances.
 
- withPort(int) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
- 
Specify the Cassandra instance port number where to write data.
 
- withPort(int) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
- 
Define the port number of the Redis server.
 
- withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
- 
 
- withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
 
- 
Sets the precision p.
 
- withPrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
 
- 
Sets the precision p.
 
- withPrefix(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies a common prefix to use for all generated filenames, if using the default file
 naming.
 
- withPrefix(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withPreparedStatementSetter(JdbcIO.PreparedStatementSetter<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
-  
 
- withProcessingTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withProcessingTime() - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
 
- 
 
- withProducerFactoryFn(SerializableFunction<Map<String, Object>, Producer<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
Sets a custom function to create Kafka producer.
 
- withProducerProperties(Properties) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
 
- 
Specify the configuration properties for Kinesis Producer Library (KPL).
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
 
- 
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
 
- 
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
 
- 
Returns a new 
DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore for the
 specified project.
 
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
 
- 
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
Returns a new 
DatastoreV1.Read that reads from the Cloud Datastore for the specified
 project.
 
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
 
- 
Returns a new 
DatastoreV1.Write that writes to the Cloud Datastore for the specified project.
 
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
 
- 
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
-  
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
- 
Specifies the Cloud Spanner project.
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
- 
Specifies the Cloud Spanner project.
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
Specifies the Cloud Spanner project.
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
Specifies the Cloud Spanner project.
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
Specifies the Cloud Spanner project.
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
Specifies the Cloud Spanner project.
 
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner project.
 
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner project.
 
- withPublishTimestampFunction(KafkaPublishTimestampFunction<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
A function to provide timestamp for records being published.
 
- withQuery(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
- 
Provide a query used while reading from Elasticsearch.
 
- withQuery(Query) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
- 
 
- withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
-  
 
- withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
-  
 
- withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
-  
 
- withQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
-  
 
- withQuery(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
 
- 
Provide a query used while reading from Solr.
 
- withQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
BigQuery geographic location where the query 
job will be
 executed.
 
 
- withQueryPriority(BigQueryIO.TypedRead.QueryPriority) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
- 
 
- withQueue(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
- 
Specify the JMS queue destination name where to read messages from.
 
- withQueue(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
 
- 
Specify the JMS queue destination name where to send messages to.
 
- withQueueUrl(String) - Method in class org.apache.beam.sdk.io.aws.sqs.SqsIO.Read
 
- 
Define the queueUrl used by the 
SqsIO.Read to receive messages from SQS.
 
 
- withRate(long, Duration) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
- 
Specifies to generate at most a given number of elements per a given period.
 
- withReadCommitted() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets "isolation_level" to "read_committed" in Kafka consumer configuration.
 
- withReadOperation(ReadOperation) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
Sets a JAXB annotated class that can be populated using a record of the provided XML file.
 
- withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
 
- 
 
- withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
- 
Writes objects of the given class mapped to XML elements using JAXB.
 
- withRecordElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
Sets name of the record element of the XML document.
 
- withRecordElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
 
- 
 
- withRelativeError(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
 
- 
Sets the relative error epsilon.
 
- withRelativeError(double) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.PerKeySketch
 
- 
Sets the relative error epsilon.
 
- withRepresentativeType(TypeDescriptor<IdT>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
 
- 
Return a 
WithRepresentativeValues PTransform that is like this one, but with
 the specified output type descriptor.
 
 
- withRepresentativeValueFn(SerializableFunction<T, IdT>) - Static method in class org.apache.beam.sdk.transforms.Distinct
 
- 
Returns a Distinct<T, IdT> PTransform.
 
- withRequestRecordsLimit(int) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specifies the maximum number of records in GetRecordsResult returned by GetRecords call which
 is limited by 10K records.
 
- withResultOutputTag(TupleTag<PublishResult>) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
 
- 
Tuple tag to store results.
 
- withResumeDelay(Duration) - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContinuation
 
- 
 
- withRetained(boolean) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
- 
Whether or not the publish message should be retained by the messaging engine.
 
- withRetryConfiguration(SnsIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
 
- 
Provides configuration to retry a failed request to publish a message to SNS.
 
- withRetryConfiguration(ElasticsearchIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
- 
Provides configuration to retry a failed batch call to Elasticsearch.
 
- withRetryConfiguration(SolrIO.RetryConfiguration) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
 
- 
Provides configuration to retry a failed batch call to Solr.
 
- withRetryStrategy(JdbcIO.RetryStrategy) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
- 
 
- withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
Sets name of the root element of the XML document.
 
- withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
 
- 
 
- withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
-  
 
- withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
- 
Sets the enclosing root element for the generated XML files.
 
- withRowFilter(RowFilter) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
Returns a new 
BigtableIO.Read that will filter the rows read from Cloud Bigtable
 using the given row filter.
 
 
- withRowMapper(JdbcIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
-  
 
- withRowMapper(JdbcIO.RowMapper<OutputT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
 
-  
 
- withRowSchema(Schema) - Method in class org.apache.beam.sdk.transforms.Create.Values
 
- 
Returns a 
Create.Values PTransform like this one that uses the given 
Schema
 to represent objects.
 
 
- withRunnerDeterminedSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
 
- 
 
- withScan(Scan) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
- 
Filters the rows read from HBase using the given* scan.
 
- withSchema(Schema) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Sets the the output schema.
 
- withSchema(Schema) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withSchema(String) - Method in class org.apache.beam.sdk.io.AvroSource
 
- 
Reads files containing records that conform to the given schema.
 
- withSchema(Schema) - Method in class org.apache.beam.sdk.io.AvroSource
 
- 
 
- withSchema(Class<X>) - Method in class org.apache.beam.sdk.io.AvroSource
 
- 
Reads files containing records of the given class.
 
- withSchema(TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Uses the specified schema for rows to be written.
 
- withSchema(ValueProvider<TableSchema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- withSchema(Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
 
- 
 
- withSchema(Schema, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
 
- 
Returns a 
Create.Values PTransform like this one that uses the given 
Schema
 to represent objects.
 
 
- withSchema(Schema) - Static method in class org.apache.beam.sdk.transforms.JsonToRow
 
-  
 
- withSchema(Schema) - Static method in class org.apache.beam.sdk.values.Row
 
- 
 
- withSchemaFromView(PCollectionView<Map<String, String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Allows the schemas for each table to be computed within the pipeline itself.
 
- withScrollKeepalive(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
- 
Provide a scroll keepalive.
 
- withShard(int) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
-  
 
- withSharding(PTransform<PCollection<UserT>, PCollectionView<Integer>>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies a 
PTransform to use for computing the desired number of shards in each
 window.
 
 
- withSharding(PTransform<PCollection<UserT>, PCollectionView<Integer>>) - Method in class org.apache.beam.sdk.io.WriteFiles
 
- 
 
- withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
 
- withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
Uses the given shard name template.
 
- withShardTemplate(String) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
- 
Sets the shard template.
 
- withSideInputs(List<PCollectionView<?>>) - Method in class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
 
- withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
- 
 
- withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
-  
 
- withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
-  
 
- withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
- 
 
- withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
- 
 
- withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
- 
 
- withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
- 
 
- withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
- 
 
- withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
- 
 
- withSideInputStream(TranslationContext, JavaStream<WindowedValue<InputT>>, Map<String, PCollectionView<?>>) - Static method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils
 
-  
 
- withSingletonValues() - Method in class org.apache.beam.sdk.transforms.View.AsMap
 
- 
 
- withSkew(Duration) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
- 
Specifies the Cloud Spanner configuration.
 
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
- 
Specifies the Cloud Spanner configuration.
 
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
- 
Specifies the Cloud Spanner configuration.
 
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
- 
Specifies the Cloud Spanner configuration.
 
- withSparsePrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
 
- 
Sets the sparse representation's precision sp.
 
- withSparsePrecision(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.PerKeyDistinct
 
- 
Sets the sparse representation's precision sp.
 
- withSparseRepresentation(int) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
 
- 
 
- withStagingPathSupplier(Callable<Path>) - Method in class org.apache.beam.runners.direct.portable.job.ReferenceRunnerJobService
 
-  
 
- withStartingDay(int, int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- withStartingMonth(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- withStartingYear(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- withStartKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
- 
Returns new 
ByteKeyRange like this one, but with the specified start key.
 
 
- withStartReadTime(Instant) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Use timestamp to set up start offset.
 
- withStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
-  
 
- withStatementPreparator(JdbcIO.StatementPreparator) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
-  
 
- withStreamName(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specify reading from streamName.
 
- withStreamName(String) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
 
- 
Specify Kinesis stream name which will be used for writing, this name is required.
 
- withSuffix(String) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Configures the filename suffix for written files.
 
- withSuffix(String) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withSuffix(String) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
- 
Sets the suffix.
 
- withSuffix(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies a common suffix to use for all generated filenames, if using the default file
 naming.
 
- withSuffix(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withSuffix(String) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Configures the filename suffix for written files.
 
- withSuffix(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withSuffix(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
- 
Writes to the file(s) with the given filename suffix.
 
- withTable(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify the Cassandra table where to read data.
 
- withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
-  
 
- withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withTable(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
 
- 
Sets the table name to read from.
 
- withTable(String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
 
- 
Sets the table name to write to, the table should exist beforehand.
 
- withTableDescription(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Specifies the table description.
 
- withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
- 
 
- withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
- 
 
- withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
- 
Reads from the specified table.
 
- withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
- 
Writes to the specified table.
 
- withTableProvider(TableProvider) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
 
-  
 
- withTableReference(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
-  
 
- withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Set the base directory used to generate temporary files.
 
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Set the base directory used to generate temporary files.
 
- withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withTempDirectory(String) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
Specifies a directory into which all temporary files will be placed.
 
- withTempDirectory(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Write
 
- 
 
- withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Set the base directory used to generate temporary files.
 
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Set the base directory used to generate temporary files.
 
- withTempDirectory(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withTempDirectory(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
- 
Use new template-compatible source implementation.
 
- withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
-  
 
- withTempLocation(String) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
 
- 
Sets the path to a temporary location where the sorter writes intermediate files.
 
- withTerminationPerInput(Watch.Growth.TerminationCondition<InputT, ?>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
 
- 
 
- withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
-  
 
- withTikaConfigPath(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
- 
 
- withTikaConfigPath(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.ParseFiles
 
- 
Like with(tikaConfigPath).
 
- withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
 
- 
Define the Redis connection timeout.
 
- withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
 
-  
 
- withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadAll
 
-  
 
- withTimeout(int) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
 
-  
 
- withTimePartitioning(TimePartitioning) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- withTimePartitioning(ValueProvider<TimePartitioning>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
 
- withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
-  
 
- withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
- 
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message
 attributes, specifies the name of the attribute that contains the timestamp.
 
- withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
- 
Writes to Pub/Sub and adds each record's timestamp to the published messages in an attribute
 with the specified name.
 
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
-  
 
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
-  
 
- withTimestampCombiner(TimestampCombiner) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
- 
 
- withTimestampCombiner(TimestampCombiner) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
-  
 
- withTimestampFn(SerializableFunction<Long, Instant>) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
- 
Specifies the function to use to assign timestamps to the elements.
 
- withTimestampFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withTimestampFn(SerializableFunction<KafkaRecord<K, V>, Instant>) - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
 
- 
Used by the Read transform to support old timestamp functions API.
 
- withTimestampFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withTimestampPolicyFactory(TimestampPolicyFactory<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- WithTimestamps<T> - Class in org.apache.beam.sdk.transforms
 
- 
 
- withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
-  
 
- withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
-  
 
- withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
-  
 
- withTopic(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
- 
Specify the JMS topic destination name where to receive messages from.
 
- withTopic(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
 
- 
Specify the JMS topic destination name where to send messages to.
 
- withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets the topic to read from.
 
- withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
Sets the Kafka topic to write to.
 
- withTopicName(String) - Method in class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
 
- 
Specify the SNS topic which will be used for writing, this name is mandatory.
 
- withTopicPartitions(List<TopicPartition>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets a list of partitions to read from.
 
- withTopics(List<String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets a list of topics to read from.
 
- withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
-  
 
- withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
-  
 
- withTrigger(Trigger) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
- 
Returns a 
WindowingStrategy identical to 
this but with the trigger set to
 
wildcardTrigger.
 
 
- withTriggeringFrequency(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Choose the frequency at which file writes are triggered.
 
- withType(Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Field
 
- 
 
- withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
 
- 
Returns a 
Create.TimestampedValues PTransform like this one that uses the given
 
TypeDescriptor<T> to determine the 
Coder to use to decode each of the objects
 into a value of type 
T.
 
 
- withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
 
- 
Returns a 
Create.Values PTransform like this one that uses the given 
TypeDescriptor<T> to determine the 
Coder to use to decode each of the objects into a
 value of type 
T.
 
 
- withTypeFn(ElasticsearchIO.Write.FieldValueExtractFn) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
- 
Provide a function to extract the target type from the document allowing for dynamic document
 routing.
 
- withUpToDateThreshold(Duration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
- 
Specifies how late records consumed by this source can be to still be considered on time.
 
- withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
-  
 
- withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
-  
 
- withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
- 
Define the location of the MongoDB instances using an URI.
 
- withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
- 
Define the location of the MongoDB instances using an URI.
 
- withUsePartialUpdate(boolean) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
- 
Provide an instruction to control whether partial updates or inserts (default) are issued to
 Elasticsearch.
 
- withUsername(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
 
- 
Specify the username for authentication.
 
- withUsername(String) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
- 
Specify the username used for authentication.
 
- withUsername(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
- 
If Elasticsearch authentication is enabled, provide the username.
 
- withUsername(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
-  
 
- withUsername(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
-  
 
- withUsername(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
- 
Define the username to connect to the JMS broker (authenticated).
 
- withUsername(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
 
- 
Define the username to connect to the JMS broker (authenticated).
 
- withUsername(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
-  
 
- withValidation() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
 
- 
After creation we will validate that 
PipelineOptions conforms to all the validation
 criteria from 
<T>.
 
 
- withValidation() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
- 
After creation we will validate that <T> conforms to all the validation criteria.
 
- withValidationEventHandler(ValidationEventHandler) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
- 
Sets the ValidationEventHandler to use with JAXB.
 
- withValidationEventHandler(ValidationEventHandler) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
 
- 
 
- withValueDeserializer(Class<? extends Deserializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets a Kafka Deserializer to interpret value bytes read from Kafka.
 
- withValueDeserializerAndCoder(Class<? extends Deserializer<V>>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
Sets a Kafka 
Deserializer for interpreting value bytes read from Kafka along with a
 
Coder for helping the Beam runner materialize value objects at runtime if necessary.
 
 
- withValueSerializer(Class<? extends Serializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
- 
Sets a Serializer for serializing value to bytes.
 
- withValueTranslation(SimpleFunction<?, V>) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
- 
Transforms the values read from the source using the given value translation function.
 
- withWatermark(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
 
- 
Sets the watermark - an approximate lower bound on timestamps of future new outputs from
 this 
Watch.Growth.PollFn.
 
 
- withWatermarkFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withWatermarkFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
- 
 
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
 
- 
Preserves windowing of input elements and writes them to files based on the element's window.
 
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
- 
 
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
 
- 
Specify that writes are windowed.
 
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
Preserves windowing of input elements and writes them to files based on the element's window.
 
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
 
- withWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
 
- 
Returns a new 
WriteFiles that writes preserves windowing on it's input.
 
 
- withWindowFn(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
- 
Returns a 
WindowingStrategy identical to 
this but with the window function set
 to 
wildcardWindowFn.
 
 
- withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
 
- 
 
- withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
- 
See TypedWrite#withWritableByteChannelFactory(WritableByteChannelFactory).
 
- withWriteDisposition(BigQueryIO.Write.WriteDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
- 
Specifies what to do with existing data in the table, in case the table already exists.
 
- WorkerHarnessContainerImageFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
 
-  
 
- WorkerLogLevelOverrides() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
 
- 
Deprecated.
  
- wrap(Throwable) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
 
-  
 
- wrap(String) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
 
-  
 
- wrapping(StreamObserver<V>) - Static method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
 
- 
Create a new 
SynchronizedStreamObserver which will delegate all calls to the underlying
 
StreamObserver, synchronizing access to that observer.
 
 
- wrapProcessContext(DoFn<InputT, ?>.ProcessContext) - Static method in class org.apache.beam.sdk.transforms.Contextful.Fn.Context
 
- 
 
- WritableCoder<T extends org.apache.hadoop.io.Writable> - Class in org.apache.beam.sdk.io.hadoop
 
- 
A 
WritableCoder is a 
Coder for a Java class that implements 
Writable.
 
 
- WritableCoder(Class<T>) - Constructor for class org.apache.beam.sdk.io.hadoop.WritableCoder
 
-  
 
- WritableCoder.WritableCoderProviderRegistrar - Class in org.apache.beam.sdk.io.hadoop
 
- 
 
- WritableCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
 
-  
 
- write(int) - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
-  
 
- write(byte[], int, int) - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.amqp.AmqpIO
 
-  
 
- Write() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpIO.Write
 
-  
 
- write(ElementT) - Method in class org.apache.beam.sdk.io.AvroIO.Sink
 
-  
 
- write(Class<T>) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
Writes a 
PCollection to an Avro file (or multiple Avro files matching a sharding
 pattern).
 
 
- write() - Static method in class org.apache.beam.sdk.io.aws.sns.SnsIO
 
-  
 
- Write() - Constructor for class org.apache.beam.sdk.io.aws.sns.SnsIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.aws.sqs.SqsIO
 
-  
 
- Write() - Constructor for class org.apache.beam.sdk.io.aws.sqs.SqsIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
 
-  
 
- write(T) - Method in interface org.apache.beam.sdk.io.cassandra.CassandraService.Writer
 
- 
This method should be synchronous.
 
- write(T) - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl.WriterImpl
 
- 
Write the entity to the Cassandra instance, using Mapper obtained with the MappingManager.
 
- write() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
 
-  
 
- Write() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
-  
 
- write(OutputT) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
Called for each value in the bundle.
 
- write(ElementT) - Method in interface org.apache.beam.sdk.io.FileIO.Sink
 
- 
Appends a single element to the file.
 
- write() - Static method in class org.apache.beam.sdk.io.FileIO
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.FileIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
-  
 
- write() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
 
- 
 
- write() - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO
 
- 
Write data to Hive.
 
- Write() - Constructor for class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
 
- 
Write data to a JDBC datasource.
 
- Write() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
 
-  
 
- Write() - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.kinesis.KinesisIO
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
 
- 
Write data to GridFS.
 
- write(MongoDbGridFSIO.WriteFn<T>) - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
 
-  
 
- Write() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
-  
 
- write(T, OutputStream) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.WriteFn
 
- 
Output the object to the given OutputStream.
 
- write() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbIO
 
- 
Write data to MongoDB.
 
- Write() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
 
-  
 
- Write() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
-  
 
- write(GenericRecord) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.redis.RedisIO
 
- 
Write data to a Redis server.
 
- Write() - Constructor for class org.apache.beam.sdk.io.redis.RedisIO.Write
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.solr.SolrIO
 
-  
 
- Write() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.Write
 
-  
 
- write(String) - Method in class org.apache.beam.sdk.io.TextIO.Sink
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.TextIO
 
- 
A 
PTransform that writes a 
PCollection to a text file (or multiple text files
 matching a sharding pattern), with each element of the input collection encoded into its own
 line.
 
 
- write(byte[]) - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.TFRecordIO
 
- 
A 
PTransform that writes a 
PCollection to TFRecord file (or multiple TFRecord
 files matching a sharding pattern), with each element of the input collection encoded into its
 own record.
 
 
- Write() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Write
 
-  
 
- write(T) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
 
-  
 
- write() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
 
- 
 
- Write() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Write
 
-  
 
- write(T) - Method in interface org.apache.beam.sdk.state.ValueState
 
- 
Set the value.
 
- writeAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that writes binary encoded Avro messages of a given type to a
 Google Cloud Pub/Sub stream.
 
 
- writeCompressed(WritableByteChannel) - Method in enum org.apache.beam.sdk.io.Compression
 
-  
 
- writeCustomType() - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
A 
PTransform that writes a 
PCollection to an avro file (or multiple avro files
 matching a sharding pattern), with each element of the input collection encoded into its own
 record of type OutputT.
 
 
- writeCustomType() - Static method in class org.apache.beam.sdk.io.TextIO
 
- 
A 
PTransform that writes a 
PCollection to a text file (or multiple text files
 matching a sharding pattern), with each element of the input collection encoded into its own
 line.
 
 
- writeCustomTypeToGenericRecords() - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
 
- writeDynamic() - Static method in class org.apache.beam.sdk.io.FileIO
 
- 
Writes elements to files using a 
FileIO.Sink and grouping the elements using "dynamic
 destinations".
 
 
- writeExternal(ObjectOutput) - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
-  
 
- WriteFiles<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
 
- 
 
- WriteFiles() - Constructor for class org.apache.beam.sdk.io.WriteFiles
 
-  
 
- WriteFilesResult<DestinationT> - Class in org.apache.beam.sdk.io
 
- 
 
- writeFooter() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
Writes footer at the end of output files.
 
- writeGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
Writes Avro records of the specified schema.
 
- writeGenericRecords(String) - Static method in class org.apache.beam.sdk.io.AvroIO
 
- 
Writes Avro records of the specified schema.
 
- WriteGrouped(SpannerIO.Write) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
 
-  
 
- writeHeader() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
Writes header at the beginning of output files.
 
- writeMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that writes to a Google Cloud Pub/Sub stream.
 
 
- writeMetrics(MetricQueryResults) - Method in interface org.apache.beam.sdk.metrics.MetricsSink
 
-  
 
- WriteOperation(FileBasedSink<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
Constructs a WriteOperation using the default strategy for generating a temporary directory
 from the base output filename.
 
- WriteOperation(FileBasedSink<?, DestinationT, OutputT>, ResourceId) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
- 
Create a new WriteOperation.
 
- writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that writes binary encoded protobuf messages of a given type to a
 Google Cloud Pub/Sub stream.
 
 
- Writer(FileBasedSink.WriteOperation<DestinationT, OutputT>, String) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.Writer
 
- 
 
- WriteResult - Class in org.apache.beam.sdk.io.gcp.bigquery
 
- 
 
- writeStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
- 
Returns A 
PTransform that writes UTF-8 encoded strings to a Google Cloud Pub/Sub
 stream.
 
 
- writeTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
 
- 
 
- writeTo(OutputStream, int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
- 
Writes length bytes starting at offset from the backing data store to the
 specified output stream.
 
- writeToPort(String, BeamFnApi.RemoteGrpcPort) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
 
- 
Create a 
RemoteGrpcPortWrite which writes the 
RunnerApi.PCollection with the provided
 Pipeline id to the provided 
BeamFnApi.RemoteGrpcPort.