Skip navigation links
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z 

A

absolute(String, String...) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
Construct a path from an absolute component path hierarchy.
AccumulatingCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
accumulatingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
Returns a new Window PTransform that uses the registered WindowFn and Triggering behavior, and that accumulates elements in a pane after they are triggered.
ACCUMULATOR_NAME - Static variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
 
AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
 
AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
 
add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
add(int, GlobalWatermarkHolder.SparkWatermarks) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
add(InputT) - Method in interface org.apache.beam.sdk.state.GroupingState
Add a value to the buffer.
add(Long) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
Adds a value to the heap, returning whether the value is (large enough to be) in the heap.
add(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register the given display item.
addAccum(AccumT) - Method in interface org.apache.beam.sdk.state.CombiningState
Add an accumulator to this state cell.
addAccumulator(NamedAggregators, NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.AggAccumParam
 
addAll(Map<Integer, Queue<GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
addBoolean(Map<String, Object>, String, boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addDouble(Map<String, Object>, String, Double) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addElements(T, T...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Adds the specified elements to the source with timestamp equal to the current watermark.
addElements(TimestampedValue<T>, TimestampedValue<T>...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Adds the specified elements to the source with the provided timestamps.
addIfAbsent(T) - Method in interface org.apache.beam.sdk.state.SetState
Ensures a value is a member of the set, returning true if it was added and false otherwise.
addIfNotDefault(DisplayData.ItemSpec<T>, T) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register the given display item if the value is different than the specified default.
addIfNotNull(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register the given display item if the value is not null.
addInPlace(NamedAggregators, NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.AggAccumParam
 
addInput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique, T) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
addInput(InputT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
Adds the given input value to this accumulator, modifying this accumulator.
addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
addInput(double[], Double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
addInput(Combine.Holder<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
addInput(int[], Integer) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
addInput(long[], Long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Adds the given input value to the given accumulator, returning the new accumulator value.
addInput(List<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
addInput(Object[], DataT) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
addInput(Object[], DataT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
addInput(AccumT, InputT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Adds the given input value to the given accumulator, returning the new accumulator value.
addInput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>, T) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
addList(Map<String, Object>, String, List<? extends Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addList(Map<String, Object>, String, T[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addLong(Map<String, Object>, String, long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addLongs(Map<String, Object>, String, long...) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addMessage(Message) - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
 
addNameFilter(MetricNameFilter) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
addNull(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addOverrideForClass(Class<?>, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Overrides the default log level for the passed in class.
addOverrideForName(String, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Overrides the default log level for the passed in name.
addOverrideForPackage(Package, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Overrides the default log level for the passed in package.
addProperties(Configuration, Properties) - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
Transfer the properties to the configuration object.
addStep(String) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
Add a step filter.
addString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addStringList(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
addTraceFor(AbstractGoogleClient, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
Creates a GoogleApiDebugOptions.GoogleApiTracer that sets the trace destination on all calls that match the given client type.
addTraceFor(AbstractGoogleClientRequest<?>, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
Creates a GoogleApiDebugOptions.GoogleApiTracer that sets the trace traceDestination on all calls that match for the given request type.
advance() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
advance(JavaSparkContext) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
Advances the watermarks to the next-in-line watermarks.
advance() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
advance() - Method in class org.apache.beam.sdk.io.Source.Reader
Advances the reader to the next valid record.
advance() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Advances the reader to the next valid record.
advanceImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
advanceImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Advances to the next record and returns true, or returns false if there is no next record.
advanceNextBatchWatermarkToInfinity() - Method in class org.apache.beam.runners.spark.io.CreateStream
Advances the watermark in the next batch to the end-of-time.
advanceProcessingTime(Duration) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Advance the processing time by the specified amount.
advanceWatermark() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
Advances the watermark.
advanceWatermarkForNextBatch(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
Advances the watermark in the next batch.
advanceWatermarkTo(Instant) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Advance the watermark of this source to the specified instant.
advanceWatermarkToInfinity() - Method in class org.apache.beam.sdk.testing.TestStream.Builder
Advance the watermark to infinity, completing this TestStream.
AfterAll - Class in org.apache.beam.sdk.transforms.windowing
A composite Trigger that fires when all of its sub-triggers are ready.
AfterEach - Class in org.apache.beam.sdk.transforms.windowing
A composite Trigger that executes its sub-triggers in order.
AfterFirst - Class in org.apache.beam.sdk.transforms.windowing
A composite Trigger that fires once after at least one of its sub-triggers have fired.
AfterPane - Class in org.apache.beam.sdk.transforms.windowing
A Trigger that fires at some point after a specified number of input elements have arrived.
AfterProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
A Trigger trigger that fires at a specified point in processing time, relative to when input first arrives.
AfterSynchronizedProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
FOR INTERNAL USE ONLY.
AfterWatermark - Class in org.apache.beam.sdk.transforms.windowing
AfterWatermark triggers fire based on progress of the system watermark.
AfterWatermark.AfterWatermarkEarlyAndLate - Class in org.apache.beam.sdk.transforms.windowing
 
AfterWatermark.FromEndOfWindow - Class in org.apache.beam.sdk.transforms.windowing
A watermark trigger targeted relative to the end of the window.
AggAccumParam - Class in org.apache.beam.runners.spark.aggregators
Aggregator accumulator param.
AggAccumParam() - Constructor for class org.apache.beam.runners.spark.aggregators.AggAccumParam
 
AggregatorMetric - Class in org.apache.beam.runners.spark.metrics
An adapter between the NamedAggregators and Codahale's Metric interface.
AggregatorMetricSource - Class in org.apache.beam.runners.spark.metrics
A Spark Source that is tailored to expose an AggregatorMetric, wrapping an underlying NamedAggregators instance.
AggregatorMetricSource(String, NamedAggregators) - Constructor for class org.apache.beam.runners.spark.metrics.AggregatorMetricSource
 
AggregatorsAccumulator - Class in org.apache.beam.runners.spark.aggregators
For resilience, Accumulators are required to be wrapped in a Singleton.
AggregatorsAccumulator() - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
 
AggregatorsAccumulator.AccumulatorCheckpointingSparkListener - Class in org.apache.beam.runners.spark.aggregators
Spark Listener which checkpoints NamedAggregators values for fault-tolerance.
align(Duration) - Method in interface org.apache.beam.sdk.state.Timer
Aligns the target timestamp used by Timer.setRelative() to the next boundary of period.
alignedTo(Duration, Instant) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
Aligns timestamps to the smallest multiple of period since the offset greater than the timestamp.
alignedTo(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
Aligns the time to be the smallest multiple of period greater than the epoch boundary (aka new Instant(0)).
alignTo(Duration, Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
For internal use only; no backwards-compatibility guarantees.
alignTo(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
For internal use only; no backwards-compatibility guarantees.
AlignTo() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
ALL_CONTEXTS - Static variable in class org.apache.beam.sdk.testing.CoderProperties
All the contexts, for use in test cases.
ALL_KEYS - Static variable in class org.apache.beam.sdk.io.range.ByteKeyRange
The range of all keys, with empty start and end keys.
allMatches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.AllMatches PTransform that checks if the entire line matches the Regex.
allMatches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.AllMatches PTransform that checks if the entire line matches the Regex.
AllMatches(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.AllMatches
 
ALLOWED_ENCODINGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Whether this reader should allow dynamic splitting of the offset ranges.
AlwaysPassMatcher() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
 
AlwaysPassMatcherFactory() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
 
and(TupleTag<V>, List<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns a new CoGbkResult based on this, with the given tag and given data added to it.
and(TupleTag<V>, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns a new KeyedPCollectionTuple<K> that is the same as this, appended with the given PCollection.
and(PCollection.IsBounded) - Method in enum org.apache.beam.sdk.values.PCollection.IsBounded
Returns the composed IsBounded property.
and(PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionList
Returns a new PCollectionList that has all the PCollections of this PCollectionList plus the given PCollection appended to the end.
and(Iterable<PCollection<T>>) - Method in class org.apache.beam.sdk.values.PCollectionList
Returns a new PCollectionList that has all the PCollections of this PCollectionList plus the given PCollections appended to the end, in order.
and(TupleTag<T>, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns a new PCollectionTuple that has each PCollection and TupleTag of this PCollectionTuple plus the given PCollection associated with the given TupleTag.
and(TupleTag<?>) - Method in class org.apache.beam.sdk.values.TupleTagList
Returns a new TupleTagList that has all the TupleTags of this TupleTagList plus the given TupleTag appended to the end.
and(List<TupleTag<?>>) - Method in class org.apache.beam.sdk.values.TupleTagList
Returns a new TupleTagList that has all the TupleTags of this TupleTagList plus the given TupleTags appended to the end, in order.
any(long) - Static method in class org.apache.beam.sdk.transforms.Sample
Sample#any(long) takes a PCollection<T> and a limit, and produces a new PCollection<T> containing up to limit elements of the input PCollection.
ApexPipelineOptions - Interface in org.apache.beam.runners.apex
Options that configure the Apex pipeline.
ApexRunner - Class in org.apache.beam.runners.apex
A PipelineRunner that translates the pipeline to an Apex DAG and executes it on an Apex cluster.
ApexRunner(ApexPipelineOptions) - Constructor for class org.apache.beam.runners.apex.ApexRunner
 
ApexRunner.CreateApexPCollectionView<ElemT,ViewT> - Class in org.apache.beam.runners.apex
Creates a primitive PCollectionView.
ApexRunnerRegistrar - Class in org.apache.beam.runners.apex
Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the ApexRunner.
ApexRunnerRegistrar.Options - Class in org.apache.beam.runners.apex
Registers the ApexPipelineOptions.
ApexRunnerRegistrar.Runner - Class in org.apache.beam.runners.apex
Registers the ApexRunner.
ApexRunnerResult - Class in org.apache.beam.runners.apex
Result of executing a Pipeline with Apex in embedded mode.
ApexRunnerResult(DAG, Launcher.AppHandle) - Constructor for class org.apache.beam.runners.apex.ApexRunnerResult
 
ApexYarnLauncher - Class in org.apache.beam.runners.apex
Proxy to launch the YARN application through the hadoop script to run in the pre-configured environment (class path, configuration, native libraries etc.).
ApexYarnLauncher() - Constructor for class org.apache.beam.runners.apex.ApexYarnLauncher
 
ApexYarnLauncher.LaunchParams - Class in org.apache.beam.runners.apex
Launch parameters that will be serialized and passed to the child process.
ApexYarnLauncher.ProcessWatcher - Class in org.apache.beam.runners.apex
Starts a command and waits for it to complete.
APPEND_TRAILING_NEWLINES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
applicableTo(PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
 
ApplicationNameOptions - Interface in org.apache.beam.sdk.options
Options that allow setting the application name.
apply(InputT) - Method in interface org.apache.beam.sdk.coders.DelegateCoder.CodingFunction
 
apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
Like Pipeline.apply(String, PTransform) but the transform node in the Pipeline graph will be named according to PTransform.getName().
apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
Adds a root PTransform, such as Read or Create, to this Pipeline.
apply(T) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert.MatcherCheckerFn
 
apply(Statement, Description) - Method in class org.apache.beam.sdk.testing.TestPipeline
 
apply(double, double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
Applies the binary operation to the two operands, returning the result.
apply(V, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
Applies the binary operation to the two operands, returning the result.
apply(int, int) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
Applies the binary operation to the two operands, returning the result.
apply(long, long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
Applies the binary operation to the two operands, returning the result.
apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Applies this CombineFn to a collection of input values to produce a combined output value.
apply(Iterable<? extends InputT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Applies this CombineFnWithContext to a collection of input values to produce a combined output value.
apply(PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Like KeyedPCollectionTuple.apply(String, PTransform) but defaulting to the name provided by the PTransform.
apply(String, PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Applies the given PTransform to this input KeyedPCollectionTuple and returns its OutputT.
apply(InputT) - Method in interface org.apache.beam.sdk.transforms.SerializableFunction
Returns the result of invoking this function on the given input.
apply(InputT) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
 
apply(PrimitiveViewT) - Method in class org.apache.beam.sdk.transforms.ViewFn
A function to adapt a primitive view type to a desired view type.
apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
Like PBegin.apply(String, PTransform) but defaulting to the name of the PTransform.
apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
Applies the given PTransform to this PBegin, using name to identify this specific application of the transform.
apply(PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
Like PCollection.apply(String, PTransform) but defaulting to the name of the PTransform.
apply(String, PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
Applies the given PTransform to this input PCollection, using name to identify this specific application of the transform.
apply(PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
Like PCollectionList.apply(String, PTransform) but defaulting to the name of the PTransform.
apply(String, PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
Applies the given PTransform to this input PCollectionList, using name to identify this specific application of the transform.
apply(PTransform<PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Like PCollectionTuple.apply(String, PTransform) but defaulting to the name of the PTransform.
apply(String, PTransform<PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Applies the given PTransform to this input PCollectionTuple, using name to identify this specific application of the transform.
apply(Iterable<WindowedValue<T>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
Deprecated.
 
apply(Iterable<WindowedValue<T>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
apply(Iterable<WindowedValue<KV<K, V>>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
Deprecated.
Input iterable must actually be Iterable<WindowedValue<KV<K, V>>>.
apply(Iterable<WindowedValue<KV<K, V>>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
Deprecated.
 
apply(Iterable<WindowedValue<T>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
Deprecated.
 
applyTransform(InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
For internal use only; no backwards-compatibility guarantees.
applyTransform(String, InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
For internal use only; no backwards-compatibility guarantees.
ApproximateQuantiles - Class in org.apache.beam.sdk.transforms
PTransforms for getting an idea of a PCollection's data distribution using approximate N-tiles (e.g.
ApproximateQuantiles.ApproximateQuantilesCombineFn<T,ComparatorT extends java.util.Comparator<T> & java.io.Serializable> - Class in org.apache.beam.sdk.transforms
The ApproximateQuantilesCombineFn combiner gives an idea of the distribution of a collection of values using approximate N-tiles.
ApproximateUnique - Class in org.apache.beam.sdk.transforms
PTransforms for estimating the number of distinct elements in a PCollection, or the number of distinct values associated with each key in a PCollection of KVs.
ApproximateUnique() - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique
 
ApproximateUnique.ApproximateUniqueCombineFn<T> - Class in org.apache.beam.sdk.transforms
CombineFn that computes an estimate of the number of distinct values that were combined.
ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique - Class in org.apache.beam.sdk.transforms
A heap utility class to efficiently track the largest added elements.
ApproximateUniqueCombineFn(long, Coder<T>) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
array() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns the backing array.
as(Class<T>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
Transforms this object into an object of type <T> saving each property that has been manipulated.
as(Class<T>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Creates and returns an object that implements <T>.
as(Class<T>) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
Creates and returns an object that implements <T> using the values configured on this builder during construction.
asCloudObject(Coder<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
Convert the provided Coder into a CloudObject.
asInputStream(int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns an InputStream wrapper which supplies the portion of this backing byte buffer starting at offset and up to length bytes.
asIterable() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsIterable transform that takes a PCollection as input and produces a PCollectionView mapping each window to an Iterable of the values in that window.
AsJsons<InputT> - Class in org.apache.beam.sdk.extensions.jackson
PTransform for serializing objects to JSON Strings.
asList() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsList transform that takes a PCollection and returns a PCollectionView mapping each window to a List containing all of the elements in the window.
asMap() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
asMap() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsMap transform that takes a PCollection<KV<K, V>> as input and produces a PCollectionView mapping each window to a Map<K, V>.
asMultimap() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsMultimap transform that takes a PCollection<KV<K, V>> as input and produces a PCollectionView mapping each window to its contents as a Map<K, Iterable<V>> for use as a side input.
asOutputStream() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns an output stream which writes to the backing buffer from the current position.
asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub API.
asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Returns the string representation of this topic as a path used in the Cloud Pub/Sub API.
ASSERTION_ERROR - Static variable in class org.apache.beam.runners.apex.ApexRunner
TODO: this isn't thread safe and may cause issues when tests run in parallel Holds any most resent assertion error that was raised while processing elements.
assertionError() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
assertSourcesEqualReferenceSource(BoundedSource<T>, List<? extends BoundedSource<T>>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Given a reference Source and a list of Sources, assert that the union of the records read from the list of sources is equal to the records read from the reference source.
assertSplitAtFractionBehavior(BoundedSource<T>, int, double, SourceTestUtils.ExpectedSplitOutcome, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Asserts that the source's reader either fails to splitAtFraction(fraction) after reading numItemsToReadBeforeSplit items, or succeeds in a way that is consistent according to SourceTestUtils.assertSplitAtFractionSucceedsAndConsistent(org.apache.beam.sdk.io.BoundedSource<T>, int, double, org.apache.beam.sdk.options.PipelineOptions).
assertSplitAtFractionExhaustive(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Asserts that for each possible start position, BoundedSource.BoundedReader#splitAtFraction at every interesting fraction (halfway between two fractions that differ by at least one item) can be called successfully and the results are consistent if a split succeeds.
assertSplitAtFractionFails(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Asserts that the source's reader fails to splitAtFraction(fraction) after reading numItemsToReadBeforeSplit items.
assertSplitAtFractionSucceedsAndConsistent(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Verifies some consistency properties of BoundedSource.BoundedReader#splitAtFraction on the given source.
assertUnstartedReaderReadsSameAsItsSource(BoundedSource.BoundedReader<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Assert that a Reader returns a Source that, when read from, produces the same records as the reader.
assign(BoundedWindow, Instant) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
AssignContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
 
assignedWindows(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
Returns the single window to which elements with this timestamp belong.
assignWindows(WindowFn<Object, GlobalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
assignWindows(WindowFn<Object, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
assignWindows(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Given a timestamp and element, returns the set of windows into which it should be placed.
asSingleton() - Static method in class org.apache.beam.sdk.transforms.View
Returns a View.AsSingleton transform that takes a PCollection with a single value per window as input and produces a PCollectionView that returns the value in the main input window when read as a side input.
asSingletonView() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform that produces a PCollectionView whose elements are the result of combining elements per-window in the input PCollection.
asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Deprecated.
the v1beta1 API for Cloud Pub/Sub is deprecated.
asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Deprecated.
the v1beta1 API for Cloud Pub/Sub is deprecated.
asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Deprecated.
the v1beta2 API for Cloud Pub/Sub is deprecated.
asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Deprecated.
the v1beta2 API for Cloud Pub/Sub is deprecated.
atMinimumTimestamp(V) - Static method in class org.apache.beam.sdk.values.TimestampedValue
Returns a new TimestampedValue with the minimum timestamp.
AtomicCoder<T> - Class in org.apache.beam.sdk.coders
A Coder that has no component Coders or other configuration.
AtomicCoder() - Constructor for class org.apache.beam.sdk.coders.AtomicCoder
 
AtomicLongFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
 
attempted() - Method in interface org.apache.beam.sdk.metrics.MetricResult
Return the value of this metric across all attempts of executing all parts of the pipeline.
AvailableParallelismFactory() - Constructor for class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
 
AvroCoder<T> - Class in org.apache.beam.sdk.coders
A Coder using Avro binary format.
AvroCoder(Class<T>, Schema) - Constructor for class org.apache.beam.sdk.coders.AvroCoder
 
AvroIO - Class in org.apache.beam.sdk.io
PTransforms for reading and writing Avro files.
AvroIO.Read<T> - Class in org.apache.beam.sdk.io
AvroIO.Write<T> - Class in org.apache.beam.sdk.io
AvroReader(AvroSource<T>) - Constructor for class org.apache.beam.sdk.io.AvroSource.AvroReader
Reads Avro records of type T from the specified source.
AvroSource<T> - Class in org.apache.beam.sdk.io
Do not use in pipelines directly: most users should use AvroIO.Read.
AvroSource.AvroReader<T> - Class in org.apache.beam.sdk.io
A BlockBasedSource.BlockBasedReader for reading blocks from Avro files.
awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
 

B

BACKLOG_UNKNOWN - Static variable in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Constant representing an unknown amount of backlog.
backlogBytes() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Gauge for source backlog in bytes.
backlogBytesOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Gauge for source split backlog in bytes.
backlogElements() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Gauge for source backlog in elements.
backlogElementsOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Gauge for source split backlog in elements.
bag() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a BagState, optimized for adding values frequently and occasionally retrieving all the values that have been added.
bag(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.bag(), but with an element coder explicitly supplied.
BagState<T> - Interface in org.apache.beam.sdk.state
A ReadableState cell containing a bag of values.
BatchStatefulParDoOverrides - Class in org.apache.beam.runners.dataflow
PTransformOverrideFactories that expands to correctly implement stateful ParDo using window-unaware BatchViewOverrides.GroupByKeyAndSortValuesOnly to linearize processing per key.
BatchStatefulParDoOverrides() - Constructor for class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
 
BatchStatefulParDoOverrides.BatchStatefulDoFn<K,V,OutputT> - Class in org.apache.beam.runners.dataflow
A key-preserving DoFn that explodes an iterable that has been grouped by key and window.
BeamSparkRunnerRegistrator - Class in org.apache.beam.runners.spark.coders
Custom KryoRegistrators for Beam's Spark runner needs.
BeamSparkRunnerRegistrator() - Constructor for class org.apache.beam.runners.spark.coders.BeamSparkRunnerRegistrator
 
begin() - Method in class org.apache.beam.sdk.Pipeline
Returns a PBegin owned by this Pipeline.
beginningOnDay(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
beginningOnDay(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
BigDecimalCoder - Class in org.apache.beam.sdk.coders
A BigDecimalCoder encodes a BigDecimal as an integer scale encoded with VarIntCoder and a BigInteger encoded using BigIntegerCoder.
bigdecimals() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for BigDecimal.
BigEndianIntegerCoder - Class in org.apache.beam.sdk.coders
A BigEndianIntegerCoder encodes Integers in 4 bytes, big-endian.
BigEndianLongCoder - Class in org.apache.beam.sdk.coders
A BigEndianLongCoder encodes Longs in 8 bytes, big-endian.
BigIntegerCoder - Class in org.apache.beam.sdk.coders
A BigIntegerCoder encodes a BigInteger as a byte array containing the big endian two's-complement representation, encoded via ByteArrayCoder.
bigintegers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for BigInteger.
BIGQUERY_CREATE_DISPOSITION - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_DATASET - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_EXPORT_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_EXPORT_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_FLATTEN_RESULTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_PROJECT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_QUERY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_TABLE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_USE_LEGACY_SQL - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BIGQUERY_WRITE_DISPOSITION - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
BigQueryCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
A CoderProviderRegistrar for standard types used with BigQueryIO.
BigQueryCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
 
BigQueryHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
A set of helper functions and classes used by BigQueryIO.
BigQueryHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
BigQueryIO - Class in org.apache.beam.sdk.io.gcp.bigquery
PTransforms for reading and writing BigQuery tables.
BigQueryIO.Read - Class in org.apache.beam.sdk.io.gcp.bigquery
Implementation of BigQueryIO.read().
BigQueryIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
Implementation of BigQueryIO.write().
BigQueryIO.Write.CreateDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the BigQuery create disposition strings.
BigQueryIO.Write.WriteDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the BigQuery write disposition strings.
BigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
Properties needed when using Google BigQuery with the Apache Beam SDK.
BigtableIO - Class in org.apache.beam.sdk.io.gcp.bigtable
Transforms for reading from and writing to Google Cloud Bigtable.
BigtableIO.Read - Class in org.apache.beam.sdk.io.gcp.bigtable
A PTransform that reads from Google Cloud Bigtable.
BigtableIO.Write - Class in org.apache.beam.sdk.io.gcp.bigtable
A PTransform that writes to Google Cloud Bigtable.
BinaryCombineDoubleFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
BinaryCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
BinaryCombineIntegerFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
BinaryCombineLongFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
bind(String, StateBinder) - Method in interface org.apache.beam.sdk.state.StateSpec
For internal use only; no backwards-compatibility guarantees.
bindBag(String, StateSpec<BagState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindCombining(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindCombiningWithContext(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindMap(String, StateSpec<MapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindSet(String, StateSpec<SetState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindValue(String, StateSpec<ValueState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
 
bindWatermark(String, StateSpec<WatermarkHoldState>, TimestampCombiner) - Method in interface org.apache.beam.sdk.state.StateBinder
Bind to a watermark StateSpec.
BitSetCoder - Class in org.apache.beam.sdk.coders
Coder for BitSet.
Block() - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.Block
 
BlockBasedReader(BlockBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
BlockBasedSource<T> - Class in org.apache.beam.sdk.io
A BlockBasedSource is a FileBasedSource where a file consists of blocks of records.
BlockBasedSource(String, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
Creates a BlockBasedSource based on a file name or pattern.
BlockBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
Creates a BlockBasedSource for a single file.
BlockBasedSource.Block<T> - Class in org.apache.beam.sdk.io
A Block represents a block of records that can be read.
BlockBasedSource.BlockBasedReader<T> - Class in org.apache.beam.sdk.io
A Reader that reads records from a BlockBasedSource.
booleans() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Boolean.
Bounded(SparkContext, BoundedSource<T>, SparkRuntimeContext, String) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
BoundedReader() - Constructor for class org.apache.beam.sdk.io.BoundedSource.BoundedReader
 
BoundedReadFromUnboundedSource<T> - Class in org.apache.beam.sdk.io
PTransform that reads a bounded amount of data from an UnboundedSource, specified as one or both of a maximum number of elements or a maximum period of time to read.
BoundedSource<T> - Class in org.apache.beam.sdk.io
A Source that reads a finite amount of input and, because of that, supports some additional operations.
BoundedSource() - Constructor for class org.apache.beam.sdk.io.BoundedSource
 
BoundedSource.BoundedReader<T> - Class in org.apache.beam.sdk.io
A Reader that reads a bounded amount of input and supports some additional operations, such as progress estimation and dynamic work rebalancing.
BoundedWindow - Class in org.apache.beam.sdk.transforms.windowing
A BoundedWindow represents a finite grouping of elements, with an upper bound (larger timestamps represent more recent data) on the timestamps of elements that can be placed in the window.
BoundedWindow() - Constructor for class org.apache.beam.sdk.transforms.windowing.BoundedWindow
 
broadcast(JavaSparkContext) - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
BufferedExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
Sorter that will use in memory sorting until the values can't fit into memory and will then fall back to external sorting.
BufferedExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
Contains configuration for the sorter.
build() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
build() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
builder() - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
Builder() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
builder() - Static method in class org.apache.beam.sdk.metrics.MetricsFilter
 
Builder() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
buildOutputFilenames(Iterable<FileBasedSink.FileResult>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
buildTemporaryFilename(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Constructs a temporary file resource given the temporary directory and a filename.
by(PredicateT) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that satisfy the given predicate.
ByteArray - Class in org.apache.beam.runners.spark.util
Serializable byte array.
ByteArray(byte[]) - Constructor for class org.apache.beam.runners.spark.util.ByteArray
 
ByteArrayCoder - Class in org.apache.beam.sdk.coders
A Coder for byte[].
ByteCoder - Class in org.apache.beam.sdk.coders
A ByteCoder encodes Byte values in 1 byte using Java serialization.
ByteKey - Class in org.apache.beam.sdk.io.range
A class representing a key consisting of an array of bytes.
ByteKeyRange - Class in org.apache.beam.sdk.io.range
A class representing a range of ByteKeys.
ByteKeyRangeTracker - Class in org.apache.beam.sdk.io.range
bytes() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Byte.
bytesRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Counter of bytes read by a source.
bytesReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Counter of bytes read by a source split.
ByteStringCoder - Class in org.apache.beam.sdk.extensions.protobuf
A Coder for ByteString objects based on their encoded Protocol Buffer form.
bytesWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
Counter of bytes written to a sink.

C

CalendarWindows - Class in org.apache.beam.sdk.transforms.windowing
A collection of WindowFns that windows values into calendar-based windows such as spans of days, months, or years.
CalendarWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.CalendarWindows
 
CalendarWindows.DaysWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows elements into periods measured by days.
CalendarWindows.MonthsWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows elements into periods measured by months.
CalendarWindows.YearsWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows elements into periods measured by years.
cancel() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
cancel() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
cancel() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
cancel() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
cancel() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
cancel() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
cancel() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
cancel() - Method in interface org.apache.beam.sdk.PipelineResult
Cancels the pipeline execution.
CannotProvideCoderException - Exception in org.apache.beam.sdk.coders
The exception thrown when a CoderRegistry or CoderProvider cannot provide a Coder that has been requested.
CannotProvideCoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(String, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(String, Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException(Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
CannotProvideCoderException.ReasonCode - Enum in org.apache.beam.sdk.coders
Indicates the reason that Coder inference failed.
characters() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Character.
checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
checkDone() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
Called by the runner after DoFn.ProcessElement returns.
checkpoint() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
checkpoint() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
Signals that the current DoFn.ProcessElement call should terminate as soon as possible.
classesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
Gets a map from Coder to a CloudObjectTranslator that can translate that Coder.
classesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
classNamesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
Gets a map from the name returned by CloudObject.getClassName() to a translator that can convert into the equivalent Coder.
classNamesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
CLASSPATH_SCHEME - Static variable in class org.apache.beam.runners.apex.ApexRunner
 
cleanup() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
 
clear() - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
 
clear() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
clear() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
clear() - Method in interface org.apache.beam.sdk.state.State
Clear out the state location.
clearCache() - Static method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
clearOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Clears the record of the elements output so far to the main output.
clearOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Clears the record of the elements output so far to the output with the given tag.
clone() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
 
clone() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
close() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
close() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Closes the channel and returns the bundle result.
close() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
Closes any ReadableByteChannel created for the current reader.
close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
close() - Method in class org.apache.beam.sdk.io.Source.Reader
Closes the reader.
close() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
CloudDebuggerOptions - Interface in org.apache.beam.runners.dataflow.options
Options for controlling Cloud Debugger.
CloudObject - Class in org.apache.beam.runners.dataflow.util
A representation of an arbitrary Java object to be instantiated by Dataflow workers.
cloudObjectClassName() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
Gets the class name that will represent the CloudObject created by this CloudObjectTranslator.
CloudObjects - Class in org.apache.beam.runners.dataflow.util
Utilities for converting an object to a CloudObject.
CloudObjectTranslator<T> - Interface in org.apache.beam.runners.dataflow.util
A translator that takes an object and creates a CloudObject which can be converted back to the original object.
CloudResourceManagerOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
Properties needed when using Google CloudResourceManager with the Apache Beam SDK.
CO_GBK_RESULT_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
Coder<T> - Class in org.apache.beam.sdk.coders
A Coder<T> defines how to encode and decode values of type T into byte streams.
Coder() - Constructor for class org.apache.beam.sdk.coders.Coder
 
Coder.Context - Class in org.apache.beam.sdk.coders
Deprecated.
Coder.NonDeterministicException - Exception in org.apache.beam.sdk.coders
Exception thrown by Coder.verifyDeterministic() if the encoding is not deterministic, including details of why the encoding is not deterministic.
CoderCloudObjectTranslatorRegistrar - Interface in org.apache.beam.runners.dataflow.util
Coder authors have the ability to automatically have their Coder registered with the Dataflow Runner by creating a ServiceLoader entry and a concrete implementation of this interface.
coderConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T> and values of type T, the values are equal if and only if the encoded bytes are equal.
coderConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and values of type T, the values are equal if and only if the encoded bytes are equal, in any Coder.Context.
coderDecodeEncodeContentsEqual(Coder<CollectionT>, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in any Coder.Context.
coderDecodeEncodeContentsEqualInContext(Coder<CollectionT>, Coder.Context, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in the given Coder.Context.
coderDecodeEncodeContentsInSameOrder(Coder<IterableT>, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<Collection<T>>, and value of type Collection<T>, encoding followed by decoding yields an equal value of type Collection<T>, in any Coder.Context.
coderDecodeEncodeContentsInSameOrderInContext(Coder<IterableT>, Coder.Context, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<Iterable<T>>, and value of type Iterable<T>, encoding followed by decoding yields an equal value of type Collection<T>, in the given Coder.Context.
coderDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, and value of type T, encoding followed by decoding yields an equal value of type T, in any Coder.Context.
coderDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and value of type T, encoding followed by decoding yields an equal value of type T.
coderDecodesBase64(Coder<T>, String, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderDecodesBase64(Coder<T>, List<String>, List<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderDecodesBase64ContentsEqual(Coder<IterableT>, String, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderDecodesBase64ContentsEqual(Coder<IterableT>, List<String>, List<IterableT>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderDeterministic(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, and values of type T, if the values are equal then the encoded bytes are equal, in any Coder.Context.
coderDeterministicInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and values of type T, if the values are equal then the encoded bytes are equal.
coderEncodesBase64(Coder<T>, T, String) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
coderEncodesBase64(Coder<T>, List<T>, List<String>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
 
CoderException - Exception in org.apache.beam.sdk.coders
An Exception thrown if there is a problem encoding or decoding a value.
CoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
CoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
CoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
 
coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.CoderProvider
Returns a Coder<T> to use for values of a particular type, given the Coders for each of the type's generic parameter types.
coderFromCloudObject(CloudObject) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
 
CoderHelpers - Class in org.apache.beam.runners.spark.coders
Serialization utility class.
CoderProperties - Class in org.apache.beam.sdk.testing
Properties for use in Coder tests.
CoderProperties() - Constructor for class org.apache.beam.sdk.testing.CoderProperties
 
CoderProperties.TestElementByteSizeObserver - Class in org.apache.beam.sdk.testing
An ElementByteSizeObserver that records the observed element sizes for testing purposes.
CoderProvider - Class in org.apache.beam.sdk.coders
A CoderProvider provides Coders.
CoderProvider() - Constructor for class org.apache.beam.sdk.coders.CoderProvider
 
CoderProviderRegistrar - Interface in org.apache.beam.sdk.coders
Coder creators have the ability to automatically have their coders registered with this SDK by creating a ServiceLoader entry and a concrete implementation of this interface.
CoderProviders - Class in org.apache.beam.sdk.coders
Static utility methods for creating and working with CoderProviders.
CoderRegistry - Class in org.apache.beam.sdk.coders
A CoderRegistry allows creating a Coder for a given Java class or type descriptor.
coderSerializable(Coder<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that the given Coder<T> can be correctly serialized and deserialized.
CoGbkResult - Class in org.apache.beam.sdk.transforms.join
A row result of a CoGroupByKey.
CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
A row in the PCollection resulting from a CoGroupByKey transform.
CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>, int) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
 
CoGbkResult.CoGbkResultCoder - Class in org.apache.beam.sdk.transforms.join
CoGbkResultSchema - Class in org.apache.beam.sdk.transforms.join
A schema for the results of a CoGroupByKey.
CoGbkResultSchema(TupleTagList) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Builds a schema from a tuple of TupleTag<?>s.
CoGroupByKey<K> - Class in org.apache.beam.sdk.transforms.join
A PTransform that performs a CoGroupByKey on a tuple of tables.
CollectionCoder<T> - Class in org.apache.beam.sdk.coders
A CollectionCoder encodes Collections in the format of IterableLikeCoder.
CollectionCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.CollectionCoder
 
Combine - Class in org.apache.beam.sdk.transforms
PTransforms for combining PCollection elements globally and per-key.
combine(Iterable<? extends Instant>) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Combines the given times, which must be from the same window and must have been passed through TimestampCombiner.merge(org.apache.beam.sdk.transforms.windowing.BoundedWindow, java.lang.Iterable<? extends org.joda.time.Instant>).
combine(Instant...) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Combine.AccumulatingCombineFn<InputT,AccumT extends Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT>,OutputT> - Class in org.apache.beam.sdk.transforms
A CombineFn that uses a subclass of Combine.AccumulatingCombineFn.Accumulator as its accumulator type.
Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
The type of mutable accumulator values used by this AccumulatingCombineFn.
Combine.BinaryCombineDoubleFn - Class in org.apache.beam.sdk.transforms
An abstract subclass of Combine.CombineFn for implementing combiners that are more easily and efficiently expressed as binary operations on doubles.
Combine.BinaryCombineFn<V> - Class in org.apache.beam.sdk.transforms
An abstract subclass of Combine.CombineFn for implementing combiners that are more easily expressed as binary operations.
Combine.BinaryCombineIntegerFn - Class in org.apache.beam.sdk.transforms
An abstract subclass of Combine.CombineFn for implementing combiners that are more easily and efficiently expressed as binary operations on ints
Combine.BinaryCombineLongFn - Class in org.apache.beam.sdk.transforms
An abstract subclass of Combine.CombineFn for implementing combiners that are more easily and efficiently expressed as binary operations on longs.
Combine.CombineFn<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
A CombineFn<InputT, AccumT, OutputT> specifies how to combine a collection of input values of type InputT into a single output value of type OutputT.
Combine.Globally<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
Combine.Globally<InputT, OutputT> takes a PCollection<InputT> and returns a PCollection<OutputT> whose elements are the result of combining all the elements in each window of the input PCollection, using a specified CombineFn<InputT, AccumT, OutputT>.
Combine.GloballyAsSingletonView<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
Combine.GloballyAsSingletonView<InputT, OutputT> takes a PCollection<InputT> and returns a PCollectionView<OutputT> whose elements are the result of combining all the elements in each window of the input PCollection, using a specified CombineFn<InputT, AccumT, OutputT>.
Combine.GroupedValues<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
GroupedValues<K, InputT, OutputT> takes a PCollection<KV<K, Iterable<InputT>>>, such as the result of GroupByKey, applies a specified CombineFn<InputT, AccumT, OutputT> to each of the input KV<K, Iterable<InputT>> elements to produce a combined output KV<K, OutputT> element, and returns a PCollection<KV<K, OutputT>> containing all the combined output elements.
Combine.Holder<V> - Class in org.apache.beam.sdk.transforms
Holds a single value value of type V which may or may not be present.
Combine.IterableCombineFn<V> - Class in org.apache.beam.sdk.transforms
Converts a SerializableFunction from Iterable<V>s to Vs into a simple Combine.CombineFn over Vs.
Combine.PerKey<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
PerKey<K, InputT, OutputT> takes a PCollection<KV<K, InputT>>, groups it by key, applies a combining function to the InputT values associated with each key to produce a combined OutputT value, and returns a PCollection<KV<K, OutputT>> representing a map from each distinct key of the input PCollection to the corresponding combined value.
Combine.PerKeyWithHotKeyFanout<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
Like Combine.PerKey, but sharding the combining of hot keys.
Combine.SimpleCombineFn<V> - Class in org.apache.beam.sdk.transforms
Deprecated.
COMBINE_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
CombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.CombineFn
 
combineFn() - Static method in class org.apache.beam.sdk.transforms.Count
Returns a Combine.CombineFn that counts the number of its inputs.
combineFn() - Static method in class org.apache.beam.sdk.transforms.Latest
Returns a Combine.CombineFn that selects the latest element among its inputs.
combineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
Returns a Combine.CombineFn that computes a fixed-sized sample of its inputs.
CombineFnBase - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
CombineFnBase() - Constructor for class org.apache.beam.sdk.transforms.CombineFnBase
 
CombineFnBase.GlobalCombineFn<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
CombineFns - Class in org.apache.beam.sdk.transforms
Static utility methods that create combine function instances.
CombineFns() - Constructor for class org.apache.beam.sdk.transforms.CombineFns
 
CombineFns.CoCombineResult - Class in org.apache.beam.sdk.transforms
A tuple of outputs produced by a composed combine functions.
CombineFns.ComposeCombineFnBuilder - Class in org.apache.beam.sdk.transforms
A builder class to construct a composed CombineFnBase.GlobalCombineFn.
CombineFns.ComposedCombineFn<DataT> - Class in org.apache.beam.sdk.transforms
A composed Combine.CombineFn that applies multiple CombineFns.
CombineFns.ComposedCombineFnWithContext<DataT> - Class in org.apache.beam.sdk.transforms
CombineFnWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
CombineFunctionState(Combine.CombineFn<InputT, InterT, OutputT>, Coder<InputT>, SparkRuntimeContext) - Constructor for class org.apache.beam.runners.spark.aggregators.NamedAggregators.CombineFunctionState
 
CombineWithContext - Class in org.apache.beam.sdk.transforms
This class contains combine functions that have access to PipelineOptions and side inputs through CombineWithContext.Context.
CombineWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext
 
CombineWithContext.CombineFnWithContext<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
A combine function that has access to PipelineOptions and side inputs through CombineWithContext.Context.
CombineWithContext.Context - Class in org.apache.beam.sdk.transforms
Information accessible to all methods in CombineFnWithContext and KeyedCombineFnWithContext.
CombineWithContext.RequiresContextInternal - Interface in org.apache.beam.sdk.transforms
An internal interface for signaling that a GloballyCombineFn or a PerKeyCombineFn needs to access CombineWithContext.Context.
combining(Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a CombiningState which uses a Combine.CombineFn to automatically merge multiple values of type InputT into a single resulting OutputT.
combining(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a CombiningState which uses a CombineWithContext.CombineFnWithContext to automatically merge multiple values of type InputT into a single resulting OutputT.
combining(Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to #combining(CombineFn), but with an accumulator coder explicitly supplied.
combining(Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to #combining(CombineFnWithContext), but with an accumulator coder explicitly supplied.
combiningFromInputInternal(Coder<InputT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards-compatibility guarantees.
CombiningState<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.state
A ReadableState cell defined by a Combine.CombineFn, accepting multiple input values, combining them as specified into accumulators, and producing a single output value.
committed() - Method in interface org.apache.beam.sdk.metrics.MetricResult
Return the value of this metric across all successfully completed parts of the pipeline.
commonPrefixLength(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
Compute the length of the common prefix of the two provided sets of bytes.
compact(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns an accumulator that represents the same logical value as the input accumulator, but may have a more compact representation.
compact(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
compact(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
compact(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
compact(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Returns an accumulator that represents the same logical value as the input accumulator, but may have a more compact representation.
compare(JobMessage, JobMessage) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
 
compare(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
 
compare(RandomAccessData, RandomAccessData, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
Compare the two sets of bytes starting at the given offset.
compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Largest
 
compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Smallest
 
compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByKey
 
compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByValue
 
compareTo(ByteArray) - Method in class org.apache.beam.runners.spark.util.ByteArray
 
compareTo(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKey
ByteKey implements Comparable<ByteKey> by comparing the arrays in lexicographic order.
compareTo(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
COMPONENT_ENCODINGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
compose() - Static method in class org.apache.beam.sdk.transforms.CombineFns
ComposeCombineFnBuilder() - Constructor for class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
 
CompositeSource - Class in org.apache.beam.runners.spark.metrics
Composite source made up of several MetricRegistry instances.
CompositeSource(String, MetricRegistry...) - Constructor for class org.apache.beam.runners.spark.metrics.CompositeSource
 
CompressedReader(CompressedSource<T>, FileBasedSource.FileBasedReader<T>) - Constructor for class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Create a CompressedReader from a CompressedSource and delegate reader.
CompressedSource<T> - Class in org.apache.beam.sdk.io
A Source that reads from compressed files.
CompressedSource.CompressedReader<T> - Class in org.apache.beam.sdk.io
Reader for a CompressedSource.
CompressedSource.CompressionMode - Enum in org.apache.beam.sdk.io
Default compression types supported by the CompressedSource.
CompressedSource.DecompressingChannelFactory - Interface in org.apache.beam.sdk.io
Factory interface for creating channels that decompress the content of an underlying channel.
COMPRESSION_TYPE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
CONCAT_SOURCE_BASE_SPECS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
CONCAT_SOURCE_SOURCES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ConfigurationLocator() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
 
configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
configure() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
Returns a new builder for a Window transform for setting windowing parameters other than the windowing function.
ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
 
ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ConnectionConfiguration
 
ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ByteCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.Coder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DoubleCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DurationCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.InstantCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.KvCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
LengthPrefixCoder is consistent with equals if the nested Coder is.
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.NullableCoder
NullableCoder is consistent with equals if the nested Coder is.
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.StructuredCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarIntCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarLongCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VoidCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
Returns true if this Coder is injective with respect to Object.equals(java.lang.Object).
consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
ConsoleIO - Class in org.apache.beam.runners.spark.io
Print to console.
ConsoleIO.Write - Class in org.apache.beam.runners.spark.io
Write on the console.
ConsoleIO.Write.Unbound<T> - Class in org.apache.beam.runners.spark.io
PTransform writing PCollection on the console.
constructName(String, String, String, int, int) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
Constructs a fully qualified name from components.
constructUsingStandardParameters(ValueProvider<ResourceId>, String, String) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
A helper function to construct a DefaultFilenamePolicy using the standard filename parameters, namely a provided ResourceId for the output prefix, and possibly-null shard name template and suffix.
contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
contains(T) - Method in interface org.apache.beam.sdk.state.SetState
Returns true if this set contains the specified element.
contains(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns whether this window contains the given window.
containsInAnyOrder(T...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Asserts that the iterable in question contains the provided elements.
containsInAnyOrder(Iterable<T>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Asserts that the iterable in question contains the provided elements.
containsInAnyOrder(T...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Checks that the Iterable contains the expected elements, in any order.
containsInAnyOrder(Iterable<T>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Checks that the Iterable contains the expected elements, in any order.
containsKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns true if the specified ByteKey is contained within this range.
Context(boolean) - Constructor for class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
Context(int, int) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy.Context
 
Context() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.Context
 
convertToArgs(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipeline
 
convertToBagSpecInternal(StateSpec<CombiningState<InputT, AccumT, OutputT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards-compatibility guarantees.
convertToFileResourceIfPossible(String) - Static method in class org.apache.beam.sdk.io.FileBasedSink
This is a helper function for turning a user-provided output filename prefix and converting it into a ResourceId for writing output files.
copy() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns a copy of this RandomAccessData.
copy(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
Copies a List of file-like resources from one location to another.
copy(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
Copies a List of file-like resources from one location to another.
copyFrom(ByteBuffer) - Static method in class org.apache.beam.sdk.io.range.ByteKey
Creates a new ByteKey backed by a copy of the data remaining in the specified ByteBuffer.
copyFrom(byte[]) - Static method in class org.apache.beam.sdk.io.range.ByteKey
Creates a new ByteKey backed by a copy of the specified byte[].
count() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
Count - Class in org.apache.beam.sdk.transforms
PTransforms to count the elements in a PCollection.
countAsserts(Pipeline) - Static method in class org.apache.beam.sdk.testing.PAssert
 
Counter - Interface in org.apache.beam.sdk.metrics
A metric that reports a single long value and can be incremented or decremented.
counter(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
counter(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
CounterMark(long, Instant) - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMark
Creates a checkpoint mark reflecting the last emitted value.
counters() - Method in interface org.apache.beam.sdk.metrics.MetricQueryResults
Return the metric results for the counters that matched the filter.
CountingSource - Class in org.apache.beam.sdk.io
Most users should use GenerateSequence instead.
CountingSource.CounterMark - Class in org.apache.beam.sdk.io
The checkpoint for an unbounded CountingSource is simply the last value produced.
CrashingRunner - Class in org.apache.beam.sdk.testing
A PipelineRunner that applies no overrides and throws an exception on calls to Pipeline.run().
CrashingRunner() - Constructor for class org.apache.beam.sdk.testing.CrashingRunner
 
create(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowClient
 
create(PCollectionView<?>, Coder<T>) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
Deprecated.
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
 
create(PipelineOptions) - Method in class org.apache.beam.runners.flink.DefaultParallelismFactory
 
create(boolean) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
 
create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
 
create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkPipelineOptions.TmpCheckpointDirFactory
 
create() - Static method in class org.apache.beam.runners.spark.SparkRunner
Creates and returns a new SparkRunner with default options.
create(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
Creates and returns a new SparkRunner with specified options.
create(PipelineOptions) - Method in class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
 
create(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
 
create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.SortValues
Returns a SortValues<PrimaryKeyT, SecondaryKeyT, ValueT> PTransform.
create(String[], String, String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
Creates a new Elasticsearch connection configuration.
create(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
 
create(ResourceIdT, CreateOptions) - Method in class org.apache.beam.sdk.io.FileSystem
Returns a write channel for the given ResourceIdT.
create(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a write channel for the given ResourceId.
create(ResourceId, CreateOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a write channel for the given ResourceId with CreateOptions.
create(MatchResult.Status, List<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
create(MatchResult.Status, IOException) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
Returns a MatchResult given the MatchResult.Status and IOException.
create(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
 
create(DataSource) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
create(String, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
create(String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
Describe a connection configuration to the MQTT broker.
create(String, String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
Describe a connection configuration to the MQTT broker.
create(long, long, long, long) - Static method in class org.apache.beam.sdk.metrics.DistributionResult
 
create(long, Instant) - Static method in class org.apache.beam.sdk.metrics.GaugeResult
 
create(PipelineOptions) - Method in interface org.apache.beam.sdk.options.DefaultValueFactory
Creates a default value for a getter marked with Default.InstanceFactory.
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
 
create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
 
create() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
Creates and returns an object that implements PipelineOptions using the values configured on this builder during construction.
create() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Creates and returns an object that implements PipelineOptions.
create() - Static method in class org.apache.beam.sdk.Pipeline
Constructs a pipeline from default PipelineOptions.
create(PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
Constructs a pipeline from the provided PipelineOptions.
create() - Static method in class org.apache.beam.sdk.testing.TestPipeline
Creates and returns a new test pipeline.
create(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
 
create(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream
Create a new TestStream.Builder with no elements and watermark equal to BoundedWindow.TIMESTAMP_MIN_VALUE.
create(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Returns an approximate quantiles combiner with the given compareFn and desired number of quantiles.
create(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Like ApproximateQuantiles.ApproximateQuantilesCombineFn.create(int, Comparator), but sorts values using their natural ordering.
create(int, ComparatorT, long, double) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Creates an approximate quantiles combiner with the given compareFn and desired number of quantiles.
Create<T> - Class in org.apache.beam.sdk.transforms
Create<T> takes a collection of elements of type T known when the pipeline is constructed and returns a PCollection<T> containing the elements.
Create() - Constructor for class org.apache.beam.sdk.transforms.Create
 
create() - Static method in class org.apache.beam.sdk.transforms.Distinct
Returns a Distinct<T> PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.GroupByKey
Returns a GroupByKey<K, V> PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
Returns a CoGroupByKey<K> PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.Keys
Returns a Keys<K> PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.KvSwap
Returns a KvSwap<K, V> PTransform.
create() - Static method in class org.apache.beam.sdk.transforms.Values
Returns a Values<V> PTransform.
Create.TimestampedValues<T> - Class in org.apache.beam.sdk.transforms
A PTransform that creates a PCollection whose elements have associated timestamps.
Create.Values<T> - Class in org.apache.beam.sdk.transforms
A PTransform that creates a PCollection from a set of in-memory objects.
createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
createAccumulator() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
createContextual(DeserializationContext, BeanProperty) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
 
CreateDataflowView<ElemT,ViewT> - Class in org.apache.beam.runners.dataflow
A DataflowRunner marker class for creating a PCollectionView.
createDecompressingChannel(ReadableByteChannel) - Method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
 
createDecompressingChannel(ReadableByteChannel) - Method in interface org.apache.beam.sdk.io.CompressedSource.DecompressingChannelFactory
Given a channel, create a channel that decompresses the content read from the channel.
createDefault() - Static method in class org.apache.beam.sdk.coders.CoderRegistry
Creates a CoderRegistry containing registrations for all standard coders part of the core Java Apache Beam SDK and also any registrations provided by coder registrars.
createForSubrangeOfFile(String, long, long) - Method in class org.apache.beam.sdk.io.AvroSource
Deprecated.
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.AvroSource
 
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.BlockBasedSource
Creates a BlockBasedSource for the specified range in a single file.
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.CompressedSource
Creates a CompressedSource for a subrange of a file.
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
Creates and returns a new FileBasedSource of the same type as the current FileBasedSource backed by a given file and an offset range.
createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
createInputFormatInstance() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
Creates instance of InputFormat class.
createJar(File, File) - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
Create a jar file from the given directory.
createJob(Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Creates the Dataflow Job.
CreateOptions - Class in org.apache.beam.sdk.io.fs
An abstract class that contains common configuration options for creating resources.
CreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions
 
CreateOptions.Builder<BuilderT extends CreateOptions.Builder<BuilderT>> - Class in org.apache.beam.sdk.io.fs
An abstract builder for CreateOptions.
CreateOptions.StandardCreateOptions - Class in org.apache.beam.sdk.io.fs
A standard configuration options with builder.
CreateOptions.StandardCreateOptions.Builder - Class in org.apache.beam.sdk.io.fs
createPane(boolean, boolean, PaneInfo.Timing) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
createPane(boolean, boolean, PaneInfo.Timing, long, long) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Factory method to create a PaneInfo with the specified parameters.
createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded) - Static method in class org.apache.beam.sdk.values.PCollection
For internal use only; no backwards-compatibility guarantees.
createProcessContext(ValueInSingleWindow<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
Returns a new BoundedSource.BoundedReader that reads from this source.
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
createReader(PipelineOptions, JmsCheckpointMark) - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
createReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.sdk.io.UnboundedSource
Create a new UnboundedSource.UnboundedReader to read from this source, resuming from the given checkpoint if present.
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.AvroSource
 
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BlockBasedSource
Creates a BlockBasedReader.
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.CompressedSource
Creates a FileBasedReader to read a single file.
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
Creates and returns an instance of a FileBasedReader implementation for the current source assuming the source represents a single file.
createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns an OffsetBasedSource for a subrange of the current source.
CreateStream<T> - Class in org.apache.beam.runners.spark.io
Create an input stream from Queue.
createStructuralValues(Coder<T>, List<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Testing utilities below depend on standard assertions and matchers to compare elements read by sources.
CreateTables<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Creates any tables needed before performing streaming writes to the tables.
CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
 
createWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink
Return a subclass of FileBasedSink.WriteOperation that will manage the write to the sink.
createWriter() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Clients must implement to return a subclass of FileBasedSink.Writer.
CredentialFactory - Interface in org.apache.beam.sdk.extensions.gcp.auth
Construct an oauth credential to be used by the SDK and the SDK workers.
CsvSink - Class in org.apache.beam.runners.spark.metrics.sink
A Spark Sink that is tailored to report AggregatorMetric metrics to a CSV file.
CsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
 
ctxt - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
current() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators.CombineFunctionState
 
current() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
 
currentEventTime() - Method in interface org.apache.beam.sdk.state.Timers
Returns the current event time.
currentInputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
currentOutputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
currentProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
currentProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
Returns the current processing time.
currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
currentRestriction() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
Returns a restriction accurately describing the full range of work the current DoFn.ProcessElement call will do, including already completed work.
currentSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
currentSynchronizedProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
Returns the current synchronized processing time or null if unknown.
CUSTOM_SOURCE_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
CustomCoder<T> - Class in org.apache.beam.sdk.coders
An abstract base class that implements all methods of Coder except Coder.encode(T, java.io.OutputStream) and Coder.decode(java.io.InputStream).
CustomCoder() - Constructor for class org.apache.beam.sdk.coders.CustomCoder
 

D

DataflowClient - Class in org.apache.beam.runners.dataflow
Wrapper around the generated Dataflow client to provide common functionality.
DataflowClientFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
 
DataflowJobAlreadyExistsException - Exception in org.apache.beam.runners.dataflow
An exception that is thrown if the unique job name constraint of the Dataflow service is broken because an existing job with the same job name is currently active.
DataflowJobAlreadyExistsException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyExistsException
Create a new DataflowJobAlreadyExistsException with the specified DataflowPipelineJob and message.
DataflowJobAlreadyUpdatedException - Exception in org.apache.beam.runners.dataflow
An exception that is thrown if the existing job has already been updated within the Dataflow service and is no longer able to be updated.
DataflowJobAlreadyUpdatedException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyUpdatedException
Create a new DataflowJobAlreadyUpdatedException with the specified DataflowPipelineJob and message.
DataflowJobException - Exception in org.apache.beam.runners.dataflow
A RuntimeException that contains information about a DataflowPipelineJob.
DataflowPipelineDebugOptions - Interface in org.apache.beam.runners.dataflow.options
Internal.
DataflowPipelineDebugOptions.DataflowClientFactory - Class in org.apache.beam.runners.dataflow.options
Returns the default Dataflow client built from the passed in PipelineOptions.
DataflowPipelineDebugOptions.StagerFactory - Class in org.apache.beam.runners.dataflow.options
Creates a Stager object using the class specified in DataflowPipelineDebugOptions.getStagerClass().
DataflowPipelineJob - Class in org.apache.beam.runners.dataflow
A DataflowPipelineJob represents a job submitted to Dataflow using DataflowRunner.
DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
Constructs the job.
DataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
Options that can be used to configure the DataflowRunner.
DataflowPipelineOptions.StagingLocationFactory - Class in org.apache.beam.runners.dataflow.options
Returns a default staging location under GcpOptions.getGcpTempLocation().
DataflowPipelineRegistrar - Class in org.apache.beam.runners.dataflow
Contains the PipelineOptionsRegistrar and PipelineRunnerRegistrar for the DataflowRunner.
DataflowPipelineRegistrar.Options - Class in org.apache.beam.runners.dataflow
DataflowPipelineRegistrar.Runner - Class in org.apache.beam.runners.dataflow
Register the DataflowRunner.
DataflowPipelineTranslator - Class in org.apache.beam.runners.dataflow
DataflowPipelineTranslator knows how to translate Pipeline objects into Cloud Dataflow Service API Jobs.
DataflowPipelineTranslator.JobSpecification - Class in org.apache.beam.runners.dataflow
The result of a job translation.
DataflowPipelineWorkerPoolOptions - Interface in org.apache.beam.runners.dataflow.options
Options that are used to configure the Dataflow pipeline worker pool.
DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType - Enum in org.apache.beam.runners.dataflow.options
Type of autoscaling algorithm to use.
DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory - Class in org.apache.beam.runners.dataflow.options
Returns the default Docker container image that executes Dataflow worker harness, residing in Google Container Registry.
DataflowProfilingAgentConfiguration() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowProfilingOptions.DataflowProfilingAgentConfiguration
 
DataflowProfilingOptions - Interface in org.apache.beam.runners.dataflow.options
Options for controlling profiling of pipeline execution.
DataflowProfilingOptions.DataflowProfilingAgentConfiguration - Class in org.apache.beam.runners.dataflow.options
Configuration the for profiling agent.
DataflowRunner - Class in org.apache.beam.runners.dataflow
A PipelineRunner that executes the operations in the pipeline by first translating them to the Dataflow representation using the DataflowPipelineTranslator and then submitting them to a Dataflow service for execution.
DataflowRunner(DataflowPipelineOptions) - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner
 
DataflowRunner.StreamingPCollectionViewWriterFn<T> - Class in org.apache.beam.runners.dataflow
Deprecated.
DataflowRunnerHooks - Class in org.apache.beam.runners.dataflow
An instance of this class can be passed to the DataflowRunner to add user defined hooks to be invoked at various times during pipeline execution.
DataflowRunnerHooks() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunnerHooks
 
DataflowRunnerInfo - Class in org.apache.beam.runners.dataflow
Populates versioning and other information for DataflowRunner.
DataflowServiceException - Exception in org.apache.beam.runners.dataflow
Signals there was an error retrieving information about a job from the Cloud Dataflow Service.
DataflowTemplateJob - Class in org.apache.beam.runners.dataflow.util
A DataflowPipelineJob that is returned when --templateRunner is set.
DataflowTemplateJob() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
DataflowTransport - Class in org.apache.beam.runners.dataflow.util
Helpers for cloud communication.
DataflowTransport() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTransport
 
DataflowWorkerHarnessOptions - Interface in org.apache.beam.runners.dataflow.options
Options that are used exclusively within the Dataflow worker harness.
DataflowWorkerLoggingOptions - Interface in org.apache.beam.runners.dataflow.options
Options that are used to control logging configuration on the Dataflow worker.
DataflowWorkerLoggingOptions.Level - Enum in org.apache.beam.runners.dataflow.options
The set of log levels that can be used on the Dataflow worker.
DataflowWorkerLoggingOptions.WorkerLogLevelOverrides - Class in org.apache.beam.runners.dataflow.options
Defines a log level override for a specific class, package, or name.
DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
DatastoreIO provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
DatastoreV1 provides an API to Read, Write and Delete PCollections of Google Cloud Datastore version v1 Entity objects.
DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that deletes Entities from Cloud Datastore.
DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that deletes Entities associated with the given Keys from Cloud Datastore.
DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that reads the result rows of a Cloud Datastore query as Entity objects.
DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that writes Entity objects to Cloud Datastore.
days(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
Returns a WindowFn that windows elements into periods measured by days.
dec() - Method in interface org.apache.beam.sdk.metrics.Counter
 
dec(long) - Method in interface org.apache.beam.sdk.metrics.Counter
 
decode(InputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.AvroCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.Coder
Decodes a value of type T from the given input stream in the given context.
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
Deprecated.
decode(InputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
decodePane(byte) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.CollectionCoder
Builds an instance of IterableT, this coder's associated Iterable-like subtype, from a list of decoded elements.
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
 
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
Builds an instance of IterableT, this coder's associated Iterable-like subtype, from a list of decoded elements.
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
 
decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.SetCoder
Builds an instance of IterableT, this coder's associated Iterable-like subtype, from a list of decoded elements.
Default - Annotation Type in org.apache.beam.sdk.options
Default represents a set of annotations that can be used to annotate getter properties on PipelineOptions with information representing the default value to be returned if no value is specified.
Default.Boolean - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified boolean primitive value.
Default.Byte - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified byte primitive value.
Default.Character - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified char primitive value.
Default.Class - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified Class value.
Default.Double - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified double primitive value.
Default.Enum - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified enum.
Default.Float - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified float primitive value.
Default.InstanceFactory - Annotation Type in org.apache.beam.sdk.options
Value must be of type DefaultValueFactory and have a default constructor.
Default.Integer - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified int primitive value.
Default.Long - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified long primitive value.
Default.Short - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified short primitive value.
Default.String - Annotation Type in org.apache.beam.sdk.options
This represents that the default of the option is the specified String value.
DEFAULT_BYTE_ARRAY_CODER - Static variable in class org.apache.beam.sdk.io.TFRecordIO
The default coder, which returns each record of the input file as a byte array.
DEFAULT_MAX_NUM_ELEMENTS - Static variable in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
The cost (in time and space) to compute quantiles to a given accuracy is a function of the total number of elements in the data set.
DEFAULT_SCHEME - Static variable in class org.apache.beam.sdk.io.FileSystems
 
DEFAULT_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
DefaultCoder - Annotation Type in org.apache.beam.sdk.coders
The DefaultCoder annotation specifies a Coder class to handle encoding and decoding instances of the annotated class.
DefaultCoder.DefaultCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
A CoderProviderRegistrar that registers a CoderProvider which can use the @DefaultCoder annotation to provide coder providers that creates Coders.
DefaultCoderCloudObjectTranslatorRegistrar - Class in org.apache.beam.runners.dataflow.util
The CoderCloudObjectTranslatorRegistrar containing the default collection of Coder Cloud Object Translators.
DefaultCoderCloudObjectTranslatorRegistrar() - Constructor for class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
 
DefaultCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
 
DefaultConcludeTransform() - Constructor for class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
 
DefaultFilenamePolicy - Class in org.apache.beam.sdk.io
A default FileBasedSink.FilenamePolicy for unwindowed files.
DefaultParallelismFactory - Class in org.apache.beam.runners.flink
DefaultValueFactory for getting a default value for the parallelism option on FlinkPipelineOptions.
DefaultParallelismFactory() - Constructor for class org.apache.beam.runners.flink.DefaultParallelismFactory
 
DefaultPipelineOptionsRegistrar - Class in org.apache.beam.sdk.options
A PipelineOptionsRegistrar containing the PipelineOptions subclasses available by default.
DefaultPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
 
DefaultProjectFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
 
Defaults() - Constructor for class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
DefaultStopPipelineWatermarkFactory() - Constructor for class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
 
DefaultTrigger - Class in org.apache.beam.sdk.transforms.windowing
A trigger that is equivalent to Repeatedly.forever(AfterWatermark.pastEndOfWindow()).
defaultValue() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns the default value when there are no values added to the accumulator.
defaultValue() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
Returns the default value when there are no values added to the accumulator.
defaultValue() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
 
defaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
Returns the default value of this transform, or null if there isn't one.
DefaultValueFactory<T> - Interface in org.apache.beam.sdk.options
An interface used with the Default.InstanceFactory annotation to specify the class that will be an instance factory to produce default values for a given getter on PipelineOptions.
delay(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
For internal use only; no backwards-compatibility guarantees.
Delay() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
 
delegate() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
delegate(HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register display data from the specified component on behalf of the current component.
DelegateCoder<T,IntermediateT> - Class in org.apache.beam.sdk.coders
A DelegateCoder<T, IntermediateT> wraps a Coder for IntermediateT and encodes/decodes values of type T by converting to/from IntermediateT and then encoding/decoding using the underlying Coder<IntermediateT>.
DelegateCoder(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.DelegateCoder
 
DelegateCoder.CodingFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.coders
A CodingFunction<InputT, OutputT> is a serializable function from InputT to OutputT that may throw any Exception.
delete(Collection<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
Deletes a collection of resources.
delete(Collection<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
Deletes a collection of resources.
deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.DeleteEntity builder.
deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.DeleteKey builder.
deleteTimer(StateNamespace, String, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
deleteTimer(StateNamespace, String) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
deleteTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
Removes the timer set in this context for the timestmap and timeDomain.
dependsOnlyOnEarliestTimestamp() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Returns true if the result of combination of many output timestamps actually depends only on the earliest.
dependsOnlyOnWindow() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Returns true if the result does not depend on what outputs were combined but only the window they are in.
describeMismatchSafely(PipelineResult, Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
describeTo(Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
describeTo(Description) - Method in class org.apache.beam.sdk.testing.RegexMatcher
 
describeTo(Description) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
 
Description - Annotation Type in org.apache.beam.sdk.options
Descriptions are used to generate human readable output when the --help command is specified.
deserialize(String, byte[]) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
deserialize(JsonParser, DeserializationContext) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
 
deserializeTimers(Collection<byte[]>, TimerInternals.TimerDataCoder) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
detectClassPathResourcesToStage(ClassLoader) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
Attempts to detect all the resources the class loader has access to.
detectClassPathResourcesToStage(ClassLoader) - Static method in class org.apache.beam.runners.flink.FlinkRunner
Attempts to detect all the resources the class loader has access to.
DirectOptions - Interface in org.apache.beam.runners.direct
Options that can be used to configure the DirectRunner.
DirectOptions.AvailableParallelismFactory - Class in org.apache.beam.runners.direct
A DefaultValueFactory that returns the result of Runtime.availableProcessors() from the DirectOptions.AvailableParallelismFactory.create(PipelineOptions) method.
DIRECTORY_CONTAINER - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
Shard is a file within a directory.
DirectRegistrar - Class in org.apache.beam.runners.direct
Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the DirectRunner.
DirectRegistrar.Options - Class in org.apache.beam.runners.direct
Registers the DirectOptions.
DirectRegistrar.Runner - Class in org.apache.beam.runners.direct
Registers the DirectRunner.
DirectRunner - Class in org.apache.beam.runners.direct
A PipelineRunner that executes a Pipeline within the process that constructed the Pipeline.
DirectRunner() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
 
DirectRunner.DirectPipelineResult - Class in org.apache.beam.runners.direct
The result of running a Pipeline with the DirectRunner.
DISALLOW_COMBINER_LIFTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
discardingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
Returns a new Window PTransform that uses the registered WindowFn and Triggering behavior, and that discards elements in a pane after they are triggered.
DISPLAY_DATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
DisplayData - Class in org.apache.beam.sdk.transforms.display
Static display data associated with a pipeline component.
DisplayData.Builder - Interface in org.apache.beam.sdk.transforms.display
Utility to build up display data from a component and its included subcomponents.
DisplayData.Identifier - Class in org.apache.beam.sdk.transforms.display
Unique identifier for a display data item within a component.
DisplayData.Item - Class in org.apache.beam.sdk.transforms.display
Items are the unit of display data.
DisplayData.ItemSpec<T> - Class in org.apache.beam.sdk.transforms.display
Specifies an DisplayData.Item to register as display data.
DisplayData.Path - Class in org.apache.beam.sdk.transforms.display
Structured path of registered display data within a component hierarchy.
DisplayData.Type - Enum in org.apache.beam.sdk.transforms.display
Display data type.
Distinct<T> - Class in org.apache.beam.sdk.transforms
Distinct<T> takes a PCollection<T> and returns a PCollection<T> that has all distinct elements of the input.
Distinct() - Constructor for class org.apache.beam.sdk.transforms.Distinct
 
Distinct.WithRepresentativeValues<T,IdT> - Class in org.apache.beam.sdk.transforms
A Distinct PTransform that uses a SerializableFunction to obtain a representative value for each input element.
Distribution - Interface in org.apache.beam.sdk.metrics
A metric that reports information about the distribution of reported values.
distribution(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that records various statistics about the distribution of reported values.
distribution(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that records various statistics about the distribution of reported values.
DistributionResult - Class in org.apache.beam.sdk.metrics
The result of a Distribution metric.
DistributionResult() - Constructor for class org.apache.beam.sdk.metrics.DistributionResult
 
distributions() - Method in interface org.apache.beam.sdk.metrics.MetricQueryResults
Return the metric results for the distributions that matched the filter.
doChecks(PAssert.PAssertionSite, ActualT, SerializableFunction<ActualT, Void>) - Static method in class org.apache.beam.sdk.testing.PAssert
 
DoFn<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
The argument to ParDo providing the code to use to process elements of the input PCollection.
DoFn() - Constructor for class org.apache.beam.sdk.transforms.DoFn
 
DoFn.BoundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
Annotation on a splittable DoFn specifying that the DoFn performs a bounded amount of work per input element, so applying it to a bounded PCollection will produce also a bounded PCollection.
DoFn.FinishBundle - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use to finish processing a batch of elements.
DoFn.FinishBundleContext - Class in org.apache.beam.sdk.transforms
Information accessible while within the DoFn.FinishBundle method.
DoFn.GetInitialRestriction - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that maps an element to an initial restriction for a splittable DoFn.
DoFn.GetRestrictionCoder - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that returns the coder to use for the restriction of a splittable DoFn.
DoFn.NewTracker - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that creates a new RestrictionTracker for the restriction of a splittable DoFn.
DoFn.OnTimer - Annotation Type in org.apache.beam.sdk.transforms
Annotation for registering a callback for a timer.
DoFn.OnTimerContext - Class in org.apache.beam.sdk.transforms
Information accessible when running a DoFn.OnTimer method.
DoFn.OutputReceiver<T> - Interface in org.apache.beam.sdk.transforms
Receives values of the given type.
DoFn.ProcessContext - Class in org.apache.beam.sdk.transforms
Information accessible when running a DoFn.ProcessElement method.
DoFn.ProcessElement - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use for processing elements.
DoFn.Setup - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use to prepare an instance for processing bundles of elements.
DoFn.SplitRestriction - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method that splits restriction of a splittable DoFn into multiple parts to be processed in parallel.
DoFn.StartBundle - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use to prepare an instance for processing a batch of elements.
DoFn.StartBundleContext - Class in org.apache.beam.sdk.transforms
Information accessible while within the DoFn.StartBundle method.
DoFn.StateId - Annotation Type in org.apache.beam.sdk.transforms
Annotation for declaring and dereferencing state cells.
DoFn.Teardown - Annotation Type in org.apache.beam.sdk.transforms
Annotation for the method to use to clean up this instance after processing bundles of elements.
DoFn.TimerId - Annotation Type in org.apache.beam.sdk.transforms
Annotation for declaring and dereferencing timers.
DoFn.UnboundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
Annotation on a splittable DoFn specifying that the DoFn performs an unbounded amount of work per input element, so applying it to a bounded PCollection will produce an unbounded PCollection.
DoFn.WindowedContext - Class in org.apache.beam.sdk.transforms
Information accessible to all methods in this DoFn where the context is in some window.
DoFnInfo<InputT,OutputT> - Class in org.apache.beam.runners.dataflow.util
Wrapper class holding the necessary information to serialize a DoFn.
DoFnRunnerWithMetricsUpdate<InputT,OutputT> - Class in org.apache.beam.runners.flink.metrics
DoFnRunner decorator which registers MetricsContainerImpl.
DoFnRunnerWithMetricsUpdate(String, DoFnRunner<InputT, OutputT>, RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
DoFnTester<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A harness for unit-testing a DoFn.
DoFnTester.CloningBehavior - Enum in org.apache.beam.sdk.transforms
When a DoFnTester should clone the DoFn under test and how it should manage the lifecycle of the DoFn.
DoubleCoder - Class in org.apache.beam.sdk.coders
A DoubleCoder encodes Double values in 8 bytes using Java serialization.
doubles() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Double.
doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<Double> and returns a PCollection<Double> whose contents is the maximum of the input PCollection's elements, or Double.NEGATIVE_INFINITY if there are no elements.
doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<Double> and returns a PCollection<Double> whose contents is the minimum of the input PCollection's elements, or Double.POSITIVE_INFINITY if there are no elements.
doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<Double> and returns a PCollection<Double> whose contents is the sum of the input PCollection's elements, or 0 if there are no elements.
doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns a PCollection<KV<K, Double>> that contains an output element mapping each distinct key in the input PCollection to the maximum of the values associated with that key in the input PCollection.
doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns a PCollection<KV<K, Double>> that contains an output element mapping each distinct key in the input PCollection to the minimum of the values associated with that key in the input PCollection.
doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<KV<K, Double>> and returns a PCollection<KV<K, Double>> that contains an output element mapping each distinct key in the input PCollection to the sum of the values associated with that key in the input PCollection.
DurationCoder - Class in org.apache.beam.sdk.coders
A Coder that encodes a joda Duration as a Long using the format of VarLongCoder.
DynamicDestinations<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
This class provides the most general way of specifying dynamic BigQuery table destinations.
DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 

E

ElasticsearchIO - Class in org.apache.beam.sdk.io.elasticsearch
Transforms for reading and writing data from/to Elasticsearch.
ElasticsearchIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.elasticsearch
A POJO describing a connection configuration to Elasticsearch.
ElasticsearchIO.Read - Class in org.apache.beam.sdk.io.elasticsearch
A PTransform reading data from Elasticsearch.
ElasticsearchIO.Write - Class in org.apache.beam.sdk.io.elasticsearch
A PTransform writing data to Elasticsearch.
ELEMENT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
element() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Returns the input element to be processed.
element() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
Returns the current element.
elementCountAtLeast(int) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterPane
Creates a trigger that fires when the pane contains at least countElems elements.
ElementEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ElementEvent
 
ELEMENTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
elements() - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each element of the input PCollection to a String using the Object.toString() method.
elementsRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Counter of elements read by a source.
elementsReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
Counter of elements read by a source split.
elementsWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
Counter of elements written to a sink.
EMPTY - Static variable in class org.apache.beam.sdk.io.range.ByteKey
An empty key.
empty() - Static method in class org.apache.beam.sdk.metrics.GaugeResult
 
empty() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Asserts that the iterable in question is empty.
empty() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
empty(Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces an empty PCollection.
empty(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces an empty PCollection.
empty() - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns an empty CoGbkResult.
empty(Pipeline) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns an empty KeyedPCollectionTuple<K> on the given pipeline.
empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionList
Returns an empty PCollectionList that is part of the given Pipeline.
empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
Returns an empty PCollectionTuple that is part of the given Pipeline.
empty() - Static method in class org.apache.beam.sdk.values.TupleTagList
Returns an empty TupleTagList.
emptyBatch() - Method in class org.apache.beam.runners.spark.io.CreateStream
Adds an empty batch.
EmptyCheckpointMark - Class in org.apache.beam.runners.spark.io
Passing null values to Spark's Java API may cause problems because of Guava preconditions.
EmptyListenersList() - Constructor for class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
 
enableAbandonedNodeEnforcement(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
Enables the abandoned node detection.
enableAutoRunIfMissing(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
If enabled, a pipeline.run() statement will be added automatically in case it is missing in the test.
encode(RandomAccessData, OutputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
encode(RandomAccessData, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.AvroCoder
 
encode(BigDecimal, OutputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
encode(BigDecimal, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
encode(BigInteger, OutputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
encode(BigInteger, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
encode(BitSet, OutputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
encode(BitSet, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
encode(byte[], OutputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
encode(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
encode(Byte, OutputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.Coder
Encodes the given value of type T onto the given output stream.
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
Deprecated.
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
encode(Double, OutputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
encode(ReadableDuration, OutputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
encode(Instant, OutputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
encode(IterableT, OutputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
encode(KV<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
 
encode(KV<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
encode(Map<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
 
encode(Map<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
encode(String, OutputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
encode(String, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
encode(Integer, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
encode(Void, OutputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
encode(ByteString, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
encode(ByteString, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
encode(FileBasedSink.FileResult, OutputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
encode(TableRow, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
encode(TableRow, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
encode(KafkaRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
encode(KafkaRecord<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
encode(CoGbkResult, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
encode(RawUnionValue, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
encode(RawUnionValue, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
encode(GlobalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
encode(IntervalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
encode(PaneInfo, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
encode(TimestampedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
encode(ValueInSingleWindow<T>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
encode(ValueInSingleWindow<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
encode(ValueWithRecordId<ValueT>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
encode(ValueWithRecordId<ValueT>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
encodeAndOwn(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
Encodes the provided value with the identical encoding to ByteArrayCoder.encode(byte[], java.io.OutputStream), but with optimizations that take ownership of the value.
ENCODING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ENCODING_ID - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
end() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns the end of this window, exclusive.
END_INDEX - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
END_OFFSET - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
END_SHUFFLE_POSITION - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkNativePipelineVisitor
 
enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
enterCompositeTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called for each composite transform after all topological predecessors have been visited but before any of its component transforms.
entries() - Method in interface org.apache.beam.sdk.state.MapState
Returns an Iterable over the key-value pairs contained in this map.
ENVIRONMENT_VERSION_JOB_TYPE_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ENVIRONMENT_VERSION_MAJOR_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
equal(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that equals to a given value.
equals(Object) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
equals(Object) - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
equals(Object) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
equals(Object) - Method in class org.apache.beam.runners.spark.util.ByteArray
 
equals(Object) - Method in class org.apache.beam.sdk.coders.AtomicCoder
.
equals(Object) - Method in class org.apache.beam.sdk.coders.AvroCoder
 
equals(Object) - Method in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
equals(Object) - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
equals(Object) - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
equals(Object) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
equals(Object) - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
equals(Object) - Method in class org.apache.beam.sdk.coders.StructuredCoder
equals(Object) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
equals(Object) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
equals(Object) - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKey
 
equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
equals(Object) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
equals(Object) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Deprecated.
Object.equals(Object) is not supported on PAssert objects. If you meant to test object equality, use a variant of PAssert.PCollectionContentsAssert.containsInAnyOrder(T...) instead.
equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRange
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
equals(Object) - Method in class org.apache.beam.sdk.values.KV
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionList
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
equals(Object) - Method in class org.apache.beam.sdk.values.TimestampedValue
 
equals(Object) - Method in class org.apache.beam.sdk.values.TupleTag
 
equals(Object) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Two type descriptor are equal if and only if they represent the same type.
equals(Object) - Method in class org.apache.beam.sdk.values.TypeParameter
 
equals(Object) - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
equals(Object) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
estimateFractionForKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns the fraction of this range [startKey, endKey) that is in the interval [startKey, key).
Evaluator(SparkPipelineTranslator, EvaluationContext) - Constructor for class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
ever() - Static method in class org.apache.beam.sdk.transforms.windowing.Never
Returns a trigger which never fires.
every(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Returns a new SlidingWindows with the original size, that assigns timestamps into half-open intervals of the form [N * period, N * period + size), where 0 is the epoch.
ExecutorServiceFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
 
expand(PCollection<List<ElemT>>) - Method in class org.apache.beam.runners.apex.ApexRunner.CreateApexPCollectionView
 
expand(PCollection<ElemT>) - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
expand(PCollection<T>) - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
 
expand(PBegin) - Method in class org.apache.beam.runners.spark.io.CreateStream
 
expand(PInput) - Method in class org.apache.beam.runners.spark.util.SinglePrimitiveOutputPTransform
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
 
expand(PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>>) - Method in class org.apache.beam.sdk.extensions.sorter.SortValues
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
expand(PCollection<KV<DestinationT, TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
 
expand(PCollection<KV<DestinationT, TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
 
expand(PCollection<KV<TableDestination, TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
expand() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
expand(PCollection<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
expand(PCollection<KV<byte[], Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
expand(PCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Bounded
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.WriteFiles
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
expand(PCollection<SuccessOrFailure>) - Method in class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssert
 
expand(PCollection<Iterable<T>>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssertForSingleton
 
expand(PBegin) - Method in class org.apache.beam.sdk.testing.PAssert.OneSideInputAssert
 
expand(PBegin) - Method in class org.apache.beam.sdk.testing.TestStream
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
expand(PCollection<? extends KV<K, ? extends Iterable<InputT>>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
 
expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
 
expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.Values
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Filter
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
 
expand(PCollection<? extends Iterable<T>>) - Method in class org.apache.beam.sdk.transforms.Flatten.Iterables
 
expand(PCollectionList<T>) - Method in class org.apache.beam.sdk.transforms.Flatten.PCollections
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
 
expand(KeyedPCollectionTuple<K>) - Method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
 
expand() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Expands the component PCollections, stripping off any tag-specific information.
expand(PCollection<? extends KV<K, ?>>) - Method in class org.apache.beam.sdk.transforms.Keys
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.KvSwap
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Partition
 
expand(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
Applies this PTransform on the given InputT, and returns its Output.
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.AllMatches
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Find
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindAll
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindKV
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindName
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindNameKV
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Matches
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesKV
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesName
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesNameKV
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceAll
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Split
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Reshuffle
Deprecated.
 
expand(PCollection<? extends KV<?, V>>) - Method in class org.apache.beam.sdk.transforms.Values
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsIterable
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsList
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMap
 
expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
 
expand(PCollection<ElemT>) - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
expand(PCollection<V>) - Method in class org.apache.beam.sdk.transforms.WithKeys
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
 
expand() - Method in class org.apache.beam.sdk.values.PBegin
 
expand() - Method in class org.apache.beam.sdk.values.PCollectionList
 
expand() - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
expand() - Method in class org.apache.beam.sdk.values.PDone
A PDone contains no PValues.
expand() - Method in interface org.apache.beam.sdk.values.PInput
Expands this PInput into a list of its component output PValues.
expand() - Method in interface org.apache.beam.sdk.values.POutput
Expands this POutput into a list of its component output PValues.
expand() - Method in interface org.apache.beam.sdk.values.PValue
Deprecated.
expand() - Method in class org.apache.beam.sdk.values.PValueBase
 
Experimental - Annotation Type in org.apache.beam.sdk.annotations
Signifies that a public API (public class, method or field) is subject to incompatible changes, or even removal, in a future release.
Experimental.Kind - Enum in org.apache.beam.sdk.annotations
An enumeration of various kinds of experimental APIs.
extend(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
Extend the path by appending a sub-component path.
extractOrderedList() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
Returns the values in the heap, ordered largest to smallest.
extractOutput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
extractOutput() - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
Returns the output value that is the result of combining all the input values represented by this accumulator.
extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
extractOutput(double[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
extractOutput(Combine.Holder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
extractOutput(int[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
extractOutput(long[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns the output value that is the result of combining all the input values represented by the given accumulator.
extractOutput(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
extractOutput(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
extractOutput(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
extractOutput(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Returns the output value that is the result of combining all the input values represented by the given accumulator.
extractOutput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 

F

failure(PAssert.PAssertionSite, String) - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
failure(PAssert.PAssertionSite) - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
FAILURE_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
 
fewKeys() - Method in class org.apache.beam.sdk.transforms.GroupByKey
Returns whether it groups just few keys.
FileBasedReader(FileBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
Subclasses should not perform IO operations at the constructor.
FileBasedSink<T> - Class in org.apache.beam.sdk.io
Abstract class for file-based output.
FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.FilenamePolicy) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
Construct a FileBasedSink with the given filename policy, producing uncompressed files.
FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.FilenamePolicy, FileBasedSink.WritableByteChannelFactory) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
Construct a FileBasedSink with the given filename policy and output channel type.
FileBasedSink.CompressionType - Enum in org.apache.beam.sdk.io
Directly supported file output compression types.
FileBasedSink.FilenamePolicy - Class in org.apache.beam.sdk.io
A naming policy for output files.
FileBasedSink.FilenamePolicy.Context - Class in org.apache.beam.sdk.io
Context used for generating a name based on shard number, and num shards.
FileBasedSink.FilenamePolicy.WindowedContext - Class in org.apache.beam.sdk.io
Context used for generating a name based on window, pane, shard number, and num shards.
FileBasedSink.FileResult - Class in org.apache.beam.sdk.io
Result of a single bundle write.
FileBasedSink.FileResultCoder - Class in org.apache.beam.sdk.io
A coder for FileBasedSink.FileResult objects.
FileBasedSink.WritableByteChannelFactory - Interface in org.apache.beam.sdk.io
Implementations create instances of WritableByteChannel used by FileBasedSink and related classes to allow decorating, or otherwise transforming, the raw data that would normally be written directly to the WritableByteChannel passed into FileBasedSink.WritableByteChannelFactory.create(WritableByteChannel).
FileBasedSink.WriteOperation<T> - Class in org.apache.beam.sdk.io
Abstract operation that manages the process of writing to FileBasedSink.
FileBasedSink.Writer<T> - Class in org.apache.beam.sdk.io
Abstract writer that writes a bundle to a FileBasedSink.
FileBasedSource<T> - Class in org.apache.beam.sdk.io
A common base class for all file-based Sources.
FileBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
Create a FileBaseSource based on a file or a file pattern specification.
FileBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
Create a FileBasedSource based on a single file.
FileBasedSource.FileBasedReader<T> - Class in org.apache.beam.sdk.io
A reader that implements code common to readers of FileBasedSources.
FileBasedSource.Mode - Enum in org.apache.beam.sdk.io
A given FileBasedSource represents a file resource of one of these types.
FileChecksumMatcher - Class in org.apache.beam.sdk.testing
Matcher to verify file checksum in E2E test.
FileChecksumMatcher(String, String) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
Constructor that uses default shard template.
FileChecksumMatcher(String, String, Pattern) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
Constructor using a custom shard template.
FileChecksumMatcher(String, ShardedFile) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
Constructor using an entirely custom ShardedFile implementation.
FILENAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
FILENAME_PREFIX - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
FILENAME_SUFFIX - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
FilenamePolicy() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
 
FILEPATTERN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
FileResult(ResourceId, int, BoundedWindow, PaneInfo) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
FileResultCoder(Coder<BoundedWindow>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
FileSystem<ResourceIdT extends ResourceId> - Class in org.apache.beam.sdk.io
File system interface in Beam.
FileSystem() - Constructor for class org.apache.beam.sdk.io.FileSystem
 
FileSystemRegistrar - Interface in org.apache.beam.sdk.io
A registrar that creates FileSystem instances from PipelineOptions.
FileSystems - Class in org.apache.beam.sdk.io
Clients facing FileSystem utility.
FileSystems() - Constructor for class org.apache.beam.sdk.io.FileSystems
 
Filter<T> - Class in org.apache.beam.sdk.transforms
PTransforms for filtering from a PCollection the elements satisfying a predicate, or satisfying an inequality with a given value based on the elements' natural ordering.
finalize(Iterable<FileBasedSink.FileResult>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Finalizes writing by copying temporary output files to their final location and optionally removing temporary files.
finalizeCheckpoint() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
 
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
Acknowledge all outstanding message.
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
finalizeCheckpoint() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
Called by the system to signal that this checkpoint mark has been committed along with all the records which have been read from the UnboundedSource.UnboundedReader since the previous checkpoint was taken.
find(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Find PTransform that checks if a portion of the line matches the Regex.
find(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Find PTransform that checks if a portion of the line matches the Regex.
find(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Find PTransform that checks if a portion of the line matches the Regex.
find(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Find PTransform that checks if a portion of the line matches the Regex.
find(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindName PTransform that checks if a portion of the line matches the Regex.
find(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindName PTransform that checks if a portion of the line matches the Regex.
Find(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Find
 
findAll(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindAll PTransform that checks if a portion of the line matches the Regex.
findAll(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindAll PTransform that checks if a portion of the line matches the Regex.
FindAll(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindAll
 
findKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindKV PTransform that checks if a portion of the line matches the Regex.
findKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindKV PTransform that checks if a portion of the line matches the Regex.
findKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindNameKV PTransform that checks if a portion of the line matches the Regex.
findKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.FindNameKV PTransform that checks if a portion of the line matches the Regex.
FindKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindKV
 
FindName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindName
 
FindNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindNameKV
 
finishBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
finishBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Calls the DoFn.FinishBundle method of the DoFn under test.
FinishBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
 
finishSpecifying() - Method in interface org.apache.beam.sdk.state.StateSpec
For internal use only; no backwards-compatibility guarantees.
finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
After building, finalizes this PValue to make it ready for running.
finishSpecifying(PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.PValue
After building, finalizes this PValue to make it ready for being used as an input to a PTransform.
finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionList
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PDone
Does nothing; there is nothing to finish specifying.
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.POutput
As part of applying the producing PTransform, finalizes this output to make it ready for being used as an input and for running.
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
 
finishWrite() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
fixDefaults() - Method in class org.apache.beam.sdk.values.WindowingStrategy
Fixes all the defaults so that equals can be used to check that two strategies are the same, regardless of the state of "defaulted-ness".
fixedSizeGlobally(int) - Static method in class org.apache.beam.sdk.transforms.Sample
Returns a PTransform that takes a PCollection<T>, selects sampleSize elements, uniformly at random, and returns a PCollection<Iterable<T>> containing the selected elements.
fixedSizePerKey(int) - Static method in class org.apache.beam.sdk.transforms.Sample
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a PCollection<KV<K, Iterable<V>>> that contains an output element mapping each distinct key in the input PCollection to a sample of sampleSize values associated with that key in the input PCollection, taken uniformly at random.
FixedWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows values into fixed-size timestamp-based windows.
FlatMapElements<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
PTransforms for mapping a simple function that returns iterables over the elements of a PCollection and merging the results.
Flatten - Class in org.apache.beam.sdk.transforms
Flatten<T> takes multiple PCollection<T>s bundled into a PCollectionList<T> and returns a single PCollection<T> containing all the elements in all the input PCollections.
Flatten() - Constructor for class org.apache.beam.sdk.transforms.Flatten
 
Flatten.Iterables<T> - Class in org.apache.beam.sdk.transforms
FlattenIterables<T> takes a PCollection<Iterable<T>> and returns a PCollection<T> that contains all the elements from each iterable.
Flatten.PCollections<T> - Class in org.apache.beam.sdk.transforms
A PTransform that flattens a PCollectionList into a PCollection containing all the elements of all the PCollections in its input.
FlattenWithHeterogeneousCoders - Interface in org.apache.beam.sdk.testing
Category tag for tests that use a Flatten where the input PCollectionList contains PCollections heterogeneous coders.
FlinkDetachedRunnerResult - Class in org.apache.beam.runners.flink
Result of a detached execution of a Pipeline with Flink.
FlinkMetricContainer - Class in org.apache.beam.runners.flink.metrics
Helper class for holding a MetricsContainerImpl and forwarding Beam metrics to Flink accumulators and metrics.
FlinkMetricContainer(RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
 
FlinkMetricContainer.FlinkDistributionGauge - Class in org.apache.beam.runners.flink.metrics
Flink Gauge for DistributionResult.
FlinkMetricContainer.FlinkGauge - Class in org.apache.beam.runners.flink.metrics
Flink Gauge for GaugeResult.
FlinkPipelineOptions - Interface in org.apache.beam.runners.flink
Options which can be used to configure a Flink PipelineRunner.
FlinkRunner - Class in org.apache.beam.runners.flink
A PipelineRunner that executes the operations in the pipeline by first translating them to a Flink Plan and then executing them either locally or on a Flink cluster, depending on the configuration.
FlinkRunnerRegistrar - Class in org.apache.beam.runners.flink
AutoService registrar - will register FlinkRunner and FlinkOptions as possible pipeline runner services.
FlinkRunnerRegistrar.Options - Class in org.apache.beam.runners.flink
Pipeline options registrar.
FlinkRunnerRegistrar.Runner - Class in org.apache.beam.runners.flink
Pipeline runner registrar.
FlinkRunnerResult - Class in org.apache.beam.runners.flink
Result of executing a Pipeline with Flink.
floats() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Float.
FOOTER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
forBoolean(Boolean) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forClass(Class<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject to be used for serializing an instance of the supplied class for transport via the Dataflow API.
forClassName(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject to be used for serializing data to be deserialized using the supplied class name the supplied class name for transport via the Dataflow API.
forCoder(TypeDescriptor<?>, Coder<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
Creates a CoderProvider that always returns the given coder for the specified type.
forever(Trigger) - Static method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
Create a composite trigger that repeatedly executes the trigger repeated, firing each time it fires and ignoring any indications to finish.
forFloat(Float) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forFloat(Double) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forFn(DoFn<InputT, OutputT>, WindowingStrategy<?, ?>, Iterable<PCollectionView<?>>, Coder<InputT>, long, Map<Long, TupleTag<?>>) - Static method in class org.apache.beam.runners.dataflow.util.DoFnInfo
Creates a DoFnInfo for the given DoFn.
forFn(Serializable, WindowingStrategy<?, ?>, Iterable<PCollectionView<?>>, Coder<InputT>, long, Map<Long, TupleTag<?>>) - Static method in class org.apache.beam.runners.dataflow.util.DoFnInfo
Deprecated.
forInteger(Long) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forInteger(Integer) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
forKnownType(Object) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value of a well-known cloud object type.
FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
formatTimestamp(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
Formats a Instant timestamp with additional Beam-specific metadata, such as indicating whether the timestamp is the end of the global window or one of the distinguished values BoundedWindow.TIMESTAMP_MIN_VALUE or BoundedWindow.TIMESTAMP_MIN_VALUE.
forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
forStreamFromSources(List<Integer>, Broadcast<Map<Integer, GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
Build the TimerInternals according to the feeding streams.
forString(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject representing the given value.
from(Map<String, String>) - Static method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
Expects a map keyed by logger Names with values representing Levels.
from(String) - Method in class org.apache.beam.sdk.io.AvroIO.Read
Reads from the given filename or filepattern.
from(String) - Static method in class org.apache.beam.sdk.io.AvroSource
Creates an AvroSource that reads from the given file name or pattern ("glob").
from(FileBasedSource<T>) - Static method in class org.apache.beam.sdk.io.CompressedSource
Creates a CompressedSource from an underlying FileBasedSource.
from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Reads a BigQuery table specified as "[project_id]:[dataset_id].[table_id]" or "[dataset_id].[table_id]" for tables within the current project.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Same as from(String), but with a ValueProvider.
from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Read from table specified by a TableReference.
from(long) - Static method in class org.apache.beam.sdk.io.GenerateSequence
Specifies the minimum number to generate (inclusive).
from(String, InitialPositionInStream) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Specify reading from streamName at some initial position.
from(String, Instant) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Specify reading from streamName beginning at given Instant.
from(BoundedSource<T>) - Method in class org.apache.beam.sdk.io.Read.Builder
Returns a new Read.Bounded PTransform reading from the given BoundedSource.
from(UnboundedSource<T, ?>) - Method in class org.apache.beam.sdk.io.Read.Builder
Returns a new Read.Unbounded PTransform reading from the given UnboundedSource.
from(BoundedSource<T>) - Static method in class org.apache.beam.sdk.io.Read
Returns a new Read.Bounded PTransform reading from the given BoundedSource.
from(UnboundedSource<T, ?>) - Static method in class org.apache.beam.sdk.io.Read
Returns a new Read.Unbounded PTransform reading from the given UnboundedSource.
from(String) - Method in class org.apache.beam.sdk.io.TextIO.Read
Reads text files that reads from the file(s) with the given filename or filename pattern.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Read
Same as from(filepattern), but accepting a ValueProvider.
from(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
Returns a transform for reading TFRecord files that reads from the file(s) with the given filename or filename pattern.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
Same as from(filepattern), but accepting a ValueProvider.
from(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
from(HasDisplayData) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Collect the DisplayData from a component.
fromArgs(String...) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
Sets the command line arguments to parse when constructing the PipelineOptions.
fromArgs(String...) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Sets the command line arguments to parse when constructing the PipelineOptions.
fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for deserializing a byte array using the specified coder.
fromByteArrays(Collection<byte[]>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for deserializing a Iterable of byte arrays using the specified coder.
fromByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting a byte array to an object.
fromByteFunction(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting a byte array pair to a key-value pair.
fromByteFunctionIterable(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting a byte array pair to a key-value pair, where values are Iterable.
fromCloudDuration(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
Converts a Dataflow API duration string into a Duration.
fromCloudObject(CloudObject) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
Converts back into the original object from a provided CloudObject.
fromCloudTime(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
Converts a time value received via the Dataflow API into the corresponding Instant.
fromFile(File, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.apex.ApexRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.apex.TestApexRunner
 
fromOptions(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Constructs a translator from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
Construct a runner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.TestDataflowRunner
Constructs a runner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.GcsStager
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.direct.DirectRunner
Construct a DirectRunner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkRunner
Construct a runner from the provided options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
Creates and returns a new SparkRunner with specified options.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunnerDebugger
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.TestSparkRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.FileSystemRegistrar
Create zero or more filesystems from the given PipelineOptions.
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
 
fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.LocalFileSystemRegistrar
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.PipelineRunner
Constructs a runner from the provided PipelineOptions.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.CrashingRunner
 
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipeline
 
fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Creates a class representing a Pub/Sub subscription from the specified subscription path.
fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Creates a class representing a Cloud Pub/Sub topic from the specified topic path.
fromPath(Path, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Reads results received after executing the given query.
fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Same as fromQuery(String), but with a ValueProvider.
fromSerializableFunctionWithOutputType(SerializableFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.SimpleFunction
 
fromSpec(Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
Constructs a CloudObject by copying the supplied serialized object spec, which must represent an SDK object serialized for transport via the Dataflow API.
fromStaticMethods(Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
Creates a CoderProvider from a class's static <T> Coder<T> of(TypeDescriptor<T>, List<Coder<?>>) method.
fromString(String, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
fromString(ValueProvider<String>, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
 
fromSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Reads from the given subscription.
fromSubscription(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Like subscription() but with a ValueProvider.
fromTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Creates and returns a transform for reading from a Cloud Pub/Sub topic.
fromTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Like topic() but with a ValueProvider.

G

Gauge - Interface in org.apache.beam.sdk.metrics
A metric that reports the latest value out of reported values.
gauge(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
gauge(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
Create a metric that can have its new value set, and is aggregated by taking the last reported value.
GaugeResult - Class in org.apache.beam.sdk.metrics
The result of a Gauge metric.
GaugeResult() - Constructor for class org.apache.beam.sdk.metrics.GaugeResult
 
GaugeResult.EmptyGaugeResult - Class in org.apache.beam.sdk.metrics
Empty GaugeResult, representing no values reported.
gauges() - Method in interface org.apache.beam.sdk.metrics.MetricQueryResults
Return the metric results for the gauges that matched the filter.
GcpCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
Construct an oauth credential to be used by the SDK and the SDK workers.
GcpCredentialFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
 
GcpIoPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.gcp.common
A registrar containing the default GCP options.
GcpIoPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
 
GcpOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
Options used to configure Google Cloud Platform specific options such as the project and credentials.
GcpOptions.DefaultProjectFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Attempts to infer the default project based upon the environment this application is executing within.
GcpOptions.GcpTempLocationFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Returns PipelineOptions.getTempLocation() as the default GCP temp location.
GcpOptions.GcpUserCredentialsFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Attempts to load the GCP credentials.
GcpPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.gcp.options
A registrar containing the default GCP options.
GcpPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
 
GcpTempLocationFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
 
GcpUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
 
GcsCreateOptions - Class in org.apache.beam.sdk.extensions.gcp.storage
An abstract class that contains common configuration options for creating resources.
GcsCreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
 
GcsCreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.storage
A builder for GcsCreateOptions.
GcsFileSystemRegistrar - Class in org.apache.beam.sdk.extensions.gcp.storage
AutoService registrar for the GcsFileSystem.
GcsFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
 
GcsOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
Options used to configure Google Cloud Storage.
GcsOptions.ExecutorServiceFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Returns the default ExecutorService to use within the Apache Beam SDK.
GcsOptions.PathValidatorFactory - Class in org.apache.beam.sdk.extensions.gcp.options
Creates a PathValidator object using the class specified in GcsOptions.getPathValidatorClass().
GcsPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
GCP implementation of PathValidator.
GcsResourceId - Class in org.apache.beam.sdk.extensions.gcp.storage
ResourceId implementation for Google Cloud Storage.
GcsStager - Class in org.apache.beam.runners.dataflow.util
Utility class for staging files to GCS.
gcsUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
The buffer size (in bytes) to use when uploading files to GCS.
GenerateSequence - Class in org.apache.beam.sdk.io
A PTransform that produces longs starting from the given value, and either up to the given limit or until Long.MAX_VALUE / until the given time elapses.
GenerateSequence() - Constructor for class org.apache.beam.sdk.io.GenerateSequence
 
get() - Static method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
get() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
Returns the Broadcast containing the GlobalWatermarkHolder.SparkWatermarks mapped to their sources.
get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
get() - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
get() - Method in interface org.apache.beam.sdk.options.ValueProvider
Return the value wrapped by this ValueProvider.
get() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
get() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
get() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
get(K) - Method in interface org.apache.beam.sdk.state.MapState
A deferred lookup.
get(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
Returns the value represented by the given TupleTag.
get(int) - Method in class org.apache.beam.sdk.values.PCollectionList
Returns the PCollection at the given index (origin zero).
get(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns the PCollection associated with the given TupleTag in this PCollectionTuple.
get(int) - Method in class org.apache.beam.sdk.values.TupleTagList
Returns the TupleTag at the given index (origin zero).
getAccum() - Method in interface org.apache.beam.sdk.state.CombiningState
Read the merged accumulator for this state cell.
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
getAccumulatorCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
getAccumulatorCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
getAccumulatorCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
Returns the Coder to use for accumulator AccumT values, or null if it is not able to be inferred.
getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
getAdaptedSource() - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
Returns an adapted BoundedSource wrapping the underlying UnboundedSource, with the specified bounds on number of records and read time.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns the side inputs of this Combine, tagged with the tag of the PCollectionView.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns the side inputs of this Combine, tagged with the tag of the PCollectionView.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
Returns the side inputs of this ParDo, tagged with the tag of the PCollectionView.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns the side inputs of this ParDo, tagged with the tag of the PCollectionView.
getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns all PValues that are consumed as inputs to this PTransform that are independent of the expansion of the InputT within PTransform.expand(PInput).
getAdditionalOutputTags() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getAlgorithm() - Method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
Returns the string representation of this type.
getAll(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns the values from the table represented by the given TupleTag<V> as an Iterable<V> (which may be empty if there are no results).
getAll() - Method in class org.apache.beam.sdk.values.PCollectionList
Returns an immutable List of all the PCollections in this PCollectionList.
getAll() - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns an immutable Map from TupleTag to corresponding PCollection, for all the members of this PCollectionTuple.
getAll() - Method in class org.apache.beam.sdk.values.TupleTagList
Returns an immutable List of all the TupleTags in this TupleTagList.
getAllowedLateness() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.DoFn
Deprecated.
This method permits a DoFn to emit elements behind the watermark. These elements are considered late, and if behind the allowed lateness of a downstream PCollection may be silently dropped. See https://issues.apache.org/jira/browse/BEAM-644 for details on a replacement.
getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.WithTimestamps
Deprecated.
This method permits a to elements to be emitted behind the watermark. These elements are considered late, and if behind the allowed lateness of a downstream PCollection may be silently dropped. See https://issues.apache.org/jira/browse/BEAM-644 for details on a replacement.
getApexDAG() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
Return the DAG executed by the pipeline.
getApexLauncher() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.LaunchParams
 
getApiRootUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The root URL for the Dataflow API.
getApplicationName() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
getAppliedFn(CoderRegistry, Coder<? extends KV<K, ? extends Iterable<InputT>>>, WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
Returns the Combine.CombineFn bound to its coders.
getAppName() - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
Name of application, for display purposes.
getApproximateArrivalTimestamp() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getArgumentTypes(Method) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a list of argument types for the given method, which must be a part of the class.
getAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the given attribute value.
getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the full map of attributes.
getAutoscalingAlgorithm() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
[Experimental] The autoscaling algorithm to use for the workerpool.
getBaseOutputDirectoryProvider() - Method in class org.apache.beam.sdk.io.FileBasedSink
Returns the base directory inside which files will be written according to the configured FileBasedSink.FilenamePolicy.
getBatches() - Method in class org.apache.beam.runners.spark.io.CreateStream
Get the underlying queue representing the mock stream of micro-batches.
getBatchIntervalMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns the Google Cloud Bigtable instance being read from, and other parameters.
getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns the Google Cloud Bigtable instance being written to, and other parameters.
getBoolean(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getBoolean(Map<String, Object>, String, Boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getBytes(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getBytes(Map<String, Object>, String, byte[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getBytes() - Method in class org.apache.beam.sdk.io.range.ByteKey
Returns a newly-allocated byte[] representing this ByteKey.
getBytesPerOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns approximately how many bytes of data correspond to a single offset in this source.
getCause() - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
Returns the reason that this WindowFn is invalid.
getChannelFactory() - Method in class org.apache.beam.sdk.io.CompressedSource
 
getCheckpointDir() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getCheckpointDurationMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getCheckpointingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getCheckpointMark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getCheckpointMark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns a UnboundedSource.CheckpointMark representing the progress of this UnboundedReader.
getCheckpointMarkCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.UnboundedSource
Returns a Coder for encoding and decoding the checkpoints for this source.
getClasses() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a set of TypeDescriptors, one for each superclass (including this class).
getClassName() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
Gets the name of the Java class that this CloudObject represents.
getCloningBehavior() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Indicates whether this DoFnTester will clone the DoFn under test.
getClosingBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getCmd() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.LaunchParams
 
getCoder(Class<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Returns the Coder to use for values of the given class.
getCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Returns the Coder to use for values of the given type.
getCoder(TypeDescriptor<OutputT>, TypeDescriptor<InputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Deprecated.
getCoder(Class<? extends T>, Class<T>, Map<Type, ? extends Coder<?>>, TypeVariable<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Deprecated.
getCoder() - Method in class org.apache.beam.sdk.coders.DelegateCoder
Returns the coder used to encode/decode the intermediate values produced/consumed by the coding functions of this DelegateCoder.
getCoder() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns a Coder suitable for IntervalWindow.
getCoder() - Method in class org.apache.beam.sdk.values.PCollection
Returns the Coder used by this PCollection to encode and decode the values stored in it.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.AtomicCoder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.AvroCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.Coder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.CustomCoder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.KvCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.MapCoder
If this is a Coder for a parameterized type, returns the list of Coders being used for each of the parameters in the same order they appear within the parameterized type's type signature.
getCoderArguments() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
getCoderInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
Deprecated.
this method will be removed entirely. The PCollection underlying a side input, including its Coder, is part of the side input's specification with a ParDo transform, which will obtain that information via a package-private channel.
getCoderInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
getCoderProvider() - Static method in class org.apache.beam.sdk.coders.AvroCoder
Returns a CoderProvider which uses the AvroCoder if possible for all types.
getCoderProvider() - Static method in class org.apache.beam.sdk.coders.SerializableCoder
Returns a CoderProvider which uses the SerializableCoder if possible for all types.
getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns a CoderProvider which uses the ProtoCoder for proto messages.
getCoderProvider() - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
Returns a CoderProvider which uses the WritableCoder for Hadoop writable types.
getCoderProviders() - Method in interface org.apache.beam.sdk.coders.CoderProviderRegistrar
Returns a list of coder providers which will be registered by default within each coder registry instance.
getCoderProviders() - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
 
getCoderRegistry() - Method in class org.apache.beam.sdk.Pipeline
Returns the CoderRegistry that this Pipeline uses.
getCoGbkResultSchema() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns the CoGbkResultSchema associated with this KeyedPCollectionTuple.
getCollection() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
Returns the underlying PCollection of this TaggedKeyedPCollection.
getCombineFn() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators.CombineFunctionState
 
getCombineFn() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
 
getCombineFn() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
getComponents() - Method in class org.apache.beam.sdk.coders.AtomicCoder
Returns the list of Coders that are components of this Coder.
getComponents() - Method in class org.apache.beam.sdk.coders.StructuredCoder
Returns the list of Coders that are components of this Coder.
getComponents() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
Hierarchy list of component paths making up the full path, starting with the top-level child component path.
getComponents() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
getComponents() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
getComponents() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
getComponentType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns the component type if this type is an array type, otherwise returns null.
getConfigFile() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
getContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Provides the container version that will be used for constructing harness image paths.
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
Deprecated.
 
getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
Return a trigger to use after a GroupByKey to preserve the intention of this trigger.
getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
Subclasses should override this to return the Trigger.getContinuationTrigger() of this Trigger.
getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
 
getCount() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
getCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Counter that should be used for implementing the given metricName in this container.
getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.auth.CredentialFactory
 
getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
Returns a default GCP Credentials or null when it fails.
getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
 
getCredentialFactoryClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
The class of the credential factory that should be created and used to create credentials.
getCurrent() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getCurrent() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
getCurrent() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Gets the current record from the delegate reader.
getCurrent() - Method in class org.apache.beam.sdk.io.Source.Reader
Returns the value of the data item that was read by the last Source.Reader.start() or Source.Reader.advance() call.
getCurrentBlock() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
getCurrentBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Returns the current block (the block that was read by the last successful call to BlockBasedSource.BlockBasedReader.readNextBlock()).
getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Returns the largest offset such that starting to read from that offset includes the current block.
getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Returns the size of the current block in bytes as it is represented in the underlying file, if possible.
getCurrentContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Return the MetricsContainer for the current thread.
getCurrentDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
getCurrentDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns the ResourceId that represents the current directory of this ResourceId.
getCurrentOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
getCurrentOffset() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
getCurrentOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Returns the starting offset of the current record, which has been read by the last successful Source.Reader.start() or Source.Reader.advance() call.
getCurrentRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
Returns the current record.
getCurrentRecordId() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns a unique identifier for the current record.
getCurrentSource() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getCurrentSource() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
getCurrentSource() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Returns a Source describing the same input that this Reader currently reads (including items already read).
getCurrentSource() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
getCurrentSource() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
getCurrentSource() - Method in class org.apache.beam.sdk.io.Source.Reader
Returns a Source describing the same input that this Reader currently reads (including items already read).
getCurrentSource() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns the UnboundedSource that created this reader.
getCurrentTimestamp() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
By default, returns the minimum possible timestamp.
getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.Source.Reader
Returns the timestamp associated with the current data item.
getData() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getDataAsBytes() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getDataCoder() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
Deprecated.
 
getDataflowClient() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
An instance of the Dataflow client.
getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Dataflow endpoint to use.
getDataflowJobFile() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The path to write the translated Dataflow job specification out to at job submission time.
getDataflowRunnerInfo() - Static method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Returns an instance of DataflowRunnerInfo.
getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getDebuggee() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
The Cloud Debugger debuggee to associate with.
getDefaultCoder(TypeDescriptor<?>, CoderRegistry) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
Returns the default coder for a given type descriptor.
getDefaultOutputCoder() - Method in class org.apache.beam.runners.spark.io.CreateStream
 
getDefaultOutputCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
getDefaultOutputCoder(PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>>) - Method in class org.apache.beam.sdk.extensions.sorter.SortValues
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.AvroSource
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.CompressedSource
Returns the delegate source's default output coder.
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.Read.Bounded
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.Source
Returns the default Coder to use for the data read from this source.
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.TextIO.Read
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.TextIO.Write
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
getDefaultOutputCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
getDefaultOutputCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
getDefaultOutputCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
getDefaultOutputCoder(PCollection<? extends KV<K, ? extends Iterable<InputT>>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
Returns the Coder to use by default for output OutputT values, or null if it is not able to be inferred.
getDefaultOutputCoder(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
 
getDefaultOutputCoder(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.Values
 
getDefaultOutputCoder(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Filter
 
getDefaultOutputCoder(PCollectionList<T>) - Method in class org.apache.beam.sdk.transforms.Flatten.PCollections
 
getDefaultOutputCoder(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getDefaultOutputCoder(PCollection<? extends InputT>, PCollection<T>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getDefaultOutputCoder(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
getDefaultOutputCoder() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns the default Coder to use for the output of this single-output PTransform.
getDefaultOutputCoder(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
Returns the default Coder to use for the output of this single-output PTransform when applied to the given input.
getDefaultOutputCoder(InputT, PCollection<T>) - Method in class org.apache.beam.sdk.transforms.PTransform
Returns the default Coder to use for the given output of this single-output PTransform when applied to the given input.
getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
getDefaultOutputCoder(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
Deprecated.
Returns the default value that was specified.
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Return a WindowMappingFn that returns the earliest window that contains the end of the main-input window.
getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns the default WindowMappingFn to use to map main input windows to side input windows.
getDefaultWorkerLogLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
This option controls the default log level of all loggers without a log level override.
getDelay() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
 
getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns an object that represents at a high level which table is being written to.
getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the coder for DestinationT.
getDestinationFile(FileBasedSink.FilenamePolicy, ResourceId, int, String) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getDictionary(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getDiskSizeGb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Remote worker disk size, in gigabytes, or 0 to use the default size.
getDistribution(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Distribution that should be used for implementing the given metricName in this container.
getDoFn() - Method in class org.apache.beam.runners.dataflow.util.DoFnInfo
Returns the embedded function.
getDumpHeapOnOOM() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
If true, save a heap dump before killing a thread or process which is GC thrashing or out of memory.
getEarlyTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getElemCoder() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
getElementCoders() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
getElementCount() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
The number of elements after which this trigger may fire.
getElements() - Method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
 
getElements() - Method in class org.apache.beam.sdk.transforms.Create.Values
 
getEnableCloudDebugger() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
Whether to enable the Cloud Debugger snapshot agent for the current job.
getEnableMetrics() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getEnableSparkMetricSinks() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getEncodedElementByteSize(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
getEncodedElementByteSize(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
getEncodedElementByteSize(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.Coder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
Overridden to short-circuit the default StructuredCoder behavior of encoding and counting the bytes.
getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
Overridden to short-circuit the default StructuredCoder behavior of encoding and counting the bytes.
getEncodedElementByteSize(String) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
Returns the size in bytes of the encoded value using this coder.
getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
getEncodedElementByteSize(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
 
getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
getEncodedElementByteSize(TableRow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.AvroCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.Coder
Returns the TypeDescriptor for the type encoded.
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.CollectionCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DoubleCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DurationCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.InstantCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.IterableCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.KvCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ListCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.MapCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.NullableCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SetCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StructuredCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VoidCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
getEndKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns the ByteKey representing the upper bound of this ByteKeyRange.
getEndOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns the specified ending offset of the source.
getEnv() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.LaunchParams
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
An estimate of the total size (in bytes) of the data that would be read from this source.
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
getEvents() - Method in class org.apache.beam.sdk.testing.TestStream
Returns the sequence of Events in this TestStream.
getExecutionRetryDelay() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getExecutorService() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The ExecutorService instance to use to create threads, can be overridden to specify an ExecutorService that is compatible with the users environment.
getExpectedAssertions() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
getExperiments() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The list of backend experiments to enable.
getExtendedSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getExtension() - Method in class org.apache.beam.sdk.io.FileBasedSink
Returns the extension that will be written to the produced files.
getExtensionHosts() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
getExtensionRegistry() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns the ExtensionRegistry listing all known Protocol Buffers extension messages to T registered with this ProtoCoder.
getFanout() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
getFilename() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
getFilename() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns the name of the file or directory denoted by this ResourceId.
getFilenamePolicy() - Method in class org.apache.beam.sdk.io.FileBasedSink
Returns the policy by which files will be named inside of the base output directory.
getFilenameSuffix() - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
 
getFilenameSuffix() - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
 
getFileOrPatternSpec() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getFileOrPatternSpecProvider() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getFilesToStage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
List of local files to make available to workers.
getFilesToStage() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
List of local files to make available to workers.
getFlinkMaster() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
The url of the Flink JobManager on which to execute pipelines.
getFn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
getFn() - Method in class org.apache.beam.runners.dataflow.util.DoFnInfo
Deprecated.
getFn() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns the CombineFnBase.GlobalCombineFn used by this Combine operation.
getFn() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
Returns the CombineFnBase.GlobalCombineFn used by this Combine operation.
getFn() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns the CombineFnBase.GlobalCombineFn used by this Combine operation.
getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
getFnApiEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Provides the FnAPI environment's major version number.
getFractionConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
 
getFractionConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Returns a value in [0, 1] representing approximately what fraction of the current source this reader has read so far, or null if such an estimate is not available.
getFractionConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
getFractionConsumed() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Returns the approximate fraction of positions in the source that have been consumed by successful RangeTracker.tryReturnRecordAt(boolean, PositionT) calls, or 0.0 if no such calls have happened.
getFractionOfBlockConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
Returns the fraction of the block already consumed, if possible, as a value in [0, 1].
getFrom() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRange
 
getGapDuration() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
getGauge(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
Return the Gauge that should be used for implementing the given metricName in this container.
getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getGcloudCancelCommand(DataflowPipelineOptions, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
getGcpCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
The credential instance that should be used to authenticate against GCP services.
getGcpTempLocation() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
A GCS path for storing temporary files in GCP.
getGcsEndpoint() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
GCS endpoint to use.
getGcsUploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The buffer size (in bytes) to use when uploading files to GCS.
getGcsUtil() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The GcsUtil instance that should be used to communicate with Google Cloud Storage.
getGoogleApiTrace() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
This option enables tracing of API calls to Google services used within the Apache Beam SDK.
getHadoopConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableConfiguration
 
getHdfsConfiguration() - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
 
getHighWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getId() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
getId() - Method in class org.apache.beam.sdk.values.TupleTag
Returns the id of this TupleTag.
getId() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the id attribute.
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the id attribute.
getIncompatibleGlobalWindowErrorMessage() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
Returns the error message for not supported default values in Combine.globally().
getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
getIndex(TupleTag<?>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Returns the index for the given tuple tag, if the tag is present in this schema, -1 if it isn't.
getIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
The zero-based index of this trigger firing that produced this pane.
getInputCoder() - Method in class org.apache.beam.runners.dataflow.util.DoFnInfo
 
getinputFormatClass() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
getinputFormatKeyClass() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
getinputFormatValueClass() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
Returns a TypeDescriptor capturing what is known statically about the input type of this DoFn instance's most-derived class.
getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.SimpleFunction
Returns a TypeDescriptor capturing what is known statically about the input type of this SimpleFunction instance's most-derived class.
getInputValueCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
Returns the Coder of the values of the input to this transform.
getInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
getInstance() - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
 
getInstance() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
getInt(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getInt(Map<String, Object>, String, Integer) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getInterfaces() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a set of TypeDescriptors, one for each interface implemented by this class.
getJAXBClass() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
getJmsCorrelationID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsDeliveryMode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsDestination() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsExpiration() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsMessageID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsPriority() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsRedelivered() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsReplyTo() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsTimestamp() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJmsType() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getJob(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Gets the Dataflow Job with the given jobId.
getJob() - Method in exception org.apache.beam.runners.dataflow.DataflowJobException
Returns the failed job.
getJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
 
getJobId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Get the id of this job.
getJobId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
The identity of the Dataflow job.
getJobId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
getJobMessages(String, long) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
Return job messages sorted in ascending order by timestamp.
getJobMetrics(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Gets the JobMetrics with the given jobId.
getJobMonitoringPageURL(String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
getJobName() - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
The key for the display item.
getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The key for the display item.
getKey() - Method in class org.apache.beam.sdk.values.KV
Returns the key of this KV.
getKeyCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
 
getKeyCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
 
getKeyCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
Returns the Coder of the keys of the input to this transform, which is also used as the Coder of the keys of the output of this transform.
getKeyCoder() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns the key Coder for all PCollections in this KeyedPCollectionTuple.
getKeyedCollections() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns a list of TaggedKeyedPCollections for the PCollections contained in this KeyedPCollectionTuple.
getKeyRange() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns the range of keys that will be read from the table.
getKeyRange() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Returns the range of keys that will be read from the table.
getKeyTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
getKeyTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
getKindString() - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
getKindString() - Method in class org.apache.beam.sdk.io.Read.Bounded
 
getKindString() - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
 
getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
getKindString() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns the name to use by default for this PTransform (not including the names of any enclosing PTransforms).
getKindString() - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
getKindString() - Method in class org.apache.beam.sdk.values.PValueBase
Returns a String capturing the kind of this PValueBase.
getKV() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Retrieve the optional label for an item.
getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The optional label for an item.
getLastEmitted() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
Returns the last value emitted by the reader.
getLateTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getLegacyEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
Provides the legacy environment's major version number.
getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Retrieve the optional link URL for an item.
getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The optional link URL for an item.
getListeners() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
getListOfMaps(Map<String, Object>, String, List<Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getLiteralGqlQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getLocalhost() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getLocalValue() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
getLong(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getLong(Map<String, Object>, String, Long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getLowWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
getMainOutput() - Method in class org.apache.beam.runners.dataflow.util.DoFnInfo
 
getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getMainTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
The main trigger, which will continue firing until the "until" trigger fires.
getMaterialization() - Method in class org.apache.beam.sdk.transforms.ViewFn
Gets the materialization of this ViewFn.
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
Deprecated.
 
getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
Deprecated.
 
getMaxConditionCost() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
The maximum cost (as a ratio of CPU time) allowed for evaluating conditional snapshots.
getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns the actual ending offset of the current source.
getMaxNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
The maximum number of workers to use for the workerpool.
getMaxRecordsPerBatch() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getMean() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Returns the configured size of the memory buffer.
getMessage() - Method in exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
getMessages() - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
 
getMessageType() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns the Protocol Buffers Message type this ProtoCoder supports.
getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getMimeType() - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
 
getMimeType() - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
Returns the MIME type that should be used for the files that will hold the output data.
getMinBundleSize() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns the minimum bundle size that should be used when splitting the source into sub-sources.
getMinReadTimeMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getMode() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
getMode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getMonthOfYear() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getMutableOutput(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
getName() - Method in interface org.apache.beam.sdk.metrics.Metric
The MetricName given to this metric.
getName() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
If set, the metric must have this name to match this MetricNameFilter.
getName() - Method in class org.apache.beam.sdk.transforms.PTransform
Returns the transform name.
getName() - Method in class org.apache.beam.sdk.values.PCollection
Returns the name of this PCollection.
getName() - Method in interface org.apache.beam.sdk.values.PValue
Returns the name of this PValue.
getName() - Method in class org.apache.beam.sdk.values.PValueBase
Returns the name of this PValueBase.
getNameOverride() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
getNameOverride() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
getNamespace() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
The inNamespace that a metric must be in to match this MetricNameFilter.
getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
The namespace for the display item.
getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The namespace for the display item.
getNeedsAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
getNetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
GCE network for launching workers.
getNextOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
getNonSpeculativeIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
The zero-based index of this trigger firing among non-speculative panes.
getNum() - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
 
getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getNumberOfExecutionRetries() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getNumberOfWorkerHarnessThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Number of threads to use on the Dataflow worker harness.
getNumQuerySplits() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getNumShards() - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy.Context
 
getNumShards() - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy.WindowedContext
 
getNumShards() - Method in class org.apache.beam.sdk.io.WriteFiles
 
getNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Number of workers to use when executing the Dataflow job.
getObject(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getObjectReuse() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
getOldestPendingTimestamp() - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
 
getOnCreateMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
getOnly(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
If there is a singleton value for the given tag, returns it.
getOnly(TupleTag<V>, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
If there is a singleton value for the given tag, returns it.
getOnSuccessMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
getOptions() - Method in class org.apache.beam.sdk.testing.TestPipeline
 
getOptionsId() - Method in interface org.apache.beam.sdk.options.PipelineOptions
Provides a unique ID for this PipelineOptions object, assigned at graph construction time.
getOrCreateReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
getOriginalWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
Returns the original windowFn that this InvalidWindows replaced.
getOutName(int) - Method in class org.apache.beam.sdk.values.TupleTag
If this TupleTag is tagging output outputIndex of a PTransform, returns the name that should be used by default for the output.
getOutputCoder(SerializableFunction<InputT, OutputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Deprecated.
getOutputKvCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
Returns the Coder of the output of this transform.
getOutputMap() - Method in class org.apache.beam.runners.dataflow.util.DoFnInfo
 
getOutputStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Get the output strategy of this Window PTransform.
getOutputTime(Instant, GlobalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
getOutputTime(Instant, W) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
getOutputTime(Instant, W) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
getOutputTime(Instant, IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Ensures that later sliding windows have an output time that is past the end of earlier windows.
getOutputTime(Instant, W) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns the output timestamp to use for data depending on the given inputTimestamp in the specified window.
getOutputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns a TypeDescriptor capturing what is known statically about the output type of this CombineFn instance's most-derived class.
getOutputTypeDescriptor() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
Returns a TypeDescriptor capturing what is known statically about the output type of this DoFn instance's most-derived class.
getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.SimpleFunction
Returns a TypeDescriptor capturing what is known statically about the output type of this SimpleFunction instance's most-derived class.
getOverrideWindmillBinary() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Custom windmill_main binary to use with the streaming runner.
getPane() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Returns the pane of this ValueInSingleWindow in its window.
getPaneInfo() - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy.WindowedContext
 
getPaneInfo() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getPartitionKey() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
 
getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
getPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
The path for the display item within a component hierarchy.
getPathValidator() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The path validator instance that should be used to validate paths.
getPathValidatorClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
The class of the validator that should be created and used to validate paths.
getPayload() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the main PubSub message.
getPayload() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getPCollection() - Method in interface org.apache.beam.sdk.values.PCollectionView
For internal use only.
getPCollection() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
getPipeline() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
getPipeline() - Method in class org.apache.beam.sdk.values.PBegin
 
getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionList
 
getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
getPipeline() - Method in class org.apache.beam.sdk.values.PDone
 
getPipeline() - Method in interface org.apache.beam.sdk.values.PInput
Returns the owning Pipeline of this PInput.
getPipeline() - Method in interface org.apache.beam.sdk.values.POutput
Returns the owning Pipeline of this POutput.
getPipeline() - Method in class org.apache.beam.sdk.values.PValueBase
 
getPipelineOptions() - Method in class org.apache.beam.runners.apex.ApexRunnerRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.direct.DirectRunner
Returns the PipelineOptions used to create this DirectRunner.
getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunner
For testing.
getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.runners.flink.TestFlinkRunner
 
getPipelineOptions() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
 
getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
 
getPipelineOptions() - Method in class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
 
getPipelineOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptionsRegistrar
 
getPipelineOptions() - Method in interface org.apache.beam.sdk.state.StateContext
Returns the PipelineOptions specified with the PipelineRunner.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
Returns the PipelineOptions specified with the PipelineRunner invoking this KeyedCombineFn.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
Returns the PipelineOptions specified with the PipelineRunner invoking this DoFn.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
Returns the PipelineOptions specified with the PipelineRunner invoking this DoFn.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Returns the PipelineOptions specified with the PipelineRunner invoking this DoFn.
getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFnTester
 
getPipelineRunners() - Method in class org.apache.beam.runners.apex.ApexRunnerRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
 
getPipelineRunners() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
 
getPositionForFractionConsumed(double) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
Returns a position P such that the range [start, P) represents approximately the given fraction of the range [start, end).
getProcessingTimeAdvance() - Method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
 
getProfilingAgentConfiguration() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
getProject() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
getProject() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
Project id to use when launching jobs.
getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the project path.
getProjectId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Get the project this job exists in.
getProjectId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getProperties() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
getProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
getPubsubRootUrl() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
Root URL for use with the Google Cloud Pub/Sub API.
getQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getRange() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
Returns the current range.
getRawType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns the Class underlying the Type represented by this TypeDescriptor.
getReadDurationMillis() - Method in class org.apache.beam.runners.spark.io.SparkUnboundedSource.Metadata
 
getReadTime() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getReadTimePercentage() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getReason() - Method in exception org.apache.beam.sdk.coders.CannotProvideCoderException
 
getReasons() - Method in exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
getRecordType() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
getRegion() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
The Google Compute Engine region for creating Dataflow jobs.
getRegisteredOptions() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
 
getRepeatedTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Returns a new DataflowPipelineJob for the job that replaced this one, if applicable.
getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollection<OutputT>, ParDo.SingleOutput<InputT, OutputT>>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
 
getRetainExternalizedCheckpointsOnCancellation() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
getRootCause() - Method in exception org.apache.beam.sdk.coders.CannotProvideCoderException
Returns the inner-most CannotProvideCoderException when they are deeply nested.
getRunMillis() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
getRunner() - Method in interface org.apache.beam.sdk.options.PipelineOptions
The pipeline runner that will be used to execute the pipeline.
getSaveProfilesToGcs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
getSchema() - Method in class org.apache.beam.sdk.coders.AvroCoder
Returns the schema used by this coder.
getSchema() - Method in class org.apache.beam.sdk.io.AvroSource
 
getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the table schema for the destination.
getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns the schema used by this CoGbkResult.
getScheme() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
getScheme() - Method in class org.apache.beam.sdk.io.FileSystem
Get the URI scheme which defines the namespace of the FileSystem.
getScheme() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Get the scheme which defines the namespace of the ResourceId.
getSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getServiceAccount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Run the job as a specific service account, instead of the default GCE robot.
getShard() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getShardId() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
Gets the PTransform that will be used to determine sharding.
getShardNumber() - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy.Context
 
getShardNumber() - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy.WindowedContext
 
getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Return the optional short value for an item, or null if none is provided.
getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The optional short value for an item, or null if none is provided.
getSideInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
 
getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Specifies that this object needs access to one or more side inputs.
getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns the side inputs used by this Combine operation.
getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns the side inputs used by this Combine operation.
getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
 
getSideInputViews() - Method in class org.apache.beam.runners.dataflow.util.DoFnInfo
 
getSideInputWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
Returns the window of the side input corresponding to the given window of the main input.
getSingleFileMetadata() - Method in class org.apache.beam.sdk.io.FileBasedSource
Returns the information about the single file that this source is reading from.
getSink() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Returns the FileBasedSink for this write operation.
getSink() - Method in class org.apache.beam.sdk.io.WriteFiles
Returns the FileBasedSink associated with this PTransform.
getSize() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
getSize() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
getSource() - Method in class org.apache.beam.sdk.io.Read.Bounded
Returns the BoundedSource used to create this Read PTransform.
getSource() - Method in class org.apache.beam.sdk.io.Read.Unbounded
Returns the UnboundedSource used to create this Read PTransform.
getSource() - Method in class org.apache.beam.sdk.io.TextIO.Read
 
getSource() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
getSparkMaster() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getSplit() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableSplit
 
getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns the size of the backlog of unread data in the underlying data source represented by this split of this source.
getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Returns the total amount of parallelism in the consumed (returned and processed) range of this reader's current BoundedSource (as would be returned by BoundedSource.BoundedReader.getCurrentSource()).
getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
getSplitPointsProcessed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
Returns the total number of split points that have been processed.
getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Returns the total amount of parallelism in the unprocessed part of this reader's current BoundedSource (as would be returned by BoundedSource.BoundedReader.getCurrentSource()).
getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
 
getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
getStableUniqueNames() - Method in interface org.apache.beam.sdk.options.PipelineOptions
Whether to check for stable unique names on each transform.
getStager() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The resource stager instance that should be used to stage resources.
getStagerClass() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
The class responsible for staging resources to be accessible by workers during job execution.
getStagingLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
GCS path for staging local files, e.g.
getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getStartKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns the ByteKey representing the lower bound of this ByteKeyRange.
getStartOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
Returns the starting offset of the source.
getStartPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
getStartPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
getStartPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Returns the starting position of the current range, inclusive.
getStartTime() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
Returns the time the reader was started.
getState() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
getState() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
getState() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
 
getState() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
getState() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
getState() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
getState() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
getState() - Method in interface org.apache.beam.sdk.PipelineResult
Retrieves the current state of the pipeline execution.
getStateBackend() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
State backend to store Beam's state during computation.
getStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
Returns the mapping of AppliedPTransforms to the internal step name for that AppliedPTransform.
getStopPipelineWatermark() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
getStopPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
getStopPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
getStopPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Returns the ending position of the current range, exclusive.
getStorageLevel() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getStreamName() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getString(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getStrings(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
 
getSubnetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
GCE subnetwork for launching workers.
getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the subscription being read from.
getSubscriptionProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the ValueProvider for the subscription being read from.
getSubSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getSum() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
getSumAndReset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
getSupertype(Class<? super T>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns the generic form of a supertype.
getSupportedClass() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
Gets the class this CloudObjectTranslator is capable of converting.
getSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Returns the table to read, or null if reading from a query instead.
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Returns the table reference, or null.
getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns a TableDestination object for the destination.
getTableDescription() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns the table being read from.
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns the table being written to.
getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Returns the table to read, or null if reading from a query instead.
getTableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTag(int) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Returns the tuple tag at the given index.
getTag() - Method in class org.apache.beam.sdk.values.TaggedPValue
Returns the local tag associated with the PValue.
getTagInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
Deprecated.
this method will be removed entirely. The PCollection underlying a side input, is part of the side input's specification with a ParDo transform, which will obtain that information via a package-private channel.
getTagInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
Returns a unique TupleTag identifying this PCollectionView.
getTargetParallelism() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
getTempDatasetId() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getTempFilename() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getTemplateLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Where the runner should generate a template file.
getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Returns the configured temporary location.
getTempLocation() - Method in interface org.apache.beam.sdk.options.PipelineOptions
A pipeline level default location for storing temporary files.
getTempRoot() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
getTestTimeoutSeconds() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
getTimeDomain() - Method in interface org.apache.beam.sdk.state.TimerSpec
 
getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
 
getTimes() - Method in class org.apache.beam.runners.spark.io.CreateStream
Get times so they can be pushed into the GlobalWatermarkHolder.
getTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getTimestamp() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
getTimestamp() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Returns the timestamp of this ValueInSingleWindow.
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the timestamp attribute.
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the timestamp attribute.
getTimestampCombiner() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
Return the TimestampCombiner which will be used to determine a watermark hold time given an element timestamp, and to combine watermarks from windows which are about to be merged.
getTimestampCombiner() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getTimestampTransforms() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
The transforms applied to the arrival time of an element to determine when this trigger allows output.
getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
getTiming() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Return the timing of this pane.
getTo() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRange
 
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the topic being written to.
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the topic being read from.
getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the ValueProvider for the topic being written to.
getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the ValueProvider for the topic being read from.
getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns the size of the backlog of unread data in the underlying data source represented by all splits of this source.
getTransformNameMapping() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Mapping of old PTranform names to new ones, specified as JSON {"oldName":"newName",...}.
getTransformTranslator(Class<TransformT>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Returns the TransformTranslator to use for instances of the specified PTransform class, or null if none registered.
getTranslator() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
Returns the DataflowPipelineTranslator associated with this object.
getTrigger() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getTupleTag() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
Returns the TupleTag of this TaggedKeyedPCollection.
getTupleTagList() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Returns the TupleTagList tuple associated with this schema.
getType() - Method in class org.apache.beam.sdk.coders.AvroCoder
Returns the type this coder encodes/decodes.
getType() - Method in interface org.apache.beam.sdk.testing.TestStream.Event
 
getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Retrieve the DisplayData.Type of display data.
getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The DisplayData.Type of display data.
getType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns the Type represented by this TypeDescriptor.
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollection
Returns a TypeDescriptor<T> with some reflective information about T, if possible.
getTypeDescriptor() - Method in class org.apache.beam.sdk.values.TupleTag
Returns a TypeDescriptor capturing what is known statically about the type of this TupleTag instance's most-derived class.
getTypeParameter(String) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a TypeVariable for the named type parameter.
getTypes() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a set of TypeDescriptor, one for each superclass as well as each interface implemented by this class.
getUnderlyingDoFn() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
getUnionCoder() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
getUnionTag() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
getUniqueId() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
getUntilTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
The trigger that signals termination of this trigger.
getUrn() - Method in interface org.apache.beam.sdk.transforms.Materialization
Gets the URN describing this Materialization.
getUsePublicIps() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Specifies whether worker pools should be started with public IP addresses.
getUsesProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
getValue() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer.FlinkDistributionGauge
 
getValue() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer.FlinkGauge
 
getValue(String, Class<T>) - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators
 
getValue() - Method in class org.apache.beam.runners.spark.util.ByteArray
 
getValue() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
getValue() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
getValue() - Method in class org.apache.beam.sdk.io.range.ByteKey
Returns a read-only ByteBuffer representing this ByteKey.
getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
Retrieve the value of the display item.
getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
The value of the display item.
getValue() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
getValue() - Method in class org.apache.beam.sdk.values.KV
Returns the value of this KV.
getValue() - Method in class org.apache.beam.sdk.values.TaggedPValue
Returns the PValue.
getValue() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
getValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Returns the value of this ValueInSingleWindow.
getValue() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
getValueCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
 
getValueCoder() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
Gets the value coder that will be prefixed by the length.
getValueCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
 
getValueCoder() - Method in class org.apache.beam.sdk.coders.NullableCoder
Returns the inner Coder wrapped by this NullableCoder instance.
getValueCoder() - Method in class org.apache.beam.sdk.testing.TestStream
 
getValueCoder() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
getValueCoder() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
getValueTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
getValueTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
getView() - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
getView() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
Deprecated.
 
getView() - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
Deprecated.
getViewFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
Deprecated.
this method will be removed entirely. The ViewFn for a side input is an attribute of the side input's specification with a ParDo transform, which will obtain this specification via a package-private channel.
getViewFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
getWatermark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
getWatermark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Returns a timestamp before or at the timestamps of all future elements read by this reader.
getWatermark() - Method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
Deprecated.
 
getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
For internal use only; no backwards-compatibility guarantees.
getWindmillServiceEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
Custom windmill service endpoint.
getWindmillServicePort() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
getWindow() - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy.WindowedContext
 
getWindow() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
getWindow() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
Returns the window of this ValueInSingleWindow.
getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
getWindowFn() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
getWindowingStrategy() - Method in class org.apache.beam.runners.dataflow.util.DoFnInfo
 
getWindowingStrategy() - Method in class org.apache.beam.sdk.values.PCollection
Returns the WindowingStrategy of this PCollection.
getWindowingStrategyInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
Deprecated.
this method will be removed entirely. The PCollection underlying a side input, including its WindowingStrategy, is part of the side input's specification with a ParDo transform, which will obtain that information via a package-private channel.
getWindowingStrategyInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
Returns the WindowingStrategy of this PCollectionView, which should be that of the underlying PCollection.
getWindowMappingFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
For internal use only.
getWindowMappingFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns a TypeDescriptor capturing what is known statically about the window type of this WindowFn instance's most-derived class.
getWorkerCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
The size of the worker's in-memory cache, in megabytes.
getWorkerDiskType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Specifies what type of persistent disk is used.
getWorkerHarnessContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Docker container image that executes Dataflow worker harness, residing in Google Container Registry.
getWorkerId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
The identity of the worker running this pipeline.
getWorkerLogLevelOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
This option controls the log levels for specifically named loggers.
getWorkerMachineType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
Machine type to create Dataflow worker VMs as.
getWorkerSystemErrMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Controls the log level given to messages printed to System.err.
getWorkerSystemOutMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
Controls the log level given to messages printed to System.out.
getWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Return the WriteOperation that this Writer belongs to.
getYarnDeployDependencies() - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
From the current classpath, find the jar files that need to be deployed with the application to run on YARN.
getZone() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
GCE availability zone for launching workers.
getZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
GCP availability zone for operations.
global(Broadcast<Map<Integer, GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
Build a global TimerInternals for all feeding streams.
globalDefault() - Static method in class org.apache.beam.sdk.values.WindowingStrategy
Return a fully specified, default windowing strategy.
globally(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
Returns a PTransform that takes a PCollection<T> and returns a PCollection<List<T>> whose single value is a List of the approximate N-tiles of the elements of the input PCollection.
globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
Like ApproximateQuantiles.globally(int, Comparator), but sorts using the elements' natural ordering.
globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
Returns a PTransform that takes a PCollection<T> and returns a PCollection<Long> containing a single value that is an estimate of the number of distinct elements in the input PCollection.
globally(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
Like ApproximateUnique.globally(int), but specifies the desired maximum estimation error instead of the sample size.
globally(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.Globally PTransform that uses the given SerializableFunction to combine all the elements in each window of the input PCollection into a single value in the output PCollection.
globally(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.Globally PTransform that uses the given GloballyCombineFn to combine all the elements in each window of the input PCollection into a single value in the output PCollection.
globally() - Static method in class org.apache.beam.sdk.transforms.Count
Returns a PTransform that counts the number of elements in its input PCollection.
globally() - Static method in class org.apache.beam.sdk.transforms.Latest
Returns a PTransform that takes as input a PCollection<T> and returns a PCollection<T> whose contents is the latest element according to its event time, or null if there are no elements.
globally() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the maximum according to the natural ordering of T of the input PCollection's elements, or null if there are no elements.
globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the maximum of the input PCollection's elements, or null if there are no elements.
globally() - Static method in class org.apache.beam.sdk.transforms.Mean
Returns a PTransform that takes an input PCollection<NumT> and returns a PCollection<Double> whose contents is the mean of the input PCollection's elements, or 0 if there are no elements.
globally() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the minimum according to the natural ordering of T of the input PCollection's elements, or null if there are no elements.
globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> whose contents is the minimum of the input PCollection's elements, or null if there are no elements.
GlobalWatermarkHolder - Class in org.apache.beam.runners.spark.util
A Broadcast variable to hold the global watermarks for a micro-batch.
GlobalWatermarkHolder() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
 
GlobalWatermarkHolder.SparkWatermarks - Class in org.apache.beam.runners.spark.util
A GlobalWatermarkHolder.SparkWatermarks holds the watermarks and batch time relevant to a micro-batch input from a specific source.
GlobalWatermarkHolder.WatermarksListener - Class in org.apache.beam.runners.spark.util
Advance the WMs onBatchCompleted event.
GlobalWindow - Class in org.apache.beam.sdk.transforms.windowing
The default window into which all data is placed (via GlobalWindows).
GlobalWindow.Coder - Class in org.apache.beam.sdk.transforms.windowing
GlobalWindow.Coder for encoding and decoding GlobalWindows.
GlobalWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that assigns all data to the same window.
GlobalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
GoogleApiDebugOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
These options configure debug settings for Google API clients created within the Apache Beam SDK.
GoogleApiDebugOptions.GoogleApiTracer - Class in org.apache.beam.sdk.extensions.gcp.options
A GoogleClientRequestInitializer that adds the trace destination to Google API calls.
GoogleApiTracer() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
 
GraphiteSink - Class in org.apache.beam.runners.spark.metrics.sink
A Spark Sink that is tailored to report AggregatorMetric metrics to Graphite.
GraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
 
greaterThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that are greater than a given value, based on the elements' natural ordering.
greaterThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that are greater than or equal to a given value, based on the elements' natural ordering.
groupAlsoByWindow(JavaDStream<WindowedValue<KV<K, Iterable<WindowedValue<InputT>>>>>, Coder<K>, Coder<WindowedValue<InputT>>, WindowingStrategy<?, W>, SparkRuntimeContext, List<Integer>) - Static method in class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
 
GroupByKey<K,V> - Class in org.apache.beam.sdk.transforms
GroupByKey<K, V> takes a PCollection<KV<K, V>>, groups the values by key and windows, and returns a PCollection<KV<K, Iterable<V>>> representing a map from each distinct key and window of the input PCollection to an Iterable over all the values associated with that key in the input per window.
groupedValues(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.GroupedValues PTransform that takes a PCollection of KVs where a key maps to an Iterable of values, e.g., the result of a GroupByKey, then uses the given SerializableFunction to combine all the values associated with a key, ignoring the key.
groupedValues(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.GroupedValues PTransform that takes a PCollection of KVs where a key maps to an Iterable of values, e.g., the result of a GroupByKey, then uses the given CombineFn to combine all the values associated with a key, ignoring the key.
GroupingState<InputT,OutputT> - Interface in org.apache.beam.sdk.state
A ReadableState cell that combines multiple input values and outputs a single value of a different type.
GroupIntoBatches<K,InputT> - Class in org.apache.beam.sdk.transforms
A PTransform that batches inputs to a desired batch size.

H

HadoopFileSystemModule - Class in org.apache.beam.sdk.io.hdfs
A Jackson Module that registers a JsonSerializer and JsonDeserializer for a Hadoop Configuration.
HadoopFileSystemModule() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemModule
 
HadoopFileSystemOptions - Interface in org.apache.beam.sdk.io.hdfs
PipelineOptions which encapsulate Hadoop Configuration for the HadoopFileSystem.
HadoopFileSystemOptions.ConfigurationLocator - Class in org.apache.beam.sdk.io.hdfs
A DefaultValueFactory which locates a Hadoop Configuration.
HadoopFileSystemOptionsRegistrar - Class in org.apache.beam.sdk.io.hdfs
AutoService registrar for HadoopFileSystemOptions.
HadoopFileSystemOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
 
HadoopFileSystemRegistrar - Class in org.apache.beam.sdk.io.hdfs
AutoService registrar for the HadoopFileSystem.
HadoopFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
 
HadoopInputFormatBoundedSource(HadoopInputFormatIO.SerializableConfiguration, Coder<K>, Coder<V>, SimpleFunction<?, K>, SimpleFunction<?, V>, HadoopInputFormatIO.SerializableSplit) - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
HadoopInputFormatIO - Class in org.apache.beam.sdk.io.hadoop.inputformat
A HadoopInputFormatIO is a Transform for reading data from any source which implements Hadoop InputFormat.
HadoopInputFormatIO() - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO
 
HadoopInputFormatIO.HadoopInputFormatBoundedSource<K,V> - Class in org.apache.beam.sdk.io.hadoop.inputformat
Bounded source implementation for HadoopInputFormatIO.
HadoopInputFormatIO.Read<K,V> - Class in org.apache.beam.sdk.io.hadoop.inputformat
A PTransform that reads from any data source which implements Hadoop InputFormat.
HadoopInputFormatIO.SerializableConfiguration - Class in org.apache.beam.sdk.io.hadoop.inputformat
A wrapper to allow Hadoop Configuration to be serialized using Java's standard serialization mechanisms.
HadoopInputFormatIO.SerializableSplit - Class in org.apache.beam.sdk.io.hadoop.inputformat
A wrapper to allow Hadoop InputSplit to be serialized using Java's standard serialization mechanisms.
has(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
Returns whether this PCollectionTuple contains a PCollection with the given tag.
HasDefaultTracker<RestrictionT extends HasDefaultTracker<RestrictionT,TrackerT>,TrackerT extends RestrictionTracker<RestrictionT>> - Interface in org.apache.beam.sdk.transforms.splittabledofn
Interface for restrictions for which a default implementation of DoFn.NewTracker is available, depending only on the restriction itself.
hasDefaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
Returns whether this transform has a default value.
HasDisplayData - Interface in org.apache.beam.sdk.transforms.display
Marker interface for PTransforms and components to specify display data used within UIs and diagnostic tools.
hasExperiment(DataflowPipelineDebugOptions, String) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
Returns true if the specified experiment is enabled, handling null experiments.
hashCode() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
hashCode() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
 
hashCode() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
hashCode() - Method in class org.apache.beam.runners.spark.util.ByteArray
 
hashCode() - Method in class org.apache.beam.sdk.coders.AtomicCoder
.
hashCode() - Method in class org.apache.beam.sdk.coders.AvroCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
hashCode() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.SerializableCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
hashCode() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
hashCode() - Method in class org.apache.beam.sdk.coders.StructuredCoder
 
hashCode() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
hashCode() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
hashCode() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
hashCode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.range.ByteKey
 
hashCode() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
hashCode() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
 
hashCode() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
Deprecated.
Object.hashCode() is not supported on PAssert objects.
hashCode() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
hashCode() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
hashCode() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
hashCode() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
hashCode() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
hashCode() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRange
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
hashCode() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
hashCode() - Method in class org.apache.beam.sdk.values.KV
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionList
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionTuple
 
hashCode() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
hashCode() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
hashCode() - Method in class org.apache.beam.sdk.values.TupleTag
 
hashCode() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
hashCode() - Method in class org.apache.beam.sdk.values.TypeParameter
 
hashCode() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
hashCode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
hasReplacementJob() - Method in enum org.apache.beam.sdk.PipelineResult.State
 
HBaseIO - Class in org.apache.beam.sdk.io.hbase
A bounded source and sink for HBase.
HBaseIO.Read - Class in org.apache.beam.sdk.io.hbase
A PTransform that reads from HBase.
HBaseIO.Write - Class in org.apache.beam.sdk.io.hbase
A PTransform that writes to HBase.
HEADER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
Hidden - Annotation Type in org.apache.beam.sdk.options
Methods and/or interfaces annotated with @Hidden will be suppressed from being output when --help is specified on the command-line.

I

Identifier() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
Returns the identity element of this operation, i.e.
identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
Returns the value that should be used for the combine of the empty set.
identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
Returns the identity element of this operation, i.e.
identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
Returns the identity element of this operation, i.e.
immediate(T) - Static method in class org.apache.beam.sdk.state.ReadableStates
A ReadableState constructed from a constant value, hence immediately available.
immutableNames() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
immutableNamesBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
immutableSteps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
immutableStepsBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
 
in(Pipeline) - Static method in class org.apache.beam.sdk.values.PBegin
Returns a PBegin in the given Pipeline.
in(Pipeline) - Static method in class org.apache.beam.sdk.values.PDone
Creates a PDone in the given Pipeline.
inc() - Method in interface org.apache.beam.sdk.metrics.Counter
Increment the counter.
inc(long) - Method in interface org.apache.beam.sdk.metrics.Counter
Increment the counter by the given amount.
include(String, HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
Register display data from the specified subcomponent at the given path.
inCombinedNonLatePanes(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window across all panes that were not produced by the arrival of late data.
inCombinedNonLatePanes(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
increment() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns a RandomAccessData that is the smallest value of same length which is strictly greater than this.
INDEX_OF_MAX - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
Shard name containing the index and max.
inEarlyGlobalWindowPanes() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on panes in the GlobalWindow that were emitted before the GlobalWindow closed.
inEarlyGlobalWindowPanes() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inferType(Object) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Infer the DisplayData.Type for the given object.
inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window, running the checker only on the final pane for each key.
inFinalPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert like this one, but with the assertion restricted to only run on the provided window, running the checker only on the final pane for each key.
init(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
Init aggregators accumulator if it has not been initiated.
init(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
Init metrics accumulator if it has not been initiated.
initAccumulators(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
Init Metrics/Aggregators accumulators.
initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
 
initialize(AbstractGoogleClientRequest<?>) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
 
initialSystemTimeAt(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
Set the initial synchronized processing time.
inNamespace(String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
inNamespace(Class<?>) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
innerJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Inner join of two collections of KV elements.
inOnlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert like this one, but with the assertion restricted to only run on the provided window.
inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window.
inOnTimePane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Creates a new PAssert.SingletonAssert like this one, but with the assertion restricted to only run on the provided window, running the checker only on the on-time pane for each key.
inOrder(Trigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
Returns an AfterEach Trigger with the given subtriggers.
inOrder(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
Returns an AfterEach Trigger with the given subtriggers.
INPUT_CODER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
INPUTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
Singleton instance of GlobalWindow.
INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
InstantCoder - Class in org.apache.beam.sdk.coders
A Coder for joda Instant that encodes it as a big endian Long shifted such that lexicographic ordering of the bytes corresponds to chronological order.
InstantDeserializer - Class in org.apache.beam.sdk.io.kafka.serialization
Kafka Deserializer for Instant.
InstantDeserializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
 
InstantSerializer - Class in org.apache.beam.sdk.io.kafka.serialization
Kafka Serializer for Instant.
InstantSerializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
integers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Integer.
integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<Integer> and returns a PCollection<Integer> whose contents is the maximum of the input PCollection's elements, or Integer.MIN_VALUE if there are no elements.
integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<Integer> and returns a PCollection<Integer> whose contents is a single value that is the minimum of the input PCollection's elements, or Integer.MAX_VALUE if there are no elements.
integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<Integer> and returns a PCollection<Integer> whose contents is the sum of the input PCollection's elements, or 0 if there are no elements.
integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and returns a PCollection<KV<K, Integer>> that contains an output element mapping each distinct key in the input PCollection to the maximum of the values associated with that key in the input PCollection.
integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and returns a PCollection<KV<K, Integer>> that contains an output element mapping each distinct key in the input PCollection to the minimum of the values associated with that key in the input PCollection.
integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<KV<K, Integer>> and returns a PCollection<KV<K, Integer>> that contains an output element mapping each distinct key in the input PCollection to the sum of the values associated with that key in the input PCollection.
Internal - Annotation Type in org.apache.beam.sdk.annotations
Signifies that a publicly accessible API (public class, method or field) is intended for internal use only and not for public consumption.
interpolateKey(double) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns a ByteKey key such that [startKey, key) represents approximately the specified fraction of the range [startKey, endKey).
intersects(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns whether this window intersects the given window.
IntervalWindow - Class in org.apache.beam.sdk.transforms.windowing
An implementation of BoundedWindow that represents an interval from IntervalWindow.start (inclusive) to IntervalWindow.end (exclusive).
IntervalWindow(Instant, Instant) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Creates a new IntervalWindow that represents the half-open time interval [start, end).
IntervalWindow(Instant, ReadableDuration) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
IntervalWindow.IntervalWindowCoder - Class in org.apache.beam.sdk.transforms.windowing
Encodes an IntervalWindow as a pair of its upper bound and duration.
IntervalWindowCoder() - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
Returns a new FlatMapElements transform with the given type descriptor for the output type, but the mapping function yet to be specified using FlatMapElements.via(SerializableFunction).
into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
Returns a new MapElements transform with the given type descriptor for the output type, but the mapping function yet to be specified using MapElements.via(SerializableFunction).
into(WindowFn<? super T, ?>) - Static method in class org.apache.beam.sdk.transforms.windowing.Window
Creates a Window PTransform that uses the given WindowFn to window the data.
InvalidWindows<W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that represents an invalid pipeline state.
InvalidWindows(String, WindowFn<?, W>) - Constructor for class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
invokeAdvance(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
invokeStart(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
inWindow(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Creates a new PAssert.IterableAssert like this one, but with the assertion restricted to only run on the provided window.
inWindow(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
IS_GENERATED - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
IS_MERGING_WINDOW_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
IS_PAIR_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
IS_STREAM_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
IS_WRAPPER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
isAccessible() - Method in interface org.apache.beam.sdk.options.ValueProvider
Whether the contents of this ValueProvider is available to routines that run at graph construction time.
isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
isAllowedLatenessSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isArray() - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns true if this type is known to be an array type.
isAtSplitPoint() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Returns true if the reader is at a split point.
isAtSplitPoint() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Returns true only for the first record; compressed sources cannot be split.
isAtSplitPoint() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Returns whether the current record is at a split point (i.e., whether the current record would be the first record to be read by a source with a specified start offset of OffsetBasedSource.OffsetBasedReader.getCurrentOffset()).
isBlockOnRun() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
isBounded() - Method in class org.apache.beam.sdk.values.PCollection
 
isBoundedCollection(Collection<PValue>) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
InvalidWindows objects with the same originalWindowFn are compatible.
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
For internal use only; no backwards-compatibility guarantees.
isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns whether this performs the same merging as the given WindowFn.
isDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
isDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns true if this ResourceId represents a directory, false otherwise.
isDisjoint(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns whether this window is disjoint from the given window.
isDone() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
isDone() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
isDone() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
isEmbeddedExecution() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
isEmbeddedExecutionDebugMode() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
isEmpty() - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
 
isEmpty() - Method in class org.apache.beam.sdk.io.range.ByteKey
Returns true if the byte[] backing this ByteKey is of length 0.
isEmpty() - Method in interface org.apache.beam.sdk.state.GroupingState
Return true if this state is empty.
isEmpty() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
isEmpty() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
 
isEnforceEncodability() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
isEnforceImmutability() - Method in interface org.apache.beam.runners.direct.DirectOptions
 
isEqualTo(T) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Asserts that the value in question is equal to the provided value, according to Object.equals(java.lang.Object).
isExternalizedCheckpointsEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
Enables or disables externalized checkpoints.
isFinished() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.ProcessWatcher
 
isFirst() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Return true if this is the first pane produced for the associated window.
isForceStreaming() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
isInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns whether or not this transformation applies a default value.
isLast() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Return true if this is the last pane that will be produced in the associated window.
isMetricsSupported() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Indicates whether metrics reporting is supported.
isModeSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
 
isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns true if this WindowFn never needs to merge any windows.
isParDoFusionEnabled() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
isReadSeekEfficient() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
isRegisterByteSizeObserverCheap(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
isRegisterByteSizeObserverCheap(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.Coder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(ReadableDuration) - Method in class org.apache.beam.sdk.coders.DurationCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(IterableT) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
Returns whether both keyCoder and valueCoder are considered not expensive.
isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
LengthPrefixCoder is cheap if valueCoder is cheap.
isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
NullableCoder is cheap if valueCoder is cheap.
isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
Returns whether Coder.registerByteSizeObserver(T, org.apache.beam.sdk.util.common.ElementByteSizeObserver) cheap enough to call for every element, that is, if this Coder can calculate the byte size of the element to be coded in roughly constant time (or lazily).
isRegisterByteSizeObserverCheap(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
isRegisterByteSizeObserverCheap(RawUnionValue) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
Since this coder uses elementCoders.get(index) and coders that are known to run in constant time, we defer the return value to that coder.
isSplittable() - Method in class org.apache.beam.sdk.io.CompressedSource
Determines whether a single file represented by this source is splittable.
isSplittable() - Method in class org.apache.beam.sdk.io.FileBasedSource
Determines whether a file represented by this source is can be split into bundles.
isStarted() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Returns true if there has been a call to OffsetBasedSource.OffsetBasedReader.start().
isStarted() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
isStreaming() - Method in interface org.apache.beam.sdk.options.StreamingOptions
Set to true if running a streaming pipeline.
isSubtypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Return true if this type is a subtype of the given type.
isSuccess() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
isSupertypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns true if this type is assignable from the given type.
isTerminal() - Method in enum org.apache.beam.sdk.PipelineResult.State
 
isTimestampCombinerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isTriggerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
isTupleTracingEnabled() - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
isUnknown() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
Return true if there is no timing information for the current PaneInfo.
isUpdate() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
Whether to update the currently running pipeline with the same name as this one.
isWholeStream - Variable in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
Whether the encoded or decoded value fills the remainder of the output or input (resp.) record/stream contents.
item(String, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and string value.
item(String, ValueProvider<?>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and ValueProvider.
item(String, Integer) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and integer value.
item(String, Long) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and integer value.
item(String, Float) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and floating point value.
item(String, Double) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and floating point value.
item(String, Boolean) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and boolean value.
item(String, Instant) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and timestamp value.
item(String, Duration) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and duration value.
item(String, Class<T>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key and class value.
item(String, DisplayData.Type, T) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Create a display item for the specified key, type, and value.
Item() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
items() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
ItemSpec() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
iterable() - Static method in class org.apache.beam.sdk.transforms.Materializations
For internal use only; no backwards-compatibility guarantees.
ITERABLE_MATERIALIZATION_URN - Static variable in class org.apache.beam.sdk.transforms.Materializations
The URN for a Materialization where the primitive view type is an iterable of fully specified windowed values.
IterableCoder<T> - Class in org.apache.beam.sdk.coders
An IterableCoder encodes any Iterable in the format of IterableLikeCoder.
IterableCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.IterableCoder
 
IterableLikeCoder<T,IterableT extends java.lang.Iterable<T>> - Class in org.apache.beam.sdk.coders
An abstract base class with functionality for assembling a Coder for a class that implements Iterable.
IterableLikeCoder(Coder<T>, String) - Constructor for class org.apache.beam.sdk.coders.IterableLikeCoder
 
iterables() - Static method in class org.apache.beam.sdk.transforms.Flatten
Returns a PTransform that takes a PCollection<Iterable<T>> and returns a PCollection<T> containing all the elements from all the Iterables.
iterables() - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each item in the iterable of the input PCollection to a String using the Object.toString() method followed by a "," until the last element in the iterable.
iterables(String) - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each item in the iterable of the input PCollection to a String using the Object.toString() method followed by the specified delimiter until the last element in the iterable.
iterables(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Iterable.
iterableView(PCollection<T>, WindowingStrategy<?, W>, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Iterable<T>> capable of processing elements encoded using the provided Coder and windowed using the provided WindowingStrategy.
IterableViewFn() - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
Deprecated.
 

J

javaSparkContext - Variable in class org.apache.beam.runners.spark.SparkPipelineResult
 
JAXBCoder<T> - Class in org.apache.beam.sdk.io.xml
A coder for JAXB annotated objects.
JdbcIO - Class in org.apache.beam.sdk.io.jdbc
IO to read and write data on JDBC.
JdbcIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.jdbc
A POJO describing a DataSource, either providing directly a DataSource or all properties allowing to create a DataSource.
JdbcIO.PreparedStatementSetter<T> - Interface in org.apache.beam.sdk.io.jdbc
An interface used by the JdbcIO Write to set the parameters of the PreparedStatement used to setParameters into the database.
JdbcIO.Read<T> - Class in org.apache.beam.sdk.io.jdbc
A PTransform to read data from a JDBC datasource.
JdbcIO.RowMapper<T> - Interface in org.apache.beam.sdk.io.jdbc
An interface used by JdbcIO.Read for converting each row of the ResultSet into an element of the resulting PCollection.
JdbcIO.StatementPreparator - Interface in org.apache.beam.sdk.io.jdbc
An interface used by the JdbcIO Write to set the parameters of the PreparedStatement used to setParameters into the database.
JdbcIO.Write<T> - Class in org.apache.beam.sdk.io.jdbc
A PTransform to write to a JDBC datasource.
JmsCheckpointMark - Class in org.apache.beam.sdk.io.jms
Checkpoint for an unbounded JmsIO.Read.
JmsCheckpointMark() - Constructor for class org.apache.beam.sdk.io.jms.JmsCheckpointMark
 
JmsIO - Class in org.apache.beam.sdk.io.jms
An unbounded source for JMS destinations (queues or topics).
JmsIO.Read - Class in org.apache.beam.sdk.io.jms
A PTransform to read from a JMS destination.
JmsIO.UnboundedJmsSource - Class in org.apache.beam.sdk.io.jms
An unbounded JMS source.
JmsIO.Write - Class in org.apache.beam.sdk.io.jms
A PTransform to write to a JMS queue.
JmsRecord - Class in org.apache.beam.sdk.io.jms
JmsRecord contains message payload of the record as well as metadata (JMS headers and properties).
JmsRecord(String, long, String, Destination, Destination, int, boolean, String, long, int, Map<String, Object>, String) - Constructor for class org.apache.beam.sdk.io.jms.JmsRecord
 
jobId - Variable in class org.apache.beam.runners.dataflow.DataflowPipelineJob
The id for the job.
JobNameFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
 
JobSpecification(Job, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
 
jobToString(Job) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Renders a Job as a string.
Join - Class in org.apache.beam.sdk.extensions.joinlibrary
Utility class with different versions of joins.
Join() - Constructor for class org.apache.beam.sdk.extensions.joinlibrary.Join
 

K

KafkaCheckpointMark - Class in org.apache.beam.sdk.io.kafka
Checkpoint for an unbounded KafkaIO.Read.
KafkaCheckpointMark(List<KafkaCheckpointMark.PartitionMark>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
KafkaCheckpointMark.PartitionMark - Class in org.apache.beam.sdk.io.kafka
A tuple to hold topic, partition, and offset that comprise the checkpoint for a single partition.
KafkaIO - Class in org.apache.beam.sdk.io.kafka
An unbounded source and a sink for Kafka topics.
KafkaIO.Read<K,V> - Class in org.apache.beam.sdk.io.kafka
A PTransform to read from Kafka topics.
KafkaIO.TypedWithoutMetadata<K,V> - Class in org.apache.beam.sdk.io.kafka
A PTransform to read from Kafka topics.
KafkaIO.Write<K,V> - Class in org.apache.beam.sdk.io.kafka
A PTransform to write to a Kafka topic.
KafkaRecord<K,V> - Class in org.apache.beam.sdk.io.kafka
KafkaRecord contains key and value of the record as well as metadata for the record (topic name, partition id, and offset).
KafkaRecord(String, int, long, long, K, V) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecord
 
KafkaRecord(String, int, long, long, KV<K, V>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecord
 
KafkaRecordCoder<K,V> - Class in org.apache.beam.sdk.io.kafka
KafkaRecordCoder(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
KeyedPCollectionTuple<K> - Class in org.apache.beam.sdk.transforms.join
An immutable tuple of keyed PCollections with key type K.
KeyedPCollectionTuple.TaggedKeyedPCollection<K,V> - Class in org.apache.beam.sdk.transforms.join
A utility class to help ensure coherence of tag and input PCollection types.
keys() - Method in interface org.apache.beam.sdk.state.MapState
Returns an Iterable over the keys contained in this map.
Keys<K> - Class in org.apache.beam.sdk.transforms
Keys<K> takes a PCollection of KV<K, V>s and returns a PCollection<K> of the keys.
KinesisIO - Class in org.apache.beam.sdk.io.kinesis
PTransforms for reading from Kinesis streams.
KinesisIO() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisIO
 
KinesisIO.Read - Class in org.apache.beam.sdk.io.kinesis
Implementation of KinesisIO.read().
KinesisRecord - Class in org.apache.beam.sdk.io.kinesis
UserRecord enhanced with utility methods.
KinesisRecord(UserRecord, String, String) - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
KinesisRecord(ByteBuffer, String, long, String, Instant, Instant, String, String) - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisRecord
 
KV<K,V> - Class in org.apache.beam.sdk.values
An immutable key/value pair.
KV.OrderByKey<K extends java.lang.Comparable<? super K>,V> - Class in org.apache.beam.sdk.values
A Comparator that orders KVs by the natural ordering of their keys.
KV.OrderByValue<K,V extends java.lang.Comparable<? super V>> - Class in org.apache.beam.sdk.values
A Comparator that orders KVs by the natural ordering of their values.
KvCoder<K,V> - Class in org.apache.beam.sdk.coders
A KvCoder encodes KVs.
kvs() - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each element of the input PCollection to a String by using the Object.toString() on the key followed by a "," followed by the Object.toString() of the value.
kvs(String) - Static method in class org.apache.beam.sdk.transforms.ToString
Transforms each element of the input PCollection to a String by using the Object.toString() on the key followed by the specified delimiter followed by the Object.toString() of the value.
kvs(TypeDescriptor<K>, TypeDescriptor<V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
KvSwap<K,V> - Class in org.apache.beam.sdk.transforms
KvSwap<K, V> takes a PCollection<KV<K, V>> and returns a PCollection<KV<V, K>>, where all the keys and values have been swapped.

L

largest(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<List<T>> with a single element containing the largest count elements of the input PCollection<T>, in decreasing order, sorted according to their natural order.
Largest() - Constructor for class org.apache.beam.sdk.transforms.Top.Largest
 
largestPerKey(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the input PCollection to the largest count values associated with that key in the input PCollection<KV<K, V>>, in decreasing order, sorted according to their natural order.
LargestUnique(long) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
Creates a heap to track the largest sampleSize elements.
Latest - Class in org.apache.beam.sdk.transforms
PTransform and Combine.CombineFn for computing the latest element in a PCollection.
launchApp(StreamingApplication, Properties) - Method in class org.apache.beam.runners.apex.ApexYarnLauncher
 
launchApp(ApexYarnLauncher.LaunchParams) - Method in class org.apache.beam.runners.apex.ApexYarnLauncher
 
LaunchParams(DAG, Attribute.AttributeMap, Properties) - Constructor for class org.apache.beam.runners.apex.ApexYarnLauncher.LaunchParams
 
leaseWorkItem(String, LeaseWorkItemRequest) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Leases the work item for jobId.
leaveCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
leaveCompositeTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called for each composite transform after all of its component transforms and their outputs have been visited.
leftOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Left Outer Join of two collections of KV elements.
LengthPrefixCoder<T> - Class in org.apache.beam.sdk.coders
A Coder which is able to take any existing coder and wrap it such that it is only invoked in the outer context.
lessThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection and returns a PCollection with elements that are less than a given value, based on the elements' natural ordering.
lessThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<T> with elements that are less than or equal to a given value, based on the elements' natural ordering.
ListCoder<T> - Class in org.apache.beam.sdk.coders
A Coder for List, using the format of IterableLikeCoder.
ListCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.ListCoder
 
listJobMessages(String, String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Lists job messages with the given jobId.
listJobs(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Lists Dataflow Jobs in the project associated with the DataflowPipelineOptions.
lists(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for List.
listView(PCollection<T>, WindowingStrategy<?, W>, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<List<T>> capable of processing elements encoded using the provided Coder and windowed using the provided WindowingStrategy.
ListViewFn() - Constructor for class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
Deprecated.
 
LocalFileSystemRegistrar - Class in org.apache.beam.sdk.io
AutoService registrar for the LocalFileSystem.
LocalFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.LocalFileSystemRegistrar
 
LocalResources - Class in org.apache.beam.sdk.io
Helper functions for producing a ResourceId that references a local file or directory.
LoggingHandler() - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler
 
longs() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Long.
longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<Long> and returns a PCollection<Long> whose contents is the maximum of the input PCollection's elements, or Long.MIN_VALUE if there are no elements.
longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<Long> and returns a PCollection<Long> whose contents is the minimum of the input PCollection's elements, or Long.MAX_VALUE if there are no elements.
longsGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<Long> and returns a PCollection<Long> whose contents is the sum of the input PCollection's elements, or 0 if there are no elements.
longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, Long>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the input PCollection to the maximum of the values associated with that key in the input PCollection.
longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, Long>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the input PCollection to the minimum of the values associated with that key in the input PCollection.
longsPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
Returns a PTransform that takes an input PCollection<KV<K, Long>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the input PCollection to the sum of the values associated with that key in the input PCollection.

M

main(String[]) - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
The main method expects the serialized DAG and will launch the YARN application.
map() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a SetState, optimized for key lookups and writes.
map(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.map(), but with key and value coders explicitly supplied.
MapCoder<K,V> - Class in org.apache.beam.sdk.coders
A Coder for Maps that encodes them according to provided coders for keys and values.
MapElements<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
PTransforms for mapping a simple function over the elements of a PCollection.
mapRow(ResultSet) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.RowMapper
 
mapSourceFunction(SparkRuntimeContext, String) - Static method in class org.apache.beam.runners.spark.stateful.StateSpecFunctions
A StateSpec function to support reading from an UnboundedSource.
MapState<K,V> - Interface in org.apache.beam.sdk.state
A ReadableState cell mapping keys to values.
mapView(PCollection<KV<K, V>>, WindowingStrategy<?, W>, Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Map<K, V>> capable of processing elements encoded using the provided Coder and windowed using the provided WindowingStrategy.
MapViewFn() - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
Deprecated.
 
markDone() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
Marks this range tracker as being done.
markDone() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
Marks this range tracker as being done.
markDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
Marks that there are no more offsets to be claimed in the range.
match(List<String>) - Method in class org.apache.beam.sdk.io.FileSystem
This is the entry point to convert user-provided specs to ResourceIds.
match(List<String>) - Static method in class org.apache.beam.sdk.io.FileSystems
This is the entry point to convert user-provided specs to ResourceIds.
match(String) - Static method in class org.apache.beam.sdk.io.FileSystems
Like FileSystems.match(List), but for a single resource specification.
MatcherCheckerFn(SerializableMatcher<T>) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert.MatcherCheckerFn
 
matches(String) - Method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
Returns true if the given file name implies that the contents are compressed according to the compression embodied by this factory.
matches(String) - Method in enum org.apache.beam.sdk.io.TextIO.CompressionType
Determine if a given filename matches a compression type based on its extension.
matches(String) - Method in enum org.apache.beam.sdk.io.TFRecordIO.CompressionType
Determine if a given filename matches a compression type based on its extension.
matches(String) - Method in enum org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
Determine if a given filename matches a compression type based on its extension.
matches(Object) - Method in class org.apache.beam.sdk.testing.RegexMatcher
 
matches(String) - Static method in class org.apache.beam.sdk.testing.RegexMatcher
 
matches(Object) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
 
matches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Matches PTransform that checks if the entire line matches the Regex.
matches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Matches PTransform that checks if the entire line matches the Regex.
matches(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Matches PTransform that checks if the entire line matches the Regex.
matches(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Matches PTransform that checks if the entire line matches the Regex.
matches(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesName PTransform that checks if the entire line matches the Regex.
matches(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesName PTransform that checks if the entire line matches the Regex.
Matches(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Matches
 
matchesKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesKV PTransform that checks if the entire line matches the Regex.
matchesKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesKV PTransform that checks if the entire line matches the Regex.
matchesKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesNameKV PTransform that checks if the entire line matches the Regex.
matchesKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.MatchesNameKV PTransform that checks if the entire line matches the Regex.
MatchesKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesKV
 
MatchesName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesName
 
MatchesNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.MatchesNameKV
 
matchesSafely(PipelineResult) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
 
matchNewResource(String, boolean) - Method in class org.apache.beam.sdk.io.FileSystem
Returns a new ResourceId for this filesystem that represents the named resource.
matchNewResource(String, boolean) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a new ResourceId that represents the named resource of a type corresponding to the resource type.
matchResources(List<ResourceId>) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns MatchResults for the given resourceIds.
MatchResult - Class in org.apache.beam.sdk.io.fs
MatchResult.Metadata - Class in org.apache.beam.sdk.io.fs
MatchResult.Metadata of a matched file.
MatchResult.Metadata.Builder - Class in org.apache.beam.sdk.io.fs
Builder class for MatchResult.Metadata.
MatchResult.Status - Enum in org.apache.beam.sdk.io.fs
Status of a MatchResult.
matchSingleFileSpec(String) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns the MatchResult.Metadata for a single file resource.
Materialization<T> - Interface in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
Materializations - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
Materializations() - Constructor for class org.apache.beam.sdk.transforms.Materializations
 
max() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
Max - Class in org.apache.beam.sdk.transforms
PTransforms for computing the maximum of the elements in a PCollection, or the maximum of the values associated with each key in a PCollection of KVs.
maximumLookback() - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
The maximum distance between the end of any main input window mainWindow and the end of the side input window returned by WindowMappingFn.getSideInputWindow(BoundedWindow)
maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
Returns the inclusive upper bound of timestamps for values in this window.
maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
 
maxTimestamp() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns the largest timestamp that can be included in this window.
mean() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
Mean - Class in org.apache.beam.sdk.transforms
PTransforms for computing the arithmetic mean (a.k.a.
merge(Accumulator<MetricsContainerStepMap, MetricsContainerStepMap>) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
merge(NamedAggregators.State<InputT, InterT, OutputT>) - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators.CombineFunctionState
 
merge(NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators
Merges another NamedAggregators instance with this instance.
merge(NamedAggregators.State<InputT, InterT, OutputT>) - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
 
merge(BoundedWindow, Iterable<? extends Instant>) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Merges the given timestamps, which may have originated in separate windows, into the context of the result window.
merge(BoundedWindow, Instant...) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
merge(Collection<W>, W) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
Signals to the framework that the windows in toBeMerged should be merged together to form mergeResult.
mergeAccumulator(AccumT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
Adds the input values represented by the given accumulator into this accumulator.
mergeAccumulators(Iterable<AccumT>) - Method in interface org.apache.beam.sdk.state.CombiningState
Merge the given accumulators according to the underlying Combine.CombineFn.
mergeAccumulators(Iterable<ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
 
mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
 
mergeAccumulators(Iterable<double[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
 
mergeAccumulators(Iterable<Combine.Holder<V>>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
 
mergeAccumulators(Iterable<int[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
 
mergeAccumulators(Iterable<long[]>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
 
mergeAccumulators(Iterable<AccumT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
Returns an accumulator representing the accumulation of all the input values accumulated in the merging accumulators.
mergeAccumulators(Iterable<List<V>>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
mergeAccumulators(Iterable<Object[]>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
mergeAccumulators(Iterable<Object[]>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
mergeAccumulators(Iterable<AccumT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
Returns an accumulator representing the accumulation of all the input values accumulated in the merging accumulators.
mergeAccumulators(Iterable<Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
MergeContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
 
MergeOverlappingIntervalWindows - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards compatibility guarantees.
MergeOverlappingIntervalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.MergeOverlappingIntervalWindows
 
mergeWindows(WindowFn<Object, W>.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
mergeWindows(WindowFn<?, IntervalWindow>.MergeContext) - Static method in class org.apache.beam.sdk.transforms.windowing.MergeOverlappingIntervalWindows
Merge overlapping IntervalWindows.
mergeWindows(WindowFn<T, W>.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
 
mergeWindows(WindowFn<Object, IntervalWindow>.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
mergeWindows(WindowFn<T, W>.MergeContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Does whatever merging of windows is necessary.
Metadata(long, Instant, Instant, long, MetricsContainerStepMap) - Constructor for class org.apache.beam.runners.spark.io.SparkUnboundedSource.Metadata
 
metadata() - Method in class org.apache.beam.sdk.io.fs.MatchResult
MatchResult.Metadata of matched files.
Metadata() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
Metric - Interface in org.apache.beam.sdk.metrics
Marker interface for all user-facing metrics.
MetricName - Class in org.apache.beam.sdk.metrics
The name of a metric consists of a MetricName.namespace() and a MetricName.name().
MetricName() - Constructor for class org.apache.beam.sdk.metrics.MetricName
 
MetricNameFilter - Class in org.apache.beam.sdk.metrics
The name of a metric.
MetricNameFilter() - Constructor for class org.apache.beam.sdk.metrics.MetricNameFilter
 
MetricQueryResults - Interface in org.apache.beam.sdk.metrics
The results of a query for metrics.
metricRegistry() - Method in class org.apache.beam.runners.spark.metrics.AggregatorMetricSource
 
metricRegistry() - Method in class org.apache.beam.runners.spark.metrics.CompositeSource
 
metricRegistry() - Method in class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
 
MetricResult<T> - Interface in org.apache.beam.sdk.metrics
The results of a single current metric.
MetricResults - Class in org.apache.beam.sdk.metrics
Methods for interacting with the metrics of a pipeline that has been executed.
MetricResults() - Constructor for class org.apache.beam.sdk.metrics.MetricResults
 
metrics() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
metrics() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
metrics() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
 
metrics() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
metrics() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
metrics() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
Metrics - Class in org.apache.beam.sdk.metrics
The Metrics is a utility class for producing various kinds of metrics for reporting properties of an executing pipeline.
metrics() - Method in interface org.apache.beam.sdk.PipelineResult
Returns the object to access metrics from the pipeline.
MetricsAccumulator - Class in org.apache.beam.runners.flink.metrics
Accumulator of MetricsContainerStepMap.
MetricsAccumulator() - Constructor for class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
MetricsAccumulator - Class in org.apache.beam.runners.spark.metrics
For resilience, Accumulators are required to be wrapped in a Singleton.
MetricsAccumulator() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator
 
MetricsAccumulator.AccumulatorCheckpointingSparkListener - Class in org.apache.beam.runners.spark.metrics
Spark Listener which checkpoints MetricsContainerStepMap values for fault-tolerance.
MetricsContainer - Interface in org.apache.beam.sdk.metrics
Holds the metrics for a single step and unit-of-commit (bundle).
MetricsEnvironment - Class in org.apache.beam.sdk.metrics
Manages and provides the metrics container associated with each thread.
MetricsEnvironment() - Constructor for class org.apache.beam.sdk.metrics.MetricsEnvironment
 
MetricsFilter - Class in org.apache.beam.sdk.metrics
Simple POJO representing a filter for querying metrics.
MetricsFilter() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter
 
MetricsFilter.Builder - Class in org.apache.beam.sdk.metrics
Builder for creating a MetricsFilter.
MicrobatchSource<T,CheckpointMarkT extends UnboundedSource.CheckpointMark> - Class in org.apache.beam.runners.spark.io
A Source that accommodates Spark's micro-batch oriented nature and wraps an UnboundedSource.
MicrobatchSource.Reader - Class in org.apache.beam.runners.spark.io
Mostly based on BoundedReadFromUnboundedSource's UnboundedToBoundedSourceAdapter, with some adjustments for Spark specifics.
mimeType() - Method in class org.apache.beam.sdk.io.fs.CreateOptions
The file-like resource mime type.
min() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
Min - Class in org.apache.beam.sdk.transforms
PTransforms for computing the minimum of the elements in a PCollection, or the minimum of the values associated with each key in a PCollection of KVs.
modifyEnvironmentBeforeSubmission(Environment) - Method in class org.apache.beam.runners.dataflow.DataflowRunnerHooks
Allows the user to modify the environment of their job before their job is submitted to the service for execution.
MongoDbGridFSIO - Class in org.apache.beam.sdk.io.mongodb
IO to read and write data on MongoDB GridFS.
MongoDbGridFSIO() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
 
MongoDbGridFSIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.mongodb
Encapsulate the MongoDB GridFS connection logic.
MongoDbGridFSIO.Parser<T> - Interface in org.apache.beam.sdk.io.mongodb
Interface for the parser that is used to parse the GridFSDBFile into the appropriate types.
MongoDbGridFSIO.ParserCallback<T> - Interface in org.apache.beam.sdk.io.mongodb
Callback for the parser to use to submit data.
MongoDbGridFSIO.Read<T> - Class in org.apache.beam.sdk.io.mongodb
A PTransform to read data from MongoDB GridFS.
MongoDbGridFSIO.Read.BoundedGridFSSource - Class in org.apache.beam.sdk.io.mongodb
A BoundedSource for MongoDB GridFS.
MongoDbGridFSIO.Write<T> - Class in org.apache.beam.sdk.io.mongodb
A PTransform to write data to MongoDB GridFS.
MongoDbGridFSIO.WriteFn<T> - Interface in org.apache.beam.sdk.io.mongodb
Function that is called to write the data to the give GridFS OutputStream.
MongoDbIO - Class in org.apache.beam.sdk.io.mongodb
IO to read and write data on MongoDB.
MongoDbIO.Read - Class in org.apache.beam.sdk.io.mongodb
A PTransform to read data from MongoDB.
MongoDbIO.Write - Class in org.apache.beam.sdk.io.mongodb
A PTransform to write to a MongoDB database.
MonitoringUtil - Class in org.apache.beam.runners.dataflow.util
A helper class for monitoring jobs submitted to the service.
MonitoringUtil(DataflowClient) - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil
Construct a helper for monitoring.
MonitoringUtil.JobMessagesHandler - Interface in org.apache.beam.runners.dataflow.util
An interface that can be used for defining callbacks to receive a list of JobMessages containing monitoring information.
MonitoringUtil.LoggingHandler - Class in org.apache.beam.runners.dataflow.util
A handler that logs monitoring messages.
MonitoringUtil.TimeStampComparator - Class in org.apache.beam.runners.dataflow.util
Comparator for sorting rows in increasing order based on timestamp.
months(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
Returns a WindowFn that windows elements into periods measured by months.
MoveOptions - Interface in org.apache.beam.sdk.io.fs
MoveOptions.StandardMoveOptions - Enum in org.apache.beam.sdk.io.fs
Defines the standard MoveOptions.
MqttIO - Class in org.apache.beam.sdk.io.mqtt
An unbounded source for MQTT broker.
MqttIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.mqtt
A POJO describing a MQTT connection.
MqttIO.Read - Class in org.apache.beam.sdk.io.mqtt
A PTransform to read from a MQTT broker.
MqttIO.Write - Class in org.apache.beam.sdk.io.mqtt
A PTransform to write and send a message to a MQTT server.
multimapView(PCollection<KV<K, V>>, WindowingStrategy<?, W>, Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<Map<K, Iterable<V>>> capable of processing elements encoded using the provided Coder and windowed using the provided WindowingStrategy.
MultimapViewFn() - Constructor for class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
Deprecated.
 
multiOutputOverrideFactory() - Static method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
Returns a PTransformOverrideFactory that replaces a multi-output ParDo with a composite transform specialized for the DataflowRunner.

N

name() - Method in class org.apache.beam.sdk.metrics.MetricName
The name of this metric.
name() - Method in interface org.apache.beam.sdk.metrics.MetricResult
Return the name of the metric.
name - Variable in class org.apache.beam.sdk.transforms.PTransform
The base name of this PTransform, e.g., from defaults, or null if not yet assigned.
named(String, String) - Static method in class org.apache.beam.sdk.metrics.MetricName
 
named(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.MetricName
 
named(String, String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
named(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
 
NamedAggregators - Class in org.apache.beam.runners.spark.aggregators
This class wraps a map of named aggregators.
NamedAggregators() - Constructor for class org.apache.beam.runners.spark.aggregators.NamedAggregators
Constructs a new NamedAggregators instance.
NamedAggregators(String, NamedAggregators.State<?, ?, ?>) - Constructor for class org.apache.beam.runners.spark.aggregators.NamedAggregators
Constructs a new named aggregators instance that contains a mapping from the specified `named` to the associated initial state.
NamedAggregators.CombineFunctionState<InputT,InterT,OutputT> - Class in org.apache.beam.runners.spark.aggregators
 
NamedAggregators.State<InputT,InterT,OutputT> - Interface in org.apache.beam.runners.spark.aggregators
 
names() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
namespace() - Method in class org.apache.beam.sdk.metrics.MetricName
The namespace associated with this metric.
naturalOrder(T) - Static method in class org.apache.beam.sdk.transforms.Max
 
naturalOrder() - Static method in class org.apache.beam.sdk.transforms.Max
 
naturalOrder(T) - Static method in class org.apache.beam.sdk.transforms.Min
 
naturalOrder() - Static method in class org.apache.beam.sdk.transforms.Min
 
NeedsRunner - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize TestPipeline for execution and expect to be executed by a PipelineRunner.
NESTED - Static variable in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
The nested context: the value being encoded or decoded is (potentially) a part of a larger record/stream contents, and may have other parts encoded or decoded after it.
nested() - Method in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
Never - Class in org.apache.beam.sdk.transforms.windowing
A Trigger which never fires.
Never() - Constructor for class org.apache.beam.sdk.transforms.windowing.Never
 
Never.NeverTrigger - Class in org.apache.beam.sdk.transforms.windowing
The actual trigger class for Never triggers.
newClouddebuggerClient(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.DataflowTransport
 
newConfiguration(SerializableConfiguration) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
Returns new populated Configuration object.
newDataflowClient(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.DataflowTransport
Returns a Google Cloud Dataflow client builder.
newJob(SerializableConfiguration) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
Returns new configured Job object.
newTracker() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.HasDefaultTracker
Creates a new tracker for this.
newTracker() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRange
 
nextBatch(TimestampedValue<T>...) - Method in class org.apache.beam.runners.spark.io.CreateStream
Enqueue next micro-batch elements.
nextBatch(T...) - Method in class org.apache.beam.runners.spark.io.CreateStream
For non-timestamped elements.
NO_FIRING - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo
PaneInfo to use for elements on (and before) initial window assignemnt (including elements read from sources) before they have passed through a GroupByKey and are associated with a particular trigger firing.
NON_PARALLEL_INPUTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
NonDeterministicException(Coder<?>, String, Coder.NonDeterministicException) - Constructor for exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
NonDeterministicException(Coder<?>, String) - Constructor for exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
NonDeterministicException(Coder<?>, List<String>) - Constructor for exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
NonDeterministicException(Coder<?>, List<String>, Coder.NonDeterministicException) - Constructor for exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
 
none() - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
Default empty DisplayData instance.
NonMergingWindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
Abstract base class for WindowFns that do not merge windows.
NonMergingWindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
 
NoopCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
Construct an oauth credential to be used by the SDK and the SDK workers.
NoopCredentialFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
 
NoopPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
For internal use only; no backwards compatibility guarantees.
notEqualTo(T) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Asserts that the value in question is not equal to the provided value, according to Object.equals(java.lang.Object).
NullableCoder<T> - Class in org.apache.beam.sdk.coders
A NullableCoder encodes nullable values of type T using a nested Coder<T> that does not tolerate null values.
nullContext() - Static method in class org.apache.beam.sdk.state.StateContexts
Returns a fake StateContext.
NullCredentialInitializer - Class in org.apache.beam.sdk.extensions.gcp.auth
A HttpRequestInitializer for requests that don't have credentials.
NullCredentialInitializer() - Constructor for class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
 
nulls() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for nulls/Void.
NUM_METADATA_SHARD_CODERS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
NUM_QUERY_SPLITS_MAX - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
An upper bound on the number of splits for a query.
NUM_SHARD_CODERS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
NUM_SHARDS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 

O

OBJECT_TYPE_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
of(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.apex.ApexRunner.CreateApexPCollectionView
 
of(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
 
of() - Static method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
of(Coder<T>, Duration) - Static method in class org.apache.beam.runners.spark.io.CreateStream
Set the batch interval for the stream.
of(NamedAggregators) - Static method in class org.apache.beam.runners.spark.metrics.AggregatorMetric
 
of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.AvroCoder
Returns an AvroCoder instance for the provided element type.
of(Class<T>) - Static method in class org.apache.beam.sdk.coders.AvroCoder
Returns an AvroCoder instance for the provided element class.
of(Schema) - Static method in class org.apache.beam.sdk.coders.AvroCoder
Returns an AvroCoder instance for the Avro schema.
of(Class<T>, Schema) - Static method in class org.apache.beam.sdk.coders.AvroCoder
Returns an AvroCoder instance for the provided element type using the provided Avro schema.
of() - Static method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
of() - Static method in class org.apache.beam.sdk.coders.BitSetCoder
 
of() - Static method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
of() - Static method in class org.apache.beam.sdk.coders.ByteCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.CollectionCoder
 
of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
 
of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
 
of() - Static method in class org.apache.beam.sdk.coders.DoubleCoder
 
of() - Static method in class org.apache.beam.sdk.coders.DurationCoder
 
of() - Static method in class org.apache.beam.sdk.coders.InstantCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.IterableCoder
 
of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.KvCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.LengthPrefixCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.ListCoder
 
of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.MapCoder
Produces a MapCoder with the given keyCoder and valueCoder.
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.NullableCoder
 
of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
Returns a SerializableCoder instance for the provided element type.
of(Class<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
Returns a SerializableCoder instance for the provided element class.
of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SetCoder
Produces a SetCoder with the given elementCoder.
of(Class<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
of(Class<T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
of() - Static method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
of() - Static method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
of() - Static method in class org.apache.beam.sdk.coders.VarIntCoder
 
of() - Static method in class org.apache.beam.sdk.coders.VarLongCoder
 
of() - Static method in class org.apache.beam.sdk.coders.VoidCoder
 
of(Class<? extends OutputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.AsJsons
Creates a AsJsons PTransform that will transform a PCollection<InputT> into a PCollection of JSON Strings representing those objects using a Jackson ObjectMapper.
of(Class<? extends OutputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
Creates a ParseJsons PTransform that will parse JSON Strings into a PCollection<OutputT> using a Jackson ObjectMapper.
of() - Static method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns a ProtoCoder for the given Protocol Buffers Message.
of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns a ProtoCoder for the Protocol Buffers Message indicated by the given TypeDescriptor.
of(Coder<BoundedWindow>) - Static method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
of(Class<T>) - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
Returns a WritableCoder instance for the provided element class.
of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
of(int...) - Static method in class org.apache.beam.sdk.io.range.ByteKey
Creates a new ByteKey backed by a copy of the specified int[].
of(ByteKey, ByteKey) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRange
Creates a new ByteKeyRange with the given start and end keys.
of(ByteKeyRange) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
Instantiates a new ByteKeyRangeTracker with the specified range.
of(Class<T>) - Static method in class org.apache.beam.sdk.io.xml.JAXBCoder
Create a coder for a given type of JAXB annotated objects.
of(ValueProvider<X>, SerializableFunction<X, T>) - Static method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
Creates a ValueProvider.NestedValueProvider that wraps the provided value.
of(T) - Static method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
Creates a ValueProvider.StaticValueProvider that wraps the provided value.
of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
Returns a CombineFn that uses the given SerializableFunction to combine values.
of(SerializableFunction<Iterable<V>, V>, int) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
Returns a CombineFn that uses the given SerializableFunction to combine values, attempting to buffer at least bufferSize values between invocations.
of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
Deprecated.
of(Iterable<T>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces a PCollection containing elements of the provided Iterable.
of(T, T...) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces a PCollection containing the specified elements.
of(Map<K, V>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.Values transform that produces a PCollection of KVs corresponding to the keys and values of the specified Map.
of(DisplayData.Path, Class<?>, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.DoFnTester
Returns a DoFnTester supporting unit-testing of the given DoFn.
of(CoGbkResultSchema, UnionCoder) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
Returns a CoGbkResult.CoGbkResultCoder for the given schema and UnionCoder.
of(TupleTag<V>, List<V>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
Returns a new CoGbkResult that contains just the given tag and given data.
of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
of(TupleTag<InputT>, PCollection<KV<K, InputT>>) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
Returns a new KeyedPCollectionTuple<K> with the given tag and initial PCollection.
of(List<Coder<?>>) - Static method in class org.apache.beam.sdk.transforms.join.UnionCoder
Builds a union coder with the given list of element coders.
of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
A CombineFn that computes the maximum of a collection of elements of type T using an arbitrary Comparator and identity, useful as an argument to Combine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>) or Combine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>).
of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
A CombineFn that computes the maximum of a collection of elements of type T using an arbitrary Comparator, useful as an argument to Combine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>) or Combine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>).
of() - Static method in class org.apache.beam.sdk.transforms.Mean
A Combine.CombineFn that computes the arithmetic mean (a.k.a.
of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
A CombineFn that computes the minimum of a collection of elements of type T using an arbitrary Comparator and an identity, useful as an argument to Combine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>) or Combine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>).
of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
A CombineFn that computes the minimum of a collection of elements of type T using an arbitrary Comparator, useful as an argument to Combine.globally(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>) or Combine.perKey(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Iterable<V>, V>).
of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.ParDo
Creates a ParDo PTransform that will invoke the given DoFn function.
of(int, Partition.PartitionFn<? super T>) - Static method in class org.apache.beam.sdk.transforms.Partition
Returns a new Partition PTransform that divides its input PCollection into the given number of partitions, using the given partitioning function.
of() - Static method in class org.apache.beam.sdk.transforms.Reshuffle
Deprecated.
 
of(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<List<T>> with a single element containing the largest count elements of the input PCollection<T>, in decreasing order, sorted using the given Comparator<T>.
of(PCollectionView<ViewT>) - Static method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
 
of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
Returns an AfterAll Trigger with the given subtriggers.
of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
Returns an AfterAll Trigger with the given subtriggers.
of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
Returns an AfterFirst Trigger with the given subtriggers.
of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
Returns an AfterFirst Trigger with the given subtriggers.
of() - Static method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
Returns the default trigger.
of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
Partitions the timestamp space into half-open intervals of the form [N * size, (N + 1) * size), where 0 is the epoch.
of() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
of() - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Assigns timestamps into half-open intervals of the form [N * period, N * period + size), where 0 is the epoch.
of(SerializableFunction<V, K>) - Static method in class org.apache.beam.sdk.transforms.WithKeys
Returns a PTransform that takes a PCollection<V> and returns a PCollection<KV<K, V>>, where each of the values in the input PCollection has been paired with a key computed from the value by invoking the given SerializableFunction.
of(K) - Static method in class org.apache.beam.sdk.transforms.WithKeys
Returns a PTransform that takes a PCollection<V> and returns a PCollection<KV<K, V>>, where each of the values in the input PCollection has been paired with the given key.
of(SerializableFunction<T, Instant>) - Static method in class org.apache.beam.sdk.transforms.WithTimestamps
For a SerializableFunction fn from T to Instant, outputs a PTransform that takes an input PCollection<T> and outputs a PCollection<T> containing every element v in the input where each element is output with a timestamp obtained as the result of fn.apply(v).
of(K, V) - Static method in class org.apache.beam.sdk.values.KV
Returns a KV with the given key and value.
of(PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionList
Returns a singleton PCollectionList containing the given PCollection.
of(Iterable<PCollection<T>>) - Static method in class org.apache.beam.sdk.values.PCollectionList
Returns a PCollectionList containing the given PCollections, in order.
of(TupleTag<T>, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
Returns a singleton PCollectionTuple containing the given PCollection keyed by the given TupleTag.
of(TupleTag<?>, PValue) - Static method in class org.apache.beam.sdk.values.TaggedPValue
 
of(V, Instant) - Static method in class org.apache.beam.sdk.values.TimestampedValue
Returns a new TimestampedValue with the given value and timestamp.
of(Coder<T>) - Static method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
of(TupleTag<?>) - Static method in class org.apache.beam.sdk.values.TupleTagList
Returns a singleton TupleTagList containing the given TupleTag.
of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.values.TupleTagList
Returns a TupleTagList containing the given TupleTags, in order.
of(Class<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a TypeDescriptor representing the given type.
of(Type) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a TypeDescriptor representing the given type.
of(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
of(T, Instant, BoundedWindow, PaneInfo) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow
 
of(Coder<ValueT>) - Static method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
of(WindowFn<T, W>) - Static method in class org.apache.beam.sdk.values.WindowingStrategy
 
ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Max
ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Min
ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Sum
ofExpandedValue(PValue) - Static method in class org.apache.beam.sdk.values.TaggedPValue
 
offerCoders(Coder[]) - Method in interface org.apache.beam.sdk.state.StateSpec
For internal use only; no backwards-compatibility guarantees.
ofFirstElement() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
offset(Duration) - Method in interface org.apache.beam.sdk.state.Timer
Offsets the target timestamp used by Timer.setRelative() by the given duration.
OFFSET_INFINITY - Static variable in class org.apache.beam.sdk.io.range.OffsetRangeTracker
Offset corresponding to infinity.
OffsetBasedReader(OffsetBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
OffsetBasedSource<T> - Class in org.apache.beam.sdk.io
A BoundedSource that uses offsets to define starting and ending positions.
OffsetBasedSource(long, long, long) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource
 
OffsetBasedSource.OffsetBasedReader<T> - Class in org.apache.beam.sdk.io
A Source.Reader that implements code common to readers of all OffsetBasedSources.
OffsetRange - Class in org.apache.beam.sdk.transforms.splittabledofn
A restriction represented by a range of integers [from, to).
OffsetRange(long, long) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.OffsetRange
 
OffsetRangeTracker - Class in org.apache.beam.sdk.io.range
A RangeTracker for non-negative positions of type long.
OffsetRangeTracker(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRangeTracker
Creates an OffsetRangeTracker for the specified range.
OffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
A RestrictionTracker for claiming offsets in an OffsetRange in a monotonically increasing fashion.
OffsetRangeTracker(OffsetRange) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
 
ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Max
ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Min
ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Sum
ofLongs() - Static method in class org.apache.beam.sdk.transforms.Max
ofLongs() - Static method in class org.apache.beam.sdk.transforms.Min
ofLongs() - Static method in class org.apache.beam.sdk.transforms.Sum
ofPrimitiveOutputsInternal(Pipeline, TupleTagList, WindowingStrategy<?, ?>, PCollection.IsBounded) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
For internal use only; no backwards-compatibility guarantees.
ofSize(long) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
 
ON_TIME_AND_ONLY_FIRING - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo
PaneInfo to use when there will be exactly one firing and it is on time.
onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
 
onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
 
onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarksListener
 
OnceTrigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
 
onTimer(String, BoundedWindow, Instant, TimeDomain) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
OnTimerContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
 
open(ResourceIdT) - Method in class org.apache.beam.sdk.io.FileSystem
Returns a read channel for the given ResourceIdT.
open(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
Returns a read channel for the given ResourceId.
openUnwindowed(String, int) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
openWindowed(String, BoundedWindow, PaneInfo, int) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Performs bundle initialization.
Options() - Constructor for class org.apache.beam.runners.apex.ApexRunnerRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
 
Options() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
 
options() - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
OrderByKey() - Constructor for class org.apache.beam.sdk.values.KV.OrderByKey
 
OrderByValue() - Constructor for class org.apache.beam.sdk.values.KV.OrderByValue
 
orFinally(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
Specify an ending condition for this trigger.
OrFinallyTrigger - Class in org.apache.beam.sdk.transforms.windowing
A Trigger that executes according to its main trigger until its "finally" trigger fires.
org.apache.beam.runners.apex - package org.apache.beam.runners.apex
Implementation of the Beam runner for Apache Apex.
org.apache.beam.runners.dataflow - package org.apache.beam.runners.dataflow
Provides a Beam runner that executes pipelines on the Google Cloud Dataflow service.
org.apache.beam.runners.dataflow.options - package org.apache.beam.runners.dataflow.options
Provides PipelineOptions specific to Google Cloud Dataflow.
org.apache.beam.runners.dataflow.util - package org.apache.beam.runners.dataflow.util
Provides miscellaneous internal utilities used by the Google Cloud Dataflow runner.
org.apache.beam.runners.direct - package org.apache.beam.runners.direct
Defines the PipelineOptions.DirectRunner which executes both Bounded and Unbounded Pipelines on the local machine.
org.apache.beam.runners.flink - package org.apache.beam.runners.flink
Internal implementation of the Beam runner for Apache Flink.
org.apache.beam.runners.flink.metrics - package org.apache.beam.runners.flink.metrics
Internal metrics implementation of the Beam runner for Apache Flink.
org.apache.beam.runners.spark - package org.apache.beam.runners.spark
Internal implementation of the Beam runner for Apache Spark.
org.apache.beam.runners.spark.aggregators - package org.apache.beam.runners.spark.aggregators
Provides internal utilities for implementing Beam aggregators using Spark accumulators.
org.apache.beam.runners.spark.aggregators.metrics - package org.apache.beam.runners.spark.aggregators.metrics
Defines classes for integrating with Spark's metrics mechanism (Sinks, Sources, etc.).
org.apache.beam.runners.spark.coders - package org.apache.beam.runners.spark.coders
Beam coders and coder-related utilities for running on Apache Spark.
org.apache.beam.runners.spark.io - package org.apache.beam.runners.spark.io
Spark-specific transforms for I/O.
org.apache.beam.runners.spark.metrics - package org.apache.beam.runners.spark.metrics
Provides internal utilities for implementing Beam metrics using Spark accumulators.
org.apache.beam.runners.spark.metrics.sink - package org.apache.beam.runners.spark.metrics.sink
Spark sinks that supports beam metrics and aggregators.
org.apache.beam.runners.spark.stateful - package org.apache.beam.runners.spark.stateful
Spark-specific stateful operators.
org.apache.beam.runners.spark.util - package org.apache.beam.runners.spark.util
Internal utilities to translate Beam pipelines to Spark.
org.apache.beam.sdk - package org.apache.beam.sdk
Provides a simple, powerful model for building both batch and streaming parallel data processing Pipelines.
org.apache.beam.sdk.annotations - package org.apache.beam.sdk.annotations
Defines annotations used across the SDK.
org.apache.beam.sdk.coders - package org.apache.beam.sdk.coders
Defines Coders to specify how data is encoded to and decoded from byte strings.
org.apache.beam.sdk.extensions.gcp.auth - package org.apache.beam.sdk.extensions.gcp.auth
Defines classes related to interacting with Credentials for pipeline creation and execution containing Google Cloud Platform components.
org.apache.beam.sdk.extensions.gcp.options - package org.apache.beam.sdk.extensions.gcp.options
Defines PipelineOptions for configuring pipeline execution for Google Cloud Platform components.
org.apache.beam.sdk.extensions.gcp.storage - package org.apache.beam.sdk.extensions.gcp.storage
Defines IO connectors for Google Cloud Storage.
org.apache.beam.sdk.extensions.jackson - package org.apache.beam.sdk.extensions.jackson
Utilities for parsing and creating JSON serialized objects.
org.apache.beam.sdk.extensions.joinlibrary - package org.apache.beam.sdk.extensions.joinlibrary
Utilities for performing SQL-style joins of keyed PCollections.
org.apache.beam.sdk.extensions.protobuf - package org.apache.beam.sdk.extensions.protobuf
Defines a Coder for Protocol Buffers messages, ProtoCoder.
org.apache.beam.sdk.extensions.sorter - package org.apache.beam.sdk.extensions.sorter
Utility for performing local sort of potentially large sets of values.
org.apache.beam.sdk.io - package org.apache.beam.sdk.io
Defines transforms for reading and writing common storage formats, including AvroIO, and TextIO.
org.apache.beam.sdk.io.elasticsearch - package org.apache.beam.sdk.io.elasticsearch
Transforms for reading and writing from Elasticsearch.
org.apache.beam.sdk.io.fs - package org.apache.beam.sdk.io.fs
Apache Beam FileSystem interfaces and their default implementations.
org.apache.beam.sdk.io.gcp.bigquery - package org.apache.beam.sdk.io.gcp.bigquery
Defines transforms for reading and writing from Google BigQuery.
org.apache.beam.sdk.io.gcp.bigtable - package org.apache.beam.sdk.io.gcp.bigtable
Defines transforms for reading and writing from Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.common - package org.apache.beam.sdk.io.gcp.common
Defines common Google Cloud Platform IO support classes.
org.apache.beam.sdk.io.gcp.datastore - package org.apache.beam.sdk.io.gcp.datastore
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
org.apache.beam.sdk.io.gcp.pubsub - package org.apache.beam.sdk.io.gcp.pubsub
Defines transforms for reading and writing from Google Cloud Pub/Sub.
org.apache.beam.sdk.io.gcp.testing - package org.apache.beam.sdk.io.gcp.testing
Defines utilities for unit testing Google Cloud Platform components of Apache Beam pipelines.
org.apache.beam.sdk.io.hadoop - package org.apache.beam.sdk.io.hadoop
Classes shared by Hadoop based IOs.
org.apache.beam.sdk.io.hadoop.inputformat - package org.apache.beam.sdk.io.hadoop.inputformat
Defines transforms for reading from Data sources which implement Hadoop Input Format.
org.apache.beam.sdk.io.hbase - package org.apache.beam.sdk.io.hbase
Defines transforms for reading and writing from HBase.
org.apache.beam.sdk.io.hdfs - package org.apache.beam.sdk.io.hdfs
FileSystem implementation for any Hadoop FileSystem.
org.apache.beam.sdk.io.jdbc - package org.apache.beam.sdk.io.jdbc
Transforms for reading and writing from JDBC.
org.apache.beam.sdk.io.jms - package org.apache.beam.sdk.io.jms
Transforms for reading and writing from JMS (Java Messaging Service).
org.apache.beam.sdk.io.kafka - package org.apache.beam.sdk.io.kafka
Transforms for reading and writing from Apache Kafka.
org.apache.beam.sdk.io.kafka.serialization - package org.apache.beam.sdk.io.kafka.serialization
Kafka serializers and deserializers.
org.apache.beam.sdk.io.kinesis - package org.apache.beam.sdk.io.kinesis
Transforms for reading and writing from Amazon Kinesis.
org.apache.beam.sdk.io.mongodb - package org.apache.beam.sdk.io.mongodb
Transforms for reading and writing from MongoDB.
org.apache.beam.sdk.io.mqtt - package org.apache.beam.sdk.io.mqtt
Transforms for reading and writing from MQTT.
org.apache.beam.sdk.io.range - package org.apache.beam.sdk.io.range
Provides thread-safe helpers for implementing dynamic work rebalancing in position-based bounded sources.
org.apache.beam.sdk.io.xml - package org.apache.beam.sdk.io.xml
Transforms for reading and writing Xml files.
org.apache.beam.sdk.metrics - package org.apache.beam.sdk.metrics
Metrics allow exporting information about the execution of a pipeline.
org.apache.beam.sdk.options - package org.apache.beam.sdk.options
Defines PipelineOptions for configuring pipeline execution.
org.apache.beam.sdk.state - package org.apache.beam.sdk.state
Classes and interfaces for interacting with state.
org.apache.beam.sdk.testing - package org.apache.beam.sdk.testing
Defines utilities for unit testing Apache Beam pipelines.
org.apache.beam.sdk.transforms - package org.apache.beam.sdk.transforms
Defines PTransforms for transforming data in a pipeline.
org.apache.beam.sdk.transforms.display - package org.apache.beam.sdk.transforms.display
Defines HasDisplayData for annotating components which provide display data used within UIs and diagnostic tools.
org.apache.beam.sdk.transforms.join - package org.apache.beam.sdk.transforms.join
Defines the CoGroupByKey transform for joining multiple PCollections.
org.apache.beam.sdk.transforms.splittabledofn - package org.apache.beam.sdk.transforms.splittabledofn
Defines utilities related to splittable DoFn.
org.apache.beam.sdk.transforms.windowing - package org.apache.beam.sdk.transforms.windowing
Defines the Window transform for dividing the elements in a PCollection into windows, and the Trigger for controlling when those elements are output.
org.apache.beam.sdk.values - package org.apache.beam.sdk.values
Defines PCollection and other classes for representing data in a Pipeline.
out() - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
 
out(int) - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
 
OUTER - Static variable in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
The outer context: the value being encoded or decoded takes up the remainder of the record/stream contents.
OUTPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
output(T) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
Output the object.
output(T, Instant) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
Output the object using the specified timestamp.
output(OutputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
Adds the given element to the main output PCollection at the given timestamp in the given window.
output(TupleTag<T>, T, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
Adds the given element to the output PCollection with the given tag at the given timestamp in the given window.
output(T) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
 
output(OutputT) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the main output PCollection.
output(TupleTag<T>, T) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the output PCollection with the given tag.
OUTPUT_INFO - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
OUTPUT_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
OutputReference - Class in org.apache.beam.runners.dataflow.util
A representation used by Steps to reference the output of other Steps.
OutputReference(String, String) - Constructor for class org.apache.beam.runners.dataflow.util.OutputReference
 
outputRuntimeOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptions
Returns a map of properties which correspond to ValueProvider.RuntimeValueProvider, keyed by the property name.
outputWithTimestamp(OutputT, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the main output PCollection, with the given timestamp.
outputWithTimestamp(TupleTag<T>, T, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
Adds the given element to the specified output PCollection, with the given timestamp.
overlaps(ByteKeyRange) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns true if the specified ByteKeyRange overlaps this range.

P

pane() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Returns information about the pane within this window into which the input element has been assigned.
PaneInfo - Class in org.apache.beam.sdk.transforms.windowing
Provides information about the pane an element belongs to.
PaneInfo.PaneInfoCoder - Class in org.apache.beam.sdk.transforms.windowing
A Coder for encoding PaneInfo instances.
PaneInfo.Timing - Enum in org.apache.beam.sdk.transforms.windowing
Enumerates the possibilities for the timing of this pane firing related to the input and output watermarks for its computation.
PARALLEL_INPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ParDo - Class in org.apache.beam.sdk.transforms
ParDo is the core element-wise transform in Apache Beam, invoking a user-specified function on each of the elements of the input PCollection to produce zero or more output elements, all of which are collected into the output PCollection.
ParDo() - Constructor for class org.apache.beam.sdk.transforms.ParDo
 
ParDo.MultiOutput<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A PTransform that, when applied to a PCollection<InputT>, invokes a user-specified DoFn<InputT, OutputT> on all its elements, which can emit elements to any of the PTransform's output PCollections, which are bundled into a result PCollectionTuple.
ParDo.SingleOutput<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A PTransform that, when applied to a PCollection<InputT>, invokes a user-specified DoFn<InputT, OutputT> on all its elements, with all its outputs collected into an output PCollection<OutputT>.
parse(GridFSDBFile, MongoDbGridFSIO.ParserCallback<T>) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Parser
 
ParseJsons<OutputT> - Class in org.apache.beam.sdk.extensions.jackson
PTransform for parsing JSON Strings.
parseTableSpec(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
Parse a table specification in the form "[project_id]:[dataset_id].[table_id]" or "[dataset_id].[table_id]".
Partition<T> - Class in org.apache.beam.sdk.transforms
Partition takes a PCollection<T> and a PartitionFn, uses the PartitionFn to split the elements of the input PCollection into N partitions, and returns a PCollectionList<T> that bundles N PCollection<T>s containing the split elements.
Partition.PartitionFn<T> - Interface in org.apache.beam.sdk.transforms
A function object that chooses an output partition for an element.
partitioner() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
partitionFor(T, int) - Method in interface org.apache.beam.sdk.transforms.Partition.PartitionFn
Chooses the partition into which to put the given element.
PartitioningWindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that places each value into exactly one window based on its timestamp and never merges windows.
PartitioningWindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
 
PartitionMark(String, int, long) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
PAssert - Class in org.apache.beam.sdk.testing
An assertion on the contents of a PCollection incorporated into the pipeline.
PAssert.DefaultConcludeTransform - Class in org.apache.beam.sdk.testing
Default transform to check that a PAssert was successful.
PAssert.GroupThenAssert<T> - Class in org.apache.beam.sdk.testing
A transform that applies an assertion-checking function over iterables of ActualT to the entirety of the contents of its input.
PAssert.GroupThenAssertForSingleton<T> - Class in org.apache.beam.sdk.testing
A transform that applies an assertion-checking function to a single iterable contained as the sole element of a PCollection.
PAssert.IterableAssert<T> - Interface in org.apache.beam.sdk.testing
Builder interface for assertions applicable to iterables and PCollection contents.
PAssert.OneSideInputAssert<ActualT> - Class in org.apache.beam.sdk.testing
An assertion checker that takes a single PCollectionView<ActualT> and an assertion over ActualT, and checks it within a Beam pipeline.
PAssert.PAssertionSite - Class in org.apache.beam.sdk.testing
Track the place where an assertion is defined.
PAssert.PCollectionContentsAssert<T> - Class in org.apache.beam.sdk.testing
An PAssert.IterableAssert about the contents of a PCollection.
PAssert.PCollectionContentsAssert.MatcherCheckerFn<T> - Class in org.apache.beam.sdk.testing
Check that the passed-in matchers match the existing data.
PAssert.SingletonAssert<T> - Interface in org.apache.beam.sdk.testing
Builder interface for assertions applicable to a single value.
pastEndOfWindow() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark
Creates a trigger that fires when the watermark passes the end of the window.
pastFirstElementInPane() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
Creates a trigger that fires when the current processing time passes the processing time at which this trigger saw the first element in a pane.
PathValidator - Interface in org.apache.beam.sdk.extensions.gcp.storage
For internal use only; no backwards compatibility guarantees.
PathValidatorFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
 
PBegin - Class in org.apache.beam.sdk.values
PBegin is the "input" to a root PTransform, such as Read or Create.
PBegin(Pipeline) - Constructor for class org.apache.beam.sdk.values.PBegin
Constructs a PBegin in the given Pipeline.
PCollection<T> - Class in org.apache.beam.sdk.values
A PCollection<T> is an immutable collection of values of type T.
PCollection.IsBounded - Enum in org.apache.beam.sdk.values
The enumeration of cases for whether a PCollection is bounded.
PCollectionContentsAssert(PCollection<T>, PAssert.PAssertionSite) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
PCollectionContentsAssert(PCollection<T>, PAssert.AssertionWindows, SimpleFunction<Iterable<ValueInSingleWindow<T>>, Iterable<T>>, PAssert.PAssertionSite) - Constructor for class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
PCollectionList<T> - Class in org.apache.beam.sdk.values
A PCollectionList<T> is an immutable list of homogeneously typed PCollection<T>s.
pCollections() - Static method in class org.apache.beam.sdk.transforms.Flatten
Returns a PTransform that flattens a PCollectionList into a PCollection containing all the elements of all the PCollections in its input.
PCollectionTuple - Class in org.apache.beam.sdk.values
A PCollectionTuple is an immutable tuple of heterogeneously-typed PCollections, "keyed" by TupleTags.
PCollectionView<T> - Interface in org.apache.beam.sdk.values
A PCollectionView<T> is an immutable view of a PCollection as a value of type T that can be accessed as a side input to a ParDo transform.
PCollectionViews - Class in org.apache.beam.sdk.values
For internal use only; no backwards compatibility guarantees.
PCollectionViews() - Constructor for class org.apache.beam.sdk.values.PCollectionViews
 
PCollectionViews.IterableViewFn<T> - Class in org.apache.beam.sdk.values
Deprecated.
Beam views are migrating off of Iterable<WindowedValue<T>> as a primitive view type.
PCollectionViews.ListViewFn<T> - Class in org.apache.beam.sdk.values
Deprecated.
Beam views are migrating off of Iterable<WindowedValue<T>> as a primitive view type.
PCollectionViews.MapViewFn<K,V> - Class in org.apache.beam.sdk.values
Deprecated.
Beam views are migrating off of Iterable<WindowedValue<T>> as a primitive view type.
PCollectionViews.MultimapViewFn<K,V> - Class in org.apache.beam.sdk.values
Deprecated.
Beam views are migrating off of Iterable<WindowedValue<T>> as a primitive view type.
PCollectionViews.SimplePCollectionView<ElemT,ViewT,W extends BoundedWindow> - Class in org.apache.beam.sdk.values
A class for PCollectionView implementations, with additional type parameters that are not visible at pipeline assembly time when the view is used as a side input.
PCollectionViews.SingletonViewFn<T> - Class in org.apache.beam.sdk.values
Deprecated.
Beam views are migrating off of Iterable<WindowedValue<T>> as a primitive view type.
PDone - Class in org.apache.beam.sdk.values
PDone is the output of a PTransform that has a trivial result, such as a WriteFiles.
peekOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Returns the elements output so far to the main output.
peekOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Returns the elements output so far to the output with the given tag.
peekOutputElementsInWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Returns the elements output so far to the main output in the provided window with associated timestamps.
peekOutputElementsInWindow(TupleTag<OutputT>, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Returns the elements output so far to the specified output in the provided window with associated timestamps.
peekOutputElementsWithTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Returns the elements output so far to the main output with associated timestamps.
perElement() - Static method in class org.apache.beam.sdk.transforms.Count
Returns a PTransform that counts the number of occurrences of each element in its input PCollection.
perKey(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
Returns a PTransform that takes a PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the input PCollection to a List of the approximate N-tiles of the values associated with that key in the input PCollection.
perKey(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
Like ApproximateQuantiles.perKey(int, Comparator), but sorts values using the their natural ordering.
perKey(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
Returns a PTransform that takes a PCollection<KV<K, V>> and returns a PCollection<KV<K, Long>> that contains an output element mapping each distinct key in the input PCollection to an estimate of the number of distinct values associated with that key in the input PCollection.
perKey(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
Like ApproximateUnique.perKey(int), but specifies the desired maximum estimation error instead of the sample size.
perKey(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.PerKey PTransform that first groups its input PCollection of KVs by keys and windows, then invokes the given function on each of the values lists to produce a combined value, and then returns a PCollection of KVs mapping each distinct key to its combined value for each window.
perKey(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
Returns a Combine.PerKey PTransform that first groups its input PCollection of KVs by keys and windows, then invokes the given function on each of the values lists to produce a combined value, and then returns a PCollection of KVs mapping each distinct key to its combined value for each window.
perKey() - Static method in class org.apache.beam.sdk.transforms.Count
Returns a PTransform that counts the number of elements associated with each key of its input PCollection.
perKey() - Static method in class org.apache.beam.sdk.transforms.Latest
Returns a PTransform that takes as input a PCollection<KV<K, V>> and returns a PCollection<KV<K, V>> whose contents is the latest element per-key according to its event time.
perKey() - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a PCollection<KV<K, T>> that contains an output element mapping each distinct key in the input PCollection to the maximum according to the natural ordering of T of the values associated with that key in the input PCollection.
perKey(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a PCollection<KV<K, T>> that contains one output element per key mapping each to the maximum of the values associated with that key in the input PCollection.
perKey() - Static method in class org.apache.beam.sdk.transforms.Mean
Returns a PTransform that takes an input PCollection<KV<K, N>> and returns a PCollection<KV<K, Double>> that contains an output element mapping each distinct key in the input PCollection to the mean of the values associated with that key in the input PCollection.
perKey() - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a PCollection<KV<K, T>> that contains an output element mapping each distinct key in the input PCollection to the minimum according to the natural ordering of T of the values associated with that key in the input PCollection.
perKey(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
Returns a PTransform that takes an input PCollection<KV<K, T>> and returns a PCollection<KV<K, T>> that contains one output element per key mapping each to the minimum of the values associated with that key in the input PCollection.
perKey(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the input PCollection to the largest count values associated with that key in the input PCollection<KV<K, V>>, in decreasing order, sorted using the given Comparator<V>.
PHASE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PInput - Interface in org.apache.beam.sdk.values
The interface for things that might be input to a PTransform.
Pipeline - Class in org.apache.beam.sdk
A Pipeline manages a directed acyclic graph of PTransforms, and the PCollections that the PTransforms consume and produce.
Pipeline(PipelineOptions) - Constructor for class org.apache.beam.sdk.Pipeline
 
Pipeline.PipelineExecutionException - Exception in org.apache.beam.sdk
Thrown during execution of a Pipeline, whenever user code within that Pipeline throws an exception.
Pipeline.PipelineVisitor - Interface in org.apache.beam.sdk
For internal use only; no backwards-compatibility guarantees.
Pipeline.PipelineVisitor.CompositeBehavior - Enum in org.apache.beam.sdk
Control enum for indicating whether or not a traversal should process the contents of a composite transform or not.
Pipeline.PipelineVisitor.Defaults - Class in org.apache.beam.sdk
Default no-op Pipeline.PipelineVisitor that enters all composite transforms.
pipelineExecution - Variable in class org.apache.beam.runners.spark.SparkPipelineResult
 
PipelineExecutionException(Throwable) - Constructor for exception org.apache.beam.sdk.Pipeline.PipelineExecutionException
PipelineOptions - Interface in org.apache.beam.sdk.options
PipelineOptions are used to configure Pipelines.
PipelineOptions.AtomicLongFactory - Class in org.apache.beam.sdk.options
DefaultValueFactory which supplies an ID that is guaranteed to be unique within the given process.
PipelineOptions.CheckEnabled - Enum in org.apache.beam.sdk.options
Enumeration of the possible states for a given check.
PipelineOptions.DirectRunner - Class in org.apache.beam.sdk.options
A DefaultValueFactory that obtains the class of the DirectRunner if it exists on the classpath, and throws an exception otherwise.
PipelineOptions.JobNameFactory - Class in org.apache.beam.sdk.options
Returns a normalized job name constructed from ApplicationNameOptions.getAppName(), the local system user name (if available), the current time, and a random integer.
PipelineOptionsFactory - Class in org.apache.beam.sdk.options
Constructs a PipelineOptions or any derived interface that is composable to any other derived interface of PipelineOptions via the PipelineOptions.as(java.lang.Class<T>) method.
PipelineOptionsFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptionsFactory
 
PipelineOptionsFactory.Builder - Class in org.apache.beam.sdk.options
A fluent PipelineOptions builder.
PipelineOptionsRegistrar - Interface in org.apache.beam.sdk.options
PipelineOptions creators have the ability to automatically have their PipelineOptions registered with this SDK by creating a ServiceLoader entry and a concrete implementation of this interface.
PipelineOptionsValidator - Class in org.apache.beam.sdk.options
Validates that the PipelineOptions conforms to all the Validation criteria.
PipelineOptionsValidator() - Constructor for class org.apache.beam.sdk.options.PipelineOptionsValidator
 
PipelineResult - Interface in org.apache.beam.sdk
Result of Pipeline.run().
PipelineResult.State - Enum in org.apache.beam.sdk
Possible job states, for both completed and ongoing jobs.
PipelineRunner<ResultT extends PipelineResult> - Class in org.apache.beam.sdk
PipelineRunner() - Constructor for class org.apache.beam.sdk.PipelineRunner
 
plusDelayOf(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
Adds some delay to the original target time.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.AvroIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.AvroIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.CompressedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
Populates the display data.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSink
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.GenerateSequence
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Read.Bounded
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Read.Unbounded
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.Source
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TextIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.WriteFiles
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
 
populateDisplayData(DisplayData.Builder) - Method in interface org.apache.beam.sdk.transforms.display.HasDisplayData
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.DoFn
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Filter
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.MapElements
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Partition
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.PTransform
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
Register display data for the given transform or component.
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.Window
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Register display data for the given transform or component.
POutput - Interface in org.apache.beam.sdk.values
The interface for things that might be output from a PTransform.
prepareForProcessing() - Method in class org.apache.beam.sdk.transforms.DoFn
Deprecated.
prepareWrite(WritableByteChannel) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Called with the channel that a subclass will write its header, footer, and values to.
PrepareWrite<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Prepare an input PCollection for writing to BigQuery.
PrepareWrite(DynamicDestinations<T, DestinationT>, SerializableFunction<T, TableRow>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
 
PrimitiveParDoSingleFactory<InputT,OutputT> - Class in org.apache.beam.runners.dataflow
A PTransformOverrideFactory that produces PrimitiveParDoSingleFactory.ParDoSingle instances from ParDo.SingleOutput instances.
PrimitiveParDoSingleFactory() - Constructor for class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
 
PrimitiveParDoSingleFactory.ParDoSingle<InputT,OutputT> - Class in org.apache.beam.runners.dataflow
A single-output primitive ParDo.
printHelp(PrintStream) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Outputs the set of registered options with the PipelineOptionsFactory with a description for each one if available to the output stream.
printHelp(PrintStream, Class<? extends PipelineOptions>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
Outputs the set of options available to be set for the passed in PipelineOptions interface.
process(List<JobMessage>) - Method in interface org.apache.beam.runners.dataflow.util.MonitoringUtil.JobMessagesHandler
Process the rows.
process(List<JobMessage>) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler
 
processBundle(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
A convenience operation that first calls DoFnTester.startBundle(), then calls DoFnTester.processElement(InputT) on each of the input elements, then calls DoFnTester.finishBundle(), then returns the result of DoFnTester.takeOutputElements().
processBundle(InputT...) - Method in class org.apache.beam.sdk.transforms.DoFnTester
A convenience method for testing DoFns with bundles of elements.
ProcessContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.ProcessContext
 
processElement(DoFn<KV<K, Iterable<KV<Instant, WindowedValue<KV<K, V>>>>>, OutputT>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
 
processElement(DoFn<Iterable<T>, T>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
Deprecated.
 
processElement(WindowedValue<InputT>) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
processElement(DoFn<T, Void>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
processElement(InputT) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Calls the DoFn.ProcessElement method on the DoFn under test, in a context where DoFn.ProcessContext.element() returns the given element and the element is in the global window.
processElement(DoFn<ValueWithRecordId<T>, T>.ProcessContext) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.StripIdsDoFn
 
ProcessingTimeEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
 
processTimestampedElement(TimestampedValue<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Calls DoFn.ProcessElement on the DoFn under test, in a context where DoFn.ProcessContext.element() returns the given element and timestamp and the element is in the global window.
ProcessWatcher(Process) - Constructor for class org.apache.beam.runners.apex.ApexYarnLauncher.ProcessWatcher
 
processWindowedElement(InputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Calls DoFn.ProcessElement on the DoFn under test, in a context where DoFn.ProcessContext.element() returns the given element and timestamp and the element is in the given window.
PROJECT_ID_REGEXP - Static variable in class org.apache.beam.runners.dataflow.DataflowRunner
Project IDs must contain lowercase letters, digits, or dashes.
PROPERTY_BEAM_TEST_PIPELINE_OPTIONS - Static variable in class org.apache.beam.sdk.testing.TestPipeline
System property used to set TestPipelineOptions.
propertyName() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
Returns the property name associated with this provider.
propertyName() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
Returns the property name that corresponds to this provider.
PropertyNames - Class in org.apache.beam.runners.dataflow.util
Constant property names used by the SDK in CloudWorkflow specifications.
PropertyNames() - Constructor for class org.apache.beam.runners.dataflow.util.PropertyNames
 
ProtobufCoderProviderRegistrar - Class in org.apache.beam.sdk.extensions.protobuf
A CoderProviderRegistrar for standard types used with Google Protobuf.
ProtobufCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
 
ProtoCoder<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.extensions.protobuf
A Coder using Google Protocol Buffers binary format.
PTransform<InputT extends PInput,OutputT extends POutput> - Class in org.apache.beam.sdk.transforms
A PTransform<InputT, OutputT> is an operation that takes an InputT (some subtype of PInput) and produces an OutputT (some subtype of POutput).
PTransform() - Constructor for class org.apache.beam.sdk.transforms.PTransform
 
PTransform(String) - Constructor for class org.apache.beam.sdk.transforms.PTransform
 
PUBSUB_ID_ATTRIBUTE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_SERIALIZED_ATTRIBUTES_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_SUBSCRIPTION - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_SUBSCRIPTION_OVERRIDE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_TIMESTAMP_ATTRIBUTE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_TOPIC - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PUBSUB_TOPIC_OVERRIDE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
PubsubBoundedWriter() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
PubsubCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
A CoderProviderRegistrar for standard types used with PubsubIO.
PubsubCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
 
PubsubIO - Class in org.apache.beam.sdk.io.gcp.pubsub
Read and Write PTransforms for Cloud Pub/Sub streams.
PubsubIO.PubsubSubscription - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Cloud Pub/Sub Subscription.
PubsubIO.PubsubTopic - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Cloud Pub/Sub Topic.
PubsubIO.Read<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
Implementation of PubsubIO.read().
PubsubIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
Implementation of PubsubIO.write().
PubsubIO.Write.PubsubBoundedWriter - Class in org.apache.beam.sdk.io.gcp.pubsub
Writer to Pubsub which batches messages from bounded collections.
PubsubMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Pub/Sub message.
PubsubMessage(byte[], Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
PubsubMessagePayloadOnlyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload.
PubsubMessagePayloadOnlyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
PubsubMessageWithAttributesCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including attributes.
PubsubMessageWithAttributesCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
PubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
Properties that can be set when using Google Cloud Pub/Sub with the Apache Beam SDK.
PubsubUnboundedSink - Class in org.apache.beam.sdk.io.gcp.pubsub
A PTransform which streams messages to Pubsub.
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSource - Class in org.apache.beam.sdk.io.gcp.pubsub
Users should use PubsubIO.read() instead.
PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Construct an unbounded source to consume from the Pubsub subscription.
put(K, V) - Method in interface org.apache.beam.sdk.state.MapState
Associates the specified value with the specified key in this state.
putIfAbsent(K, V) - Method in interface org.apache.beam.sdk.state.MapState
A deferred read-followed-by-write.
PValue - Interface in org.apache.beam.sdk.values
For internal use.
PValueBase - Class in org.apache.beam.sdk.values
For internal use.
PValueBase(Pipeline) - Constructor for class org.apache.beam.sdk.values.PValueBase
 
PValueBase() - Constructor for class org.apache.beam.sdk.values.PValueBase
No-arg constructor for Java serialization only.

Q

queryMetrics(MetricsFilter) - Method in class org.apache.beam.sdk.metrics.MetricResults
Query for all metric values that match a given filter.

R

RandomAccessData - Class in org.apache.beam.runners.dataflow.util
An elastic-sized byte array which allows you to manipulate it as a stream, or access it directly.
RandomAccessData() - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
Constructs a RandomAccessData with a default buffer size.
RandomAccessData(byte[]) - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
Constructs a RandomAccessData with the initial buffer.
RandomAccessData(int) - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData
Constructs a RandomAccessData with the given buffer size.
RandomAccessData.RandomAccessDataCoder - Class in org.apache.beam.runners.dataflow.util
A Coder which encodes the valid parts of this stream.
RandomAccessData.UnsignedLexicographicalComparator - Class in org.apache.beam.runners.dataflow.util
A Comparator that compares two byte arrays lexicographically.
RandomAccessDataCoder() - Constructor for class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
RangeTracker<PositionT> - Interface in org.apache.beam.sdk.io.range
A RangeTracker is a thread-safe helper object for implementing dynamic work rebalancing in position-based BoundedSource.BoundedReader subclasses.
RawUnionValue - Class in org.apache.beam.sdk.transforms.join
This corresponds to an integer union tag and value.
RawUnionValue(int, Object) - Constructor for class org.apache.beam.sdk.transforms.join.RawUnionValue
Constructs a partial union from the given union tag and value.
read(JavaStreamingContext, SparkRuntimeContext, UnboundedSource<T, CheckpointMarkT>, String) - Static method in class org.apache.beam.runners.spark.io.SparkUnboundedSource
 
read(Class<T>) - Static method in class org.apache.beam.sdk.io.AvroIO
Reads records of the given type from an Avro file (or multiple Avro files matching a pattern).
Read() - Constructor for class org.apache.beam.sdk.io.AvroIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
 
Read() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that reads from a BigQuery table and returns a PCollection of TableRows containing each of the rows of the table.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Creates an uninitialized BigtableIO.Read.
read() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.Read builder.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
Read() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO
Creates an uninitialized HadoopInputFormatIO.Read.
Read() - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
Creates an uninitialized HBaseIO.Read.
read() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
Read data from a JDBC datasource.
Read() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
 
Read() - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
Creates an uninitialized KafkaIO.Read PTransform.
Read() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.kinesis.KinesisIO
Returns a new KinesisIO.Read transform for reading from Kinesis.
Read() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
Read data from GridFS.
Read() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbIO
Read data from MongoDB.
Read() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
 
Read() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
Read - Class in org.apache.beam.sdk.io
A PTransform for reading from a Source.
Read() - Constructor for class org.apache.beam.sdk.io.Read
 
read() - Static method in class org.apache.beam.sdk.io.TextIO
A PTransform that reads from one or more text files and returns a bounded PCollection containing one element for each line of the input files.
Read() - Constructor for class org.apache.beam.sdk.io.TextIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.TFRecordIO
A PTransform that reads from a TFRecord file (or multiple TFRecord files matching a pattern) and returns a PCollection containing the decoding of each of the records of the TFRecord file(s) as a byte array.
Read() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Read
 
read() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
Reads XML files.
Read() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Read
 
read() - Method in interface org.apache.beam.sdk.state.ReadableState
Read the current value, blocking until it is available.
Read.Bounded<T> - Class in org.apache.beam.sdk.io
PTransform that reads from a BoundedSource.
Read.Builder - Class in org.apache.beam.sdk.io
Helper class for building Read transforms.
Read.Unbounded<T> - Class in org.apache.beam.sdk.io
PTransform that reads from a UnboundedSource.
ReadableState<T> - Interface in org.apache.beam.sdk.state
A State that can be read via ReadableState.read().
ReadableStates - Class in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
ReadableStates() - Constructor for class org.apache.beam.sdk.state.ReadableStates
 
readAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads binary encoded Avro messages of the given type from a Google Cloud Pub/Sub stream.
readBytes() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
Creates an uninitialized KafkaIO.Read PTransform.
Reader() - Constructor for class org.apache.beam.sdk.io.Source.Reader
 
ReaderInvocationUtil<OutputT,ReaderT extends Source.Reader<OutputT>> - Class in org.apache.beam.runners.flink.metrics
Util for invoking Source.Reader methods that might require a MetricsContainerImpl to be active.
ReaderInvocationUtil(String, PipelineOptions, FlinkMetricContainer) - Constructor for class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
 
readExternal(ObjectInput) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableConfiguration
 
readExternal(ObjectInput) - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
readFrom(InputStream, int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Reads length bytes from the specified input stream writing them into the backing data store starting at offset.
readFromSource(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Reads all elements from the given BoundedSource.
readFromStartedReader(Source.Reader<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Reads all elements from the given started Source.Reader.
readFromUnstartedReader(Source.Reader<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Reads all elements from the given unstarted Source.Reader.
readGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.AvroIO
Reads Avro file(s) containing records of the specified schema.
readGenericRecords(String) - Static method in class org.apache.beam.sdk.io.AvroIO
Reads Avro file(s) containing records of the specified schema.
readLater() - Method in interface org.apache.beam.sdk.state.BagState
 
readLater() - Method in interface org.apache.beam.sdk.state.CombiningState
 
readLater() - Method in interface org.apache.beam.sdk.state.GroupingState
 
readLater() - Method in interface org.apache.beam.sdk.state.ReadableState
Indicate that the value will be read later.
readLater() - Method in interface org.apache.beam.sdk.state.SetState
 
readLater() - Method in interface org.apache.beam.sdk.state.ValueState
 
readLater() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
 
readMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithAttributes() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readNextBlock() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
readNextBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Read the next block from the input.
readNextRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
Reads the next record from the block and returns true iff one exists.
readNextRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
Reads the next record from the current block if possible.
readNextRecord() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Reads the next record via the delegate reader.
readNextRecord() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
readNItemsFromStartedReader(Source.Reader<T>, int) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Read elements from a Source.Reader that has already had Source.Reader#start called on it, until n elements are read.
readNItemsFromUnstartedReader(Source.Reader<T>, int) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Read elements from a Source.Reader until n elements are read.
readProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads binary encoded protobuf messages of the given type from a Google Cloud Pub/Sub stream.
readRemainingFromReader(Source.Reader<T>, boolean) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Read all remaining elements from a Source.Reader.
readStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads UTF-8 encoded strings from a Google Cloud Pub/Sub stream.
Regex - Class in org.apache.beam.sdk.transforms
PTransorms to use Regular Expressions to process elements in a PCollection.
Regex.AllMatches - Class in org.apache.beam.sdk.transforms
Regex.MatchesName<String> takes a PCollection<String> and returns a PCollection<List<String>> representing the value extracted from all the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.Find - Class in org.apache.beam.sdk.transforms
Regex.Find<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.FindAll - Class in org.apache.beam.sdk.transforms
Regex.Find<String> takes a PCollection<String> and returns a PCollection<List<String>> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.FindKV - Class in org.apache.beam.sdk.transforms
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a PCollection<KV<String, String>> representing the key and value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.FindName - Class in org.apache.beam.sdk.transforms
Regex.Find<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.FindNameKV - Class in org.apache.beam.sdk.transforms
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a PCollection<KV<String, String>> representing the key and value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.Matches - Class in org.apache.beam.sdk.transforms
Regex.Matches<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.MatchesKV - Class in org.apache.beam.sdk.transforms
Regex.MatchesKV<KV<String, String>> takes a PCollection<String> and returns a PCollection<KV<String, String>> representing the key and value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.MatchesName - Class in org.apache.beam.sdk.transforms
Regex.MatchesName<String> takes a PCollection<String> and returns a PCollection<String> representing the value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.MatchesNameKV - Class in org.apache.beam.sdk.transforms
Regex.MatchesNameKV<KV<String, String>> takes a PCollection<String> and returns a PCollection<KV<String, String>> representing the key and value extracted from the Regex groups of the input PCollection to the number of times that element occurs in the input.
Regex.ReplaceAll - Class in org.apache.beam.sdk.transforms
Regex.ReplaceAll<String> takes a PCollection<String> and returns a PCollection<String> with all Strings that matched the Regex being replaced with the replacement string.
Regex.ReplaceFirst - Class in org.apache.beam.sdk.transforms
Regex.ReplaceFirst<String> takes a PCollection<String> and returns a PCollection<String> with the first Strings that matched the Regex being replaced with the replacement string.
Regex.Split - Class in org.apache.beam.sdk.transforms
Regex.Split<String> takes a PCollection<String> and returns a PCollection<String> with the input string split into individual items in a list.
RegexMatcher - Class in org.apache.beam.sdk.testing
Hamcrest matcher to assert a string matches a pattern.
RegexMatcher(String) - Constructor for class org.apache.beam.sdk.testing.RegexMatcher
 
register(Class<? extends PipelineOptions>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
This registers the interface with this factory.
registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.Coder
Notifies the ElementByteSizeObserver about the byte size of the encoded value using this Coder.
registerByteSizeObserver(ReadableDuration, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.DurationCoder
 
registerByteSizeObserver(Instant, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.InstantCoder
 
registerByteSizeObserver(IterableT, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
 
registerByteSizeObserver(KV<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.KvCoder
Notifies ElementByteSizeObserver about the byte size of the encoded value using this coder.
registerByteSizeObserver(Map<K, V>, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.MapCoder
 
registerByteSizeObserver(T, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.coders.NullableCoder
Overridden to short-circuit the default StructuredCoder behavior of encoding and counting the bytes.
registerByteSizeObserver(RawUnionValue, ElementByteSizeObserver) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
Notifies ElementByteSizeObserver about the byte size of the encoded value using this coder.
registerClasses(Kryo) - Method in class org.apache.beam.runners.spark.coders.BeamSparkRunnerRegistrator
 
registerCoderForClass(Class<?>, Coder<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Registers the provided Coder for the given class.
registerCoderForType(TypeDescriptor<?>, Coder<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Registers the provided Coder for the given type.
registerCoderProvider(CoderProvider) - Method in class org.apache.beam.sdk.coders.CoderRegistry
Registers coderProvider as a potential CoderProvider which can produce Coder instances.
registerTransformTranslator(Class<TransformT>, TransformTranslator<? extends TransformT>) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Records that instances of the specified PTransform class should be translated by default by the corresponding TransformTranslator.
remerge() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
Creates a Window PTransform that does not change assigned windows, but will cause windows to be merged again as part of the next GroupByKey.
remove(K) - Method in interface org.apache.beam.sdk.state.MapState
Remove the mapping for a key from this map if it is present.
remove(T) - Method in interface org.apache.beam.sdk.state.SetState
Removes the specified element from this set if it is present.
rename(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
Renames a List of file-like resources from one location to another.
rename(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
Renames a List of file-like resources from one location to another.
render() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators.CombineFunctionState
 
render() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
 
renderAll() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators
 
Repeatedly - Class in org.apache.beam.sdk.transforms.windowing
A Trigger that fires according to its subtrigger forever.
replaceAll(List<PTransformOverride>) - Method in class org.apache.beam.sdk.Pipeline
For internal use only; no backwards-compatibility guarantees.
replaceAll(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.ReplaceAll PTransform that checks if a portion of the line matches the Regex and replaces all matches with the replacement String.
replaceAll(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.ReplaceAll PTransform that checks if a portion of the line matches the Regex and replaces all matches with the replacement String.
ReplaceAll(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.ReplaceAll
 
replaceFirst(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.ReplaceAll PTransform that checks if a portion of the line matches the Regex and replaces the first match with the replacement String.
replaceFirst(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.ReplaceAll PTransform that checks if a portion of the line matches the Regex and replaces the first match with the replacement String.
ReplaceFirst(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
 
reportElementSize(long) - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
reportWorkItemStatus(String, ReportWorkItemStatusRequest) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Reports the status of the work item for jobId.
requiresDeduping() - Method in class org.apache.beam.sdk.io.UnboundedSource
Returns whether this source requires explicit deduping.
reset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
resetLocal() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
 
resetTo(int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Resets the end of the stream to the specified position.
Reshuffle<K,V> - Class in org.apache.beam.sdk.transforms
Deprecated.
this transform's intended side effects are not portable; it will likely be removed
ReshuffleTrigger<W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
Deprecated.
The intended side effect of Reshuffle is not portable; it will likely be removed
ReshuffleTrigger() - Constructor for class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
Deprecated.
 
resolve(String, ResolveOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
resolve(String, ResolveOptions) - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns a child ResourceId under this.
ResolveOptions - Interface in org.apache.beam.sdk.io.fs
ResolveOptions.StandardResolveOptions - Enum in org.apache.beam.sdk.io.fs
Defines the standard resolve options.
resolveType(Type) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a TypeDescriptor representing the given type, with type variables resolved according to the specialization in this type.
resourceId() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
ResourceId - Interface in org.apache.beam.sdk.io.fs
An identifier which represents a file-like resource.
RestrictionTracker<RestrictionT> - Interface in org.apache.beam.sdk.transforms.splittabledofn
Manages concurrent access to the restriction and keeps track of its claimed part for a splittable DoFn.
rightOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
Right Outer Join of two collections of KV elements.
root() - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
Path for display data registered by a top-level component.
run(Pipeline) - Method in class org.apache.beam.runners.apex.ApexRunner
 
run() - Method in class org.apache.beam.runners.apex.ApexYarnLauncher.ProcessWatcher
 
run(Pipeline) - Method in class org.apache.beam.runners.apex.TestApexRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.dataflow.TestDataflowRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.direct.DirectRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.flink.FlinkRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.flink.TestFlinkRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.spark.SparkRunner
 
run(Pipeline) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger
 
run(Pipeline) - Method in class org.apache.beam.runners.spark.TestSparkRunner
 
run() - Method in class org.apache.beam.sdk.Pipeline
Runs this Pipeline according to the PipelineOptions used to create the Pipeline via Pipeline.create(PipelineOptions).
run(PipelineOptions) - Method in class org.apache.beam.sdk.Pipeline
Runs this Pipeline using the given PipelineOptions, using the runner specified by the options.
run(Pipeline) - Method in class org.apache.beam.sdk.PipelineRunner
Processes the given Pipeline, potentially asynchronously, returning a runner-specific type of result.
run(Pipeline) - Method in class org.apache.beam.sdk.testing.CrashingRunner
 
run() - Method in class org.apache.beam.sdk.testing.TestPipeline
Runs this TestPipeline, unwrapping any AssertionError that is raised during testing.
Runner() - Constructor for class org.apache.beam.runners.apex.ApexRunnerRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
 
Runner() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
 
runWindowFn(WindowFn<T, W>, List<Long>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
Runs the WindowFn over the provided input, returning a map of windows to the timestamps in those windows.

S

Sample - Class in org.apache.beam.sdk.transforms
PTransforms for taking samples of the elements in a PCollection, or samples of the values associated with each key in a PCollection of KVs.
Sample() - Constructor for class org.apache.beam.sdk.transforms.Sample
 
Sample.FixedSizedSampleFn<T> - Class in org.apache.beam.sdk.transforms
CombineFn that computes a fixed-size sample of a collection of values.
satisfies(SerializableFunction<Iterable<T>, Void>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
Applies the provided checking function (presumably containing assertions) to the iterable in question.
satisfies(SerializableFunction<Iterable<T>, Void>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
 
satisfies(SerializableFunction<T, Void>) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
Applies the provided checking function (presumably containing assertions) to the value in question.
SCALAR_FIELD_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
scopedMetricsContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Set the MetricsContainer for the current thread.
SerializableCoder<T extends java.io.Serializable> - Class in org.apache.beam.sdk.coders
A Coder for Java classes that implement Serializable.
SerializableCoder(Class<T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.SerializableCoder
 
SerializableCoder.SerializableCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
A CoderProviderRegistrar which registers a CoderProvider which can handle serializable types.
SerializableCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
 
SerializableComparator<T> - Interface in org.apache.beam.sdk.transforms
A Comparator that is also Serializable.
SerializableConfiguration() - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableConfiguration
 
SerializableConfiguration(Configuration) - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableConfiguration
 
SerializableConfiguration - Class in org.apache.beam.sdk.io.hadoop
A wrapper to allow Hadoop Configurations to be serialized using Java's standard serialization mechanisms.
SerializableConfiguration() - Constructor for class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
SerializableConfiguration(Configuration) - Constructor for class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
SerializableFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.transforms
A function that computes an output value of type OutputT from an input value of type InputT and is Serializable.
SerializableMatcher<T> - Interface in org.apache.beam.sdk.testing
A Matcher that is also Serializable.
SerializableSplit() - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableSplit
 
SerializableSplit(InputSplit) - Constructor for class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableSplit
 
serialize(String, Instant) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
 
serialize(ValueProvider<?>, JsonGenerator, SerializerProvider) - Method in class org.apache.beam.sdk.options.ValueProvider.Serializer
 
SERIALIZED_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
Serializer() - Constructor for class org.apache.beam.sdk.options.ValueProvider.Serializer
 
serializeTimers(Collection<TimerInternals.TimerData>, TimerInternals.TimerDataCoder) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
Sessions - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows values into sessions separated by periods with no input for at least the duration specified by Sessions.getGapDuration().
set(long) - Method in interface org.apache.beam.sdk.metrics.Gauge
Set current value for this gauge.
set() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a SetState, optimized for checking membership.
set(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.set(), but with an element coder explicitly supplied.
set(Instant) - Method in interface org.apache.beam.sdk.state.Timer
Sets or resets the time in the timer's TimeDomain at which it should fire.
set(long...) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
setApiRootUrl(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setApplicationName(String) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
setAppName(String) - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
 
setAutoscalingAlgorithm(DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setBatchIntervalMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setBlockOnRun(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
setCheckpointDir(String) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setCheckpointDurationMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setCheckpointingInterval(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setCloningBehavior(DoFnTester.CloningBehavior) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Instruct this DoFnTester whether or not to clone the DoFn under test.
SetCoder<T> - Class in org.apache.beam.sdk.coders
A SetCoder encodes any Set using the format of IterableLikeCoder.
SetCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.SetCoder
 
setCoder(Coder<T>) - Method in class org.apache.beam.sdk.values.PCollection
Sets the Coder used by this PCollection to encode and decode the values stored in it.
setCoderRegistry(CoderRegistry) - Method in class org.apache.beam.sdk.Pipeline
Deprecated.
this should never be used - every Pipeline has a registry throughout its lifetime.
setConfigFile(String) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
setCredentialFactoryClass(Class<? extends CredentialFactory>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setCurrentContainer(MetricsContainer) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Set the MetricsContainer for the current thread.
setDataflowClient(Dataflow) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setDataflowEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setDataflowJobFile(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setDebuggee(Debuggee) - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
setDefaultConfigInWorkers(PipelineOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
Deprecated.
to be removed.
setDefaultPipelineOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
Sets the default configuration in workers.
setDefaultWorkerLogLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
setDiskSizeGb(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setDumpHeapOnOOM(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setEmbeddedExecution(boolean) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
setEmbeddedExecutionDebugMode(boolean) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
setEnableCloudDebugger(boolean) - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
setEnableMetrics(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setEnableSparkMetricSinks(Boolean) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setEnforceEncodability(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
setEnforceImmutability(boolean) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
setExecutionRetryDelay(Long) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setExecutorService(ExecutorService) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setExpectedAssertions(Integer) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
setExperiments(List<String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setExternalizedCheckpointsEnabled(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setFilesToStage(List<String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setFilesToStage(List<String>) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setFlinkMaster(String) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setForceStreaming(boolean) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
setGcpCredential(Credentials) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setGcpTempLocation(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setGcsEndpoint(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsUploadBufferSizeBytes(Integer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGcsUploadBufferSizeBytes(Integer) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
 
setGcsUtil(GcsUtil) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setGoogleApiTrace(GoogleApiDebugOptions.GoogleApiTracer) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
 
setHdfsConfiguration(List<Configuration>) - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
 
setHooks(DataflowRunnerHooks) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
Sets callbacks to invoke during execution see DataflowRunnerHooks.
setIsBoundedInternal(PCollection.IsBounded) - Method in class org.apache.beam.sdk.values.PCollection
For internal use only; no backwards-compatibility guarantees.
setIsReadSeekEfficient(boolean) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
setJobId(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
setJobName(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setListeners(List<JavaStreamingListener>) - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
setMaxConditionCost(double) - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
 
setMaxNumWorkers(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setMaxRecordsPerBatch(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setMetricsSupported(boolean) - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
Called by the run to indicate whether metrics reporting is supported.
setMimeType(String) - Method in class org.apache.beam.sdk.io.fs.CreateOptions.Builder
 
setMinReadTimeMillis(Long) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setName(String) - Method in class org.apache.beam.sdk.values.PCollection
Sets the name of this PCollection.
setName(String) - Method in class org.apache.beam.sdk.values.PValueBase
Sets the name of this PValueBase.
setNetwork(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setNumberOfExecutionRetries(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setNumberOfWorkerHarnessThreads(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setNumWorkers(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setObjectReuse(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setOnCreateMatcher(SerializableMatcher<PipelineResult>) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
setOnSuccessMatcher(SerializableMatcher<PipelineResult>) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
setOptionsId(long) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setOverrideWindmillBinary(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setParallelism(Integer) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setParameters(T, PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.PreparedStatementSetter
 
setParameters(PreparedStatement) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.StatementPreparator
 
setParDoFusionEnabled(boolean) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
setPathValidator(PathValidator) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setPathValidatorClass(Class<? extends PathValidator>) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
 
setProfilingAgentConfiguration(DataflowProfilingOptions.DataflowProfilingAgentConfiguration) - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
setProject(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setProject(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
setProvidedSparkContext(JavaSparkContext) - Method in interface org.apache.beam.runners.spark.SparkContextOptions
 
setPubsubRootUrl(String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
 
setReadTimePercentage(Double) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setRegion(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setRelative() - Method in interface org.apache.beam.sdk.state.Timer
Sets the timer relative to the current time, according to any offset and alignment specified.
setResourceId(ResourceId) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
setRetainExternalizedCheckpointsOnCancellation(Boolean) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setRunMillis(long) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
setRunner(Class<? extends PipelineRunner<?>>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
sets(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Set.
setSaveProfilesToGcs(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
 
setServiceAccount(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setSideInput(PCollectionView<T>, BoundedWindow, T) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Registers the values of a side input PCollectionView to pass to the DoFn under test.
setSideInputs(Map<PCollectionView<?>, Map<BoundedWindow, ?>>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Registers the tuple of values of the side input PCollectionViews to pass to the DoFn under test.
setSizeBytes(long) - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
 
setSparkMaster(String) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setStableUniqueNames(PipelineOptions.CheckEnabled) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setStager(Stager) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setStagerClass(Class<? extends Stager>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setStagingLocation(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
SetState<T> - Interface in org.apache.beam.sdk.state
A ReadableState cell containing a set of elements.
setStateBackend(AbstractStateBackend) - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
 
setStopPipelineWatermark(Long) - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
 
setStorageLevel(String) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setStreaming(boolean) - Method in interface org.apache.beam.sdk.options.StreamingOptions
 
setSubnetwork(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setTargetParallelism(int) - Method in interface org.apache.beam.runners.direct.DirectOptions
 
setTempDatasetId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setTemplateLocation(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setTempLocation(String) - Method in interface org.apache.beam.sdk.options.PipelineOptions
 
setTempRoot(String) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
setTestTimeoutSeconds(Long) - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
 
setTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
setTimer(StateNamespace, String, Instant, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
 
setTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
Sets a timer to fire when the event time watermark, the current processing time, or the synchronized processing time watermark surpasses a given timestamp.
setTransformNameMapping(Map<String, String>) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setTupleTracingEnabled(boolean) - Method in interface org.apache.beam.runners.apex.ApexPipelineOptions
 
setTypeDescriptor(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.values.PCollection
setUpdate(boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
 
setUsePublicIps(Boolean) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setUsesProvidedSparkContext(boolean) - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
 
setWindmillServiceEndpoint(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setWindmillServicePort(int) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
 
setWindowedWrites(boolean) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Indicates that the operation will be performing windowed writes.
setWindowingStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
For internal use only; no backwards-compatibility guarantees.
setWorkerCacheMb(Integer) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
setWorkerDiskType(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setWorkerHarnessContainerImage(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setWorkerId(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
 
setWorkerLogLevelOverrides(DataflowWorkerLoggingOptions.WorkerLogLevelOverrides) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
setWorkerMachineType(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setWorkerSystemErrMessageLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
setWorkerSystemOutMessageLevel(DataflowWorkerLoggingOptions.Level) - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
 
setZone(String) - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
 
setZone(String) - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
 
SHARD_NAME_TEMPLATE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
ShardNameTemplate - Class in org.apache.beam.sdk.io
Standard shard naming templates.
ShardNameTemplate() - Constructor for class org.apache.beam.sdk.io.ShardNameTemplate
 
shorts() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for Short.
shouldDefer(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
SHUFFLE_KIND - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SHUFFLE_READER_CONFIG - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SHUFFLE_WRITER_CONFIG - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the value of a given side input.
sideInput(PCollectionView<T>) - Method in interface org.apache.beam.sdk.state.StateContext
Returns the value of the side input for the corresponding state window.
sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
Returns the value of the side input for the window corresponding to the main input's window in which values are being combined.
sideInput(PCollectionView<T>) - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Returns the value of the side input.
SideInputBroadcast<T> - Class in org.apache.beam.runners.spark.util
Broadcast helper for side inputs.
SimpleCombineFn(SerializableFunction<Iterable<V>, V>) - Constructor for class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
Deprecated.
 
SimpleFunction<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
A SerializableFunction which is not a functional interface.
SimpleFunction() - Constructor for class org.apache.beam.sdk.transforms.SimpleFunction
 
SimpleFunction(SerializableFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.transforms.SimpleFunction
 
SimplePCollectionView() - Constructor for class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
For serialization only.
singleOutputOverrideFactory() - Static method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
Returns a PTransformOverrideFactory that replaces a single-output ParDo with a composite transform specialized for the DataflowRunner.
SinglePrimitiveOutputPTransform<T> - Class in org.apache.beam.runners.spark.util
A PTransform wrapping another transform.
SinglePrimitiveOutputPTransform(PTransform<PInput, PCollection<T>>) - Constructor for class org.apache.beam.runners.spark.util.SinglePrimitiveOutputPTransform
 
singletonView(PCollection<T>, WindowingStrategy<?, W>, boolean, T, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
Returns a PCollectionView<T> capable of processing elements encoded using the provided Coder and windowed using the provided * WindowingStrategy.
sink - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
The Sink that this WriteOperation will write to.
SinkMetrics - Class in org.apache.beam.sdk.metrics
Standard Sink Metrics.
SinkMetrics() - Constructor for class org.apache.beam.sdk.metrics.SinkMetrics
 
size() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Returns the number of bytes in the backing array that are valid.
size() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
Returns the number of columns for this schema.
size() - Method in class org.apache.beam.sdk.values.PCollectionList
Returns the number of PCollections in this PCollectionList.
size() - Method in class org.apache.beam.sdk.values.TupleTagList
Returns the number of TupleTags in this TupleTagList.
sizeBytes() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
 
SlidingWindows - Class in org.apache.beam.sdk.transforms.windowing
A WindowFn that windows values into possibly overlapping fixed-size timestamp-based windows.
smallest(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<T> and returns a PCollection<List<T>> with a single element containing the smallest count elements of the input PCollection<T>, in increasing order, sorted according to their natural order.
Smallest() - Constructor for class org.apache.beam.sdk.transforms.Top.Smallest
 
smallestPerKey(int) - Static method in class org.apache.beam.sdk.transforms.Top
Returns a PTransform that takes an input PCollection<KV<K, V>> and returns a PCollection<KV<K, List<V>>> that contains an output element mapping each distinct key in the input PCollection to the smallest count values associated with that key in the input PCollection<KV<K, V>>, in increasing order, sorted according to their natural order.
sort() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
 
SORT_VALUES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SortValues<PrimaryKeyT,SecondaryKeyT,ValueT> - Class in org.apache.beam.sdk.extensions.sorter
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT> takes a PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>> with elements consisting of a primary key and iterables over <secondary key, value> pairs, and returns a PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>> of the same elements but with values sorted by a secondary key.
Source<T> - Class in org.apache.beam.sdk.io
Base class for defining input formats and creating a Source for reading the input.
Source() - Constructor for class org.apache.beam.sdk.io.Source
 
Source.Reader<T> - Class in org.apache.beam.sdk.io
The interface that readers of custom input sources must implement.
SOURCE_DOES_NOT_NEED_SPLITTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_ESTIMATED_SIZE_BYTES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_IS_INFINITE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_METADATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_PRODUCES_SORTED_KEYS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_SPEC - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SOURCE_STEP_INPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
SourceMetrics - Class in org.apache.beam.sdk.metrics
Standard Source Metrics.
SourceMetrics() - Constructor for class org.apache.beam.sdk.metrics.SourceMetrics
 
sourceName() - Method in class org.apache.beam.runners.spark.metrics.AggregatorMetricSource
 
sourceName() - Method in class org.apache.beam.runners.spark.metrics.CompositeSource
 
sourceName() - Method in class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
 
SourceRDD - Class in org.apache.beam.runners.spark.io
Classes implementing Beam Source RDDs.
SourceRDD() - Constructor for class org.apache.beam.runners.spark.io.SourceRDD
 
SourceRDD.Bounded<T> - Class in org.apache.beam.runners.spark.io
A SourceRDD.Bounded reads input from a BoundedSource and creates a Spark RDD.
SourceRDD.Unbounded<T,CheckpointMarkT extends UnboundedSource.CheckpointMark> - Class in org.apache.beam.runners.spark.io
A SourceRDD.Unbounded is the implementation of a micro-batch in a SourceDStream.
SourceTestUtils - Class in org.apache.beam.sdk.testing
Helper functions and test harnesses for checking correctness of Source implementations.
SourceTestUtils() - Constructor for class org.apache.beam.sdk.testing.SourceTestUtils
 
SourceTestUtils.ExpectedSplitOutcome - Enum in org.apache.beam.sdk.testing
span(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns the minimal window that includes both this window and the given window.
SparkBeamMetricSource - Class in org.apache.beam.runners.spark.metrics
A Spark Source that is tailored to expose a SparkBeamMetric, wrapping an underlying MetricResults instance.
SparkBeamMetricSource(String) - Constructor for class org.apache.beam.runners.spark.metrics.SparkBeamMetricSource
 
SparkContextOptions - Interface in org.apache.beam.runners.spark
A custom PipelineOptions to work with properties related to JavaSparkContext.
SparkContextOptions.EmptyListenersList - Class in org.apache.beam.runners.spark
Returns an empty list, top avoid handling null.
SparkGroupAlsoByWindowViaWindowSet - Class in org.apache.beam.runners.spark.stateful
An implementation of GroupAlsoByWindowViaWindowSetDoFn logic for grouping by windows and controlling trigger firings and pane accumulation.
SparkGroupAlsoByWindowViaWindowSet() - Constructor for class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
 
SparkNativePipelineVisitor - Class in org.apache.beam.runners.spark
Pipeline visitor for translating a Beam pipeline into equivalent Spark operations.
SparkPipelineOptions - Interface in org.apache.beam.runners.spark
Spark runner PipelineOptions handles Spark execution-related configurations, such as the master address, batch-interval, and other user-related knobs.
SparkPipelineOptions.TmpCheckpointDirFactory - Class in org.apache.beam.runners.spark
Returns the default checkpoint directory of /tmp/${job.name}.
SparkPipelineResult - Class in org.apache.beam.runners.spark
Represents a Spark pipeline execution result.
SparkRunner - Class in org.apache.beam.runners.spark
The SparkRunner translate operations defined on a pipeline to a representation executable by Spark, and then submitting the job to Spark to be executed.
SparkRunner.Evaluator - Class in org.apache.beam.runners.spark
Evaluator on the pipeline.
SparkRunnerDebugger - Class in org.apache.beam.runners.spark
Pipeline runner which translates a Beam pipeline into equivalent Spark operations, without running them.
SparkRunnerDebugger.DebugSparkPipelineResult - Class in org.apache.beam.runners.spark
PipelineResult of running a Pipeline using SparkRunnerDebugger Use SparkRunnerDebugger.DebugSparkPipelineResult.getDebugString() to get a String representation of the Pipeline translated into Spark native operations.
SparkRunnerRegistrar - Class in org.apache.beam.runners.spark
Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the SparkRunner.
SparkRunnerRegistrar.Options - Class in org.apache.beam.runners.spark
Registers the SparkPipelineOptions.
SparkRunnerRegistrar.Runner - Class in org.apache.beam.runners.spark
Registers the SparkRunner.
SparkSideInputReader - Class in org.apache.beam.runners.spark.util
A SideInputReader for thw SparkRunner.
SparkSideInputReader(Map<TupleTag<?>, KV<WindowingStrategy<?, ?>, SideInputBroadcast<?>>>) - Constructor for class org.apache.beam.runners.spark.util.SparkSideInputReader
 
SparkTimerInternals - Class in org.apache.beam.runners.spark.stateful
An implementation of TimerInternals for the SparkRunner.
SparkUnboundedSource - Class in org.apache.beam.runners.spark.io
A "composite" InputDStream implementation for UnboundedSources.
SparkUnboundedSource() - Constructor for class org.apache.beam.runners.spark.io.SparkUnboundedSource
 
SparkUnboundedSource.Metadata - Class in org.apache.beam.runners.spark.io
A metadata holder for an input stream partition.
SparkWatermarks(Instant, Instant, Instant) - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
Splits the source into bundles of approximately desiredBundleSizeBytes.
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
split(long, PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
split(int) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns a list of up to numSplits + 1 ByteKeys in ascending order, where the keys have been interpolated to form roughly equal sub-ranges of this ByteKeyRange, assuming a uniform distribution of keys within this range.
split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.UnboundedSource
Returns a list of UnboundedSource objects representing the instances of this source that should be used when executing the workflow.
split(String) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Split PTransform that splits a string on the regular expression and then outputs each item.
split(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Split PTransform that splits a string on the regular expression and then outputs each item.
split(String, boolean) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Split PTransform that splits a string on the regular expression and then outputs each item.
split(Pattern, boolean) - Static method in class org.apache.beam.sdk.transforms.Regex
Returns a Regex.Split PTransform that splits a string on the regular expression and then outputs each item.
Split(Pattern, boolean) - Constructor for class org.apache.beam.sdk.transforms.Regex.Split
 
SPLIT_POINTS_UNKNOWN - Static variable in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
A constant to use as the return value for BoundedSource.BoundedReader.getSplitPointsConsumed() or BoundedSource.BoundedReader.getSplitPointsRemaining() when the exact value is unknown.
splitAtFraction(double) - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
Tells the reader to narrow the range of the input it's going to read and give up the remainder, so that the new range would contain approximately the given fraction of the amount of data in the current range.
splitAtFraction(double) - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
stageFiles() - Method in class org.apache.beam.runners.dataflow.util.GcsStager
 
stageFiles() - Method in interface org.apache.beam.runners.dataflow.util.Stager
 
Stager - Interface in org.apache.beam.runners.dataflow.util
Interface for staging files needed for running a Dataflow pipeline.
StagerFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
 
StagingLocationFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
 
StandardCreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
 
start() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
 
start() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
 
start() - Method in class org.apache.beam.sdk.io.Source.Reader
Initializes the reader and advances the reader to the first record.
start() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
Initializes the reader and advances the reader to the first record.
start() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
Returns the start of this window, inclusive.
START_INDEX - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
START_OFFSET - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
START_SHUFFLE_POSITION - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
startBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
 
startBundle(DoFn<T, Void>.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
startBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Calls the DoFn.StartBundle method on the DoFn under test.
StartBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
 
startImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
 
startImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
Initializes the OffsetBasedSource.OffsetBasedReader and advances to the first record, returning true if there is a record available to be read.
startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
 
startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
Creates a decompressing channel from the input channel and passes it to its delegate reader's FileBasedReader#startReading(ReadableByteChannel).
startReading(ReadableByteChannel) - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
Performs any initialization of the subclass of FileBasedReader that involves IO operations.
state - Variable in class org.apache.beam.runners.spark.SparkPipelineResult
 
State - Interface in org.apache.beam.sdk.state
A state cell, supporting a State.clear() operation.
StateBinder - Interface in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
StateContext<W extends BoundedWindow> - Interface in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
StateContexts - Class in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
StateContexts() - Constructor for class org.apache.beam.sdk.state.StateContexts
 
StateSpec<StateT extends State> - Interface in org.apache.beam.sdk.state
A specification of a persistent state cell.
StateSpecFunctions - Class in org.apache.beam.runners.spark.stateful
A class containing StateSpec mappingFunctions.
StateSpecFunctions() - Constructor for class org.apache.beam.runners.spark.stateful.StateSpecFunctions
 
StateSpecs - Class in org.apache.beam.sdk.state
Static methods for working with StateSpecs.
status() - Method in class org.apache.beam.sdk.io.fs.MatchResult
Status of the MatchResult.
STATUS_BACKOFF_FACTORY - Static variable in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
step() - Method in interface org.apache.beam.sdk.metrics.MetricResult
Return the step context to which this metric result applies.
steps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
 
stop() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
stop() - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
 
StreamingInserts<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
PTransform that performs streaming BigQuery write.
StreamingIT - Interface in org.apache.beam.sdk.testing
Deprecated.
StreamingOptions - Interface in org.apache.beam.sdk.options
Options used to configure streaming.
StreamingWriteTables - Class in org.apache.beam.sdk.io.gcp.bigquery
This transform takes in key-value pairs of TableRow entries and the TableDestination it should be written to.
StreamingWriteTables() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
StringDelegateCoder<T> - Class in org.apache.beam.sdk.coders
A Coder that wraps a Coder<String> and encodes/decodes values via string representations.
StringDelegateCoder(Class<T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.StringDelegateCoder
 
strings() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
The TypeDescriptor for String.
StringUtf8Coder - Class in org.apache.beam.sdk.coders
A Coder that encodes Strings in UTF-8 encoding.
STRIP_TRAILING_NEWLINES - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
StripIdsDoFn() - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId.StripIdsDoFn
 
Structs - Class in org.apache.beam.runners.dataflow.util
A collection of static methods for manipulating datastructure representations transferred via the Dataflow API.
StructuralByteArray - Class in org.apache.beam.sdk.coders
A wrapper around a byte[] that uses structural, value-based equality rather than byte[]'s normal object identity.
StructuralByteArray(byte[]) - Constructor for class org.apache.beam.sdk.coders.StructuralByteArray
 
structuralValue(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
Returns an object with an Object.equals() method that represents structural equality on the argument.
structuralValue(T) - Method in class org.apache.beam.sdk.coders.Coder
Returns an object with an Object.equals() method that represents structural equality on the argument.
structuralValue(T) - Method in class org.apache.beam.sdk.coders.DelegateCoder
Returns an object with an Object.equals() method that represents structural equality on the argument.
structuralValue(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
 
structuralValue(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
 
structuralValue(T) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
structuralValue(T) - Method in class org.apache.beam.sdk.coders.StructuredCoder
 
structuralValue(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
structuralValueConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T> and values of type T, the structural values are equal if and only if the encoded bytes are equal.
structuralValueConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and values of type T, the structural values are equal if and only if the encoded bytes are equal, in any Coder.Context.
structuralValueDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T> and value of type T, the structural value is equal to the structural value yield by encoding and decoding the original value.
structuralValueDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
Verifies that for the given Coder<T>, Coder.Context, and value of type T, the structural value is equal to the structural value yield by encoding and decoding the original value, in any Coder.Context.
StructuredCoder<T> - Class in org.apache.beam.sdk.coders
An abstract base class to implement a Coder that defines equality, hashing, and printing via the class name and recursively using StructuredCoder.getComponents().
StructuredCoder() - Constructor for class org.apache.beam.sdk.coders.StructuredCoder
 
subTriggers - Variable in class org.apache.beam.sdk.transforms.windowing.Trigger
 
subTriggers() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
success() - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
SUCCESS_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
 
SuccessOrFailure - Class in org.apache.beam.sdk.testing
Output of PAssert.
sum() - Method in class org.apache.beam.sdk.metrics.DistributionResult
 
Sum - Class in org.apache.beam.sdk.transforms
PTransforms for computing the sum of the elements in a PCollection, or the sum of the values associated with each key in a PCollection of KVs.

T

TableDestination - Class in org.apache.beam.sdk.io.gcp.bigquery
Encapsulates a BigQuery table destination.
TableDestination(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestinationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
A coder for TableDestination objects.
TableDestinationCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
TableRowJsonCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder that encodes BigQuery TableRow objects in their native JSON format.
TaggedKeyedPCollection(TupleTag<V>, PCollection<KV<K, V>>) - Constructor for class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
 
TaggedPValue - Class in org.apache.beam.sdk.values
For internal use only; no backwards-compatibility guarantees.
TaggedPValue() - Constructor for class org.apache.beam.sdk.values.TaggedPValue
 
takeOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Returns the elements output so far to the main output.
takeOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
Returns the elements output so far to the output with the given tag.
takeOutputElementsWithTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFnTester
Returns the elements output so far to the main output with associated timestamps.
tempDirectory - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Directory for temporary output files.
TestApexRunner - Class in org.apache.beam.runners.apex
Apex PipelineRunner for testing.
testByteCount(Coder<T>, Coder.Context, T[]) - Static method in class org.apache.beam.sdk.testing.CoderProperties
A utility method that passes the given (unencoded) elements through coder's registerByteSizeObserver() and encode() methods, and confirms they are mutually consistent.
TestDataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow
A set of options used to configure the TestPipeline.
TestDataflowRunner - Class in org.apache.beam.runners.dataflow
TestDataflowRunner is a pipeline runner that wraps a DataflowRunner when running tests against the TestPipeline.
TestElementByteSizeObserver() - Constructor for class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
 
TestFlinkRunner - Class in org.apache.beam.runners.flink
Test Flink runner.
testingPipelineOptions() - Static method in class org.apache.beam.sdk.testing.TestPipeline
Creates PipelineOptions for testing.
TestPipeline - Class in org.apache.beam.sdk.testing
A creator of test pipelines that can be used inside of tests that can be configured to run locally or against a remote pipeline runner.
TestPipeline.AbandonedNodeException - Exception in org.apache.beam.sdk.testing
An exception thrown in case an abandoned PTransform is detected, that is, a PTransform that has not been run.
TestPipeline.PipelineRunMissingException - Exception in org.apache.beam.sdk.testing
An exception thrown in case a test finishes without invoking Pipeline.run().
TestPipelineOptions - Interface in org.apache.beam.sdk.testing
TestPipelineOptions is a set of options for test pipelines.
TestPipelineOptions.AlwaysPassMatcher - Class in org.apache.beam.sdk.testing
Matcher which will always pass.
TestPipelineOptions.AlwaysPassMatcherFactory - Class in org.apache.beam.sdk.testing
Factory for PipelineResult matchers which always pass.
TestSparkPipelineOptions - Interface in org.apache.beam.runners.spark
TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory - Class in org.apache.beam.runners.spark
A factory to provide the default watermark to stop a pipeline that reads from an unbounded source.
TestSparkRunner - Class in org.apache.beam.runners.spark
The SparkRunner translate operations defined on a pipeline to a representation executable by Spark, and then submitting the job to Spark to be executed.
TestStream<T> - Class in org.apache.beam.sdk.testing
A testing input that generates an unbounded PCollection of elements, advancing the watermark and processing time as elements are emitted.
TestStream.Builder<T> - Class in org.apache.beam.sdk.testing
An incomplete TestStream.
TestStream.ElementEvent<T> - Class in org.apache.beam.sdk.testing
A TestStream.Event that produces elements.
TestStream.Event<T> - Interface in org.apache.beam.sdk.testing
An event in a TestStream.
TestStream.EventType - Enum in org.apache.beam.sdk.testing
The types of TestStream.Event that are supported by TestStream.
TestStream.ProcessingTimeEvent<T> - Class in org.apache.beam.sdk.testing
A TestStream.Event that advances the processing time clock.
TestStream.WatermarkEvent<T> - Class in org.apache.beam.sdk.testing
A TestStream.Event that advances the watermark.
TextIO - Class in org.apache.beam.sdk.io
PTransforms for reading and writing text files.
TextIO.CompressionType - Enum in org.apache.beam.sdk.io
Possible text file compression types.
TextIO.Read - Class in org.apache.beam.sdk.io
Implementation of TextIO.read().
TextIO.Write - Class in org.apache.beam.sdk.io
Implementation of TextIO.write().
TextualIntegerCoder - Class in org.apache.beam.sdk.coders
A Coder that encodes Integer Integers as the ASCII bytes of their textual, decimal, representation.
TextualIntegerCoder() - Constructor for class org.apache.beam.sdk.coders.TextualIntegerCoder
 
TFRecordIO - Class in org.apache.beam.sdk.io
PTransforms for reading and writing TensorFlow TFRecord files.
TFRecordIO.CompressionType - Enum in org.apache.beam.sdk.io
Possible TFRecord file compression types.
TFRecordIO.Read - Class in org.apache.beam.sdk.io
Implementation of TFRecordIO.read().
TFRecordIO.Write - Class in org.apache.beam.sdk.io
Implementation of TFRecordIO.write().
that(PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the elements of the provided PCollection.
that(String, PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the elements of the provided PCollection with the specified reason.
thatMap(PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection, which must have at most one value per key.
thatMap(String, PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection with the specified reason.
thatMultimap(PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection.
thatMultimap(String, PCollection<KV<K, V>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection with the specified reason.
thatSingleton(PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection PCollection<T>, which must be a singleton.
thatSingleton(String, PCollection<T>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs a PAssert.SingletonAssert for the value of the provided PCollection PCollection<T> with the specified reason.
thatSingletonIterable(PCollection<? extends Iterable<T>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the value of the provided PCollection which must contain a single Iterable<T> value.
thatSingletonIterable(String, PCollection<? extends Iterable<T>>) - Static method in class org.apache.beam.sdk.testing.PAssert
Constructs an PAssert.IterableAssert for the value of the provided PCollection with the specified reason.
throwNullCredentialException() - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
 
TimeDomain - Enum in org.apache.beam.sdk.state
TimeDomain specifies whether an operation is based on timestamps of elements or current "real-world" time as reported while processing.
timeDomain() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
Returns the time domain of the current timer.
Timer - Interface in org.apache.beam.sdk.state
A timer for a specified time domain that can be set to register the desire for further processing at particular time in its specified time domain.
timer(TimeDomain) - Static method in class org.apache.beam.sdk.state.TimerSpecs
 
Timers - Interface in org.apache.beam.sdk.state
Interface for interacting with time.
TimerSpec - Interface in org.apache.beam.sdk.state
A specification for a Timer.
TimerSpecs - Class in org.apache.beam.sdk.state
Static methods for working with TimerSpecs.
TimerSpecs() - Constructor for class org.apache.beam.sdk.state.TimerSpecs
 
timestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
 
timestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult
 
timestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
Returns the timestamp of the current timer.
timestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Returns the timestamp of the input element.
timestamp() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
Returns the timestamp of the current element.
TIMESTAMP_MAX_VALUE - Static variable in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
The maximum value for any Beam timestamp.
TIMESTAMP_MIN_VALUE - Static variable in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
The minimum value for any Beam timestamp.
TimestampCombiner - Enum in org.apache.beam.sdk.transforms.windowing
Policies for combining timestamps that occur within a window.
TimeStampComparator() - Constructor for class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
 
timestamped(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.TimestampedValues transform that produces a PCollection containing the elements of the provided Iterable with the specified timestamps.
timestamped(TimestampedValue<T>, TimestampedValue<T>...) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new Create.TimestampedValues transform that produces a PCollection containing the specified elements with the specified timestamps.
timestamped(Iterable<T>, Iterable<Long>) - Static method in class org.apache.beam.sdk.transforms.Create
Returns a new root transform that produces a PCollection containing the specified elements with the specified timestamps.
TimestampedValue<V> - Class in org.apache.beam.sdk.values
An immutable pair of a value and a timestamp.
TimestampedValue(V, Instant) - Constructor for class org.apache.beam.sdk.values.TimestampedValue
 
TimestampedValue.TimestampedValueCoder<T> - Class in org.apache.beam.sdk.values
TimestampTransform - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards-compatibility guarantees.
TimestampTransform.AlignTo - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards-compatibility guarantees.
TimestampTransform.Delay - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards-compatibility guarantees.
TimeUtil - Class in org.apache.beam.runners.dataflow.util
A helper class for converting between Dataflow API and SDK time representations.
TmpCheckpointDirFactory() - Constructor for class org.apache.beam.runners.spark.SparkPipelineOptions.TmpCheckpointDirFactory
 
to(String) - Method in class org.apache.beam.sdk.io.AvroIO.Write
Writes to file(s) with the given output prefix.
to(ResourceId) - Method in class org.apache.beam.sdk.io.AvroIO.Write
Writes to file(s) with the given output prefix.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
to(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the given table, specified in the format described in BigQueryHelpers.parseTableSpec(java.lang.String).
to(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the given table, specified as a TableReference.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
to(SerializableFunction<ValueInSingleWindow<T>, TableDestination>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to table specified by the specified table function.
to(DynamicDestinations<T, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the table and schema specified by the DynamicDestinations object.
to(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Publishes to the specified topic.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Like topic() but with a ValueProvider.
to(long) - Method in class org.apache.beam.sdk.io.GenerateSequence
Specifies the maximum number to generate (exclusive).
to(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
Writes to text files with the given prefix.
to(ResourceId) - Method in class org.apache.beam.sdk.io.TextIO.Write
Writes to text files with prefix from the given resource.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
to(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes TFRecord file(s) with the given output prefix.
to(ResourceId) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes TFRecord file(s) with a prefix given by the specified resource.
to(FileBasedSink<T>) - Static method in class org.apache.beam.sdk.io.WriteFiles
Creates a WriteFiles transform that writes to the given FileBasedSink, letting the runner control how many different shards are produced.
to(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
Writes to files with the given path prefix.
toByteArray(T, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for serializing an object using the specified coder.
toByteArrays(Iterable<T>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
Utility method for serializing a Iterable of values using the specified coder.
toByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting an object to a bytearray.
toByteFunction(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
A function wrapper for converting a key-value pair to a byte array pair.
toCloudDuration(ReadableDuration) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
Converts a ReadableDuration into a Dataflow API duration string.
toCloudObject(T) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
Converts the provided object into an equivalent CloudObject.
toCloudTime(ReadableInstant) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
Converts a ReadableInstant into a Dateflow API time value.
Top - Class in org.apache.beam.sdk.transforms
PTransforms for finding the largest (or smallest) set of elements in a PCollection, or the largest (or smallest) set of values associated with each key in a PCollection of KVs.
Top.Largest<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
A Serializable Comparator that that uses the compared elements' natural ordering.
Top.Smallest<T extends java.lang.Comparable<? super T>> - Class in org.apache.beam.sdk.transforms
Serializable Comparator that that uses the reverse of the compared elements' natural ordering.
Top.TopCombineFn<T,ComparatorT extends java.util.Comparator<T> & java.io.Serializable> - Class in org.apache.beam.sdk.transforms
CombineFn for Top transforms that combines a bunch of Ts into a single count-long List<T>, using compareFn to choose the largest Ts.
TopCombineFn(int, ComparatorT) - Constructor for class org.apache.beam.sdk.transforms.Top.TopCombineFn
 
toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TextIO.Write
toResource(ValueProvider<ResourceId>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
toState(String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
 
toString() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
 
toString() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
 
toString() - Method in class org.apache.beam.runners.dataflow.TestDataflowRunner
 
toString() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
toString() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
toString() - Method in class org.apache.beam.runners.flink.FlinkRunner
 
toString() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
toString() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators
 
toString() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
 
toString() - Method in class org.apache.beam.sdk.coders.Coder.Context
Deprecated.
 
toString() - Method in class org.apache.beam.sdk.coders.DelegateCoder
 
toString() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
toString() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
 
toString() - Method in class org.apache.beam.sdk.coders.StructuredCoder
 
toString() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
 
toString() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
toString() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
 
toString() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
toString() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
Returns the string representation of this ResourceId.
toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
 
toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
 
toString() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
toString() - Method in class org.apache.beam.sdk.io.range.ByteKey
 
toString() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
 
toString() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
toString() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
toString() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
 
toString() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
 
toString() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
 
toString() - Method in class org.apache.beam.sdk.Pipeline
 
toString() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
 
toString() - Method in class org.apache.beam.sdk.testing.TestPipeline
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
 
toString() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
 
toString() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
 
toString() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
 
toString() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
 
toString() - Method in class org.apache.beam.sdk.transforms.PTransform
 
toString() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRange
 
ToString - Class in org.apache.beam.sdk.transforms
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
Deprecated.
 
toString() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
 
toString() - Method in class org.apache.beam.sdk.values.KV
 
toString() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
 
toString() - Method in class org.apache.beam.sdk.values.PValueBase
 
toString() - Method in class org.apache.beam.sdk.values.TimestampedValue
 
toString() - Method in class org.apache.beam.sdk.values.TupleTag
 
toString() - Method in class org.apache.beam.sdk.values.TupleTagList
 
toString() - Method in class org.apache.beam.sdk.values.TypeDescriptor
 
toString() - Method in class org.apache.beam.sdk.values.TypeParameter
 
toString() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
 
toString() - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
Returns a canonical string representation of the TableReference.
toUnsplittableSource(BoundedSource<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
Returns an equivalent unsplittable BoundedSource<T>.
transformStepNames - Variable in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
translate(Pipeline, ApexPipelineOptions) - Static method in class org.apache.beam.runners.apex.TestApexRunner
 
translate(Pipeline, DataflowRunner, List<DataflowPackage>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
Translates a Pipeline into a JobSpecification.
translate(TransformHierarchy.Node, TransformT, Class<TransformT>) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
Determine if this Node belongs to a Bounded branch of the pipeline, or Unbounded, and translate with the proper translator.
translateOnly - Variable in class org.apache.beam.runners.apex.ApexRunner
 
translator - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
traverseTopologically(Pipeline.PipelineVisitor) - Method in class org.apache.beam.sdk.Pipeline
For internal use only; no backwards-compatibility guarantees.
Trigger - Class in org.apache.beam.sdk.transforms.windowing
Triggers control when the elements for a specific key and window are output.
Trigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger
 
Trigger() - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger
 
Trigger.OnceTrigger - Class in org.apache.beam.sdk.transforms.windowing
For internal use only; no backwards-compatibility guarantees. Triggers that are guaranteed to fire at most once should extend Trigger.OnceTrigger rather than the general Trigger class to indicate that behavior.
triggering(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Sets a non-default trigger for this Window PTransform.
tryClaim(long) - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
Attempts to claim the given offset.
tryReturnRecordAt(boolean, ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
tryReturnRecordAt(boolean, Long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
tryReturnRecordAt(boolean, long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
tryReturnRecordAt(boolean, PositionT) - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Atomically determines whether a record at the given position can be returned and updates internal state.
trySplitAtPosition(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
 
trySplitAtPosition(Long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
trySplitAtPosition(long) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
 
trySplitAtPosition(PositionT) - Method in interface org.apache.beam.sdk.io.range.RangeTracker
Atomically splits the current range [RangeTracker.getStartPosition(), RangeTracker.getStopPosition()) into a "primary" part [RangeTracker.getStartPosition(), splitPosition) and a "residual" part [splitPosition, RangeTracker.getStopPosition()), assuming the current last-consumed position is within [RangeTracker.getStartPosition(), splitPosition) (i.e., splitPosition has not been consumed yet).
TUPLE_TAGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
TupleTag<V> - Class in org.apache.beam.sdk.values
A TupleTag is a typed tag to use as the key of a heterogeneously typed tuple, like PCollectionTuple.
TupleTag() - Constructor for class org.apache.beam.sdk.values.TupleTag
Constructs a new TupleTag, with a fresh unique id.
TupleTag(String) - Constructor for class org.apache.beam.sdk.values.TupleTag
Constructs a new TupleTag with the given id.
TupleTagList - Class in org.apache.beam.sdk.values
A TupleTagList is an immutable list of heterogeneously typed TupleTags.
type - Variable in class org.apache.beam.runners.dataflow.util.OutputReference
 
TypeDescriptor<T> - Class in org.apache.beam.sdk.values
A description of a Java type, including actual generic parameters where possible.
TypeDescriptor() - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
Creates a TypeDescriptor representing the type parameter T.
TypeDescriptor(Object) - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
Creates a TypeDescriptor representing the type parameter T, which should resolve to a concrete type in the context of the class clazz.
TypeDescriptor(Class<?>) - Constructor for class org.apache.beam.sdk.values.TypeDescriptor
Creates a TypeDescriptor representing the type parameter T, which should resolve to a concrete type in the context of the class clazz.
TypeDescriptors - Class in org.apache.beam.sdk.values
A utility class containing the Java primitives for TypeDescriptor equivalents.
TypeDescriptors() - Constructor for class org.apache.beam.sdk.values.TypeDescriptors
 
TypeParameter<T> - Class in org.apache.beam.sdk.values
TypeParameter() - Constructor for class org.apache.beam.sdk.values.TypeParameter
 

U

Unbounded(SparkContext, SparkRuntimeContext, MicrobatchSource<T, CheckpointMarkT>, int) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
 
unbounded() - Static method in class org.apache.beam.sdk.io.CountingSource
Deprecated.
use GenerateSequence instead
UnboundedJmsSource(JmsIO.Read) - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
UnboundedReader() - Constructor for class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
 
UnboundedSource<OutputT,CheckpointMarkT extends UnboundedSource.CheckpointMark> - Class in org.apache.beam.sdk.io
A Source that reads an unbounded amount of input and, because of that, supports some additional operations such as checkpointing, watermarks, and record ids.
UnboundedSource() - Constructor for class org.apache.beam.sdk.io.UnboundedSource
 
UnboundedSource.CheckpointMark - Interface in org.apache.beam.sdk.io
A marker representing the progress and state of an UnboundedSource.UnboundedReader.
UnboundedSource.UnboundedReader<OutputT> - Class in org.apache.beam.sdk.io
A Reader that reads an unbounded amount of input.
unboundedWithTimestampFn(SerializableFunction<Long, Instant>) - Static method in class org.apache.beam.sdk.io.CountingSource
UnionCoder - Class in org.apache.beam.sdk.transforms.join
A UnionCoder encodes RawUnionValues.
unknown() - Static method in class org.apache.beam.sdk.io.fs.MatchResult
unpersist() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
 
UNSIGNED_LEXICOGRAPHICAL_COMPARATOR - Static variable in class org.apache.beam.runners.dataflow.util.RandomAccessData
 
unwindowedFilename(ResourceId, FileBasedSink.FilenamePolicy.Context, String) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
unwindowedFilename(ResourceId, FileBasedSink.FilenamePolicy.Context, String) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
When a sink has not requested windowed or triggered output, this method will be invoked to return the file resource to be created given the base output directory and a (possibly empty) extension applied by additional FileBasedSink configuration (e.g., FileBasedSink.CompressionType).
update(InputT) - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators.CombineFunctionState
 
update(InputT) - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
 
update(long) - Method in interface org.apache.beam.sdk.metrics.Distribution
Add an observation to this distribution.
updateCacheCandidates(Pipeline, SparkPipelineTranslator, EvaluationContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
Evaluator that update/populate the cache candidates.
updateConsumerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Update consumer configuration with new properties.
updateJob(String, Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
Updates the Dataflow Job with the given jobId.
updateProducerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Adds the given producer properties, overriding old values of properties with the same key.
updateWatermark(Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
Gives the runner a (best-effort) lower bound about the timestamps of future output associated with the current element.
updateWindowingStrategy(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
 
upTo(long) - Static method in class org.apache.beam.sdk.io.CountingSource
Deprecated.
use GenerateSequence instead
USE_INDEXED_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
USER_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
USER_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
USES_KEYED_STATE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
UsesAttemptedMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Metrics.
UsesAttemptedMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesAttemptedMetrics
 
UsesCommittedMetrics - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Metrics.
UsesCounterMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Counter.
UsesCounterMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesCounterMetrics
 
UsesDistributionMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Distribution.
UsesDistributionMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesDistributionMetrics
 
UsesGaugeMetrics - Class in org.apache.beam.sdk.testing
Category tag for validation tests which utilize Gauge.
UsesGaugeMetrics() - Constructor for class org.apache.beam.sdk.testing.UsesGaugeMetrics
 
UsesMapState - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize MapState.
UsesSetState - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize SetState.
UsesSplittableParDo - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize splittable ParDo.
UsesSplittableParDoWithWindowedSideInputs - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize splittable ParDo and use windowed side inputs.
UsesStatefulParDo - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize stateful ParDo.
UsesTestStream - Interface in org.apache.beam.sdk.testing
Category tag for tests that use TestStream, which is not a part of the Beam model but a special feature currently only implemented by the direct runner.
UsesTimersInParDo - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize timers in ParDo.
UsesUnboundedPCollections - Interface in org.apache.beam.sdk.testing
Category tag for validation tests which utilize at least one unbounded PCollection.
usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Enables BigQuery's Standard SQL dialect when reading from a query.

V

v1() - Static method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreIO
Returns a DatastoreV1 that provides an API for accessing Cloud Datastore through v1 version of Datastore Client library.
validate() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
 
validate() - Method in class org.apache.beam.sdk.io.AvroSource
 
validate() - Method in class org.apache.beam.sdk.io.CompressedSource
Validates that the delegate source is a valid source and that the channel factory is not null.
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSink
 
validate() - Method in class org.apache.beam.sdk.io.FileBasedSource
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
validate() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
 
validate() - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
 
validate(T) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
 
validate() - Method in class org.apache.beam.sdk.io.Source
Checks that this source is valid, before it can be used in a pipeline.
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.WriteFiles
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.xml.XmlSource
 
validate(Class<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.options.PipelineOptionsValidator
Validates that the passed PipelineOptions conforms to all the validation criteria from the passed in interface.
validate(PipelineOptions) - Method in class org.apache.beam.sdk.transforms.PTransform
Called before running the Pipeline to verify this transform is fully and correctly specified.
VALIDATE_SINK - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
VALIDATE_SOURCE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
validateGetOutputTimestamp(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
Assigns the given timestamp to windows using the specified windowFn, and verifies that result of windowFn.getOutputTime for later windows (as defined by maxTimestamp won't prevent the watermark from passing the end of earlier windows.
validateGetOutputTimestamps(WindowFn<T, W>, TimestampCombiner, List<List<Long>>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
Verifies that later-ending merged windows from any of the timestamps hold up output of earlier-ending windows, using the provided WindowFn and TimestampCombiner.
validateInputFilePatternSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
Validates the the input GCS path is accessible and that the path is well formed.
validateInputFilePatternSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
validateInputFilePatternSupported(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
Validate that a file pattern is conforming.
validateNonInterferingOutputTimes(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
Assigns the given timestamp to windows using the specified windowFn, and verifies that result of windowFn.getOutputTimestamp for each window is within the proper bound.
validateOutputFilePrefixSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
Validates the the output GCS path is accessible and that the path is well formed.
validateOutputFilePrefixSupported(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
validateOutputFilePrefixSupported(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
Validate that an output file prefix is conforming.
validateOutputResourceSupported(ResourceId) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
validateOutputResourceSupported(ResourceId) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
validateOutputResourceSupported(ResourceId) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
Validates that an output path is conforming.
ValidatesRunner - Interface in org.apache.beam.sdk.testing
Category tag for tests which validate that a Beam runner is correctly implemented.
Validation - Annotation Type in org.apache.beam.sdk.options
Validation represents a set of annotations that can be used to annotate getter properties on PipelineOptions with information representing the validation criteria to be used when validating with the PipelineOptionsValidator.
Validation.Required - Annotation Type in org.apache.beam.sdk.options
This criteria specifies that the value must be not null.
VALUE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
 
value() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
 
value() - Method in class org.apache.beam.sdk.metrics.GaugeResult
 
value() - Static method in class org.apache.beam.sdk.state.StateSpecs
Create a StateSpec for a single value of type T.
value(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
Identical to StateSpecs.value(), but with a coder explicitly supplied.
ValueInSingleWindow<T> - Class in org.apache.beam.sdk.values
An immutable tuple of value, timestamp, window, and pane.
ValueInSingleWindow() - Constructor for class org.apache.beam.sdk.values.ValueInSingleWindow
 
ValueInSingleWindow.Coder<T> - Class in org.apache.beam.sdk.values
A coder for ValueInSingleWindow.
valueOf(String) - Static method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.annotations.Experimental.Kind
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.FileBasedSource.Mode
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.MatchResult.Status
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.TextIO.CompressionType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.TFRecordIO.CompressionType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.PipelineResult.State
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.state.TimeDomain
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.testing.TestStream.EventType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.display.DisplayData.Type
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.values.PCollection.IsBounded
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
Returns the enum constant of this type with the specified name.
ValueProvider<T> - Interface in org.apache.beam.sdk.options
A ValueProvider abstracts the notion of fetching a value that may or may not be currently available.
ValueProvider.Deserializer - Class in org.apache.beam.sdk.options
For internal use only; no backwards compatibility guarantees.
ValueProvider.NestedValueProvider<T,X> - Class in org.apache.beam.sdk.options
ValueProvider.NestedValueProvider is an implementation of ValueProvider that allows for wrapping another ValueProvider object.
ValueProvider.RuntimeValueProvider<T> - Class in org.apache.beam.sdk.options
ValueProvider.RuntimeValueProvider is an implementation of ValueProvider that allows for a value to be provided at execution time rather than at graph construction time.
ValueProvider.Serializer - Class in org.apache.beam.sdk.options
For internal use only; no backwards compatibility guarantees.
ValueProvider.StaticValueProvider<T> - Class in org.apache.beam.sdk.options
ValueProvider.StaticValueProvider is an implementation of ValueProvider that allows for a static value to be provided.
values() - Static method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.Level
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.annotations.Experimental.Kind
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.coders.CannotProvideCoderException.ReasonCode
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.FileBasedSource.Mode
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.fs.MatchResult.Status
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Writes just the values to Kafka.
values() - Static method in enum org.apache.beam.sdk.io.TextIO.CompressionType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.TFRecordIO.CompressionType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.options.PipelineOptions.CheckEnabled
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.PipelineResult.State
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Method in interface org.apache.beam.sdk.state.MapState
Returns an Iterable over the values contained in this map.
values() - Static method in enum org.apache.beam.sdk.state.TimeDomain
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.testing.SourceTestUtils.ExpectedSplitOutcome
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.testing.TestStream.EventType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.display.DisplayData.Type
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
Returns an array containing the constants of this enum type, in the order they are declared.
Values<V> - Class in org.apache.beam.sdk.transforms
Values<V> takes a PCollection of KV<K, V>s and returns a PCollection<V> of the values.
values() - Static method in enum org.apache.beam.sdk.transforms.windowing.PaneInfo.Timing
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.transforms.windowing.Window.ClosingBehavior
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.values.PCollection.IsBounded
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.values.WindowingStrategy.AccumulationMode
Returns an array containing the constants of this enum type, in the order they are declared.
ValueState<T> - Interface in org.apache.beam.sdk.state
A ReadableState cell containing a single value.
ValueWithRecordId<ValueT> - Class in org.apache.beam.sdk.values
For internal use only; no backwards compatibility guarantees.
ValueWithRecordId(ValueT, byte[]) - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId
 
ValueWithRecordId.StripIdsDoFn<T> - Class in org.apache.beam.sdk.values
DoFn to turn a ValueWithRecordId<T> back to the value T.
ValueWithRecordId.ValueWithRecordIdCoder<ValueT> - Class in org.apache.beam.sdk.values
A Coder for ValueWithRecordId, using a wrapped value Coder.
ValueWithRecordIdCoder(Coder<ValueT>) - Constructor for class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
VarIntCoder - Class in org.apache.beam.sdk.coders
A Coder that encodes Integers using between 1 and 5 bytes.
VarLongCoder - Class in org.apache.beam.sdk.coders
A Coder that encodes Longs using between 1 and 10 bytes.
verifyDeterministic() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.AtomicCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.AvroCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.BitSetCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ByteCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.Coder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic(Coder<?>, String, Iterable<Coder<?>>) - Static method in class org.apache.beam.sdk.coders.Coder
Verifies all of the provided coders are deterministic.
verifyDeterministic(Coder<?>, String, Coder<?>...) - Static method in class org.apache.beam.sdk.coders.Coder
Verifies all of the provided coders are deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.CustomCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DelegateCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DoubleCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.DurationCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.InstantCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.KvCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
LengthPrefixCoder is deterministic if the nested Coder is.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.ListCoder
List sizes are always known, so ListIterable may be deterministic while the general IterableLikeCoder is not.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.MapCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.NullableCoder
NullableCoder is deterministic if the nested Coder is.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SerializableCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.SetCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VarIntCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VarLongCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.coders.VoidCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
Throw Coder.NonDeterministicException if the coding is not deterministic.
verifyDeterministic() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
 
verifyPAssertsSucceeded(Pipeline, PipelineResult) - Static method in class org.apache.beam.sdk.testing.TestPipeline
Verifies all {PAsserts} in the pipeline have been executed and were successful.
verifyPath(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
 
verifyPath(String) - Method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
 
verifyPath(String) - Method in interface org.apache.beam.sdk.extensions.gcp.storage.PathValidator
Validate that a path is a valid path and that the path is accessible.
via(SimpleFunction<? super InputT, ? extends Iterable<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
For a SimpleFunction<InputT, ? extends Iterable<OutputT>> fn, return a PTransform that applies fn to every element of the input PCollection<InputT> and outputs all of the elements to the output PCollection<OutputT>.
via(SerializableFunction<NewInputT, ? extends Iterable<OutputT>>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
For a SerializableFunction<InputT, ? extends Iterable<OutputT>> fn, returns a PTransform that applies fn to every element of the input PCollection<InputT> and outputs all of the elements to the output PCollection<OutputT>.
via(SimpleFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
For a SimpleFunction<InputT, OutputT> fn, returns a PTransform that takes an input PCollection<InputT> and returns a PCollection<OutputT> containing fn.apply(v) for every element v in the input.
via(SerializableFunction<NewInputT, OutputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
For a SerializableFunction<InputT, OutputT> fn and output type descriptor, returns a PTransform that takes an input PCollection<InputT> and returns a PCollection<OutputT> containing fn.apply(v) for every element v in the input.
View - Class in org.apache.beam.sdk.transforms
Transforms for creating PCollectionViews from PCollections (to read them as side inputs).
View.AsIterable<T> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.AsList<T> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.AsMap<K,V> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.AsMultimap<K,V> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.AsSingleton<T> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
View.CreatePCollectionView<ElemT,ViewT> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
ViewFn<PrimitiveViewT,ViewT> - Class in org.apache.beam.sdk.transforms
For internal use only; no backwards-compatibility guarantees.
ViewFn() - Constructor for class org.apache.beam.sdk.transforms.ViewFn
 
visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
 
visitPrimitiveTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
visitPrimitiveTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called for each primitive transform after all of its topological predecessors and inputs have been visited.
visitValue(PValue, TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
 
visitValue(PValue, TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
Called for each value after the transform that produced the value has been visited.
VoidCoder - Class in org.apache.beam.sdk.coders
A Coder for Void.

W

waitUntilFinish(Duration) - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
 
waitUntilFinish(Duration, MonitoringUtil.JobMessagesHandler) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
Waits until the pipeline finishes and returns the final status.
waitUntilFinish() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
Waits until the pipeline finishes and returns the final status.
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
Waits until the pipeline finishes and returns the final status.
waitUntilFinish() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
 
waitUntilFinish() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
waitUntilFinish(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
 
waitUntilFinish(Duration) - Method in interface org.apache.beam.sdk.PipelineResult
Waits until the pipeline finishes and returns the final status.
waitUntilFinish() - Method in interface org.apache.beam.sdk.PipelineResult
Waits until the pipeline finishes and returns the final status.
WatermarkEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
 
WatermarkHoldState - Interface in org.apache.beam.sdk.state
For internal use only; no backwards-compatibility guarantees.
WatermarksListener(JavaStreamingContext) - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarksListener
 
watermarkStateInternal(TimestampCombiner) - Static method in class org.apache.beam.sdk.state.StateSpecs
For internal use only; no backwards-compatibility guarantees.
weeks(int, int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
Returns a WindowFn that windows elements into periods measured by weeks.
where(TypeParameter<X>, TypeDescriptor<X>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
Returns a new TypeDescriptor where type variables represented by typeParameter are substituted by typeDescriptor.
window() - Method in interface org.apache.beam.sdk.state.StateContext
Returns the window corresponding to the state.
window() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
Returns the window in which the timer is firing.
Window<T> - Class in org.apache.beam.sdk.transforms.windowing
Window logically divides up or groups the elements of a PCollection into finite windows according to a WindowFn.
Window() - Constructor for class org.apache.beam.sdk.transforms.windowing.Window
 
window() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
Returns the window of the current element prior to this WindowFn being called.
Window.Assign<T> - Class in org.apache.beam.sdk.transforms.windowing
A Primitive PTransform that assigns windows to elements based on a WindowFn.
Window.ClosingBehavior - Enum in org.apache.beam.sdk.transforms.windowing
Specifies the conditions under which a final pane will be created when a window is permanently closed.
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
 
windowCoder() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
Returns the Coder used for serializing the windows used by this windowFn.
WindowedContext(BoundedWindow, PaneInfo, int, int) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy.WindowedContext
 
WindowedContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.WindowedContext
 
windowedFilename(ResourceId, FileBasedSink.FilenamePolicy.WindowedContext, String) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
 
windowedFilename(ResourceId, FileBasedSink.FilenamePolicy.WindowedContext, String) - Method in class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
When a sink has requested windowed or triggered output, this method will be invoked to return the file resource to be created given the base output directory and a (possibly empty) extension from FileBasedSink configuration (e.g., FileBasedSink.CompressionType).
windowedWrites - Variable in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Whether windowed writes are being used.
WindowFn<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
The argument to the Window transform used to assign elements into windows and to determine how windows are merged.
WindowFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn
 
WindowFn.AssignContext - Class in org.apache.beam.sdk.transforms.windowing
WindowFn.MergeContext - Class in org.apache.beam.sdk.transforms.windowing
WindowFnTestUtils - Class in org.apache.beam.sdk.testing
A utility class for testing WindowFns.
WindowFnTestUtils() - Constructor for class org.apache.beam.sdk.testing.WindowFnTestUtils
 
WindowingStrategy<T,W extends BoundedWindow> - Class in org.apache.beam.sdk.values
A WindowingStrategy describes the windowing behavior for a specific collection of values.
WindowingStrategy.AccumulationMode - Enum in org.apache.beam.sdk.values
The accumulation modes that can be used with windowing.
WindowMappingFn<TargetWindowT extends BoundedWindow> - Class in org.apache.beam.sdk.transforms.windowing
Experimental! This will be ready for users eventually, but should be considered internal for now.
WindowMappingFn() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
Create a new WindowMappingFn with zero maximum lookback.
WindowMappingFn(Duration) - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
Create a new WindowMappingFn with the specified maximum lookback.
windows() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.MergeContext
Returns the current set of windows.
with(SimpleFunction<DataT, InputT>, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
Returns a CombineFns.ComposedCombineFn that can take additional GlobalCombineFns and apply them as a single combine function.
with(SimpleFunction<DataT, InputT>, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
Returns a CombineFns.ComposedCombineFnWithContext that can take additional GlobalCombineFns and apply them as a single combine function.
with(SimpleFunction<DataT, InputT>, Combine.CombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
Returns a CombineFns.ComposedCombineFn with an additional Combine.CombineFn.
with(SimpleFunction<DataT, InputT>, CombineWithContext.CombineFnWithContext<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
with(SimpleFunction<DataT, InputT>, CombineFnBase.GlobalCombineFn<InputT, ?, OutputT>, TupleTag<OutputT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
withAllowedLateness(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Override the amount of lateness allowed for data elements in the output PCollection and downstream PCollections until explicitly set again.
withAllowedLateness(Duration, Window.ClosingBehavior) - Method in class org.apache.beam.sdk.transforms.windowing.Window
Override the amount of lateness allowed for data elements in the pipeline.
withAllowedLateness(Duration) - Method in class org.apache.beam.sdk.values.WindowingStrategy
Returns a WindowingStrategy identical to this but with the allowed lateness set to allowedLateness.
withAllowedTimestampSkew(Duration) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
Deprecated.
This method permits a to elements to be emitted behind the watermark. These elements are considered late, and if behind the allowed lateness of a downstream PCollection may be silently dropped. See https://issues.apache.org/jira/browse/BEAM-644 for details on a replacement.
withBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide a size for the scroll read.
withBatchSize(long) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable instance indicated by the given options, and using any other specified customizations.
withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable instance indicated by the given options, and using any other specified customizations.
withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write to the Cloud Bigtable instance indicated by the given options, and using any other specified customizations.
withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write to the Cloud Bigtable instance indicated by the given options, and using any other specified customizations.
withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets the bootstrap servers for the Kafka consumer.
withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Returns a new KafkaIO.Write transform with Kafka producer pointing to bootstrapServers.
withBucket(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withBucket(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets the XML file charset.
withCharset(Charset) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
Sets the charset used to write the file.
withChunkSize(Long) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withClientProvider(KinesisClientProvider) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Allows to specify custom KinesisClientProvider.
withClientProvider(String, String, Regions) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Specify credential details and region to be used to read from Kinesis.
withClosingBehavior(Window.ClosingBehavior) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
withCodec(CodecFactory) - Method in class org.apache.beam.sdk.io.AvroIO.Write
Writes to Avro file(s) compressed using specified codec.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
Returns a Create.TimestampedValues PTransform like this one that uses the given Coder<T> to decode each of the objects into a value of type T.
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
Returns a Create.Values PTransform like this one that uses the given Coder<T> to decode each of the objects into a value of type T.
withCollection(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
withCollection(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
withCompressionType(TextIO.CompressionType) - Method in class org.apache.beam.sdk.io.TextIO.Read
Returns a new transform for reading from text files that's like this one but reads from input sources using the specified compression type.
withCompressionType(TFRecordIO.CompressionType) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
Returns a transform for reading TFRecord files that decompresses all input files using the specified compression type.
withCompressionType(TFRecordIO.CompressionType) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes to output files using the specified compression type.
withCompressionType(XmlIO.Read.CompressionType) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Decompresses all input files using the specified compression type.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
Returns a new HadoopInputFormatIO.Read that will read from the source using the options provided by the given configuration.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Returns a new HBaseIO.Read that will read from the HBase instance indicated by the given configuration.
withConfiguration(Configuration) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
Returns a new HBaseIO.Write that will write to the HBase instance indicated by the given Configuration, and using any other specified customizations.
withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide the Elasticsearch connection configuration object.
withConnectionConfiguration(ElasticsearchIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
Provide the Elasticsearch connection configuration object.
withConnectionConfiguration(MqttIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
Define the MQTT connection configuration used to connect to the MQTT broker.
withConnectionConfiguration(MqttIO.ConnectionConfiguration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
Define MQTT connection configuration used to connect to the MQTT broker.
withConnectionFactory(ConnectionFactory) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Specify the JMS connection factory to connect to the JMS broker.
withConnectionFactory(ConnectionFactory) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify the JMS connection factory to connect to the JMS broker.
withConnectionProperties(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
Sets the connection properties passed to driver.connect(...).
withConsumerFactoryFn(SerializableFunction<Map<String, Object>, Consumer<byte[], byte[]>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A factory to create Kafka Consumer from consumer configuration.
withCreateDisposition(BigQueryIO.Write.CreateDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies whether the table should be created if it does not exist.
withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
withDatabase(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
withDecompression(CompressedSource.DecompressingChannelFactory) - Method in class org.apache.beam.sdk.io.CompressedSource
Return a CompressedSource that is like this one but will decompress its underlying file with the given CompressedSource.DecompressingChannelFactory.
withDefaultValue(T) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
Default value to return for windows with no value in them.
withEarlyFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
withEarlyFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
Creates a new Trigger like the this, except that it fires repeatedly whenever the given Trigger fires before the watermark has passed the end of the window.
withEndKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns new ByteKeyRange like this one, but with the specified end key.
withEpsilon(double) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Returns an ApproximateQuantilesCombineFn that's like this one except that it uses the specified epsilon value.
withExtensionsFrom(Iterable<Class<?>>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
Returns a ProtoCoder like this one, but with the extensions from the given classes registered.
withExtensionsFrom(Class<?>...) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
withFanout(int) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform identical to this, but that uses an intermediate node to combine parts of the data to reduce load on the final global combine step.
withFilename(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withFilenamePolicy(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.AvroIO.Write
Configures the FileBasedSink.FilenamePolicy that will be used to name written files.
withFilenamePolicy(FileBasedSink.FilenamePolicy) - Method in class org.apache.beam.sdk.io.TextIO.Write
Configures the FileBasedSink.FilenamePolicy that will be used to name written files.
withFilter(Filter) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Returns a new HBaseIO.Read that will filter the rows read from HBase using the given row filter.
withFilter(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withFilter(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
withFn(DoFn<InputT, OutputT>) - Method in class org.apache.beam.runners.dataflow.util.DoFnInfo
 
withFooter(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
Adds a footer string to each file.
withFormatFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Formats the user's type into a TableRow to be written to BigQuery.
withGapDuration(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.Sessions
Creates a Sessions WindowFn with the specified gap duration.
withHeader(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
Adds a header string to each file.
withHotKeyFanout(SerializableFunction<? super K, Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
If a single key has disproportionately many values, it may become a bottleneck, especially in streaming mode.
withHotKeyFanout(int) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Like Combine.PerKey.withHotKeyFanout(SerializableFunction), but returning the given constant value for every key.
withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub, adding each record's unique identifier to the published messages in an attribute with the specified name.
withJsonSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Similar to BigQueryIO.Write.withSchema(TableSchema) but takes in a JSON-serialized TableSchema.
withJsonSchema(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withKeyDeserializer(Class<? extends Deserializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a Kafka Deserializer to interpret key bytes read from Kafka.
withKeyDeserializerAndCoder(Class<? extends Deserializer<K>>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a Kafka Deserializer for interpreting key bytes read from Kafka along with a Coder for helping the Beam runner materialize key objects at runtime if necessary.
withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read only rows in the specified range.
withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Returns a new HBaseIO.Read that will read only rows in the specified range.
withKeyRange(byte[], byte[]) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Returns a new HBaseIO.Read that will read only rows in the specified range.
WithKeys<K,V> - Class in org.apache.beam.sdk.transforms
WithKeys<K, V> takes a PCollection<V>, and either a constant key of type K or a function from V to K, and returns a PCollection<KV<K, V>>, where each of the values in the input PCollection has been paired with either the constant key or a key computed from the value.
withKeySerializer(Class<? extends Serializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Sets a Serializer for serializing key (if any) to bytes.
withKeyTranslation(SimpleFunction<?, K>) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
Returns a new HadoopInputFormatIO.Read that will transform the keys read from the source using the given key translation function.
withKeyType(TypeDescriptor<K>) - Method in class org.apache.beam.sdk.transforms.WithKeys
Return a WithKeys that is like this one with the specified key type descriptor.
withLabel(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
Set the item label.
withLateFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
 
withLateFirings(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
Creates a new Trigger like the this, except that it fires repeatedly whenever the given Trigger fires after the watermark has passed the end of the window.
withLinkUrl(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
Set the item link url.
withLiteralGqlQuery(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads the results of the specified GQL query.
withLiteralGqlQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore Emulator running locally on the specified host port.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore Emulator running locally on the specified host port.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from a Datastore Emulator running at the given localhost address.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that writes to the Cloud Datastore Emulator running locally on the specified host port.
withMapper(ObjectMapper) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
Use custom Jackson ObjectMapper instead of the default one.
withMapper(ObjectMapper) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
Use custom Jackson ObjectMapper instead of the default one.
withMaxBatchSize(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
Provide a maximum size in number of documents for the batch see bulk API (https://www.elastic.co/guide/en/elasticsearch/reference/2.4/docs-bulk.html).
withMaxBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
Provide a maximum size in bytes for the batch see bulk API (https://www.elastic.co/guide/en/elasticsearch/reference/2.4/docs-bulk.html).
withMaxInputSize(long) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
Returns an ApproximateQuantilesCombineFn that's like this one except that it uses the specified maxNumElements value.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
Returns a new BoundedReadFromUnboundedSource that reads a bounded amount of data from the given UnboundedSource.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Define the max number of records that the source will read.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
withMaxNumRecords(int) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Specifies to read at most a given number of records.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
Define the max number of records received by the MqttIO.Read.
withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.Read.Unbounded
Returns a new BoundedReadFromUnboundedSource that reads a bounded amount of data from the given UnboundedSource.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
Returns a new BoundedReadFromUnboundedSource that reads a bounded amount of data from the given UnboundedSource.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.GenerateSequence
Specifies to stop generating elements after the given time.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Define the max read time that the source will read.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
Specifies to read at most a given number of records.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
Define the max read time (duration) while the MqttIO.Read will receive messages.
withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.Read.Unbounded
Returns a new BoundedReadFromUnboundedSource that reads a bounded amount of data from the given UnboundedSource.
withMemoryMB(int) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Sets the size of the memory buffer in megabytes.
withMetadata(Map<String, Object>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
Writes to Avro file(s) with the specified metadata.
WithMetricsSupport - Class in org.apache.beam.runners.spark.metrics
A MetricRegistry decorator-like that supports AggregatorMetric and SparkBeamMetric as Gauges.
withMinBundleSize(long) - Method in class org.apache.beam.sdk.io.AvroSource
Returns an AvroSource that's like this one but uses the supplied minimum bundle size.
withMinBundleSize(long) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets a parameter minBundleSize for the minimum bundle size of the source.
withMode(WindowingStrategy.AccumulationMode) - Method in class org.apache.beam.sdk.values.WindowingStrategy
Returns a WindowingStrategy identical to this but with the accumulation mode set to mode.
withNamespace(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from the given namespace.
withNamespace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
withNamespace(Class<?>) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
Set the item namespace from the given Class.
withNumQuerySplits(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads by splitting the given query into numQuerySplits.
withNumShards(int) - Method in class org.apache.beam.sdk.io.AvroIO.Write
Configures the number of output shards produced overall (when using unwindowed writes) or per-window (when using windowed writes).
withNumShards(int) - Method in class org.apache.beam.sdk.io.TextIO.Write
Configures the number of output shards produced overall (when using unwindowed writes) or per-window (when using windowed writes).
withNumShards(int) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes to the provided number of shards.
withNumShards(int) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink using the specified number of shards.
withNumShards(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink using the ValueProvider specified number of shards.
withNumSplits(int) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
 
withOffset(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
Partitions the timestamp space into half-open intervals of the form [N * size + offset, (N + 1) * size + offset), where 0 is the epoch.
withOffset(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
Assigns timestamps into half-open intervals of the form [N * period + offset, N * period + offset + size).
withoutDefaults() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform identical to this, but that does not attempt to provide a default value in the case of empty input.
withoutMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Returns a PTransform for PCollection of KV, dropping Kafka metatdata.
withOutputTags(TupleTag<OutputT>, TupleTagList) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns a new multi-output ParDo PTransform that's like this PTransform but with the specified output tags.
withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
withoutSharding() - Method in class org.apache.beam.sdk.io.AvroIO.Write
Forces a single file as output and empty shard name template.
withoutSharding() - Method in class org.apache.beam.sdk.io.TextIO.Write
Forces a single file as output and empty shard name template.
withoutSharding() - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Forces a single file as output.
withoutStrictParsing() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
During parsing of the arguments, we will skip over improperly formatted and unknown arguments.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Disable validation that the table exists or the query succeeds prior to pipeline submission.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Disables BigQuery table validation.
withoutValidation() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
Returns a transform for reading TFRecord files that has GCS path validation on pipeline creation disabled.
withParser(MongoDbGridFSIO.Parser<X>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withPassword(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch authentication is enabled, provide the password.
withPassword(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
withPassword(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Define the password to connect to the JMS broker (authenticated).
withPassword(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Define the password to connect to the JMS broker (authenticated).
withPassword(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
withPreparedStatementSetter(JdbcIO.PreparedStatementSetter<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
withProducerFactoryFn(SerializableFunction<Map<String, Object>, Producer<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Sets a custom function to create Kafka producer.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that writes to the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
withQuery(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide a query used while reading from Elasticsearch.
withQuery(Query) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads the results of the specified query.
withQuery(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withQueue(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Specify the JMS queue destination name where to read messages from.
withQueue(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify the JMS queue destination name where to send messages to.
withRate(long, Duration) - Method in class org.apache.beam.sdk.io.GenerateSequence
Specifies to generate at most a given number of elements per a given period.
withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets a JAXB annotated class that can be populated using a record of the provided XML file.
withRecordClass(Class<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
Writes objects of the given class mapped to XML elements using JAXB.
withRecordElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets name of the record element of the XML document.
withRepresentativeType(TypeDescriptor<IdT>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
Return a WithRepresentativeValues PTransform that is like this one, but with the specified output type descriptor.
withRepresentativeValueFn(SerializableFunction<T, IdT>) - Static method in class org.apache.beam.sdk.transforms.Distinct
Returns a Distinct<T, IdT> PTransform.
withRetained(boolean) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
Whether or not the publish message should be retained by the messaging engine.
withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets name of the root element of the XML document.
withRootElement(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
Sets the enclosing root element for the generated XML files.
withRowFilter(RowFilter) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will filter the rows read from Cloud Bigtable using the given row filter.
withRowMapper(JdbcIO.RowMapper<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withRunnerDeterminedSharding() - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink with runner-determined sharding.
withScan(Scan) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Returns a new HBaseIO.Read that will filter the rows read from HBase using the given scan.
withSchema(String) - Method in class org.apache.beam.sdk.io.AvroSource
Returns an AvroSource that's like this one but reads files containing records that conform to the given schema.
withSchema(Schema) - Method in class org.apache.beam.sdk.io.AvroSource
Returns an AvroSource that's like this one but reads files containing records that conform to the given schema.
withSchema(Class<X>) - Method in class org.apache.beam.sdk.io.AvroSource
Returns an AvroSource that's like this one but reads files containing records of the type of the given class.
withSchema(TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Uses the specified schema for rows to be written.
withSchema(ValueProvider<TableSchema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withSchemaFromView(PCollectionView<Map<String, String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows the schemas for each table to be computed within the pipeline itself.
withScrollKeepalive(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
Provide a scroll keepalive.
withShard(int) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
 
withSharding(PTransform<PCollection<T>, PCollectionView<Integer>>) - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that will write to the current FileBasedSink using the specified PTransform to compute the number of shards.
withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.AvroIO.Write
Uses the given ShardNameTemplate for naming output files.
withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
Uses the given ShardNameTemplate for naming output files.
withShardNameTemplate(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Uses the given shard name template.
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform identical to this, but with the specified side inputs to use in CombineWithContext.CombineFnWithContext.
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
Returns a PTransform identical to this, but with the specified side inputs to use in CombineWithContext.CombineFnWithContext.
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
 
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns a PTransform identical to this, but with the specified side inputs to use in CombineWithContext.CombineFnWithContext.
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
Returns a PTransform identical to this, but with the specified side inputs to use in CombineWithContext.CombineFnWithContext.
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
Returns a new multi-output ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
Returns a new multi-output ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInputs(PCollectionView<?>...) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns a new ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSideInputs(Iterable<? extends PCollectionView<?>>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
Returns a new ParDo PTransform that's like this PTransform but with the specified additional side inputs.
withSingletonValues() - Method in class org.apache.beam.sdk.transforms.View.AsMap
Deprecated.
this method simply returns this AsMap unmodified
withSkew(Duration) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withStartingDay(int, int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
withStartingMonth(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
withStartingYear(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
withStartKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
Returns new ByteKeyRange like this one, but with the specified start key.
withStatement(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
withStatementPreparator(JdbcIO.StatementPreparator) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
 
withSuffix(String) - Method in class org.apache.beam.sdk.io.AvroIO.Write
Configures the filename suffix for written files.
withSuffix(String) - Method in class org.apache.beam.sdk.io.TextIO.Write
Configures the filename suffix for written files.
withSuffix(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
Writes to the file(s) with the given filename suffix.
withTableDescription(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies the table description.
withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write to the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
Returns a new HBaseIO.Read that will read from the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
Returns a new HBaseIO.Write that will write to the specified table.
withTempLocation(String) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
Sets the path to a temporary location where the sorter writes intermediate files.
withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub and adds each record's timestamp to the published messages in an attribute with the specified name.
withTimestampCombiner(TimestampCombiner) - Method in class org.apache.beam.sdk.transforms.windowing.Window
(Experimental) Override the default TimestampCombiner, to control the output timestamp of values output from a GroupByKey operation.
withTimestampCombiner(TimestampCombiner) - Method in class org.apache.beam.sdk.values.WindowingStrategy
 
withTimestampFn(SerializableFunction<Long, Instant>) - Method in class org.apache.beam.sdk.io.GenerateSequence
Specifies the function to use to assign timestamps to the elements.
withTimestampFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A function to assign a timestamp to a record.
withTimestampFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A function to assign a timestamp to a record.
WithTimestamps<T> - Class in org.apache.beam.sdk.transforms
A PTransform for assigning timestamps to all the elements of a PCollection.
withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
 
withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
 
withTimeZone(DateTimeZone) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
 
withTopic(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Specify the JMS topic destination name where to receive messages from.
withTopic(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Specify the JMS topic destination name where to send messages to.
withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets the topic to read from.
withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Sets the Kafka topic to write to.
withTopicPartitions(List<TopicPartition>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a list of partitions to read from.
withTopics(List<String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a list of topics to read from.
withTrigger(Trigger) - Method in class org.apache.beam.sdk.values.WindowingStrategy
Returns a WindowingStrategy identical to this but with the trigger set to wildcardTrigger.
withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
Returns a Create.TimestampedValues PTransform like this one that uses the given TypeDescriptor<T> to determine the Coder to use to decode each of the objects into a value of type T.
withType(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.transforms.Create.Values
Returns a Create.Values PTransform like this one that uses the given TypeDescriptor<T> to determine the Coder to use to decode each of the objects into a value of type T.
withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
 
withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
Example documentation for withUri.
withUri(String) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
withUsername(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
If Elasticsearch authentication is enabled, provide the username.
withUsername(String) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
 
withUsername(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
Define the username to connect to the JMS broker (authenticated).
withUsername(String) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
Define the username to connect to the JMS broker (authenticated).
withUsername(String) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
 
withValidation() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
After creation we will validate that PipelineOptions conforms to all the validation criteria from <T>.
withValidation() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory
After creation we will validate that <T> conforms to all the validation criteria.
withValidationEventHandler(ValidationEventHandler) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
Sets the ValidationEventHandler to use with JAXB.
withValueDeserializer(Class<? extends Deserializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a Kafka Deserializer to interpret value bytes read from Kafka.
withValueDeserializerAndCoder(Class<? extends Deserializer<V>>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
Sets a Kafka Deserializer for interpreting value bytes read from Kafka along with a Coder for helping the Beam runner materialize value objects at runtime if necessary.
withValueSerializer(Class<? extends Serializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
Sets a Serializer for serializing value to bytes.
withValueTranslation(SimpleFunction<?, V>) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
Returns a new HadoopInputFormatIO.Read that will transform the values read from the source using the given value translation function.
withWatermarkFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A function to calculate watermark after a record.
withWatermarkFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
A function to calculate watermark after a record.
withWindowedWrites() - Method in class org.apache.beam.sdk.io.AvroIO.Write
Preserves windowing of input elements and writes them to files based on the element's window.
withWindowedWrites() - Method in class org.apache.beam.sdk.io.TextIO.Write
 
withWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
Returns a new WriteFiles that writes preserves windowing on it's input.
withWindowFn(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.values.WindowingStrategy
Returns a WindowingStrategy identical to this but with the window function set to wildcardWindowFn.
withWritableByteChannelFactory(FileBasedSink.WritableByteChannelFactory) - Method in class org.apache.beam.sdk.io.TextIO.Write
Returns a transform for writing to text files like this one but that has the given FileBasedSink.WritableByteChannelFactory to be used by the FileBasedSink during output.
withWriteDisposition(BigQueryIO.Write.WriteDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies what to do with existing data in the table, in case the table already exists.
WorkerHarnessContainerImageFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
 
WorkerLogLevelOverrides() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
 
wrap(Throwable) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
 
wrap(String) - Method in class org.apache.beam.sdk.testing.PAssert.PAssertionSite
 
WritableCoder<T extends org.apache.hadoop.io.Writable> - Class in org.apache.beam.sdk.io.hadoop
A WritableCoder is a Coder for a Java class that implements Writable.
WritableCoder(Class<T>) - Constructor for class org.apache.beam.sdk.io.hadoop.WritableCoder
 
WritableCoder.WritableCoderProviderRegistrar - Class in org.apache.beam.sdk.io.hadoop
A CoderProviderRegistrar which registers a CoderProvider which can handle writable types.
WritableCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
 
write(Class<T>) - Static method in class org.apache.beam.sdk.io.AvroIO
Writes a PCollection to an Avro file (or multiple Avro files matching a sharding pattern).
Write() - Constructor for class org.apache.beam.sdk.io.AvroIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
 
Write() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
 
write(T) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Called for each value in the bundle.
write() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection to a BigQuery table.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Creates an uninitialized BigtableIO.Write.
write() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.Write builder.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.hbase.HBaseIO
Creates an uninitialized HBaseIO.Write.
write() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO
Write data to a JDBC datasource.
Write() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.jms.JmsIO
 
Write() - Constructor for class org.apache.beam.sdk.io.jms.JmsIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
Creates an uninitialized KafkaIO.Write PTransform.
Write() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
Write data to GridFS.
write(MongoDbGridFSIO.WriteFn<T>) - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO
 
Write() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
 
write(T, OutputStream) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.WriteFn
Output the object to the given OutputStream.
write() - Static method in class org.apache.beam.sdk.io.mongodb.MongoDbIO
Write data to MongoDB.
Write() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO
 
Write() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.TextIO
A PTransform that writes a PCollection to a text file (or multiple text files matching a sharding pattern), with each element of the input collection encoded into its own line.
Write() - Constructor for class org.apache.beam.sdk.io.TextIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.TFRecordIO
A PTransform that writes a PCollection to TFRecord file (or multiple TFRecord files matching a sharding pattern), with each element of the input collection encoded into its own record.
Write() - Constructor for class org.apache.beam.sdk.io.TFRecordIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.xml.XmlIO
A FileBasedSink that outputs records as XML-formatted elements.
Write() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO.Write
 
write(T) - Method in interface org.apache.beam.sdk.state.ValueState
Set the value.
WRITE_CODER - Static variable in class org.apache.beam.sdk.io.hbase.HBaseIO
 
writeAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded Avro messages of a given type to a Google Cloud Pub/Sub stream.
writeExternal(ObjectOutput) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.SerializableConfiguration
 
writeExternal(ObjectOutput) - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
 
WriteFiles<T> - Class in org.apache.beam.sdk.io
A PTransform that writes to a FileBasedSink.
writeFooter() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Writes footer at the end of output files.
writeGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.AvroIO
Writes Avro records of the specified schema.
writeGenericRecords(String) - Static method in class org.apache.beam.sdk.io.AvroIO
Writes Avro records of the specified schema.
writeHeader() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
Writes header at the beginning of output files.
writeMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes to a Google Cloud Pub/Sub stream.
WriteOperation(FileBasedSink<T>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Constructs a WriteOperation using the default strategy for generating a temporary directory from the base output filename.
WriteOperation(FileBasedSink<T>, ResourceId) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
Create a new WriteOperation.
writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded protobuf messages of a given type to a Google Cloud Pub/Sub stream.
Writer(FileBasedSink.WriteOperation<T>, String) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.Writer
Construct a new FileBasedSink.Writer that will produce files of the given MIME type.
WriteResult - Class in org.apache.beam.sdk.io.gcp.bigquery
The result of a BigQueryIO.Write transform.
writeStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes UTF-8 encoded strings to a Google Cloud Pub/Sub stream.
writeTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection containing TableRows to a BigQuery table.
writeTo(OutputStream, int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
Writes length bytes starting at offset from the backing data store to the specified output stream.

X

XmlIO - Class in org.apache.beam.sdk.io.xml
Transforms for reading and writing XML files using JAXB mappers.
XmlIO() - Constructor for class org.apache.beam.sdk.io.xml.XmlIO
 
XmlIO.Read<T> - Class in org.apache.beam.sdk.io.xml
Implementation of XmlIO.read().
XmlIO.Read.CompressionType - Enum in org.apache.beam.sdk.io.xml
Strategy for determining the compression type of XML files being read.
XmlIO.Write<T> - Class in org.apache.beam.sdk.io.xml
Implementation of XmlIO.write().
XmlSource<T> - Class in org.apache.beam.sdk.io.xml
Implementation of XmlIO.read().

Y

years(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
Returns a WindowFn that windows elements into periods measured by years.

Z

zero(NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.AggAccumParam
 
ZERO - Static variable in class org.apache.beam.sdk.metrics.DistributionResult
 
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z