- absolute(String, String...) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Construct a path from an absolute component path hierarchy.
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCaseExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCastExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
-
assertion to make sure the input and output are supported in this expression.
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlInputRefExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlReinterpretExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUdfExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowEndExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowStartExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
-
Compare operation must have 2 operands.
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNotNullExpression
-
only one operand is required.
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNullExpression
-
only one operand is required.
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentDateExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimeExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimestampExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDateCeilExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDateFloorExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlExtractExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlLogicalExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlNotExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathBinaryExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathUnaryExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPiExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandIntegerExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlConcatExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlOverlayExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlPositionExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlStringUnaryExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlSubstringExpression
-
- accept() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlTrimExpression
-
- AccumulatingCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
- accumulatingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new Window
PTransform
that uses the registered WindowFn and
Triggering behavior, and that accumulates elements in a pane after they are triggered.
- ACCUMULATOR_NAME - Static variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
-
- AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
-
- AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
-
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
-
- add(int, GlobalWatermarkHolder.SparkWatermarks) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
-
- add(InputT) - Method in interface org.apache.beam.sdk.state.GroupingState
-
Add a value to the buffer.
- add(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
-
For internal use only: no backwards compatibility guarantees.
- add(Long) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Adds a value to the heap, returning whether the value is (large enough
to be) in the heap.
- add(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item.
- addAccum(AccumT) - Method in interface org.apache.beam.sdk.state.CombiningState
-
Add an accumulator to this state cell.
- addAccumulator(NamedAggregators, NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.AggAccumParam
-
- addAll(Map<Integer, Queue<GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- addBoolean(Map<String, Object>, String, boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addCollectionToSingletonOutput(PCollection<?>, PCollectionView<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an output to this CollectionToSingleton
Dataflow step, consuming the specified
input PValue
and producing the specified output PValue
.
- addDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addDouble(Map<String, Object>, String, Double) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addElements(T, T...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Adds the specified elements to the source with timestamp equal to the current watermark.
- addElements(TimestampedValue<T>, TimestampedValue<T>...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Adds the specified elements to the source with the provided timestamps.
- addEncodingInput(Coder<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Sets the encoding for this Dataflow step.
- addIfAbsent(T) - Method in interface org.apache.beam.sdk.state.SetState
-
Ensures a value is a member of the set, returning true
if it was added and false
otherwise.
- addIfNotDefault(DisplayData.ItemSpec<T>, T) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item if the value is different than the specified default.
- addIfNotNull(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item if the value is not null.
- addInPlace(NamedAggregators, NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.AggAccumParam
-
- addInput(String, Boolean) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, Long) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, PInput) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name to this Dataflow step, coming from the specified input
PValue.
- addInput(String, Map<String, Object>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input that is a dictionary of strings to objects.
- addInput(String, List<? extends Map<String, Object>>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input that is a list of objects.
- addInput(BeamAggregationTransforms.AggregationAccumulator, BeamRecord) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAdaptor
-
- addInput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique, T) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
- addInput(InputT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
-
Adds the given input value to this accumulator, modifying
this accumulator.
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
- addInput(double[], Double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
- addInput(Combine.Holder<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
- addInput(int[], Integer) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
- addInput(long[], Long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Adds the given input value to the given accumulator, returning the
new accumulator value.
- addInput(List<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
- addInput(Object[], DataT) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
- addInput(Object[], DataT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
- addInput(AccumT, InputT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Adds the given input value to the given accumulator, returning the
new accumulator value.
- addInput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>, T) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
- addList(Map<String, Object>, String, List<? extends Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addList(Map<String, Object>, String, T[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addLong(Map<String, Object>, String, long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addLongs(Map<String, Object>, String, long...) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addMessage(Message) - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
-
- addNameFilter(MetricNameFilter) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
- addNull(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addOutput(PCollection<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds a primitive output to this Dataflow step, producing the specified output PValue
,
including its Coder
if a TypedPValue
.
- addOverrideForClass(Class<?>, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Overrides the default log level for the passed in class.
- addOverrideForName(String, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Overrides the default log level for the passed in name.
- addOverrideForPackage(Package, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Overrides the default log level for the passed in package.
- addProperties(Configuration, Properties) - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
-
Transfer the properties to the configuration object.
- addStep(PTransform<?, ?>, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Adds a step to the Dataflow workflow for the given transform, with the given Dataflow step
type.
- addStep(String) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
Add a step filter.
- addString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addStringList(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addTraceFor(AbstractGoogleClient, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
- addTraceFor(AbstractGoogleClientRequest<?>, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
- advance() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- advance(String) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
Advances the watermarks to the next-in-line watermarks.
- advance() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- advance() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- advance() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Advances the reader to the next valid record.
- advance() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Advances the reader to the next valid record.
- advanceBy(Duration) - Static method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
-
For internal use only: no backwards compatibility guarantees.
- advanceImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
- advanceImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Advances to the next record and returns true
, or returns false if there is no next
record.
- advanceNextBatchWatermarkToInfinity() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Advances the watermark in the next batch to the end-of-time.
- advanceProcessingTime(Duration) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the processing time by the specified amount.
- advanceTo(Instant) - Static method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
-
For internal use only: no backwards compatibility guarantees.
- advanceWatermark() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Advances the watermark.
- advanceWatermarkForNextBatch(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Advances the watermark in the next batch.
- advanceWatermarkTo(Instant) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the watermark of this source to the specified instant.
- advanceWatermarkToInfinity() - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the watermark to infinity, completing this
TestStream
.
- AfterAll - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that fires when all of its sub-triggers are ready.
- AfterEach - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that executes its sub-triggers in order.
- AfterFirst - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that fires once after at least one of its sub-triggers have fired.
- AfterPane - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
that fires at some point after a specified number of input elements have
arrived.
- AfterProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
trigger that fires at a specified point in processing time, relative to when
input first arrives.
- AfterSynchronizedProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
-
FOR INTERNAL USE ONLY.
- afterTimeSinceNewOutput(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- afterTimeSinceNewOutput(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- afterTotalOf(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- afterTotalOf(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- AfterWatermark - Class in org.apache.beam.sdk.transforms.windowing
-
AfterWatermark
triggers fire based on progress of the system watermark.
- AfterWatermark.AfterWatermarkEarlyAndLate - Class in org.apache.beam.sdk.transforms.windowing
-
- AfterWatermark.FromEndOfWindow - Class in org.apache.beam.sdk.transforms.windowing
-
A watermark trigger targeted relative to the end of the window.
- AggAccumParam - Class in org.apache.beam.runners.spark.aggregators
-
Aggregator accumulator param.
- AggAccumParam() - Constructor for class org.apache.beam.runners.spark.aggregators.AggAccumParam
-
- AggregationAccumulator() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAccumulator
-
- AggregationAccumulatorCoder(List<Coder>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAccumulatorCoder
-
- AggregationAdaptor(List<AggregateCall>, BeamRecordSqlType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAdaptor
-
- AggregationGroupByKeyFn(int, ImmutableBitSet) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationGroupByKeyFn
-
- AggregatorMetric - Class in org.apache.beam.runners.spark.metrics
-
- AggregatorMetricSource - Class in org.apache.beam.runners.spark.metrics
-
- AggregatorMetricSource(String, NamedAggregators) - Constructor for class org.apache.beam.runners.spark.metrics.AggregatorMetricSource
-
- AggregatorsAccumulator - Class in org.apache.beam.runners.spark.aggregators
-
For resilience, Accumulators
are required to be wrapped in a Singleton.
- AggregatorsAccumulator() - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
-
- AggregatorsAccumulator.AccumulatorCheckpointingSparkListener - Class in org.apache.beam.runners.spark.aggregators
-
- align(Duration) - Method in interface org.apache.beam.sdk.state.Timer
-
- alignedTo(Duration, Instant) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Aligns timestamps to the smallest multiple of period
since the offset
greater
than the timestamp.
- alignedTo(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Aligns the time to be the smallest multiple of period
greater than the epoch
boundary (aka new Instant(0)
).
- alignTo(Duration, Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- alignTo(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- AlignTo() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
-
- ALL_CONTEXTS - Static variable in class org.apache.beam.sdk.testing.CoderProperties
-
All the contexts, for use in test cases.
- ALL_KEYS - Static variable in class org.apache.beam.sdk.io.range.ByteKeyRange
-
The range of all keys, with empty start and end keys.
- allMatches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- allMatches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- AllMatches(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.AllMatches
-
- allOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Whether this reader should allow dynamic splitting of the offset ranges.
- AlwaysPassMatcher() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
-
- AlwaysPassMatcherFactory() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
-
- alwaysRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Always retry all failures.
- AmqpIO - Class in org.apache.beam.sdk.io.amqp
-
AmqpIO supports AMQP 1.0 protocol using the Apache QPid Proton-J library.
- AmqpIO.Read - Class in org.apache.beam.sdk.io.amqp
-
A
PTransform
to read/receive messages using AMQP 1.0 protocol.
- AmqpIO.Write - Class in org.apache.beam.sdk.io.amqp
-
A
PTransform
to send messages using AMQP 1.0 protocol.
- AmqpMessageCoder - Class in org.apache.beam.sdk.io.amqp
-
A coder for AMQP message.
- AmqpMessageCoder() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
-
- AmqpMessageCoderProviderRegistrar - Class in org.apache.beam.sdk.io.amqp
-
- AmqpMessageCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
-
- and(TupleTag<V>, List<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns a new
CoGbkResult
based on this, with the given tag and given data
added to it.
- and(TupleTag<V>, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns a new KeyedPCollectionTuple<K>
that is the same as this,
appended with the given PCollection.
- and(PCollection.IsBounded) - Method in enum org.apache.beam.sdk.values.PCollection.IsBounded
-
Returns the composed IsBounded property.
- and(PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- and(Iterable<PCollection<T>>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- and(TupleTag<T>, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- and(TupleTag<?>) - Method in class org.apache.beam.sdk.values.TupleTagList
-
- and(List<TupleTag<?>>) - Method in class org.apache.beam.sdk.values.TupleTagList
-
- any(long) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Sample#any(long)
takes a PCollection<T>
and a limit, and
produces a new PCollection<T>
containing up to limit
elements of the input PCollection
.
- ApexPipelineOptions - Interface in org.apache.beam.runners.apex
-
Options that configure the Apex pipeline.
- ApexRunner - Class in org.apache.beam.runners.apex
-
A
PipelineRunner
that translates the
pipeline to an Apex DAG and executes it on an Apex cluster.
- ApexRunner(ApexPipelineOptions) - Constructor for class org.apache.beam.runners.apex.ApexRunner
-
- ApexRunner.CreateApexPCollectionView<ElemT,ViewT> - Class in org.apache.beam.runners.apex
-
- ApexRunnerRegistrar - Class in org.apache.beam.runners.apex
-
- ApexRunnerRegistrar.Options - Class in org.apache.beam.runners.apex
-
- ApexRunnerRegistrar.Runner - Class in org.apache.beam.runners.apex
-
- ApexRunnerResult - Class in org.apache.beam.runners.apex
-
Result of executing a
Pipeline
with Apex in embedded mode.
- ApexRunnerResult(DAG, Launcher.AppHandle) - Constructor for class org.apache.beam.runners.apex.ApexRunnerResult
-
- ApexYarnLauncher - Class in org.apache.beam.runners.apex
-
Proxy to launch the YARN application through the hadoop script to run in the
pre-configured environment (class path, configuration, native libraries etc.).
- ApexYarnLauncher() - Constructor for class org.apache.beam.runners.apex.ApexYarnLauncher
-
- ApexYarnLauncher.LaunchParams - Class in org.apache.beam.runners.apex
-
Launch parameters that will be serialized and passed to the child process.
- ApexYarnLauncher.ProcessWatcher - Class in org.apache.beam.runners.apex
-
Starts a command and waits for it to complete.
- applicableTo(PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
- ApplicationNameOptions - Interface in org.apache.beam.sdk.options
-
Options that allow setting the application name.
- apply(WindowFunction.Context<T>) - Method in class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator.GearpumpWindowFn
-
- apply(InputT) - Method in interface org.apache.beam.sdk.coders.DelegateCoder.CodingFunction
-
- apply(BeamRecord) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationGroupByKeyFn
-
- apply(BeamRecord) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.WindowTimestampFn
-
- apply(BeamRecord) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.ExtractJoinFields
-
- apply(KV<BeamRecord, KV<BeamRecord, BeamRecord>>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.JoinParts2WholeRow
-
- apply(BeamRecord) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
-
- apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
-
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
-
- apply(T) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert.MatcherCheckerFn
-
- apply(Statement, Description) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
- apply(double, double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Applies the binary operation to the two operands, returning the result.
- apply(V, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Applies the binary operation to the two operands, returning the result.
- apply(int, int) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Applies the binary operation to the two operands, returning the result.
- apply(long, long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Applies the binary operation to the two operands, returning the result.
- apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Applies this CombineFn
to a collection of input values
to produce a combined output value.
- apply(Iterable<? extends InputT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Applies this CombineFnWithContext
to a collection of input values to produce a
combined output value.
- apply(PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- apply(String, PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Applies the given
PTransform
to this input
KeyedPCollectionTuple
and returns
its
OutputT
.
- apply(InputT) - Method in interface org.apache.beam.sdk.transforms.SerializableFunction
-
Returns the result of invoking this function on the given input.
- apply(InputT) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
-
- apply(PrimitiveViewT) - Method in class org.apache.beam.sdk.transforms.ViewFn
-
A function to adapt a primitive view type to a desired view type.
- apply(InputT, Instant) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.PollFn
-
- apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
-
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
-
Applies the given
PTransform
to this
PBegin
, using
name
to identify
this specific application of the transform.
- apply(PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
-
- apply(String, PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
-
Applies the given
PTransform
to this input
PCollection
,
using
name
to identify this specific application of the transform.
- apply(PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- apply(String, PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Applies the given
PTransform
to this input
PCollectionList
,
using
name
to identify this specific application of the transform.
- apply(PTransform<PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- apply(String, PTransform<PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- apply(Iterable<WindowedValue<T>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- apply(Iterable<WindowedValue<T>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- apply(Iterable<WindowedValue<KV<K, V>>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
Input iterable must actually be Iterable<WindowedValue<KV<K, V>>>
.
- apply(Iterable<WindowedValue<KV<K, V>>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- apply(Iterable<WindowedValue<T>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- applyTransform(InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- applyTransform(String, InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- ApproximateQuantiles - Class in org.apache.beam.sdk.transforms
-
PTransform
s for getting an idea of a PCollection
's
data distribution using approximate N
-tiles (e.g.
- ApproximateQuantiles.ApproximateQuantilesCombineFn<T,ComparatorT extends java.util.Comparator<T> & java.io.Serializable> - Class in org.apache.beam.sdk.transforms
-
The ApproximateQuantilesCombineFn
combiner gives an idea
of the distribution of a collection of values using approximate
N
-tiles.
- ApproximateUnique - Class in org.apache.beam.sdk.transforms
-
PTransform
s for estimating the number of distinct elements
in a PCollection
, or the number of distinct values
associated with each key in a PCollection
of KV
s.
- ApproximateUnique() - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique
-
- ApproximateUnique.ApproximateUniqueCombineFn<T> - Class in org.apache.beam.sdk.transforms
-
CombineFn
that computes an estimate of the number of
distinct values that were combined.
- ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique - Class in org.apache.beam.sdk.transforms
-
A heap utility class to efficiently track the largest added elements.
- ApproximateUniqueCombineFn(long, Coder<T>) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
- array() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns the backing array.
- as(Class<T>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Transforms this object into an object of type <T>
saving each property
that has been manipulated.
- as(Class<T>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Creates and returns an object that implements <T>
.
- as(Class<T>) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Creates and returns an object that implements <T>
using the values configured on
this builder during construction.
- asCloudObject(Coder<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
-
- asInputStream(int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns an InputStream
wrapper which supplies the portion of this backing byte buffer
starting at offset
and up to length
bytes.
- asIterable() - Static method in class org.apache.beam.sdk.transforms.View
-
- AsJsons<InputT> - Class in org.apache.beam.sdk.extensions.jackson
-
PTransform
for serializing objects to JSON
Strings
.
- asList() - Static method in class org.apache.beam.sdk.transforms.View
-
- asMap() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
-
- asMap() - Static method in class org.apache.beam.sdk.transforms.View
-
- asMultimap() - Static method in class org.apache.beam.sdk.transforms.View
-
- asOutputReference(PValue, AppliedPTransform<?, ?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Encode a PValue reference as an output reference.
- asOutputStream() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns an output stream which writes to the backing buffer from the current position.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub
API.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Returns the string representation of this topic as a path used in the Cloud Pub/Sub
API.
- ASSERTION_ERROR - Static variable in class org.apache.beam.runners.apex.ApexRunner
-
TODO: this isn't thread safe and may cause issues when tests run in parallel
Holds any most resent assertion error that was raised while processing elements.
- assertionError() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
-
- assertSourcesEqualReferenceSource(BoundedSource<T>, List<? extends BoundedSource<T>>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Given a reference Source
and a list of Source
s, assert that the union of
the records read from the list of sources is equal to the records read from the reference
source.
- assertSplitAtFractionBehavior(BoundedSource<T>, int, double, SourceTestUtils.ExpectedSplitOutcome, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
- assertSplitAtFractionExhaustive(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that for each possible start position,
BoundedSource.BoundedReader#splitAtFraction
at every interesting fraction (halfway
between two fractions that differ by at least one item) can be called successfully and the
results are consistent if a split succeeds.
- assertSplitAtFractionFails(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that the source
's reader fails to splitAtFraction(fraction)
after reading numItemsToReadBeforeSplit
items.
- assertSplitAtFractionSucceedsAndConsistent(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Verifies some consistency properties of
BoundedSource.BoundedReader#splitAtFraction
on the given source.
- assertUnstartedReaderReadsSameAsItsSource(BoundedSource.BoundedReader<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Assert that a Reader
returns a Source
that, when read from, produces the same
records as the reader.
- assign(BoundedWindow, Instant) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
- AssignContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
-
- assignedWindows(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
- assignedWindowsWithValue(WindowFn<T, W>, TimestampedValue<T>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
-
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns true if this
WindowFn
always assigns an element to exactly one window.
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
-
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
-
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
Returns the single window to which elements with this timestamp belong.
- assignWindows(WindowFn<Object, GlobalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
-
- assignWindows(WindowFn<Object, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.InvalidWindows
-
- assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
- assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
-
- assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- assignWindows(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Given a timestamp and element, returns the set of windows into which it
should be placed.
- asSingleton() - Static method in class org.apache.beam.sdk.transforms.View
-
- asSingletonView() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns a
PTransform
that produces a
PCollectionView
whose elements are the result of combining elements per-window in
the input
PCollection
.
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
- atMinimumTimestamp(V) - Static method in class org.apache.beam.sdk.values.TimestampedValue
-
- AtomicCoder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder
that has no component
Coders
or other configuration.
- AtomicCoder() - Constructor for class org.apache.beam.sdk.coders.AtomicCoder
-
- AtomicLongFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
-
- attached() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
- attempted() - Method in interface org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all attempts of executing all parts of the pipeline.
- autoCastField(int, Object) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
-
- AvailableParallelismFactory() - Constructor for class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
-
- AvroCoder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder
using Avro binary format.
- AvroCoder(Class<T>, Schema) - Constructor for class org.apache.beam.sdk.coders.AvroCoder
-
- AvroIO - Class in org.apache.beam.sdk.io
-
- AvroIO.Parse<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.ParseAll<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.Read<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.ReadAll<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.TypedWrite<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- AvroIO.Write<T> - Class in org.apache.beam.sdk.io
-
- AvroReader(AvroSource<T>) - Constructor for class org.apache.beam.sdk.io.AvroSource.AvroReader
-
Reads Avro records of type T
from the specified source.
- AvroSource<T> - Class in org.apache.beam.sdk.io
-
Do not use in pipelines directly: most users should use
AvroIO.Read
.
- AvroSource.AvroReader<T> - Class in org.apache.beam.sdk.io
-
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
-
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
-
- AWSClientsProvider - Interface in org.apache.beam.sdk.io.kinesis
-
Provides instances of AWS clients.
- BACKLOG_UNKNOWN - Static variable in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Constant representing an unknown amount of backlog.
- backlogBytes() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source backlog in bytes.
- backlogBytesOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source split backlog in bytes.
- backlogElements() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source backlog in elements.
- backlogElementsOfSplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Gauge for source split backlog in elements.
- bag() - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Create a
StateSpec
for a
BagState
, optimized for adding values frequently
and occasionally retrieving all the values that have been added.
- bag(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
- BagState<T> - Interface in org.apache.beam.sdk.state
-
- BaseBeamTable - Class in org.apache.beam.sdk.extensions.sql.impl.schema
-
Each IO in Beam has one table schema, by extending
BaseBeamTable
.
- BaseBeamTable(BeamRecordSqlType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BaseBeamTable
-
- BatchStatefulParDoOverrides - Class in org.apache.beam.runners.dataflow
-
PTransformOverrideFactories
that expands to correctly implement
stateful
ParDo
using window-unaware
BatchViewOverrides.GroupByKeyAndSortValuesOnly
to linearize
processing per key.
- BatchStatefulParDoOverrides() - Constructor for class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides
-
- BatchStatefulParDoOverrides.BatchStatefulDoFn<K,V,OutputT> - Class in org.apache.beam.runners.dataflow
-
A key-preserving
DoFn
that explodes an iterable that has been grouped by key and
window.
- BEAM_REL_DATATYPE_SYSTEM - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
-
- BeamAggregationRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
- BeamAggregationRel(RelOptCluster, RelTraitSet, RelNode, boolean, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>, WindowFn, Trigger, int, Duration) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
-
- BeamAggregationRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
Rule to detect the window/trigger settings.
- BeamAggregationRule(Class<? extends Aggregate>, Class<? extends Project>, RelBuilderFactory) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
-
- BeamAggregationRule(RelOptRuleOperand, String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
-
- BeamAggregationTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Collections of PTransform
and DoFn
used to perform GROUP-BY operation.
- BeamAggregationTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms
-
- BeamAggregationTransforms.AggregationAccumulator - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
A class to holder varied accumulator objects.
- BeamAggregationTransforms.AggregationAccumulatorCoder - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
- BeamAggregationTransforms.AggregationAdaptor - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
An adaptor class to invoke Calcite UDAF instances in Beam CombineFn
.
- BeamAggregationTransforms.AggregationGroupByKeyFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
extract group-by fields.
- BeamAggregationTransforms.MergeAggregationRecord - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Merge KV to single record.
- BeamAggregationTransforms.WindowTimestampFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Assign event timestamp.
- BeamFilterRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a Filter
node.
- BeamFilterRel(RelOptCluster, RelTraitSet, RelNode, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamFilterRel
-
- BeamFilterRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
- BeamIntersectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace a Intersect
node.
- BeamIntersectRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
-
- BeamIntersectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replace Intersect
with BeamIntersectRel
.
- BeamIOSinkRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a TableModify
node.
- BeamIOSinkRel(RelOptCluster, RelTraitSet, RelOptTable, Prepare.CatalogReader, RelNode, Operation, List<String>, List<RexNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
-
- BeamIOSinkRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
- BeamIOSourceRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a TableScan
node.
- BeamIOSourceRel(RelOptCluster, RelTraitSet, RelOptTable) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
-
- BeamIOSourceRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
- BeamIOType - Enum in org.apache.beam.sdk.extensions.sql.impl.schema
-
Type as a source IO, determined whether it's a STREAMING process, or batch
process.
- BeamJoinRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace a Join
node.
- BeamJoinRel(RelOptCluster, RelTraitSet, RelNode, RelNode, RexNode, Set<CorrelationId>, JoinRelType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
- BeamJoinRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replace Join
with BeamJoinRel
.
- BeamJoinTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Collections of PTransform
and DoFn
used to perform JOIN operation.
- BeamJoinTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
-
- BeamJoinTransforms.ExtractJoinFields - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
A SimpleFunction
to extract join fields from the specified row.
- BeamJoinTransforms.JoinParts2WholeRow - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
A SimpleFunction
to combine two rows into one.
- BeamJoinTransforms.SideInputJoinDoFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
A DoFn
which implement the sideInput-JOIN.
- BeamKafkaCSVTable - Class in org.apache.beam.sdk.extensions.sql.impl.schema.kafka
-
A Kafka topic that saves records as CSV format.
- BeamKafkaCSVTable(BeamRecordSqlType, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaCSVTable
-
- BeamKafkaCSVTable(BeamRecordSqlType, String, List<String>, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaCSVTable
-
- BeamKafkaCSVTable.CsvRecorderDecoder - Class in org.apache.beam.sdk.extensions.sql.impl.schema.kafka
-
A PTransform to convert
KV<byte[], byte[]>
to
BeamRecord
.
- BeamKafkaCSVTable.CsvRecorderEncoder - Class in org.apache.beam.sdk.extensions.sql.impl.schema.kafka
-
A PTransform to convert
BeamRecord
to
KV<byte[], byte[]>
.
- BeamKafkaTable - Class in org.apache.beam.sdk.extensions.sql.impl.schema.kafka
-
BeamKafkaTable
represent a Kafka topic, as source or target.
- BeamKafkaTable(BeamRecordSqlType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaTable
-
- BeamKafkaTable(BeamRecordSqlType, String, List<String>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaTable
-
- BeamLogicalConvention - Enum in org.apache.beam.sdk.extensions.sql.impl.rel
-
Convertion for Beam SQL.
- BeamMinusRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace a Minus
node.
- BeamMinusRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
-
- BeamMinusRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replace Minus
with BeamMinusRel
.
- BeamPCollectionTable - Class in org.apache.beam.sdk.extensions.sql.impl.schema
-
BeamPCollectionTable
converts a PCollection<BeamSqlRow>
as a virtual table,
then a downstream query can query directly.
- BeamPCollectionTable(BeamRecordSqlType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
-
- BeamPCollectionTable(PCollection<BeamRecord>, BeamRecordSqlType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
-
- BeamProjectRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace a Project
node.
- BeamProjectRel(RelOptCluster, RelTraitSet, RelNode, List<? extends RexNode>, RelDataType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamProjectRel
-
projects: RexLiteral
, RexInputRef
, RexCall
.
- BeamProjectRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
- BeamQueryPlanner - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
The core component to handle through a SQL statement, from explain execution plan,
to generate a Beam pipeline.
- BeamQueryPlanner(SchemaPlus) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamQueryPlanner
-
- BeamRecord - Class in org.apache.beam.sdk.values
-
- BeamRecord(BeamRecordType, List<Object>) - Constructor for class org.apache.beam.sdk.values.BeamRecord
-
Creates a BeamRecord.
- BeamRecord(BeamRecordType, Object...) - Constructor for class org.apache.beam.sdk.values.BeamRecord
-
- BeamRecordCoder - Class in org.apache.beam.sdk.coders
-
- BeamRecordSqlType - Class in org.apache.beam.sdk.extensions.sql
-
- BeamRecordSqlType(List<String>, List<Coder>) - Constructor for class org.apache.beam.sdk.extensions.sql.BeamRecordSqlType
-
- BeamRecordType - Class in org.apache.beam.sdk.values
-
- BeamRecordType(List<String>, List<Coder>) - Constructor for class org.apache.beam.sdk.values.BeamRecordType
-
- BeamRelDataTypeSystem - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
customized data type in Beam.
- BeamRelDataTypeSystem() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
-
- BeamRelNode - Interface in org.apache.beam.sdk.extensions.sql.impl.rel
-
- BeamRuleSets - Class in org.apache.beam.sdk.extensions.sql.impl.planner
-
- BeamRuleSets() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
-
- BeamSetOperatorRelBase - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Delegate for Set operators: BeamUnionRel
, BeamIntersectRel
and BeamMinusRel
.
- BeamSetOperatorRelBase(BeamRelNode, BeamSetOperatorRelBase.OpType, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
-
- BeamSetOperatorRelBase.OpType - Enum in org.apache.beam.sdk.extensions.sql.impl.rel
-
Set operator type.
- BeamSetOperatorsTransforms - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Collections of PTransform
and DoFn
used to perform Set operations.
- BeamSetOperatorsTransforms() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms
-
- BeamSetOperatorsTransforms.BeamSqlRow2KvFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Transform a BeamSqlRow
to a KV<BeamSqlRow, BeamSqlRow>
.
- BeamSetOperatorsTransforms.SetOperatorFilteringDoFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
Filter function used for Set operators.
- BeamSortRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace a Sort
node.
- BeamSortRel(RelOptCluster, RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
-
- BeamSortRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replace Sort
with BeamSortRel
.
- BeamSparkRunnerRegistrator - Class in org.apache.beam.runners.spark.coders
-
Custom KryoRegistrator
s for Beam's Spark runner needs.
- BeamSparkRunnerRegistrator() - Constructor for class org.apache.beam.runners.spark.coders.BeamSparkRunnerRegistrator
-
- BeamSql - Class in org.apache.beam.sdk.extensions.sql
-
BeamSql
is the DSL interface of BeamSQL.
- BeamSql() - Constructor for class org.apache.beam.sdk.extensions.sql.BeamSql
-
- BeamSql.QueryTransform - Class in org.apache.beam.sdk.extensions.sql
-
A
PTransform
representing an execution plan for a SQL query.
- BeamSql.SimpleQueryTransform - Class in org.apache.beam.sdk.extensions.sql
-
A
PTransform
representing an execution plan for a SQL query referencing
a single table.
- BeamSqlAbsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'ABS' function.
- BeamSqlAbsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAbsExpression
-
- BeamSqlAcosExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'ACOS' function.
- BeamSqlAcosExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAcosExpression
-
- BeamSqlAndExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
-
BeamSqlExpression
for 'AND' operation.
- BeamSqlAndExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlAndExpression
-
- BeamSqlArithmeticExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
-
Base class for all arithmetic operators.
- BeamSqlArithmeticExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
-
- BeamSqlArithmeticExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
-
- BeamSqlAsinExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'ASIN' function.
- BeamSqlAsinExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAsinExpression
-
- BeamSqlAtan2Expression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
- BeamSqlAtan2Expression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAtan2Expression
-
- BeamSqlAtanExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'ATAN' function.
- BeamSqlAtanExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAtanExpression
-
- BeamSqlCaseExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
BeamSqlCaseExpression
represents CASE, NULLIF, COALESCE in SQL.
- BeamSqlCaseExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCaseExpression
-
- BeamSqlCastExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
Base class to support 'CAST' operations for all SqlTypeName
.
- BeamSqlCastExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCastExpression
-
- BeamSqlCeilExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'CEIL' function.
- BeamSqlCeilExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCeilExpression
-
- BeamSqlCharLengthExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
'CHAR_LENGTH' operator.
- BeamSqlCharLengthExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlCharLengthExpression
-
- BeamSqlCli - Class in org.apache.beam.sdk.extensions.sql.impl
-
BeamSqlCli
provides methods to execute Beam SQL with an interactive client.
- BeamSqlCli() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.BeamSqlCli
-
- BeamSqlCompareExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
- BeamSqlCompareExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
-
- BeamSqlConcatExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
String concat operator.
- BeamSqlConcatExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlConcatExpression
-
- BeamSqlConcatExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlConcatExpression
-
- BeamSqlCosExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'COS' function.
- BeamSqlCosExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCosExpression
-
- BeamSqlCotExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'COT' function.
- BeamSqlCotExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCotExpression
-
- BeamSqlCurrentDateExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
-
BeamSqlExpression
for CURRENT_DATE and LOCALTIME.
- BeamSqlCurrentDateExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentDateExpression
-
- BeamSqlCurrentTimeExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
-
BeamSqlExpression
for LOCALTIME and CURRENT_TIME.
- BeamSqlCurrentTimeExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimeExpression
-
- BeamSqlCurrentTimestampExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
-
BeamSqlExpression
for LOCALTIMESTAMP and CURRENT_TIMESTAMP.
- BeamSqlCurrentTimestampExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimestampExpression
-
- BeamSqlDateCeilExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
-
BeamSqlExpression
for CEIL(date).
- BeamSqlDateCeilExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDateCeilExpression
-
- BeamSqlDateFloorExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
-
BeamSqlExpression
for FLOOR(date).
- BeamSqlDateFloorExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDateFloorExpression
-
- BeamSqlDegreesExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'DEGREES' function.
- BeamSqlDegreesExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlDegreesExpression
-
- BeamSqlDivideExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
-
'/' operator.
- BeamSqlDivideExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlDivideExpression
-
- BeamSqlEnv - Class in org.apache.beam.sdk.extensions.sql.impl
-
- BeamSqlEnv() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
- BeamSqlEqualsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
BeamSqlExpression
for =
operation.
- BeamSqlEqualsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
-
- BeamSqlExpExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'EXP' function.
- BeamSqlExpExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlExpExpression
-
- BeamSqlExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
BeamSqlExpression
is an equivalent expression in BeamSQL, of RexNode
in Calcite.
- BeamSqlExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
-
- BeamSqlExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
-
- BeamSqlExpressionExecutor - Interface in org.apache.beam.sdk.extensions.sql.impl.interpreter
-
BeamSqlExpressionExecutor
fills the gap between relational
expressions in Calcite SQL and executable code.
- BeamSqlExtractExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date
-
BeamSqlExpression
for EXTRACT.
- BeamSqlExtractExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlExtractExpression
-
- BeamSqlFilterFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
- BeamSqlFilterFn(String, BeamSqlExpressionExecutor) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlFilterFn
-
- BeamSqlFloorExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'FLOOR' function.
- BeamSqlFloorExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlFloorExpression
-
- BeamSqlFnExecutor - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter
-
- BeamSqlFnExecutor(BeamRelNode) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlFnExecutor
-
- BeamSqlGreaterThanExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
BeamSqlExpression
for >
operation.
- BeamSqlGreaterThanExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
-
- BeamSqlGreaterThanOrEqualsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
BeamSqlExpression
for >=
operation.
- BeamSqlGreaterThanOrEqualsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
-
- BeamSqlInitCapExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
'INITCAP' operator.
- BeamSqlInitCapExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlInitCapExpression
-
- BeamSqlInputRefExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
An primitive operation for direct field extraction.
- BeamSqlInputRefExpression(SqlTypeName, int) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlInputRefExpression
-
- BeamSqlIsNotNullExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
BeamSqlExpression
for 'IS NOT NULL' operation.
- BeamSqlIsNotNullExpression(BeamSqlExpression) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNotNullExpression
-
- BeamSqlIsNullExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
BeamSqlExpression
for 'IS NULL' operation.
- BeamSqlIsNullExpression(BeamSqlExpression) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNullExpression
-
- BeamSqlLessThanExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
BeamSqlExpression
for <
operation.
- BeamSqlLessThanExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
-
- BeamSqlLessThanOrEqualsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
BeamSqlExpression
for <=
operation.
- BeamSqlLessThanOrEqualsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
-
- BeamSqlLnExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'LN' function.
- BeamSqlLnExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlLnExpression
-
- BeamSqlLogExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'Log10' function.
- BeamSqlLogExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlLogExpression
-
- BeamSqlLogicalExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
-
BeamSqlExpression
for Logical operators.
- BeamSqlLogicalExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlLogicalExpression
-
- BeamSqlLowerExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
'LOWER' operator.
- BeamSqlLowerExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlLowerExpression
-
- BeamSqlMathBinaryExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
Base class for all binary functions such as
POWER, MOD, RAND_INTEGER, ATAN2, ROUND, TRUNCATE.
- BeamSqlMathBinaryExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathBinaryExpression
-
- BeamSqlMathUnaryExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
Base class for all unary functions such as
ABS, SQRT, LN, LOG10, EXP, CEIL, FLOOR, RAND, ACOS,
ASIN, ATAN, COS, COT, DEGREES, RADIANS, SIGN, SIN, TAN.
- BeamSqlMathUnaryExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathUnaryExpression
-
- BeamSqlMinusExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
-
'-' operator.
- BeamSqlMinusExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlMinusExpression
-
- BeamSqlModExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
-
'%' operator.
- BeamSqlModExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlModExpression
-
- BeamSqlMultiplyExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
-
'*' operator.
- BeamSqlMultiplyExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlMultiplyExpression
-
- BeamSqlNotEqualsExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison
-
BeamSqlExpression
for <>
operation.
- BeamSqlNotEqualsExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
-
- BeamSqlNotExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
-
BeamSqlExpression
for logical operator: NOT.
- BeamSqlNotExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlNotExpression
-
- BeamSqlOrExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical
-
BeamSqlExpression
for 'OR' operation.
- BeamSqlOrExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlOrExpression
-
- BeamSqlOutputToConsoleFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
A test PTransform to display output in console.
- BeamSqlOutputToConsoleFn(String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlOutputToConsoleFn
-
- BeamSqlOverlayExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
'OVERLAY' operator.
- BeamSqlOverlayExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlOverlayExpression
-
- BeamSqlPiExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
Base class for the PI function.
- BeamSqlPiExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPiExpression
-
- BeamSqlPlusExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic
-
'+' operator.
- BeamSqlPlusExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlPlusExpression
-
- BeamSqlPositionExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
String position operator.
- BeamSqlPositionExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlPositionExpression
-
- BeamSqlPowerExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathBinaryExpression
for 'POWER' function.
- BeamSqlPowerExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPowerExpression
-
- BeamSqlPrimitive<T> - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
- BeamSqlProjectFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform
-
- BeamSqlProjectFn(String, BeamSqlExpressionExecutor, BeamRecordSqlType) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlProjectFn
-
- BeamSqlRadiansExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'RADIANS' function.
- BeamSqlRadiansExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRadiansExpression
-
- BeamSqlRandExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'RAND([seed])' function.
- BeamSqlRandExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandExpression
-
- BeamSqlRandIntegerExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'RAND_INTEGER([seed, ] numeric)'
function.
- BeamSqlRandIntegerExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandIntegerExpression
-
- BeamSqlRecordHelper - Class in org.apache.beam.sdk.extensions.sql
-
- BeamSqlRecordHelper() - Constructor for class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper
-
- BeamSqlRecordHelper.BooleanCoder - Class in org.apache.beam.sdk.extensions.sql
-
Coder
for Java type
Boolean
.
- BeamSqlRecordHelper.DateCoder - Class in org.apache.beam.sdk.extensions.sql
-
Coder
for Java type
Date
, it's stored as
Long
.
- BeamSqlRecordHelper.DoubleCoder - Class in org.apache.beam.sdk.extensions.sql
-
Coder
for Java type
Double
, it's stored as
BigDecimal
.
- BeamSqlRecordHelper.FloatCoder - Class in org.apache.beam.sdk.extensions.sql
-
Coder
for Java type
Float
, it's stored as
BigDecimal
.
- BeamSqlRecordHelper.ShortCoder - Class in org.apache.beam.sdk.extensions.sql
-
Coder
for Java type
Short
.
- BeamSqlRecordHelper.TimeCoder - Class in org.apache.beam.sdk.extensions.sql
-
Coder
for Java type
GregorianCalendar
, it's stored as
Long
.
- BeamSqlReinterpretExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
BeamSqlExpression
for REINTERPRET.
- BeamSqlReinterpretExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlReinterpretExpression
-
- BeamSqlRoundExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathBinaryExpression
for 'ROUND' function.
- BeamSqlRoundExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRoundExpression
-
- beamSqlRow2CsvLine(BeamRecord, CSVFormat) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
-
- BeamSqlRow2KvFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
-
- beamSqlRowType - Variable in class org.apache.beam.sdk.extensions.sql.impl.schema.BaseBeamTable
-
- beamSqlRowType - Variable in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTableIOReader
-
- beamSqlRowType - Variable in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTableIOWriter
-
- BeamSqlSignExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'SIGN' function.
- BeamSqlSignExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlSignExpression
-
- BeamSqlSinExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'SIN' function.
- BeamSqlSinExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlSinExpression
-
- BeamSqlStringUnaryExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
Base class for all string unary operators.
- BeamSqlStringUnaryExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlStringUnaryExpression
-
- BeamSqlSubstringExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
'SUBSTRING' operator.
- BeamSqlSubstringExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlSubstringExpression
-
- BeamSqlTable - Interface in org.apache.beam.sdk.extensions.sql.impl.schema
-
This interface defines a Beam Sql Table.
- BeamSqlTanExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathUnaryExpression
for 'TAN' function.
- BeamSqlTanExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlTanExpression
-
- BeamSqlTrimExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
Trim operator.
- BeamSqlTrimExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlTrimExpression
-
- BeamSqlTruncateExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math
-
BeamSqlMathBinaryExpression
for 'TRUNCATE' function.
- BeamSqlTruncateExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlTruncateExpression
-
- BeamSqlUdf - Interface in org.apache.beam.sdk.extensions.sql
-
Interface to create a UDF in Beam SQL.
- BeamSqlUdfExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
invoke a UDF function.
- BeamSqlUdfExpression(Method, List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUdfExpression
-
- BeamSqlUpperExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string
-
'UPPER' operator.
- BeamSqlUpperExpression(List<BeamSqlExpression>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlUpperExpression
-
- BeamSqlWindowEndExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
BeamSqlExpression
for HOP_END
, TUMBLE_END
, SESSION_END
operation.
- BeamSqlWindowEndExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowEndExpression
-
- BeamSqlWindowExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
BeamSqlExpression
for HOP
, TUMBLE
, SESSION
operation.
- BeamSqlWindowExpression(List<BeamSqlExpression>, SqlTypeName) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowExpression
-
- BeamSqlWindowStartExpression - Class in org.apache.beam.sdk.extensions.sql.impl.interpreter.operator
-
BeamSqlExpression
for HOP_START
, TUMBLE_START
,
SESSION_START
operation.
- BeamSqlWindowStartExpression() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowStartExpression
-
- BeamTableUtils - Class in org.apache.beam.sdk.extensions.sql.impl.schema
-
Utility methods for working with BeamTable
.
- BeamTableUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
-
- BeamTextCSVTable - Class in org.apache.beam.sdk.extensions.sql.impl.schema.text
-
BeamTextCSVTable
is a BeamTextTable
which formatted in CSV.
- BeamTextCSVTable(BeamRecordSqlType, String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTable
-
CSV table with DEFAULT
format.
- BeamTextCSVTable(BeamRecordSqlType, String, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTable
-
- BeamTextCSVTableIOReader - Class in org.apache.beam.sdk.extensions.sql.impl.schema.text
-
IOReader for BeamTextCSVTable
.
- BeamTextCSVTableIOReader(BeamRecordSqlType, String, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTableIOReader
-
- BeamTextCSVTableIOWriter - Class in org.apache.beam.sdk.extensions.sql.impl.schema.text
-
IOWriter for BeamTextCSVTable
.
- BeamTextCSVTableIOWriter(BeamRecordSqlType, String, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTableIOWriter
-
- BeamTextTable - Class in org.apache.beam.sdk.extensions.sql.impl.schema.text
-
BeamTextTable
represents a text file/directory(backed by TextIO
).
- BeamTextTable(BeamRecordSqlType, String) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextTable
-
- BeamUnionRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
- BeamUnionRel(RelOptCluster, RelTraitSet, List<RelNode>, boolean) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
-
- BeamUnionRel(RelInput) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
-
- BeamUnionRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
A
ConverterRule
to replace
org.apache.calcite.rel.core.Union
with
BeamUnionRule
.
- BeamValuesRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode
to replace a Values
node.
- BeamValuesRel(RelOptCluster, RelDataType, ImmutableList<ImmutableList<RexLiteral>>, RelTraitSet) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
-
- BeamValuesRule - Class in org.apache.beam.sdk.extensions.sql.impl.rule
-
ConverterRule
to replace Values
with BeamValuesRel
.
- begin() - Method in class org.apache.beam.sdk.Pipeline
-
Returns a
PBegin
owned by this Pipeline.
- beginningOnDay(int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
-
- beginningOnDay(int, int) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- BigDecimalCoder - Class in org.apache.beam.sdk.coders
-
- bigdecimals() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- BigEndianIntegerCoder - Class in org.apache.beam.sdk.coders
-
- BigEndianLongCoder - Class in org.apache.beam.sdk.coders
-
- BigIntegerCoder - Class in org.apache.beam.sdk.coders
-
- bigintegers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- BigQueryCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
-
- BigQueryCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
-
- BigQueryFileToTable - Class in org.apache.beam.runners.dataflow
-
An example that stresses the BigQuery sink at scale.
- BigQueryFileToTable() - Constructor for class org.apache.beam.runners.dataflow.BigQueryFileToTable
-
- BigQueryHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A set of helper functions and classes used by
BigQueryIO
.
- BigQueryHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
- BigQueryIO - Class in org.apache.beam.sdk.io.gcp.bigquery
-
- BigQueryIO.Read - Class in org.apache.beam.sdk.io.gcp.bigquery
-
- BigQueryIO.TypedRead<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
- BigQueryIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
- BigQueryIO.Write.CreateDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery create disposition strings.
- BigQueryIO.Write.Method - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
Determines the method used to insert data in BigQuery.
- BigQueryIO.Write.WriteDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery write disposition strings.
- BigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Properties needed when using Google BigQuery with the Apache Beam SDK.
- BigtableIO - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Transforms
for reading from and writing to Google Cloud Bigtable.
- BigtableIO.Read - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransform
that reads from Google Cloud Bigtable.
- BigtableIO.Write - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransform
that writes to Google Cloud Bigtable.
- BinaryCombineDoubleFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
- BinaryCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
- BinaryCombineIntegerFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
- BinaryCombineLongFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
- bind(String, StateBinder) - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- bindBag(String, StateSpec<BagState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
-
- bindCombining(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
-
- bindCombiningWithContext(String, StateSpec<CombiningState<InputT, AccumT, OutputT>>, Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Method in interface org.apache.beam.sdk.state.StateBinder
-
- bindMap(String, StateSpec<MapState<KeyT, ValueT>>, Coder<KeyT>, Coder<ValueT>) - Method in interface org.apache.beam.sdk.state.StateBinder
-
- bindSet(String, StateSpec<SetState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
-
- bindValue(String, StateSpec<ValueState<T>>, Coder<T>) - Method in interface org.apache.beam.sdk.state.StateBinder
-
- bindWatermark(String, StateSpec<WatermarkHoldState>, TimestampCombiner) - Method in interface org.apache.beam.sdk.state.StateBinder
-
- BitSetCoder - Class in org.apache.beam.sdk.coders
-
Coder for BitSet
.
- Block() - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.Block
-
- BlockBasedReader(BlockBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
- BlockBasedSource<T> - Class in org.apache.beam.sdk.io
-
A
BlockBasedSource
is a
FileBasedSource
where a file consists of blocks of
records.
- BlockBasedSource(String, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a BlockBasedSource
based on a file name or pattern.
- BlockBasedSource(String, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
- BlockBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
- BlockBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
- BlockBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a BlockBasedSource
for a single file.
- BlockBasedSource.Block<T> - Class in org.apache.beam.sdk.io
-
A Block
represents a block of records that can be read.
- BlockBasedSource.BlockBasedReader<T> - Class in org.apache.beam.sdk.io
-
- BooleanCoder - Class in org.apache.beam.sdk.coders
-
- BooleanCoder() - Constructor for class org.apache.beam.sdk.coders.BooleanCoder
-
- booleans() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- Bounded(SparkContext, BoundedSource<T>, SerializablePipelineOptions, String) - Constructor for class org.apache.beam.runners.spark.io.SourceRDD.Bounded
-
- BoundedReader() - Constructor for class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
- BoundedReadFromUnboundedSource<T> - Class in org.apache.beam.sdk.io
-
PTransform
that reads a bounded amount of data from an
UnboundedSource
,
specified as one or both of a maximum number of elements or a maximum period of time to read.
- BoundedSource<T> - Class in org.apache.beam.sdk.io
-
A
Source
that reads a finite amount of input and, because of that, supports
some additional operations.
- BoundedSource() - Constructor for class org.apache.beam.sdk.io.BoundedSource
-
- BoundedSource.BoundedReader<T> - Class in org.apache.beam.sdk.io
-
A Reader
that reads a bounded amount of input and supports some additional
operations, such as progress estimation and dynamic work rebalancing.
- BoundedSourceWrapper<T> - Class in org.apache.beam.runners.gearpump.translators.io
-
wrapper over BoundedSource for Gearpump DataSource API.
- BoundedSourceWrapper(BoundedSource<T>, PipelineOptions) - Constructor for class org.apache.beam.runners.gearpump.translators.io.BoundedSourceWrapper
-
- BoundedWindow - Class in org.apache.beam.sdk.transforms.windowing
-
A
BoundedWindow
represents window information assigned to data elements.
- BoundedWindow() - Constructor for class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
- boundedWindowToGearpumpWindow(BoundedWindow) - Static method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils
-
- broadcast(JavaSparkContext) - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
-
- BufferedExternalSorter - Class in org.apache.beam.sdk.extensions.sorter
-
Sorter
that will use in memory sorting until the values can't fit into memory and will
then fall back to external sorting.
- BufferedExternalSorter.Options - Class in org.apache.beam.sdk.extensions.sorter
-
Contains configuration for the sorter.
- build() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
-
- build() - Method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
-
- build() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
-
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
-
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
-
- build() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamFilterRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
-
Note that BeamIOSinkRel
returns the input PCollection,
which is the persisted PCollection.
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamProjectRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
A
BeamRelNode
is a recursive structure, the
BeamQueryPlanner
visits it with a DFS(Depth-First-Search)
algorithm.
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
-
- buildBeamPipeline(PCollectionTuple, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
-
- builder() - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
-
- Builder() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions.Builder
-
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.Builder
-
- builder() - Static method in class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions
-
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions.StandardCreateOptions.Builder
-
- builder() - Static method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
-
- Builder() - Constructor for class org.apache.beam.sdk.io.fs.MatchResult.Metadata.Builder
-
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
-
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
-
- builder() - Static method in class org.apache.beam.sdk.metrics.MetricsFilter
-
- Builder() - Constructor for class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
- buildIOReader(Pipeline) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
-
- buildIOReader(Pipeline) - Method in interface org.apache.beam.sdk.extensions.sql.impl.schema.BeamSqlTable
-
create a PCollection<BeamSqlRow>
from source.
- buildIOReader(Pipeline) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaTable
-
- buildIOReader(Pipeline) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTable
-
- buildIOWriter() - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
-
- buildIOWriter() - Method in interface org.apache.beam.sdk.extensions.sql.impl.schema.BeamSqlTable
-
create a IO.write()
instance to write to target.
- buildIOWriter() - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaTable
-
- buildIOWriter() - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTable
-
- buildOutputFilenames(Iterable<FileBasedSink.FileResult<DestinationT>>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
- buildTemporaryFilename(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Constructs a temporary file resource given the temporary directory and a filename.
- by(PredicateT) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a PTransform
that takes an input
PCollection<T>
and returns a PCollection<T>
with
elements that satisfy the given predicate.
- ByteArray - Class in org.apache.beam.runners.spark.util
-
Serializable byte array.
- ByteArray(byte[]) - Constructor for class org.apache.beam.runners.spark.util.ByteArray
-
- ByteArrayCoder - Class in org.apache.beam.sdk.coders
-
- ByteCoder - Class in org.apache.beam.sdk.coders
-
A
ByteCoder
encodes
Byte
values in 1 byte using Java serialization.
- ByteKey - Class in org.apache.beam.sdk.io.range
-
A class representing a key consisting of an array of bytes.
- ByteKeyRange - Class in org.apache.beam.sdk.io.range
-
A class representing a range of
ByteKeys
.
- ByteKeyRangeTracker - Class in org.apache.beam.sdk.io.range
-
- bytes() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- bytesRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of bytes read by a source.
- bytesReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of bytes read by a source split.
- ByteStringCoder - Class in org.apache.beam.sdk.extensions.protobuf
-
A
Coder
for
ByteString
objects based on their encoded Protocol Buffer form.
- bytesWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
-
Counter of bytes written to a sink.
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
-
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlDivideExpression
-
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlMinusExpression
-
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlModExpression
-
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlMultiplyExpression
-
- calc(BigDecimal, BigDecimal) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlPlusExpression
-
- CalciteUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
Utility methods for Calcite related operations.
- CalciteUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAbsExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAcosExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAsinExpression
-
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAtan2Expression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlAtanExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCeilExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCosExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlCotExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlDegreesExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlExpExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlFloorExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlLnExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlLogExpression
-
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathBinaryExpression
-
The base method for implementation of math binary functions.
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathUnaryExpression
-
For the operands of other type SqlTypeName#NUMERIC_TYPES
.
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPowerExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRadiansExpression
-
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRoundExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlSignExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlSinExpression
-
- calculate(BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlTanExpression
-
- calculate(BeamSqlPrimitive, BeamSqlPrimitive) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlTruncateExpression
-
- CalendarWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A collection of
WindowFn
s that windows values into calendar-based
windows such as spans of days, months, or years.
- CalendarWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
- CalendarWindows.DaysWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by days.
- CalendarWindows.MonthsWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by months.
- CalendarWindows.YearsWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by years.
- cancel() - Method in class org.apache.beam.runners.apex.ApexRunnerResult
-
- cancel() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
- cancel() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
-
- cancel() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
-
- cancel() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
-
- cancel() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
-
- cancel() - Method in class org.apache.beam.runners.gearpump.GearpumpPipelineResult
-
- cancel(JobApi.CancelJobRequest, StreamObserver<JobApi.CancelJobResponse>) - Method in class org.apache.beam.runners.reference.job.ReferenceRunnerJobService
-
- cancel() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
-
- cancel() - Method in interface org.apache.beam.sdk.PipelineResult
-
Cancels the pipeline execution.
- canConvertConvention(Convention) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
-
- CannotProvideCoderException - Exception in org.apache.beam.sdk.coders
-
- CannotProvideCoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(String, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(String, Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException.ReasonCode - Enum in org.apache.beam.sdk.coders
-
Indicates the reason that
Coder
inference failed.
- canStopPolling(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
- CassandraIO - Class in org.apache.beam.sdk.io.cassandra
-
An IO to read from Apache Cassandra.
- CassandraIO.Read<T> - Class in org.apache.beam.sdk.io.cassandra
-
- CassandraIO.Write<T> - Class in org.apache.beam.sdk.io.cassandra
-
- CassandraService<T> - Interface in org.apache.beam.sdk.io.cassandra
-
An interface for real or fake implementations of Cassandra.
- CassandraService.Writer<T> - Interface in org.apache.beam.sdk.io.cassandra
-
Writer for an entity.
- CassandraServiceImpl<T> - Class in org.apache.beam.sdk.io.cassandra
-
An implementation of the
CassandraService
that actually use a Cassandra instance.
- CassandraServiceImpl() - Constructor for class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
-
- CassandraServiceImpl.TokenRange - Class in org.apache.beam.sdk.io.cassandra
-
Represent a token range in Cassandra instance, wrapping the partition count, size and token
range.
- CassandraServiceImpl.WriterImpl<T> - Class in org.apache.beam.sdk.io.cassandra
-
Writer storing an entity into Apache Cassandra database.
- characters() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
- checkDone() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Called by the runner after DoFn.ProcessElement
returns.
- checkpoint() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
- checkpoint() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Signals that the current
DoFn.ProcessElement
call should terminate as soon as possible:
after this method returns, the tracker MUST refuse all future claim calls, and
RestrictionTracker.checkDone()
MUST succeed.
- classesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
-
- classesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
-
- classNamesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
-
- classNamesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
-
- CLASSPATH_SCHEME - Static variable in class org.apache.beam.runners.apex.ApexRunner
-
- cleanup() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
- clear() - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
-
- clear() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
-
- clear() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- clear() - Method in interface org.apache.beam.sdk.state.State
-
Clear out the state location.
- clearCache() - Static method in class org.apache.beam.runners.spark.io.MicrobatchSource
-
- clearOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Clears the record of the elements output so far to the main output.
- clearOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
Clears the record of the elements output so far to the output with the given tag.
- clone() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
-
- clone() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
-
- close() - Method in class org.apache.beam.runners.gearpump.translators.io.GearpumpSource
-
- close() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- close() - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionExecutor
-
- close() - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlFnExecutor
-
- close() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlFilterFn
-
- close() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSqlProjectFn
-
- close() - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl.WriterImpl
-
- close() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Closes the channel and returns the bundle result.
- close() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Closes any ReadableByteChannel
created for the current reader.
- close() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
-
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
-
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
-
- close() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Closes the reader.
- close() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- CloudDebuggerOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options for controlling Cloud Debugger.
- CloudObject - Class in org.apache.beam.runners.dataflow.util
-
A representation of an arbitrary Java object to be instantiated by Dataflow
workers.
- cloudObjectClassName() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
- CloudObjects - Class in org.apache.beam.runners.dataflow.util
-
- CloudObjectTranslator<T> - Interface in org.apache.beam.runners.dataflow.util
-
A translator that takes an object and creates a
CloudObject
which can be converted back
to the original object.
- CloudResourceManagerOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Properties needed when using Google CloudResourceManager with the Apache Beam SDK.
- CO_GBK_RESULT_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- Coder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder<T>
defines how to encode and decode values of type
T
into
byte streams.
- Coder() - Constructor for class org.apache.beam.sdk.coders.Coder
-
- Coder.Context - Class in org.apache.beam.sdk.coders
-
- Coder.NonDeterministicException - Exception in org.apache.beam.sdk.coders
-
Exception thrown by
Coder.verifyDeterministic()
if the encoding is
not deterministic, including details of why the encoding is not deterministic.
- CoderCloudObjectTranslatorRegistrar - Interface in org.apache.beam.runners.dataflow.util
-
Coder
authors have the ability to automatically have their
Coder
registered with
the Dataflow Runner by creating a
ServiceLoader
entry and a concrete implementation of
this interface.
- coderConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
and values of
type T
, the values are equal if and only if the
encoded bytes are equal.
- coderConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, Coder.Context
, and
values of type T
, the values are equal if and only if the
encoded bytes are equal, in any Coder.Context
.
- coderDecodeEncodeContentsEqual(Coder<CollectionT>, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<Collection<T>>
,
and value of type Collection<T>
, encoding followed by decoding yields an
equal value of type Collection<T>
, in any Coder.Context
.
- coderDecodeEncodeContentsEqualInContext(Coder<CollectionT>, Coder.Context, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<Collection<T>>
,
and value of type Collection<T>
, encoding followed by decoding yields an
equal value of type Collection<T>
, in the given Coder.Context
.
- coderDecodeEncodeContentsInSameOrder(Coder<IterableT>, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<Collection<T>>
,
and value of type Collection<T>
, encoding followed by decoding yields an
equal value of type Collection<T>
, in any Coder.Context
.
- coderDecodeEncodeContentsInSameOrderInContext(Coder<IterableT>, Coder.Context, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<Iterable<T>>
,
and value of type Iterable<T>
, encoding followed by decoding yields an
equal value of type Collection<T>
, in the given Coder.Context
.
- coderDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
,
and value of type T
, encoding followed by decoding yields an
equal value of type T
, in any Coder.Context
.
- coderDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, Coder.Context
,
and value of type T
, encoding followed by decoding yields an
equal value of type T
.
- coderDecodesBase64(Coder<T>, String, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderDecodesBase64(Coder<T>, List<String>, List<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderDecodesBase64ContentsEqual(Coder<IterableT>, String, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderDecodesBase64ContentsEqual(Coder<IterableT>, List<String>, List<IterableT>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderDeterministic(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, and values of
type T
, if the values are equal then the encoded bytes are equal, in any
Coder.Context
.
- coderDeterministicInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, Coder.Context
, and values of
type T
, if the values are equal then the encoded bytes are equal.
- coderEncodesBase64(Coder<T>, T, String) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderEncodesBase64(Coder<T>, List<T>, List<String>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- CoderException - Exception in org.apache.beam.sdk.coders
-
An Exception
thrown if there is a problem encoding or decoding a value.
- CoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CoderException
-
- CoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
-
- CoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
-
- coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.CoderProvider
-
Returns a Coder<T>
to use for values of a particular type, given the Coders for each of
the type's generic parameter types.
- coderFromCloudObject(CloudObject) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
-
- CoderHelpers - Class in org.apache.beam.runners.spark.coders
-
Serialization utility class.
- CoderProperties - Class in org.apache.beam.sdk.testing
-
Properties for use in
Coder
tests.
- CoderProperties() - Constructor for class org.apache.beam.sdk.testing.CoderProperties
-
- CoderProperties.TestElementByteSizeObserver - Class in org.apache.beam.sdk.testing
-
An ElementByteSizeObserver
that records the observed element sizes for testing
purposes.
- CoderProvider - Class in org.apache.beam.sdk.coders
-
- CoderProvider() - Constructor for class org.apache.beam.sdk.coders.CoderProvider
-
- CoderProviderRegistrar - Interface in org.apache.beam.sdk.coders
-
Coder
creators have the ability to automatically have their
coders
registered with this SDK by creating a
ServiceLoader
entry
and a concrete implementation of this interface.
- CoderProviders - Class in org.apache.beam.sdk.coders
-
Static utility methods for creating and working with
CoderProvider
s.
- CoderRegistry - Class in org.apache.beam.sdk.coders
-
- coderSerializable(Coder<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that the given Coder<T>
can be correctly serialized and
deserialized.
- CoGbkResult - Class in org.apache.beam.sdk.transforms.join
-
- CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>, int) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- CoGbkResult.CoGbkResultCoder - Class in org.apache.beam.sdk.transforms.join
-
- CoGbkResultSchema - Class in org.apache.beam.sdk.transforms.join
-
- CoGbkResultSchema(TupleTagList) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Builds a schema from a tuple of TupleTag<?>
s.
- CoGroupByKey<K> - Class in org.apache.beam.sdk.transforms.join
-
- CollectionCoder<T> - Class in org.apache.beam.sdk.coders
-
- CollectionCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.CollectionCoder
-
- Combine - Class in org.apache.beam.sdk.transforms
-
PTransform
s for combining PCollection
elements
globally and per-key.
- combine(Iterable<? extends Instant>) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
- combine(Instant...) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
- Combine.AccumulatingCombineFn<InputT,AccumT extends Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT>,OutputT> - Class in org.apache.beam.sdk.transforms
-
- Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
-
The type of mutable accumulator values used by this
AccumulatingCombineFn
.
- Combine.BinaryCombineDoubleFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more
easily and efficiently expressed as binary operations on
double
s.
- Combine.BinaryCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more
easily expressed as binary operations.
- Combine.BinaryCombineIntegerFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more
easily and efficiently expressed as binary operations on
int
s
- Combine.BinaryCombineLongFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more
easily and efficiently expressed as binary operations on
long
s.
- Combine.CombineFn<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
-
A CombineFn<InputT, AccumT, OutputT>
specifies how to combine a
collection of input values of type InputT
into a single
output value of type OutputT
.
- Combine.Globally<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
Combine.Globally<InputT, OutputT>
takes a
PCollection<InputT>
and returns a
PCollection<OutputT>
whose elements are the result of
combining all the elements in each window of the input
PCollection
,
using a specified
CombineFn<InputT, AccumT, OutputT>
.
- Combine.GloballyAsSingletonView<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
Combine.GloballyAsSingletonView<InputT, OutputT>
takes a
PCollection<InputT>
and returns a
PCollectionView<OutputT>
whose elements are the result of
combining all the elements in each window of the input
PCollection
,
using a specified
CombineFn<InputT, AccumT, OutputT>
.
- Combine.GroupedValues<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
GroupedValues<K, InputT, OutputT>
takes a
PCollection<KV<K, Iterable<InputT>>>
,
such as the result of
GroupByKey
, applies a specified
CombineFn<InputT, AccumT, OutputT>
to each of the input
KV<K,
Iterable<InputT>>
elements to produce a combined output
KV<K, OutputT>
element, and
returns a
PCollection<KV<K, OutputT>>
containing all the combined output elements.
- Combine.Holder<V> - Class in org.apache.beam.sdk.transforms
-
Holds a single value value of type V
which may or may not be present.
- Combine.IterableCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
- Combine.PerKey<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
PerKey<K, InputT, OutputT>
takes a
PCollection<KV<K, InputT>>
, groups it by key, applies a
combining function to the InputT
values associated with each
key to produce a combined OutputT
value, and returns a
PCollection<KV<K, OutputT>>
representing a map from each
distinct key of the input PCollection
to the corresponding
combined value.
- Combine.PerKeyWithHotKeyFanout<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
- Combine.SimpleCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
- CombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.CombineFn
-
- combineFn() - Static method in class org.apache.beam.sdk.transforms.Count
-
- combineFn() - Static method in class org.apache.beam.sdk.transforms.Latest
-
- combineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
- CombineFnBase - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- CombineFnBase() - Constructor for class org.apache.beam.sdk.transforms.CombineFnBase
-
- CombineFnBase.GlobalCombineFn<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- CombineFns - Class in org.apache.beam.sdk.transforms
-
Static utility methods that create combine function instances.
- CombineFns() - Constructor for class org.apache.beam.sdk.transforms.CombineFns
-
- CombineFns.CoCombineResult - Class in org.apache.beam.sdk.transforms
-
A tuple of outputs produced by a composed combine functions.
- CombineFns.ComposeCombineFnBuilder - Class in org.apache.beam.sdk.transforms
-
- CombineFns.ComposedCombineFn<DataT> - Class in org.apache.beam.sdk.transforms
-
- CombineFns.ComposedCombineFnWithContext<DataT> - Class in org.apache.beam.sdk.transforms
-
- CombineFnTester - Class in org.apache.beam.sdk.testing
-
- CombineFnTester() - Constructor for class org.apache.beam.sdk.testing.CombineFnTester
-
- CombineFnWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
- CombineWithContext - Class in org.apache.beam.sdk.transforms
-
This class contains combine functions that have access to PipelineOptions
and side inputs
through CombineWithContext.Context
.
- CombineWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext
-
- CombineWithContext.CombineFnWithContext<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
-
A combine function that has access to PipelineOptions
and side inputs through
CombineWithContext.Context
.
- CombineWithContext.Context - Class in org.apache.beam.sdk.transforms
-
Information accessible to all methods in CombineFnWithContext
and KeyedCombineFnWithContext
.
- CombineWithContext.RequiresContextInternal - Interface in org.apache.beam.sdk.transforms
-
An internal interface for signaling that a GloballyCombineFn
or a PerKeyCombineFn
needs to access CombineWithContext.Context
.
- combining(Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
- combining(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards compatibility guarantees
- combining(Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to #combining(CombineFn)
, but with an accumulator coder explicitly supplied.
- combining(Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards compatibility guarantees
- combiningFromInputInternal(Coder<InputT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- CombiningState<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.state
-
A
ReadableState
cell defined by a
Combine.CombineFn
, accepting multiple input values,
combining them as specified into accumulators, and producing a single output value.
- commitManifest(ArtifactApi.CommitManifestRequest, StreamObserver<ArtifactApi.CommitManifestResponse>) - Method in class org.apache.beam.artifact.local.LocalFileSystemArtifactStagerService
-
- committed() - Method in interface org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all successfully completed parts of the pipeline.
- commonPrefixLength(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
Compute the length of the common prefix of the two provided sets of bytes.
- compact(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns an accumulator that represents the same logical value as the
input accumulator, but may have a more compact representation.
- compact(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
- compact(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
- compact(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
- compact(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns an accumulator that represents the same logical value as the
input accumulator, but may have a more compact representation.
- compare(JobMessage, JobMessage) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
-
- compare(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
- compare(RandomAccessData, RandomAccessData, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
Compare the two sets of bytes starting at the given offset.
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
-
Compare between String values, mapping to SqlTypeName#VARCHAR
.
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
-
Compare between Boolean values, mapping to SqlTypeName#BOOLEAN
.
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
-
Compare between Number values, including SqlTypeName#BIGINT
,
SqlTypeName#DECIMAL
, SqlTypeName#DOUBLE
, SqlTypeName#FLOAT
,
SqlTypeName#INTEGER
, SqlTypeName#SMALLINT
and SqlTypeName#TINYINT
.
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
-
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
-
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlEqualsExpression
-
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
-
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
-
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanExpression
-
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
-
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
-
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlGreaterThanOrEqualsExpression
-
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
-
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
-
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanExpression
-
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
-
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
-
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlLessThanOrEqualsExpression
-
- compare(CharSequence, CharSequence) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
-
- compare(Boolean, Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
-
- compare(Number, Number) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlNotEqualsExpression
-
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Largest
-
Deprecated.
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Natural
-
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Reversed
-
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Smallest
-
Deprecated.
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByKey
-
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByValue
-
- compareTo(ByteArray) - Method in class org.apache.beam.runners.spark.util.ByteArray
-
- compareTo(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKey
-
ByteKey
implements
Comparable<ByteKey>
by comparing the
arrays in lexicographic order.
- compareTo(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
- compileBeamPipeline(String, Pipeline, BeamSqlEnv) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamQueryPlanner
-
compileBeamPipeline
translate a SQL statement to executed as Beam data flow,
which is linked with the given pipeline
.
- compilePipeline(String, BeamSqlEnv) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlCli
-
- compilePipeline(String, Pipeline, BeamSqlEnv) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlCli
-
- complete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Constructs a
Watch.Growth.PollResult
with the given outputs and declares that there will be no
new outputs for the current input.
- complete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
- COMPONENT_ENCODINGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- compose() - Static method in class org.apache.beam.sdk.transforms.CombineFns
-
- ComposeCombineFnBuilder() - Constructor for class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
-
- CompositeSource - Class in org.apache.beam.runners.spark.metrics
-
Composite source made up of several MetricRegistry
instances.
- CompositeSource(String, MetricRegistry...) - Constructor for class org.apache.beam.runners.spark.metrics.CompositeSource
-
- CompressedReader(CompressedSource<T>, FileBasedSource.FileBasedReader<T>) - Constructor for class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Create a CompressedReader
from a CompressedSource
and delegate reader.
- CompressedSource<T> - Class in org.apache.beam.sdk.io
-
A Source that reads from compressed files.
- CompressedSource.CompressedReader<T> - Class in org.apache.beam.sdk.io
-
- CompressedSource.CompressionMode - Enum in org.apache.beam.sdk.io
-
- CompressedSource.DecompressingChannelFactory - Interface in org.apache.beam.sdk.io
-
Factory interface for creating channels that decompress the content of an underlying channel.
- Compression - Enum in org.apache.beam.sdk.io
-
Various compression types for reading/writing files.
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
-
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
-
- ConfigurationLocator() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
-
- configure() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new builder for a
Window
transform for setting windowing parameters other
than the windowing function.
- connect() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Connect to the Redis instance.
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ConnectionConfiguration
-
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
-
- connectToSpanner() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BitSetCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BooleanCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ByteCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.Coder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DurationCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.InstantCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.KvCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
LengthPrefixCoder
is consistent with equals if the nested Coder
is.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
NullableCoder
is consistent with equals if the nested Coder
is.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VoidCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- ConsoleIO - Class in org.apache.beam.runners.spark.io
-
Print to console.
- ConsoleIO.Write - Class in org.apache.beam.runners.spark.io
-
Write on the console.
- ConsoleIO.Write.Unbound<T> - Class in org.apache.beam.runners.spark.io
-
- constant(FileBasedSink.FilenamePolicy, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
- constant(FileBasedSink.FilenamePolicy) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
A specialization of #constant(FilenamePolicy, SerializableFunction)
for the case where
UserT and OutputT are the same type and the format function is the identity.
- constant(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
-
- constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.AvroIO
-
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
-
- contains(T) - Method in interface org.apache.beam.sdk.state.SetState
-
Returns true if this set contains the specified element.
- contains(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns whether this window contains the given window.
- containsInAnyOrder(T...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question contains the provided elements.
- containsInAnyOrder(Iterable<T>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question contains the provided elements.
- containsInAnyOrder(T...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the Iterable
contains the expected elements, in any order.
- containsInAnyOrder(Iterable<T>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the Iterable
contains the expected elements, in any order.
- containsKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns
true
if the specified
ByteKey
is contained within this range.
- Context(boolean) - Constructor for class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- Context(TableDataInsertAllResponse.InsertErrors) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
-
- Context() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.Context
-
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
-
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Continuously watches for new files at the given interval until the given termination
condition is reached, where the input to the condition is the filepattern.
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamFilterRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSourceRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamProjectRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
-
- convertToArgs(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipeline
-
- convertToBagSpecInternal(StateSpec<CombiningState<InputT, AccumT, OutputT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- convertToBeamRel(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamQueryPlanner
-
It parses and validate the input query, then convert into a
BeamRelNode
tree.
- convertToFileResourceIfPossible(String) - Static method in class org.apache.beam.sdk.io.FileBasedSink
-
This is a helper function for turning a user-provided output filename prefix and converting it
into a
ResourceId
for writing output files.
- copy() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns a copy of this RandomAccessData.
- copy(RelTraitSet, RelNode, boolean, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
-
- copy(RelTraitSet, RelNode, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamFilterRel
-
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
-
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
-
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
-
- copy(RelTraitSet, RelNode, List<RexNode>, RelDataType) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamProjectRel
-
- copy(RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
-
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
-
- copy(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
-
Copies a List
of file-like resources from one location to another.
- copy(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Copies a List
of file-like resources from one location to another.
- copyFrom(ByteBuffer) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKey
backed by a copy of the data remaining in the specified
ByteBuffer
.
- copyFrom(byte[]) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKey
backed by a copy of the specified
byte[]
.
- count() - Method in class org.apache.beam.sdk.metrics.DistributionResult
-
- Count - Class in org.apache.beam.sdk.transforms
-
- countAsserts(Pipeline) - Static method in class org.apache.beam.sdk.testing.PAssert
-
- Counter - Interface in org.apache.beam.sdk.metrics
-
A metric that reports a single long value and can be incremented or decremented.
- counter(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
- counter(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
- CounterMark(long, Instant) - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Creates a checkpoint mark reflecting the last emitted value.
- counters() - Method in interface org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the counters that matched the filter.
- CountingSource - Class in org.apache.beam.sdk.io
-
- CountingSource.CounterMark - Class in org.apache.beam.sdk.io
-
The checkpoint for an unbounded
CountingSource
is simply the last value produced.
- CrashingRunner - Class in org.apache.beam.sdk.testing
-
- CrashingRunner() - Constructor for class org.apache.beam.sdk.testing.CrashingRunner
-
- create(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowClient
-
- create(PCollectionView<?>, Coder<T>) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.flink.DefaultParallelismFactory
-
- create(boolean) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
-
- create() - Static method in class org.apache.beam.runners.reference.job.ReferenceRunnerJobService
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkPipelineOptions.TmpCheckpointDirFactory
-
- create() - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with default options.
- create(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with specified options.
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
-
- create(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.util.SideInputBroadcast
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
-
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
-
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.SortValues
-
Returns a
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT>
PTransform
.
- create(List<String>, List<Integer>) - Static method in class org.apache.beam.sdk.extensions.sql.BeamRecordSqlType
-
- create(String[], String, String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration.
- create(WritableByteChannel) - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- create(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
-
- create(EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
- create(ResourceIdT, CreateOptions) - Method in class org.apache.beam.sdk.io.FileSystem
-
Returns a write channel for the given ResourceIdT
.
- create(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileSystems
-
- create(ResourceId, CreateOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
-
- create(MatchResult.Status, List<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
-
- create(MatchResult.Status, IOException) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
-
- create(Mutation, Mutation...) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
Creates a new group.
- create(Mutation, Iterable<Mutation>) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
-
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- create(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
-
- create(DataSource) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
- create(String, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
- create(String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
Describe a connection configuration to the MQTT broker.
- create(String, String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
Describe a connection configuration to the MQTT broker.
- create() - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
- create(String, int) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
- create(String) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
-
Creates a new Solr connection configuration.
- create(long, long, long, long) - Static method in class org.apache.beam.sdk.metrics.DistributionResult
-
- create(long, Instant) - Static method in class org.apache.beam.sdk.metrics.GaugeResult
-
- create(PipelineOptions) - Method in interface org.apache.beam.sdk.options.DefaultValueFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.UserAgentFactory
-
- create() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Creates and returns an object that implements
PipelineOptions
using the values
configured on this builder during construction.
- create() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
- create() - Static method in class org.apache.beam.sdk.Pipeline
-
- create(PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
-
- create() - Static method in class org.apache.beam.sdk.testing.TestPipeline
-
Creates and returns a new test pipeline.
- create(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
-
- create(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream
-
- create(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns an approximate quantiles combiner with the given
compareFn
and desired number of quantiles.
- create(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
- create(int, ComparatorT, long, double) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Creates an approximate quantiles combiner with the given
compareFn
and desired number of quantiles.
- Create<T> - Class in org.apache.beam.sdk.transforms
-
Create<T>
takes a collection of elements of type T
known when the pipeline is constructed and returns a
PCollection<T>
containing the elements.
- Create() - Constructor for class org.apache.beam.sdk.transforms.Create
-
- create() - Static method in class org.apache.beam.sdk.transforms.Distinct
-
Returns a Distinct<T>
PTransform
.
- create() - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns a GroupByKey<K, V>
PTransform
.
- create() - Static method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
-
Returns a CoGroupByKey<K>
PTransform
.
- create() - Static method in class org.apache.beam.sdk.transforms.Keys
-
Returns a Keys<K>
PTransform
.
- create() - Static method in class org.apache.beam.sdk.transforms.KvSwap
-
Returns a KvSwap<K, V>
PTransform
.
- create() - Static method in class org.apache.beam.sdk.transforms.Values
-
Returns a Values<V>
PTransform
.
- Create.OfValueProvider<T> - Class in org.apache.beam.sdk.transforms
-
- Create.TimestampedValues<T> - Class in org.apache.beam.sdk.transforms
-
A PTransform
that creates a PCollection
whose elements have
associated timestamps.
- Create.Values<T> - Class in org.apache.beam.sdk.transforms
-
A PTransform
that creates a PCollection
from a set of in-memory objects.
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAdaptor
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
- createContextual(DeserializationContext, BeanProperty) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
-
- CreateDataflowView<ElemT,ViewT> - Class in org.apache.beam.runners.dataflow
-
- createDecompressingChannel(ReadableByteChannel) - Method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- createDecompressingChannel(ReadableByteChannel) - Method in interface org.apache.beam.sdk.io.CompressedSource.DecompressingChannelFactory
-
Given a channel, create a channel that decompresses the content read from the channel.
- createDefault() - Static method in class org.apache.beam.sdk.coders.CoderRegistry
-
Creates a CoderRegistry containing registrations for all standard coders part of the core Java
Apache Beam SDK and also any registrations provided by
coder registrars
.
- createForSubrangeOfFile(String, long, long) - Method in class org.apache.beam.sdk.io.AvroSource
-
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.AvroSource
-
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a BlockBasedSource
for the specified range in a single file.
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a CompressedSource
for a subrange of a file.
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Creates and returns a new FileBasedSource
of the same type as the current
FileBasedSource
backed by a given file and an offset range.
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.xml.XmlSource
-
- CreateGearpumpPCollectionViewTranslator<ElemT,ViewT> - Class in org.apache.beam.runners.gearpump.translators
-
CreateGearpumpPCollectionView bridges input stream to down stream
transforms.
- CreateGearpumpPCollectionViewTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.CreateGearpumpPCollectionViewTranslator
-
- createInputFormatInstance() - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
-
Creates instance of InputFormat class.
- createJar(File, File) - Static method in class org.apache.beam.runners.apex.ApexYarnLauncher
-
Create a jar file from the given directory.
- createJob(Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Creates the Dataflow Job
.
- CreateOptions - Class in org.apache.beam.sdk.io.fs
-
An abstract class that contains common configuration options for creating resources.
- CreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions
-
- CreateOptions.Builder<BuilderT extends CreateOptions.Builder<BuilderT>> - Class in org.apache.beam.sdk.io.fs
-
- CreateOptions.StandardCreateOptions - Class in org.apache.beam.sdk.io.fs
-
A standard configuration options with builder.
- CreateOptions.StandardCreateOptions.Builder - Class in org.apache.beam.sdk.io.fs
-
- createPane(boolean, boolean, PaneInfo.Timing) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
- createPane(boolean, boolean, PaneInfo.Timing, long, long) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Factory method to create a
PaneInfo
with the specified parameters.
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- createProcessContext(ValueInSingleWindow<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- createReader(PipelineOptions) - Method in class org.apache.beam.runners.gearpump.translators.io.BoundedSourceWrapper
-
- createReader(PipelineOptions) - Method in class org.apache.beam.runners.gearpump.translators.io.GearpumpSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.runners.gearpump.translators.io.UnboundedSourceWrapper
-
- createReader(PipelineOptions, UnboundedSource.CheckpointMark) - Method in class org.apache.beam.runners.gearpump.translators.io.ValuesSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
-
- createReader(CassandraIO.CassandraSource<T>) - Method in interface org.apache.beam.sdk.io.cassandra.CassandraService
-
- createReader(CassandraIO.CassandraSource<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.HadoopInputFormatBoundedSource
-
- createReader(PipelineOptions, JmsCheckpointMark) - Method in class org.apache.beam.sdk.io.jms.JmsIO.UnboundedJmsSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
-
- createReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.sdk.io.UnboundedSource
-
- createRunner(ReadyCheckingSideInputReader) - Method in class org.apache.beam.runners.gearpump.translators.utils.DoFnRunnerFactory
-
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.AvroSource
-
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a BlockBasedReader
.
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a FileBasedReader
to read a single file.
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Creates and returns an instance of a FileBasedReader
implementation for the current
source assuming the source represents a single file.
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlSource
-
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
- CreateStream<T> - Class in org.apache.beam.runners.spark.io
-
Create an input stream from Queue.
- createStructuralValues(Coder<T>, List<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Testing utilities below depend on standard assertions and matchers to compare elements read by
sources.
- CreateTables<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Creates any tables needed before performing streaming writes to the tables.
- CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
-
- createTransaction() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Returns a transform that creates a batch transaction.
- CreateTransaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
- createWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
- createWriter(CassandraIO.Write<T>) - Method in interface org.apache.beam.sdk.io.cassandra.CassandraService
-
- createWriter(CassandraIO.Write<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
-
- createWriter() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
- CredentialFactory - Interface in org.apache.beam.sdk.extensions.gcp.auth
-
Construct an oauth credential to be used by the SDK and the SDK workers.
- csvFormat - Variable in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTableIOReader
-
- csvFormat - Variable in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTableIOWriter
-
- csvLine2BeamSqlRow(CSVFormat, String, BeamRecordSqlType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
-
- CsvRecorderDecoder(BeamRecordSqlType, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaCSVTable.CsvRecorderDecoder
-
- CsvRecorderEncoder(BeamRecordSqlType, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaCSVTable.CsvRecorderEncoder
-
- CsvSink - Class in org.apache.beam.runners.spark.metrics.sink
-
- CsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
-
- ctxt - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
-
- current() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
-
- currentEventTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current event time.
- currentInputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- currentOutputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- currentProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- currentProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current processing time.
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
- currentRestriction() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Returns a restriction accurately describing the full range of work the current DoFn.ProcessElement
call will do, including already completed work.
- currentSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- currentSynchronizedProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current synchronized processing time or null
if unknown.
- CUSTOM_SOURCE_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- CustomCoder<T> - Class in org.apache.beam.sdk.coders
-
- CustomCoder() - Constructor for class org.apache.beam.sdk.coders.CustomCoder
-
- DataflowClient - Class in org.apache.beam.runners.dataflow
-
Wrapper around the generated Dataflow
client to provide common functionality.
- DataflowClientFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
-
- DataflowJobAlreadyExistsException - Exception in org.apache.beam.runners.dataflow
-
An exception that is thrown if the unique job name constraint of the Dataflow
service is broken because an existing job with the same job name is currently active.
- DataflowJobAlreadyExistsException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyExistsException
-
Create a new
DataflowJobAlreadyExistsException
with the specified
DataflowPipelineJob
and message.
- DataflowJobAlreadyUpdatedException - Exception in org.apache.beam.runners.dataflow
-
An exception that is thrown if the existing job has already been updated within the Dataflow
service and is no longer able to be updated.
- DataflowJobAlreadyUpdatedException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyUpdatedException
-
Create a new
DataflowJobAlreadyUpdatedException
with the specified
DataflowPipelineJob
and message.
- DataflowJobException - Exception in org.apache.beam.runners.dataflow
-
- DataflowPipelineDebugOptions - Interface in org.apache.beam.runners.dataflow.options
-
Internal.
- DataflowPipelineDebugOptions.DataflowClientFactory - Class in org.apache.beam.runners.dataflow.options
-
Returns the default Dataflow client built from the passed in PipelineOptions.
- DataflowPipelineDebugOptions.StagerFactory - Class in org.apache.beam.runners.dataflow.options
-
- DataflowPipelineJob - Class in org.apache.beam.runners.dataflow
-
A DataflowPipelineJob represents a job submitted to Dataflow using
DataflowRunner
.
- DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Constructs the job.
- DataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
-
- DataflowPipelineOptions.StagingLocationFactory - Class in org.apache.beam.runners.dataflow.options
-
- DataflowPipelineRegistrar - Class in org.apache.beam.runners.dataflow
-
- DataflowPipelineRegistrar.Options - Class in org.apache.beam.runners.dataflow
-
- DataflowPipelineRegistrar.Runner - Class in org.apache.beam.runners.dataflow
-
- DataflowPipelineTranslator - Class in org.apache.beam.runners.dataflow
-
- DataflowPipelineTranslator.JobSpecification - Class in org.apache.beam.runners.dataflow
-
The result of a job translation.
- DataflowPipelineWorkerPoolOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used to configure the Dataflow pipeline worker pool.
- DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType - Enum in org.apache.beam.runners.dataflow.options
-
Type of autoscaling algorithm to use.
- DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory - Class in org.apache.beam.runners.dataflow.options
-
Returns the default Docker container image that executes Dataflow worker harness, residing in
Google Container Registry.
- DataflowProfilingAgentConfiguration() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowProfilingOptions.DataflowProfilingAgentConfiguration
-
- DataflowProfilingOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options for controlling profiling of pipeline execution.
- DataflowProfilingOptions.DataflowProfilingAgentConfiguration - Class in org.apache.beam.runners.dataflow.options
-
Configuration the for profiling agent.
- DataflowRunner - Class in org.apache.beam.runners.dataflow
-
A
PipelineRunner
that executes the operations in the pipeline by first translating them
to the Dataflow representation using the
DataflowPipelineTranslator
and then submitting
them to a Dataflow service for execution.
- DataflowRunner(DataflowPipelineOptions) - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner
-
- DataflowRunner.StreamingPCollectionViewWriterFn<T> - Class in org.apache.beam.runners.dataflow
-
- DataflowRunnerHooks - Class in org.apache.beam.runners.dataflow
-
An instance of this class can be passed to the
DataflowRunner
to add user defined hooks to be
invoked at various times during pipeline execution.
- DataflowRunnerHooks() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunnerHooks
-
- DataflowRunnerInfo - Class in org.apache.beam.runners.dataflow
-
- DataflowServiceException - Exception in org.apache.beam.runners.dataflow
-
Signals there was an error retrieving information about a job from the Cloud Dataflow Service.
- DataflowTemplateJob - Class in org.apache.beam.runners.dataflow.util
-
- DataflowTemplateJob() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
-
- DataflowTransport - Class in org.apache.beam.runners.dataflow.util
-
Helpers for cloud communication.
- DataflowTransport() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTransport
-
- DataflowWorkerHarnessOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used exclusively within the Dataflow worker harness.
- DataflowWorkerLoggingOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used to control logging configuration on the Dataflow worker.
- DataflowWorkerLoggingOptions.Level - Enum in org.apache.beam.runners.dataflow.options
-
The set of log levels that can be used on the Dataflow worker.
- DataflowWorkerLoggingOptions.WorkerLogLevelOverrides - Class in org.apache.beam.runners.dataflow.options
-
Defines a log level override for a specific class, package, or name.
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
- DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
- DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
-
- DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that deletes
Entities
from Cloud Datastore.
- DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that deletes
Entities
associated with the given
Keys
from Cloud Datastore.
- DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that reads the result rows of a Cloud Datastore query as
Entity
objects.
- DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that writes
Entity
objects to Cloud Datastore.
- days(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
Returns a
WindowFn
that windows elements into periods measured by days.
- dec() - Method in interface org.apache.beam.sdk.metrics.Counter
-
- dec(long) - Method in interface org.apache.beam.sdk.metrics.Counter
-
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.AvroCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BeamRecordCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.Coder
-
Decodes a value of type T
from the given input stream in
the given context.
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.BooleanCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.DateCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.DoubleCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.FloatCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.ShortCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.TimeCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAccumulatorCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- decodePane(byte) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.CollectionCoder
-
Builds an instance of IterableT
, this coder's associated Iterable
-like
subtype, from a list of decoded elements.
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
-
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Builds an instance of IterableT
, this coder's associated Iterable
-like
subtype, from a list of decoded elements.
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
-
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.SetCoder
-
Builds an instance of IterableT
, this coder's associated Iterable
-like
subtype, from a list of decoded elements.
- deduceOutputType(SqlTypeName, SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
-
- Default - Annotation Type in org.apache.beam.sdk.options
-
Default
represents a set of annotations that can be used to annotate getter properties
on
PipelineOptions
with information representing the default value to be returned
if no value is specified.
- Default.Boolean - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified boolean primitive value.
- Default.Byte - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified byte primitive value.
- Default.Character - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified char primitive value.
- Default.Class - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified Class
value.
- Default.Double - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified double primitive value.
- Default.Enum - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified enum.
- Default.Float - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified float primitive value.
- Default.InstanceFactory - Annotation Type in org.apache.beam.sdk.options
-
- Default.Integer - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified int primitive value.
- Default.Long - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified long primitive value.
- Default.Short - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified short primitive value.
- Default.String - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified String
value.
- DEFAULT_BYTE_ARRAY_CODER - Static variable in class org.apache.beam.sdk.io.TFRecordIO
-
The default coder, which returns each record of the input file as a byte array.
- DEFAULT_MAX_NUM_ELEMENTS - Static variable in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
The cost (in time and space) to compute quantiles to a given
accuracy is a function of the total number of elements in the
data set.
- DEFAULT_QUEUE_MAX_POLL_TIME - Static variable in class org.apache.beam.sdk.io.tika.TikaIO.Read
-
- DEFAULT_QUEUE_POLL_TIME - Static variable in class org.apache.beam.sdk.io.tika.TikaIO.Read
-
- DEFAULT_SCHEME - Static variable in class org.apache.beam.sdk.io.FileSystems
-
- DEFAULT_UNWINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
The default sharding name template.
- DefaultCoder - Annotation Type in org.apache.beam.sdk.coders
-
The
DefaultCoder
annotation specifies a
Coder
class to handle encoding and
decoding instances of the annotated class.
- DefaultCoder.DefaultCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
-
- DefaultCoderCloudObjectTranslatorRegistrar - Class in org.apache.beam.runners.dataflow.util
-
- DefaultCoderCloudObjectTranslatorRegistrar() - Constructor for class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
-
- DefaultCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
-
- DefaultConcludeTransform() - Constructor for class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
-
- DefaultFilenamePolicy - Class in org.apache.beam.sdk.io
-
- DefaultFilenamePolicy.Params - Class in org.apache.beam.sdk.io
-
- DefaultFilenamePolicy.ParamsCoder - Class in org.apache.beam.sdk.io
-
- DefaultParallelismFactory - Class in org.apache.beam.runners.flink
-
- DefaultParallelismFactory() - Constructor for class org.apache.beam.runners.flink.DefaultParallelismFactory
-
- DefaultPipelineOptionsRegistrar - Class in org.apache.beam.sdk.options
-
- DefaultPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
-
- DefaultProjectFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
-
- Defaults() - Constructor for class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
-
- DefaultStopPipelineWatermarkFactory() - Constructor for class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
-
- DefaultTrigger - Class in org.apache.beam.sdk.transforms.windowing
-
A trigger that is equivalent to Repeatedly.forever(AfterWatermark.pastEndOfWindow())
.
- defaultValue() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the default value when there are no values added to the accumulator.
- defaultValue() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the default value when there are no values added to the accumulator.
- defaultValue() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
- defaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
-
Returns the default value of this transform, or null if there isn't one.
- DefaultValueFactory<T> - Interface in org.apache.beam.sdk.options
-
- delay(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- Delay() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
-
- delegate() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
-
- delegate(HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register display data from the specified component on behalf of the current component.
- DelegateCoder<T,IntermediateT> - Class in org.apache.beam.sdk.coders
-
A
DelegateCoder<T, IntermediateT>
wraps a
Coder
for
IntermediateT
and
encodes/decodes values of type
T
by converting
to/from
IntermediateT
and then encoding/decoding using the underlying
Coder<IntermediateT>
.
- DelegateCoder(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.DelegateCoder
-
- DelegateCoder.CodingFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.coders
-
- delete(Collection<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
-
Deletes a collection of resources.
- delete(Collection<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Deletes a collection of resources.
- deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
- deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
- deleteTimer(StateNamespace, String, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- deleteTimer(StateNamespace, String) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- deleteTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
-
Removes the timer set in this context for the timestmap
and timeDomain
.
- dependsOnlyOnEarliestTimestamp() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns true
if the result of combination of many output timestamps actually depends
only on the earliest.
- dependsOnlyOnWindow() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns true
if the result does not depend on what outputs were combined but only
the window they are in.
- describeMismatchSafely(PipelineResult, Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
-
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
-
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.RegexMatcher
-
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
-
- Description - Annotation Type in org.apache.beam.sdk.options
-
Descriptions are used to generate human readable output when the --help
command is specified.
- deserialize(String, byte[]) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
-
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
-
- deserializeTimers(Collection<byte[]>, TimerInternals.TimerDataCoder) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- detect(String) - Static method in enum org.apache.beam.sdk.io.Compression
-
- detectClassPathResourcesToStage(ClassLoader) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Attempts to detect all the resources the class loader has access to.
- detectClassPathResourcesToStage(ClassLoader) - Static method in class org.apache.beam.runners.flink.FlinkRunner
-
Attempts to detect all the resources the class loader has access to.
- DirectOptions - Interface in org.apache.beam.runners.direct
-
- DirectOptions.AvailableParallelismFactory - Class in org.apache.beam.runners.direct
-
- DIRECTORY_CONTAINER - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
-
Shard is a file within a directory.
- DirectRegistrar - Class in org.apache.beam.runners.direct
-
- DirectRegistrar.Options - Class in org.apache.beam.runners.direct
-
- DirectRegistrar.Runner - Class in org.apache.beam.runners.direct
-
- DirectRunner - Class in org.apache.beam.runners.direct
-
- DirectRunner() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
-
- DirectRunner.DirectPipelineResult - Class in org.apache.beam.runners.direct
-
- DirectTestOptions - Interface in org.apache.beam.runners.direct
-
Internal-only options for tweaking the behavior of the
DirectRunner
in ways that users
should never do.
- DISALLOW_COMBINER_LIFTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- discardingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new Window
PTransform
that uses the registered WindowFn and
Triggering behavior, and that discards elements in a pane after they are triggered.
- dispatchBag(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchBag(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchDefault() - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchMap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchMap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchSet(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchSet(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchValue(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchValue(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- DISPLAY_DATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- DisplayData - Class in org.apache.beam.sdk.transforms.display
-
Static display data associated with a pipeline component.
- DisplayData.Builder - Interface in org.apache.beam.sdk.transforms.display
-
Utility to build up display data from a component and its included
subcomponents.
- DisplayData.Identifier - Class in org.apache.beam.sdk.transforms.display
-
Unique identifier for a display data item within a component.
- DisplayData.Item - Class in org.apache.beam.sdk.transforms.display
-
Items
are the unit of display data.
- DisplayData.ItemSpec<T> - Class in org.apache.beam.sdk.transforms.display
-
- DisplayData.Path - Class in org.apache.beam.sdk.transforms.display
-
Structured path of registered display data within a component hierarchy.
- DisplayData.Type - Enum in org.apache.beam.sdk.transforms.display
-
Display data type.
- distance(long, long) - Static method in class org.apache.beam.sdk.io.cassandra.CassandraServiceImpl
-
Measure distance between two tokens.
- Distinct<T> - Class in org.apache.beam.sdk.transforms
-
Distinct<T>
takes a PCollection<T>
and
returns a PCollection<T>
that has all distinct elements of the
input.
- Distinct() - Constructor for class org.apache.beam.sdk.transforms.Distinct
-
- Distinct.WithRepresentativeValues<T,IdT> - Class in org.apache.beam.sdk.transforms
-
- Distribution - Interface in org.apache.beam.sdk.metrics
-
A metric that reports information about the distribution of reported values.
- distribution(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that records various statistics about the distribution of reported values.
- distribution(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that records various statistics about the distribution of reported values.
- DistributionResult - Class in org.apache.beam.sdk.metrics
-
- DistributionResult() - Constructor for class org.apache.beam.sdk.metrics.DistributionResult
-
- distributions() - Method in interface org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the distributions that matched the filter.
- doChecks(PAssert.PAssertionSite, ActualT, SerializableFunction<ActualT, Void>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
- DoFn<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
The argument to
ParDo
providing the code to use to process
elements of the input
PCollection
.
- DoFn() - Constructor for class org.apache.beam.sdk.transforms.DoFn
-
- DoFn.BoundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.FinishBundle - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use to finish processing a batch of elements.
- DoFn.FinishBundleContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.GetInitialRestriction - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that maps an element to an initial restriction for a
splittable DoFn
.
- DoFn.GetRestrictionCoder - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the coder to use for the restriction of a
splittable DoFn
.
- DoFn.NewTracker - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.OnTimer - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for registering a callback for a timer.
- DoFn.OnTimerContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.OutputReceiver<T> - Interface in org.apache.beam.sdk.transforms
-
Receives values of the given type.
- DoFn.ProcessContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.ProcessContinuation - Class in org.apache.beam.sdk.transforms
-
When used as a return value of
DoFn.ProcessElement
, indicates whether there is more work to
be done for the current element.
- DoFn.ProcessElement - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use for processing elements.
- DoFn.Setup - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use to prepare an instance for processing bundles of elements.
- DoFn.SplitRestriction - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that splits restriction of a
splittable DoFn
into multiple parts to
be processed in parallel.
- DoFn.StartBundle - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use to prepare an instance for processing a batch of elements.
- DoFn.StartBundleContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.StateId - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for declaring and dereferencing state cells.
- DoFn.Teardown - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use to clean up this instance after processing bundles of
elements.
- DoFn.TimerId - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for declaring and dereferencing timers.
- DoFn.UnboundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.WindowedContext - Class in org.apache.beam.sdk.transforms
-
Information accessible to all methods in this
DoFn
where the context is in some window.
- DoFnFunction<InputT,OutputT> - Class in org.apache.beam.runners.gearpump.translators.functions
-
Gearpump
FlatMapFunction
wrapper over Beam
DoFn
.
- DoFnFunction(GearpumpPipelineOptions, DoFn<InputT, OutputT>, WindowingStrategy<?, ?>, Collection<PCollectionView<?>>, Map<String, PCollectionView<?>>, TupleTag<OutputT>, List<TupleTag<?>>) - Constructor for class org.apache.beam.runners.gearpump.translators.functions.DoFnFunction
-
- DoFnRunnerFactory<InputT,OutputT> - Class in org.apache.beam.runners.gearpump.translators.utils
-
a serializable SimpleDoFnRunner
.
- DoFnRunnerFactory(GearpumpPipelineOptions, DoFn<InputT, OutputT>, Collection<PCollectionView<?>>, DoFnRunners.OutputManager, TupleTag<OutputT>, List<TupleTag<?>>, StepContext, WindowingStrategy<?, ?>) - Constructor for class org.apache.beam.runners.gearpump.translators.utils.DoFnRunnerFactory
-
- DoFnRunnerWithMetricsUpdate<InputT,OutputT> - Class in org.apache.beam.runners.flink.metrics
-
DoFnRunner
decorator which registers
MetricsContainerImpl
.
- DoFnRunnerWithMetricsUpdate(String, DoFnRunner<InputT, OutputT>, RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
-
- DoFnTester<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
A harness for unit-testing a
DoFn
.
- DoFnTester.CloningBehavior - Enum in org.apache.beam.sdk.transforms
-
When a
DoFnTester
should clone the
DoFn
under test and how it should manage
the lifecycle of the
DoFn
.
- DoubleCoder - Class in org.apache.beam.sdk.coders
-
A
DoubleCoder
encodes
Double
values in 8 bytes using Java serialization.
- doubles() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a PTransform
that takes an input PCollection<Double>
and returns a
PCollection<Double>
whose contents is the maximum of the input PCollection
's
elements, or Double.NEGATIVE_INFINITY
if there are no elements.
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a PTransform
that takes an input PCollection<Double>
and returns a
PCollection<Double>
whose contents is the minimum of the input PCollection
's
elements, or Double.POSITIVE_INFINITY
if there are no elements.
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a PTransform
that takes an input
PCollection<Double>
and returns a
PCollection<Double>
whose contents is the sum of the
input PCollection
's elements, or
0
if there are no elements.
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a PTransform
that takes an input PCollection<KV<K, Double>>
and returns
a PCollection<KV<K, Double>>
that contains an output element mapping each distinct key
in the input PCollection
to the maximum of the values associated with that key in the
input PCollection
.
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a PTransform
that takes an input PCollection<KV<K, Double>>
and returns
a PCollection<KV<K, Double>>
that contains an output element mapping each distinct key
in the input PCollection
to the minimum of the values associated with that key in the
input PCollection
.
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a PTransform
that takes an input
PCollection<KV<K, Double>>
and returns a
PCollection<KV<K, Double>>
that contains an output
element mapping each distinct key in the input
PCollection
to the sum of the values associated with
that key in the input PCollection
.
- DurationCoder - Class in org.apache.beam.sdk.coders
-
- DynamicAvroDestinations<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- DynamicAvroDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicAvroDestinations
-
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
- DynamicDestinations<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This class provides the most general way of specifying dynamic BigQuery table destinations.
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
- DynamicFileDestinations - Class in org.apache.beam.sdk.io
-
- DynamicFileDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicFileDestinations
-
- eitherOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- ElasticsearchIO - Class in org.apache.beam.sdk.io.elasticsearch
-
Transforms for reading and writing data from/to Elasticsearch.
- ElasticsearchIO.BoundedElasticsearchSource - Class in org.apache.beam.sdk.io.elasticsearch
-
- ElasticsearchIO.ConnectionConfiguration - Class in org.apache.beam.sdk.io.elasticsearch
-
A POJO describing a connection configuration to Elasticsearch.
- ElasticsearchIO.Read - Class in org.apache.beam.sdk.io.elasticsearch
-
- ElasticsearchIO.Write - Class in org.apache.beam.sdk.io.elasticsearch
-
- element() - Method in class org.apache.beam.sdk.transforms.DoFn.ProcessContext
-
Returns the input element to be processed.
- element() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
-
Returns the current element.
- elementCountAtLeast(int) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
Creates a trigger that fires when the pane contains at least countElems
elements.
- ElementEvent() - Constructor for class org.apache.beam.sdk.testing.TestStream.ElementEvent
-
- elements() - Static method in class org.apache.beam.sdk.transforms.ToString
-
Transforms each element of the input
PCollection
to a
String
using the
Object.toString()
method.
- elementsRead() - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of elements read by a source.
- elementsReadBySplit(String) - Static method in class org.apache.beam.sdk.metrics.SourceMetrics
-
Counter of elements read by a source split.
- elementsWritten() - Static method in class org.apache.beam.sdk.metrics.SinkMetrics
-
Counter of elements written to a sink.
- EMPTY - Static variable in class org.apache.beam.sdk.io.range.ByteKey
-
An empty key.
- empty() - Static method in class org.apache.beam.sdk.metrics.GaugeResult
-
- empty() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question is empty.
- empty() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- empty(Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces
an empty
PCollection
.
- empty(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces
an empty
PCollection
.
- empty() - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- empty(Pipeline) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns an empty KeyedPCollectionTuple<K>
on the given pipeline.
- empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionList
-
- empty(Pipeline) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
- empty() - Static method in class org.apache.beam.sdk.values.TupleTagList
-
- emptyBatch() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Adds an empty batch.
- EmptyCheckpointMark - Class in org.apache.beam.runners.spark.io
-
Passing null values to Spark's Java API may cause problems because of Guava preconditions.
- EmptyListenersList() - Constructor for class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
-
- EmptyMatchTreatment - Enum in org.apache.beam.sdk.io.fs
-
- enableAbandonedNodeEnforcement(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
Enables the abandoned node detection.
- enableAutoRunIfMissing(boolean) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
If enabled, a pipeline.run()
statement will be added automatically in case it is
missing in the test.
- encode(RandomAccessData, OutputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- encode(RandomAccessData, OutputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.AvroCoder
-
- encode(BeamRecord, OutputStream) - Method in class org.apache.beam.sdk.coders.BeamRecordCoder
-
- encode(BigDecimal, OutputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
- encode(BigDecimal, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
- encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
- encode(BigInteger, OutputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
- encode(BigInteger, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
- encode(BitSet, OutputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
-
- encode(BitSet, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
-
- encode(Boolean, OutputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
-
- encode(byte[], OutputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- encode(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- encode(Byte, OutputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.Coder
-
Encodes the given value of type T
onto the given output stream.
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
- encode(Double, OutputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
- encode(ReadableDuration, OutputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
-
- encode(Instant, OutputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
-
- encode(IterableT, OutputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
- encode(KV<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
-
- encode(KV<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
- encode(Map<K, V>, OutputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
-
- encode(Map<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
- encode(ShardedKey<KeyT>, OutputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- encode(String, OutputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
- encode(String, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- encode(Integer, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- encode(Integer, OutputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
- encode(Long, OutputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
- encode(Void, OutputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
-
- encode(ByteString, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- encode(ByteString, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- encode(Boolean, OutputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.BooleanCoder
-
- encode(Date, OutputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.DateCoder
-
- encode(Double, OutputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.DoubleCoder
-
- encode(Float, OutputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.FloatCoder
-
- encode(Short, OutputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.ShortCoder
-
- encode(GregorianCalendar, OutputStream) - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlRecordHelper.TimeCoder
-
- encode(BeamAggregationTransforms.AggregationAccumulator, OutputStream) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAccumulatorCoder
-
- encode(Message, OutputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
-
- encode(DefaultFilenamePolicy.Params, OutputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
-
- encode(FileBasedSink.FileResult<DestinationT>, OutputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
-
- encode(MatchResult.Metadata, OutputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
-
- encode(ResourceId, OutputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
-
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
-
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
-
- encode(TableRow, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- encode(TableRow, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
-
- encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
-
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
-
- encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
-
- encode(KafkaRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- encode(KafkaRecord<K, V>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- encode(FileIO.ReadableFile, OutputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
-
- encode(T, OutputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- encode(T, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- encode(CoGbkResult, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
- encode(RawUnionValue, OutputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- encode(RawUnionValue, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- encode(GlobalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
-
- encode(IntervalWindow, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- encode(PaneInfo, OutputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
-
- encode(TimestampedValue<T>, OutputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
-
- encode(ValueInSingleWindow<T>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- encode(ValueInSingleWindow<T>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- encode(ValueWithRecordId<ValueT>, OutputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- encode(ValueWithRecordId<ValueT>, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- encodeAndOwn(byte[], OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- ENCODING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- end() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns the end of this window, exclusive.
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.gearpump.translators.GearpumpPipelineTranslator
-
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkNativePipelineVisitor
-
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
-
- enterCompositeTransform(TransformHierarchy.Node) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
-
- enterCompositeTransform(TransformHierarchy.Node) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called for each composite transform after all topological predecessors have been visited but
before any of its component transforms.
- enterPipeline(Pipeline) - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
-
- enterPipeline(Pipeline) - Method in interface org.apache.beam.sdk.Pipeline.PipelineVisitor
-
Called before visiting anything values or transforms, as many uses of a visitor require
access to the
Pipeline
object itself.
- entries() - Method in interface org.apache.beam.sdk.state.MapState
-
Returns an Iterable
over the key-value pairs contained in this map.
- ENVIRONMENT_VERSION_JOB_TYPE_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- ENVIRONMENT_VERSION_MAJOR_KEY - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- equal(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a PTransform
that takes an input
PCollection<T>
and returns a PCollection<T>
with
elements that equals to a given value.
- equals(Object) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
- equals(Object) - Method in class org.apache.beam.runners.gearpump.translators.utils.TranslatorUtils.RawUnionValue
-
- equals(Object) - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
-
- equals(Object) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
-
- equals(Object) - Method in class org.apache.beam.runners.spark.util.ByteArray
-
- equals(Object) - Method in class org.apache.beam.sdk.coders.AtomicCoder
-
.
- equals(Object) - Method in class org.apache.beam.sdk.coders.AvroCoder
-
- equals(Object) - Method in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
- equals(Object) - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
- equals(Object) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- equals(Object) - Method in class org.apache.beam.sdk.coders.StructuralByteArray
-
- equals(Object) - Method in class org.apache.beam.sdk.coders.StructuredCoder
- equals(Object) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
-
- equals(Object) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- equals(Object) - Method in class org.apache.beam.sdk.extensions.sql.BeamRecordSqlType
-
- equals(Object) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.Params
-
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
- equals(Object) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
-
- equals(Object) - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- equals(Object) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
-
- equals(Object) - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKey
-
- equals(Object) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
- equals(Object) - Method in class org.apache.beam.sdk.io.range.OffsetRange
-
- equals(Object) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- equals(Object) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- equals(Object) - Method in class org.apache.beam.sdk.testing.TestStream
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- equals(Object) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
- equals(Object) - Method in class org.apache.beam.sdk.values.BeamRecord
-
- equals(Object) - Method in class org.apache.beam.sdk.values.KV
-
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- equals(Object) - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
- equals(Object) - Method in class org.apache.beam.sdk.values.ShardedKey
-
- equals(Object) - Method in class org.apache.beam.sdk.values.TimestampedValue
-
- equals(Object) - Method in class org.apache.beam.sdk.values.TupleTag
-
- equals(Object) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Two type descriptor are equal if and only if they
represent the same type.
- equals(Object) - Method in class org.apache.beam.sdk.values.TypeParameter
-
- equals(Object) - Method in class org.apache.beam.sdk.values.ValueWithRecordId
-
- equals(Object) - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- estimateFractionForKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns the fraction of this range [startKey, endKey)
that is in the interval
[startKey, key)
.
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.arithmetic.BeamSqlArithmeticExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCaseExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlCastExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlInputRefExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlPrimitive
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlReinterpretExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlUdfExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowEndExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.BeamSqlWindowStartExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlCompareExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNotNullExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.comparison.BeamSqlIsNullExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentDateExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimeExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlCurrentTimestampExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDateCeilExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlDateFloorExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.date.BeamSqlExtractExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlAndExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlNotExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.logical.BeamSqlOrExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathBinaryExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlMathUnaryExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlPiExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.math.BeamSqlRandIntegerExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlCharLengthExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlConcatExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlInitCapExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlLowerExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlOverlayExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlPositionExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlSubstringExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlTrimExpression
-
- evaluate(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.operator.string.BeamSqlUpperExpression
-
- Evaluator(SparkPipelineTranslator, EvaluationContext) - Constructor for class org.apache.beam.runners.spark.SparkRunner.Evaluator
-
- ever() - Static method in class org.apache.beam.sdk.transforms.windowing.Never
-
Returns a trigger which never fires.
- every(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Returns a new SlidingWindows
with the original size, that assigns
timestamps into half-open intervals of the form
[N * period, N * period + size), where 0 is the epoch.
- execute(BeamRecord, BoundedWindow) - Method in interface org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlExpressionExecutor
-
- execute(BeamRecord, BoundedWindow) - Method in class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlFnExecutor
-
- ExecutorServiceFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
-
- expand(PCollection<ElemT>) - Method in class org.apache.beam.runners.apex.ApexRunner.CreateApexPCollectionView
-
- expand(PCollection<ElemT>) - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
-
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
-
- expand(PCollection<T>) - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
-
- expand(PBegin) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
- expand(PCollection<KV<PrimaryKeyT, Iterable<KV<SecondaryKeyT, ValueT>>>>) - Method in class org.apache.beam.sdk.extensions.sorter.SortValues
-
- expand(PCollectionTuple) - Method in class org.apache.beam.sdk.extensions.sql.BeamSql.QueryTransform
-
- expand(PCollection<BeamRecord>) - Method in class org.apache.beam.sdk.extensions.sql.BeamSql.SimpleQueryTransform
-
- expand(PCollection<KV<byte[], byte[]>>) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaCSVTable.CsvRecorderDecoder
-
- expand(PCollection<BeamRecord>) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.kafka.BeamKafkaCSVTable.CsvRecorderEncoder
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTableIOReader
-
- expand(PCollection<BeamRecord>) - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextCSVTableIOWriter
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Read
-
- expand(PCollection<Message>) - Method in class org.apache.beam.sdk.io.amqp.AmqpIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.AvroIO.ParseAll
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.AvroIO.Read
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.AvroIO.ReadAll
-
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.AvroIO.TypedWrite
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.AvroIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Read
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.cassandra.CassandraIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Read
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
-
- expand(PCollection<MatchResult.Metadata>) - Method in class org.apache.beam.sdk.io.FileIO.ReadMatches
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
- expand(PCollection<KV<DestinationT, TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
-
- expand(PCollection<KV<DestinationT, TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
-
- expand(PCollection<KV<TableDestination, TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
-
- expand() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
- expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
- expand(PCollection<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
- expand(PCollection<ReadOperation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
- expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
- expand(PCollection<MutationGroup>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.GenerateSequence
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO.Read
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
- expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Read
-
- expand(PCollection<HCatRecord>) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Read
-
- expand(PCollection<ParameterT>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.ReadAll
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Read
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.jms.JmsIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
-
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kinesis.KinesisIO.Read
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
- expand(PCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Read
-
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.mqtt.MqttIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Bounded
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.Read.Unbounded
-
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Read
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.ReadAll
-
- expand(PCollection<KV<String, String>>) - Method in class org.apache.beam.sdk.io.redis.RedisIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
-
- expand(PCollection<SolrInputDocument>) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.ReadAll
-
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.TextIO.ReadFiles
-
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.TextIO.TypedWrite
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.TextIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Write
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Read
-
- expand(PCollection<UserT>) - Method in class org.apache.beam.sdk.io.WriteFiles
-
- expand() - Method in class org.apache.beam.sdk.io.WriteFilesResult
-
- expand(PBegin) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
- expand(PCollection<FileIO.ReadableFile>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.ReadFiles
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Write
-
- expand(PCollection<SuccessOrFailure>) - Method in class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssert
-
- expand(PCollection<Iterable<T>>) - Method in class org.apache.beam.sdk.testing.PAssert.GroupThenAssertForSingleton
-
- expand(PBegin) - Method in class org.apache.beam.sdk.testing.PAssert.OneSideInputAssert
-
- expand(PBegin) - Method in class org.apache.beam.sdk.testing.TestStream
-
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
-
- expand(PCollection<? extends KV<K, ? extends Iterable<InputT>>>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
-
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
-
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.OfValueProvider
-
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.TimestampedValues
-
- expand(PBegin) - Method in class org.apache.beam.sdk.transforms.Create.Values
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Distinct.WithRepresentativeValues
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Filter
-
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.FlatMapElements
-
- expand(PCollection<? extends Iterable<T>>) - Method in class org.apache.beam.sdk.transforms.Flatten.Iterables
-
- expand(PCollectionList<T>) - Method in class org.apache.beam.sdk.transforms.Flatten.PCollections
-
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.GroupByKey
-
- expand(PCollection<KV<K, InputT>>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
- expand(KeyedPCollectionTuple<K>) - Method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
-
- expand() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Expands the component
PCollections
, stripping off
any tag-specific information.
- expand(PCollection<? extends KV<K, ?>>) - Method in class org.apache.beam.sdk.transforms.Keys
-
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.KvSwap
-
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.MapElements
-
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
- expand(PCollection<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Partition
-
- expand(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
-
Override this method to specify how this PTransform
should be expanded
on the given InputT
.
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.AllMatches
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Find
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindAll
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindKV
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindName
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.FindNameKV
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Matches
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesKV
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesName
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.MatchesNameKV
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceAll
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.ReplaceFirst
-
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.transforms.Regex.Split
-
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.Reshuffle
-
Deprecated.
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.Reshuffle.ViaRandomKey
-
Deprecated.
- expand(PCollection<? extends KV<?, V>>) - Method in class org.apache.beam.sdk.transforms.Values
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsIterable
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsList
-
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMap
-
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.View.AsMultimap
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
-
- expand(PCollection<ElemT>) - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
-
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
- expand(PCollection<V>) - Method in class org.apache.beam.sdk.transforms.WithKeys
-
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.transforms.WithTimestamps
-
- expand() - Method in class org.apache.beam.sdk.values.PBegin
-
- expand() - Method in class org.apache.beam.sdk.values.PCollection
-
- expand() - Method in class org.apache.beam.sdk.values.PCollectionList
-
- expand() - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- expand() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
- expand() - Method in class org.apache.beam.sdk.values.PDone
-
- expand() - Method in interface org.apache.beam.sdk.values.PInput
-
- expand() - Method in interface org.apache.beam.sdk.values.POutput
-
- expand() - Method in interface org.apache.beam.sdk.values.PValue
-
- Experimental - Annotation Type in org.apache.beam.sdk.annotations
-
Signifies that a public API (public class, method or field) is subject to incompatible changes,
or even removal, in a future release.
- Experimental.Kind - Enum in org.apache.beam.sdk.annotations
-
An enumeration of various kinds of experimental APIs.
- explainQuery(String, BeamSqlEnv) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlCli
-
Returns a human readable representation of the query execution plan.
- explainTerms(RelWriter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
-
- exps - Variable in class org.apache.beam.sdk.extensions.sql.impl.interpreter.BeamSqlFnExecutor
-
- extend(String) - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Extend the path by appending a sub-component path.
- extractFromTypeParameters(T, Class<? super T>, TypeDescriptors.TypeVariableExtractor<T, V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Extracts a type from the actual type parameters of a parameterized class, subject to Java type
erasure.
- extractFromTypeParameters(TypeDescriptor<T>, Class<? super T>, TypeDescriptors.TypeVariableExtractor<T, V>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- ExtractJoinFields(boolean, List<<any>>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms.ExtractJoinFields
-
- extractOrderedList() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Returns the values in the heap, ordered largest to smallest.
- extractOutput(BeamAggregationTransforms.AggregationAccumulator) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamAggregationTransforms.AggregationAdaptor
-
- extractOutput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
- extractOutput() - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
-
Returns the output value that is the result of combining all
the input values represented by this accumulator.
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
- extractOutput(double[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
- extractOutput(Combine.Holder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
- extractOutput(int[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
- extractOutput(long[]) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
- extractOutput(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the output value that is the result of combining all
the input values represented by the given accumulator.
- extractOutput(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
- extractOutput(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
- extractOutput(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
- extractOutput(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns the output value that is the result of combining all
the input values represented by the given accumulator.
- extractOutput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
- failure(PAssert.PAssertionSite, Throwable) - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
-
- FAILURE_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
-
- fewKeys() - Method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns whether it groups just few keys.
- fieldTypes - Variable in class org.apache.beam.sdk.extensions.sql.BeamRecordSqlType
-
- FileBasedReader(FileBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Subclasses should not perform IO operations at the constructor.
- FileBasedSink<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
Abstract class for file-based output.
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory, producing uncompressed files.
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, FileBasedSink.WritableByteChannelFactory) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory and output channel type.
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, Compression) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory and output channel type.
- FileBasedSink.CompressionType - Enum in org.apache.beam.sdk.io
-
- FileBasedSink.DynamicDestinations<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- FileBasedSink.FilenamePolicy - Class in org.apache.beam.sdk.io
-
A naming policy for output files.
- FileBasedSink.FileResult<DestinationT> - Class in org.apache.beam.sdk.io
-
Result of a single bundle write.
- FileBasedSink.FileResultCoder<DestinationT> - Class in org.apache.beam.sdk.io
-
- FileBasedSink.OutputFileHints - Interface in org.apache.beam.sdk.io
-
Provides hints about how to generate output files, such as a suggested filename suffix (e.g.
- FileBasedSink.WritableByteChannelFactory - Interface in org.apache.beam.sdk.io
-
- FileBasedSink.WriteOperation<DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
Abstract operation that manages the process of writing to
FileBasedSink
.
- FileBasedSink.Writer<DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- FileBasedSource<T> - Class in org.apache.beam.sdk.io
-
A common base class for all file-based
Source
s.
- FileBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Create a FileBaseSource
based on a file or a file pattern specification, with the given
strategy for treating filepatterns that do not match any files.
- FileBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
- FileBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Create a FileBasedSource
based on a single file.
- FileBasedSource.FileBasedReader<T> - Class in org.apache.beam.sdk.io
-
A
reader
that implements code common to readers of
FileBasedSource
s.
- FileBasedSource.Mode - Enum in org.apache.beam.sdk.io
-
A given FileBasedSource
represents a file resource of one of these types.
- FileChecksumMatcher - Class in org.apache.beam.sdk.testing
-
Matcher to verify file checksum in E2E test.
- FileChecksumMatcher(String, String) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
-
Constructor that uses default shard template.
- FileChecksumMatcher(String, String, Pattern) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
-
Constructor using a custom shard template.
- FileChecksumMatcher(String, ShardedFile) - Constructor for class org.apache.beam.sdk.testing.FileChecksumMatcher
-
Constructor using an entirely custom ShardedFile
implementation.
- FileIO - Class in org.apache.beam.sdk.io
-
Transforms for working with files.
- FileIO() - Constructor for class org.apache.beam.sdk.io.FileIO
-
- FileIO.Match - Class in org.apache.beam.sdk.io
-
- FileIO.MatchAll - Class in org.apache.beam.sdk.io
-
- FileIO.MatchConfiguration - Class in org.apache.beam.sdk.io
-
Describes configuration for matching filepatterns, such as
EmptyMatchTreatment
and
continuous watching for matching files.
- FileIO.ReadableFile - Class in org.apache.beam.sdk.io
-
A utility class for accessing a potentially compressed file.
- FileIO.ReadMatches - Class in org.apache.beam.sdk.io
-
- FilenamePolicy() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
-
- filePattern - Variable in class org.apache.beam.sdk.extensions.sql.impl.schema.text.BeamTextTable
-
- filepattern(String) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
Matches the given filepattern.
- filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
- FileResult(ResourceId, int, BoundedWindow, PaneInfo, DestinationT) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResult
-
- FileResultCoder(Coder<BoundedWindow>, Coder<DestinationT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
-
- FileSystem<ResourceIdT extends ResourceId> - Class in org.apache.beam.sdk.io
-
File system interface in Beam.
- FileSystem() - Constructor for class org.apache.beam.sdk.io.FileSystem
-
- FileSystemRegistrar - Interface in org.apache.beam.sdk.io
-
- FileSystems - Class in org.apache.beam.sdk.io
-
- FileSystems() - Constructor for class org.apache.beam.sdk.io.FileSystems
-
- Filter<T> - Class in org.apache.beam.sdk.transforms
-
PTransform
s for filtering from a PCollection
the
elements satisfying a predicate, or satisfying an inequality with
a given value based on the elements' natural ordering.
- finalize(Iterable<FileBasedSink.FileResult<DestinationT>>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Finalizes writing by copying temporary output files to their final location.
- finalizeCheckpoint() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
-
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
-
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.jms.JmsCheckpointMark
-
Acknowledge all outstanding message.
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
-
- finalizeCheckpoint() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
-
Called by the system to signal that this checkpoint mark has been committed along with
all the records which have been read from the
UnboundedSource.UnboundedReader
since the
previous checkpoint was taken.
- find(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- Find(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Find
-
- findAll(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- findAll(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- FindAll(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindAll
-
- findIndexOfField(String) - Method in class org.apache.beam.sdk.values.BeamRecordType
-
Find the index of a given field.
- findKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- findKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- findKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- findKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- FindKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindKV
-
- FindName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindName
-
- FindNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindNameKV
-
- findTable(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
- finishBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
-
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
-
- finishBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- FinishBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
-
- finishSpecifying() - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
-
After building, finalizes this
PValue
to make it ready for
running.
- finishSpecifying(PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.PValue
-
After building, finalizes this
PValue
to make it ready for being used as an input to a
PTransform
.
- finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.WriteFilesResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PDone
-
Does nothing; there is nothing to finish specifying.
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.POutput
-
As part of applying the producing
PTransform
, finalizes this output to make it ready
for being used as an input and for running.
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
-
- finishWrite() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
- fixDefaults() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
Fixes all the defaults so that equals can be used to check that two strategies are the same,
regardless of the state of "defaulted-ness".
- fixedSizeGlobally(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a PTransform
that takes a PCollection<T>
, selects sampleSize
elements, uniformly at random, and returns a PCollection<Iterable<T>>
containing the
selected elements.
- fixedSizePerKey(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a PTransform
that takes an input PCollection<KV<K, V>>
and returns a
PCollection<KV<K, Iterable<V>>>
that contains an output element mapping each distinct
key in the input PCollection
to a sample of sampleSize
values associated with
that key in the input PCollection
, taken uniformly at random.
- FixedWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows values into fixed-size timestamp-based windows.
- flatMap(List<TranslatorUtils.RawUnionValue>) - Method in class org.apache.beam.runners.gearpump.translators.functions.DoFnFunction
-
- flatMap(WindowedValue<T>) - Method in class org.apache.beam.runners.gearpump.translators.WindowAssignTranslator.AssignWindows
-
- FlatMapElements<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
PTransform
s for mapping a simple function that returns iterables over the elements of a
PCollection
and merging the results.
- Flatten - Class in org.apache.beam.sdk.transforms
-
Flatten<T>
takes multiple PCollection<T>
s bundled
into a PCollectionList<T>
and returns a single
PCollection<T>
containing all the elements in all the input
PCollection
s.
- Flatten() - Constructor for class org.apache.beam.sdk.transforms.Flatten
-
- Flatten.Iterables<T> - Class in org.apache.beam.sdk.transforms
-
FlattenIterables<T>
takes a PCollection<Iterable<T>>
and returns a
PCollection<T>
that contains all the elements from each iterable.
- Flatten.PCollections<T> - Class in org.apache.beam.sdk.transforms
-
- FlattenPCollectionsTranslator<T> - Class in org.apache.beam.runners.gearpump.translators
-
Flatten.FlattenPCollectionList is translated to Gearpump merge function.
- FlattenPCollectionsTranslator() - Constructor for class org.apache.beam.runners.gearpump.translators.FlattenPCollectionsTranslator
-
- FlattenWithHeterogeneousCoders - Interface in org.apache.beam.sdk.testing
-
- FlinkDetachedRunnerResult - Class in org.apache.beam.runners.flink
-
Result of a detached execution of a
Pipeline
with Flink.
- FlinkMetricContainer - Class in org.apache.beam.runners.flink.metrics
-
Helper class for holding a MetricsContainerImpl
and forwarding Beam metrics to
Flink accumulators and metrics.
- FlinkMetricContainer(RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
-
- FlinkMetricContainer.FlinkDistributionGauge - Class in org.apache.beam.runners.flink.metrics
-
- FlinkMetricContainer.FlinkGauge - Class in org.apache.beam.runners.flink.metrics
-
- FlinkPipelineOptions - Interface in org.apache.beam.runners.flink
-
Options which can be used to configure a Flink PipelineRunner.
- FlinkRunner - Class in org.apache.beam.runners.flink
-
A
PipelineRunner
that executes the operations in the
pipeline by first translating them to a Flink Plan and then executing them either locally
or on a Flink cluster, depending on the configuration.
- FlinkRunnerRegistrar - Class in org.apache.beam.runners.flink
-
AutoService registrar - will register FlinkRunner and FlinkOptions
as possible pipeline runner services.
- FlinkRunnerRegistrar.Options - Class in org.apache.beam.runners.flink
-
Pipeline options registrar.
- FlinkRunnerRegistrar.Runner - Class in org.apache.beam.runners.flink
-
Pipeline runner registrar.
- FlinkRunnerResult - Class in org.apache.beam.runners.flink
-
Result of executing a
Pipeline
with Flink.
- FlinkTransformOverrides - Class in org.apache.beam.runners.flink
-
- FlinkTransformOverrides() - Constructor for class org.apache.beam.runners.flink.FlinkTransformOverrides
-
- floats() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- fold(KV<Instant, WindowedValue<KV<K, List<V>>>>, KV<Instant, WindowedValue<KV<K, V>>>) - Method in class org.apache.beam.runners.gearpump.translators.GroupByKeyTranslator.Merge
-
- forBoolean(Boolean) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forClass(Class<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
to be used for serializing an instance of
the supplied class for transport via the Dataflow API.
- forClassName(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
to be used for serializing data to be
deserialized using the supplied class name the supplied class name for
transport via the Dataflow API.
- forCoder(TypeDescriptor<?>, Coder<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
-
Creates a
CoderProvider
that always returns the
given coder for the specified type.
- forever(Trigger) - Static method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
-
Create a composite trigger that repeatedly executes the trigger repeated
, firing each
time it fires and ignoring any indications to finish.
- forFloat(Float) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forFloat(Double) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forInteger(Long) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forInteger(Integer) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forKnownType(Object) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value of a
well-known cloud object type.
- FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- formatRecord(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Convert an input record type into the output type.
- formatTimestamp(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
- forNewInput(Instant, InputT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watch
transform to create a new independent termination state for a
newly arrived
InputT
.
- forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
-
- forStreamFromSources(List<Integer>, Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Build the TimerInternals
according to the feeding streams.
- forString(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forTransformHierarchy(TransformHierarchy, PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
-
- from(Map<String, String>) - Static method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Expects a map keyed by logger Name
s with values representing Level
s.
- from(String) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
-
Reads from the given filename or filepattern.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
-
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.Read
-
Reads from the given filename or filepattern.
- from(String) - Method in class org.apache.beam.sdk.io.AvroIO.Read
-