- abort() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
-
De-registers the handler for all future requests for state for the registered process bundle
instruction id.
- abort(Executor) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- absolute(String, String...) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Construct a path from an absolute component path hierarchy.
- AbstractBeamCalcRel - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamRelNode to replace Project
and Filter
node.
- AbstractBeamCalcRel(RelOptCluster, RelTraitSet, RelNode, RexProgram) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
-
- AbstractGetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
-
- AbstractResult() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
-
- AbstractTranslationContext - Class in org.apache.beam.runners.spark.structuredstreaming.translation
-
Base class that gives a context for
PTransform
translation: keeping track of the
datasets, the
SparkSession
, the current transform being translated.
- AbstractTranslationContext(SparkStructuredStreamingPipelineOptions) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- accept(ByteString, Boolean) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Deprecated.
- accept(BeamFnApi.Elements) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver2
-
- accept(T) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundObserver
-
Deprecated.
- accept(ByteString) - Method in class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
-
- accept(T) - Method in interface org.apache.beam.sdk.fn.data.FnDataReceiver
-
- accept(T1, T2) - Method in interface org.apache.beam.sdk.function.ThrowingBiConsumer
-
- accept(T) - Method in interface org.apache.beam.sdk.function.ThrowingConsumer
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierListContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifyComponentContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
-
- accept(ParseTreeVisitor<? extends T>) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.WildcardContext
-
- accept(SchemaZipFold.Context, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accepts two components, context.parent() is always ROW, MAP, ARRAY or absent.
- accept(SchemaZipFold.Context, Optional<Schema.Field>, Optional<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accepts two fields, context.parent() is always ROW.
- accessPattern() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
-
- accumulate(T, T) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
Accumulate two results together.
- AccumulatingCombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
- accumulatingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new Window
PTransform
that uses the registered WindowFn and
Triggering behavior, and that accumulates elements in a pane after they are triggered.
- ACCUMULATOR_NAME - Static variable in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
-
- AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
-
- AccumulatorCheckpointingSparkListener() - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
-
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return the ack deadline, in seconds, for subscription
.
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Acknowldege messages from subscription
with ackIds
.
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- acquireTaskAttemptIdLock(Configuration, int) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
-
- acquireTaskAttemptIdLock(Configuration, int) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
-
- acquireTaskIdLock(Configuration) - Method in interface org.apache.beam.sdk.io.hadoop.format.ExternalSynchronization
-
Creates
TaskID
with unique id among given job.
- acquireTaskIdLock(Configuration) - Method in class org.apache.beam.sdk.io.hadoop.format.HDFSSynchronization
-
- ActionFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Factory class for creating instances that will handle each type of record within a change stream
query.
- ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
- ACTIVE_PARTITION_READ_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the active partition reads during the execution of the Connector.
- actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in interface org.apache.beam.sdk.schemas.ProjectionProducer
-
Actuate a projection pushdown.
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
-
- add(NamedAggregators) - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregatorsAccumulator
-
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
-
- add(NamedAggregators) - Method in class org.apache.beam.runners.spark.structuredstreaming.aggregators.NamedAggregatorsAccumulator
-
- add(MetricsContainerStepMap) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsContainerStepMapAccumulator
-
- add(String, Broadcast<?>, Coder<?>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.SideInputBroadcast
-
- add(int, GlobalWatermarkHolder.SparkWatermarks) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- add(T) - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
-
- add(T, long, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
-
- add(T, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.Sketch
-
- add(KV<byte[], byte[]>) - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
-
- add(Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
-
- add(Type, String, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
-
- add(List<ValueInSingleWindow<T>>, TableDataInsertAllResponse.InsertErrors, TableReference, FailsafeValueInSingleWindow<TableRow, TableRow>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
-
- add(InputT) - Method in interface org.apache.beam.sdk.state.GroupingState
-
Add a value to the buffer.
- add(Iterable<TimestampedValue<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
-
For internal use only: no backwards compatibility guarantees.
- add(long) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
-
Deprecated.
Adds a value to the heap, returning whether the value is (large enough to be) in the heap.
- add(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item.
- addAccum(AccumT) - Method in interface org.apache.beam.sdk.state.CombiningState
-
Add an accumulator to this state cell.
- addAll(Map<Integer, Queue<GlobalWatermarkHolder.SparkWatermarks>>) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- addArray(Collection<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
-
- addArray(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
-
- addArrayField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addAttempted(T, BiFunction<T, T, T>) - Method in class org.apache.beam.sdk.metrics.MetricResult
-
- addBoolean(Map<String, Object>, String, boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addBooleanField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addByteArrayField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addByteField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addCollectionToSingletonOutput(PCollection<?>, String, PCollectionView<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an output to this CollectionToSingleton
Dataflow step, consuming the specified
input PValue
and producing the specified output PValue
.
- addCommitted(T, BiFunction<T, T, T>) - Method in class org.apache.beam.sdk.metrics.MetricResult
-
- addDataSet(String, DataSet<T>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
-
- addDataStream(String, DataStream<T>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
-
- addDateTimeField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addDecimalField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addDouble(Map<String, Object>, String, Double) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addDoubleField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addElements(T, T...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Adds the specified elements to the source with timestamp equal to the current watermark.
- addElements(TimestampedValue<T>, TimestampedValue<T>...) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Adds the specified elements to the source with the provided timestamps.
- addEncodingInput(Coder<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Sets the encoding for this Dataflow step.
- addErrorForCode(int, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
-
Adds a matcher to log the provided string if the error matches a particular status code.
- addErrorForCodeAndUrlContains(int, String, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.Builder
-
Adds a matcher to log the provided string if the error matches a particular status code and
the url contains a certain string.
- addExperiment(ExperimentalOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
Adds experiment to options if not already present.
- addField(Schema.Field) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addFields(List<Schema.Field>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addFields(Schema.Field...) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- AddFields - Class in org.apache.beam.sdk.schemas.transforms
-
A transform to add new nullable fields to a PCollection's schema.
- AddFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.AddFields
-
- AddFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Inner PTransform for AddFields.
- addFloatField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- AddHarnessIdInterceptor - Class in org.apache.beam.sdk.fn.channel
-
A ClientInterceptor
that attaches a provided SDK Harness ID to outgoing messages.
- addIfAbsent(T) - Method in interface org.apache.beam.sdk.state.SetState
-
Ensures a value is a member of the set, returning true
if it was added and false
otherwise.
- addIfNotDefault(DisplayData.ItemSpec<T>, T) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item if the value is different than the specified default.
- addIfNotNull(DisplayData.ItemSpec<?>) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register the given display item if the value is not null.
- addInput(String, Boolean) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, Long) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name and value to this Dataflow step.
- addInput(String, PInput) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input with the given name to this Dataflow step, coming from the specified input
PValue.
- addInput(String, Map<String, Object>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input that is a dictionary of strings to objects.
- addInput(String, List<? extends Map<String, Object>>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds an input that is a list of objects.
- addInput(HyperLogLogPlus, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
- addInput(SketchFrequencies.Sketch<InputT>, InputT) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
- addInput(MergingDigest, Double) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
- addInput(long[], Boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
- addInput(CovarianceAccumulator, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
- addInput(VarianceAccumulator, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
- addInput(BeamBuiltinAggregations.BitXOr.Accum, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
- addInput(AccumT, InputT, Long, Long, Long) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions.PositionAwareCombineFn
-
- addInput(List<T>, T) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
- addInput(String, byte[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
- addInput(String, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
- addInput(Long, Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
-
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- addInput(AccumT, InputT) - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(List<String>, String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
- addInput(ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique, T) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- addInput(InputT) - Method in interface org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn.Accumulator
-
Adds the given input value to this accumulator, modifying this accumulator.
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
-
- addInput(double[], Double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
- addInput(Combine.Holder<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
- addInput(int[], Integer) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
- addInput(long[], Long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
- addInput(AccumT, InputT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(List<V>, V) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
- addInput(Object[], DataT) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
- addInput(Object[], DataT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
- addInput(AccumT, InputT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Adds the given input value to the given accumulator, returning the new accumulator value.
- addInput(Top.BoundedHeap<KV<Integer, T>, SerializableComparator<KV<Integer, T>>>, T) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
- addInt16Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addInt32Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addInt64Field(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addIterable(Iterable<T>) - Method in class org.apache.beam.sdk.values.Row.Builder
-
- addIterableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addLengthPrefixedCoder(String, RunnerApi.Components.Builder, boolean) - Static method in class org.apache.beam.runners.fnexecution.wire.LengthPrefixUnknownCoders
-
- addList(Map<String, Object>, String, List<? extends Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addList(Map<String, Object>, String, T[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addLogicalTypeField(String, Schema.LogicalType<InputT, BaseT>) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addLong(Map<String, Object>, String, long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addLongs(Map<String, Object>, String, long...) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addMapField(String, Schema.FieldType, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addMessageListener(Consumer<JobApi.JobMessage>) - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Listen for job messages with a Consumer
.
- addMethodParameters(Method) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase.ParameterListBuilder
-
- addNameFilter(MetricNameFilter) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
- addNull(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addNullableField(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addOptions(Schema.Options) - Method in class org.apache.beam.sdk.schemas.Schema.Options.Builder
-
- addOutput(String, PCollection<?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.StepTranslationContext
-
Adds a primitive output to this Dataflow step with the given name as the local output name,
producing the specified output PValue
, including its Coder
if a TypedPValue
.
- addOutputColumnList(List<ResolvedNodes.ResolvedOutputColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
-
- addOverrideForClass(Class<?>, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.
Overrides the default log level for the passed in class.
- addOverrideForClass(Class<?>, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log level for the passed in class.
- addOverrideForName(String, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.
Overrides the default log level for the passed in name.
- addOverrideForName(String, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log logLevel for the passed in name.
- addOverrideForPackage(Package, DataflowWorkerLoggingOptions.Level) - Method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.
Overrides the default log level for the passed in package.
- addOverrideForPackage(Package, SdkHarnessOptions.LogLevel) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Overrides the default log level for the passed in package.
- addResolvedTable(TableResolution.SimpleTableWithPath) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
-
Store a table together with its full path for repeated resolutions.
- addRowField(String, Schema) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addRows(Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
Add rows to the builder.
- addRows(String, Row...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
-
- addRows(Duration, Object...) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
-
Add rows to the builder.
- addRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Creates a runner-side wire coder for a port read/write for the given PCollection.
- addSchema(String, TableProvider) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Add a top-level schema backed by the table provider.
- addSdkWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components.Builder, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Creates an SDK-side wire coder for a port read/write for the given PCollection.
- addStateListener(Consumer<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Listen for job state changes with a Consumer
.
- addStep(PTransform<?, ?>, String) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Adds a step to the Dataflow workflow for the given transform, with the given Dataflow step
type.
- addStep(String) - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
Add a step filter.
- addString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addStringField(String) - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- addStringList(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- addTraceFor(AbstractGoogleClient, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
- addTraceFor(AbstractGoogleClientRequest<?>, String) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
- addUdaf(String, Combine.CombineFn) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDAF function which can be used in GROUP-BY expression.
- addUdf(String, Class<?>, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUdf(String, Class<? extends BeamSqlUdf>) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUdf(String, SerializableFunction) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
Register a UDF function which can be used in SQL expression.
- addUuids() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Add Uuids to to-be-published messages that ensures that uniqueness is maintained.
- AddUuidsTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A transform to add UUIDs to each message to be written to Pub/Sub Lite.
- AddUuidsTransform() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
-
- addValue(Object) - Method in class org.apache.beam.sdk.values.Row.Builder
-
- addValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
-
- addValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
-
- advance() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- advance() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- advance() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
For subscription mode only: Track progression of time according to the Clock
passed .
- advance() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- advance() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Advances the reader to the next valid record.
- advance() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Advances the reader to the next valid record.
- advanceBy(Duration) - Static method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
-
For internal use only: no backwards compatibility guarantees.
- advanceImpl() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
- advanceImpl() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Advances to the next record and returns true
, or returns false if there is no next
record.
- advanceNextBatchWatermarkToInfinity() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Advances the watermark in the next batch to the end-of-time.
- advanceProcessingTime(Duration) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the processing time by the specified amount.
- advanceTo(Instant) - Static method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
-
For internal use only: no backwards compatibility guarantees.
- advanceWatermark() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Advances the watermark.
- advanceWatermarkForNextBatch(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Advances the watermark in the next batch.
- advanceWatermarkTo(Instant) - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the watermark of this source to the specified instant.
- advanceWatermarkToInfinity() - Method in class org.apache.beam.sdk.testing.TestStream.Builder
-
Advance the watermark to infinity, completing this
TestStream
.
- AdvancingPhaser - Class in org.apache.beam.sdk.fn.stream
-
A Phaser
which never terminates.
- AdvancingPhaser(int) - Constructor for class org.apache.beam.sdk.fn.stream.AdvancingPhaser
-
- AfterAll - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that fires when all of its sub-triggers are ready.
- afterBundleCommit(Instant, DoFn.BundleFinalizer.Callback) - Method in interface org.apache.beam.sdk.transforms.DoFn.BundleFinalizer
-
The provided function will be called after the runner successfully commits the output of a
successful bundle.
- AfterEach - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that executes its sub-triggers in order.
- AfterFirst - Class in org.apache.beam.sdk.transforms.windowing
-
A composite
Trigger
that fires once after at least one of its sub-triggers have fired.
- afterIterations(int) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- AfterPane - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
that fires at some point after a specified number of input elements have
arrived.
- AfterProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
trigger that fires at a specified point in processing time, relative to when
input first arrives.
- AfterSynchronizedProcessingTime - Class in org.apache.beam.sdk.transforms.windowing
-
FOR INTERNAL USE ONLY.
- afterTimeSinceNewOutput(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- afterTimeSinceNewOutput(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- afterTotalOf(ReadableDuration) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- afterTotalOf(SerializableFunction<InputT, ReadableDuration>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- AfterWatermark - Class in org.apache.beam.sdk.transforms.windowing
-
AfterWatermark
triggers fire based on progress of the system watermark.
- AfterWatermark.AfterWatermarkEarlyAndLate - Class in org.apache.beam.sdk.transforms.windowing
-
- AfterWatermark.FromEndOfWindow - Class in org.apache.beam.sdk.transforms.windowing
-
A watermark trigger targeted relative to the end of the window.
- aggregate(Combine.CombineFn<InputT, ?, OutputT>) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- AggregateCombiner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements.
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- aggregateField(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateField(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- aggregateFieldBaseValue(String, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- aggregateFieldBaseValue(int, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(List<String>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFields(FieldAccessDescriptor, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
Build up an aggregation function over the input elements.
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.AggregateCombiner
-
Build up an aggregation function over the input elements by field id.
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, String) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- aggregateFieldsById(List<Integer>, Combine.CombineFn<CombineInputT, AccumT, CombineOutputT>, Schema.Field) - Method in class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- AggregateFn<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.extensions.sql.udf
-
An aggregate function that can be executed as part of a SQL query.
- AggregationCombineFnAdapter<T> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
- AggregationCombineFnAdapter() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
-
- AggregationQuery - Class in org.apache.beam.sdk.io.mongodb
-
Builds a MongoDB AggregateIterable object.
- AggregationQuery() - Constructor for class org.apache.beam.sdk.io.mongodb.AggregationQuery
-
- AggregatorMetric - Class in org.apache.beam.runners.spark.metrics
-
- AggregatorMetric - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
-
- AggregatorMetricSource - Class in org.apache.beam.runners.spark.metrics
-
- AggregatorMetricSource(String, NamedAggregators) - Constructor for class org.apache.beam.runners.spark.metrics.AggregatorMetricSource
-
- AggregatorMetricSource - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
-
- AggregatorMetricSource(String, NamedAggregators) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.AggregatorMetricSource
-
- AggregatorsAccumulator - Class in org.apache.beam.runners.spark.aggregators
-
For resilience, Accumulators
are required to be wrapped in a Singleton.
- AggregatorsAccumulator() - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
-
- AggregatorsAccumulator - Class in org.apache.beam.runners.spark.structuredstreaming.aggregators
-
For resilience, Accumulators
are required to be wrapped in a Singleton.
- AggregatorsAccumulator() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator
-
- AggregatorsAccumulator.AccumulatorCheckpointingSparkListener - Class in org.apache.beam.runners.spark.aggregators
-
- algorithm(String) - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey.Builder
-
- align(Duration) - Method in interface org.apache.beam.sdk.state.Timer
-
- alignedTo(Duration, Instant) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Aligns timestamps to the smallest multiple of period
since the offset
greater
than the timestamp.
- alignedTo(Duration) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
Aligns the time to be the smallest multiple of period
greater than the epoch boundary
(aka new Instant(0)
).
- alignTo(Duration, Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- alignTo(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- AlignTo() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
-
- ALL_CONTEXTS - Static variable in class org.apache.beam.sdk.testing.CoderProperties
-
All the contexts, for use in test cases.
- ALL_KEYS - Static variable in class org.apache.beam.sdk.io.range.ByteKeyRange
-
The range of all keys, with empty start and end keys.
- allLeavesDescriptor(Schema, SerializableFunction<List<String>, String>) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
- allMatches(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- allMatches(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- AllMatches(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.AllMatches
-
- allMetrics() - Method in class org.apache.beam.sdk.metrics.MetricResults
-
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
-
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Creates an instance of this server using an ephemeral address.
- allocateAddressAndCreate(List<BindableService>, Endpoints.ApiServiceDescriptor.Builder) - Method in class org.apache.beam.sdk.fn.server.ServerFactory.InetSocketAddressServerFactory
-
- allocatePortAndCreateFor(ServiceT, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
- allocatePortAndCreateFor(List<? extends FnService>, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
- allOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- allOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- allOf(Watch.Growth.TerminationCondition<InputT, FirstStateT>, Watch.Growth.TerminationCondition<InputT, SecondStateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
- ALLOWS_SHARDABLE_STATE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
- allowsDynamicSplitting() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
Whether this reader should allow dynamic splitting of the offset ranges.
- AlwaysPassMatcher() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
-
- AlwaysPassMatcherFactory() - Constructor for class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
-
- alwaysRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Always retry all failures.
- AmqpIO - Class in org.apache.beam.sdk.io.amqp
-
AmqpIO supports AMQP 1.0 protocol using the Apache QPid Proton-J library.
- AmqpIO.Read - Class in org.apache.beam.sdk.io.amqp
-
A
PTransform
to read/receive messages using AMQP 1.0 protocol.
- AmqpIO.Write - Class in org.apache.beam.sdk.io.amqp
-
A
PTransform
to send messages using AMQP 1.0 protocol.
- AmqpMessageCoder - Class in org.apache.beam.sdk.io.amqp
-
A coder for AMQP message.
- AmqpMessageCoder() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
-
- AmqpMessageCoderProviderRegistrar - Class in org.apache.beam.sdk.io.amqp
-
- AmqpMessageCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
-
- and(TupleTag<V>, List<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns a new
CoGbkResult
based on this, with the given tag and given data added to it.
- and(TupleTag<V>, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns a new KeyedPCollectionTuple<K>
that is the same as this, appended with the
given PCollection.
- and(String, PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- and(PCollection.IsBounded) - Method in enum org.apache.beam.sdk.values.PCollection.IsBounded
-
Returns the composed IsBounded property.
- and(PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- and(Iterable<PCollection<T>>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- and(String, PCollection<Row>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- and(TupleTag<T>, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- and(String, PCollection<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- and(TupleTag<?>) - Method in class org.apache.beam.sdk.values.TupleTagList
-
- and(List<TupleTag<?>>) - Method in class org.apache.beam.sdk.values.TupleTagList
-
- annotateFromBytes(PCollectionView<Map<ByteString, VideoContext>>, List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from ByteStrings of their contents.
- annotateFromBytesWithContext(List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from key-value pairs of ByteStrings and VideoContext.
- annotateFromURI(List<Feature>, PCollectionView<Map<String, VideoContext>>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from GCS URIs.
- annotateFromUriWithContext(List<Feature>) - Static method in class org.apache.beam.sdk.extensions.ml.VideoIntelligence
-
Annotates videos from key-value pairs of GCS URI and VideoContext.
- annotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
- annotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
- AnnotateImagesFromBytes(PCollectionView<Map<ByteString, ImageContext>>, List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytes
-
- annotateImagesFromBytesWithContext(List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from KVs of
their GCS addresses in Strings and
ImageContext
for each image.
- annotateImagesFromBytesWithContext(List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from KVs of
their GCS addresses in Strings and
ImageContext
for each image.
- AnnotateImagesFromBytesWithContext(List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromBytesWithContext
-
- annotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from their
GCS addresses.
- annotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from their
GCS addresses.
- AnnotateImagesFromGcsUri(PCollectionView<Map<String, ImageContext>>, List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUri
-
- annotateImagesFromGcsUriWithContext(List<Feature>, long, int) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from KVs of
their String-encoded contents and
ImageContext
for each image.
- annotateImagesFromGcsUriWithContext(List<Feature>, long) - Static method in class org.apache.beam.sdk.extensions.ml.CloudVision
-
Creates a
PTransform
that annotates images from KVs of
their String-encoded contents and
ImageContext
for each image.
- AnnotateImagesFromGcsUriWithContext(List<Feature>, long, int) - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision.AnnotateImagesFromGcsUriWithContext
-
- AnnotateText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
using the Cloud AI Natural language processing capability.
- AnnotateText() - Constructor for class org.apache.beam.sdk.extensions.ml.AnnotateText
-
- AnnotateText.Builder - Class in org.apache.beam.sdk.extensions.ml
-
- AnnotateVideoFromBytes(PCollectionView<Map<ByteString, VideoContext>>, List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytes
-
- AnnotateVideoFromBytesWithContext(List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromBytesWithContext
-
- AnnotateVideoFromUri(PCollectionView<Map<String, VideoContext>>, List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromUri
-
- AnnotateVideoFromURIWithContext(List<Feature>) - Constructor for class org.apache.beam.sdk.extensions.ml.VideoIntelligence.AnnotateVideoFromURIWithContext
-
- any(long) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Sample#any(long)
takes a PCollection<T>
and a limit, and produces a new PCollection<T>
containing up to limit elements of the input PCollection
.
- anyCombineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFn
that computes a fixed-sized potentially non-uniform sample of its
inputs.
- anyOf(Iterable<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- anyOf(SerializableMatcher<T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- anything() - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- anyValueCombineFn() - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFn
that computes a single and potentially non-uniform sample value of
its inputs.
- append(K, W, Iterator<V>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Appends the values to the bag user state for the given key and window.
- appendRows(long, ProtoRows) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
Append rows to a Storage API write stream at the given offset.
- applicableTo(PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
- ApplicationNameOptions - Interface in org.apache.beam.sdk.options
-
Options that allow setting the application name.
- apply(Tuple2<ByteArray, byte[]>) - Method in class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
-
- apply(KV<String, Long>) - Method in class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.FormatAsTextFn
-
- apply(InputT) - Method in interface org.apache.beam.sdk.coders.DelegateCoder.CodingFunction
-
- apply(Pipeline, String, RunnerApi.FunctionSpec, Map<String, PCollection<?>>) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionService.TransformProvider
-
- apply(Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamSetOperatorsTransforms.BeamSqlRow2KvFn
-
- apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- apply(T1, T2) - Method in interface org.apache.beam.sdk.function.ThrowingBiFunction
-
- apply(T1) - Method in interface org.apache.beam.sdk.function.ThrowingFunction
-
- apply(Schema) - Method in interface org.apache.beam.sdk.io.AvroSink.DatumWriterFactory
-
- apply(Schema, Schema) - Method in interface org.apache.beam.sdk.io.AvroSource.DatumReaderFactory
-
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
- apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
-
- apply(HealthcareIOError<T>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
-
- apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
-
- apply(PubsubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
-
- apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
-
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
- apply(Void) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
-
- apply(SQLException) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
-
- apply(Void) - Method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
-
- apply(SQLException) - Method in interface org.apache.beam.sdk.io.jdbc.JdbcIO.RetryStrategy
-
- apply(String, Session) - Method in class org.apache.beam.sdk.io.jms.TextMessageMapper
-
- apply(MongoCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
-
- apply(MongoCollection<Document>) - Method in class org.apache.beam.sdk.io.mongodb.FindQuery
-
- apply(Void) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
-
- apply(FileIO.ReadableFile, OffsetRange, Exception) - Method in class org.apache.beam.sdk.io.ReadAllViaFileBasedSource.ReadFileRangesFnExceptionHandler
-
- apply(Void) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
-
- apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
-
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.Pipeline
-
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
-
- apply(Schema, Schema) - Method in interface org.apache.beam.sdk.schemas.transforms.Cast.Validator
-
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
-
- apply(Schema, Schema) - Method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold
-
- apply(T) - Method in class org.apache.beam.sdk.testing.PAssert.MatcherCheckerFn
-
- apply(Statement, Description) - Method in class org.apache.beam.sdk.testing.TestPipeline
-
- apply(double, double) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Applies the binary operation to the two operands, returning the result.
- apply(V, V) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Applies the binary operation to the two operands, returning the result.
- apply(int, int) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Applies the binary operation to the two operands, returning the result.
- apply(long, long) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Applies the binary operation to the two operands, returning the result.
- apply(Iterable<? extends InputT>) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Applies this CombineFn
to a collection of input values to produce a combined output
value.
- apply(Iterable<? extends InputT>, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Applies this CombineFnWithContext
to a collection of input values to produce a
combined output value.
- apply(InputT, Contextful.Fn.Context) - Method in interface org.apache.beam.sdk.transforms.Contextful.Fn
-
Invokes the function on the given input with the given context.
- apply(InputT) - Method in class org.apache.beam.sdk.transforms.InferableFunction
-
- apply(PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- apply(String, PTransform<KeyedPCollectionTuple<K>, OutputT>) - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Applies the given
PTransform
to this input
KeyedPCollectionTuple
and returns
its
OutputT
.
- apply(InputT) - Method in interface org.apache.beam.sdk.transforms.ProcessFunction
-
Returns the result of invoking this function on the given input.
- apply(InputT) - Method in interface org.apache.beam.sdk.transforms.SerializableFunction
-
Returns the result of invoking this function on the given input.
- apply(InputT) - Method in class org.apache.beam.sdk.transforms.SimpleFunction
-
- apply(PrimitiveViewT) - Method in class org.apache.beam.sdk.transforms.ViewFn
-
A function to adapt a primitive view type to a desired view type.
- apply(WithFailures.ExceptionElement<T>) - Method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionAsMapHandler
-
- apply(WithFailures.ExceptionElement<T>) - Method in class org.apache.beam.sdk.transforms.WithFailures.ThrowableHandler
-
- apply(PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
-
- apply(String, PTransform<? super PBegin, OutputT>) - Method in class org.apache.beam.sdk.values.PBegin
-
Applies the given
PTransform
to this
PBegin
, using
name
to identify
this specific application of the transform.
- apply(PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
-
- apply(String, PTransform<? super PCollection<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollection
-
Applies the given
PTransform
to this input
PCollection
, using
name
to
identify this specific application of the transform.
- apply(PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- apply(String, PTransform<PCollectionList<T>, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Applies the given
PTransform
to this input
PCollectionList
, using
name
to identify this specific application of the transform.
- apply(PTransform<? super PCollectionRowTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- apply(String, PTransform<? super PCollectionRowTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- apply(PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- apply(String, PTransform<? super PCollectionTuple, OutputT>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
-
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- apply(Materializations.MultimapView<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
-
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- apply(Materializations.MultimapView<K, V>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
-
- apply(Materializations.MultimapView<Void, KV<K, V>>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- apply(Materializations.MultimapView<K, V>) - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
-
- apply(Materializations.MultimapView<Void, T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- apply(Materializations.IterableView<T>) - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
-
- applySdkEnvironmentOverrides(RunnerApi.Pipeline, DataflowPipelineOptions) - Method in class org.apache.beam.runners.dataflow.DataflowRunner
-
- applyTransform(InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- applyTransform(String, InputT, PTransform<? super InputT, OutputT>) - Static method in class org.apache.beam.sdk.Pipeline
-
For internal use only; no backwards-compatibility guarantees.
- applyWindowing() - Method in class org.apache.beam.sdk.transforms.PeriodicImpulse
-
- ApproximateCountDistinct - Class in org.apache.beam.sdk.extensions.zetasketch
-
PTransform
s for estimating the number of distinct elements in a PCollection
, or
the number of distinct values associated with each key in a PCollection
of KV
s.
- ApproximateCountDistinct() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
-
- ApproximateCountDistinct.Globally<T> - Class in org.apache.beam.sdk.extensions.zetasketch
-
PTransform
for estimating the number of distinct elements in a PCollection
.
- ApproximateCountDistinct.Globally.Builder<T> - Class in org.apache.beam.sdk.extensions.zetasketch
-
- ApproximateCountDistinct.PerKey<K,V> - Class in org.apache.beam.sdk.extensions.zetasketch
-
- ApproximateCountDistinct.PerKey.Builder<K,V> - Class in org.apache.beam.sdk.extensions.zetasketch
-
- ApproximateDistinct - Class in org.apache.beam.sdk.extensions.sketching
-
PTransform
s for computing the approximate number of distinct elements in a stream.
- ApproximateDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
-
- ApproximateDistinct.ApproximateDistinctFn<InputT> - Class in org.apache.beam.sdk.extensions.sketching
-
- ApproximateDistinct.GloballyDistinct<InputT> - Class in org.apache.beam.sdk.extensions.sketching
-
- ApproximateDistinct.HyperLogLogPlusCoder - Class in org.apache.beam.sdk.extensions.sketching
-
Coder for HyperLogLogPlus
class.
- ApproximateDistinct.PerKeyDistinct<K,V> - Class in org.apache.beam.sdk.extensions.sketching
-
- ApproximateQuantiles - Class in org.apache.beam.sdk.transforms
-
PTransform
s for getting an idea of a PCollection
's data distribution using
approximate N
-tiles (e.g.
- ApproximateQuantiles.ApproximateQuantilesCombineFn<T,ComparatorT extends java.util.Comparator<T> & java.io.Serializable> - Class in org.apache.beam.sdk.transforms
-
The ApproximateQuantilesCombineFn
combiner gives an idea of the distribution of a
collection of values using approximate N
-tiles.
- ApproximateUnique - Class in org.apache.beam.sdk.transforms
-
- ApproximateUnique() - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.
- ApproximateUnique.ApproximateUniqueCombineFn<T> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
CombineFn
that computes an estimate of the number of distinct values that were
combined.
- ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique - Class in org.apache.beam.sdk.transforms
-
Deprecated.
A heap utility class to efficiently track the largest added elements.
- ApproximateUnique.Globally<T> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
PTransform
for estimating the number of distinct elements in a PCollection
.
- ApproximateUnique.PerKey<K,V> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
PTransform
for estimating the number of distinct values associated with each key in a
PCollection
of KV
s.
- ApproximateUniqueCombineFn(long, Coder<T>) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- array() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns the backing array.
- array(TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- array(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Create an array type for the given field type.
- array(Schema.FieldType, boolean) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- ARRAY_AGG_FN - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
- ArrayAgg - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
-
- ArrayAgg() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg
-
- ArrayAgg.ArrayAggArray<T> - Class in org.apache.beam.sdk.extensions.sql.impl.udaf
-
- ArrayAggArray() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
- arrayContaining(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- arrayContaining(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- arrayContaining(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- arrayContaining(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- arrayContainingInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- arrayContainingInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- arrayContainingInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- arrayContainingInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- ArrayCopyState() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayCopyState
-
- arrayElementType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- ArrayNewState() - Constructor for class org.apache.beam.sdk.jmh.util.ByteStringOutputStreamBenchmark.NewVsCopy.ArrayNewState
-
- ArrayOfNestedStringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle
-
- ArrayOfStringBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle
-
- arrayQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- arrayQualifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
-
- ArrayQualifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
-
- ArrayQualifierListContext(FieldSpecifierNotationParser.QualifierListContext) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierListContext
-
- arrayWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- arrayWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- ArrowConversion - Class in org.apache.beam.sdk.extensions.arrow
-
Utilities to create
Iterable
s of Beam
Row
instances backed by Arrow record
batches.
- ArrowConversion.ArrowSchemaTranslator - Class in org.apache.beam.sdk.extensions.arrow
-
Converts Arrow schema to Beam row schema.
- ArrowConversion.RecordBatchRowIterator - Class in org.apache.beam.sdk.extensions.arrow
-
- arrowSchemaFromInput(InputStream) - Static method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion
-
- ArrowSchemaTranslator() - Constructor for class org.apache.beam.sdk.extensions.arrow.ArrowConversion.ArrowSchemaTranslator
-
- ArtifactDestination() - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
-
- ArtifactRetrievalService - Class in org.apache.beam.runners.fnexecution.artifact
-
- ArtifactRetrievalService() - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- ArtifactRetrievalService(ArtifactResolver) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- ArtifactRetrievalService(int) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- ArtifactRetrievalService(ArtifactResolver, int) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- ArtifactStagingService - Class in org.apache.beam.runners.fnexecution.artifact
-
- ArtifactStagingService(ArtifactStagingService.ArtifactDestinationProvider) - Constructor for class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
- ArtifactStagingService.ArtifactDestination - Class in org.apache.beam.runners.fnexecution.artifact
-
A pairing of a newly created artifact type and an ouptut stream that will be readable at that
type.
- ArtifactStagingService.ArtifactDestinationProvider - Interface in org.apache.beam.runners.fnexecution.artifact
-
Provides a concrete location to which artifacts can be staged on retrieval.
- as(Class<T>) - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Transforms this object into an object of type <T>
saving each property that has been
manipulated.
- as(Class<T>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Creates and returns an object that implements <T>
.
- as(Class<T>) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Creates and returns an object that implements <T>
using the values configured on this
builder during construction.
- asCloudObject(Coder<?>, SdkComponents) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
-
- asInputStream(int, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns an InputStream
wrapper which supplies the portion of this backing byte buffer
starting at offset
and up to length
bytes.
- asIterable() - Static method in class org.apache.beam.sdk.transforms.View
-
- AsJsons<InputT> - Class in org.apache.beam.sdk.extensions.jackson
-
PTransform
for serializing objects to JSON
Strings
.
- AsJsons.AsJsonsWithFailures<FailureT> - Class in org.apache.beam.sdk.extensions.jackson
-
A
PTransform
that adds exception handling to
AsJsons
.
- asList() - Static method in class org.apache.beam.sdk.transforms.View
-
- asMap() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
-
- asMap() - Static method in class org.apache.beam.sdk.transforms.View
-
- asMultimap() - Static method in class org.apache.beam.sdk.transforms.View
-
- asOutputReference(PValue, AppliedPTransform<?, ?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Encode a PValue reference as an output reference.
- asOutputStream() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns an output stream which writes to the backing buffer from the current position.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub
API.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Returns the string representation of this topic as a path used in the Cloud Pub/Sub API.
- asQueryable(QueryProvider, SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
-
- asResponseObserver() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
-
- assertionError() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
-
- assertSourcesEqualReferenceSource(BoundedSource<T>, List<? extends BoundedSource<T>>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Given a reference Source
and a list of Source
s, assert that the union of the
records read from the list of sources is equal to the records read from the reference source.
- assertSplitAtFractionBehavior(BoundedSource<T>, int, double, SourceTestUtils.ExpectedSplitOutcome, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
- assertSplitAtFractionExhaustive(BoundedSource<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that for each possible start position, BoundedSource.BoundedReader#splitAtFraction
at every interesting fraction (halfway between two
fractions that differ by at least one item) can be called successfully and the results are
consistent if a split succeeds.
- assertSplitAtFractionFails(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Asserts that the source
's reader fails to splitAtFraction(fraction)
after
reading numItemsToReadBeforeSplit
items.
- assertSplitAtFractionSucceedsAndConsistent(BoundedSource<T>, int, double, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Verifies some consistency properties of BoundedSource.BoundedReader#splitAtFraction
on
the given source.
- assertSubscriptionEventuallyCreated(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Block until a subscription is created for this test topic in the specified project.
- assertThatAllRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
- assertThatTopicEventuallyReceives(Matcher<PubsubMessage>...) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Repeatedly pull messages from
TestPubsub.subscriptionPath()
until receiving one for each matcher
(or timeout is reached), then assert that the received messages match the expectations.
- assertUnstartedReaderReadsSameAsItsSource(BoundedSource.BoundedReader<T>, PipelineOptions) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Assert that a Reader
returns a Source
that, when read from, produces the same
records as the reader.
- assign(BoundedWindow, Instant) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
- assignableTo(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if this Schema can be assigned to another Schema.
- assignableToIgnoreNullable(Schema) - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns true if this Schema can be assigned to another Schema, ignoring nullable.
- AssignContext() - Constructor for class org.apache.beam.sdk.transforms.windowing.WindowFn.AssignContext
-
- assignedWindows(WindowFn<T, W>, long) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
- assignedWindowsWithValue(WindowFn<T, W>, TimestampedValue<T>) - Static method in class org.apache.beam.sdk.testing.WindowFnTestUtils
-
- AssignShardFn(Integer) - Constructor for class org.apache.beam.sdk.transforms.Reshuffle.AssignShardFn
-
Deprecated.
- assignShardKey(DestinationT, UserT, int) - Method in interface org.apache.beam.sdk.io.ShardingFunction
-
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
-
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- assignsToOneWindow() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns true if this
WindowFn
always assigns an element to exactly one window.
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
-
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
-
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
- assignWindow(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
Returns the single window to which elements with this timestamp belong.
- AssignWindowP<T> - Class in org.apache.beam.runners.jet.processors
-
/** * Jet Processor
implementation for Beam's Windowing primitive.
- assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
- assignWindows(WindowFn<Object, GlobalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
-
- assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
- assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
-
- assignWindows(WindowFn<Object, IntervalWindow>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- assignWindows(Instant) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- assignWindows(WindowFn<T, W>.AssignContext) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Given a timestamp and element, returns the set of windows into which it should be placed.
- AssignWindowsFunction<T> - Class in org.apache.beam.runners.twister2.translators.functions
-
Assign Windows function.
- AssignWindowsFunction(WindowFn<T, BoundedWindow>, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
-
- assignWindowsMapFunction(WindowFn<T, W>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.WindowingHelpers
-
- AssignWindowTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
-
Assign Window translator.
- AssignWindowTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.AssignWindowTranslatorBatch
-
- asSingleton() - Static method in class org.apache.beam.sdk.transforms.View
-
- asSingletonView() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns a
PTransform
that produces a
PCollectionView
whose elements are the
result of combining elements per-window in the input
PCollection
.
- assumeSingleMessageSchema() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
-
- ASTERISK - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
-
- ASTERISK_RELUCTANT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.cep.Quantifier
-
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
- atMinimumTimestamp(V) - Static method in class org.apache.beam.sdk.values.TimestampedValue
-
- AtomicCoder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder
that has no component
Coders
or other configuration.
- AtomicCoder() - Constructor for class org.apache.beam.sdk.coders.AtomicCoder
-
- AtomicLongFactory() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
-
- attached() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
- attachValues(List<Object>) - Method in class org.apache.beam.sdk.values.Row.Builder
-
- attachValues(Object...) - Method in class org.apache.beam.sdk.values.Row.Builder
-
- attempted(MetricKey, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
-
- ATTRIBUTE_ARRAY_ENTRY_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
- ATTRIBUTE_ARRAY_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
- ATTRIBUTE_MAP_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
- AttributeValueCoder - Class in org.apache.beam.sdk.io.aws.dynamodb
-
- AttributeValueCoder - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
- AttributeValueCoderProviderRegistrar - Class in org.apache.beam.sdk.io.aws.dynamodb
-
- AttributeValueCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoderProviderRegistrar
-
- AUTH_VALIDATION_GROUP - Static variable in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- AuthenticatedRetryInitializer(GoogleCredentials) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
-
- AUTO - Static variable in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- autoCastField(Schema.Field, Object) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
-
Attempt to cast an object to a specified Schema.Field.Type.
- autoLoadUserDefinedFunctions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.BeamSqlEnvBuilder
-
- AutoScaler - Interface in org.apache.beam.sdk.io.jms
-
Enables users to specify their own `JMS` backlog reporters enabling
JmsIO
to report
UnboundedSource.UnboundedReader#getTotalBacklogBytes()
.
- AutoValueSchema - Class in org.apache.beam.sdk.schemas
-
- AutoValueSchema() - Constructor for class org.apache.beam.sdk.schemas.AutoValueSchema
-
- AutoValueSchema.AbstractGetterTypeSupplier - Class in org.apache.beam.sdk.schemas
-
- AutoValueUtils - Class in org.apache.beam.sdk.schemas.utils
-
Utilities for managing AutoValue schemas.
- AutoValueUtils() - Constructor for class org.apache.beam.sdk.schemas.utils.AutoValueUtils
-
- AvailableParallelismFactory() - Constructor for class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
-
- AvroCoder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder
using Avro binary format.
- AvroCoder(Class<T>, Schema) - Constructor for class org.apache.beam.sdk.coders.AvroCoder
-
- AvroCoder(Class<T>, Schema, boolean) - Constructor for class org.apache.beam.sdk.coders.AvroCoder
-
- AvroCoder.JodaTimestampConversion - Class in org.apache.beam.sdk.coders
-
Conversion for DateTime.
- AvroConvertType(boolean) - Constructor for class org.apache.beam.sdk.schemas.utils.AvroUtils.AvroConvertType
-
- AvroGenericCoder - Class in org.apache.beam.sdk.coders
-
AvroCoder specialisation for GenericRecord.
- AvroIO - Class in org.apache.beam.sdk.io
-
- AvroIO.Parse<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.ParseAll<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.ParseFiles<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.Read<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.ReadAll<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.ReadFiles<T> - Class in org.apache.beam.sdk.io
-
- AvroIO.RecordFormatter<ElementT> - Interface in org.apache.beam.sdk.io
-
- AvroIO.Sink<ElementT> - Class in org.apache.beam.sdk.io
-
- AvroIO.TypedWrite<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- AvroIO.Write<T> - Class in org.apache.beam.sdk.io
-
- AvroPayloadSerializerProvider - Class in org.apache.beam.sdk.schemas.io.payloads
-
- AvroPayloadSerializerProvider() - Constructor for class org.apache.beam.sdk.schemas.io.payloads.AvroPayloadSerializerProvider
-
- AvroReader(AvroSource<T>) - Constructor for class org.apache.beam.sdk.io.AvroSource.AvroReader
-
Reads Avro records of type T
from the specified source.
- AvroRecordSchema - Class in org.apache.beam.sdk.schemas
-
- AvroRecordSchema() - Constructor for class org.apache.beam.sdk.schemas.AvroRecordSchema
-
- AvroSchemaIOProvider - Class in org.apache.beam.sdk.io
-
- AvroSchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.AvroSchemaIOProvider
-
- AvroSink<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- AvroSink.DatumWriterFactory<T> - Interface in org.apache.beam.sdk.io
-
- AvroSource<T> - Class in org.apache.beam.sdk.io
-
Do not use in pipelines directly: most users should use
AvroIO.Read
.
- AvroSource.AvroReader<T> - Class in org.apache.beam.sdk.io
-
- AvroSource.DatumReaderFactory<T> - Interface in org.apache.beam.sdk.io
-
- AvroTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.avro
-
- AvroTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
-
- AvroUtils - Class in org.apache.beam.sdk.schemas.utils
-
Utils to convert AVRO records to Beam rows.
- AvroUtils.AvroConvertType - Class in org.apache.beam.sdk.schemas.utils
-
- AvroUtils.AvroConvertValueForGetter - Class in org.apache.beam.sdk.schemas.utils
-
- AvroUtils.AvroConvertValueForSetter - Class in org.apache.beam.sdk.schemas.utils
-
- AvroUtils.FixedBytesField - Class in org.apache.beam.sdk.schemas.utils
-
Wrapper for fixed byte fields.
- AvroWriteRequest<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
- AvroWriteRequest(T, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
-
- awaitCompletion() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Deprecated.
- awaitCompletion() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver2
-
Uses the callers thread to process all elements received until we receive the end of the stream
from the upstream producer for all endpoints specified.
- awaitCompletion() - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
-
- awaitCompletion() - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
-
Deprecated.
Block until the client has completed reading from the inbound stream.
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkPipelineResult
-
- awaitTermination(Duration) - Method in class org.apache.beam.runners.spark.SparkRunnerDebugger.DebugSparkPipelineResult
-
- AwsClientsProvider - Interface in org.apache.beam.sdk.io.aws.dynamodb
-
Provides instances of AWS clients.
- AwsClientsProvider - Interface in org.apache.beam.sdk.io.aws.sns
-
Provides instances of AWS clients.
- AWSClientsProvider - Interface in org.apache.beam.sdk.io.aws2.kinesis
-
- AWSClientsProvider - Interface in org.apache.beam.sdk.io.kinesis
-
Provides instances of AWS clients.
- AwsCoders - Class in org.apache.beam.sdk.io.aws.coders
-
Coder
s for common AWS SDK objects.
- AwsCoders - Class in org.apache.beam.sdk.io.aws2.coders
-
Coder
s for common AWS SDK objects.
- AwsModule - Class in org.apache.beam.sdk.io.aws.options
-
- AwsModule() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsModule
-
- AwsModule - Class in org.apache.beam.sdk.io.aws2.options
-
- AwsModule() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsModule
-
- AwsOptions - Interface in org.apache.beam.sdk.io.aws.options
-
Options used to configure Amazon Web Services specific options such as credentials and region.
- AwsOptions - Interface in org.apache.beam.sdk.io.aws2.options
-
Options used to configure Amazon Web Services specific options such as credentials and region.
- AwsOptions.AwsRegionFactory - Class in org.apache.beam.sdk.io.aws.options
-
Attempt to load default region.
- AwsOptions.AwsRegionFactory - Class in org.apache.beam.sdk.io.aws2.options
-
Attempt to load default region.
- AwsOptions.AwsUserCredentialsFactory - Class in org.apache.beam.sdk.io.aws.options
-
Attempts to load AWS credentials.
- AwsOptions.AwsUserCredentialsFactory - Class in org.apache.beam.sdk.io.aws2.options
-
- AwsOptions.ClientConfigurationFactory - Class in org.apache.beam.sdk.io.aws.options
-
Default AWS client configuration.
- AwsPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.aws.options
-
A registrar containing the default AWS options.
- AwsPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsPipelineOptionsRegistrar
-
- AwsPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.aws2.options
-
A registrar containing the default AWS options.
- AwsPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsPipelineOptionsRegistrar
-
- AwsRegionFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsRegionFactory
-
- AwsRegionFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsRegionFactory
-
- awsResponseMetadata() - Static method in class org.apache.beam.sdk.io.aws2.coders.AwsCoders
-
- AwsSerializableUtils - Class in org.apache.beam.sdk.io.aws2.options
-
Utilities for working with AWS Serializables.
- AwsSerializableUtils() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
-
- AwsSerializableUtils - Class in org.apache.beam.sdk.io.kinesis.serde
-
Utilities for working with AWS Serializables.
- AwsSerializableUtils() - Constructor for class org.apache.beam.sdk.io.kinesis.serde.AwsSerializableUtils
-
- AwsUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsUserCredentialsFactory
-
- AwsUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsUserCredentialsFactory
-
- AzureBlobStoreFileSystemRegistrar - Class in org.apache.beam.sdk.io.azure.blobstore
-
AutoService
registrar for the AzureBlobStoreFileSystem
.
- AzureBlobStoreFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.azure.blobstore.AzureBlobStoreFileSystemRegistrar
-
- AzureModule - Class in org.apache.beam.sdk.io.azure.options
-
- AzureModule() - Constructor for class org.apache.beam.sdk.io.azure.options.AzureModule
-
- AzurePipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.azure.options
-
A registrar containing the default Azure options.
- AzurePipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.azure.options.AzurePipelineOptionsRegistrar
-
- AzureUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.io.azure.options.BlobstoreOptions.AzureUserCredentialsFactory
-
- CACHED_CREATORS - Static variable in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- CACHED_CREATORS - Static variable in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- CachedSideInputReader - Class in org.apache.beam.runners.spark.structuredstreaming.translation.utils
-
SideInputReader
that caches materialized views.
- CachedSideInputReader - Class in org.apache.beam.runners.spark.util
-
SideInputReader
that caches materialized views.
- CachingFactory<CreatedT> - Class in org.apache.beam.sdk.schemas
-
A wrapper around a
Factory
that assumes the schema parameter never changes.
- CachingFactory(Factory<CreatedT>) - Constructor for class org.apache.beam.sdk.schemas.CachingFactory
-
- CalciteConnectionWrapper - Class in org.apache.beam.sdk.extensions.sql.impl
-
Abstract wrapper for CalciteConnection
to simplify extension.
- CalciteConnectionWrapper(CalciteConnection) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- CalciteFactoryWrapper - Class in org.apache.beam.sdk.extensions.sql.impl
-
Wrapper for CalciteFactory
.
- CalciteQueryPlanner - Class in org.apache.beam.sdk.extensions.sql.impl
-
The core component to handle through a SQL statement, from explain execution plan, to generate a
Beam pipeline.
- CalciteQueryPlanner(JdbcConnection, Collection<RuleSet>) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
Called by
BeamSqlEnv
.instantiatePlanner() reflectively.
- CalciteQueryPlanner.NonCumulativeCostImpl - Class in org.apache.beam.sdk.extensions.sql.impl
-
- CalciteUtils - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
Utility methods for Calcite related operations.
- CalciteUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- CalciteUtils.CharType - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
A LogicalType corresponding to CHAR.
- CalciteUtils.TimeWithLocalTzType - Class in org.apache.beam.sdk.extensions.sql.impl.utils
-
A LogicalType corresponding to TIME_WITH_LOCAL_TIME_ZONE.
- CalcRelSplitter - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
CalcRelSplitter operates on a Calc
with multiple RexCall
sub-expressions that
cannot all be implemented by a single concrete RelNode
.
- CalcRelSplitter(Calc, RelBuilder, CalcRelSplitter.RelType[]) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Constructs a CalcRelSplitter.
- CalcRelSplitter.RelType - Class in org.apache.beam.sdk.extensions.sql.impl.rel
-
Type of relational expression.
- CalendarWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A collection of
WindowFn
s that windows values into calendar-based windows such as spans
of days, months, or years.
- CalendarWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
- CalendarWindows.DaysWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by days.
- CalendarWindows.MonthsWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by months.
- CalendarWindows.YearsWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows elements into periods measured by years.
- call(Tuple2<ByteArray, byte[]>) - Method in class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
-
- call(Iterator<WindowedValue<InputT>>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.DoFnFunction
-
- call(K, Iterator<WindowedValue<KV<K, InputT>>>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.GroupAlsoByWindowViaOutputBufferFn
-
- cancel() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
- cancel() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
-
- cancel() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
-
- cancel() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
-
- cancel() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
-
- cancel() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
-
- cancel() - Method in class org.apache.beam.runners.jet.JetPipelineResult
-
- cancel(JobApi.CancelJobRequest, StreamObserver<JobApi.CancelJobResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- cancel() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Cancel the job.
- cancel() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
-
- cancel() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
-
- cancel() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
-
- cancel(Exception) - Method in class org.apache.beam.sdk.fn.CancellableQueue
-
- cancel() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Deprecated.
- cancel() - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
-
- cancel() - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
-
Deprecated.
Cancels the client, causing it to drop any future inbound data.
- cancel() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.BigQueryServerStream
-
Cancels the stream, releasing any client- and server-side resources.
- cancel() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
-
- cancel() - Method in interface org.apache.beam.sdk.PipelineResult
-
Cancels the pipeline execution.
- CancellableQueue<T> - Class in org.apache.beam.sdk.fn
-
A simplified
ThreadSafe
blocking queue that can be cancelled freeing any blocked
Thread
s and preventing future
Thread
s from blocking.
- CancellableQueue(int) - Constructor for class org.apache.beam.sdk.fn.CancellableQueue
-
Creates a
ThreadSafe
blocking queue with a maximum capacity.
- cancelled() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that the pipeline has been cancelled.
- canConvertConvention(Convention) - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
-
- canImplement(LogicalCalc, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
- canImplement(RexFieldAccess) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
- canImplement(RexDynamicParam) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
- canImplement(RexLiteral) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
- canImplement(RexCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
- canImplement(RexNode, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
Returns whether this RelType
can implement a given expression.
- canImplement(RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter.RelType
-
Returns whether this tester's RelType
can implement a given program.
- CannotProvideCoderException - Exception in org.apache.beam.sdk.coders
-
- CannotProvideCoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(String, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(String, Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException(Throwable, CannotProvideCoderException.ReasonCode) - Constructor for exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- CannotProvideCoderException.ReasonCode - Enum in org.apache.beam.sdk.coders
-
Indicates the reason that
Coder
inference failed.
- canStopPolling(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
- CassandraIO - Class in org.apache.beam.sdk.io.cassandra
-
An IO to read and write from/to Apache Cassandra
- CassandraIO.MutationType - Enum in org.apache.beam.sdk.io.cassandra
-
Specify the mutation type: either write or delete.
- CassandraIO.Read<T> - Class in org.apache.beam.sdk.io.cassandra
-
- CassandraIO.ReadAll<T> - Class in org.apache.beam.sdk.io.cassandra
-
- CassandraIO.Write<T> - Class in org.apache.beam.sdk.io.cassandra
-
- Cast<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Set of utilities for casting rows between schemas.
- Cast() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast
-
- Cast.CompatibilityError - Class in org.apache.beam.sdk.schemas.transforms
-
Describes compatibility errors during casting.
- Cast.Narrowing - Class in org.apache.beam.sdk.schemas.transforms
-
Narrowing changes type without guarantee to preserve data.
- Cast.Validator - Interface in org.apache.beam.sdk.schemas.transforms
-
Interface for statically validating casts.
- Cast.Widening - Class in org.apache.beam.sdk.schemas.transforms
-
Widening changes to type that can represent any possible value of the original type.
- CAST_OP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
- CastFunctionImpl - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
ZetaSQLCastFunctionImpl.
- CastFunctionImpl() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
-
- castNumber(Number, Schema.TypeName, Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
-
- castRow(Row, Schema, Schema) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
-
- castValue(Object, Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
-
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
-
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
-
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
-
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
-
- catalogName() - Method in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
-
- CEPCall - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
A CEPCall
instance represents an operation (node) that contains an operator and a list of
operands.
- CEPFieldRef - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
A CEPFieldRef
instance represents a node that points to a specified field in a Row
.
- CEPKind - Enum in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPKind
corresponds to Calcite's SqlKind
.
- CEPLiteral - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPLiteral
represents a literal node.
- CEPMeasure - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The CEPMeasure
class represents the Measures clause and contains information about output
columns.
- CEPMeasure(Schema, String, CEPOperation) - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
-
- CEPOperation - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
CEPOperation
is the base class for the evaluation operations defined in the DEFINE
syntax of MATCH_RECOGNIZE
.
- CEPOperation() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperation
-
- CEPOperator - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The CEPOperator
records the operators (i.e.
- CEPPattern - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
Core pattern class that stores the definition of a single pattern.
- CEPUtils - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
Some utility methods for transforming Calcite's constructs into our own Beam constructs (for
serialization purpose).
- CEPUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
- ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Responsible for making change stream queries for a given partition.
- ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Class to aggregate metrics related functionality.
- ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Constructs a ChangeStreamMetrics instance with the following metrics enabled by default.
- ChangeStreamMetrics(Set<MetricName>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Constructs a ChangeStreamMetrics instance with the given metrics enabled.
- changeStreamQuery(String, Timestamp, Timestamp, long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamDao
-
Performs a change stream query.
- ChangeStreamRecord - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a Spanner Change Stream Record.
- ChangeStreamRecordMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
-
- changeStreamRecordMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
-
Creates and returns a singleton instance of a mapper class capable of transforming a
Struct
into a
List
of
ChangeStreamRecord
subclasses.
- ChangeStreamRecordMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
- ChangeStreamRecordMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
- ChangeStreamResultSet - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Decorator class over a
ResultSet
that provides telemetry for the streamed records.
- ChangeStreamResultSetMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Represents telemetry metadata gathered during the consumption of a change stream query.
- ChangeStreamsConstants - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Single place for defining the constants used in the Spanner.readChangeStreams()
connector.
- ChangeStreamsConstants() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
- channelNames - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- CHAR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- CHAR_LENGTH - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
- CHAR_LENGTH_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
-
- characters() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- charLength(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
-
- CharType() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.CharType
-
- check(RelNode) - Method in interface org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall.JoinChecker
-
- checkConfiguration(ClientConfiguration, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Check if all necessary configuration is available to create clients.
- checkConfiguration(ClientConfiguration, AwsOptions) - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.DefaultClientBuilder
-
- checkDone() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
-
- checkDone() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Checks if the restriction has been processed successfully.
- checkDone() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
-
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
-
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
- checkDone() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Checks whether the restriction has been fully processed.
- checkIfAnySubscriptionExists(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
- checksum() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
-
An optional checksum to identify the contents of a file.
- ChildPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A child partition represents a new partition that should be queried.
- ChildPartition(String, HashSet<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Constructs a child partition, which will have its own token and the parents that it originated
from.
- ChildPartition(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Constructs a child partition, which will have its own token and the parent that it originated
from.
- ChildPartitionsRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a ChildPartitionsRecord.
- ChildPartitionsRecord(Timestamp, String, List<ChildPartition>, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Constructs a child partitions record containing one or more child partitions.
- childPartitionsRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
- ChildPartitionsRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
- CivilTimeEncoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Encoder for TIME and DATETIME values, according to civil_time encoding.
- classesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
-
- classesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
-
- ClassLoaderFileSystem - Class in org.apache.beam.sdk.io
-
A read-only
FileSystem
implementation looking up resources using a ClassLoader.
- ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar - Class in org.apache.beam.sdk.io
-
- ClassLoaderFileSystem.ClassLoaderResourceId - Class in org.apache.beam.sdk.io
-
- ClassLoaderFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar
-
- classNamesToTranslators() - Method in interface org.apache.beam.runners.dataflow.util.CoderCloudObjectTranslatorRegistrar
-
- classNamesToTranslators() - Method in class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
-
- ClassWithSchema() - Constructor for class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
-
- CleanTmpFilesFromGcsFn(ValueProvider<String>, String) - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read.CleanTmpFilesFromGcsFn
-
Created object that will remove temp files from stage.
- cleanup() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
- CleanUpReadChangeStreamDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
- CleanUpReadChangeStreamDoFn(DaoFactory) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
-
- clear(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Clears the bag user state for the given key and window.
- clear() - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
-
- clear() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
-
- clear() - Static method in class org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator
-
- clear() - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
-
- clear() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- clear() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
-
- clear() - Method in interface org.apache.beam.sdk.state.State
-
Clear out the state location.
- clear() - Method in interface org.apache.beam.sdk.state.Timer
-
Clears a timer.
- clearCache() - Static method in class org.apache.beam.runners.spark.io.MicrobatchSource
-
- clearOutputElements() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- clearOutputElements(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- clearRange(Instant, Instant) - Method in interface org.apache.beam.sdk.state.OrderedListState
-
Clear a timestamp-limited subrange of the list.
- clearState(ReduceFn<K, T, Iterable<T>, W>.Context) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
-
- clearWarnings() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- ClickHouseIO - Class in org.apache.beam.sdk.io.clickhouse
-
An IO to write to ClickHouse.
- ClickHouseIO() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
-
- ClickHouseIO.Write<T> - Class in org.apache.beam.sdk.io.clickhouse
-
- ClickHouseWriter - Class in org.apache.beam.sdk.io.clickhouse
-
Writes Rows and field values using ClickHouseRowBinaryStream
.
- ClickHouseWriter() - Constructor for class org.apache.beam.sdk.io.clickhouse.ClickHouseWriter
-
- CLIENT_EXECUTION_TIMEOUT - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
-
- clientBuffered(ExecutorService) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Create a buffering
OutboundObserverFactory
for client-side RPCs with the specified
ExecutorService
and the default buffer size.
- clientBuffered(ExecutorService, int) - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Create a buffering
OutboundObserverFactory
for client-side RPCs with the specified
ExecutorService
and buffer size.
- ClientBuilderFactory - Interface in org.apache.beam.sdk.io.aws2.common
-
- ClientBuilderFactory.DefaultClientBuilder - Class in org.apache.beam.sdk.io.aws2.common
-
- ClientConfiguration - Class in org.apache.beam.sdk.io.aws2.common
-
AWS client configuration.
- ClientConfiguration() - Constructor for class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
- ClientConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.common
-
- ClientConfigurationFactory() - Constructor for class org.apache.beam.sdk.io.aws.options.AwsOptions.ClientConfigurationFactory
-
- clientDirect() - Static method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
- Clock - Interface in org.apache.beam.runners.direct
-
Access to the current time.
- clone() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
-
- clone() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
-
- clonesOf(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
-
- close() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil
-
- close() - Method in class org.apache.beam.runners.flink.metrics.FileReporter
-
- close() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- close() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
-
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
-
- close() - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.WrappedSdkHarnessClient
-
- close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
-
- close() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
-
- close() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Closes this bundle.
- close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Blocks until bundle processing is finished.
- close() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
- close() - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- close() - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
-
- close() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
-
- close() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
.
- close() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
-
- close() - Method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
-
- close() - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
-
- close() - Method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
-
- close() - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
- close() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
-
- close() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
-
- close() - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- close() - Method in class org.apache.beam.runners.portability.CloseableResource
-
Closes the underlying resource.
- close(T) - Method in interface org.apache.beam.runners.portability.CloseableResource.Closer
-
- close() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- close() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
-
- close() - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
-
- close() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
- close() - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
-
- close() - Method in class org.apache.beam.sdk.extensions.arrow.ArrowConversion.RecordBatchRowIterator
-
- close() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- close() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
-
Deprecated.
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer2
-
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver2
-
- close() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundObserver
-
Deprecated.
- close() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
-
.
- close() - Method in interface org.apache.beam.sdk.fn.server.FnService
-
.
- close() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
- close() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
-
- close() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Closes the channel and returns the bundle result.
- close() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Closes any ReadableByteChannel
created for the current reader.
- close() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Close the client object.
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Gracefully close the underlying netty channel.
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- close() - Method in interface org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedBacklogReaderFactory
-
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedBacklogReaderFactoryImpl
-
- close() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
- close() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
-
- close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
-
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
-
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
-
- close() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Closes the reader.
- close() - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.ThriftWriter
-
- close() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- CloseableFnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
-
A receiver of streamed data that can be closed.
- CloseableResource<T> - Class in org.apache.beam.runners.portability
-
An AutoCloseable
that wraps a resource that needs to be cleaned up but does not implement
AutoCloseable
itself.
- CloseableResource.CloseException - Exception in org.apache.beam.runners.portability
-
An exception that wraps errors thrown while a resource is being closed.
- CloseableResource.Closer<T> - Interface in org.apache.beam.runners.portability
-
A function that knows how to clean up after a resource.
- CloseableThrowingConsumer<ExceptionT extends java.lang.Exception,T> - Interface in org.apache.beam.sdk.function
-
- closeTo(double, double) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- CloudDebuggerOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options for controlling Cloud Debugger.
- CloudObject - Class in org.apache.beam.runners.dataflow.util
-
A representation of an arbitrary Java object to be instantiated by Dataflow workers.
- cloudObjectClassName() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
- cloudObjectClassName() - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
-
- cloudObjectClassName() - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
-
- CloudObjects - Class in org.apache.beam.runners.dataflow.util
-
- CloudObjectTranslator<T> - Interface in org.apache.beam.runners.dataflow.util
-
A translator that takes an object and creates a
CloudObject
which can be converted back
to the original object.
- CloudPubsubTransforms - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
A class providing transforms between Cloud Pub/Sub and Pub/Sub Lite message types.
- CloudResourceManagerOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Properties needed when using Google CloudResourceManager with the Apache Beam SDK.
- CloudVision - Class in org.apache.beam.sdk.extensions.ml
-
Factory class for implementations of AnnotateImages
.
- CloudVision() - Constructor for class org.apache.beam.sdk.extensions.ml.CloudVision
-
- CloudVision.AnnotateImagesFromBytes - Class in org.apache.beam.sdk.extensions.ml
-
- CloudVision.AnnotateImagesFromBytesWithContext - Class in org.apache.beam.sdk.extensions.ml
-
Accepts
KV
s of
ByteString
(encoded image contents) and
ImageContext
.
- CloudVision.AnnotateImagesFromGcsUri - Class in org.apache.beam.sdk.extensions.ml
-
Accepts
String
(image URI on GCS) with optional
DoFn.SideInput
with a
Map
of
ImageContext
to
the image.
- CloudVision.AnnotateImagesFromGcsUriWithContext - Class in org.apache.beam.sdk.extensions.ml
-
Accepts
KV
s of
String
(GCS URI to the image) and
ImageContext
.
- CO_GBK_RESULT_SCHEMA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- CodahaleCsvSink - Class in org.apache.beam.runners.spark.structuredstreaming.metrics.sink
-
- CodahaleCsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
-
Constructor for Spark 3.1.x and earlier.
- CodahaleCsvSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleCsvSink
-
Constructor for Spark 3.2.x and later.
- CodahaleGraphiteSink - Class in org.apache.beam.runners.spark.structuredstreaming.metrics.sink
-
- CodahaleGraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
-
Constructor for Spark 3.1.x and earlier.
- CodahaleGraphiteSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.sink.CodahaleGraphiteSink
-
Constructor for Spark 3.2.x and later.
- coder() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
-
- Coder<T> - Class in org.apache.beam.sdk.coders
-
A
Coder<T>
defines how to encode and decode values of type
T
into
byte streams.
- Coder() - Constructor for class org.apache.beam.sdk.coders.Coder
-
- Coder() - Constructor for class org.apache.beam.sdk.io.range.OffsetRange.Coder
-
- Coder.Context - Class in org.apache.beam.sdk.coders
-
- Coder.NonDeterministicException - Exception in org.apache.beam.sdk.coders
-
Exception thrown by
Coder.verifyDeterministic()
if the encoding is not deterministic,
including details of why the encoding is not deterministic.
- CoderCloudObjectTranslatorRegistrar - Interface in org.apache.beam.runners.dataflow.util
-
Coder
authors have the ability to automatically have their
Coder
registered with
the Dataflow Runner by creating a
ServiceLoader
entry and a concrete implementation of
this interface.
- coderConsistentWithEquals(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
and values of type T
, the values are equal
if and only if the encoded bytes are equal.
- coderConsistentWithEqualsInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, Coder.Context
, and values of type T
, the values are equal if and only if the encoded bytes are equal, in any Coder.Context
.
- coderDecodeEncodeContentsEqual(Coder<CollectionT>, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<Collection<T>>
, and value of type Collection<T>
, encoding followed by decoding yields an equal value of type Collection<T>
, in any Coder.Context
.
- coderDecodeEncodeContentsEqualInContext(Coder<CollectionT>, Coder.Context, CollectionT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<Collection<T>>
, and value of type Collection<T>
, encoding followed by decoding yields an equal value of type Collection<T>
, in the given Coder.Context
.
- coderDecodeEncodeContentsInSameOrder(Coder<IterableT>, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<Collection<T>>
, and value of type Collection<T>
, encoding followed by decoding yields an equal value of type Collection<T>
, in any Coder.Context
.
- coderDecodeEncodeContentsInSameOrderInContext(Coder<IterableT>, Coder.Context, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<Iterable<T>>
, and value of type Iterable<T>
,
encoding followed by decoding yields an equal value of type Collection<T>
, in the given
Coder.Context
.
- coderDecodeEncodeEqual(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, and value of type T
, encoding followed by
decoding yields an equal value of type T
, in any Coder.Context
.
- coderDecodeEncodeEqualInContext(Coder<T>, Coder.Context, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, Coder.Context
, and value of type T
, encoding followed by decoding yields an equal value of type T
.
- coderDecodeEncodeInContext(Coder<T>, Coder.Context, T, Matcher<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, Coder.Context
, and value of type T
, encoding followed by decoding yields a value of type T
and tests that the matcher
succeeds on the values.
- coderDecodesBase64(Coder<T>, String, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderDecodesBase64(Coder<T>, List<String>, List<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderDecodesBase64ContentsEqual(Coder<IterableT>, String, IterableT) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderDecodesBase64ContentsEqual(Coder<IterableT>, List<String>, List<IterableT>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderDeterministic(Coder<T>, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, and values of type T
, if the values are
equal then the encoded bytes are equal, in any Coder.Context
.
- coderDeterministicInContext(Coder<T>, Coder.Context, T, T) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that for the given Coder<T>
, Coder.Context
, and values of type T
, if the values are equal then the encoded bytes are equal.
- coderEncodesBase64(Coder<T>, T, String) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- coderEncodesBase64(Coder<T>, List<T>, List<String>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
- CoderException - Exception in org.apache.beam.sdk.coders
-
An Exception
thrown if there is a problem encoding or decoding a value.
- CoderException(String) - Constructor for exception org.apache.beam.sdk.coders.CoderException
-
- CoderException(String, Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
-
- CoderException(Throwable) - Constructor for exception org.apache.beam.sdk.coders.CoderException
-
- coderFor(TypeDescriptor<T>, List<? extends Coder<?>>) - Method in class org.apache.beam.sdk.coders.CoderProvider
-
Returns a Coder<T>
to use for values of a particular type, given the Coders for each of
the type's generic parameter types.
- coderForFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
-
- coderFromCloudObject(CloudObject) - Static method in class org.apache.beam.runners.dataflow.util.CloudObjects
-
- CoderHelpers - Class in org.apache.beam.runners.spark.coders
-
Serialization utility class.
- CoderHelpers - Class in org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Serialization utility class.
- CoderHelpers.FromByteFunction<K,V> - Class in org.apache.beam.runners.spark.coders
-
A function for converting a byte array pair to a key-value pair.
- CoderProperties - Class in org.apache.beam.sdk.testing
-
Properties for use in
Coder
tests.
- CoderProperties() - Constructor for class org.apache.beam.sdk.testing.CoderProperties
-
- CoderProperties.TestElementByteSizeObserver - Class in org.apache.beam.sdk.testing
-
An ElementByteSizeObserver
that records the observed element sizes for testing
purposes.
- CoderProvider - Class in org.apache.beam.sdk.coders
-
- CoderProvider() - Constructor for class org.apache.beam.sdk.coders.CoderProvider
-
- CoderProviderRegistrar - Interface in org.apache.beam.sdk.coders
-
Coder
creators have the ability to automatically have their
coders
registered with this SDK by creating a
ServiceLoader
entry and a concrete implementation
of this interface.
- CoderProviders - Class in org.apache.beam.sdk.coders
-
Static utility methods for creating and working with
CoderProvider
s.
- CoderRegistry - Class in org.apache.beam.sdk.coders
-
- coderSerializable(Coder<T>) - Static method in class org.apache.beam.sdk.testing.CoderProperties
-
Verifies that the given Coder<T>
can be correctly serialized and deserialized.
- CoGbkResult - Class in org.apache.beam.sdk.transforms.join
-
- CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- CoGbkResult(CoGbkResultSchema, Iterable<RawUnionValue>, int, int) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- CoGbkResult.CoGbkResultCoder - Class in org.apache.beam.sdk.transforms.join
-
- CoGbkResultSchema - Class in org.apache.beam.sdk.transforms.join
-
- CoGbkResultSchema(TupleTagList) - Constructor for class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Builds a schema from a tuple of TupleTag<?>
s.
- CoGroup - Class in org.apache.beam.sdk.schemas.transforms
-
A transform that performs equijoins across multiple schema
PCollection
s.
- CoGroup() - Constructor for class org.apache.beam.sdk.schemas.transforms.CoGroup
-
- CoGroup.By - Class in org.apache.beam.sdk.schemas.transforms
-
Defines the set of fields to extract for the join key, as well as other per-input join options.
- CoGroup.ExpandCrossProduct - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransform
that calculates the cross-product join.
- CoGroup.Impl - Class in org.apache.beam.sdk.schemas.transforms
-
The implementing PTransform.
- CoGroup.Result - Class in org.apache.beam.sdk.schemas.transforms
-
- CoGroupByKey<K> - Class in org.apache.beam.sdk.transforms.join
-
- COLLECTION_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- CollectionCoder<T> - Class in org.apache.beam.sdk.coders
-
- CollectionCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.CollectionCoder
-
- column(SqlParserPos, SqlIdentifier, SqlDataTypeSpec, SqlNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
-
Creates a column declaration.
- Column() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
-
- COLUMN_CREATED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition row was first created.
- COLUMN_END_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp to end the change stream query of the partition.
- COLUMN_FINISHED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
- COLUMN_HEARTBEAT_MILLIS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the change stream query heartbeat interval in millis.
- COLUMN_PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for parent partition tokens.
- COLUMN_PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the partition token.
- COLUMN_RUNNING_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
- COLUMN_SCHEDULED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was scheduled by the
DetectNewPartitionsDoFn
SDF.
- COLUMN_START_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp to start the change stream query of the partition.
- COLUMN_STATE - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the state that the partition is currently in.
- COLUMN_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the current watermark of the partition.
- columns() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema
-
- COLUMNS_MAPPING - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
-
- columnType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
-
- ColumnType() - Constructor for class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- ColumnType - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Defines a column type from a Cloud Spanner table with the following information: column name,
column type, flag indicating if column is primary key and column position in the table.
- ColumnType(String, TypeCode, boolean, long) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
- Combine - Class in org.apache.beam.sdk.transforms
-
PTransform
s for combining PCollection
elements globally and per-key.
- combine(Iterable<? extends Instant>) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
- combine(Instant...) - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
- Combine.AccumulatingCombineFn<InputT,AccumT extends Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT>,OutputT> - Class in org.apache.beam.sdk.transforms
-
- Combine.AccumulatingCombineFn.Accumulator<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
-
The type of mutable accumulator values used by this AccumulatingCombineFn
.
- Combine.BinaryCombineDoubleFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more easily and
efficiently expressed as binary operations on
double
s.
- Combine.BinaryCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more easily
expressed as binary operations.
- Combine.BinaryCombineIntegerFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more easily and
efficiently expressed as binary operations on
int
s
- Combine.BinaryCombineLongFn - Class in org.apache.beam.sdk.transforms
-
An abstract subclass of
Combine.CombineFn
for implementing combiners that are more easily and
efficiently expressed as binary operations on
long
s.
- Combine.CombineFn<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
-
A CombineFn<InputT, AccumT, OutputT>
specifies how to combine a collection of input
values of type InputT
into a single output value of type OutputT
.
- Combine.Globally<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
Combine.Globally<InputT, OutputT>
takes a
PCollection<InputT>
and returns a
PCollection<OutputT>
whose elements are the result of combining all the elements in
each window of the input
PCollection
, using a specified
CombineFn<InputT, AccumT, OutputT>
.
- Combine.GloballyAsSingletonView<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
Combine.GloballyAsSingletonView<InputT, OutputT>
takes a
PCollection<InputT>
and returns a
PCollectionView<OutputT>
whose elements are the result of combining all
the elements in each window of the input
PCollection
, using a specified
CombineFn<InputT, AccumT, OutputT>
.
- Combine.GroupedValues<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
GroupedValues<K, InputT, OutputT>
takes a
PCollection<KV<K, Iterable<InputT>>>
,
such as the result of
GroupByKey
, applies a specified
CombineFn<InputT, AccumT, OutputT>
to each of the input
KV<K, Iterable<InputT>>
elements to produce a combined output
KV<K, OutputT>
element, and returns a
PCollection<KV<K, OutputT>>
containing all the combined output elements.
- Combine.Holder<V> - Class in org.apache.beam.sdk.transforms
-
Holds a single value value of type V
which may or may not be present.
- Combine.IterableCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
- Combine.PerKey<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
PerKey<K, InputT, OutputT>
takes a PCollection<KV<K, InputT>>
, groups it by
key, applies a combining function to the InputT
values associated with each key to
produce a combined OutputT
value, and returns a PCollection<KV<K, OutputT>>
representing a map from each distinct key of the input PCollection
to the corresponding
combined value.
- Combine.PerKeyWithHotKeyFanout<K,InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
- Combine.SimpleCombineFn<V> - Class in org.apache.beam.sdk.transforms
-
Deprecated.
- CombineFieldsByFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsByFields
-
- combineFn() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf
-
- CombineFn() - Constructor for class org.apache.beam.sdk.transforms.Combine.CombineFn
-
- combineFn() - Static method in class org.apache.beam.sdk.transforms.Count
-
- combineFn() - Static method in class org.apache.beam.sdk.transforms.Latest
-
- combineFn(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a
Combine.CombineFn
that computes a fixed-sized uniform sample of its inputs.
- CombineFnBase - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- CombineFnBase() - Constructor for class org.apache.beam.sdk.transforms.CombineFnBase
-
- CombineFnBase.GlobalCombineFn<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- CombineFns - Class in org.apache.beam.sdk.transforms
-
Static utility methods that create combine function instances.
- CombineFns() - Constructor for class org.apache.beam.sdk.transforms.CombineFns
-
- CombineFns.CoCombineResult - Class in org.apache.beam.sdk.transforms
-
A tuple of outputs produced by a composed combine functions.
- CombineFns.ComposeCombineFnBuilder - Class in org.apache.beam.sdk.transforms
-
- CombineFns.ComposedCombineFn<DataT> - Class in org.apache.beam.sdk.transforms
-
- CombineFns.ComposedCombineFnWithContext<DataT> - Class in org.apache.beam.sdk.transforms
-
- CombineFnTester - Class in org.apache.beam.sdk.testing
-
- CombineFnTester() - Constructor for class org.apache.beam.sdk.testing.CombineFnTester
-
- CombineFnWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
- CombineWithContext - Class in org.apache.beam.sdk.transforms
-
This class contains combine functions that have access to PipelineOptions
and side inputs
through CombineWithContext.Context
.
- CombineWithContext() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext
-
- CombineWithContext.CombineFnWithContext<InputT,AccumT,OutputT> - Class in org.apache.beam.sdk.transforms
-
A combine function that has access to PipelineOptions
and side inputs through CombineWithContext.Context
.
- CombineWithContext.Context - Class in org.apache.beam.sdk.transforms
-
Information accessible to all methods in CombineFnWithContext
and KeyedCombineFnWithContext
.
- CombineWithContext.RequiresContextInternal - Interface in org.apache.beam.sdk.transforms
-
An internal interface for signaling that a GloballyCombineFn
or a PerKeyCombineFn
needs to access CombineWithContext.Context
.
- combining(Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
- combining(CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards compatibility guarantees
- combining(Coder<AccumT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
Identical to #combining(CombineFn)
, but with an accumulator coder explicitly supplied.
- combining(Coder<AccumT>, CombineWithContext.CombineFnWithContext<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards compatibility guarantees
- combiningFromInputInternal(Coder<InputT>, Combine.CombineFn<InputT, AccumT, OutputT>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- CombiningState<InputT,AccumT,OutputT> - Interface in org.apache.beam.sdk.state
-
A
ReadableState
cell defined by a
Combine.CombineFn
, accepting multiple input values,
combining them as specified into accumulators, and producing a single output value.
- comment(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.Table.Builder
-
- commit() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- commitOffsets() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Enable committing record offset.
- commitOffsetsInFinalize() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Finalized offsets are committed to Kafka.
- commitWriteStreams(String, Iterable<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Commit write streams of type PENDING.
- commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- commonPrefixLength(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
Compute the length of the common prefix of the two provided sets of bytes.
- compact(AccumT) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- compact(AccumT) - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns an accumulator that represents the same logical value as the input accumulator, but
may have a more compact representation.
- compact(List<V>) - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
- compact(Object[]) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
- compact(Object[], CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
- compact(AccumT, CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns an accumulator that represents the same logical value as the input accumulator, but
may have a more compact representation.
- compare(JobMessage, JobMessage) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator
-
- compare(RandomAccessData, RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
- compare(RandomAccessData, RandomAccessData, int) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
-
Compare the two sets of bytes starting at the given offset.
- compare(Row, Row) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel.BeamSqlRowComparator
-
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Largest
-
Deprecated.
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Natural
-
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Reversed
-
- compare(T, T) - Method in class org.apache.beam.sdk.transforms.Top.Smallest
-
Deprecated.
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByKey
-
- compare(KV<K, V>, KV<K, V>) - Method in class org.apache.beam.sdk.values.KV.OrderByValue
-
- compareTo(ByteArray) - Method in class org.apache.beam.runners.spark.util.ByteArray
-
- compareTo(Path) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- compareTo(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- compareTo(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKey
-
ByteKey
implements
Comparable<ByteKey>
by comparing the arrays
in lexicographic order.
- compareTo(RedisCursor) - Method in class org.apache.beam.sdk.io.redis.RedisCursor
-
RedisCursor
implements
Comparable<RedisCursor>
by transforming
the cursors to an index of the Redis table.
- compareTo(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
- CompatibilityError() - Constructor for class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
-
- compile(List<CEPPattern>, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.nfa.NFA
-
- CompletableFutureInboundDataClient - Class in org.apache.beam.sdk.fn.data
-
- complete() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
-
- complete() - Method in class org.apache.beam.runners.jet.processors.ImpulseP
-
- complete() - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
-
- complete() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
-
- complete() - Method in class org.apache.beam.runners.jet.processors.ViewP
-
- complete() - Method in class org.apache.beam.runners.jet.processors.WindowGroupP
-
- complete() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Deprecated.
- complete() - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
-
- complete() - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
-
Deprecated.
Mark the client as completed.
- complete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Constructs a
Watch.Growth.PollResult
with the given outputs and declares that there will be no
new outputs for the current input.
- complete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
- completed() - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that the pipeline has successfully completed.
- COMPONENT_ENCODINGS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- compose() - Static method in class org.apache.beam.sdk.transforms.CombineFns
-
- compose(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
-
For a SerializableFunction<InputT, OutputT>
fn
, returns a PTransform
given by applying fn.apply(v)
to the input PCollection<InputT>
.
- compose(String, SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.PTransform
-
- ComposeCombineFnBuilder() - Constructor for class org.apache.beam.sdk.transforms.CombineFns.ComposeCombineFnBuilder
-
- COMPOSITE_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- CompositeSource - Class in org.apache.beam.runners.spark.metrics
-
Composite source made up of several MetricRegistry
instances.
- CompositeSource(String, MetricRegistry...) - Constructor for class org.apache.beam.runners.spark.metrics.CompositeSource
-
- CompositeSource - Class in org.apache.beam.runners.spark.structuredstreaming.metrics
-
Composite source made up of several MetricRegistry
instances.
- CompositeSource(String, MetricRegistry...) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.metrics.CompositeSource
-
- CompressedReader(CompressedSource<T>, FileBasedSource.FileBasedReader<T>) - Constructor for class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Create a CompressedReader
from a CompressedSource
and delegate reader.
- CompressedSource<T> - Class in org.apache.beam.sdk.io
-
A Source that reads from compressed files.
- CompressedSource.CompressedReader<T> - Class in org.apache.beam.sdk.io
-
- CompressedSource.CompressionMode - Enum in org.apache.beam.sdk.io
-
- CompressedSource.DecompressingChannelFactory - Interface in org.apache.beam.sdk.io
-
Factory interface for creating channels that decompress the content of an underlying channel.
- Compression - Enum in org.apache.beam.sdk.io
-
Various compression types for reading/writing files.
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
-
- compute(Partition, TaskContext) - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
-
- compute(Iterator<WindowedValue<T>>, RecordCollector<WindowedValue<T>>) - Method in class org.apache.beam.runners.twister2.translators.functions.AssignWindowsFunction
-
- compute(Iterator<WindowedValue<InputT>>, RecordCollector<RawUnionValue>) - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
-
- compute(Iterator<RawUnionValue>, RecordCollector<WindowedValue<OutputT>>) - Method in class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
-
- computeIfAbsent(K, Function<? super K, ? extends V>) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred read-followed-by-write.
- computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
-
- computeSelfCost(RelOptPlanner, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
-
- concat(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
-
- concat(String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
-
- concat(String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
-
- concat(String, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
-
- concat(String, String, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.StringFunctions
-
- CONCAT - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
- concat(Iterable<T>...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
Concatentates the Iterable
s.
- concat(Iterator<T>...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
Concatentates the Iterator
s.
- CONCAT_FIELD_NAMES - Static variable in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
This policy keeps all levels of a name.
- CONCAT_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
-
- Concatenate() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
- concatFieldNames() - Method in class org.apache.beam.sdk.schemas.transforms.Select.Flattened
-
For nested fields, concatenate all the names separated by a _ character in the flattened
schema.
- concatIterators(Iterator<Iterator<T>>) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
- CONCRETE_CLASS - Static variable in class org.apache.beam.sdk.io.WriteFiles
-
For internal use by runners.
- config() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- config() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
-
- Configuration() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar.ReadBuilder.Configuration
-
- configuration - Variable in class org.apache.beam.runners.jobsubmission.JobServerDriver
-
- Configuration() - Constructor for class org.apache.beam.sdk.extensions.schemaio.expansion.ExternalSchemaIOTransformRegistrar.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.ReadDataBuilder.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.kinesis.KinesisTransformRegistrar.WriteBuilder.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.ReadBuilder.Configuration
-
- Configuration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
-
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
- ConfigurationLocator() - Constructor for class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
-
- configurationSchema() - Method in class org.apache.beam.sdk.io.AvroSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
-
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
-
- configure() - Static method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new builder for a
Window
transform for setting windowing parameters other
than the windowing function.
- ConfluentSchemaRegistryDeserializerProvider<T> - Class in org.apache.beam.sdk.io.kafka
-
- connect(String, Properties) - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
Configures Beam-specific options and opens a JDBC connection to Calcite.
- connect(TableProvider, PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
- connect() - Method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
Connect to the Redis instance.
- CONNECT_STRING_PREFIX - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
- connection() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- CONNECTION_MAX_IDLE_TIME - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
-
- CONNECTION_TIME_TO_LIVE - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
-
- CONNECTION_TIMEOUT - Static variable in class org.apache.beam.sdk.io.aws.options.AwsModule
-
- connectionAcquisitionTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait when acquiring a connection from the pool before giving up and timing
out.
- connectionAcquisitionTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait when acquiring a connection from the pool before giving up and timing out.
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ConnectionConfiguration
-
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
- ConnectionConfiguration() - Constructor for class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
-
- ConnectionManager - Class in org.apache.beam.sdk.io.cassandra
-
- ConnectionManager() - Constructor for class org.apache.beam.sdk.io.cassandra.ConnectionManager
-
- connectionMaxIdleTime(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Maximum milliseconds a connection should be allowed to remain open while idle.
- connectionMaxIdleTime() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Maximum milliseconds a connection should be allowed to remain open while idle.
- connectionTimeout(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Milliseconds to wait when initially establishing a connection before giving up and timing
out.
- connectionTimeout() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Milliseconds to wait when initially establishing a connection before giving up and timing out.
- connectionTimeToLive(Integer) - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration.Builder
-
Maximum milliseconds a connection should be allowed to remain open, regardless of usage
frequency.
- connectionTimeToLive() - Method in class org.apache.beam.sdk.io.aws2.common.HttpClientConfiguration
-
Maximum milliseconds a connection should be allowed to remain open, regardless of usage
frequency.
- ConnectorConfiguration() - Constructor for class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
- Connectors - Enum in org.apache.beam.io.debezium
-
Enumeration of debezium connectors.
- consistentWithEquals() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- consistentWithEquals() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BitSetCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.BooleanCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ByteCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.Coder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DequeCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.DurationCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.FloatCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.InstantCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.KvCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
LengthPrefixCoder
is consistent with equals if the nested Coder
is.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.ListCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.MapCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
NullableCoder
is consistent with equals if the nested Coder
is.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
Returns
true
if this
Coder
is injective with respect to
Object.equals(java.lang.Object)
.
- consistentWithEquals() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
-
- consistentWithEquals() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- ConsoleIO - Class in org.apache.beam.runners.spark.io
-
Print to console.
- ConsoleIO.Write - Class in org.apache.beam.runners.spark.io
-
Write to console.
- ConsoleIO.Write.Unbound<T> - Class in org.apache.beam.runners.spark.io
-
- constant(FileBasedSink.FilenamePolicy, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
- constant(FileBasedSink.FilenamePolicy) - Static method in class org.apache.beam.sdk.io.DynamicFileDestinations
-
A specialization of #constant(FilenamePolicy, SerializableFunction)
for the case where
UserT and OutputT are the same type and the format function is the identity.
- constant(OutT) - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
-
- CONSTANT_WINDOW_SIZE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
-
- constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>) - Static method in class org.apache.beam.sdk.io.AvroIO
-
- constantDestinations(FileBasedSink.FilenamePolicy, Schema, Map<String, Object>, CodecFactory, SerializableFunction<UserT, OutputT>, AvroSink.DatumWriterFactory<OutputT>) - Static method in class org.apache.beam.sdk.io.AvroIO
-
- Constants - Class in org.apache.beam.runners.spark.structuredstreaming
-
- Constants() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.Constants
-
- constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
-
- constructFilter(List<RexNode>) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Generate an IO implementation of BeamSqlTableFilter
for predicate push-down.
- constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
-
- constructFilter(List<RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
-
- consumesProjection() - Method in interface org.apache.beam.sdk.schemas.ProjectionConsumer
-
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
-
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.CachedSideInputReader
-
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
-
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
-
- contains(PCollectionView<T>) - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
-
- contains(Descriptors.Descriptor) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
-
- contains(T) - Method in interface org.apache.beam.sdk.state.SetState
-
- contains(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- contains(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- contains(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- contains(List<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- contains(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns whether this window contains the given window.
- containsInAnyOrder(T...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question contains the provided elements.
- containsInAnyOrder(SerializableMatcher<? super T>...) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question matches the provided elements.
- containsInAnyOrder() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
- containsInAnyOrder(Iterable<T>) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Asserts that the iterable in question contains the provided elements.
- containsInAnyOrder(T...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the Iterable
contains the expected elements, in any order.
- containsInAnyOrder(Iterable<T>) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the Iterable
contains the expected elements, in any order.
- containsInAnyOrder(SerializableMatcher<? super T>...) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
Checks that the Iterable
contains elements that match the provided matchers, in any
order.
- containsInAnyOrder() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- containsInAnyOrder(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- containsInAnyOrder(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- containsInAnyOrder(SerializableMatcher<? super T>...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- containsInAnyOrder(Collection<SerializableMatcher<? super T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- containsKey(ByteKey) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns
true
if the specified
ByteKey
is contained within this range.
- containsKey(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
-
- containsSeekableInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
This method returns whether any of the children of the relNode are Seekable.
- containsString(String) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- containsValue(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
-
- Context(boolean) - Constructor for class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
- Context(TableDataInsertAllResponse.InsertErrors) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
-
- Context() - Constructor for class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
-
- Context() - Constructor for class org.apache.beam.sdk.transforms.CombineWithContext.Context
-
- Context() - Constructor for class org.apache.beam.sdk.transforms.Contextful.Fn.Context
-
- Contextful<ClosureT> - Class in org.apache.beam.sdk.transforms
-
Pair of a bit of user code (a "closure") and the
Requirements
needed to run it.
- Contextful.Fn<InputT,OutputT> - Interface in org.apache.beam.sdk.transforms
-
A function from an input to an output that may additionally access
Contextful.Fn.Context
when
computing the result.
- Contextful.Fn.Context - Class in org.apache.beam.sdk.transforms
-
- ContextualTextIO - Class in org.apache.beam.sdk.io.contextualtextio
-
PTransform
s that read text files and collect contextual information of the elements in
the input.
- ContextualTextIO.Read - Class in org.apache.beam.sdk.io.contextualtextio
-
- ContextualTextIO.ReadFiles - Class in org.apache.beam.sdk.io.contextualtextio
-
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
-
Like Match#continuously(Duration, TerminationCondition, boolean)
.
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchAll
-
Like Match#continuously(Duration, TerminationCondition)
.
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Continuously watches for new files at the given interval until the given termination
condition is reached, where the input to the condition is the filepattern.
- continuously(Duration, Watch.Growth.TerminationCondition<String, ?>) - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
Continuously watches for new files at the given interval until the given termination
condition is reached, where the input to the condition is the filepattern.
- control(StreamObserver<BeamFnApi.InstructionRequest>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
-
Called by gRPC for each incoming connection from an SDK harness, and enqueue an available SDK
harness client.
- ControlClientPool - Interface in org.apache.beam.runners.fnexecution.control
-
- ControlClientPool.Sink - Interface in org.apache.beam.runners.fnexecution.control
-
- ControlClientPool.Source - Interface in org.apache.beam.runners.fnexecution.control
-
- ConversionContext - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation
-
Conversion context, some rules need this data to convert the nodes.
- ConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMatchRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamTableFunctionScanRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamWindowRule
-
- convert() - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable.RowToDocument
-
- convert(ResolvedNodes.ResolvedQueryStmt, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
-
- convert(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRule
-
- Convert - Class in org.apache.beam.sdk.schemas.transforms
-
A set of utilities for converting between different objects supporting schemas.
- Convert() - Constructor for class org.apache.beam.sdk.schemas.transforms.Convert
-
- convert(TypeDescriptor) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertArray(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertAvroFieldStrict(Object, Schema, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
Strict conversion from AVRO to Beam, strict because it doesn't do widening or narrowing during
conversion.
- convertAvroFormat(Schema.FieldType, Object, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Tries to convert an Avro decoded value to a Beam field value based on the target type of the
Beam field.
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertByteBuffer(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertCharSequence(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertCollection(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertDateTime(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.AvroUtils.AvroConvertType
-
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.AvroUtils.AvroConvertValueForGetter
-
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.AvroUtils.AvroConvertValueForSetter
-
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertDefault(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- ConvertedSchemaInformation(SchemaCoder<T>, Schema.FieldType) - Constructor for class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
-
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertEnum(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertGenericRecordToTableRow(GenericRecord, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
- ConvertHelpers - Class in org.apache.beam.sdk.schemas.utils
-
Helper functions for converting between equivalent schema types.
- ConvertHelpers() - Constructor for class org.apache.beam.sdk.schemas.utils.ConvertHelpers
-
- ConvertHelpers.ConvertedSchemaInformation<T> - Class in org.apache.beam.sdk.schemas.utils
-
Return value after converting a schema.
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertIterable(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertList(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertMap(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertNumbers(TableRow) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
-
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- convertPrimitive(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversion
-
- convertRelNodeToRexRangeRef(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
- convertRelOptCost(RelOptCost) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- convertResolvedLiteral(ResolvedNodes.ResolvedLiteral) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Convert a resolved literal to a RexNode.
- convertRexNodeFromResolvedExpr(ResolvedNodes.ResolvedExpr, List<ResolvedColumn>, List<RelDataTypeField>, Map<String, RexNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Create a RexNode for a corresponding resolved expression node.
- convertRexNodeFromResolvedExpr(ResolvedNodes.ResolvedExpr) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Create a RexNode for a corresponding resolved expression.
- convertRootQuery(ConversionContext, ResolvedNodes.ResolvedQueryStmt) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter
-
- convertTableValuedFunction(RelNode, TableValuedFunction, List<ResolvedNodes.ResolvedFunctionArgument>, List<ResolvedColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Convert a TableValuedFunction in ZetaSQL to a RexCall in Calcite.
- convertToBagSpecInternal(StateSpec<CombiningState<InputT, AccumT, OutputT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
It parses and validate the input query, then convert into a
BeamRelNode
tree.
- convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner
-
It parses and validate the input query, then convert into a
BeamRelNode
tree.
- convertToBeamRel(String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- convertToBeamRel(String, Map<String, Value>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- convertToBeamRel(String, List<Value>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- convertToBeamRel(String, QueryPlanner.QueryParameters) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- convertToFileResourceIfPossible(String) - Static method in class org.apache.beam.sdk.io.FileBasedSink
-
This is a helper function for turning a user-provided output filename prefix and converting it
into a
ResourceId
for writing output files.
- convertToMapSpecInternal(StateSpec<SetState<KeyT>>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
For internal use only; no backwards-compatibility guarantees.
- ConvertType(boolean) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertType
-
- ConvertValueForGetter(StackManipulation) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- ConvertValueForSetter(StackManipulation) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- copy() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns a copy of this RandomAccessData.
- copy() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregatorsAccumulator
-
- copy() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
-
- copy() - Method in class org.apache.beam.runners.spark.structuredstreaming.aggregators.NamedAggregatorsAccumulator
-
- copy() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsContainerStepMapAccumulator
-
- copy(Iterable<String>, Iterable<String>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
- copy(RelTraitSet, RelNode, ImmutableBitSet, List<ImmutableBitSet>, List<AggregateCall>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamAggregationRel
-
- copy(RelTraitSet, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel
-
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCoGBKJoinRel
-
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
-
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIntersectRel
-
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
-
- copy(RelNode, RelDataType, RexNode, boolean, boolean, Map<String, RexNode>, Map<String, RexNode>, RexNode, Map<String, ? extends SortedSet<String>>, boolean, ImmutableBitSet, RelCollation, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
-
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMatchRel
-
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamMinusRel
-
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputJoinRel
-
- copy(RelTraitSet, RexNode, RelNode, RelNode, JoinRelType, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSideInputLookupJoinRel
-
- copy(RelTraitSet, RelNode, RelCollation, RexNode, RexNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
-
- copy(RelTraitSet, List<RelNode>, RexNode, Type, RelDataType, Set<RelColumnMapping>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamTableFunctionScanRel
-
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUncollectRel
-
- copy(RelTraitSet, List<RelNode>, boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnionRel
-
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
-
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
-
- copy(RelTraitSet, RelNode, List<RexLiteral>, RelDataType, List<Window.Group>) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamWindowRel
-
- copy(RelTraitSet, RelNode, RexProgram) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRel
-
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRel
-
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
-
- copy(RelTraitSet, List<RelNode>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
- copy(RelTraitSet, RelNode) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
- copy(List<ClassLoaderFileSystem.ClassLoaderResourceId>, List<ClassLoaderFileSystem.ClassLoaderResourceId>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
-
- copy(List<ResourceIdT>, List<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
-
Copies a List
of file-like resources from one location to another.
- copy(List<ResourceId>, List<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Copies a List
of file-like resources from one location to another.
- copyFrom(ByteBuffer) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKey
backed by a copy of the data remaining in the specified
ByteBuffer
.
- copyFrom(byte[]) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKey
backed by a copy of the specified
byte[]
.
- copyFrom(FieldSpecifierNotationParser.DotExpressionComponentContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
-
- copyFrom(FieldSpecifierNotationParser.QualifierListContext) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
-
- copyResourcesFromJar(JarFile) - Method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarCreator
-
- coreName() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
-
- coreUrl() - Method in class org.apache.beam.sdk.io.solr.SolrIO.ReplicaInfo
-
- cosh(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinTrigonometricFunctions
-
COSH(X)
- Count - Class in org.apache.beam.sdk.transforms
-
- countAsserts(Pipeline) - Static method in class org.apache.beam.sdk.testing.PAssert
-
- COUNTER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTable
-
- Counter - Interface in org.apache.beam.sdk.metrics
-
A metric that reports a single long value and can be incremented or decremented.
- counter(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
- counter(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can be incremented and decremented, and is aggregated by taking the sum.
- CounterImpl - Class in org.apache.beam.runners.jet.metrics
-
- CounterMark(long, Instant) - Constructor for class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Creates a checkpoint mark reflecting the last emitted value.
- CountIf - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Returns the count of TRUE values for expression.
- COUNTIF - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
- CountIf.CountIfFn - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
- CountIfFn() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
- CountingSource - Class in org.apache.beam.sdk.io
-
- CountingSource.CounterMark - Class in org.apache.beam.sdk.io
-
The checkpoint for an unbounded
CountingSource
is simply the last value produced.
- CountWords() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.CountWords
-
- CovarianceFn<T extends java.lang.Number> - Class in org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Combine.CombineFn
for Covariance on Number
types.
- CrashingRunner - Class in org.apache.beam.sdk.testing
-
- CrashingRunner() - Constructor for class org.apache.beam.sdk.testing.CrashingRunner
-
- create() - Static method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Creates a ConnectorConfiguration.
- create(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowClient
-
- create(PCollectionView<?>, Coder<T>) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
-
- create() - Static method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory
-
- create(FlinkJobServerDriver.FlinkServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobInvoker
-
- create(boolean) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
-
- create(String, ByteString, OutputStream) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
-
- create(JobInfo) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
-
- create(JobInfo) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
-
- create(JobInfo, Map<String, EnvironmentFactory.Provider>) - Static method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
-
- create() - Static method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
-
- create(String) - Method in interface org.apache.beam.runners.fnexecution.control.OutputReceiverFactory
-
- create(ReferenceCountingExecutableStageContextFactory.Creator, SerializableFunction<Object, Boolean>) - Static method in class org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory
-
- create(EnvironmentFactory, GrpcFnServer<GrpcDataService>, GrpcFnServer<GrpcStateService>, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- create(String, String) - Method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
-
- create(PipelineOptions, ExecutorService, OutboundObserverFactory) - Static method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
-
- create(PipelineOptions, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<FnApiControlClientPoolService>, ControlClientPool.Source) - Static method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
-
- create(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator) - Static method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory
-
- create(ProcessManager, RunnerApi.Environment, String, InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
-
- create(ProcessManager, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool.Source, IdGenerator, PipelineOptions) - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
-
- create() - Static method in class org.apache.beam.runners.fnexecution.environment.ProcessManager
-
- create(String, String, String, Struct) - Static method in class org.apache.beam.runners.fnexecution.provisioning.JobInfo
-
- create(ProvisionApi.ProvisionInfo, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
-
- create() - Static method in class org.apache.beam.runners.fnexecution.state.GrpcStateService
-
- create(Endpoints.ApiServiceDescriptor, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
- create(GrpcFnServer<ArtifactStagingService>, Function<String, String>, ThrowingConsumer<Exception, String>, JobInvoker) - Static method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
Creates an InMemoryJobService.
- create(GrpcFnServer<ArtifactStagingService>, Function<String, String>, ThrowingConsumer<Exception, String>, JobInvoker, int) - Static method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
Creates an InMemoryJobService.
- create() - Method in interface org.apache.beam.runners.jobsubmission.JobServerDriver.JobInvokerFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkCommonPipelineOptions.TmpCheckpointDirFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.SparkContextOptions.EmptyListenersList
-
- create(SparkJobServerDriver.SparkServerConfiguration) - Static method in class org.apache.beam.runners.spark.SparkJobInvoker
-
- create() - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with default options.
- create(SparkPipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with specified options.
- create() - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with default options.
- create(SparkStructuredStreamingPipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with specified options.
- create(PipelineOptions) - Method in class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
-
- create(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.util.SideInputBroadcast
-
- create(ExpansionService, String, int) - Static method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
Create a
ExpansionServer
for the provided ExpansionService running on an arbitrary
port.
- create(PipelineOptions) - Method in class org.apache.beam.sdk.expansion.service.ExpansionServiceOptions.JavaClassLookupAllowListFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.EnableStreamingEngineFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.ExecutorServiceFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.options.GcsOptions.PathValidatorFactory
-
- create(GcsPath, String) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
- create(GcsPath, String, Integer) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
- create(GcsPath, GcsUtil.CreateOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Creates an object in GCS and prepares for uploading its contents.
- create(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
-
- create(PipelineOptions, Storage, HttpRequestInitializer, ExecutorService, Credentials, Integer) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
-
Returns an instance of
GcsUtil
based on the given parameters.
- create(StorageObject) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
-
- create(IOException) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
-
- create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
- create(Coder<InputT>) - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
- create(double) - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
-
- create(ExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
-
- create(BufferedExternalSorter.Options) - Static method in class org.apache.beam.sdk.extensions.sorter.SortValues
-
Returns a
SortValues<PrimaryKeyT, SecondaryKeyT, ValueT>
PTransform
.
- create(double, double, double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
- create(double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
It creates an instance with rate=0 and window=rowCount for bounded sources.
- create(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Creates Function
from
given method.
- create(Method, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Creates org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.schema.Function
from
given method.
- create(List<String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Full table name with path.
- create(List<String>, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table name plus the path up to but not including table name.
- create(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
-
- create(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAnalyticFunctions
-
- create(DataCatalogPipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- create(Method, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.UserFunctionDefinitions.JavaScalarFunction
-
- create(Class<?>, String, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
-
Creates Function
from
given class.
- create(Method, String, String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
-
Creates Function
from
given method.
- create(RelTraitSet, RelNode, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
Creates an Uncollect.
- create(String) - Static method in class org.apache.beam.sdk.fn.channel.AddHarnessIdInterceptor
-
- create() - Static method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
-
- create(String, Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.DataEndpoint
-
- create(Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
-
- create(String, String, Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.sdk.fn.data.TimerEndpoint
-
- create(List<? extends FnService>, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
- create(ServiceT, Endpoints.ApiServiceDescriptor, ServerFactory) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Create a
GrpcFnServer
for the provided
FnService
which will run at the endpoint
specified in the
Endpoints.ApiServiceDescriptor
.
- create(ServiceT, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
- create() - Static method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
-
- create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.InProcessServerFactory
-
- create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Creates an instance of this server at the address specified by the given service descriptor and
bound to multiple services.
- create(List<BindableService>, Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.server.ServerFactory.InetSocketAddressServerFactory
-
- create(StreamObserver<ReqT>, Runnable) - Static method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
-
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.RetryConfiguration
-
Deprecated.
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsRegionFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.AwsOptions.AwsUserCredentialsFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.AwsOptions.ClientConfigurationFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.options.S3Options.S3UploadBufferSizeBytesFactory
-
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.aws.sns.SnsIO.RetryConfiguration
-
Deprecated.
- create(BuilderT, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
Configure a client builder
BuilderT
using the global defaults in
AwsOptions
.
- create(BuilderT, ClientConfiguration, AwsOptions) - Method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
- create(BuilderT, ClientConfiguration, AwsOptions) - Method in class org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory.DefaultClientBuilder
-
- create(AwsCredentialsProvider, Region, URI) - Static method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
- create() - Static method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkParameters
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsRegionFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.AwsOptions.AwsUserCredentialsFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.options.S3Options.SSECustomerKeyFactory
-
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.aws2.sns.SnsIO.RetryConfiguration
-
- create(String, String, String, long, long) - Static method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.options.BlobstoreOptions.AzureUserCredentialsFactory
-
- create(ClassLoaderFileSystem.ClassLoaderResourceId, CreateOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
-
- create(String[], String, String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration.
- create(String[], String) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration with no default type.
- create(String[]) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Creates a new Elasticsearch connection configuration with no default index nor type.
- create() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.RetryConfiguration
-
Creates RetryConfiguration for
ElasticsearchIO
with provided maxAttempts,
maxDurations and exponential backoff based retries.
- create(WritableByteChannel) - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- create(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileBasedSink.WritableByteChannelFactory
-
- create(EmptyMatchTreatment) - Static method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
- create(ResourceIdT, CreateOptions) - Method in class org.apache.beam.sdk.io.FileSystem
-
Returns a write channel for the given ResourceIdT
.
- create(ResourceId, String) - Static method in class org.apache.beam.sdk.io.FileSystems
-
- create(ResourceId, CreateOptions) - Static method in class org.apache.beam.sdk.io.FileSystems
-
- create(MatchResult.Status, List<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
-
- create(MatchResult.Status, IOException) - Static method in class org.apache.beam.sdk.io.fs.MatchResult
-
- create(ValueProvider<TableReference>, DataFormat, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
-
- create(ValueProvider<TableReference>, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
-
- create(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
Creates an instance of this rule.
- create(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
-
- create(Schema, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
-
Create a PTransform instance.
- create(String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
-
Create a PTransform instance.
- create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
- create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Creates an instance of this rule.
- create(Mutation, Mutation...) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
Creates a new group.
- create(Mutation, Iterable<Mutation>) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
-
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- create(BatchTransactionId) - Static method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
-
- create(String, String, String, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
-
- create(Map<String, String>) - Static method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
-
Create the schema adapter.
- create(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions.ConfigurationLocator
-
- create(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO.DataSourceConfiguration
-
- create(DataSource) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
- create(String, String) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
- create(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
- create(int, Duration, Duration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.RetryConfiguration
-
- create() - Static method in class org.apache.beam.sdk.io.jdbc.JdbcWriteResult
-
- create() - Static method in class org.apache.beam.sdk.io.kinesis.WatermarkParameters
-
- create() - Static method in class org.apache.beam.sdk.io.mongodb.AggregationQuery
-
- create() - Static method in class org.apache.beam.sdk.io.mongodb.FindQuery
-
- create() - Static method in class org.apache.beam.sdk.io.mongodb.UpdateConfiguration
-
- create(String, String) - Static method in class org.apache.beam.sdk.io.mqtt.MqttIO.ConnectionConfiguration
-
Describe a connection configuration to the MQTT broker.
- create() - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
-
- create(String, String, String) - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
-
- create() - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
- create(String, int) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
- create(ValueProvider<String>, ValueProvider<Integer>) - Static method in class org.apache.beam.sdk.io.redis.RedisConnectionConfiguration
-
- create() - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- create(DataSource) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- create(String) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.ConnectionConfiguration
-
Creates a new Solr connection configuration.
- create(int, Duration) - Static method in class org.apache.beam.sdk.io.solr.SolrIO.RetryConfiguration
-
- create() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent.Builder
-
- create() - Method in class org.apache.beam.sdk.io.splunk.SplunkWriteError.Builder
-
- create(long, long, long, long) - Static method in class org.apache.beam.sdk.metrics.DistributionResult
-
- create(long, Instant) - Static method in class org.apache.beam.sdk.metrics.GaugeResult
-
- create(String, MetricName) - Static method in class org.apache.beam.sdk.metrics.MetricKey
-
- create(Iterable<MetricResult<Long>>, Iterable<MetricResult<DistributionResult>>, Iterable<MetricResult<GaugeResult>>) - Static method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
- create(MetricKey, Boolean, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
-
- create(MetricKey, T, T) - Static method in class org.apache.beam.sdk.metrics.MetricResult
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.metrics.MetricsOptions.NoOpMetricsSink
-
- create(PipelineOptions) - Method in interface org.apache.beam.sdk.options.DefaultValueFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.AtomicLongFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.JobNameFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.PipelineOptions.UserAgentFactory
-
- create() - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Creates and returns an object that implements
PipelineOptions
using the values
configured on this builder during construction.
- create() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory
-
- create() - Static method in class org.apache.beam.sdk.Pipeline
-
- create(PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
-
- create() - Static method in class org.apache.beam.sdk.PipelineRunner
-
- create(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.CachingFactory
-
- create(Class<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.Factory
-
- create() - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
- create(Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type over a set of String->Integer values.
- create(List<String>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type from a fixed set of String values; integer values will be
automatically chosen.
- create(String...) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
Create an enumeration type from a fixed set of String values; integer values will be
automatically chosen.
- create(Schema.Field...) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- create(List<Schema.Field>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- create(List<Schema.Field>, Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- create(Object...) - Method in interface org.apache.beam.sdk.schemas.SchemaUserTypeCreator
-
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.AddFields
-
- create(List<String>, String) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.CompatibilityError
-
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Filter
-
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Group.CombineFieldsGlobally
-
Returns a transform that does a global combine using an aggregation built up by calls to
aggregateField and aggregateFields.
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.RenameFields
-
Create an instance of this transform.
- create() - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
- create(Class, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
-
- create(List<String>, Optional<Schema.TypeName>) - Static method in class org.apache.beam.sdk.schemas.utils.SchemaZipFold.Context
-
- create() - Static method in class org.apache.beam.sdk.testing.TestPipeline
-
Creates and returns a new test pipeline.
- create(PipelineOptions) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcherFactory
-
- create(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream
-
- create(Schema) - Static method in class org.apache.beam.sdk.testing.TestStream
-
- create(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.testing.TestStream
-
- create(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Returns an approximate quantiles combiner with the given compareFn
and desired number
of quantiles.
- create(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
- create(int, ComparatorT, long, double) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
Creates an approximate quantiles combiner with the given compareFn
and desired number
of quantiles.
- Create<T> - Class in org.apache.beam.sdk.transforms
-
Create<T>
takes a collection of elements of type T
known when the pipeline is
constructed and returns a PCollection<T>
containing the elements.
- Create() - Constructor for class org.apache.beam.sdk.transforms.Create
-
- create() - Static method in class org.apache.beam.sdk.transforms.Distinct
-
Returns a Distinct<T>
PTransform
.
- create() - Static method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
Create an instance.
- create() - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns a GroupByKey<K, V>
PTransform
.
- create(long, long, SerializableFunction<InputT, Long>, Duration) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
-
- create() - Static method in class org.apache.beam.sdk.transforms.Impulse
-
- create() - Static method in class org.apache.beam.sdk.transforms.join.CoGroupByKey
-
Returns a CoGroupByKey<K>
PTransform
.
- create(JsonToRow.JsonToRowWithErrFn) - Static method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
-
- create() - Static method in class org.apache.beam.sdk.transforms.Keys
-
Returns a Keys<K>
PTransform
.
- create() - Static method in class org.apache.beam.sdk.transforms.KvSwap
-
Returns a KvSwap<K, V>
PTransform
.
- create() - Static method in class org.apache.beam.sdk.transforms.PeriodicImpulse
-
- create() - Static method in class org.apache.beam.sdk.transforms.PeriodicSequence
-
- create() - Static method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
- create(PipelineOptions) - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.EmptyListDefault
-
- create() - Static method in class org.apache.beam.sdk.transforms.Values
-
Returns a Values<V>
PTransform
.
- create(T) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
-
- create(Coder<T>, Coder<MetaT>) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
-
- Create.OfValueProvider<T> - Class in org.apache.beam.sdk.transforms
-
- Create.TimestampedValues<T> - Class in org.apache.beam.sdk.transforms
-
A PTransform
that creates a PCollection
whose elements have associated
timestamps.
- Create.Values<T> - Class in org.apache.beam.sdk.transforms
-
A PTransform
that creates a PCollection
from a set of in-memory objects.
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.ApproximateDistinctFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations.BitXOr
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.ArrayAgg.ArrayAggArray
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggByte
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.impl.udaf.StringAgg.StringAggString
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.Sum
-
- createAccumulator() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- createAccumulator() - Method in interface org.apache.beam.sdk.extensions.sql.udf.AggregateFn
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Concatenate
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
- createAccumulator(CombineWithContext.Context) - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
Returns a new, mutable accumulator value, representing the accumulation of zero input values.
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
- createAccumulator() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
- createAll(Class<?>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Creates Function
for
each method in a given class.
- createArrayOf(String, Object[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createArtifactServerFactory(JobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.jobsubmission.JobServerDriver
-
- createBatchExecutionEnvironment(FlinkPipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
-
If the submitted job is a batch processing job, this method creates the adequate Flink ExecutionEnvironment
depending on the user-specified options.
- createBitXOr(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamBuiltinAggregations
-
- createBlob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createBoundedTableStatistics(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- createBucket(String, Bucket) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Creates a
Bucket
under the specified project in Cloud Storage or propagates an
exception.
- createBuilder(S3Options) - Method in interface org.apache.beam.sdk.io.aws.options.S3ClientBuilderFactory
-
- createBuilder(S3Options) - Method in class org.apache.beam.sdk.io.aws.s3.DefaultS3ClientBuilderFactory
-
- createBuilder(S3Options) - Method in interface org.apache.beam.sdk.io.aws2.options.S3ClientBuilderFactory
-
- createBuilder(S3Options) - Method in class org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
-
- createBuilder(BlobstoreOptions) - Method in class org.apache.beam.sdk.io.azure.blobstore.DefaultBlobstoreClientBuilderFactory
-
- createBuilder(BlobstoreOptions) - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreClientBuilderFactory
-
- createCatalogItems() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
-
- createClassLoader(List<String>) - Method in class org.apache.beam.sdk.extensions.sql.impl.JavaUdfLoader
-
- createClob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createCombineFn(AggregateCall, Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
-
- createCombineFnAnalyticsFunctions(AggregateCall, Schema.Field, String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
-
- createConstantCombineFn() - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.AggregationCombineFnAdapter
-
- createConstructorCreator(Class<T>, Constructor<T>, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- createConstructorCreator(Class<T>, Constructor<T>, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- createContextual(DeserializationContext, BeanProperty) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
-
- createDataCatalogClient(DataCatalogPipelineOptions) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- CreateDataflowView<ElemT,ViewT> - Class in org.apache.beam.runners.dataflow
-
- createDataset(String, String, String, String, Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Create a
Dataset
with the given
location
,
description
and default
expiration time for tables in the dataset (if
null
, tables don't expire).
- createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- createDecompressingChannel(ReadableByteChannel) - Method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
- createDecompressingChannel(ReadableByteChannel) - Method in interface org.apache.beam.sdk.io.CompressedSource.DecompressingChannelFactory
-
Given a channel, create a channel that decompresses the content read from the channel.
- createDefault() - Static method in class org.apache.beam.sdk.coders.CoderRegistry
-
Creates a CoderRegistry containing registrations for all standard coders part of the core Java
Apache Beam SDK and also any registrations provided by
coder
registrars
.
- createDefault() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a ManagedChannel
relying on the ManagedChannelBuilder
to choose the
channel type.
- createDefault() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
- createDefault() - Static method in interface org.apache.beam.sdk.fn.server.ServerFactory.UrlFactory
-
- createDefault() - Static method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
- createDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- createDicomStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- createDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- createDicomStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- CreateDisposition - Enum in org.apache.beam.sdk.io.snowflake.enums
-
Enum containing all supported dispositions for table.
- createDynamoDB() - Method in interface org.apache.beam.sdk.io.aws.dynamodb.AwsClientsProvider
-
- createDynamoDB() - Method in class org.apache.beam.sdk.io.aws.dynamodb.BasicDynamoDBProvider
-
- createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory
-
- createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory
-
- createEnvironment(RunnerApi.Environment, String) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory
-
Creates an active RunnerApi.Environment
and returns a handle to it.
- createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory
-
- createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory
-
- createEnvironment(RunnerApi.Environment, String) - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory
-
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
-
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
-
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
-
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
-
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.Provider
-
- createEnvironmentFactory(GrpcFnServer<FnApiControlClientPoolService>, GrpcFnServer<GrpcLoggingService>, GrpcFnServer<ArtifactRetrievalService>, GrpcFnServer<StaticGrpcProvisionService>, ControlClientPool, IdGenerator) - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory.Provider
-
- createEpoll() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a
ManagedChannelFactory
backed by an
EpollDomainSocketChannel
if the
address is a
DomainSocketAddress
.
- createEpollDomainSocket() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a EpollDomainSocket
.
- createEpollSocket() - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
Create a EpollSocket
.
- createFactory() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
- createFactoryForCreateSubscription() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- createFactoryForPublish(PubsubClient.TopicPath, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Return a factory for testing publishers.
- createFactoryForPull(Clock, PubsubClient.SubscriptionPath, int, Iterable<PubsubClient.IncomingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Return a factory for testing subscribers.
- createFactoryForPullAndPublish(PubsubClient.SubscriptionPath, PubsubClient.TopicPath, Clock, int, Iterable<PubsubClient.IncomingMessage>, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Returns a factory for a test that is expected to both publish and pull messages over the course
of the test.
- createFhirStore(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- createFhirStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- createFhirStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- createFhirStore(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- createForSubrangeOfFile(String, long, long) - Method in class org.apache.beam.sdk.io.AvroSource
-
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.AvroSource
-
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a BlockBasedSource
for the specified range in a single file.
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a CompressedSource
for a subrange of a file.
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Creates and returns a new FileBasedSource
of the same type as the current FileBasedSource
backed by a given file and an offset range.
- createForSubrangeOfFile(MatchResult.Metadata, long, long) - Method in class org.apache.beam.sdk.io.xml.XmlSource
-
- createFrom(String) - Static method in class org.apache.beam.sdk.fn.channel.SocketAddressFactory
-
Parse a SocketAddress
from the given string.
- createGetter(FieldValueTypeInformation, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- createGetterConversions(StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
-
- createGetterConversions(StackManipulation) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
-
- createHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create hl 7 v 2 message message.
- createHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- createHL7v2Store(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create hl 7 v 2 store hl 7 v 2 store.
- createHL7v2Store(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- createImplementor(Method) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
- createInProcess() - Static method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
Creates a ManagedChannel
using an in-process channel.
- createInput(Pipeline, Map<String, PCollection<?>>) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionService.TransformProvider
-
- createInputFormatInstance() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
-
Creates instance of InputFormat class.
- createIterator() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
-
- createJob(Job) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Creates the Dataflow
Job
.
- createJobInvocation(String, String, ListeningExecutorService, RunnerApi.Pipeline, FlinkPipelineOptions, PortablePipelineRunner) - Method in class org.apache.beam.runners.flink.FlinkJobInvoker
-
- createJobServerFactory(JobServerDriver.ServerConfiguration) - Static method in class org.apache.beam.runners.jobsubmission.JobServerDriver
-
- createJobService() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
-
- createKinesisProducer(KinesisProducerConfiguration) - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
-
- createMetadata(MetaT) - Static method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
-
- createNClob() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createNewDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset.
- createNewDataset(String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset with defaultTableExpirationMs.
- createNewTable(String, String, Table) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
- CreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
- CreateOptions - Class in org.apache.beam.sdk.io.fs
-
An abstract class that contains common configuration options for creating resources.
- CreateOptions() - Constructor for class org.apache.beam.sdk.io.fs.CreateOptions
-
- CreateOptions.Builder<BuilderT extends CreateOptions.Builder<BuilderT>> - Class in org.apache.beam.sdk.io.fs
-
- CreateOptions.StandardCreateOptions - Class in org.apache.beam.sdk.io.fs
-
A standard configuration options with builder.
- CreateOptions.StandardCreateOptions.Builder - Class in org.apache.beam.sdk.io.fs
-
- createOutputMap(Iterable<String>) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Creates a mapping from PCollection id to output tag integer.
- createPane(boolean, boolean, PaneInfo.Timing) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
- createPane(boolean, boolean, PaneInfo.Timing, long, long) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Factory method to create a
PaneInfo
with the specified parameters.
- createPartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Creates the metadata table in the given instance, database configuration, with the constructor
specified table name.
- createPipeline() - Method in class org.apache.beam.sdk.expansion.service.ExpansionService
-
- createPipelineOptions(Map<String, String>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
-
- createPlanner(JdbcConnection, Collection<RuleSet>) - Method in interface org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.Factory
-
- createPrepareContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>) - Static method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- createPrimitiveOutputInternal(Pipeline, WindowingStrategy<?, ?>, PCollection.IsBounded, Coder<T>, TupleTag<?>) - Static method in class org.apache.beam.sdk.values.PCollection
-
For internal use only; no backwards-compatibility guarantees.
- createProcessContext(ValueInSingleWindow<InputT>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- createProperties() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- createPushDownRel(RelDataType, List<String>, BeamSqlTableFilter) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
-
- createQuery(Expression, Class<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createQuery(Expression, Type) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createQuery(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
-
- createQueryUsingStandardSql(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
-
- createRandomSubscription(PubsubClient.ProjectPath, PubsubClient.TopicPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create a random subscription for topic
.
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
-
- createReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
-
- createReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.sdk.io.UnboundedSource
-
- createReadSession(CreateReadSessionRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Create a new read session against an existing table.
- createSessionToken(String) - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
-
- createSetter(FieldValueTypeInformation, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- createSetterConversions(StackManipulation) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
-
- createSetterConversions(StackManipulation) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
-
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.AvroSource
-
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.BlockBasedSource
-
Creates a BlockBasedReader
.
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a FileBasedReader
to read a single file.
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Creates and returns an instance of a FileBasedReader
implementation for the current
source assuming the source represents a single file.
- createSingleFileReader(PipelineOptions) - Method in class org.apache.beam.sdk.io.xml.XmlSource
-
- createSnsPublisher() - Method in interface org.apache.beam.sdk.io.aws.sns.AwsClientsProvider
-
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- createSourceForSubrange(long, long) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
- createSQLXML() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createStateBackend(FlinkPipelineOptions) - Method in interface org.apache.beam.runners.flink.FlinkStateBackendFactory
-
- createStatement() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createStatement(int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createStatement(int, int, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createStaticCreator(Class<T>, Method, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- createStaticCreator(Class<T>, Method, Schema, List<FieldValueTypeInformation>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- CreateStream<T> - Class in org.apache.beam.runners.spark.io
-
Create an input stream from Queue.
- createStreamExecutionEnvironment(FlinkPipelineOptions, List<String>, String) - Static method in class org.apache.beam.runners.flink.FlinkExecutionEnvironments
-
If the submitted job is a stream processing job, this method creates the adequate Flink StreamExecutionEnvironment
depending on the
user-specified options.
- createStringAggOperator(ResolvedNodes.ResolvedFunctionCallBase) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
- createStruct(String, Object[]) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- createStructuralValues(Coder<T>, List<T>) - Static method in class org.apache.beam.sdk.testing.SourceTestUtils
-
Testing utilities below depend on standard assertions and matchers to compare elements read by
sources.
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create subscription
to topic
.
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
-
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
-
- createTable(Table) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Creates a table.
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
-
- createTable(Table) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
-
- createTable(Table) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Creates the specified table if it does not exist.
- createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- CreateTableDestinations<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Creates any tables needed before performing writes to the tables.
- CreateTableDestinations(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableDestinations
-
- CreateTableDestinations(BigQueryIO.Write.CreateDisposition, BigQueryServices, DynamicDestinations<?, DestinationT>, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableDestinations
-
- CreateTableHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
-
- CreateTableHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers
-
- CreateTables<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Creates any tables needed before performing streaming writes to the tables.
- CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
-
The list of tables created so far, so we don't try the creation each time.
- createTest(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
-
- createTimestampPolicy(TopicPartition, Optional<Instant>) - Method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
Creates a TimestampPolicy for a partition.
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create topic
.
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- createTransaction() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Returns a transform that creates a batch transaction.
- CreateTransaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
-
Creates a batch translation context.
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator
-
- createTranslationContext(JobInfo, FlinkPipelineOptions, String, List<String>) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator
-
Creates a streaming translation context.
- createTranslator() - Static method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator
-
Creates a batch translator.
- createTypeConversion(boolean) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
-
- createTypeConversion(boolean) - Method in interface org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TypeConversionsFactory
-
- createUnboundedTableStatistics(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- createUrl(String, int) - Method in interface org.apache.beam.sdk.fn.server.ServerFactory.UrlFactory
-
- createValue(String, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Value
specifying which field to set and the value to set.
- createValue(int, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Value
specifying which field to set and the value to set.
- createValue(EnumerationType.Value, T) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Create a
OneOfType.Value
specifying which field to set and the value to set.
- createWatermarkPolicy() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory
-
- createWatermarkPolicy() - Method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory
-
- createWithPortSupplier(Supplier<Integer>) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
- createWithUrlFactory(ServerFactory.UrlFactory) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
- createWithUrlFactoryAndPortSupplier(ServerFactory.UrlFactory, Supplier<Integer>) - Static method in class org.apache.beam.sdk.fn.server.ServerFactory
-
- createWriteOperation() - Method in class org.apache.beam.sdk.io.AvroSink
-
- createWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
- createWriter() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
- createWriteStream(String, WriteStream.Type) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Create a Write Stream for use with the Storage Write API.
- createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- createZetaSqlFunction(String, SqlTypeName) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
Create a dummy SqlFunction of type OTHER_FUNCTION from given function name and return type.
- CredentialFactory - Interface in org.apache.beam.sdk.extensions.gcp.auth
-
Construct an oauth credential to be used by the SDK and the SDK workers.
- credentialsProvider(AwsCredentialsProvider) - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration.Builder
-
- credentialsProvider() - Method in class org.apache.beam.sdk.io.aws2.common.ClientConfiguration
-
- CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
-
- CrossLanguageConfiguration - Class in org.apache.beam.sdk.io.snowflake.crosslanguage
-
Parameters abstract class to expose the transforms to an external SDK.
- CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- crossProductJoin() - Method in class org.apache.beam.sdk.schemas.transforms.CoGroup.Impl
-
Expand the join into individual rows, similar to SQL joins.
- csvLines2BeamRows(CSVFormat, String, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamTableUtils
-
Decode zero or more CSV records from the given string, according to the specified
CSVFormat
, and converts them to
Rows
with the specified
Schema
.
- CsvSink - Class in org.apache.beam.runners.spark.metrics.sink
-
- CsvSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
-
Constructor for Spark 3.1.x and earlier.
- CsvSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.metrics.sink.CsvSink
-
Constructor for Spark 3.2.x and later.
- CsvToRow(Schema, CSVFormat) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
-
- ctxt - Variable in class org.apache.beam.runners.spark.SparkRunner.Evaluator
-
- current() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
-
- current() - Method in interface org.apache.beam.runners.spark.structuredstreaming.aggregators.NamedAggregators.State
-
- currentEventTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current event time.
- currentInputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- currentOutputWatermarkTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- currentProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- currentProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current processing time.
- currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
-
- currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
-
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
-
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
- currentRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Returns a restriction accurately describing the full range of work the current DoFn.ProcessElement
call will do, including already completed work.
- currentSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- currentSynchronizedProcessingTime() - Method in interface org.apache.beam.sdk.state.Timers
-
Returns the current synchronized processing time or null
if unknown.
- currentWatermark - Variable in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
-
- currentWatermark() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimator
-
Return estimated output watermark.
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
-
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
-
- currentWatermark() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
-
- custom() - Static method in class org.apache.beam.sdk.io.thrift.ThriftSchema
-
Builds a schema provider that maps any thrift type to a Beam schema, allowing for custom thrift
typedef entries (which cannot be resolved using the available metadata) to be manually
registered with their corresponding beam types.
- CUSTOM_SOURCE_FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- CustomCoder<T> - Class in org.apache.beam.sdk.coders
-
- CustomCoder() - Constructor for class org.apache.beam.sdk.coders.CustomCoder
-
- Customer - Class in org.apache.beam.sdk.extensions.sql.example.model
-
Describes a customer.
- Customer(int, String, String) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
-
- Customer() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Customer
-
- CustomHttpErrors - Class in org.apache.beam.sdk.extensions.gcp.util
-
An optional component to use with the RetryHttpRequestInitializer
in order to provide
custom errors for failing http calls.
- CustomHttpErrors.Builder - Class in org.apache.beam.sdk.extensions.gcp.util
-
A Builder which allows building immutable CustomHttpErrors object.
- CustomHttpErrors.MatcherAndError - Class in org.apache.beam.sdk.extensions.gcp.util
-
A simple Tuple class for creating a list of HttpResponseMatcher and HttpResponseCustomError to
print for the responses.
- CustomTableResolver - Interface in org.apache.beam.sdk.extensions.sql.meta
-
Interface that table providers can implement if they require custom table name resolution.
- CustomTimestampPolicyWithLimitedDelay<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A policy for custom record timestamps where timestamps within a partition are expected to be
roughly monotonically increasing with a cap on out of order event delays (say 1 minute).
- CustomTimestampPolicyWithLimitedDelay(SerializableFunction<KafkaRecord<K, V>, Instant>, Duration, Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
-
A policy for custom record timestamps where timestamps are expected to be roughly monotonically
increasing with out of order event delays less than maxDelay
.
- DAGBuilder - Class in org.apache.beam.runners.jet
-
Utility class for wiring up Jet DAGs based on Beam pipelines.
- DAGBuilder.WiringListener - Interface in org.apache.beam.runners.jet
-
Listener that can be registered with a
DAGBuilder
in order to be notified when edges
are being registered.
- DaoFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Factory class to create data access objects to perform change stream queries and access the
metadata tables.
- DaoFactory(SpannerConfig, String, SpannerConfig, String, Options.RpcPriority, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Constructs a
DaoFactory
with the configuration to be used for the underlying instances.
- data(StreamObserver<BeamFnApi.Elements>) - Method in class org.apache.beam.runners.fnexecution.data.GrpcDataService
-
- data(String, String) - Static method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
-
- DATA_BUFFER_SIZE_LIMIT - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
- DATA_BUFFER_TIME_LIMIT_MS - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
- DATA_RECORD_COMMITTED_TO_EMITTED_0MS_TO_1000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies [0, 1000) ms during the execution of the Connector.
- DATA_RECORD_COMMITTED_TO_EMITTED_1000MS_TO_3000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies [1000, 3000) ms during the execution of the Connector.
- DATA_RECORD_COMMITTED_TO_EMITTED_3000MS_TO_INF_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies equal or above 3000ms during the execution of the Connector.
- DATA_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of data records identified during the execution of the Connector.
- database() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
-
- DataCatalogPipelineOptions - Interface in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
Pipeline options for Data Catalog table provider.
- DataCatalogPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
- DataCatalogPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptionsRegistrar
-
- DataCatalogTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
Uses DataCatalog to get the source type and schema for a table.
- DataChangeRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A data change record encodes modifications to Cloud Spanner rows.
- DataChangeRecord(String, Timestamp, String, boolean, String, String, List<ColumnType>, List<Mod>, ModType, ValueCaptureType, long, long, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Constructs a data change record for a given partition, at a given timestamp, for a given
transaction.
- dataChangeRecordAction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of processing
DataChangeRecord
s.
- DataChangeRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
- DataChangeRecordAction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
-
- DataEndpoint<T> - Class in org.apache.beam.sdk.fn.data
-
- DataEndpoint() - Constructor for class org.apache.beam.sdk.fn.data.DataEndpoint
-
- DataflowClient - Class in org.apache.beam.runners.dataflow
-
Wrapper around the generated
Dataflow
client to provide common functionality.
- DataflowClientFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
-
- DataflowJobAlreadyExistsException - Exception in org.apache.beam.runners.dataflow
-
An exception that is thrown if the unique job name constraint of the Dataflow service is broken
because an existing job with the same job name is currently active.
- DataflowJobAlreadyExistsException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyExistsException
-
Create a new
DataflowJobAlreadyExistsException
with the specified
DataflowPipelineJob
and message.
- DataflowJobAlreadyUpdatedException - Exception in org.apache.beam.runners.dataflow
-
An exception that is thrown if the existing job has already been updated within the Dataflow
service and is no longer able to be updated.
- DataflowJobAlreadyUpdatedException(DataflowPipelineJob, String) - Constructor for exception org.apache.beam.runners.dataflow.DataflowJobAlreadyUpdatedException
-
Create a new
DataflowJobAlreadyUpdatedException
with the specified
DataflowPipelineJob
and message.
- DataflowJobException - Exception in org.apache.beam.runners.dataflow
-
- DataflowPipelineDebugOptions - Interface in org.apache.beam.runners.dataflow.options
-
Internal.
- DataflowPipelineDebugOptions.DataflowClientFactory - Class in org.apache.beam.runners.dataflow.options
-
Returns the default Dataflow client built from the passed in PipelineOptions.
- DataflowPipelineDebugOptions.StagerFactory - Class in org.apache.beam.runners.dataflow.options
-
- DataflowPipelineJob - Class in org.apache.beam.runners.dataflow
-
A DataflowPipelineJob represents a job submitted to Dataflow using
DataflowRunner
.
- DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>, RunnerApi.Pipeline) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Constructs the job.
- DataflowPipelineJob(DataflowClient, String, DataflowPipelineOptions, Map<AppliedPTransform<?, ?, ?>, String>) - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Constructs the job.
- DataflowPipelineOptions - Interface in org.apache.beam.runners.dataflow.options
-
- DataflowPipelineOptions.FlexResourceSchedulingGoal - Enum in org.apache.beam.runners.dataflow.options
-
Set of available Flexible Resource Scheduling goals.
- DataflowPipelineOptions.StagingLocationFactory - Class in org.apache.beam.runners.dataflow.options
-
- DataflowPipelineRegistrar - Class in org.apache.beam.runners.dataflow
-
- DataflowPipelineRegistrar.Options - Class in org.apache.beam.runners.dataflow
-
- DataflowPipelineRegistrar.Runner - Class in org.apache.beam.runners.dataflow
-
- DataflowPipelineTranslator - Class in org.apache.beam.runners.dataflow
-
- DataflowPipelineTranslator.JobSpecification - Class in org.apache.beam.runners.dataflow
-
The result of a job translation.
- DataflowPipelineWorkerPoolOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used to configure the Dataflow pipeline worker pool.
- DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType - Enum in org.apache.beam.runners.dataflow.options
-
Type of autoscaling algorithm to use.
- DataflowProfilingAgentConfiguration() - Constructor for class org.apache.beam.runners.dataflow.options.DataflowProfilingOptions.DataflowProfilingAgentConfiguration
-
- DataflowProfilingOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options for controlling profiling of pipeline execution.
- DataflowProfilingOptions.DataflowProfilingAgentConfiguration - Class in org.apache.beam.runners.dataflow.options
-
Configuration the for profiling agent.
- DataflowRunner - Class in org.apache.beam.runners.dataflow
-
A
PipelineRunner
that executes the operations in the pipeline by first translating them
to the Dataflow representation using the
DataflowPipelineTranslator
and then submitting
them to a Dataflow service for execution.
- DataflowRunner(DataflowPipelineOptions) - Constructor for class org.apache.beam.runners.dataflow.DataflowRunner
-
- DataflowRunner.StreamingPCollectionViewWriterFn<T> - Class in org.apache.beam.runners.dataflow
-
- DataflowRunnerHooks - Class in org.apache.beam.runners.dataflow
-
An instance of this class can be passed to the
DataflowRunner
to add user defined hooks
to be invoked at various times during pipeline execution.
- DataflowRunnerHooks() - Constructor for class org.apache.beam.runners.dataflow.DataflowRunnerHooks
-
- DataflowRunnerInfo - Class in org.apache.beam.runners.dataflow
-
- DataflowServiceException - Exception in org.apache.beam.runners.dataflow
-
Signals there was an error retrieving information about a job from the Cloud Dataflow Service.
- DataflowTemplateJob - Class in org.apache.beam.runners.dataflow.util
-
- DataflowTemplateJob() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
-
- DataflowTransport - Class in org.apache.beam.runners.dataflow.util
-
Helpers for cloud communication.
- DataflowTransport() - Constructor for class org.apache.beam.runners.dataflow.util.DataflowTransport
-
- DataflowWorkerHarnessOptions - Interface in org.apache.beam.runners.dataflow.options
-
Options that are used exclusively within the Dataflow worker harness.
- DataflowWorkerLoggingOptions - Interface in org.apache.beam.runners.dataflow.options
-
- DataflowWorkerLoggingOptions.Level - Enum in org.apache.beam.runners.dataflow.options
-
Deprecated.
The set of log levels that can be used on the Dataflow worker.
- DataflowWorkerLoggingOptions.WorkerLogLevelOverrides - Class in org.apache.beam.runners.dataflow.options
-
Deprecated.
Defines a log level override for a specific class, package, or name.
- DataframeTransform - Class in org.apache.beam.sdk.extensions.python.transforms
-
Wrapper for invoking external Python DataframeTransform
.
- dataSchema - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
-
- dataset - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
-
- dataSets - Variable in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- DatasetSourceBatch - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch
-
Spark DataSourceV2 API was removed in Spark3.
- DatasetSourceBatch() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.DatasetSourceBatch
-
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.influxdb.InfluxDbIO.DataSourceConfiguration
-
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceConfiguration
-
- DataSourceConfiguration() - Constructor for class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
- DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
-
- DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
-
- DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
-
- DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that reads the result rows of a Cloud Datastore query as
Entity
objects.
- DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
-
- DataStoreV1SchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.datastore
-
- DataStoreV1SchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
- DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
An abstraction to create schema aware IOs.
- DataStoreV1TableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.datastore
-
- DataStoreV1TableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
-
- DataStreamDecoder(Coder<T>, PrefetchableIterator<ByteString>) - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
-
- DataStreams - Class in org.apache.beam.sdk.fn.stream
-
- DataStreams() - Constructor for class org.apache.beam.sdk.fn.stream.DataStreams
-
- DataStreams.DataStreamDecoder<T> - Class in org.apache.beam.sdk.fn.stream
-
- DataStreams.ElementDelimitedOutputStream - Class in org.apache.beam.sdk.fn.stream
-
- DataStreams.OutputChunkConsumer<T> - Interface in org.apache.beam.sdk.fn.stream
-
- DATE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- date(Integer, Integer, Integer) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
-
- date(DateTime) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
-
- date(DateTime, String) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
-
- DATE - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- Date - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A date without a time-zone.
- Date() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.Date
-
- DATE - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
-
Beam LogicalType corresponding to ZetaSQL/CalciteSQL DATE type.
- DATE_FIELD_NAME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
-
- DATE_METHOD - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.BeamBuiltinMethods
-
- DATE_OP - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators
-
- DATE_TYPES - Static variable in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- DateFunctions - Class in org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
DateFunctions.
- DateFunctions() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.DateFunctions
-
- DateIncrementAllFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.DateIncrementAllFn
-
- DATETIME - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- DateTime - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A datetime without a time-zone.
- DateTime() - Constructor for class org.apache.beam.sdk.schemas.logicaltypes.DateTime
-
- DATETIME - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SqlTypes
-
Beam LogicalType corresponding to ZetaSQL DATETIME type.
- DATETIME - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of datetime fields.
- DATETIME_SCHEMA - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
-
- DateTimeBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle
-
- DateTimeUtils - Class in org.apache.beam.sdk.extensions.sql.zetasql
-
DateTimeUtils.
- DateTimeUtils() - Constructor for class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
-
- days(int) - Static method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows
-
Returns a
WindowFn
that windows elements into periods measured by days.
- DDL_EXECUTOR - Static variable in class org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser
-
Ddl Executor.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
The tag for the deadletter output of FHIR resources.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
-
The tag for the deadletter output of FHIR Resources.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
-
The tag for the deadletter output of FHIR Resources from a GetPatientEverything request.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
-
The tag for the deadletter output of HL7v2 Messages.
- DeadLetteredTransform<InputT,OutputT> - Class in org.apache.beam.sdk.schemas.io
-
- DeadLetteredTransform(SimpleFunction<InputT, OutputT>, String) - Constructor for class org.apache.beam.sdk.schemas.io.DeadLetteredTransform
-
- DebeziumIO - Class in org.apache.beam.io.debezium
-
Utility class which exposes an implementation
DebeziumIO.read()
and a Debezium configuration.
- DebeziumIO.ConnectorConfiguration - Class in org.apache.beam.io.debezium
-
A POJO describing a Debezium configuration.
- DebeziumIO.Read<T> - Class in org.apache.beam.io.debezium
-
- DebeziumSDFDatabaseHistory() - Constructor for class org.apache.beam.io.debezium.KafkaSourceConsumerFn.DebeziumSDFDatabaseHistory
-
- DebeziumTransformRegistrar - Class in org.apache.beam.io.debezium
-
- DebeziumTransformRegistrar() - Constructor for class org.apache.beam.io.debezium.DebeziumTransformRegistrar
-
- DebeziumTransformRegistrar.ReadBuilder - Class in org.apache.beam.io.debezium
-
- DebeziumTransformRegistrar.ReadBuilder.Configuration - Class in org.apache.beam.io.debezium
-
- dec() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
-
- dec(long) - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
-
- dec() - Method in interface org.apache.beam.sdk.metrics.Counter
-
- dec(long) - Method in interface org.apache.beam.sdk.metrics.Counter
-
- dec() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
-
- dec(long) - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
-
- decActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
- DECIMAL - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- DECIMAL - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of decimal fields.
- decode(InputStream) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- decode(InputStream) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
-
- decode(InputStream) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.MultiOutputCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.AvroCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BitSetCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.BitSetCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.BooleanCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ByteCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.Coder
-
Decodes a value of type T
from the given input stream in the given context.
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.DurationCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.FloatCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.InstantCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.KvCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.KvCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.MapCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.MapCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SnappyCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.SortedMapCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.coders.VoidCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.aws2.sqs.MessageCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.aws2.sqs.SendMessageRequestCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.ReadableFileCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.thrift.ThriftCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- decode(InputStream) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- decodeFromChunkBoundaryToChunkBoundary() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
-
Skips any remaining bytes in the current ByteString
moving to the next ByteString
in the underlying ByteString
iterator
and decoding
elements till at the next boundary.
- decodePacked32TimeSeconds(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeSeconds
as a
LocalTime
with seconds precision.
- decodePacked32TimeSecondsAsJavaTime(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes bitFieldTimeSeconds
as a LocalTime
with seconds precision.
- decodePacked64DatetimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeMicros
as a
LocalDateTime
with microseconds precision.
- decodePacked64DatetimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes bitFieldDatetimeMicros
as a LocalDateTime
with microseconds
precision.
- decodePacked64DatetimeSeconds(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeSeconds
as a
LocalDateTime
with seconds precision.
- decodePacked64DatetimeSecondsAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeSeconds
as a
LocalDateTime
with seconds precision.
- decodePacked64TimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeMicros
as a
LocalTime
with microseconds precision.
- decodePacked64TimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes bitFieldTimeMicros
as a LocalTime
with microseconds
precision.
- decodePacked64TimeNanos(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeNanos
as a
LocalTime
with nanoseconds precision.
- decodePacked64TimeNanosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes bitFieldTimeNanos
as a LocalTime
with nanoseconds precision.
- decodePane(byte) - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
- decodeQueryResult(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
-
- decodeTimerDataTimerId(String) - Static method in class org.apache.beam.runners.fnexecution.control.TimerReceiverFactory
-
Decodes a string into the transform and timer family ids.
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.CollectionCoder
-
Builds an instance of IterableT
, this coder's associated Iterable
-like subtype,
from a list of decoded elements.
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.DequeCoder
-
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableCoder
-
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Builds an instance of IterableT
, this coder's associated Iterable
-like subtype,
from a list of decoded elements.
- decodeToIterable(List<T>, long, InputStream) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
Builds an instance of IterableT
, this coder's associated Iterable
-like subtype,
from a list of decoded elements with the InputStream
at the position where this coder
detected the end of the stream.
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.ListCoder
-
- decodeToIterable(List<T>) - Method in class org.apache.beam.sdk.coders.SetCoder
-
Builds an instance of IterableT
, this coder's associated Iterable
-like subtype,
from a list of decoded elements.
- decodeWindowedValue(byte[], Coder) - Static method in class org.apache.beam.runners.jet.Utils
-
- DecodingFnDataReceiver<T> - Class in org.apache.beam.sdk.fn.data
-
A receiver of encoded data, decoding it and passing it onto a downstream consumer.
- DecodingFnDataReceiver(Coder<T>, FnDataReceiver<T>) - Constructor for class org.apache.beam.sdk.fn.data.DecodingFnDataReceiver
-
- decrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
-
Returns an
IdGenerators
that will provide successive decrementing longs.
- deduplicate(UuidDeduplicationOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Remove duplicates from the PTransform from a read.
- deduplicate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
-
- Deduplicate - Class in org.apache.beam.sdk.transforms
-
A set of
PTransform
s which deduplicate input records over a time domain and threshold.
- Deduplicate.KeyedValues<K,V> - Class in org.apache.beam.sdk.transforms
-
Deduplicates keyed values using the key over a specified time domain and threshold.
- Deduplicate.Values<T> - Class in org.apache.beam.sdk.transforms
-
Deduplicates values over a specified time domain and threshold.
- Deduplicate.WithRepresentativeValues<T,IdT> - Class in org.apache.beam.sdk.transforms
-
- deepEquals(Object, Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.values.Row.Equals
-
- deepHashCode(Object, Schema.FieldType) - Static method in class org.apache.beam.sdk.values.Row.Equals
-
- DEF - Static variable in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
-
- DEFAULT - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
-
- DEFAULT - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
-
- Default() - Constructor for class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
-
- Default - Annotation Type in org.apache.beam.sdk.options
-
Default
represents a set of annotations that can be used to annotate getter properties on
PipelineOptions
with information representing the default value to be returned if no
value is specified.
- Default.Boolean - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified boolean primitive value.
- Default.Byte - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified byte primitive value.
- Default.Character - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified char primitive value.
- Default.Class - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified Class
value.
- Default.Double - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified double primitive value.
- Default.Enum - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified enum.
- Default.Float - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified float primitive value.
- Default.InstanceFactory - Annotation Type in org.apache.beam.sdk.options
-
- Default.Integer - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified int primitive value.
- Default.Long - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified long primitive value.
- Default.Short - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified short primitive value.
- Default.String - Annotation Type in org.apache.beam.sdk.options
-
This represents that the default of the option is the specified String
value.
- DEFAULT_ATTRIBUTE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
-
- DEFAULT_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
- DEFAULT_BUFFER_LIMIT_TIME_MS - Static variable in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
- DEFAULT_BUFFER_SIZE - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- DEFAULT_BYTE_ARRAY_CODER - Static variable in class org.apache.beam.sdk.io.TFRecordIO
-
The default coder, which returns each record of the input file as a byte array.
- DEFAULT_CALC - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- DEFAULT_CHANGE_STREAM_NAME - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default change stream name for a change stream query is the empty String
.
- DEFAULT_CONTEXT - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
-
- DEFAULT_DEDUPLICATE_DURATION - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
-
- DEFAULT_DURATION - Static variable in class org.apache.beam.sdk.transforms.Deduplicate
-
The default duration is 10 mins.
- DEFAULT_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
- DEFAULT_INCLUSIVE_START_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default start timestamp for a change stream query is Timestamp.MIN_VALUE
.
- DEFAULT_INITIAL_BACKOFF - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
-
- DEFAULT_MASTER_URL - Static variable in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
-
- DEFAULT_MAX_CUMULATIVE_BACKOFF - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
-
- DEFAULT_MAX_INSERT_BLOCK_SIZE - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
-
- DEFAULT_MAX_INVOCATION_HISTORY - Static variable in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
The default maximum number of completed invocations to keep.
- DEFAULT_MAX_NUM_ELEMENTS - Static variable in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
The cost (in time and space) to compute quantiles to a given accuracy is a function of the
total number of elements in the data set.
- DEFAULT_MAX_RETRIES - Static variable in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
-
- DEFAULT_OUTBOUND_BUFFER_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.fn.stream.DataStreams
-
- DEFAULT_PARALLELISM - Static variable in class org.apache.beam.runners.spark.structuredstreaming.Constants
-
- DEFAULT_PRECISION - Static variable in class org.apache.beam.sdk.extensions.zetasketch.HllCount
-
- DEFAULT_RPC_PRIORITY - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
- DEFAULT_SCHEME - Static variable in class org.apache.beam.sdk.io.FileSystems
-
- DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
-
- DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.transforms.Deduplicate
-
- DEFAULT_UNWINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
The default sharding name template.
- DEFAULT_UPLOAD_BUFFER_SIZE_BYTES - Static variable in class org.apache.beam.sdk.io.aws2.options.S3Options.S3UploadBufferSizeBytesFactory
-
- DEFAULT_USES_RESHUFFLE - Static variable in class org.apache.beam.sdk.io.ReadAllViaFileBasedSource
-
- DEFAULT_UUID_EXTRACTOR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
-
- DEFAULT_WINDOWED_SHARD_TEMPLATE - Static variable in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
The default windowed sharding name template used when writing windowed files.
- DefaultAutoscaler - Class in org.apache.beam.sdk.io.jms
-
- DefaultAutoscaler() - Constructor for class org.apache.beam.sdk.io.jms.DefaultAutoscaler
-
- DefaultBlobstoreClientBuilderFactory - Class in org.apache.beam.sdk.io.azure.blobstore
-
Construct BlobServiceClientBuilder with given values of Azure client properties.
- DefaultBlobstoreClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.azure.blobstore.DefaultBlobstoreClientBuilderFactory
-
- DefaultCoder - Annotation Type in org.apache.beam.sdk.coders
-
The
DefaultCoder
annotation specifies a
Coder
class to handle encoding and
decoding instances of the annotated class.
- DefaultCoder.DefaultCoderProviderRegistrar - Class in org.apache.beam.sdk.coders
-
- DefaultCoderCloudObjectTranslatorRegistrar - Class in org.apache.beam.runners.dataflow.util
-
- DefaultCoderCloudObjectTranslatorRegistrar() - Constructor for class org.apache.beam.runners.dataflow.util.DefaultCoderCloudObjectTranslatorRegistrar
-
- DefaultCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
-
- DefaultConcludeTransform() - Constructor for class org.apache.beam.sdk.testing.PAssert.DefaultConcludeTransform
-
- defaultConfig(JdbcConnection, Collection<RuleSet>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
- DefaultExecutableStageContext - Class in org.apache.beam.runners.fnexecution.control
-
- defaultFactory() - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
- DefaultFilenamePolicy - Class in org.apache.beam.sdk.io
-
- DefaultFilenamePolicy.Params - Class in org.apache.beam.sdk.io
-
- DefaultFilenamePolicy.ParamsCoder - Class in org.apache.beam.sdk.io
-
- DefaultGcpRegionFactory - Class in org.apache.beam.runners.dataflow.options
-
Factory for a default value for Google Cloud region according to
https://cloud.google.com/compute/docs/gcloud-compute/#default-properties.
- DefaultGcpRegionFactory() - Constructor for class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
-
- DefaultJobBundleFactory - Class in org.apache.beam.runners.fnexecution.control
-
- DefaultJobBundleFactory.ServerInfo - Class in org.apache.beam.runners.fnexecution.control
-
A container for EnvironmentFactory and its corresponding Grpc servers.
- DefaultJobBundleFactory.WrappedSdkHarnessClient - Class in org.apache.beam.runners.fnexecution.control
-
Holder for an
SdkHarnessClient
along with its associated state and data servers.
- DefaultJobServerConfigFactory() - Constructor for class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.DefaultJobServerConfigFactory
-
- DefaultMaxCacheMemoryUsageMb() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb
-
- DefaultMaxCacheMemoryUsageMbFactory() - Constructor for class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMbFactory
-
- defaultNaming(String, String) - Static method in class org.apache.beam.sdk.io.FileIO.Write
-
- defaultNaming(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.FileIO.Write
-
Defines a default
FileIO.Write.FileNaming
which will use the prefix and suffix supplied to create
a name based on the window, pane, number of shards, shard index, and compression.
- defaultOptions() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Factory method to return a new instance of
RpcQosOptions
with all default values.
- DefaultPipelineOptionsRegistrar - Class in org.apache.beam.sdk.options
-
- DefaultPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
-
- DefaultProjectFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.DefaultProjectFactory
-
- defaultPublishResponse() - Static method in class org.apache.beam.sdk.io.aws2.sns.PublishResponseCoders
-
Returns a new SNS
PublishResponse
coder which by default serializes only the SNS
messageId.
- defaultPublishResult() - Static method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoders
-
Returns a new PublishResult coder which by default serializes only the messageId.
- DefaultRateLimiter(BackOff, BackOff) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
-
- DefaultRateLimiter(Duration, Duration, Duration) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
-
- DefaultRateLimiter(BackOff, BackOff) - Constructor for class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
-
- DefaultRateLimiter(Duration, Duration, Duration) - Constructor for class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
-
- DefaultRetryStrategy() - Constructor for class org.apache.beam.sdk.io.jdbc.JdbcIO.DefaultRetryStrategy
-
- defaults() - Static method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- Defaults() - Constructor for class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
-
- DefaultS3ClientBuilderFactory - Class in org.apache.beam.sdk.io.aws.s3
-
Construct AmazonS3ClientBuilder with default values of S3 client properties like path style
access, accelerated mode, etc.
- DefaultS3ClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws.s3.DefaultS3ClientBuilderFactory
-
- DefaultS3ClientBuilderFactory - Class in org.apache.beam.sdk.io.aws2.s3
-
Construct S3ClientBuilder with default values of S3 client properties like path style access,
accelerated mode, etc.
- DefaultS3ClientBuilderFactory() - Constructor for class org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
-
- DefaultS3FileSystemSchemeRegistrar - Class in org.apache.beam.sdk.io.aws.s3
-
Registers the "s3" uri schema to be handled by S3FileSystem
.
- DefaultS3FileSystemSchemeRegistrar() - Constructor for class org.apache.beam.sdk.io.aws.s3.DefaultS3FileSystemSchemeRegistrar
-
- DefaultS3FileSystemSchemeRegistrar - Class in org.apache.beam.sdk.io.aws2.s3
-
Registers the "s3" uri schema to be handled by S3FileSystem
.
- DefaultS3FileSystemSchemeRegistrar() - Constructor for class org.apache.beam.sdk.io.aws2.s3.DefaultS3FileSystemSchemeRegistrar
-
- DefaultSchema - Annotation Type in org.apache.beam.sdk.schemas.annotations
-
- DefaultSchema.DefaultSchemaProvider - Class in org.apache.beam.sdk.schemas.annotations
-
- DefaultSchema.DefaultSchemaProviderRegistrar - Class in org.apache.beam.sdk.schemas.annotations
-
Registrar for default schemas.
- DefaultSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
- DefaultSchemaProviderRegistrar() - Constructor for class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProviderRegistrar
-
- DefaultStopPipelineWatermarkFactory() - Constructor for class org.apache.beam.runners.spark.TestSparkPipelineOptions.DefaultStopPipelineWatermarkFactory
-
- DefaultTableFilter - Class in org.apache.beam.sdk.extensions.sql.meta
-
- DefaultTableFilter(List<RexNode>) - Constructor for class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
-
- DefaultTrigger - Class in org.apache.beam.sdk.transforms.windowing
-
A trigger that is equivalent to Repeatedly.forever(AfterWatermark.pastEndOfWindow())
.
- defaultType() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
-
- DefaultTypeConversionsFactory() - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.DefaultTypeConversionsFactory
-
- defaultValue() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- defaultValue() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
-
- defaultValue() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns the default value when there are no values added to the accumulator.
- defaultValue() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the default value when there are no values added to the accumulator.
- defaultValue() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.CombineFnWithContext
-
- defaultValue() - Method in class org.apache.beam.sdk.transforms.View.AsSingleton
-
Returns the default value of this transform, or null if there isn't one.
- DefaultValueFactory<T> - Interface in org.apache.beam.sdk.options
-
- deidentify(String, String, DeidentifyConfig) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Deidentify FHIR resources.
- deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Deidentify FHIR resources.
- Deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
-
- deidentify(DoFn<String, String>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
-
- deidentifyFhirStore(String, String, DeidentifyConfig) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- deidentifyFhirStore(String, String, DeidentifyConfig) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- DeidentifyFn(ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
-
- delay(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform
-
For internal use only; no backwards-compatibility guarantees.
- Delay() - Constructor for class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
-
- DelayIntervalRateLimiter() - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
-
- DelayIntervalRateLimiter(Supplier<Duration>) - Constructor for class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
-
- DelayIntervalRateLimiter() - Constructor for class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
-
- DelayIntervalRateLimiter(Supplier<Duration>) - Constructor for class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
-
- delegate() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
-
- delegate(HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register display data from the specified component on behalf of the current component.
- delegateBasedUponType(EnumMap<BeamFnApi.StateKey.TypeCase, StateRequestHandler>) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
Returns a
StateRequestHandler
which delegates to the supplied handler depending on the
BeamFnApi.StateRequest
s
type
.
- DelegateCoder<T,IntermediateT> - Class in org.apache.beam.sdk.coders
-
A
DelegateCoder<T, IntermediateT>
wraps a
Coder
for
IntermediateT
and
encodes/decodes values of type
T
by converting to/from
IntermediateT
and then
encoding/decoding using the underlying
Coder<IntermediateT>
.
- DelegateCoder(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Constructor for class org.apache.beam.sdk.coders.DelegateCoder
-
- DelegateCoder.CodingFunction<InputT,OutputT> - Interface in org.apache.beam.sdk.coders
-
- DelegatingCounter - Class in org.apache.beam.sdk.metrics
-
Implementation of
Counter
that delegates to the instance for the current context.
- DelegatingCounter(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
-
- DelegatingCounter(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingCounter
-
- DelegatingDistribution - Class in org.apache.beam.sdk.metrics
-
Implementation of
Distribution
that delegates to the instance for the current context.
- DelegatingDistribution(MetricName) - Constructor for class org.apache.beam.sdk.metrics.DelegatingDistribution
-
- DelegatingDistribution(MetricName, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingDistribution
-
- DelegatingHistogram - Class in org.apache.beam.sdk.metrics
-
Implementation of
Histogram
that delegates to the instance for the current context.
- DelegatingHistogram(MetricName, HistogramData.BucketType, boolean) - Constructor for class org.apache.beam.sdk.metrics.DelegatingHistogram
-
- delete() - Static method in class org.apache.beam.sdk.io.cassandra.CassandraIO
-
- delete(Collection<ClassLoaderFileSystem.ClassLoaderResourceId>) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
-
- delete(Collection<ResourceIdT>) - Method in class org.apache.beam.sdk.io.FileSystem
-
Deletes a collection of resources.
- delete(Collection<ResourceId>, MoveOptions...) - Static method in class org.apache.beam.sdk.io.FileSystems
-
Deletes a collection of resources.
- DELETE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
-
- deleteAsync(T) - Method in interface org.apache.beam.sdk.io.cassandra.Mapper
-
This method is called for each delete event.
- DeleteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.DeleteBuilder
-
- deleteDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Deletes the dataset specified by the datasetId value.
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- deleteDicomStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- deleteDicomStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
- deleteFhirStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- deleteFhirStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- deleteHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Delete hl 7 v 2 message empty.
- deleteHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- deleteHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Delete hl 7 v 2 store empty.
- deleteHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
- deletePartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Drops the metadata table.
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Delete subscription
.
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- deleteTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Deletes the table specified by tableId from the dataset.
- deleteTable(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
- deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- deleteTimer(StateNamespace, String, String, TimeDomain) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- deleteTimer(TimerInternals.TimerData) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- deleteTimer(StateNamespace, String, String) - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- deleteTimer(Instant, TimeDomain) - Method in interface org.apache.beam.sdk.state.Timers
-
Removes the timer set in this context for the timestamp
and timeDomain
.
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- delimitElement() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.ElementDelimitedOutputStream
-
- dependencies(Row, PipelineOptions) - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
List the dependencies needed for this transform.
- dependencies(Row, PipelineOptions) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
- dependsOnlyOnEarliestTimestamp() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns true
if the result of combination of many output timestamps actually depends
only on the earliest.
- dependsOnlyOnWindow() - Method in enum org.apache.beam.sdk.transforms.windowing.TimestampCombiner
-
Returns true
if the result does not depend on what outputs were combined but only the
window they are in.
- DequeCoder<T> - Class in org.apache.beam.sdk.coders
-
- DequeCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.DequeCoder
-
- deregister() - Method in interface org.apache.beam.runners.fnexecution.state.StateDelegator.Registration
-
De-registers the handler for all future requests for state for the registered process bundle
instruction id.
- deriveIterableValueCoder(WindowedValue.FullWindowedValueCoder) - Static method in class org.apache.beam.runners.jet.Utils
-
- deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel
-
- deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRel
-
- deriveRowType() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
- deriveUncollectRowType(RelNode, boolean) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.ZetaSqlUnnest
-
Returns the row type returned by applying the 'UNNEST' operation to a relational expression.
- describe(Set<Class<? extends PipelineOptions>>) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Outputs the set of options available to be set for the passed in
PipelineOptions
interfaces.
- describeMismatchSafely(BigqueryMatcher.TableAndQuery, Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
-
- describeMismatchSafely(ShardedFile, Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
-
- describeMismatchSafely(T, Description) - Method in class org.apache.beam.sdk.testing.JsonMatcher
-
- describePipelineOptions(JobApi.DescribePipelineOptionsRequest, StreamObserver<JobApi.DescribePipelineOptionsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- describeTo(Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
-
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.FileChecksumMatcher
-
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.JsonMatcher
-
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.RegexMatcher
-
- describeTo(Description) - Method in class org.apache.beam.sdk.testing.TestPipelineOptions.AlwaysPassMatcher
-
- Description - Annotation Type in org.apache.beam.sdk.options
-
Descriptions are used to generate human readable output when the --help
command is
specified.
- deserialize(String, byte[]) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
-
- deserialize(String) - Static method in class org.apache.beam.sdk.io.kinesis.serde.AwsSerializableUtils
-
- deserialize(JsonParser, DeserializationContext) - Method in class org.apache.beam.sdk.options.ValueProvider.Deserializer
-
- deserialize(byte[]) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
-
- deserializeAwsCredentialsProvider(String) - Static method in class org.apache.beam.sdk.io.aws2.options.AwsSerializableUtils
-
- DeserializeBytesIntoPubsubMessagePayloadOnly() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
-
- DeserializerProvider<T> - Interface in org.apache.beam.sdk.io.kafka
-
- deserializeTimers(Collection<byte[]>, TimerInternals.TimerDataCoderV2) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- detect(String) - Static method in enum org.apache.beam.sdk.io.Compression
-
- detectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, ChangeStreamMetrics, Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a single instance of an action class capable of detecting and scheduling
new partitions to be queried.
- DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is responsible for scheduling partitions.
- DetectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, ChangeStreamMetrics, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
-
Constructs an action class for detecting / scheduling new partitions.
- DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A SplittableDoFn (SDF) that is responsible for scheduling partitions to be queried.
- DetectNewPartitionsDoFn(DaoFactory, MapperFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
This class needs a
DaoFactory
to build DAOs to access the partition metadata tables.
- DetectNewPartitionsRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
- DetectNewPartitionsRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
-
- detectTranslationMode(Pipeline, StreamingOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
-
Visit the pipeline to determine the translation mode (batch/streaming) and update options
accordingly.
- DicomIO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The DicomIO connectors allows Beam pipelines to make calls to the Dicom API of the Google Cloud
Healthcare API (https://cloud.google.com/healthcare/docs/how-tos#dicom-guide).
- DicomIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
-
- DicomIO.ReadStudyMetadata - Class in org.apache.beam.sdk.io.gcp.healthcare
-
This class makes a call to the retrieve metadata endpoint
(https://cloud.google.com/healthcare/docs/how-tos/dicomweb#retrieving_metadata).
- DicomIO.ReadStudyMetadata.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
- dicomStorePath - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
-
- DicomWebPath() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
-
- DirectOptions - Interface in org.apache.beam.runners.direct
-
- DirectOptions.AvailableParallelismFactory - Class in org.apache.beam.runners.direct
-
- DIRECTORY_CONTAINER - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
-
Shard is a file within a directory.
- DirectRegistrar - Class in org.apache.beam.runners.direct
-
- DirectRegistrar.Options - Class in org.apache.beam.runners.direct
-
- DirectRegistrar.Runner - Class in org.apache.beam.runners.direct
-
- DirectRunner - Class in org.apache.beam.runners.direct
-
- DirectRunner() - Constructor for class org.apache.beam.sdk.options.PipelineOptions.DirectRunner
-
- DirectRunner.DirectPipelineResult - Class in org.apache.beam.runners.direct
-
- DirectStreamObserver<T> - Class in org.apache.beam.sdk.fn.stream
-
A StreamObserver
which uses synchronization on the underlying CallStreamObserver
to provide thread safety.
- DirectStreamObserver(Phaser, CallStreamObserver<T>) - Constructor for class org.apache.beam.sdk.fn.stream.DirectStreamObserver
-
- DirectTestOptions - Interface in org.apache.beam.runners.direct
-
Internal-only options for tweaking the behavior of the
DirectRunner
in ways that users
should never do.
- DISALLOW_COMBINER_LIFTING - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- discard() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator
-
- discardingFiredPanes() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
Returns a new Window
PTransform
that uses the registered WindowFn and
Triggering behavior, and that discards elements in a pane after they are triggered.
- dispatchBag(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchBag(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchCombining(Combine.CombineFn<?, ?, ?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchDefault() - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchMap(Coder<?>, Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchMap(Coder<?>, Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchOrderedList(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchSet(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchSet(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- dispatchValue(Coder<?>) - Method in interface org.apache.beam.sdk.state.StateSpec.Cases
-
- dispatchValue(Coder<?>) - Method in class org.apache.beam.sdk.state.StateSpec.Cases.WithDefault
-
- DISPLAY_DATA - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- DisplayData - Class in org.apache.beam.sdk.transforms.display
-
Static display data associated with a pipeline component.
- DisplayData.Builder - Interface in org.apache.beam.sdk.transforms.display
-
Utility to build up display data from a component and its included subcomponents.
- DisplayData.Identifier - Class in org.apache.beam.sdk.transforms.display
-
Unique identifier for a display data item within a component.
- DisplayData.Item - Class in org.apache.beam.sdk.transforms.display
-
Items
are the unit of display data.
- DisplayData.ItemSpec<T> - Class in org.apache.beam.sdk.transforms.display
-
- DisplayData.Path - Class in org.apache.beam.sdk.transforms.display
-
Structured path of registered display data within a component hierarchy.
- DisplayData.Type - Enum in org.apache.beam.sdk.transforms.display
-
Display data type.
- Distinct<T> - Class in org.apache.beam.sdk.transforms
-
Distinct<T>
takes a PCollection<T>
and returns a PCollection<T>
that has
all distinct elements of the input.
- Distinct() - Constructor for class org.apache.beam.sdk.transforms.Distinct
-
- Distinct.WithRepresentativeValues<T,IdT> - Class in org.apache.beam.sdk.transforms
-
- Distribution - Interface in org.apache.beam.sdk.metrics
-
A metric that reports information about the distribution of reported values.
- distribution(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that records various statistics about the distribution of reported values.
- distribution(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that records various statistics about the distribution of reported values.
- DistributionImpl - Class in org.apache.beam.runners.jet.metrics
-
- DistributionImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.DistributionImpl
-
- DistributionResult - Class in org.apache.beam.sdk.metrics
-
- DistributionResult() - Constructor for class org.apache.beam.sdk.metrics.DistributionResult
-
- divideBy(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
- DLP_PAYLOAD_LIMIT_BYTES - Static variable in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- DLPDeidentifyText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and
deidentifying text according to provided settings.
- DLPDeidentifyText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- DLPDeidentifyText.Builder - Class in org.apache.beam.sdk.extensions.ml
-
- DLPInspectText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and
inspecting text for identifying data according to provided settings.
- DLPInspectText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
- DLPInspectText.Builder - Class in org.apache.beam.sdk.extensions.ml
-
- DLPReidentifyText - Class in org.apache.beam.sdk.extensions.ml
-
A
PTransform
connecting to Cloud DLP (https://cloud.google.com/dlp/docs/libraries) and
inspecting text for identifying data according to provided settings.
- DLPReidentifyText() - Constructor for class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- DLPReidentifyText.Builder - Class in org.apache.beam.sdk.extensions.ml
-
- DlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
- DlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
-
- doChecks(PAssert.PAssertionSite, ActualT, SerializableFunction<ActualT, Void>) - Static method in class org.apache.beam.sdk.testing.PAssert
-
- DockerEnvironmentFactory - Class in org.apache.beam.runners.fnexecution.environment
-
- DockerEnvironmentFactory.Provider - Class in org.apache.beam.runners.fnexecution.environment
-
Provider for DockerEnvironmentFactory.
- docToBulk() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO
-
- DocToBulk() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocToBulk
-
- Document() - Constructor for class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
- DoFn<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
The argument to
ParDo
providing the code to use to process elements of the input
PCollection
.
- DoFn() - Constructor for class org.apache.beam.sdk.transforms.DoFn
-
- DoFn.AlwaysFetched - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for declaring that a state parameter is always fetched.
- DoFn.BoundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.BundleFinalizer - Interface in org.apache.beam.sdk.transforms
-
A parameter that is accessible during
@StartBundle
,
@ProcessElement
and
@FinishBundle
that allows the caller
to register a callback that will be invoked after the bundle has been successfully completed
and the runner has commit the output.
- DoFn.BundleFinalizer.Callback - Interface in org.apache.beam.sdk.transforms
-
An instance of a function that will be invoked after bundle finalization.
- DoFn.Element - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.FieldAccess - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for specifying specific fields that are accessed in a Schema PCollection.
- DoFn.FinishBundle - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use to finish processing a batch of elements.
- DoFn.FinishBundleContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.GetInitialRestriction - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that maps an element to an initial restriction for a
splittable DoFn
.
- DoFn.GetInitialWatermarkEstimatorState - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that maps an element and restriction to initial watermark estimator
state for a
splittable DoFn
.
- DoFn.GetRestrictionCoder - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the coder to use for the restriction of a
splittable DoFn
.
- DoFn.GetSize - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the corresponding size for an element and restriction
pair.
- DoFn.GetWatermarkEstimatorStateCoder - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that returns the coder to use for the watermark estimator state of a
splittable DoFn
.
- DoFn.Key - Annotation Type in org.apache.beam.sdk.transforms
-
Parameter annotation for dereferencing input element key in
KV
pair.
- DoFn.MultiOutputReceiver - Interface in org.apache.beam.sdk.transforms
-
Receives tagged output for a multi-output function.
- DoFn.NewTracker - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.NewWatermarkEstimator - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.OnTimer - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for registering a callback for a timer.
- DoFn.OnTimerContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.OnTimerFamily - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for registering a callback for a timerFamily.
- DoFn.OnWindowExpiration - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use for performing actions on window expiration.
- DoFn.OnWindowExpirationContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.OutputReceiver<T> - Interface in org.apache.beam.sdk.transforms
-
Receives values of the given type.
- DoFn.ProcessContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.ProcessContinuation - Class in org.apache.beam.sdk.transforms
-
When used as a return value of
DoFn.ProcessElement
, indicates whether there is more work to
be done for the current element.
- DoFn.ProcessElement - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use for processing elements.
- DoFn.RequiresStableInput - Annotation Type in org.apache.beam.sdk.transforms
-
Experimental - no backwards compatibility guarantees.
- DoFn.RequiresTimeSortedInput - Annotation Type in org.apache.beam.sdk.transforms
-
Experimental - no backwards compatibility guarantees.
- DoFn.Restriction - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.Setup - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use to prepare an instance for processing bundles of elements.
- DoFn.SideInput - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.SplitRestriction - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that splits restriction of a
splittable DoFn
into multiple parts to
be processed in parallel.
- DoFn.StartBundle - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use to prepare an instance for processing a batch of elements.
- DoFn.StartBundleContext - Class in org.apache.beam.sdk.transforms
-
- DoFn.StateId - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for declaring and dereferencing state cells.
- DoFn.Teardown - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method to use to clean up this instance before it is discarded.
- DoFn.TimerFamily - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.TimerId - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for declaring and dereferencing timers.
- DoFn.Timestamp - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.TruncateRestriction - Annotation Type in org.apache.beam.sdk.transforms
-
Annotation for the method that truncates the restriction of a
splittable DoFn
into a bounded one.
- DoFn.UnboundedPerElement - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.WatermarkEstimatorState - Annotation Type in org.apache.beam.sdk.transforms
-
- DoFn.WindowedContext - Class in org.apache.beam.sdk.transforms
-
Information accessible to all methods in this
DoFn
where the context is in some window.
- DoFnFunction<InputT,OutputT> - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch
-
Encapsulates a
DoFn
inside a Spark
MapPartitionsFunction
.
- DoFnFunction(MetricsContainerStepMapAccumulator, String, DoFn<InputT, OutputT>, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, SerializablePipelineOptions, List<TupleTag<?>>, TupleTag<OutputT>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, SideInputBroadcast, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.DoFnFunction
-
- DoFnFunction<OutputT,InputT> - Class in org.apache.beam.runners.twister2.translators.functions
-
DoFn function.
- DoFnFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
-
- DoFnFunction(Twister2TranslationContext, DoFn<InputT, OutputT>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, List<TupleTag<?>>, WindowingStrategy<?, ?>, Map<PCollectionView<?>, WindowingStrategy<?, ?>>, TupleTag<OutputT>, DoFnSchemaInformation, Map<TupleTag<?>, Integer>, Map<String, PCollectionView<?>>) - Constructor for class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
-
- DoFnOutputReceivers - Class in org.apache.beam.sdk.transforms
-
- DoFnOutputReceivers() - Constructor for class org.apache.beam.sdk.transforms.DoFnOutputReceivers
-
- DoFnRunnerWithMetricsUpdate<InputT,OutputT> - Class in org.apache.beam.runners.flink.metrics
-
DoFnRunner
decorator which registers MetricsContainerImpl
.
- DoFnRunnerWithMetricsUpdate(String, DoFnRunner<InputT, OutputT>, FlinkMetricContainer) - Constructor for class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
-
- DoFnSchemaInformation - Class in org.apache.beam.sdk.transforms
-
Represents information about how a DoFn extracts schemas.
- DoFnSchemaInformation() - Constructor for class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
- DoFnSchemaInformation.Builder - Class in org.apache.beam.sdk.transforms
-
The builder object.
- DoFnTester<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
- DoFnTester.CloningBehavior - Enum in org.apache.beam.sdk.transforms
-
- done() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
-
- done(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
-
- dotExpression() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- dotExpression() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
-
- dotExpressionComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- dotExpressionComponent() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
-
- dotExpressionComponent(int) - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
-
- DotExpressionComponentContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
-
- DotExpressionComponentContext() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
-
- DotExpressionContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
-
- DOUBLE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- DOUBLE - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of double fields.
- DOUBLE_NAN_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
-
- DOUBLE_NEGATIVE_INF_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
-
- DOUBLE_POSITIVE_INF_WRAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
-
- DoubleCoder - Class in org.apache.beam.sdk.coders
-
A
DoubleCoder
encodes
Double
values in 8 bytes using Java serialization.
- doubles() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a PTransform
that takes an input PCollection<Double>
and returns a
PCollection<Double>
whose contents is the maximum of the input PCollection
's
elements, or Double.NEGATIVE_INFINITY
if there are no elements.
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a PTransform
that takes an input PCollection<Double>
and returns a
PCollection<Double>
whose contents is the minimum of the input PCollection
's
elements, or Double.POSITIVE_INFINITY
if there are no elements.
- doublesGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a PTransform
that takes an input PCollection<Double>
and returns a
PCollection<Double>
whose contents is the sum of the input PCollection
's
elements, or 0
if there are no elements.
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a PTransform
that takes an input PCollection<KV<K, Double>>
and returns
a PCollection<KV<K, Double>>
that contains an output element mapping each distinct key
in the input PCollection
to the maximum of the values associated with that key in the
input PCollection
.
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a PTransform
that takes an input PCollection<KV<K, Double>>
and returns
a PCollection<KV<K, Double>>
that contains an output element mapping each distinct key
in the input PCollection
to the minimum of the values associated with that key in the
input PCollection
.
- doublesPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a PTransform
that takes an input PCollection<KV<K, Double>>
and returns
a PCollection<KV<K, Double>>
that contains an output element mapping each distinct key
in the input PCollection
to the sum of the values associated with that key in the input
PCollection
.
- doubleToByteArray(double) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
-
- drive() - Method in interface org.apache.beam.runners.local.ExecutionDriver
-
- DriverConfiguration() - Constructor for class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
-
- dropExpiredTimers(SparkTimerInternals, WindowingStrategy<?, W>) - Static method in class org.apache.beam.runners.spark.util.TimerUtils
-
- DropFields - Class in org.apache.beam.sdk.schemas.transforms
-
A transform to drop fields from a schema.
- DropFields() - Constructor for class org.apache.beam.sdk.schemas.transforms.DropFields
-
- DropFields.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Implementation class for DropFields.
- dropTable(SqlParserPos, boolean, SqlIdentifier) - Static method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes
-
Creates a DROP TABLE.
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
-
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
-
- dropTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Drops a table.
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
-
- dropTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
-
- dryRunQuery(String, JobConfigurationQuery, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Dry runs the query in the given project.
- dryRunQuery(String, JobConfigurationQuery, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
-
- DurationCoder - Class in org.apache.beam.sdk.coders
-
- DurationConvert() - Constructor for class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.DurationConvert
-
- durationMilliSec - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
-
- DynamicAvroDestinations<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- DynamicAvroDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicAvroDestinations
-
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
- DynamicDestinations<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This class provides the most general way of specifying dynamic BigQuery table destinations.
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
- DynamicFileDestinations - Class in org.apache.beam.sdk.io
-
- DynamicFileDestinations() - Constructor for class org.apache.beam.sdk.io.DynamicFileDestinations
-
- DynamicProtoCoder - Class in org.apache.beam.sdk.extensions.protobuf
-
A
Coder
using Google Protocol Buffers binary format.
- DynamoDbClientProvider - Interface in org.apache.beam.sdk.io.aws2.dynamodb
-
- DynamoDBIO - Class in org.apache.beam.sdk.io.aws.dynamodb
-
- DynamoDBIO() - Constructor for class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO
-
Deprecated.
- DynamoDBIO - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
IO to read from and write to
DynamoDB tables.
- DynamoDBIO() - Constructor for class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO
-
- DynamoDBIO.Read<T> - Class in org.apache.beam.sdk.io.aws.dynamodb
-
Deprecated.
Read data from DynamoDB and return ScanResult.
- DynamoDBIO.Read<T> - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
- DynamoDBIO.RetryConfiguration - Class in org.apache.beam.sdk.io.aws.dynamodb
-
Deprecated.
A POJO encapsulating a configuration for retry behavior when issuing requests to DynamoDB.
- DynamoDBIO.RetryConfiguration - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
- DynamoDBIO.RetryConfiguration.Builder - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
Deprecated.
- DynamoDBIO.Write<T> - Class in org.apache.beam.sdk.io.aws.dynamodb
-
Deprecated.
Write a PCollection data into DynamoDB.
- DynamoDBIO.Write<T> - Class in org.apache.beam.sdk.io.aws2.dynamodb
-
Write a PCollection data into DynamoDB.
- factory - Variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteFactoryWrapper
-
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner
-
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser
-
Parser factory.
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- Factory() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel.Factory
-
- FACTORY - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Factory for creating Pubsub clients using gRCP transport.
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Factory for creating Pubsub clients using Json transport.
- Factory<T> - Interface in org.apache.beam.sdk.schemas
-
A Factory interface for schema-related objects for a specific Java type.
- fail(Throwable) - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Deprecated.
- fail(Throwable) - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
-
- fail(Throwable) - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
-
Deprecated.
Mark the client as completed with an exception.
- failed(Exception) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that a failure has occurred.
- failed(Error) - Method in interface org.apache.beam.runners.local.PipelineMessageReceiver
-
Report that a failure has occurred.
- FAILED - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
-
The tag for the failed writes to HL7v2 store`.
- FAILED_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for the failed writes to FHIR store.
- FAILED_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
The TupleTag used for bundles that failed to be executed for any reason.
- FAILED_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for the files that failed to FHIR store.
- FAILED_WRITES - Static variable in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
-
- FailedRunningPipelineResults - Class in org.apache.beam.runners.jet
-
Alternative implementation of
PipelineResult
used to avoid throwing Exceptions in certain
situations.
- FailedRunningPipelineResults(RuntimeException) - Constructor for class org.apache.beam.runners.jet.FailedRunningPipelineResults
-
- FailedWritesException(List<FirestoreV1.WriteFailure>) - Constructor for exception org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
-
- failOnInsert(Map<TableRow, List<TableDataInsertAllResponse.InsertErrors>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
Cause a given
TableRow
object to fail when it's inserted.
- FailsafeValueInSingleWindow<T,ErrorT> - Class in org.apache.beam.sdk.values
-
An immutable tuple of value, timestamp, window, and pane.
- FailsafeValueInSingleWindow() - Constructor for class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
- FailsafeValueInSingleWindow.Coder<T,ErrorT> - Class in org.apache.beam.sdk.values
-
- failure(String, String, Metadata, Throwable) - Static method in class org.apache.beam.sdk.io.tika.ParseResult
-
- Failure - Class in org.apache.beam.sdk.schemas.io
-
A generic failure of an SQL transform.
- Failure() - Constructor for class org.apache.beam.sdk.schemas.io.Failure
-
- failure(PAssert.PAssertionSite, Throwable) - Static method in class org.apache.beam.sdk.testing.SuccessOrFailure
-
- Failure.Builder - Class in org.apache.beam.sdk.schemas.io
-
- FAILURE_COUNTER - Static variable in class org.apache.beam.sdk.testing.PAssert
-
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAICreateCatalogItem
-
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportCatalogItems
-
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIImportUserEvents
-
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIPredict
-
- FAILURE_TAG - Static variable in class org.apache.beam.sdk.extensions.ml.RecommendationAIWriteUserEvent
-
- failures() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
- failuresTo(List<PCollection<FailureElementT>>) - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
Adds the failure collection to the passed list and returns just the output collection.
- FakeBigQueryServerStream(List<T>) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
-
- FakeBigQueryServices - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake implementation of BigQuery's query service..
- FakeBigQueryServices() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
-
- FakeBigQueryServices.FakeBigQueryServerStream<T> - Class in org.apache.beam.sdk.io.gcp.testing
-
An implementation of BigQueryServerStream
which takes a List
as the Iterable
to simulate a server stream.
- FakeDatasetService - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake dataset service that can be serialized, for use in testReadFromTable.
- FakeDatasetService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- FakeJobService - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake implementation of BigQuery's job service.
- FakeJobService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
-
- FakeJobService(int) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
-
- features() - Method in class org.apache.beam.sdk.extensions.ml.AnnotateText
-
- fetchDataflowJobId() - Static method in class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
-
- FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
-
Instantiates a new Fetch HL7v2 message DoFn.
- fewKeys() - Method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns whether it groups just few keys.
- FhirBundleParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
- FhirBundleParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
- FhirBundleResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
-
- FhirBundleResponse() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
- FhirIO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
- FhirIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
- FhirIO.Deidentify - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Deidentify FHIR resources from a FHIR store to a destination FHIR store.
- FhirIO.Deidentify.DeidentifyFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A function that schedules a deidentify operation and monitors the status.
- FhirIO.ExecuteBundles - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Execute bundles.
- FhirIO.ExecuteBundlesResult - Class in org.apache.beam.sdk.io.gcp.healthcare
-
ExecuteBundlesResult contains both successfully executed bundles and information help debugging
failed executions (eg metadata & error msgs).
- FhirIO.Export - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Export FHIR resources from a FHIR store to new line delimited json files on GCS or BigQuery.
- FhirIO.Export.ExportResourcesFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A function that schedules an export operation and monitors the status.
- FhirIO.Import - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Writes each bundle of elements to a new-line delimited JSON file on GCS and issues a
fhirStores.import Request for that file.
- FhirIO.Import.ContentStructure - Enum in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Content structure.
- FhirIO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Read.
- FhirIO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result.
- FhirIO.Search<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Search.
- FhirIO.Search.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
- FhirIO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Write.
- FhirIO.Write.AbstractResult - Class in org.apache.beam.sdk.io.gcp.healthcare
-
- FhirIO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result.
- FhirIO.Write.WriteMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Write method.
- FhirIOPatientEverything - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type FhirIOPatientEverything for querying a FHIR Patient resource's compartment.
- FhirIOPatientEverything() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
-
- FhirIOPatientEverything.PatientEverythingParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
PatientEverythingParameter defines required attributes for a FHIR GetPatientEverything request
in
FhirIOPatientEverything
.
- FhirIOPatientEverything.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
- FhirResourcePagesIterator(HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod, HealthcareApiClient, String, String, String, Map<String, Object>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
-
- FhirSearchParameter<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirSearchParameter represents the query parameters for a FHIR search request, used as a
parameter for
FhirIO.Search
.
- FhirSearchParameterCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
- fhirStoresImport(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
Import method for batch writing resources.
- fhirStoresImport(String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
- fhirStoresImport(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
- field(Row, int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfNestedStringBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ArrayOfStringBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.ByteBufferBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.BytesBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.DateTimeBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfIntBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.MapOfNestedIntBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedBytesBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.NestedIntBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBuilderBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle.Field
-
- field - Variable in class org.apache.beam.sdk.jmh.schemas.RowBundles.StringBundle.Field
-
- Field() - Constructor for class org.apache.beam.sdk.schemas.Schema.Field
-
- field(String, Schema.FieldType) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
-
Add a new field of the specified type.
- field(String, Schema.FieldType, Object) - Method in class org.apache.beam.sdk.schemas.transforms.AddFields.Inner
-
Add a new field of the specified type.
- fieldAccess(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
- FieldAccessDescriptor - Class in org.apache.beam.sdk.schemas
-
Used inside of a
DoFn
to describe which fields in a schema
type need to be accessed for processing.
- FieldAccessDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
- fieldAccessDescriptor(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field access descriptor.
- FieldAccessDescriptor.FieldDescriptor - Class in org.apache.beam.sdk.schemas
-
Description of a single field.
- FieldAccessDescriptor.FieldDescriptor.Builder - Class in org.apache.beam.sdk.schemas
-
Builder class.
- FieldAccessDescriptor.FieldDescriptor.ListQualifier - Enum in org.apache.beam.sdk.schemas
-
Qualifier for a list selector.
- FieldAccessDescriptor.FieldDescriptor.MapQualifier - Enum in org.apache.beam.sdk.schemas
-
Qualifier for a map selector.
- FieldAccessDescriptor.FieldDescriptor.Qualifier - Class in org.apache.beam.sdk.schemas
-
OneOf union for a collection selector.
- FieldAccessDescriptor.FieldDescriptor.Qualifier.Kind - Enum in org.apache.beam.sdk.schemas
-
The kind of qualifier.
- FieldAccessDescriptorParser - Class in org.apache.beam.sdk.schemas.parser
-
Parser for textual field-access selector.
- FieldAccessDescriptorParser() - Constructor for class org.apache.beam.sdk.schemas.parser.FieldAccessDescriptorParser
-
- FieldDescriptor() - Constructor for class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
-
- fieldFromType(TypeDescriptor, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.StaticSchemaInference
-
Map a Java field type to a Beam Schema FieldType.
- fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field ids.
- fieldIds(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Select a set of top-level field ids from the row.
- fieldIdsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return the field ids accessed.
- fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.CoGroup.By
-
Join by the following field names.
- fieldNames(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Select a set of top-level field names from the row.
- fieldNamesAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
Return the field names accessed.
- fields(String...) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
-
- fields(Integer...) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
-
- fields(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.DropFields
-
- Fields() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select.Fields
-
- fields() - Static method in class org.apache.beam.sdk.state.StateKeySpec
-
- FieldsEqual() - Constructor for class org.apache.beam.sdk.schemas.transforms.Join.FieldsEqual
-
- fieldSpecifier() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- FieldSpecifierContext(ParserRuleContext, int) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
-
- FieldSpecifierNotationBaseListener - Class in org.apache.beam.sdk.schemas.parser.generated
-
This class provides an empty implementation of
FieldSpecifierNotationListener
,
which can be extended to create a listener which only needs to handle a subset
of the available methods.
- FieldSpecifierNotationBaseListener() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseListener
-
- FieldSpecifierNotationBaseVisitor<T> - Class in org.apache.beam.sdk.schemas.parser.generated
-
This class provides an empty implementation of
FieldSpecifierNotationVisitor
,
which can be extended to create a visitor which only needs to handle a subset
of the available methods.
- FieldSpecifierNotationBaseVisitor() - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationBaseVisitor
-
- FieldSpecifierNotationLexer - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationLexer(CharStream) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- FieldSpecifierNotationListener - Interface in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser(TokenStream) - Constructor for class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- FieldSpecifierNotationParser.ArrayQualifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.ArrayQualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.DotExpressionComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.DotExpressionContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.FieldSpecifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.MapQualifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.MapQualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.QualifiedComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.QualifierListContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.QualifyComponentContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.SimpleIdentifierContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationParser.WildcardContext - Class in org.apache.beam.sdk.schemas.parser.generated
-
- FieldSpecifierNotationVisitor<T> - Interface in org.apache.beam.sdk.schemas.parser.generated
-
- FieldType() - Constructor for class org.apache.beam.sdk.schemas.Schema.FieldType
-
- FieldTypeDescriptors - Class in org.apache.beam.sdk.schemas
-
Utilities for converting between
Schema
field types and
TypeDescriptor
s that
define Java objects which can represent these field types.
- FieldTypeDescriptors() - Constructor for class org.apache.beam.sdk.schemas.FieldTypeDescriptors
-
- fieldTypeForJavaType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.FieldTypeDescriptors
-
- fieldUpdate(String, String, String) - Static method in class org.apache.beam.sdk.io.mongodb.UpdateField
-
- FieldValueGetter<ObjectT,ValueT> - Interface in org.apache.beam.sdk.schemas
-
For internal use only; no backwards-compatibility guarantees.
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
-
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
-
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
-
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AvroRecordSchema
-
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Implementing class should override to return FieldValueGetters.
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
-
- fieldValueGetters(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
-
- FieldValueSetter<ObjectT,ValueT> - Interface in org.apache.beam.sdk.schemas
-
For internal use only; no backwards-compatibility guarantees.
- FieldValueTypeInformation - Class in org.apache.beam.sdk.schemas
-
Represents type information for a Java type that will be used to infer a Schema type.
- FieldValueTypeInformation() - Constructor for class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- FieldValueTypeInformation.Builder - Class in org.apache.beam.sdk.schemas
-
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
-
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.io.thrift.ThriftSchema
-
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema
-
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.AvroRecordSchema
-
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
Implementing class should override to return a list of type-informations.
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema
-
- fieldValueTypeInformations(Class<?>, Schema) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema
-
- FieldValueTypeSupplier - Interface in org.apache.beam.sdk.schemas.utils
-
A naming policy for schema fields.
- FILE_ARTIFACT_URN - Static variable in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- FileBasedReader(FileBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
Subclasses should not perform IO operations at the constructor.
- FileBasedSink<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
Abstract class for file-based output.
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory, producing uncompressed files.
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, FileBasedSink.WritableByteChannelFactory) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory and output channel type.
- FileBasedSink(ValueProvider<ResourceId>, FileBasedSink.DynamicDestinations<?, DestinationT, OutputT>, Compression) - Constructor for class org.apache.beam.sdk.io.FileBasedSink
-
Construct a
FileBasedSink
with the given temp directory and output channel type.
- FileBasedSink.CompressionType - Enum in org.apache.beam.sdk.io
-
- FileBasedSink.DynamicDestinations<UserT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- FileBasedSink.FilenamePolicy - Class in org.apache.beam.sdk.io
-
A naming policy for output files.
- FileBasedSink.FileResult<DestinationT> - Class in org.apache.beam.sdk.io
-
Result of a single bundle write.
- FileBasedSink.FileResultCoder<DestinationT> - Class in org.apache.beam.sdk.io
-
- FileBasedSink.OutputFileHints - Interface in org.apache.beam.sdk.io
-
Provides hints about how to generate output files, such as a suggested filename suffix (e.g.
- FileBasedSink.WritableByteChannelFactory - Interface in org.apache.beam.sdk.io
-
- FileBasedSink.WriteOperation<DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
Abstract operation that manages the process of writing to
FileBasedSink
.
- FileBasedSink.Writer<DestinationT,OutputT> - Class in org.apache.beam.sdk.io
-
- FileBasedSource<T> - Class in org.apache.beam.sdk.io
-
A common base class for all file-based
Source
s.
- FileBasedSource(ValueProvider<String>, EmptyMatchTreatment, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Create a FileBaseSource
based on a file or a file pattern specification, with the given
strategy for treating filepatterns that do not match any files.
- FileBasedSource(ValueProvider<String>, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
- FileBasedSource(MatchResult.Metadata, long, long, long) - Constructor for class org.apache.beam.sdk.io.FileBasedSource
-
Create a FileBasedSource
based on a single file.
- FileBasedSource.FileBasedReader<T> - Class in org.apache.beam.sdk.io
-
A
reader
that implements code common to readers of
FileBasedSource
s.
- FileBasedSource.Mode - Enum in org.apache.beam.sdk.io
-
A given FileBasedSource
represents a file resource of one of these types.
- FileChecksumMatcher - Class in org.apache.beam.sdk.testing
-
Matcher to verify checksum of the contents of an ShardedFile
in E2E test.
- fileContentsHaveChecksum(String) - Static method in class org.apache.beam.sdk.testing.FileChecksumMatcher
-
- FileIO - Class in org.apache.beam.sdk.io
-
General-purpose transforms for working with files: listing files (matching), reading and writing.
- FileIO() - Constructor for class org.apache.beam.sdk.io.FileIO
-
- FileIO.Match - Class in org.apache.beam.sdk.io
-
- FileIO.MatchAll - Class in org.apache.beam.sdk.io
-
- FileIO.MatchConfiguration - Class in org.apache.beam.sdk.io
-
Describes configuration for matching filepatterns, such as
EmptyMatchTreatment
and
continuous watching for matching files.
- FileIO.ReadableFile - Class in org.apache.beam.sdk.io
-
A utility class for accessing a potentially compressed file.
- FileIO.ReadMatches - Class in org.apache.beam.sdk.io
-
- FileIO.ReadMatches.DirectoryTreatment - Enum in org.apache.beam.sdk.io
-
Enum to control how directories are handled.
- FileIO.Sink<ElementT> - Interface in org.apache.beam.sdk.io
-
- FileIO.Write<DestinationT,UserT> - Class in org.apache.beam.sdk.io
-
- FileIO.Write.FileNaming - Interface in org.apache.beam.sdk.io
-
A policy for generating names for shard files.
- FilenamePolicy() - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FilenamePolicy
-
- filepattern(String) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
Matches the given filepattern.
- filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.FileIO.Match
-
- filepattern(String) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
-
Matches the given filepattern.
- filepattern(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.tika.TikaIO.Parse
-
- FileReporter - Class in org.apache.beam.runners.flink.metrics
-
Flink metrics reporter
for writing
metrics to a file specified via the "metrics.reporter.file.path" config key (assuming an alias of
"file" for this reporter in the "metrics.reporters" setting).
- FileReporter() - Constructor for class org.apache.beam.runners.flink.metrics.FileReporter
-
- FileResult(ResourceId, int, BoundedWindow, PaneInfo, DestinationT) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResult
-
- FileResultCoder(Coder<BoundedWindow>, Coder<DestinationT>) - Constructor for class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
-
- fileSize(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the file size from GCS or throws FileNotFoundException
if the resource does not
exist.
- FileStagingOptions - Interface in org.apache.beam.sdk.options
-
File staging related options.
- FileSystem<ResourceIdT extends ResourceId> - Class in org.apache.beam.sdk.io
-
File system interface in Beam.
- FileSystem() - Constructor for class org.apache.beam.sdk.io.FileSystem
-
- FileSystemRegistrar - Interface in org.apache.beam.sdk.io
-
- FileSystems - Class in org.apache.beam.sdk.io
-
- FileSystems() - Constructor for class org.apache.beam.sdk.io.FileSystems
-
- FileSystemUtils - Class in org.apache.beam.sdk.io
-
- FileSystemUtils() - Constructor for class org.apache.beam.sdk.io.FileSystemUtils
-
- FillGaps<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
-
Fill gaps in timeseries.
- FillGaps() - Constructor for class org.apache.beam.sdk.extensions.timeseries.FillGaps
-
- FillGaps.FillGapsDoFn<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
-
- FillGaps.InterpolateData<ValueT> - Class in org.apache.beam.sdk.extensions.timeseries
-
Argument to withInterpolateFunction function.
- Filter - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransform
for filtering a collection of schema types.
- Filter() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter
-
- Filter<T> - Class in org.apache.beam.sdk.transforms
-
PTransform
s for filtering from a PCollection
the elements satisfying a predicate,
or satisfying an inequality with a given value based on the elements' natural ordering.
- Filter.Inner<T> - Class in org.apache.beam.sdk.schemas.transforms
-
Implementation of the filter.
- filterCharacters(String) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
-
- finalizeAllOutstandingBundles() - Method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers.InMemoryFinalizer
-
All finalization requests will be sent without waiting for the responses.
- finalizeCheckpoint() - Method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
-
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
-
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
-
- finalizeCheckpoint() - Method in interface org.apache.beam.sdk.io.UnboundedSource.CheckpointMark
-
Called by the system to signal that this checkpoint mark has been committed along with all
the records which have been read from the
UnboundedSource.UnboundedReader
since the previous
checkpoint was taken.
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.UnboundedSource.CheckpointMark.NoopCheckpointMark
-
- finalizeDestination(DestinationT, BoundedWindow, Integer, Collection<FileBasedSink.FileResult<DestinationT>>) - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
- finalizeWriteStream(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Finalize a write stream.
- finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- find(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(String, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(Pattern, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- find(Pattern, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- Find(Pattern, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.Find
-
- findAll(String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- findAll(Pattern) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- FindAll(Pattern) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindAll
-
- findAvailablePort() - Static method in class org.apache.beam.sdk.extensions.python.PythonService
-
- findDateTimePattern(String) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
-
- findDateTimePattern(String, ImmutableMap<Enum, DateTimeFormatter>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
-
- findKV(String, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- findKV(Pattern, int, int) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- findKV(String, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- findKV(Pattern, String, String) - Static method in class org.apache.beam.sdk.transforms.Regex
-
- FindKV(Pattern, int, int) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindKV
-
- FindName(Pattern, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindName
-
- FindNameKV(Pattern, String, String) - Constructor for class org.apache.beam.sdk.transforms.Regex.FindNameKV
-
- FindQuery - Class in org.apache.beam.sdk.io.mongodb
-
Builds a MongoDB FindQuery object.
- FindQuery() - Constructor for class org.apache.beam.sdk.io.mongodb.FindQuery
-
- finishBundle() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
-
- finishBundle(DoFn<Iterable<KV<TableDestination, WriteTables.Result>>, Iterable<KV<TableDestination, WriteTables.Result>>>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
-
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
-
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
-
- finishBundle() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- finishBundle(DoFn<T, KV<Long, PCollectionViews.ValueOrMetadata<T, OffsetRange>>>.FinishBundleContext) - Method in class org.apache.beam.sdk.transforms.View.ToListViewDoFn
-
- FinishBundleContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
-
- finishSpecifying() - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
-
After building, finalizes this
PValue
to make it ready for running.
- finishSpecifying(PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.PValue
-
After building, finalizes this
PValue
to make it ready for being used as an input to a
PTransform
.
- finishSpecifying(PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.WriteFilesResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollection
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionList
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PDone
-
Does nothing; there is nothing to finish specifying.
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in interface org.apache.beam.sdk.values.POutput
-
As part of applying the producing
PTransform
, finalizes this output to make it ready
for being used as an input and for running.
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.values.PValueBase
-
- finishWrite() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
- fireEligibleTimers(InMemoryTimerInternals, Map<KV<String, String>, FnDataReceiver<Timer>>, Object) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Fires all timers which are ready to be fired.
- FirestoreIO - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreOptions - Interface in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1 - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.BatchGetDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.BatchGetDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.BatchWriteWithDeadLetterQueue - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.BatchWriteWithDeadLetterQueue.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.BatchWriteWithSummary - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.BatchWriteWithSummary.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.FailedWritesException - Exception in org.apache.beam.sdk.io.gcp.firestore
-
Exception that is thrown if one or more Write
s is unsuccessful
with a non-retryable status code.
- FirestoreV1.ListCollectionIds - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.ListCollectionIds.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.ListDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.ListDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.PartitionQuery - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.PartitionQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.Read - Class in org.apache.beam.sdk.io.gcp.firestore
-
Type safe builder factory for read operations.
- FirestoreV1.RunQuery - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.RunQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
- FirestoreV1.Write - Class in org.apache.beam.sdk.io.gcp.firestore
-
Type safe builder factory for write operations.
- FirestoreV1.WriteFailure - Class in org.apache.beam.sdk.io.gcp.firestore
-
Failure details for an attempted Write
.
- FirestoreV1.WriteSuccessSummary - Class in org.apache.beam.sdk.io.gcp.firestore
-
Summary object produced when a number of writes are successfully written to Firestore in a
single BatchWrite.
- fireTimestamp() - Method in class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
-
Returns the firing timestamp of the current timer.
- first - Variable in class org.apache.beam.sdk.transforms.PeriodicSequence.SequenceDefinition
-
- fixDefaults() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
Fixes all the defaults so that equals can be used to check that two strategies are the same,
regardless of the state of "defaulted-ness".
- FIXED_WINDOW_TVF - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.TVFStreamingUtils
-
- FixedBytes - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A LogicalType representing a fixed-size byte array.
- fixedSizeGlobally(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a PTransform
that takes a PCollection<T>
, selects sampleSize
elements, uniformly at random, and returns a PCollection<Iterable<T>>
containing the
selected elements.
- fixedSizePerKey(int) - Static method in class org.apache.beam.sdk.transforms.Sample
-
Returns a PTransform
that takes an input PCollection<KV<K, V>>
and returns a
PCollection<KV<K, Iterable<V>>>
that contains an output element mapping each distinct
key in the input PCollection
to a sample of sampleSize
values associated with
that key in the input PCollection
, taken uniformly at random.
- fixedString(int) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- fixedStringSize() - Method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- FixedWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that windows values into fixed-size timestamp-based windows.
- flatMap(KV<K, Iterable<WindowedValue<V>>>, RecordCollector<WindowedValue<KV<K, Iterable<V>>>>) - Method in class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
-
- FlatMapElements<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
PTransform
s for mapping a simple function that returns iterables over the elements of a
PCollection
and merging the results.
- FlatMapElements.FlatMapWithFailures<InputT,OutputT,FailureT> - Class in org.apache.beam.sdk.transforms
-
- Flatten - Class in org.apache.beam.sdk.transforms
-
Flatten<T>
takes multiple PCollection<T>
s bundled into a PCollectionList<T>
and returns a single PCollection<T>
containing all the elements in
all the input PCollection
s.
- Flatten() - Constructor for class org.apache.beam.sdk.transforms.Flatten
-
- Flatten.Iterables<T> - Class in org.apache.beam.sdk.transforms
-
FlattenIterables<T>
takes a PCollection<Iterable<T>>
and returns a PCollection<T>
that contains all the elements from each iterable.
- Flatten.PCollections<T> - Class in org.apache.beam.sdk.transforms
-
- Flattened() - Constructor for class org.apache.beam.sdk.schemas.transforms.Select.Flattened
-
- flattenedSchema() - Static method in class org.apache.beam.sdk.schemas.transforms.Select
-
Selects every leaf-level field.
- FlattenP - Class in org.apache.beam.runners.jet.processors
-
Jet Processor
implementation for Beam's Flatten primitive.
- FlattenP.Supplier - Class in org.apache.beam.runners.jet.processors
-
Jet
Processor
supplier that will provide instances of
FlattenP
.
- flattenRel(RelStructuredTypeFlattener) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
-
- FlattenTranslatorBatch<T> - Class in org.apache.beam.runners.twister2.translators.batch
-
Flatten translator.
- FlattenTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.FlattenTranslatorBatch
-
- FlattenWithHeterogeneousCoders - Interface in org.apache.beam.sdk.testing
-
- FlinkBatchPortablePipelineTranslator - Class in org.apache.beam.runners.flink
-
A translator that translates bounded portable pipelines into executable Flink pipelines.
- FlinkBatchPortablePipelineTranslator.BatchTranslationContext - Class in org.apache.beam.runners.flink
-
Batch translation context.
- FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform - Class in org.apache.beam.runners.flink
-
Predicate to determine whether a URN is a Flink native transform.
- FlinkDetachedRunnerResult - Class in org.apache.beam.runners.flink
-
Result of a detached execution of a
Pipeline
with Flink.
- FlinkExecutionEnvironments - Class in org.apache.beam.runners.flink
-
Utilities for Flink execution environments.
- FlinkExecutionEnvironments() - Constructor for class org.apache.beam.runners.flink.FlinkExecutionEnvironments
-
- FlinkJobInvoker - Class in org.apache.beam.runners.flink
-
- FlinkJobInvoker(FlinkJobServerDriver.FlinkServerConfiguration) - Constructor for class org.apache.beam.runners.flink.FlinkJobInvoker
-
- FlinkJobServerDriver - Class in org.apache.beam.runners.flink
-
Driver program that starts a job server for the Flink runner.
- FlinkJobServerDriver.FlinkServerConfiguration - Class in org.apache.beam.runners.flink
-
Flink runner-specific Configuration for the jobServer.
- FlinkMetricContainer - Class in org.apache.beam.runners.flink.metrics
-
Helper class for holding a MetricsContainerImpl
and forwarding Beam metrics to Flink
accumulators and metrics.
- FlinkMetricContainer(RuntimeContext) - Constructor for class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
-
- FlinkMetricContainer.FlinkDistributionGauge - Class in org.apache.beam.runners.flink.metrics
-
- FlinkMetricContainer.FlinkGauge - Class in org.apache.beam.runners.flink.metrics
-
- FlinkMiniClusterEntryPoint - Class in org.apache.beam.runners.flink
-
Entry point for starting an embedded Flink cluster.
- FlinkMiniClusterEntryPoint() - Constructor for class org.apache.beam.runners.flink.FlinkMiniClusterEntryPoint
-
- FlinkPipelineOptions - Interface in org.apache.beam.runners.flink
-
Options which can be used to configure the Flink Runner.
- FlinkPipelineRunner - Class in org.apache.beam.runners.flink
-
- FlinkPipelineRunner(FlinkPipelineOptions, String, List<String>) - Constructor for class org.apache.beam.runners.flink.FlinkPipelineRunner
-
- FlinkPortableClientEntryPoint - Class in org.apache.beam.runners.flink
-
Flink job entry point to launch a Beam pipeline by executing an external SDK driver program.
- FlinkPortableClientEntryPoint(String) - Constructor for class org.apache.beam.runners.flink.FlinkPortableClientEntryPoint
-
- FlinkPortablePipelineTranslator<T extends FlinkPortablePipelineTranslator.TranslationContext> - Interface in org.apache.beam.runners.flink
-
Interface for portable Flink translators.
- FlinkPortablePipelineTranslator.Executor - Interface in org.apache.beam.runners.flink
-
A handle used to execute a translated pipeline.
- FlinkPortablePipelineTranslator.TranslationContext - Interface in org.apache.beam.runners.flink
-
The context used for pipeline translation.
- FlinkPortableRunnerResult - Class in org.apache.beam.runners.flink
-
Result of executing a portable
Pipeline
with Flink.
- FlinkRunner - Class in org.apache.beam.runners.flink
-
A
PipelineRunner
that executes the operations in the pipeline by first translating them
to a Flink Plan and then executing them either locally or on a Flink cluster, depending on the
configuration.
- FlinkRunner(FlinkPipelineOptions) - Constructor for class org.apache.beam.runners.flink.FlinkRunner
-
- FlinkRunnerRegistrar - Class in org.apache.beam.runners.flink
-
AutoService registrar - will register FlinkRunner and FlinkOptions as possible pipeline runner
services.
- FlinkRunnerRegistrar.Options - Class in org.apache.beam.runners.flink
-
Pipeline options registrar.
- FlinkRunnerRegistrar.Runner - Class in org.apache.beam.runners.flink
-
Pipeline runner registrar.
- FlinkRunnerResult - Class in org.apache.beam.runners.flink
-
Result of executing a
Pipeline
with Flink.
- FlinkServerConfiguration() - Constructor for class org.apache.beam.runners.flink.FlinkJobServerDriver.FlinkServerConfiguration
-
- FlinkStateBackendFactory - Interface in org.apache.beam.runners.flink
-
Constructs a StateBackend to use from flink pipeline options.
- FlinkStreamingPortablePipelineTranslator - Class in org.apache.beam.runners.flink
-
Translate an unbounded portable pipeline representation into a Flink pipeline representation.
- FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform - Class in org.apache.beam.runners.flink
-
Predicate to determine whether a URN is a Flink native transform.
- FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext - Class in org.apache.beam.runners.flink
-
Streaming translation context.
- FLOAT - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- FLOAT - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of float fields.
- FLOAT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- FLOAT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- FloatCoder - Class in org.apache.beam.sdk.coders
-
A
FloatCoder
encodes
Float
values in 4 bytes using Java serialization.
- floats() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- floatToByteArray(float) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
-
- flush(boolean) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
-
- flush() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver2
-
- flush() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataOutboundObserver
-
Deprecated.
- flush() - Method in interface org.apache.beam.sdk.fn.data.CloseableFnDataReceiver
-
- flush() - Method in class org.apache.beam.sdk.io.AvroIO.Sink
-
- flush() - Method in interface org.apache.beam.sdk.io.FileIO.Sink
-
Flushes the buffered state (if any) before the channel is closed.
- flush(String, long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Flush a given stream up to the given offset.
- flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- flush() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
-
- flush() - Method in class org.apache.beam.sdk.io.TextIO.Sink
-
- flush() - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
-
- flush() - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
-
- flush() - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
-
- fn(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
- fn(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
- fn(Contextful.Fn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
- FnApiControlClient - Class in org.apache.beam.runners.fnexecution.control
-
A client for the control plane of an SDK harness, which can issue requests to it over the Fn API.
- FnApiControlClientPoolService - Class in org.apache.beam.runners.fnexecution.control
-
A Fn API control service which adds incoming SDK harness connections to a sink.
- FnDataReceiver<T> - Interface in org.apache.beam.sdk.fn.data
-
A receiver of streamed data.
- FnDataService - Interface in org.apache.beam.runners.fnexecution.data
-
The
FnDataService
is able to forward inbound elements to a consumer and is also a
consumer of outbound elements.
- FnService - Interface in org.apache.beam.sdk.fn.server
-
An interface sharing common behavior with services used during execution of user Fns.
- forBagUserStateHandlerFactory(ProcessBundleDescriptors.ExecutableProcessBundleDescriptor, StateRequestHandlers.BagUserStateHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
- forBatch(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
-
- forBoolean(Boolean) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forBytes() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builder
for a
HllCount.Init
combining
PTransform
that
computes bytes-type HLL++ sketches.
- forClass(Class<?>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
to be used for serializing an instance of the supplied class
for transport via the Dataflow API.
- forClassName(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
to be used for serializing data to be deserialized using the
supplied class name the supplied class name for transport via the Dataflow API.
- forCoder(TypeDescriptor<?>, Coder<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
-
Creates a
CoderProvider
that always returns the given coder for the specified type.
- forConsumer(LogicalEndpoint, FnDataReceiver<ByteString>) - Static method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Deprecated.
- forConsumers(List<DataEndpoint<?>>, List<TimerEndpoint<?>>) - Static method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver2
-
Creates a receiver that is able to consume elements multiplexing on to the provided set of
endpoints.
- forDescriptor(ProtoDomain, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Create a new ProtoDynamicMessageSchema from a
ProtoDomain
and for a message.
- forDescriptor(ProtoDomain, Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
Create a new ProtoDynamicMessageSchema from a
ProtoDomain
and for a descriptor.
- forDescriptor(Endpoints.ApiServiceDescriptor) - Method in class org.apache.beam.sdk.fn.channel.ManagedChannelFactory
-
- forEncoding(ByteString) - Static method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
-
- forever(Trigger) - Static method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
-
Create a composite trigger that repeatedly executes the trigger repeated
, firing each
time it fires and ignoring any indications to finish.
- forField(Field, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- forFloat(Float) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forFloat(Double) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forGetter(Method, int) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- forHandler(RunnerApi.Environment, InstructionRequestHandler) - Static method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
- forInteger(Long) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forInteger(Integer) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forIntegers() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builder
for a
HllCount.Init
combining
PTransform
that
computes integer-type HLL++ sketches.
- forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
-
- forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
-
- forIterableSideInput(String, String, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
-
- forKnownType(Object) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value of a well-known cloud object
type.
- forLongs() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builder
for a
HllCount.Init
combining
PTransform
that
computes long-type HLL++ sketches.
- FORMAT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- FormatAsTextFn() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.FormatAsTextFn
-
- formatRecord(ElementT, Schema) - Method in interface org.apache.beam.sdk.io.AvroIO.RecordFormatter
-
Deprecated.
- formatRecord(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Convert an input record type into the output type.
- formatTimestamp(Instant) - Static method in class org.apache.beam.sdk.transforms.windowing.BoundedWindow
-
- formatTimestampWithTimeZone(DateTime) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.DateTimeUtils
-
- forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.SideInputHandlerFactory
-
- forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
-
- forMultimapSideInput(String, String, KvCoder<K, V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
-
- forNewInput(Instant, InputT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watch
transform to create a new independent termination state for a
newly arrived
InputT
.
- forOneOf(String, boolean, Map<String, FieldValueTypeInformation>) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- forOrdinal(int) - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
-
- forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
-
- forRegistry(MetricRegistry) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
-
- forRequestObserver(String, StreamObserver<BeamFnApi.InstructionRequest>, ConcurrentMap<String, BeamFnApi.ProcessBundleDescriptor>) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
-
- forService(InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironmentFactory
-
- forSetter(Method) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- forSetter(Method, String) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- forSideInputHandlerFactory(Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, StateRequestHandlers.SideInputHandlerFactory) - Static method in class org.apache.beam.runners.fnexecution.state.StateRequestHandlers
-
- forSqlType(Schema.TypeName) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.BigDecimalConverter
-
- forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory
-
- forStage(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.JobBundleFactory
-
- forStage(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.SingleEnvironmentInstanceJobBundleFactory
-
Deprecated.
- forStage(ExecutableStage, BatchSideInputHandlerFactory.SideInputGetter) - Static method in class org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory
-
Creates a new state handler for the given stage.
- forStage(ExecutableStage, Map<RunnerApi.ExecutableStagePayload.SideInputId, PCollectionView<?>>, SideInputHandler) - Static method in class org.apache.beam.runners.fnexecution.translation.StreamingSideInputHandlerFactory
-
Creates a new state handler for the given stage.
- forStreamFromSources(List<Integer>, Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Build the TimerInternals
according to the feeding streams.
- forStreaming(PCollectionView<ViewT>) - Static method in class org.apache.beam.runners.dataflow.CreateDataflowView
-
- forString(String) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
representing the given value.
- forStrings() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init
-
Returns a
HllCount.Init.Builder
for a
HllCount.Init
combining
PTransform
that
computes string-type HLL++ sketches.
- forThrowable(Throwable) - Static method in class org.apache.beam.sdk.values.EncodableThrowable
-
Wraps throwable
and returns the result.
- forTransformHierarchy(TransformHierarchy, PipelineOptions) - Static method in class org.apache.beam.sdk.Pipeline
-
- forTypeName(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
-
- forUserState(String, String, Coder<K>, Coder<V>, Coder<W>) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandlerFactory
-
- ForwardingClientResponseObserver<ReqT,RespT> - Class in org.apache.beam.sdk.fn.stream
-
A ClientResponseObserver
which delegates all StreamObserver
calls.
- forWriter(LogWriter) - Static method in class org.apache.beam.runners.fnexecution.logging.GrpcLoggingService
-
- freeze() - Method in class org.apache.beam.runners.jet.metrics.JetMetricResults
-
- from(Map<String, String>) - Static method in class org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
-
Deprecated.
Expects a map keyed by logger Name
s with values representing Level
s.
- from(String) - Static method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Instantiates a cross-language wrapper for a Python transform with a given transform name.
- from(String, String) - Static method in class org.apache.beam.sdk.extensions.python.PythonExternalTransform
-
Instantiates a cross-language wrapper for a Python transform with a given transform name.
- from(ExecutorService) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
-
- from(Supplier<ExecutorService>) - Static method in class org.apache.beam.sdk.fn.test.TestExecutors
-
- from(String) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
-
Reads from the given filename or filepattern.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.Parse
-
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.AvroIO.Read
-
Reads from the given filename or filepattern.
- from(String) - Method in class org.apache.beam.sdk.io.AvroIO.Read
-
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.AvroSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that
resides there, and some IO-specific configuration object.
- from(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.AvroSource
-
Reads from the given file name or pattern ("glob").
- from(MatchResult.Metadata) - Static method in class org.apache.beam.sdk.io.AvroSource
-
- from(String) - Static method in class org.apache.beam.sdk.io.AvroSource
-
- from(FileBasedSource<T>) - Static method in class org.apache.beam.sdk.io.CompressedSource
-
Creates a CompressedSource
from an underlying FileBasedSource
.
- from(String) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
Reads text from the file(s) with the given filename or filename pattern.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
- from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Reads a BigQuery table specified as "[project_id]:[dataset_id].[table_id]"
or "[dataset_id].[table_id]"
for tables within the current project.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
- from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
- from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Produces a SchemaIO given a String representing the data's location, the schema of the data
that resides there, and some IO-specific configuration object.
- from(BigQuerySchemaTransformReadConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
- from(BigQuerySchemaTransformWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that
resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that
resides there, and some IO-specific configuration object.
- from(Struct) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.PartitionMetadataMapper
-
- from(long) - Static method in class org.apache.beam.sdk.io.GenerateSequence
-
Specifies the minimum number to generate (inclusive).
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that
resides there, and some IO-specific configuration object.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
-
- from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Parse
-
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
-
Reads from the given filename or filepattern.
- from(String) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Read
-
- from(BoundedSource<T>) - Method in class org.apache.beam.sdk.io.Read.Builder
-
Returns a new Read.Bounded
PTransform
reading from the given BoundedSource
.
- from(UnboundedSource<T, ?>) - Method in class org.apache.beam.sdk.io.Read.Builder
-
Returns a new Read.Unbounded
PTransform
reading from the given UnboundedSource
.
- from(BoundedSource<T>) - Static method in class org.apache.beam.sdk.io.Read
-
Returns a new Read.Bounded
PTransform
reading from the given BoundedSource
.
- from(UnboundedSource<T, ?>) - Static method in class org.apache.beam.sdk.io.Read
-
- from(String) - Method in class org.apache.beam.sdk.io.solr.SolrIO.Read
-
Provide name of collection while reading from Solr.
- from(String) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
Reads text files that reads from the file(s) with the given filename or filename pattern.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TextIO.Read
-
- from(String) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
Returns a transform for reading TFRecord files that reads from the file(s) with the given
filename or filename pattern.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
- from(String) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Read
-
Reads a single XML file or a set of XML files defined by a Java "glob" file pattern.
- from(Map<String, String>) - Static method in class org.apache.beam.sdk.options.SdkHarnessOptions.SdkHarnessLogLevelOverrides
-
Expects a map keyed by logger Name
s with values representing LogLevel
s.
- from(String, Row, Schema) - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that
resides there, and some IO-specific configuration object.
- from(Row) - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Produce a SchemaTransform some transform-specific configuration object.
- from(ConfigT) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
Produce a SchemaTransform from ConfigT.
- from(Row) - Method in class org.apache.beam.sdk.schemas.transforms.TypedSchemaTransformProvider
-
- from(HasDisplayData) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
- from(double, double) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
A representation for the amount of known completed and remaining work.
- fromArgs(String...) - Method in class org.apache.beam.sdk.options.PipelineOptionsFactory.Builder
-
Sets the command line arguments to parse when constructing the
PipelineOptions
.
- fromArgs(String...) - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
Sets the command line arguments to parse when constructing the
PipelineOptions
.
- fromArray(T...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables
-
- fromArray(T...) - Static method in class org.apache.beam.sdk.fn.stream.PrefetchableIterators
-
- fromAvroType(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils.FixedBytesField
-
- fromBeamCoder(Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderHelpers
-
Wrap a Beam coder into a Spark Encoder using Catalyst Expression Encoders (which uses java code
generation).
- fromBeamFieldType(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils.FixedBytesField
-
- fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.CoderHelpers
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], Coder<T>) - Static method in class org.apache.beam.runners.twister2.utils.TranslationUtils
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArray(byte[], WindowedValue.WindowedValueCoder<T>) - Static method in class org.apache.beam.runners.twister2.utils.TranslationUtils
-
Utility method for deserializing a byte array using the specified coder.
- fromByteArrays(Collection<byte[]>, Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
Utility method for deserializing a Iterable of byte arrays using the specified coder.
- fromByteFunction(Coder<T>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting a byte array to an object.
- FromByteFunction(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.runners.spark.coders.CoderHelpers.FromByteFunction
-
- fromByteFunctionIterable(Coder<K>, Coder<V>) - Static method in class org.apache.beam.runners.spark.coders.CoderHelpers
-
A function wrapper for converting a byte array pair to a key-value pair, where values are
Iterable
.
- fromCanonical(Compression) - Static method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- fromCloudDuration(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
-
Converts a Dataflow API duration string into a
Duration
.
- fromCloudObject(CloudObject) - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
Converts back into the original object from a provided
CloudObject
.
- fromCloudObject(CloudObject) - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
-
Convert from a cloud object.
- fromCloudObject(CloudObject) - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
-
Convert from a cloud object.
- fromCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Transform messages publishable using PubsubIO to their equivalent Pub/Sub Lite publishable
message.
- fromCloudTime(String) - Static method in class org.apache.beam.runners.dataflow.util.TimeUtil
-
Converts a time value received via the Dataflow API into the corresponding
Instant
.
- fromComponents(String, String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from bucket and object components.
- fromConfig(FlinkJobServerDriver.FlinkServerConfiguration) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
-
- fromConfig(FlinkJobServerDriver.FlinkServerConfiguration, JobServerDriver.JobInvokerFactory) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
-
- fromConfig(SparkJobServerDriver.SparkServerConfiguration) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
-
- fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
-
Note that the BeamFnApi.ProcessBundleDescriptor
is constructed by:
Adding gRPC read and write nodes wiring them to the specified data endpoint.
- fromExecutableStage(String, ExecutableStage, Endpoints.ApiServiceDescriptor) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
-
- fromFile(String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
-
- fromFile(String, OutputStream) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
-
- fromFile(File, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
-
- fromHex(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BuiltinStringFunctions
-
- fromIr(Ir, SbeSchema.IrOptions) - Static method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
-
Creates a new
SbeSchema
from the given intermediate representation.
- fromIr(Ir) - Static method in class org.apache.beam.sdk.extensions.sbe.SerializableIr
-
Creates a new instance from ir
.
- fromJsonString(String, Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
- fromLong(Long, Schema, LogicalType) - Method in class org.apache.beam.sdk.coders.AvroCoder.JodaTimestampConversion
-
- fromMap(Map<String, String>) - Static method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
-
Returns a new configuration instance using provided flags.
- fromModel(Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
From model Message
to hl7v2 message.
- fromName(String) - Static method in enum org.apache.beam.io.debezium.Connectors
-
Returns a connector class corresponding to the given connector name.
- fromObject(StorageObject) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- fromOptions(DataflowPipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Constructs a translator from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Construct a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.TestDataflowRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.dataflow.util.GcsStager
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.direct.DirectRunner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.FlinkRunner
-
Construct a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.flink.TestFlinkRunner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.jet.JetRunner
-
- fromOptions(PipelineOptions, Function<ClientConfig, JetInstance>) - Static method in class org.apache.beam.runners.jet.JetRunner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.PortableRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.testing.TestPortableRunner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.portability.testing.TestUniversalRunner
-
Constructs a runner from the provided options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Creates and returns a new SparkRunner with specified options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.SparkRunnerDebugger
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Creates and returns a new SparkStructuredStreamingRunner with specified options.
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.spark.TestSparkRunner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.twister2.Twister2Runner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.runners.twister2.Twister2TestRunner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
-
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.GcsPathValidator
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.extensions.gcp.storage.NoopPathValidator
-
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.s3.DefaultS3FileSystemSchemeRegistrar
-
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemRegistrar
-
- fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.aws.s3.S3FileSystemSchemeRegistrar
-
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.s3.DefaultS3FileSystemSchemeRegistrar
-
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemRegistrar
-
- fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.aws2.s3.S3FileSystemSchemeRegistrar
-
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.azure.blobstore.AzureBlobStoreFileSystemRegistrar
-
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderFileSystemRegistrar
-
- fromOptions(PipelineOptions) - Method in interface org.apache.beam.sdk.io.FileSystemRegistrar
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Creates an instance of this rule using provided options.
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemRegistrar
-
- fromOptions(PipelineOptions) - Method in class org.apache.beam.sdk.io.LocalFileSystemRegistrar
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.PipelineRunner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.CrashingRunner
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.testing.TestPipeline
-
- fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHints
-
- fromParams(String[]) - Static method in class org.apache.beam.runners.flink.FlinkJobServerDriver
-
- fromParams(String[]) - Static method in class org.apache.beam.runners.spark.SparkJobServerDriver
-
- fromParams(DefaultFilenamePolicy.Params) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
- fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Creates a class representing a Pub/Sub subscription from the specified subscription path.
- fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Creates a class representing a Cloud Pub/Sub topic from the specified topic path.
- fromPath(Path, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
-
- fromProcessFunctionWithOutputType(ProcessFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.InferableFunction
-
- fromProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
-
- fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
-
- fromPTransform(RunnerApi.PTransform) - Static method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
-
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Reads results received after executing the given query.
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- fromQuery(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
A query to be executed in Snowflake.
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
- fromRawEvents(Coder<T>, List<TestStream.Event<T>>) - Static method in class org.apache.beam.sdk.testing.TestStream
-
For internal use only.
- fromResourceName(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a OnePlatform resource name in string form.
- fromRow(Row) - Static method in class org.apache.beam.sdk.values.Row
-
Creates a row builder based on the specified row.
- fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProvider
-
Given a type, returns a function that converts from a
Row
object to that type.
- fromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
- fromRowFunction(TypeDescriptor<T>) - Method in interface org.apache.beam.sdk.schemas.SchemaProvider
-
Given a type, returns a function that converts from a
Row
object to that type.
- fromRows(Class<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
- fromRows(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Convert
-
- fromS3Options(S3Options) - Static method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
- fromS3Options(S3Options) - Static method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
- fromSerializableFunctionWithOutputType(SerializableFunction<InputT, OutputT>, TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.SimpleFunction
-
- fromSpec(Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Constructs a CloudObject
by copying the supplied serialized object spec, which must
represent an SDK object serialized for transport via the Dataflow API.
- fromSpec(HCatalogIO.Read) - Static method in class org.apache.beam.sdk.io.hcatalog.HCatToRow
-
- fromStandardParameters(ValueProvider<ResourceId>, String, String, boolean) - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy
-
- fromStaticMethods(Class<?>, Class<?>) - Static method in class org.apache.beam.sdk.coders.CoderProviders
-
Creates a
CoderProvider
from a class's
static <T> Coder<T> of(TypeDescriptor<T>,
List<Coder<?>>
) method.
- fromString(String, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
-
- fromString(ValueProvider<String>, boolean) - Static method in class org.apache.beam.sdk.io.LocalResources
-
- fromSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Reads from the given subscription.
- fromSubscription(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
- fromSupplier(SerializableMatchers.SerializableSupplier<Matcher<T>>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- fromTable(String) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
A table name to be read in Snowflake.
- fromTable(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.Read
-
- fromTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
- fromTableSchema(TableSchema, BigQueryUtils.SchemaConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
- fromTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Creates and returns a transform for reading from a Cloud Pub/Sub topic.
- fromTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
- fromUri(URI) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a URI.
- fromUri(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Creates a GcsPath from a URI in string form.
- FullNameTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
-
Base class for table providers that look up table metadata using full table names, instead of
querying it by parts of the name separately.
- FullNameTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
-
- fullOuterJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Full Outer Join of two collections of KV elements.
- fullOuterJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>, V1, V2) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Full Outer Join of two collections of KV elements.
- fullOuterJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform a full outer join.
- fullPublishResponse() - Static method in class org.apache.beam.sdk.io.aws2.sns.PublishResponseCoders
-
- fullPublishResponseWithoutHeaders() - Static method in class org.apache.beam.sdk.io.aws2.sns.PublishResponseCoders
-
- fullPublishResult() - Static method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoders
-
Returns a new PublishResult coder which serializes the sdkResponseMetadata and sdkHttpMetadata,
including the HTTP response headers.
- fullPublishResultWithoutHeaders() - Static method in class org.apache.beam.sdk.io.aws.sns.PublishResultCoders
-
Returns a new PublishResult coder which serializes the sdkResponseMetadata and sdkHttpMetadata,
but does not include the HTTP response headers.
- fullUpdate(String, String) - Static method in class org.apache.beam.sdk.io.mongodb.UpdateField
-
Sets the limit of documents to find.
- fullyExpand(Map<TupleTag<?>, PValue>) - Static method in class org.apache.beam.sdk.values.PValues
-
- functionGroup - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ZetaSqlScalarFunctionImpl
-
ZetaSQL function group identifier.
- Gauge - Interface in org.apache.beam.sdk.metrics
-
A metric that reports the latest value out of reported values.
- gauge(String, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can have its new value set, and is aggregated by taking the last reported
value.
- gauge(Class<?>, String) - Static method in class org.apache.beam.sdk.metrics.Metrics
-
Create a metric that can have its new value set, and is aggregated by taking the last reported
value.
- GaugeImpl - Class in org.apache.beam.runners.jet.metrics
-
- GaugeImpl(MetricName) - Constructor for class org.apache.beam.runners.jet.metrics.GaugeImpl
-
- GaugeResult - Class in org.apache.beam.sdk.metrics
-
The result of a
Gauge
metric.
- GaugeResult() - Constructor for class org.apache.beam.sdk.metrics.GaugeResult
-
- GaugeResult.EmptyGaugeResult - Class in org.apache.beam.sdk.metrics
-
- GceMetadataUtil - Class in org.apache.beam.sdk.extensions.gcp.util
-
- GceMetadataUtil() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GceMetadataUtil
-
- GcpCredentialFactory - Class in org.apache.beam.sdk.extensions.gcp.auth
-
Construct an oauth credential to be used by the SDK and the SDK workers.
- GcpIoPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.gcp.common
-
A registrar containing the default GCP options.
- GcpIoPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
-
- GcpOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Options used to configure Google Cloud Platform specific options such as the project and
credentials.
- GcpOptions.DefaultProjectFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Attempts to infer the default project based upon the environment this application is executing
within.
- GcpOptions.EnableStreamingEngineFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
EneableStreamingEngine defaults to false unless one of the two experiments is set.
- GcpOptions.GcpTempLocationFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
- GcpOptions.GcpUserCredentialsFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Attempts to load the GCP credentials.
- GcpPipelineOptionsRegistrar - Class in org.apache.beam.sdk.extensions.gcp.options
-
A registrar containing the default GCP options.
- GcpPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
-
- GcpTempLocationFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpTempLocationFactory
-
- GcpUserCredentialsFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GcpOptions.GcpUserCredentialsFactory
-
- GCS_URI - Static variable in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Pattern that is used to parse a GCS URL.
- GcsCreateOptions - Class in org.apache.beam.sdk.extensions.gcp.storage
-
An abstract class that contains common configuration options for creating resources.
- GcsCreateOptions() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
-
- GcsCreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.storage
-
- GcsFileSystemRegistrar - Class in org.apache.beam.sdk.extensions.gcp.storage
-
AutoService
registrar for the GcsFileSystem
.
- GcsFileSystemRegistrar() - Constructor for class org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystemRegistrar
-
- GcsOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
Options used to configure Google Cloud Storage.
- GcsOptions.ExecutorServiceFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
Returns the default ExecutorService
to use within the Apache Beam SDK.
- GcsOptions.PathValidatorFactory - Class in org.apache.beam.sdk.extensions.gcp.options
-
- GcsPath - Class in org.apache.beam.sdk.extensions.gcp.util.gcsfs
-
Implements the Java NIO Path
API for Google Cloud Storage paths.
- GcsPath(FileSystem, String, String) - Constructor for class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Constructs a GcsPath.
- GcsPathValidator - Class in org.apache.beam.sdk.extensions.gcp.storage
-
- GcsResourceId - Class in org.apache.beam.sdk.extensions.gcp.storage
-
ResourceId
implementation for Google Cloud Storage.
- GcsStager - Class in org.apache.beam.runners.dataflow.util
-
Utility class for staging files to GCS.
- gcsUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsCreateOptions
-
The buffer size (in bytes) to use when uploading files to GCS.
- GcsUtil - Class in org.apache.beam.sdk.extensions.gcp.util
-
Provides operations on GCS.
- GcsUtil.CreateOptions - Class in org.apache.beam.sdk.extensions.gcp.util
-
- GcsUtil.CreateOptions.Builder - Class in org.apache.beam.sdk.extensions.gcp.util
-
- GcsUtil.GcsUtilFactory - Class in org.apache.beam.sdk.extensions.gcp.util
-
- GcsUtil.StorageObjectOrIOException - Class in org.apache.beam.sdk.extensions.gcp.util
-
- GcsUtilFactory() - Constructor for class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.GcsUtilFactory
-
- generate(Schema) - Static method in class org.apache.beam.sdk.coders.RowCoderGenerator
-
- generatePartitionMetadataTableName(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.NameGenerator
-
Generates an unique name for the partition metadata table in the form of "CDC_Partitions_Metadata_<databaseId>_<uuid>"
.
- GenerateSequence - Class in org.apache.beam.sdk.io
-
A
PTransform
that produces longs starting from the given value, and either up to the
given limit or until
Long.MAX_VALUE
/ until the given time elapses.
- GenerateSequence() - Constructor for class org.apache.beam.sdk.io.GenerateSequence
-
- GenerateSequence.External - Class in org.apache.beam.sdk.io
-
Exposes GenerateSequence as an external transform for cross-language usage.
- GenerateSequence.External.ExternalConfiguration - Class in org.apache.beam.sdk.io
-
Parameters class to expose the transform to an external SDK.
- GenerateSequenceTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider.seqgen
-
Sequence generator table provider.
- GenerateSequenceTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
-
- GenericDlq - Class in org.apache.beam.sdk.schemas.io
-
Helper to generate a DLQ transform to write PCollection to an external system.
- GenericDlqProvider - Interface in org.apache.beam.sdk.schemas.io
-
A Provider for generic DLQ transforms that handle deserialization failures.
- get(JobInfo) - Method in interface org.apache.beam.runners.fnexecution.control.ExecutableStageContext.Factory
-
- get(JobInfo) - Method in class org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory
-
- get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.BagUserStateHandler
-
Returns an Iterable
of values representing the bag user state for the given key and
window.
- get(W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.IterableSideInputHandler
-
Returns an Iterable
of values representing the side input for the given window.
- get(W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
-
Returns an Iterable
of keys representing the side input for the given window.
- get(K, W) - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandlers.MultimapSideInputHandler
-
Returns an Iterable
of values representing the side input for the given key and
window.
- get() - Method in class org.apache.beam.runners.portability.CloseableResource
-
Gets the underlying resource.
- get() - Static method in class org.apache.beam.runners.spark.io.EmptyCheckpointMark
-
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
-
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.CachedSideInputReader
-
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
-
- get(Long) - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
-
- get() - Method in class org.apache.beam.runners.twister2.translators.functions.Twister2SinkFunction
-
- get(PCollectionView<T>, BoundedWindow) - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
-
- get(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedList
-
- get(Object) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedMap
-
- get(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamCalcRel.WrappedRow
-
- get() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ThroughputEstimator
-
Returns the estimated throughput for now.
- get() - Method in class org.apache.beam.sdk.io.hadoop.SerializableConfiguration
-
- get() - Method in interface org.apache.beam.sdk.options.ValueProvider
-
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
-
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
-
- get() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
-
- get(Class<?>) - Method in class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
-
- get(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
-
- get(Class<?>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
-
- get(Class<?>) - Method in class org.apache.beam.sdk.schemas.JavaBeanSchema.SetterTypeSupplier
-
- get(Class<?>) - Method in class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
-
- get(Object) - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
-
- get(Class<?>) - Method in interface org.apache.beam.sdk.schemas.utils.FieldValueTypeSupplier
-
Return all the FieldValueTypeInformations.
- get(Class<?>, Schema) - Method in interface org.apache.beam.sdk.schemas.utils.FieldValueTypeSupplier
-
Return all the FieldValueTypeInformations.
- get(K) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred lookup, using null values if the item is not found.
- get(String) - Method in interface org.apache.beam.sdk.state.TimerMap
-
- get() - Method in interface org.apache.beam.sdk.testing.SerializableMatchers.SerializableSupplier
-
- get(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.CombineFns.CoCombineResult
-
Returns the value represented by the given
TupleTag
.
- get(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
-
- get() - Method in interface org.apache.beam.sdk.transforms.Materializations.IterableView
-
Returns an iterable for all values.
- get() - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
-
Returns an iterable of all keys.
- get(K) - Method in interface org.apache.beam.sdk.transforms.Materializations.MultimapView
-
Returns an iterable of all the values for the specified key.
- get(int) - Method in class org.apache.beam.sdk.values.PCollectionList
-
Returns the
PCollection
at the given index (origin zero).
- get(String) - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- get(TupleTag<T>) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- get(String) - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- get() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
-
- get(int) - Method in class org.apache.beam.sdk.values.TupleTagList
-
Returns the
TupleTag
at the given index (origin zero).
- getAccessKey() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getAccountName() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getAccum() - Method in interface org.apache.beam.sdk.state.CombiningState
-
Read the merged accumulator for this state cell.
- getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
- getAccumTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.CountMinSketchFn
-
- getAccumulatorCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
- getAccumulatorCoder(CoderRegistry, Coder<Boolean>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CountIf.CountIfFn
-
- getAccumulatorCoder(CoderRegistry, Coder<Row>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.CovarianceFn
-
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.extensions.sql.impl.transform.agg.VarianceFn
-
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFn
-
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn
-
Deprecated.
- getAccumulatorCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
- getAccumulatorCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
- getAccumulatorCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
- getAccumulatorCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
- getAccumulatorCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the Coder
to use for accumulator AccumT
values, or null if it is not
able to be inferred.
- getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFn
-
- getAccumulatorCoder(CoderRegistry, Coder<DataT>) - Method in class org.apache.beam.sdk.transforms.CombineFns.ComposedCombineFnWithContext
-
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
- getAccumulatorCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
- getAdditionalInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
-
- getAdditionalInputs() - Method in class org.apache.beam.sdk.io.WriteFiles
-
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
- getAdditionalInputs() - Method in class org.apache.beam.sdk.transforms.PTransform
-
- getAdditionalOutputTags() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
- getAddresses() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getAlgorithm() - Method in enum org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
-
Returns the string representation of this type.
- getAlgorithm() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
-
- getAll() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Retrieve all HL7v2 Messages from a PCollection of message IDs (such as from PubSub notification
subscription).
- getAll(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns the values from the table represented by the given TupleTag<V>
as an Iterable<V>
(which may be empty if there are no results).
- getAll(String) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- getAll() - Method in class org.apache.beam.sdk.values.PCollectionList
-
- getAll() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- getAll() - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- getAll() - Method in class org.apache.beam.sdk.values.TupleTagList
-
- getAllFields() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
If true, all fields are being accessed.
- getAllIds(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getAllJobs() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
-
- getAllMetadata() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getAllowedLateness() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.DoFn
-
- getAllowedTimestampSkew() - Method in class org.apache.beam.sdk.transforms.WithTimestamps
-
- getAllowNonRestoredState() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getAllPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
- getAllRows(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getAllWorkerStatuses(long, TimeUnit) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
Get all the statuses from all connected SDK harnesses within specified timeout.
- getAlpha() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
-
- getAnnotatedConstructor(Class) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
- getAnnotatedCreateMethod(Class) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
- getApiKey() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getApiPrefix() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
Generates the API endpoint prefix based on the set values.
- getApiRootUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The root URL for the Dataflow API.
- getApiServiceDescriptor() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
Get an
Endpoints.ApiServiceDescriptor
describing the endpoint this
GrpcFnServer
is bound
to.
- getAppId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getApplicationName() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
-
- getAppliedFn(CoderRegistry, Coder<? extends KV<K, ? extends Iterable<InputT>>>, WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
-
- getApplyMethod(ScalarFn) - Static method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFnReflector
-
- getAppName() - Method in interface org.apache.beam.sdk.options.ApplicationNameOptions
-
Name of application, for display purposes.
- getApproximateArrivalTimestamp() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getApproximateArrivalTimestamp() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
-
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
-
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
-
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
-
- getArgument() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
-
- getArgument() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
An optional argument to configure the type.
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
-
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
-
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
-
- getArgumentType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
-
- getArgumentType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
-
- getArgumentType() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
A schema type representing how to interpret the argument.
- getArgumentTypes(Method) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a list of argument types for the given method, which must be a part of the class.
- getArray(String) - Method in class org.apache.beam.sdk.values.Row
-
Get an array value by field name, IllegalStateException
is thrown if schema doesn't
match.
- getArray(int) - Method in class org.apache.beam.sdk.values.Row
-
Get an array value by field index, IllegalStateException
is thrown if schema doesn't
match.
- getArtifact(ArtifactApi.GetArtifactRequest, StreamObserver<ArtifactApi.GetArtifactResponse>) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- getArtifact(RunnerApi.ArtifactInformation) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactRetrievalService
-
- getArtifactPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
-
- getArtifactStagingPath() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
-
- getATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- getATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- getAttempted() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all attempts of executing all parts of the pipeline.
- getAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the given attribute value.
- getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the full map of attributes.
- getAuthenticator() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getAuthenticator() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getAuthToken(String, String) - Method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverConfiguration
-
Certain embedded scenarios and so on actually allow for having no authentication at all.
- getAutoCommit() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getAutoscalingAlgorithm() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
[Experimental] The autoscaling algorithm to use for the workerpool.
- getAutoWatermarkInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getAvroBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
- getAwsCredentialsProvider() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
-
The credential instance that should be used to authenticate against AWS services.
- getAwsCredentialsProvider() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
- getAwsRegion() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
-
AWS region used by the AWS client.
- getAwsRegion() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
Region used to configure AWS service clients.
- getAwsServiceEndpoint() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
-
The AWS service endpoint used by the AWS client.
- getAzureConnectionString() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getAzureCredentialsProvider() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
The credential instance that should be used to authenticate against Azure services.
- getBacklogCheckTime() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
-
The time at which latest offset for the partition was fetched in order to calculate backlog.
- getBagUserStateSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to user state input id to
bag user
states
that are used during execution.
- getBaseAutoValueClass(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
-
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
-
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
-
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
-
- getBaseType() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
-
- getBaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
-
- getBaseType() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
- getBaseValue(String, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(String) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(int, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValue(int) - Method in class org.apache.beam.sdk.values.Row
-
Returns the base type for this field.
- getBaseValues() - Method in class org.apache.beam.sdk.values.Row
-
Return a list of data values.
- getBatchClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
-
- getBatchDuration() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
- getBatches() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Get the underlying queue representing the mock stream of micro-batches.
- getBatchingParams() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Returns user supplied parameters for batching.
- getBatchingParams() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.WithShardedKey
-
Returns user supplied parameters for batching.
- getBatchInitialCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The initial size of a batch; used in the absence of the QoS system having significant data to
determine a better batch size.
- getBatchIntervalMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- getBatchMaxBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of bytes to include in a batch.
- getBatchMaxCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of writes to include in a batch.
- getBatchService() - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices
-
- getBatchService() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
-
- getBatchSize() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
-
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- getBatchSizeBytes() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
-
- getBatchTargetLatency() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Target latency for batch requests.
- getBeamRelInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
-
- getBeamSqlTable() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
-
- getBeamSqlUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
-
- getBearerToken() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getBigQueryProject() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
- getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
- getBlobServiceEndpoint() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
The Azure Blobstore service endpoint used by the Blob service client.
- getBlobstoreClientFactoryClass() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getBody() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Message body.
- getBody() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getBoolean(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getBoolean(Map<String, Object>, String, Boolean) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getBoolean() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getBoolean(String) - Method in class org.apache.beam.sdk.values.Row
-
- getBoolean(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a Boolean
value by field index, ClassCastException
is thrown if schema
doesn't match.
- getBootstrapServers() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
-
- getBoundednessOfRelNode(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
This method returns the Boundedness of a RelNode.
- getBqStreamingApiLoggingFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getBroadcastSizeEstimate() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
-
- getBroadcastValue(String) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.SideInputBroadcast
-
- getBucket() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the bucket name associated with this GCS path, or an empty string if this is a relative
path component.
- getBucketKeyEnabled() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
-
- getBucketKeyEnabled() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
Whether to ose an S3 Bucket Key for object encryption with server-side encryption using AWS KMS
(SSE-KMS) or not.
- getBucketKeyEnabled() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
-
- getBucketKeyEnabled() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Whether to use an S3 Bucket Key for object encryption with server-side encryption using AWS KMS
(SSE-KMS) or not.
- getBufferSize() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
-
- getBuilder(S3Options) - Static method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
- getBuilderCreator(Class<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
-
Try to find an accessible builder class for creating an AutoValue class.
- getBuiltinMethods() - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.BeamBuiltinFunctionProvider
-
- getBulkDirective() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
- getBulkEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getBulkIO() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
-
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
Get a new
bundle
for processing the data in an executable stage.
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
- getBundle(OutputReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
- getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
Get a new
bundle
for processing the data in an executable stage.
- getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
- getBundle(OutputReceiverFactory, TimerReceiverFactory, StateRequestHandler, BundleProgressHandler, BundleFinalizationHandler, BundleCheckpointHandler) - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
- getBundle() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
FHIR R4 bundle resource object as a string.
- getBundleSize() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- getByte() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getByte(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTE
value by field name,
IllegalStateException
is thrown if
schema doesn't match.
- getByte(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTE
value by field index,
ClassCastException
is thrown if
schema doesn't match.
- getBytes(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getBytes(Map<String, Object>, String, byte[]) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getBytes() - Method in class org.apache.beam.sdk.io.range.ByteKey
-
Returns a newly-allocated
byte[]
representing this
ByteKey
.
- getBytes(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.BYTES
value by field name,
IllegalStateException
is thrown if
schema doesn't match.
- getBytes(int) - Method in class org.apache.beam.sdk.values.Row
-
- getBytesPerOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns approximately how many bytes of data correspond to a single offset in this source.
- getCacheTokens() - Method in interface org.apache.beam.runners.fnexecution.state.StateRequestHandler
-
Retrieves a list of valid cache tokens.
- getCaseEnumType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- getCaseType() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the enumeration that specified which OneOf field is set.
- getCatalog() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getCause() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
-
- getCEPFieldRefFromParKeys(ImmutableBitSet) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Transform the partition columns into serializable CEPFieldRef.
- getCepKind() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
-
- getCEPPatternFromPattern(Schema, RexNode, Map<String, RexNode>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Construct a list of CEPPattern
s from a RexNode
.
- getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for querying a partition change stream.
- getChannelFactory() - Method in class org.apache.beam.sdk.io.CompressedSource
-
- getChannelNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- getCheckpointDir() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
-
- getCheckpointDurationMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- getCheckpointingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getCheckpointingInterval() - Method in interface org.apache.beam.sdk.io.kafka.KafkaIO.Read.FakeFlinkPipelineOptions
-
- getCheckpointingMode() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getCheckpointMark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- getCheckpointMark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
- getCheckpointMarkCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
-
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.UnboundedSource
-
Returns a
Coder
for encoding and decoding the checkpoints for this source.
- getCheckpointTimeoutMillis() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getChildPartitions() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
List of child partitions yielded within this record.
- getChildRels(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
-
- getClasses() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptor
s, one for each superclass (including this class).
- getClassName() - Method in class org.apache.beam.runners.dataflow.util.CloudObject
-
Gets the name of the Java class that this CloudObject represents.
- getClazz() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
-
- getClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
- getClientBuilderFactory() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
- getClientConfiguration() - Method in interface org.apache.beam.sdk.io.aws.options.AwsOptions
-
The client configuration instance that should be used to configure AWS service clients.
- getClientInfo(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getClientInfo() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getCloningBehavior() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- getClosingBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- getClosure() - Method in class org.apache.beam.sdk.transforms.Contextful
-
Returns the closure.
- getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.aws.dynamodb.AwsClientsProvider
-
- getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.aws.sns.AwsClientsProvider
-
- getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.AWSClientsProvider
-
Deprecated.
- getCloudWatchClient() - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
-
- getClusterId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
- getClusterName() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
-
- getClusterType() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
-
- getCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
-
Returns the type code of the column.
- getCodec(DestinationT) - Method in class org.apache.beam.sdk.io.DynamicAvroDestinations
-
Return an AVRO codec for a given destination.
- getCodeJarPathname() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
-
- getCoder() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
-
- getCoder() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
-
- getCoder(String) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.SideInputBroadcast
-
- getCoder(Class<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Returns the
Coder
to use for values of the given class.
- getCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
Returns the
Coder
to use for values of the given type.
- getCoder(TypeDescriptor<OutputT>, TypeDescriptor<InputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
- getCoder(Class<? extends T>, Class<T>, Map<Type, ? extends Coder<?>>, TypeVariable<?>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
- getCoder() - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
Returns the coder used to encode/decode the intermediate values produced/consumed by the coding
functions of this DelegateCoder
.
- getCoder() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
-
- getCoder() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
-
- getCoder(CoderRegistry) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
-
- getCoder(CoderRegistry) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
-
- getCoder() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
- getCoder() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the
Coder
used by this
PCollection
to encode and decode the values
stored in it.
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.AtomicCoder
-
If this is a
Coder
for a parameterized type, returns the list of
Coder
s being
used for each of the parameters in the same order they appear within the parameterized type's
type signature.
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.Coder
-
If this is a
Coder
for a parameterized type, returns the list of
Coder
s being
used for each of the parameters in the same order they appear within the parameterized type's
type signature.
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.CustomCoder
-
If this is a
Coder
for a parameterized type, returns the list of
Coder
s being
used for each of the parameters in the same order they appear within the parameterized type's
type signature.
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.KvCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.MapCoder
-
If this is a
Coder
for a parameterized type, returns the list of
Coder
s being
used for each of the parameters in the same order they appear within the parameterized type's
type signature.
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.ShardedKeyCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.SnappyCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
-
If this is a
Coder
for a parameterized type, returns the list of
Coder
s being
used for each of the parameters in the same order they appear within the parameterized type's
type signature.
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadataCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- getCoderArguments() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- getCoderInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
- getCoderInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.coders.AvroCoder
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.coders.SerializableCoder
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
-
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
-
- getCoderProviders() - Method in interface org.apache.beam.sdk.coders.CoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.coders.DefaultCoder.DefaultCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.coders.SerializableCoder.SerializableCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtobufCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.amqp.AmqpMessageCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.aws.sns.SnsCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.aws2.sns.SnsCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.aws2.sqs.MessageCoderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.aws2.sqs.SendMessageRequestCoderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.hadoop.WritableCoder.WritableCoderProviderRegistrar
-
- getCoderProviders() - Method in class org.apache.beam.sdk.io.hbase.HBaseCoderProviderRegistrar
-
- getCoderRegistry() - Method in class org.apache.beam.sdk.Pipeline
-
- getCoGbkResultSchema() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- getCohorts() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.CalcRelSplitter
-
Returns a list of sets of expressions that should be on the same level.
- getCollations() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- getCollection() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
-
Returns the underlying PCollection of this TaggedKeyedPCollection.
- getCollectionElementType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
- getColumnDelimiter() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
-
- getColumns() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
-
- getCombineFn() - Method in interface org.apache.beam.runners.spark.aggregators.NamedAggregators.State
-
- getCombineFn() - Method in interface org.apache.beam.runners.spark.structuredstreaming.aggregators.NamedAggregators.State
-
- getCombineFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
-
- getCombineFn() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
-
- getComment() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
-
- getCommitDeadline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getCommitRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getCommitted() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all successfully completed parts of the pipeline.
- getCommittedOrNull() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the value of this metric across all attempts of executing all parts of the pipeline.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
-
Returns the commit timestamp of the read / write transaction.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The timestamp at which the modifications within were committed in Cloud Spanner.
- getComponents() - Method in class org.apache.beam.sdk.coders.AtomicCoder
-
Returns the list of
Coders
that are components of this
Coder
.
- getComponents() - Method in class org.apache.beam.sdk.coders.StructuredCoder
-
Returns the list of
Coders
that are components of this
Coder
.
- getComponents() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
-
- getComponents() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Path
-
Hierarchy list of component paths making up the full path, starting with the top-level child
component path.
- getComponents() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- getComponents() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
-
- getComponents() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
-
- getComponents() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- getComponentType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the component type if this type is an array type, otherwise returns null
.
- getCompression() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
- getCompression() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
- getComputeNumShards() - Method in class org.apache.beam.sdk.io.WriteFiles
-
- getConfig() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
-
- getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
-
- getConfiguration() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
- getConfiguration() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
-
- getConfigurationMap() - Method in class org.apache.beam.io.debezium.DebeziumIO.ConnectorConfiguration
-
Configuration Map Getter.
- getConnection(InfluxDbIO.DataSourceConfiguration, boolean) - Static method in class org.apache.beam.sdk.io.influxdb.InfluxDbIO
-
- getConnector() - Method in enum org.apache.beam.io.debezium.Connectors
-
Class connector to debezium.
- getConnectStringPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
- getConnectTimeout() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getConstructorCreator(Class<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.AutoValueUtils
-
Try to find an accessible constructor for creating an AutoValue class.
- getConstructorCreator(Class, Constructor, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- getConstructorCreator(Class, Constructor, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- getContainerImageBaseRepository() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for constructing the container image path.
- getContent() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the extracted text.
- getContentEncoding() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getContentType() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
The content type for the created file, eg "text/plain".
- getContentType() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getContext() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
-
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
-
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
-
Deprecated.
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
Return a trigger to use after a
GroupByKey
to preserve the intention of this trigger.
- getContinuationTrigger(List<Trigger>) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
- getContinuationTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
-
- getConversionOptions(JSONObject) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
-
- getConvertedSchemaInformation(Schema, TypeDescriptor<T>, SchemaRegistry) - Static method in class org.apache.beam.sdk.schemas.utils.ConvertHelpers
-
Get the coder used for converting from an inputSchema to a given type.
- getConvertedType() - Method in class org.apache.beam.sdk.coders.AvroCoder.JodaTimestampConversion
-
- getConvertPrimitive(Schema.FieldType, TypeDescriptor<?>, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.ConvertHelpers
-
Returns a function to convert a Row into a primitive type.
- getCorrelationId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
-
- getCount() - Method in class org.apache.beam.sdk.metrics.DistributionResult
-
- getCount() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
-
- getCountEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getCounter(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
-
- getCounter(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Counter
that should be used for implementing the given
metricName
in
this container.
- getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
-
- getCounters(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
-
- getCounters() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the counters that matched the filter.
- getCountryOfResidence() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
-
- getCpu() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- getCpuRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- getCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which this partition was first detected and created in the metadata table.
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration
-
Specifies whether the table should be created if it does not exist.
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
-
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
- getCreateFromSnapshot() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
If set, the snapshot from which the job should be created.
- getCreateTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets create time.
- getCreator(Class<T>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
Get an object creator for an AVRO-generated SpecificRecord.
- getCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.auth.CredentialFactory
-
- getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.GcpCredentialFactory
-
Returns a default GCP Credentials
or null when it fails.
- getCredential() - Method in class org.apache.beam.sdk.extensions.gcp.auth.NoopCredentialFactory
-
- getCredentialFactoryClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The class of the credential factory that should be created and used to create credentials.
- getCsvFormat() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider.CsvToRow
-
- getCurrent() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- getCurrent() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
- getCurrent() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Gets the current record from the delegate reader.
- getCurrent() - Method in class org.apache.beam.sdk.io.Source.Reader
-
- getCurrentBlock() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
-
- getCurrentBlock() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
- getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
-
- getCurrentBlockOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns the largest offset such that starting to read from that offset includes the current
block.
- getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
-
- getCurrentBlockSize() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns the size of the current block in bytes as it is represented in the underlying file,
if possible.
- getCurrentContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
- getCurrentDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
-
- getCurrentDirectory() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
-
- getCurrentDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns the ResourceId
that represents the current directory of this ResourceId
.
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
- getCurrentOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- getCurrentParent() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Gets the parent composite transform to the current transform, if one exists.
- getCurrentRecord() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
-
Returns the current record.
- getCurrentRecordId() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns a unique identifier for the current record.
- getCurrentRelativeTime() - Method in interface org.apache.beam.sdk.state.Timer
-
- getCurrentRowAsStruct() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the record at the current pointer as a
Struct
.
- getCurrentSchemaPlus() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
-
Calcite-created SchemaPlus
wrapper for the current schema.
- getCurrentSource() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- getCurrentSource() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
-
- getCurrentSource() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns a Source
describing the same input that this Reader
currently reads
(including items already read).
- getCurrentSource() - Method in class org.apache.beam.sdk.io.FileBasedSource.FileBasedReader
-
- getCurrentSource() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- getCurrentSource() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Returns a Source
describing the same input that this Reader
currently reads
(including items already read).
- getCurrentSource() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
- getCurrentTimestamp() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
By default, returns the minimum possible timestamp.
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.Source.Reader
-
Returns the timestamp associated with the current data item.
- getCurrentTransform() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
- getCurrentTransform() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getCurrentTransform() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getCursor() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
-
- getCustomerId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
-
- getCustomerProvidedKey() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getCustomError(HttpRequestWrapper, HttpResponseWrapper) - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors
-
- getCustomError() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
-
- getDanglingDataSets() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
-
- getData() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets data.
- getData() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getDataAsBytes() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getDataAsBytes() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a Snowflake database.
- getDatabase() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getDatabase() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getDatabaseAdminClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
-
- getDatabaseClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
-
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getDataCatalogEndpoint() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
-
DataCatalog endpoint.
- getDataCoder() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
-
- getDataflowClient() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
An instance of the Dataflow client.
- getDataflowEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Dataflow endpoint to use.
- getDataflowJobFile() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The path to write the translated Dataflow job specification out to at job submission time.
- getDataflowKmsKey() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
GCP
Cloud KMS key for Dataflow pipelines and
buckets created by GcpTempLocationFactory.
- getDataflowOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
- getDataflowRunnerInfo() - Static method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
- getDataflowServiceOptions() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Service options are set by the user and configure the service.
- getDataflowWorkerJar() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
- getDataResource() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
-
- getDataSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The expected schema of the Pub/Sub message.
- getDataset(PValue) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Gets the specified
Dataset
resource by dataset ID.
- getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getDataSetOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
-
- getDatasetService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
- getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
-
- getDataSource() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getDataSourceConfiguration() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getDataSourceProviderFn() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a DataSource provider function for connection credentials.
- getDataStreamOrThrow(String) - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
-
- getDataType() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
-
- getDateTime() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getDateTime(String) - Method in class org.apache.beam.sdk.values.Row
-
- getDateTime(int) - Method in class org.apache.beam.sdk.values.Row
-
- getDatumWriterFactory(DestinationT) - Method in class org.apache.beam.sdk.io.DynamicAvroDestinations
-
- getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
-
- getDayOfMonth() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- getDbSize() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
-
- getDeadLetterQueue() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The Pub/Sub topic path to write failures.
- getDebuggee() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
-
The Cloud Debugger debuggee to associate with.
- getDecimal() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getDecimal(String) - Method in class org.apache.beam.sdk.values.Row
-
- getDecimal(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a BigDecimal
value by field index, ClassCastException
is thrown if schema
doesn't match.
- getDef() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
-
- getDef() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
-
- getDefault() - Static method in class org.apache.beam.runners.fnexecution.logging.Slf4jLogWriter
-
- getDefaultCoder(TypeDescriptor<?>, CoderRegistry) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
Returns the default coder for a given type descriptor.
- getDefaultDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns the default destination.
- getDefaultEnvironmentConfig() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getDefaultEnvironmentType() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getDefaultJobName() - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
-
- getDefaultOutputCoder(CoderRegistry, Coder) - Method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.TDigestQuantilesFn
-
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- getDefaultOutputCoder() - Method in class org.apache.beam.sdk.io.Source
-
- getDefaultOutputCoder(CoderRegistry, Coder<Double>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
- getDefaultOutputCoder(CoderRegistry, Coder<V>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
- getDefaultOutputCoder(CoderRegistry, Coder<Integer>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
- getDefaultOutputCoder(CoderRegistry, Coder<Long>) - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
- getDefaultOutputCoder(CoderRegistry, Coder<InputT>) - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the Coder
to use by default for output OutputT
values, or null if it
is not able to be inferred.
- getDefaultOutputCoder() - Method in class org.apache.beam.sdk.transforms.PTransform
-
- getDefaultOutputCoder(InputT) - Method in class org.apache.beam.sdk.transforms.PTransform
-
- getDefaultOutputCoder(InputT, PCollection<T>) - Method in class org.apache.beam.sdk.transforms.PTransform
-
- getDefaultOutputCoder(CoderRegistry, Coder<T>) - Method in class org.apache.beam.sdk.transforms.Sample.FixedSizedSampleFn
-
- getDefaultPrecision(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
-
- getDefaultSdkHarnessLogLevel() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
This option controls the default log level of all loggers without a log level override.
- getDefaultTimezone() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- getDefaultValue() - Method in interface org.apache.beam.sdk.values.PCollectionViews.HasDefaultValue
-
- getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
Returns the default value that was specified.
- getDefaultValue() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
-
Returns the default value that was specified.
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
-
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.PartitioningWindowFn
-
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
-
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Return a
WindowMappingFn
that returns the earliest window that contains the end of the
main-input window.
- getDefaultWindowMappingFn() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns the default
WindowMappingFn
to use to map main input windows to side input
windows.
- getDefaultWorkerLogLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.
This option controls the default log level of all loggers without a log level override.
- getDeidentifyConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- getDeidentifyTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- getDelay() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.Delay
-
- getDelimiters() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
- getDeliveryMode() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getDependencies(RunnerApi.FunctionSpec, PipelineOptions) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionService.TransformProvider
-
- getDependencies(ConfigT, PipelineOptions) - Method in interface org.apache.beam.sdk.transforms.ExternalTransformBuilder
-
List the dependencies needed for this transform.
- getDescription() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the field's description.
- getDescriptor(String) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
-
- getDescriptorFromSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
-
Given a Beam Schema, returns a protocol-buffer Descriptor that can be used to write data using
the BigQuery Storage API.
- getDescriptorFromTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
-
Given a BigQuery TableSchema, returns a protocol-buffer Descriptor that can be used to write
data using the BigQuery Storage API.
- getDeserializer(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
-
- getDeserializer(Map<String, ?>, boolean) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
-
- getDestination() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
Staged target for this file.
- getDestination(String, String) - Method in interface org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestinationProvider
-
- getDestination(UserT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns an object that represents at a high level the destination being written to.
- getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
-
- getDestination() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Return the user destination object for this writer.
- getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns an object that represents at a high level which table is being written to.
- getDestinationCoder() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Returns the coder for DestinationT
.
- getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the coder for DestinationT
.
- getDestinationFile(boolean, FileBasedSink.DynamicDestinations<?, DestinationT, ?>, int, FileBasedSink.OutputFileHints) - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
-
- getDictionary(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getDictionary(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getDir() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
-
- getDirectoryTreatment() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
- getDisableMetrics() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getDiskSizeGb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Remote worker disk size, in gigabytes, or 0 to use the default size.
- getDistribution(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
-
- getDistribution() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- getDistribution(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Distribution
that should be used for implementing the given
metricName
in this container.
- getDistributions() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the distributions that matched the filter.
- getDlqTransform(String) - Static method in class org.apache.beam.sdk.schemas.io.GenericDlq
-
- getDocToBulk() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Write
-
- getDocumentCount() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbIO.Read
-
- getDoFnRunner(PipelineOptions, DoFn<InputT, OutputT>, SideInputReader, AbstractParDoP.JetOutputManager, TupleTag<OutputT>, List<TupleTag<?>>, Coder<InputT>, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Method in class org.apache.beam.runners.jet.processors.ParDoP
-
- getDoFnRunner(PipelineOptions, DoFn<KV<?, ?>, OutputT>, SideInputReader, AbstractParDoP.JetOutputManager, TupleTag<OutputT>, List<TupleTag<?>>, Coder<KV<?, ?>>, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, DoFnSchemaInformation, Map<String, PCollectionView<?>>) - Method in class org.apache.beam.runners.jet.processors.StatefulParDoP
-
- getDoFnSchemaInformation(DoFn<?, ?>, PCollection<?>) - Static method in class org.apache.beam.sdk.transforms.ParDo
-
Extract information on how the DoFn uses schemas.
- getDouble() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getDouble(String) - Method in class org.apache.beam.sdk.values.Row
-
- getDouble(int) - Method in class org.apache.beam.sdk.values.Row
-
- getDumpHeapOnOOM() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
If true, save a heap dump before killing a thread or process which is GC thrashing
or out of memory.
- getDynamicDestinations() - Method in class org.apache.beam.sdk.io.AvroSink
-
- getDynamicDestinations() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
- getDynamoDbClient() - Method in class org.apache.beam.sdk.io.aws2.dynamodb.BasicDynamoDbClientProvider
-
Deprecated.
- getDynamoDbClient() - Method in interface org.apache.beam.sdk.io.aws2.dynamodb.DynamoDbClientProvider
-
Deprecated.
- getEarliestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets earliest hl 7 v 2 send time.
- getEarliestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- getEarlyTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
-
- getElemCoder() - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
- getElement() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
-
- getElementByteSize() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
-
- getElementCoders() - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
- getElementConverters() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
The schema of the @Element parameter.
- getElementCount() - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
The number of elements after which this trigger may fire.
- getElements() - Method in class org.apache.beam.sdk.testing.TestStream.ElementEvent
-
- getElements() - Method in class org.apache.beam.sdk.transforms.Create.Values
-
- getElementType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a container type, returns the element type.
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
- getEmptyMatchTreatment() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
- getEmulatorHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
A host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
- getEmulatorHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getEnableCloudDebugger() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
-
Whether to enable the Cloud Debugger snapshot agent for the current job.
- getEnableSparkMetricSinks() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
-
- getEncodedElementByteSize(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
-
- getEncodedElementByteSize(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
-
- getEncodedElementByteSize(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- getEncodedElementByteSize(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.Coder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
-
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
Overridden to short-circuit the default StructuredCoder
behavior of encoding and
counting the bytes.
- getEncodedElementByteSize(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
Overridden to short-circuit the default StructuredCoder
behavior of encoding and
counting the bytes.
- getEncodedElementByteSize(String) - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
Returns the size in bytes of the encoded value using this coder.
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- getEncodedElementByteSize(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
- getEncodedElementByteSize(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
- getEncodedElementByteSize(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
-
- getEncodedElementByteSize(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- getEncodedElementByteSize(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
-
- getEncodedElementByteSize(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
-
- getEncodedElementByteSize(BigQueryInsertError) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
-
- getEncodedElementByteSize(TableRow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- getEncodedElementByteSize(OffsetRange) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.AvroCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ByteCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.Coder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.CollectionCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DelegateCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DequeCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.DurationCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.FloatCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.InstantCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.IterableCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.KvCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.ListCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.MapCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SetCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.coders.VoidCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
-
- getEncodedWindow() - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow
-
- getEncodingPositions() - Method in class org.apache.beam.sdk.schemas.Schema
-
Gets the encoding positions for this schema.
- getEnd() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
-
- getEnd() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
-
- getEndKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
- getEndOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the specified ending offset of the source.
- getEndpoint() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
Endpoint used to configure AWS service clients.
- getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The end time for querying this given partition.
- getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
-
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
-
- getEnvironment() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
Return the environment that the remote handles.
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
-
- getEnvironment() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
-
- getEnvironment() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getEnvironmentCacheMillis() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getEnvironmentExpirationMillis() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getEnvironmentId() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- getEnvironmentOption(PortablePipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
Return the value for the specified environment option or empty string if not present.
- getEnvironmentOptions() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getEquivalentFieldType(TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
-
Returns Beam equivalent of ClickHouse column type.
- getEquivalentSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
-
Returns Beam equivalent of ClickHouse schema.
- getError() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
-
- getError() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the parse error, if the file was parsed unsuccessfully.
- getError() - Method in class org.apache.beam.sdk.schemas.io.Failure
-
Information about the cause of the failure.
- getErrorAsString() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
- getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
-
- getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
-
- getErrorRowSchema(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
-
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.BoundedSource
-
An estimate of the total size (in bytes) of the data that would be read from this source.
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
-
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
-
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
-
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
-
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
- getEvents() - Method in class org.apache.beam.sdk.testing.TestStream
-
- getEx() - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
-
- getExecutables() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
-
- getExecutableStageIntermediateId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
- getExecuteStreamingSqlRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
-
- getExecutionEnvironment() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
-
- getExecutionModeForBatch() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getExecutionRetryDelay() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getExecutorService() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The ExecutorService instance to use to create threads, can be overridden to specify an
ExecutorService that is compatible with the user's environment.
- getExpansionPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
-
- getExpectedAssertions() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
-
- getExpectFileToNotExist() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
If true, the created file is expected to not exist.
- getExperiments() - Method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
- getExperimentValue(PipelineOptions, String) - Static method in interface org.apache.beam.sdk.options.ExperimentalOptions
-
Return the value for the specified experiment or null if not present.
- getExpiration() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getExplicitHashKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner.ExplicitPartitioner
-
Required hash value (128-bit integer) to determine explicitly the shard a record is assigned
to based on the hash key range of each shard.
- getExplicitHashKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
-
Optional hash value (128-bit integer) to determine explicitly the shard a record is assigned to
based on the hash key range of each shard.
- getExplicitHashKey(byte[]) - Method in interface org.apache.beam.sdk.io.kinesis.KinesisPartitioner
-
- getExpression(SchemaPlus, String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getExtendedSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getExtendedSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getExtensionHosts() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- getExtensionRegistry() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- getExternalSorterType() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the external sorter type.
- getFactory(AwsOptions) - Static method in interface org.apache.beam.sdk.io.aws2.common.ClientBuilderFactory
-
- getFactory() - Method in class org.apache.beam.sdk.schemas.utils.AvroUtils.AvroConvertValueForGetter
-
- getFactory() - Method in class org.apache.beam.sdk.schemas.utils.AvroUtils.AvroConvertValueForSetter
-
- getFactory() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForGetter
-
- getFactory() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.ConvertValueForSetter
-
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
-
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets failed bodies with err.
- getFailedBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Gets failed FhirBundleResponse wrapped inside HealthcareIOError.
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
-
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets failed file imports with err.
- getFailedInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
- getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
- getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
-
- getFailedMessages() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
-
- getFailedMutations() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
-
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
-
- getFailedSearches() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets failed searches.
- getFailedStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
- getFailedToParseLines() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
-
- getFailOnCheckpointingErrors() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getFailsafeValue() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the failsafe value of this FailsafeValueInSingleWindow
.
- getFanout() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
-
- getFasterCopy() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getFhirBundleParameter() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
- getFhirStore() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
- getField() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
-
- getField() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- getField(int) - Method in class org.apache.beam.sdk.schemas.Schema
-
Return a field by index.
- getField(String) - Method in class org.apache.beam.sdk.schemas.Schema
-
- getFieldAccessDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFnSchemaInformation
-
Effective FieldAccessDescriptor applied by DoFn.
- getFieldCount() - Method in class org.apache.beam.sdk.schemas.Schema
-
Return the count of fields.
- getFieldCount() - Method in class org.apache.beam.sdk.values.Row
-
Return the size of data fields.
- getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithGetters
-
- getFieldCount() - Method in class org.apache.beam.sdk.values.RowWithStorage
-
- getFieldId() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
-
- getFieldName() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
-
- getFieldNames() - Method in class org.apache.beam.sdk.schemas.Schema
-
Return the list of all field names.
- getFieldOptionById(int) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
-
- getFieldRef(CEPOperation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
a function that finds a pattern reference recursively.
- getFieldRename() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
-
- getFields() - Method in class org.apache.beam.sdk.schemas.Schema
-
- getFields(Class<?>) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
- getFieldsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
- getFieldType(Schema, CEPOperation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
- getFieldType(OneOfType.Value) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- getFieldTypes(Class<T>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
Get field types for an AVRO-generated SpecificRecord or a POJO.
- getFieldTypes(Class<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- getFieldTypes(Class<?>, Schema, FieldValueTypeSupplier) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- getFileDescriptor(String) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDomain
-
- getFileLocation() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the absolute path to the input file.
- getFilename() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
-
- getFileName() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- getFilename() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
-
- getFilename(BoundedWindow, PaneInfo, int, int, Compression) - Method in interface org.apache.beam.sdk.io.FileIO.Write.FileNaming
-
Generates the filename.
- getFilename() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns the name of the file or directory denoted by this ResourceId
.
- getFilenamePolicy(DestinationT) - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
- getFileOrPatternSpec() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- getFileOrPatternSpecProvider() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- getFilePattern() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
-
- getFilePattern() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
- getFilesList() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting list of names of staged files.
- getFilesList() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for a list of staged files which are will be loaded to Snowflake.
- getFilesToStage() - Method in interface org.apache.beam.sdk.options.FileStagingOptions
-
List of local files to make available to workers.
- getFileSystem() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- getFinishBundleBeforeCheckpointing() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getFinishedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which the connector finished processing this partition.
- getFlatJsonRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
Loads rows from BigQuery into
Rows
with given
Schema
.
- getFlexRSGoal() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
This option controls Flexible Resource Scheduling mode.
- getFlinkConfDir() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getFlinkMaster() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
The url of the Flink JobManager on which to execute pipelines.
- getFloat() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getFloat(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.FLOAT
value by field name,
IllegalStateException
is thrown if
schema doesn't match.
- getFloat(int) - Method in class org.apache.beam.sdk.values.Row
-
- getFn() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
-
- getFn() - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
-
- getFn() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
- getFn() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
-
- getFn() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
- getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
- getFn() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
- getFnApiDevContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for dev SDK FnAPI container image.
- getFnApiEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the FnAPI environment's major version number.
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The expected format of the Pub/Sub message.
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
Returns a value in [0, 1] representing approximately what fraction of the
current source
this reader has read so far, or
null
if such an
estimate is not available.
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
- getFractionConsumed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
- getFractionConsumed() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
- getFractionOfBlockConsumed() - Method in class org.apache.beam.sdk.io.BlockBasedSource.Block
-
Returns the fraction of the block already consumed, if possible, as a value in [0,
1]
.
- getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ThroughputEstimator
-
Returns the estimated throughput for a specified time.
- getFrom() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Returns the range start timestamp (inclusive).
- getFrom() - Method in class org.apache.beam.sdk.io.range.OffsetRange
-
- getFromRowFunction() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
- getFromRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the toRow conversion function.
- getFromRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts a
Row
object to the specified type.
- getFromRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts a
Row
object to the specified type.
- getFromRowFunction(Class<T>) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
- getFromRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema's fromRowFunction.
- getFullName(PTransform<?, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the full name of the currently being translated transform.
- getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
-
- getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
-
- getFunctionNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getFunctions(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getGapDuration() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
-
- getGauge(MetricName) - Method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
-
- getGauge(MetricName) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Gauge
that should be used for implementing the given
metricName
in
this container.
- getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
-
- getGauges(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
-
- getGauges() - Method in class org.apache.beam.sdk.metrics.MetricQueryResults
-
Return the metric results for the gauges that matched the filter.
- getGcloudCancelCommand(DataflowPipelineOptions, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
- getGcpCredential() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The credential instance that should be used to authenticate against GCP services.
- getGcpTempLocation() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
A GCS path for storing temporary files in GCP.
- getGcsEndpoint() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
GCS endpoint to use.
- getGcsPerformanceMetrics() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
If true, reports metrics of certain operations, such as batch copies.
- getGcsUploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The buffer size (in bytes) to use when uploading files to GCS.
- getGcsUtil() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The GcsUtil instance that should be used to communicate with Google Cloud Storage.
- getGCThrashingPercentagePerPeriod() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The GC thrashing threshold percentage.
- getGenericRecordToRowFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
- getGetters(Class<T>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
Get generated getters for an AVRO-generated SpecificRecord or a POJO.
- getGetters(Class<?>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- getGetters(Class<?>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- getGetters() - Method in class org.apache.beam.sdk.values.RowWithGetters
-
- getGetterTarget() - Method in class org.apache.beam.sdk.values.RowWithGetters
-
- getGoogleApiTrace() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions
-
This option enables tracing of API calls to Google services used within the Apache Beam SDK.
- getGrammarFileName() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- getGrammarFileName() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- getGroupingTableMaxSizeMb() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in MB) of each grouping table used to pre-combine elements.
- getHasError() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
- getHashCode() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
-
- getHdfsConfiguration() - Method in interface org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptions
-
- getHeaderAccessor() - Static method in class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
-
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
- getHeaderColumns() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- getHeaders() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
-
- getHeaders() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getHeartbeatMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The number of milliseconds after the stream is idle, which a heartbeat record will be emitted
in the change stream query.
- getHighWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
-
- getHintMaxNumWorkers() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
A hint to the QoS system for the intended max number of workers for a pipeline.
- getHistogram(MetricName, HistogramData.BucketType) - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the
Histogram
that should be used for implementing the given
metricName
in this container.
- getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
-
- getHistograms(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
-
- getHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Fetches an Hl7v2 message by its name from a Hl7v2 store.
- getHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Gets HL7v2 message.
- getHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets HL7v2 store.
- getHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Gets HL7v2 store.
- getHoldability() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getHost() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
-
- getHost() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
- getHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getHttpClient() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getHttpClientConfiguration() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
- getHttpPipeline() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getHTTPWriteTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getId() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get an id used to represent this bundle.
- getId() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Returns an id used to represent this bundle.
- getId() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
- getId() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
-
- getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
-
- getId() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Order
-
- getId() - Method in interface org.apache.beam.sdk.fn.IdGenerator
-
- getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
-
- getId() - Method in class org.apache.beam.sdk.values.TupleTag
-
Returns the id of this TupleTag
.
- getId() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
-
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message
attributes, specifies the name of the attribute containing the unique identifier.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the id attribute.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the id attribute.
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
-
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
-
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
-
- getIdentifier() - Method in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Date
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.DateTime
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PassThroughLogicalType
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.Time
-
- getIdentifier() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
-
- getIdentifier() - Method in interface org.apache.beam.sdk.schemas.Schema.LogicalType
-
The unique identifier for this type.
- getImpersonateServiceAccount() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
All API requests will be made as the given service account or target service account in an
impersonation delegation chain instead of the currently selected account.
- getImplementor() - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
- getImplementor(boolean) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
-
- getImplementor() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
-
- getInboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
-
Deprecated.
- getInboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer2
-
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- getIncompatibleGlobalWindowErrorMessage() - Method in interface org.apache.beam.sdk.transforms.CombineFnBase.GlobalCombineFn
-
Returns the error message for not supported default values in Combine.globally().
- getIncompatibleGlobalWindowErrorMessage() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
-
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
-
- getIndex() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexInputRef
-
- getIndex() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
-
- getIndex(TupleTag<?>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the index for the given tuple tag, if the tag is present in this schema, -1 if it
isn't.
- getIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
The zero-based index of this trigger firing that produced this pane.
- getIndexes() - Method in class org.apache.beam.sdk.extensions.sql.impl.utils.SerializableRexFieldAccess
-
- getInferMaps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
-
/** Controls whether to use the map or row FieldType for a TableSchema field that appears to
represent a map (it is an array of structs containing only key
and value
fields).
- getInflightWaitSeconds() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
If the previous call to appendRows blocked due to flow control, returns how long the call
blocked for.
- getIngestManager() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for ingest manager which serves API to load data in streaming mode and retrieve a report
about loaded data.
- getInitialBackoff() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The initial backoff duration to be used before retrying a request for the first time.
- getInitialRestriction(Map<String, String>) - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
-
- getInitialRestriction(PulsarSourceDescriptor) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
-
- getInitialRestriction(InputT) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
-
- getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
- getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
- getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
-
- getInitialWatermarkEstimatorState(Instant) - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
-
- getInput(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
- getInput() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getInput(PTransform<T, ?>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getInput(RelNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
-
- getInputDataSet(PValue) - Method in class org.apache.beam.runners.twister2.Twister2BatchTranslationContext
-
- getInputDataSet(PValue) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getInputDoc() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
- getInputFile() - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
-
- getinputFormatClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getinputFormatKeyClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getinputFormatValueClass() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getInputId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
- getInputReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get a map of PCollection ids to
receiver
s which consume input elements,
forwarding them to the remote environment.
- getInputReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
Get a map of PCollection ids to
receiver
s which consume input
elements, forwarding them to the remote environment.
- getInputs(PTransform<InputT, ?>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the input of the currently being translated transform.
- getInputs() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getInputs() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getInputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- getInputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
-
- getInputType() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- getInputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a
TypeDescriptor
capturing what is known statically about the input type of
this
CombineFn
instance's most-derived class.
- getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Returns a
TypeDescriptor
capturing what is known statically about the input type of
this
DoFn
instance's most-derived class.
- getInputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.InferableFunction
-
- getInputValueCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the Coder
of the values of the input to this transform.
- getInsertBundleParallelism() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getInsertCount() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.GloballyAsSingletonView
-
- getInsertErrors() - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
-
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
- getInspectConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
- getInspectTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- getInstance() - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
-
- getInstance() - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
-
- getInstance() - Static method in class org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator
-
- getInstance() - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
-
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getInstructionId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
-
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
- getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessEnvironment
-
- getInstructionRequestHandler() - Method in interface org.apache.beam.runners.fnexecution.environment.RemoteEnvironment
-
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.RemoteEnvironment.SimpleRemoteEnvironment
-
- getInstructionRequestHandler() - Method in class org.apache.beam.runners.fnexecution.environment.StaticRemoteEnvironment
-
- getInt(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getInt(Map<String, Object>, String, Integer) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getInt16() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getInt16(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT16
value by field name,
IllegalStateException
is thrown if
schema doesn't match.
- getInt16(int) - Method in class org.apache.beam.sdk.values.Row
-
- getInt32() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getInt32(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT32
value by field name,
IllegalStateException
is thrown if
schema doesn't match.
- getInt32(int) - Method in class org.apache.beam.sdk.values.Row
-
- getInt64() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getInt64(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.INT64
value by field name,
IllegalStateException
is thrown if
schema doesn't match.
- getInt64(int) - Method in class org.apache.beam.sdk.values.Row
-
- getInterface() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
-
- getInterfaces() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptor
s, one for each interface implemented by this class.
- getIo() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- getIr() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
-
- getIrOptions() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
-
- getIsLocalChannelProvider() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getIterable(String) - Method in class org.apache.beam.sdk.values.Row
-
Get an iterable value by field name, IllegalStateException
is thrown if schema doesn't
match.
- getIterable(int) - Method in class org.apache.beam.sdk.values.Row
-
Get an iterable value by field index, IllegalStateException
is thrown if schema doesn't
match.
- getIterableComponentType(TypeDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
For an array T[] or a subclass of Iterable, return a TypeDescriptor describing T.
- getJarPath() - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
Optional Beam filesystem path to the jar containing the bytecode for this function.
- getJavaClass(RelDataType) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamJavaTypeFactory
-
- getJavaClassLookupAllowlist() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
-
- getJavaClassLookupAllowlistFile() - Method in interface org.apache.beam.sdk.expansion.service.ExpansionServiceOptions
-
- getJAXBClass() - Method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
- getJdkAddOpenModules() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Open modules needed for reflection that access JDK internals with Java 9+
- getJdkAddOpenModules() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Open modules needed for reflection that access JDK internals with Java 9+
- getJetDefaultParallelism() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
-
- getJetLocalMode() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
-
- getJetProcessorsCooperative() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
-
- getJetServers() - Method in interface org.apache.beam.runners.jet.JetPipelineOptions
-
- getJfrRecordingDurationSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
- getJmsCorrelationID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsDeliveryMode() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsDestination() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsExpiration() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsMessageID() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsPriority() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsRedelivered() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsReplyTo() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsTimestamp() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJmsType() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getJob(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
Gets the Dataflow
Job
with the given
jobId
.
- getJob() - Method in exception org.apache.beam.runners.dataflow.DataflowJobException
-
Returns the failed job.
- getJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
-
- getJob(JobReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
- getJob(JobReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
-
- getJobEndpoint() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getJobFileZip() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
-
- getJobId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the id of this job.
- getJobId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the Dataflow job.
- getJobId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
-
- getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
-
- getJobInfo() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
-
- getJobInfo() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
-
- getJobMessages(String, long) - Method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
Return job messages sorted in ascending order by timestamp.
- getJobMetrics(String) - Method in class org.apache.beam.runners.dataflow.DataflowClient
-
- getJobMetrics(JobApi.GetJobMetricsRequest, StreamObserver<JobApi.GetJobMetricsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- getJobMetrics(JobApi.GetJobMetricsRequest, StreamObserver<JobApi.GetJobMetricsResponse>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
-
- getJobMonitoringPageURL(String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
- getJobMonitoringPageURL(String, String, String) - Static method in class org.apache.beam.runners.dataflow.util.MonitoringUtil
-
- getJobName() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
- getJobs(JobApi.GetJobsRequest, StreamObserver<JobApi.GetJobsResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- getJobServerConfig() - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
-
- getJobServerDriver() - Method in interface org.apache.beam.runners.portability.testing.TestPortablePipelineOptions
-
- getJobServerTimeout() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getJobServerUrl() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver
-
- getJobService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
- getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
-
- getJobType() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
-
- getJoinColumns(boolean, List<Pair<RexNode, RexNode>>, int, Schema) - Static method in class org.apache.beam.sdk.extensions.sql.impl.transform.BeamJoinTransforms
-
- getJsonClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
- getJsonFactory() - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
-
- getJsonFactory() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- getJsonTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
- getJsonToRowWithErrFn() - Method in class org.apache.beam.sdk.transforms.JsonToRow.JsonToRowWithErrFn.ParseWithError
-
- getKey() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the key that was output in the most recent GroupByKey
in the execution of this
bundle.
- getKey() - Method in class org.apache.beam.runners.local.StructuralKey
-
- getKey() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
-
- getKey() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
- getKey() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
-
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The key for the display item.
- getKey() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The key for the display item.
- getKey() - Method in class org.apache.beam.sdk.values.KV
-
Returns the key of this
KV
.
- getKey() - Method in class org.apache.beam.sdk.values.ShardedKey
-
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
-
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
-
- getKeyCoder() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
-
- getKeyCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getKeyCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the Coder
of the keys of the input to this transform, which is also used as the
Coder
of the keys of the output of this transform.
- getKeyCoder() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- getKeyedCollections() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- getKeyedResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets resources with input SearchParameter key.
- getKeyRange() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
Returns the range of keys that will be read from the table.
- getKeys() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- getKeySet() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
-
- getKeysJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The primary keys of this specific modification.
- getKeystorePassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getKeystorePath() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getKeyTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getKeyTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getKind() - Method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
-
- getKind() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
-
- getKind() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
-
Return the display name for this factory.
- getKind() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
-
- getKindString() - Method in class org.apache.beam.sdk.io.BoundedReadFromUnboundedSource
-
- getKindString() - Method in class org.apache.beam.sdk.io.Read.Bounded
-
- getKindString() - Method in class org.apache.beam.sdk.io.Read.Unbounded
-
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
- getKindString() - Method in class org.apache.beam.sdk.transforms.Combine.PerKeyWithHotKeyFanout
-
- getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
- getKindString() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
- getKindString() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns the name to use by default for this PTransform
(not including the names of any
enclosing PTransform
s).
- getKindString() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
- getKindString() - Method in class org.apache.beam.sdk.values.PValueBase
-
Returns a
String
capturing the kind of this
PValueBase
.
- getKinesisClient() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.AWSClientsProvider
-
Deprecated.
- getKinesisClient() - Method in interface org.apache.beam.sdk.io.kinesis.AWSClientsProvider
-
- getKV() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
-
- getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the optional label for an item.
- getLabel() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional label for an item.
- getLabels() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Labels that will be applied to the billing records for this job.
- getLabels() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets labels.
- getLanguageOptions() - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- getLastEmitted() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Returns the last value emitted by the reader.
- getLastFieldId() - Method in class org.apache.beam.sdk.schemas.Schema.Builder
-
- getLastWatermarkedBatchTime() - Static method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- getLatencyTrackingInterval() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getLatestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- getLatestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- getLateTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
-
- getLeaves() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
-
- getLeaves() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getLegacyDevContainerVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the version/tag for legacy SDK FnAPI container image.
- getLegacyEnvironmentMajorVersion() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
Provides the legacy environment's major version number.
- getLength() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
-
- getLength() - Method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
- getLimitCountOfSortRel() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
-
- getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the optional link URL for an item.
- getLinkUrl() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional link URL for an item.
- getList() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
-
- getListeners() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
-
- getListOfMaps(Map<String, Object>, String, List<Map<String, Object>>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getLiteralGqlQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
- getLoadBalanceBundles() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getLocalhost() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
- getLocalJobServicePortFile() - Method in interface org.apache.beam.runners.portability.testing.TestUniversalRunner.Options
-
A file containing the job service port, since Gradle needs to know this filename statically
to provide it in Beam testing options.
- getLocalValue() - Method in class org.apache.beam.runners.flink.metrics.MetricsAccumulator
-
- getLocation() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
-
- getLogicalType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getLogicalType(Class<LogicalTypeT>) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
Helper function for retrieving the concrete logical type subclass.
- getLogicalTypeName() - Method in class org.apache.beam.sdk.coders.AvroCoder.JodaTimestampConversion
-
- getLogicalTypeValue(String, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
- getLogicalTypeValue(int, Class<T>) - Method in class org.apache.beam.sdk.values.Row
-
Returns the Logical Type input type for this field.
- getLoginTimeout() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getLoginTimeout() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getLong(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getLong(Map<String, Object>, String, Long) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getLowWatermark() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
-
- getMainOutputTag() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
-
- getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- getMainOutputTag() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
- getMainTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
-
The main trigger, which will continue firing until the "until" trigger fires.
- getMap() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
-
- getMap(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a MAP value by field name, IllegalStateException
is thrown if schema doesn't match.
- getMap(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a MAP value by field index, IllegalStateException
is thrown if schema doesn't
match.
- getMapKeyType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a map type, returns the key type.
- getMapKeyType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getMapping() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
-
- getMapping() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
-
- getMapType(TypeDescriptor, int) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
- getMapValueType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
If the field is a map type, returns the key type.
- getMapValueType() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getMatcher() - Method in class org.apache.beam.sdk.extensions.gcp.util.CustomHttpErrors.MatcherAndError
-
- getMatchUpdatedFiles() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
- getMaterialization() - Method in class org.apache.beam.sdk.transforms.ViewFn
-
Gets the materialization of this
ViewFn
.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
-
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
-
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
-
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
-
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- getMaterialization() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
-
- getMax() - Method in class org.apache.beam.sdk.metrics.DistributionResult
-
- getMaxAttempts() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of times a request will be attempted for a complete successful result.
- getMaxBufferingDuration() - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
-
- getMaxBufferingDurationMilliSec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getMaxBundlesFromWindmillOutstanding() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Maximum number of bundles outstanding from windmill before the worker stops requesting.
- getMaxBundleSize() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getMaxBundleTimeMills() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getMaxBytesFromWindmillOutstanding() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Maximum number of bytes outstanding from windmill before the worker stops requesting.
- getMaxCacheMemoryUsage(PipelineOptions) - Method in class org.apache.beam.sdk.options.SdkHarnessOptions.DefaultMaxCacheMemoryUsageMb
-
- getMaxCacheMemoryUsage(PipelineOptions) - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions.MaxCacheMemoryUsageMb
-
- getMaxCacheMemoryUsageMb() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in MB) for the process wide cache within the SDK harness.
- getMaxCacheMemoryUsageMbClass() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
An instance of this class will be used to specify the maximum amount of memory to allocate to a
cache within an SDK harness instance.
- getMaxCacheMemoryUsagePercent() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
Size (in % [0 - 100]) for the process wide cache within the SDK harness.
- getMaxConditionCost() - Method in interface org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
-
The maximum cost (as a ratio of CPU time) allowed for evaluating conditional snapshots.
- getMaxCumulativeBackoff() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- getMaxEndOffset(PipelineOptions) - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the actual ending offset of the current source.
- getMaxInvocationHistory() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
-
- getMaxNumericPrecision() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
-
- getMaxNumericScale() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
-
- getMaxNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
The maximum number of workers to use for the workerpool.
- getMaxParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getMaxPrecision(SqlTypeName) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
-
- getMaxRecordsPerBatch() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- getMaxStreamingBatchSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getMaxStreamingRowsToBatch() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getMD5() - Method in class org.apache.beam.sdk.io.aws2.s3.SSECustomerKey
-
- getMean() - Method in class org.apache.beam.sdk.metrics.DistributionResult
-
- getMean() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
-
- getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the configured size of the memory buffer.
- getMemoryMB() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the configured size of the memory buffer.
- getMessage() - Method in exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
-
- getMessage() - Method in exception org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
-
- getMessageBacklog() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
-
Current backlog in messages (latest offset of the partition - last processed record offset).
- getMessageConverter(DestinationT, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
-
- getMessageId() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
SQS message id.
- getMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the messageId of the message populated by Cloud Pub/Sub.
- getMessageId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getMessageRecord() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
-
- getMessages() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
-
- getMessageStream(JobApi.JobMessagesRequest, StreamObserver<JobApi.JobMessagesResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- getMessageType() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- getMessageType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets message type.
- getMetaData() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getMetadata(DestinationT) - Method in class org.apache.beam.sdk.io.DynamicAvroDestinations
-
Return AVRO file metadata for a given destination.
- getMetadata() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
Returns the MatchResult.Metadata
of the file.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
String representing the metadata of the Bundle to be written.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the gathered metadata for the change stream query so far.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The connector execution metadata for this record.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
-
- getMetadata() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns the extracted metadata.
- getMetadata(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getMetadata() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
-
- getMetadataCoder() - Method in class org.apache.beam.sdk.io.ReadableFileCoder
-
- getMetadataQuery() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
-
- getMetadataString(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getMetadataTable() - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
-
Returns the name of the metadata table.
- getMetaStore() - Method in class org.apache.beam.sdk.extensions.sql.BeamSqlCli
-
- getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
-
- getMeters(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
-
- getMethod() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- getMethods(Class) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
Returns the list of non private/protected, non-static methods in the class, caching the
results.
- getMethodsMap(Class) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
- getMetrics() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
- getMetricsContainer(String) - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer
-
- getMetricsGraphiteHost() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
-
- getMetricsGraphitePort() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
-
- getMetricsHttpSinkUrl() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
-
- getMetricsMapName(long) - Static method in class org.apache.beam.runners.jet.metrics.JetMetricsContainer
-
- getMetricsPushPeriod() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
-
- getMetricsSink() - Method in interface org.apache.beam.sdk.metrics.MetricsOptions
-
- getMimeType() - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- getMimeType() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
-
Returns the MIME type that should be used for the files that will hold the output data.
- getMin() - Method in class org.apache.beam.sdk.metrics.DistributionResult
-
- getMinBundleSize() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the minimum bundle size that should be used when splitting the source into sub-sources.
- getMinCpuPlatform() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies a Minimum CPU platform for VM instances.
- getMinimumTimestamp() - Method in interface org.apache.beam.runners.local.Bundle
-
Return the minimum timestamp among elements in this bundle.
- getMinPauseBetweenCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getMinReadTimeMillis() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- getMode() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
- getMode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
-
- getMode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
-
- getMode() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- getModeNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- getModifiableCollection() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
-
- getMods() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The modifications within this record.
- getModType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The type of operation that caused the modifications within this record.
- getMonitoringInfos() - Method in interface org.apache.beam.sdk.metrics.MetricsContainer
-
Return the cumulative values for any metrics in this container as MonitoringInfos.
- getMonthOfYear() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- getMutableOutput(TupleTag<T>) - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- getName() - Method in enum org.apache.beam.io.debezium.Connectors
-
The name of this connector class.
- getName(int) - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- getName() - Method in class org.apache.beam.sdk.extensions.sql.example.model.Customer
-
- getName() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
-
- getName() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
-
- getName() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
-
- getName() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets name.
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
-
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
-
- getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The name of the column.
- getName() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
-
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
-
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingDistribution
-
- getName() - Method in class org.apache.beam.sdk.metrics.DelegatingHistogram
-
- getName() - Method in interface org.apache.beam.sdk.metrics.Metric
-
- getName() - Method in class org.apache.beam.sdk.metrics.MetricName
-
The name of this metric.
- getName() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
-
- getName() - Method in class org.apache.beam.sdk.metrics.MetricResult
-
Return the name of the metric.
- getName() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the field name.
- getName() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
Returns the field name.
- getName() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns the transform name.
- getName() - Method in class org.apache.beam.sdk.values.PCollection
-
- getName() - Method in interface org.apache.beam.sdk.values.PValue
-
Returns the name of this
PValue
.
- getName() - Method in class org.apache.beam.sdk.values.PValueBase
-
- getNameCount() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- getNameOverride(String, T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- getNameOverride() - Method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
- getNameOverride() - Method in class org.apache.beam.sdk.transforms.Top.TopCombineFn
-
- getNamespace() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
- getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricName
-
The namespace associated with this metric.
- getNamespace() - Method in class org.apache.beam.sdk.metrics.MetricNameFilter
-
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
-
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The namespace for the display item.
- getNamespace() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The namespace for the display item.
- getNeedsAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
- getNeedsMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
- getNestedFieldsAccessed() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor
-
- getNetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
- getNetworkTimeout() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getNewBigqueryClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
- getNewValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The new column values after the modification was applied.
- getNextOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
-
- getNextWindow() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
-
- getNodeStats(RelNode) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelMetadataQuery
-
- getNodeStats() - Method in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata
-
- getNodeStats(RelNode, RelMetadataQuery) - Method in interface org.apache.beam.sdk.extensions.sql.impl.planner.NodeStatsMetadata.Handler
-
- getNodeStats(RelNode, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.RelMdNodeStats
-
- getNodeStats(RelNode, BeamRelMetadataQuery) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils
-
- getNonCumulativeCost(RelNode, RelMetadataQuery) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner.NonCumulativeCostImpl
-
- getNonSpeculativeIndex() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
The zero-based index of this trigger firing among non-speculative panes.
- getNonWildcardPrefix(String) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns the prefix portion of the glob that doesn't contain wildcards.
- getNotSupported() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTableFilter
-
Identify parts of a predicate that are not supported by the IO push-down capabilities to be
preserved in a Calc
following BeamIOSourceRel
.
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.DefaultTableFilter
-
Since predicate push-down is assumed not to be supported by default - return an unchanged list
of filters to be preserved.
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
-
- getNotSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
-
- getNullable() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getNullableValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Workaround for autovalue code generation, which does not allow type variables to be
instantiated with nullable actual parameters.
- getNullFirst() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
-
- getNullParams() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
-
- getNum() - Method in class org.apache.beam.runners.spark.io.ConsoleIO.Write.Unbound
-
- getNumber() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Optionally returns the field index.
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
-
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
-
- getNumber() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- getNumberOfExecutionRetries() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getNumberOfPartitionsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The total number of partitions for the given transaction.
- getNumberOfRecordsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The total number of data change records for the given transaction.
- getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the total number of records read from the change stream so far.
- getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The number of records read in the partition change stream query before reading this record.
- getNumberOfWorkerHarnessThreads() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Number of threads to use on the Dataflow worker harness.
- getNumberOverride(int, T) - Static method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- getNumBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
-
- getNumConcurrentCheckpoints() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getNumEntities(PipelineOptions, String, String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns Number of entities available for reading.
- getNumExtractJobCalls() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
-
- getNumQuerySplits() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
- getNumRows(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
It returns the number of rows for a given table.
- getNumSampledBytesPerFile() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
- getNumShardsProvider() - Method in class org.apache.beam.sdk.io.WriteFiles
-
- getNumStorageWriteApiStreamAppendClients() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getNumStorageWriteApiStreams() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getNumStreamingKeys() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getNumWorkers() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Number of workers to use when executing the Dataflow job.
- getNumWrites() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
-
- getOAuthToken() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getOauthToken() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getOauthToken() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getObject(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getObject(Map<String, Object>, String, Map<String, Object>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getObject() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the object name associated with this GCS path, or an empty string if no object is
specified.
- getObject(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
- getObjectReuse() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getObjects(List<GcsPath>) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
- getObservedTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
-
- getOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
-
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- getOffset() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
-
- getOldValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The old column values before the modification was applied.
- getOnCreateMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
-
- getOneOfSchema() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
Returns the schema of the underlying
Row
that is used to represent the union.
- getOneOfTypes() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
- getOnly(TupleTag<V>) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
If there is a singleton value for the given tag, returns it.
- getOnly(String) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- getOnly(TupleTag<V>, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
If there is a singleton value for the given tag, returns it.
- getOnly(String, V) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- getOnSuccessMatcher() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
-
- getOnTimeBehavior() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- getOperand0() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
-
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
-
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
-
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateExternalTable
-
- getOperandList() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
-
- getOperands() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
-
- getOperation() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
-
- getOperationMode() - Method in class org.apache.beam.runners.twister2.BeamBatchTSetEnvironment
-
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
-
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCheckConstraint
-
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlColumnDeclaration
-
- getOperator() - Method in class org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateFunction
-
- getOptionNames() - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
- getOptions() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getOptions() - Method in class org.apache.beam.sdk.Pipeline
-
- getOptions() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
- getOptions() - Method in class org.apache.beam.sdk.schemas.Schema
-
- getOptions() - Method in class org.apache.beam.sdk.testing.TestPipeline
-
- getOptionsId() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Provides a process wide unique ID for this
PipelineOptions
object, assigned at graph
construction time.
- getOrCreate(SpannerConfig) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
-
- getOrCreateReader(PipelineOptions, CheckpointMarkT) - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
-
- getOrCreateSession(SparkStructuredStreamingPipelineOptions) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory
-
- getOrDefault(K, V) - Method in interface org.apache.beam.sdk.state.MapState
-
A deferred lookup.
- getOrdinalPosition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The position of the column in the table.
- getOutboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer
-
Deprecated.
- getOutboundObserver() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer2
-
- getOutName(int) - Method in class org.apache.beam.sdk.values.TupleTag
-
If this TupleTag
is tagging output outputIndex
of a PTransform
, returns
the name that should be used by default for the output.
- getOutput(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
- getOutput() - Method in interface org.apache.beam.runners.spark.structuredstreaming.examples.WordCount.WordCountOptions
-
- getOutput() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getOutput(PTransform<?, T>) - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
-
- getOutputCoder() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource
-
- getOutputCoder(SerializableFunction<InputT, OutputT>, Coder<InputT>) - Method in class org.apache.beam.sdk.coders.CoderRegistry
-
- getOutputCoder() - Method in class org.apache.beam.sdk.io.AvroSource
-
- getOutputCoder() - Method in class org.apache.beam.sdk.io.CompressedSource
-
Returns the delegate source's output coder.
- getOutputCoder() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.BoundedElasticsearchSource
-
- getOutputCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.HadoopInputFormatBoundedSource
-
- getOutputCoder() - Method in class org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.Read.BoundedGridFSSource
-
- getOutputCoder() - Method in class org.apache.beam.sdk.io.Source
-
Returns the Coder
to use for the data read from this source.
- getOutputCoder() - Method in class org.apache.beam.sdk.io.xml.XmlSource
-
- getOutputCoders() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getOutputCoders() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getOutputExecutablePath() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getOutputFile() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
- getOutputFilePrefix() - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
-
Output file prefix.
- getOutputId(PipelineNode.PTransformNode) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
- getOutputKvCoder(Coder<KV<K, V>>) - Static method in class org.apache.beam.sdk.transforms.GroupByKey
-
Returns the Coder
of the output of this transform.
- getOutputs(PTransform<?, OutputT>) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the output of the currently being translated transform.
- getOutputs() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getOutputs() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getOutputSchema(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
- getOutputStrategyInternal(WindowingStrategy<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
- getOutputStream() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
-
- getOutputTVariable() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
-
- getOutputType() - Method in class org.apache.beam.sdk.extensions.sql.TypedCombineFnDelegate
-
- getOutputType() - Method in class org.apache.beam.sdk.transforms.Combine.CombineFn
-
Returns a
TypeDescriptor
capturing what is known statically about the output type of
this
CombineFn
instance's most-derived class.
- getOutputTypeDescriptor() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
-
- getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.DoFn
-
Returns a
TypeDescriptor
capturing what is known statically about the output type of
this
DoFn
instance's most-derived class.
- getOutputTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.InferableFunction
-
- getOverloadRatio() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The target ratio between requests sent and successful requests.
- getOverrideWindmillBinary() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Custom windmill_main binary to use with the streaming runner.
- getPane() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the pane of this FailsafeValueInSingleWindow
in its window.
- getPane() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the pane of this ValueInSingleWindow
in its window.
- getPaneInfo() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
-
- getParallelism() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getParallelism() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
-
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
-
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.impl.UdfImplReflectiveFunctionBase
-
Returns the parameters of this function.
- getParameters() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.impl.CastFunctionImpl
-
- getParent() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
Returns the parent path, or null
if this path does not have a parent.
- getParents() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
-
- getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
The unique partition identifiers of the parent partitions where this child partition originated
from.
- getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The unique partition identifiers of the parent partitions where this child partition originated
from.
- getParser() - Method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
Get the memoized
Parser
, possibly initializing it lazily.
- getParser() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
Get the memoized
Parser
, possibly initializing it lazily.
- getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches the partition metadata row data for the given partition token.
- getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Fetches the partition metadata row data for the given partition token.
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
-
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
-
- getPartitionCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which this partition was first detected and created in the metadata table.
- getPartitionEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The end time for the partition change stream query, which produced this record.
- getPartitionEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
-
- getPartitionKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner.ExplicitPartitioner
-
- getPartitionKey(T) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.KinesisPartitioner
-
Determines which shard in the stream the record is assigned to.
- getPartitionKey() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getPartitionKey(byte[]) - Method in interface org.apache.beam.sdk.io.kinesis.KinesisPartitioner
-
- getPartitionKey() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getPartitionMetadataAdminDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for admin operations over the partition metadata
table.
- getPartitionMetadataDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for accessing the partition metadata table.
- getPartitionRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the connector started processing this partition.
- getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Bounded
-
- getPartitions() - Method in class org.apache.beam.runners.spark.io.SourceRDD.Unbounded
-
- getPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
-
- getPartitionScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which this partition was scheduled to be queried.
- getPartitionStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The start time for the partition change stream query, which produced this record.
- getPartitionStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
-
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The partition token that produced this change stream record.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The unique identifier of the partition that generated this record.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
Unique partition identifier, which can be used to perform a change stream query.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
-
- getPassword() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getPassword() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getPassword() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getPassword() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getPath() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table path up to the leaf table name.
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
-
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
-
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
-
- getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
-
- getPath() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
The path for the display item within a component hierarchy.
- getPathValidator() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The path validator instance that should be used to validate paths.
- getPathValidatorClass() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcsOptions
-
The class of the validator that should be created and used to validate paths.
- getPatientCompartments() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
Gets the patient compartment responses for GetPatientEverything requests.
- getPatientEverything() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Get the patient compartment for a FHIR Patient using the GetPatientEverything/$everything API.
- getPatientEverything(String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Fhir get patient everythhing http body.
- getPatientEverything(String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- getPatternCondition() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
-
- getPatternVar() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
-
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
-
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the main PubSub message.
- getPayload() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getPayload() - Method in class org.apache.beam.sdk.schemas.io.Failure
-
Bytes containing the payload which has failed.
- getPayload() - Method in class org.apache.beam.sdk.schemas.logicaltypes.UnknownLogicalType
-
- getPCollection() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the PCollection that the elements of this bundle belong to.
- getPCollection() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
For internal use only.
- getPCollection() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
- getPCollectionInputs() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
- getPCollectionInputs() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
- getPerDestinationOutputFilenames() - Method in class org.apache.beam.sdk.io.WriteFilesResult
-
Returns a
PCollection
of all output filenames generated by this
WriteFiles
organized by user destination type.
- getPerElementConsumers(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
-
- getPerElementInputs(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
-
- getPeriod() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
Amount of time between generated windows.
- getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- getPeriod() - Method in class org.apache.beam.sdk.transforms.windowing.TimestampTransform.AlignTo
-
- getPipeline(JobApi.GetJobPipelineRequest, StreamObserver<JobApi.GetJobPipelineResponse>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- getPipeline() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's pipeline.
- getPipeline() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisIO.Write.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
-
- getPipeline() - Method in class org.apache.beam.sdk.io.jms.WriteJmsResult
-
- getPipeline() - Method in class org.apache.beam.sdk.io.WriteFilesResult
-
- getPipeline() - Method in class org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
-
- getPipeline() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- getPipeline() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
-
- getPipeline() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
- getPipeline() - Method in class org.apache.beam.sdk.values.PBegin
-
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionList
-
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- getPipeline() - Method in class org.apache.beam.sdk.values.PCollectionTuple
-
- getPipeline() - Method in class org.apache.beam.sdk.values.PDone
-
- getPipeline() - Method in interface org.apache.beam.sdk.values.PInput
-
- getPipeline() - Method in interface org.apache.beam.sdk.values.POutput
-
- getPipeline() - Method in class org.apache.beam.sdk.values.PValueBase
-
- getPipelineFromClasspath(String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
-
- getPipelineOptions() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
-
- getPipelineOptions() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Returns the configured pipeline options.
- getPipelineOptions() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Options
-
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.BatchTranslationContext
-
- getPipelineOptions() - Method in interface org.apache.beam.runners.flink.FlinkPortablePipelineTranslator.TranslationContext
-
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunner
-
For testing.
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
-
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.StreamingTranslationContext
-
- getPipelineOptions() - Method in class org.apache.beam.runners.flink.TestFlinkRunner
-
- getPipelineOptions() - Method in class org.apache.beam.runners.jet.JetRunnerRegistrar.Options
-
- getPipelineOptions() - Method in class org.apache.beam.runners.portability.testing.TestPortablePipelineOptions.TestPortablePipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner.OptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
-
- getPipelineOptions() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Options
-
- getPipelineOptions() - Method in class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Options
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.gcp.options.GcpPipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.JdbcConnection
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSinkRel
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
-
- getPipelineOptions() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
Perform a DFS(Depth-First-Search) to find the PipelineOptions config.
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamValuesRel
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws.options.AwsPipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.aws2.options.AwsPipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.azure.options.AzurePipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.hdfs.HadoopFileSystemOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.DefaultPipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
-
- getPipelineOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptionsRegistrar
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.options.RemoteEnvironmentOptions.Options
-
- getPipelineOptions() - Method in interface org.apache.beam.sdk.state.StateContext
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.CombineWithContext.Context
-
Returns the
PipelineOptions
specified with the
PipelineRunner
invoking this
KeyedCombineFn
.
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
-
Returns the
PipelineOptions
specified with the
PipelineRunner
invoking this
DoFn
.
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.StartBundleContext
-
Returns the
PipelineOptions
specified with the
PipelineRunner
invoking this
DoFn
.
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Returns the
PipelineOptions
specified with the
PipelineRunner
invoking this
DoFn
.
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.DoFnTester
-
- getPipelineOptions() - Method in class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.Options
-
- getPipelineOptionsFromClasspath(String) - Static method in class org.apache.beam.runners.jobsubmission.PortablePipelineJarUtils
-
- getPipelinePolicy() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the Runner API pipeline proto if available.
- getPipelineProto() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
-
- getPipelineRunners() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
-
- getPipelineRunners() - Method in class org.apache.beam.runners.direct.DirectRegistrar.Runner
-
- getPipelineRunners() - Method in class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner
-
- getPipelineRunners() - Method in class org.apache.beam.runners.jet.JetRunnerRegistrar.Runner
-
- getPipelineRunners() - Method in class org.apache.beam.runners.portability.PortableRunnerRegistrar
-
- getPipelineRunners() - Method in class org.apache.beam.runners.portability.testing.TestUniversalRunner.RunnerRegistrar
-
- getPipelineRunners() - Method in class org.apache.beam.runners.spark.SparkRunnerRegistrar.Runner
-
- getPipelineRunners() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Runner
-
- getPipelineRunners() - Method in class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Runner
-
- getPipelineUrl() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
The URL of the staged portable pipeline.
- getPlanner() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
-
- getPlannerName() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
-
- getPort() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
-
- getPort() - Method in class org.apache.beam.sdk.expansion.service.ExpansionServer
-
- getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortRead
-
- getPort() - Method in class org.apache.beam.sdk.fn.data.RemoteGrpcPortWrite
-
- getPortNumber() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getPortNumber() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getPositionForFractionConsumed(double) - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Returns a position P
such that the range [start, P)
represents approximately
the given fraction of the range [start, end)
.
- getPrecision() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
-
- getPrecision() - Method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.PerKey
-
- getPrecision() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
-
- getPrefix() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
First element in the path.
- getPrefixedEndpoint(String) - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getPreviousWindow() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
-
- getPrimary() - Method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
-
Returns the primary restriction.
- getPriority() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getPrivateKeyPassphrase() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getPrivateKeyPassphrase() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getPrivateKeyPassphrase() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getPrivateKeyPath() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getPrivateKeyPath() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getProcessBundleDescriptor(String) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
-
- getProcessBundleDescriptor(BeamFnApi.GetProcessBundleDescriptorRequest, StreamObserver<BeamFnApi.ProcessBundleDescriptor>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
-
- getProcessBundleDescriptor() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
- getProcessBundleDescriptor() - Method in interface org.apache.beam.runners.fnexecution.control.StageBundleFactory
-
- getProcessingTimeAdvance() - Method in class org.apache.beam.sdk.testing.TestStream.ProcessingTimeEvent
-
- getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Provides
SdkHarnessClient.BundleProcessor
that is capable of processing bundles not containing timers or
state accesses such as:
Side inputs
User state
Remote references
- getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, StateDelegator) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
- getProcessor(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, StateDelegator, Map<String, Map<String, ProcessBundleDescriptors.TimerSpec>>) - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient
-
Provides
SdkHarnessClient.BundleProcessor
that is capable of processing bundles containing timers and
state accesses such as:
Side inputs
User state
Remote references
- getProcessWideContainer() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
- getProduced(ExecutableT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
-
- getProducer(PValue) - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
Get the
AppliedPTransform
that produced the provided
PValue
.
- getProducer(CollectionT) - Method in interface org.apache.beam.runners.direct.ExecutableGraph
-
- getProfilingAgentConfiguration() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
-
- getProgress(PartitionRestriction, PartitionPosition) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionProgressChecker
-
- getProgress() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
-
- getProgress() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Returns the progress made within the restriction so far.
- getProgress() - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.ReadFiles.BlockTracker
-
- getProgress() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
-
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
-
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
-
- getProgress() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
- getProgress() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.HasProgress
-
A representation for the amount of known completed and known remaining work.
- getProject() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
- getProject() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
Project id to use when launching jobs.
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the project path.
- getProjectId() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the project this job exists in.
- getProjectId() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
-
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPDeidentifyText
-
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPInspectText
-
- getProjectId() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
-
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
-
- getProperties() - Method in class org.apache.beam.runners.dataflow.DataflowRunnerInfo
-
- getProperties() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getProperties() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
-
- getProperties() - Method in class org.apache.beam.sdk.io.jms.JmsRecord
-
- getProtoBytesToRowFn(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
-
- getProtoClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
Used by the ProtoPayloadSerializerProvider when serializing from a Pub/Sub message.
- getProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkContextOptions
-
- getProviderRuntimeValues() - Method in interface org.apache.beam.sdk.testing.TestPipeline.TestValueProviderOptions
-
- getProvisionInfo(ProvisionApi.GetProvisionInfoRequest, StreamObserver<ProvisionApi.GetProvisionInfoResponse>) - Method in class org.apache.beam.runners.fnexecution.provisioning.StaticGrpcProvisionService
-
- getProxyConfiguration() - Method in interface org.apache.beam.sdk.io.aws2.options.AwsOptions
-
ProxyConfiguration
used to configure AWS service clients.
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
-
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
-
- getPTransformForInput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.PayloadSerializerKafkaTable
-
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaCSVTable
-
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
-
- getPTransformForOutput() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.PayloadSerializerKafkaTable
-
- getPTransformId() - Method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
-
- getPublishTimestamp() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
-
- getPubsubRootUrl() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
-
Root URL for use with the Google Cloud Pub/Sub API.
- getQualifiers() - Method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor
-
- getQuantifier() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
-
- getQueries() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
Configures the BigQuery read job with the SQL query.
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
-
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
-
- getQuery() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getQuery() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a query which can be source for reading.
- getQuery() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getQueryLocation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
BigQuery geographic location where the query job will be executed.
- getQueryName() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
-
- getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
- getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time that the change stream query which produced this record started.
- getQueryString() - Method in interface org.apache.beam.sdk.extensions.sql.example.BeamSqlDataCatalogExample.DCExamplePipelineOptions
-
SQL Query.
- getQuotationMark() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a character that will surround String
in staged CSV files.
- getRamMegaBytes() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
-
- getRange() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
Returns the current range.
- getRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- getRate() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
- getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory
-
- getRateLimitPolicy() - Method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory
-
- getRaw(ObjectT) - Method in interface org.apache.beam.sdk.schemas.FieldValueGetter
-
Returns the raw value of the getter before any further transformations.
- getRawPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getRawPrivateKey() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getRawPrivateKey() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getRawType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the raw class type.
- getRawType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the
Class
underlying the
Type
represented by this
TypeDescriptor
.
- getReaderCacheTimeoutSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The amount of time before UnboundedReaders are considered idle and closed during streaming
execution.
- getReadOperation() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
-
- getReadResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
Gets resources.
- getReadTime() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getReadTime() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
- getReadTime() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getReadTimePercentage() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- getReason() - Method in exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- getReasons() - Method in exception org.apache.beam.sdk.coders.Coder.NonDeterministicException
-
- getReceiptHandle() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
SQS receipt handle.
- getReceiver() - Method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
-
- getReceiver() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
-
- getReceiver() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
-
- getRecord() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
-
- getRecord() - Method in class org.apache.beam.sdk.io.kudu.TableAndRecord
-
- getRecordJfrOnGcThrashing() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
If true, save a JFR profile when GC thrashing is first detected.
- getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
- getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record was fully read.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Indicates the order in which a record was put to the stream.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Indicates the order in which this record was put into the change stream in the scope of a
partition, commit timestamp and transaction tuple.
- getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record finished to be streamed.
- getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record finished streaming.
- getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record first started to be streamed.
- getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record started to be streamed.
- getRecordTimestamp() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecord
-
The timestamp associated with the record.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The Cloud Spanner timestamp time when this record occurred.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Returns the timestamp that which this partition started being valid in Cloud Spanner.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The timestamp at which the modifications within were committed in Cloud Spanner.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Indicates the timestamp for which the change stream query has returned all changes.
- getRecordType() - Method in class org.apache.beam.sdk.coders.SerializableCoder
-
- getReferentialConstraints() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- getRegexFromPattern(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPUtils
-
Recursively construct a regular expression from a RexNode
.
- getRegion() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
Get the region this job exists in.
- getRegion() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
The Google Compute Engine
region for
creating Dataflow jobs.
- getRegionFromEnvironment() - Static method in class org.apache.beam.runners.dataflow.options.DefaultGcpRegionFactory
-
- getRegisteredOptions() - Static method in class org.apache.beam.sdk.options.PipelineOptionsFactory
-
- getReidentifyConfig() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- getReidentifyTemplateName() - Method in class org.apache.beam.sdk.extensions.ml.DLPReidentifyText
-
- getReIterableGroupByKeyResult() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getRelList() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
-
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
-
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamJavaUdfCalcRule
-
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRule
-
- getRelTypes() - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcSplittingRule
-
- getRemoteInputDestinations() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
- getRemoteOutputCoders() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
- getRepeatedTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
-
- getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
- getReplacedByJob() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
-
- getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollection<OutputT>, ParDo.SingleOutput<InputT, OutputT>>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory
-
- getReplacementTransform(AppliedPTransform<PCollection<? extends InputT>, PCollectionTuple, PTransform<PCollection<? extends InputT>, PCollectionTuple>>) - Method in class org.apache.beam.runners.direct.ParDoMultiOverrideFactory
-
- getReplyTo() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getReportCheckpointDuration() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getRequestTimeStamp() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Timestamp the message was received at (in epoch millis).
- getRequirements() - Method in class org.apache.beam.sdk.transforms.Contextful
-
Returns the requirements needed to run the closure.
- getResidual() - Method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
-
Returns the residual restriction.
- getResourceHints() - Method in class org.apache.beam.sdk.transforms.PTransform
-
Returns resource hints set on the transform.
- getResourceHints() - Method in interface org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions
-
- getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
Gets resources.
- getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets resources.
- getResourceType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
- getResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
HTTP response from the FHIR store after attempting to write the Bundle method.
- getResponseItemJson() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
- getRestrictionCoder() - Method in class org.apache.beam.io.debezium.KafkaSourceConsumerFn
-
- getRestrictionCoder() - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
-
- getRestrictionCoder() - Method in class org.apache.beam.sdk.transforms.Watch.WatchGrowthFn
-
- getResult() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
-
Returns the result of the transaction execution.
- getResults() - Method in class org.apache.beam.sdk.transforms.JsonToRow.ParseResult
-
- getRetainDockerContainers() - Method in interface org.apache.beam.sdk.options.ManualDockerEnvironmentOptions
-
- getRetainExternalizedCheckpointsOnCancellation() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getRetryableCodes() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
- getReturnType(RelDataTypeFactory, SqlOperatorBinding) - Method in class org.apache.beam.sdk.extensions.sql.impl.ScalarFunctionImpl
-
- getReturnType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.UdafImpl
-
- getRole() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getRole() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getRole() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getRoot() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- getRootCause() - Method in exception org.apache.beam.sdk.coders.CannotProvideCoderException
-
- getRootSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getRootTransforms() - Method in interface org.apache.beam.runners.direct.ExecutableGraph
-
- getRoutingKey() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
-
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
-
- getRow(String) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Schema.TypeName.ROW
value by field name,
IllegalStateException
is thrown if
schema doesn't match.
- getRow(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a
Row
value by field index,
IllegalStateException
is thrown if schema
doesn't match.
- getRowCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- getRowCount() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
- getRowReceiver(TupleTag<T>) - Method in interface org.apache.beam.sdk.transforms.DoFn.MultiOutputReceiver
-
- getRows() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- getRows() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider.TableWithRows
-
- getRowSchema() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getRowSelector(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
- getRowSelectorOptimized(Schema, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.utils.SelectHelpers
-
- getRowsWritten() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
-
The number of rows written in this batch.
- getRowToAvroBytesFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
- getRowToGenericRecordFunction(Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
- getRowToProtoBytesFn(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoMessageSchema
-
- getRowType(RelDataTypeFactory) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
-
- getRowType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The type of the primary keys and modified columns within this record.
- getRpcPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
- getRule() - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.JoinRelOptRuleCall
-
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.ArrayQualifierContext
-
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionComponentContext
-
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.DotExpressionContext
-
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.FieldSpecifierContext
-
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.MapQualifierContext
-
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
-
- getRuleIndex() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifierListContext
-
- getRuleNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- getRuleNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- getRuleSets() - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRuleSets
-
- getRunner() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
The pipeline runner that will be used to execute the pipeline.
- getRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which the connector started processing this partition.
- getS3ClientBuilder() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
Builder used to create the AmazonS3Client
.
- getS3ClientBuilder() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Builder used to create the S3Client
.
- getS3ClientFactoryClass() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
-
- getS3ClientFactoryClass() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
-
- getS3StorageClass() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
-
- getS3StorageClass() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
The AWS S3 storage class used for creating S3 objects.
- getS3StorageClass() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
-
- getS3StorageClass() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
The AWS S3 storage class used for creating S3 objects.
- getS3ThreadPoolSize() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
-
- getS3ThreadPoolSize() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
Thread pool size, limiting the max concurrent S3 operations.
- getS3ThreadPoolSize() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
-
- getS3ThreadPoolSize() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Thread pool size, limiting the max concurrent S3 operations.
- getS3UploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
-
- getS3UploadBufferSizeBytes() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
Size of S3 upload chunks.
- getS3UploadBufferSizeBytes() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
-
- getS3UploadBufferSizeBytes() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Size of S3 upload chnks.
- getSamplePeriod() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The length of time sampled request data will be retained.
- getSamplePeriodBucketSize() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
- getSamplingStrategy() - Method in class org.apache.beam.sdk.io.TextRowCountEstimator
-
- getSasToken() - Method in interface org.apache.beam.sdk.io.azure.options.BlobstoreOptions
-
- getSaveHeapDumpsToGcsPath() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
CAUTION: This option implies dumpHeapOnOOM, and has similar caveats.
- getSavepointPath() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getSaveProfilesToGcs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
-
- getSbeFields() - Method in class org.apache.beam.sdk.extensions.sbe.SbeSchema
-
- getScale() - Method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
-
- getScan() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
- getScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which this partition was scheduled to be queried.
- getSchema() - Method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns the schema used by this coder.
- getSchema() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getSchema() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Get the schema info of the table.
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
-
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.SchemaBaseBeamTable
-
- getSchema() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
-
- getSchema() - Static method in class org.apache.beam.sdk.io.contextualtextio.RecordWithMetadata
-
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.DynamicAvroDestinations
-
Return an AVRO schema for a given destination.
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
-
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the table schema for the destination.
- getSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
-
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a schema of a Snowflake table.
- getSchema() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getSchema() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getSchema() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the schema associated with this type.
- getSchema(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve a
Schema
for a given
Class
type.
- getSchema(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
- getSchema(Class<T>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
- getSchema() - Method in class org.apache.beam.sdk.schemas.utils.ReflectUtils.ClassWithSchema
-
- getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
- getSchema() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- getSchema() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema.
- getSchema() - Method in class org.apache.beam.sdk.values.Row.Builder
-
Return the schema for the row being built.
- getSchema() - Method in class org.apache.beam.sdk.values.Row.FieldValueBuilder
-
- getSchema() - Method in class org.apache.beam.sdk.values.Row
-
Return
Schema
which describes the fields.
- getSchemaCoder(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
- getSchemaCoder(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
- getSchemaHash() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
-
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
-
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
-
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
-
- getSchemaIOProvider() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
-
- getSchemaProviders() - Method in class org.apache.beam.sdk.schemas.annotations.DefaultSchema.DefaultSchemaProviderRegistrar
-
- getSchemaProviders() - Method in interface org.apache.beam.sdk.schemas.SchemaProviderRegistrar
-
- getSchemaRegistry() - Method in class org.apache.beam.sdk.Pipeline
-
- getSchematizedData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets schematized data.
- getSchemaUpdateRetries() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getScheme() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
-
- getScheme() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
The uri scheme used by resources on this filesystem.
- getScheme() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
The uri scheme used by resources on this filesystem.
- getScheme() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
-
- getScheme() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
-
- getScheme() - Method in class org.apache.beam.sdk.io.FileSystem
-
Get the URI scheme which defines the namespace of the
FileSystem
.
- getScheme() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Get the scheme which defines the namespace of the
ResourceId
.
- getSdkComponents() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
- getSdkContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Container image used to configure SDK execution environment on worker.
- getSdkHarnessContainerImageOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Overrides for SDK harness container images.
- getSdkHarnessLogLevelOverrides() - Method in interface org.apache.beam.sdk.options.SdkHarnessOptions
-
This option controls the log levels for specifically named loggers.
- getSdkWorkerId() - Method in interface org.apache.beam.sdk.fn.server.HeaderAccessor
-
This method should be called from the request method.
- getSdkWorkerParallelism() - Method in interface org.apache.beam.sdk.options.PortablePipelineOptions
-
- getSearchEndPoint() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getSemiPersistDir() - Method in interface org.apache.beam.sdk.options.RemoteEnvironmentOptions
-
- getSendFacility() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets send facility.
- getSendTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets send time.
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getSerializableFunctionUdfs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
-
- getSerializableOptions() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getSerializedATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- getSerializedATN() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
-
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
-
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.schemas.io.payloads.AvroPayloadSerializerProvider
-
- getSerializer(Schema, Map<String, Object>) - Method in class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
-
- getSerializer(Schema, Map<String, Object>) - Method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializerProvider
-
Get a PayloadSerializer.
- getSerializer(String, Schema, Map<String, Object>) - Static method in class org.apache.beam.sdk.schemas.io.payloads.PayloadSerializers
-
- getServer() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
- getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.Provider
-
- getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.Provider
-
- getServerFactory() - Method in interface org.apache.beam.runners.fnexecution.environment.EnvironmentFactory.Provider
-
- getServerFactory() - Method in class org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.Provider
-
- getServerName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getServerName() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getServerName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getServerTransactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The unique transaction id in which the modifications occurred.
- getService() - Method in class org.apache.beam.sdk.fn.server.GrpcFnServer
-
- getServiceAccount() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Run the job as a specific service account, instead of the default GCE robot.
- getSetFieldCreator(Class<T>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- getSetters(Class<?>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- getSetters(Class<?>, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- getSha256() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
The SHA-256 hash of the source file.
- getShard() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
-
- getShardId() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getShardId() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getShardingFunction() - Method in class org.apache.beam.sdk.io.WriteFiles
-
- getShardNumber() - Method in class org.apache.beam.sdk.values.ShardedKey
-
- getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Return the optional short value for an item, or null if none is provided.
- getShortValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The optional short value for an item, or null
if none is provided.
- getShutdownSourcesAfterIdleMs() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getSideInput(String) - Method in interface org.apache.beam.runners.fnexecution.translation.BatchSideInputHandlerFactory.SideInputGetter
-
- getSideInputDataSet(PCollectionView<?>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getSideInputDataSets() - Method in class org.apache.beam.runners.twister2.Twister2TranslationContext
-
- getSideInputKeys() - Method in class org.apache.beam.runners.twister2.translators.functions.DoFnFunction
-
get the tag id's of all the keys.
- getSideInputs() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.ParDoSingle
-
- getSideInputs(ExecutableStage) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors
-
- getSideInputs() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
-
- getSideInputs() - Method in class org.apache.beam.sdk.io.FileBasedSink.DynamicDestinations
-
Override to specify that this object needs access to one or more side inputs.
- getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Specifies that this object needs access to one or more side inputs.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns the side inputs used by this Combine operation.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.GroupedValues
-
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Combine.PerKey
-
Returns the side inputs used by this Combine operation.
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.MultiOutput
-
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.ParDo.SingleOutput
-
- getSideInputs() - Method in class org.apache.beam.sdk.transforms.Requirements
-
The side inputs that this
Contextful
needs access to.
- getSideInputSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to side input id to
side inputs
that
are used during execution.
- getSideInputWindow(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.WindowMappingFn
-
Returns the window of the side input corresponding to the given window of the main input.
- getSingleFileMetadata() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Returns the information about the single file that this source is reading from.
- getSingleWorkerStatus(String, long, TimeUnit) - Method in class org.apache.beam.runners.fnexecution.status.BeamWorkerStatusGrpcService
-
Get the latest SDK worker status from the client's corresponding SDK harness.
- getSink() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
-
Sink for control clients.
- getSink() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
-
- getSink() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
Returns the FileBasedSink for this write operation.
- getSink() - Method in class org.apache.beam.sdk.io.WriteFiles
-
- getSize() - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
Size of the generated windows.
- getSize(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
- getSize(PulsarSourceDescriptor, OffsetRange) - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
-
- getSize() - Method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
-
- getSize() - Method in class org.apache.beam.sdk.schemas.utils.AvroUtils.FixedBytesField
-
Get the size.
- getSize() - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
- getSize() - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- getSketchFromByteBuffer(ByteBuffer) - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount
-
Converts the passed-in sketch from ByteBuffer
to byte[]
, mapping null
ByteBuffer
s (representing empty sketches) to empty byte[]
s.
- getSkipKeyClone() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getSkipValueClone() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getSnowPipe() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getSnsAsyncClient() - Method in interface org.apache.beam.sdk.io.aws2.sns.SnsAsyncClientProvider
-
- getSnsClient() - Method in interface org.apache.beam.sdk.io.aws2.sns.SnsClientProvider
-
Deprecated.
- getSocketTimeout() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getSorterType() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the sorter type.
- getSource() - Method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
The file to stage.
- getSource() - Method in interface org.apache.beam.runners.fnexecution.control.ControlClientPool
-
Source of control clients.
- getSource() - Method in class org.apache.beam.runners.fnexecution.control.MapControlClientPool
-
- getSource() - Method in class org.apache.beam.sdk.io.contextualtextio.ContextualTextIO.Read
-
- getSource() - Method in class org.apache.beam.sdk.io.Read.Bounded
-
Returns the BoundedSource
used to create this Read
PTransform
.
- getSource() - Method in class org.apache.beam.sdk.io.Read.Unbounded
-
Returns the UnboundedSource
used to create this Read
PTransform
.
- getSource() - Method in class org.apache.beam.sdk.io.TextIO.Read
-
- getSource() - Method in class org.apache.beam.sdk.io.TFRecordIO.Read
-
- getSparkMaster() - Method in interface org.apache.beam.runners.spark.SparkCommonPipelineOptions
-
- getSparkSession() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.AbstractTranslationContext
-
- getSplit() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.SerializableSplit
-
- getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns the size of the backlog of unread data in the underlying data source represented by
this split of this source.
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- getSplitPointsConsumed() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
- getSplitPointsProcessed() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Returns the total number of split points that have been processed.
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.AvroSource.AvroReader
-
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.BoundedSource.BoundedReader
-
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
- getSplitPointsRemaining() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- getSqsClient() - Method in interface org.apache.beam.sdk.io.aws2.sqs.SqsClientProvider
-
Deprecated.
- getSSEAlgorithm() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
-
- getSSEAlgorithm() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
Algorithm for SSE-S3 encryption, e.g.
- getSSEAlgorithm() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
-
- getSSEAlgorithm() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
Algorithm for SSE-S3 encryption, e.g.
- getSSEAwsKeyManagementParams() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
-
- getSSEAwsKeyManagementParams() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
KMS key id for SSE-KMS encryption, e.g.
- getSSECustomerKey() - Method in interface org.apache.beam.sdk.io.aws.options.S3Options
-
- getSSECustomerKey() - Method in class org.apache.beam.sdk.io.aws.s3.S3FileSystemConfiguration
-
SSE key for SSE-C encryption, e.g.
- getSSECustomerKey() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
-
- getSSECustomerKey() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
SSE key for SSE-C encryption, e.g.
- getSSEKMSKeyId() - Method in interface org.apache.beam.sdk.io.aws2.options.S3Options
-
- getSSEKMSKeyId() - Method in class org.apache.beam.sdk.io.aws2.s3.S3FileSystemConfiguration
-
KMS key id for SSE-KMS encyrption, e.g.
- getSsl() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getStableUniqueNames() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
Whether to check for stable unique names on each transform.
- getStackTrace() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
-
- getStageBundleFactory(ExecutableStage) - Method in class org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext
-
- getStageBundleFactory(ExecutableStage) - Method in interface org.apache.beam.runners.fnexecution.control.ExecutableStageContext
-
- getStagedArtifacts(String) - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
Returns the rewritten artifacts associated with this job, keyed by environment.
- getStager() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The resource stager instance that should be used to stage resources.
- getStagerClass() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The class responsible for staging resources to be accessible by workers during job execution.
- getStagingBucketDir() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting directory where files are staged.
- getStagingBucketDir() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeStreamingServiceConfig
-
Getter for a bucket name with directory where files were staged and waiting for loading.
- getStagingBucketName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getStagingBucketName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getStagingLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
GCS path for staging local files, e.g.
- getStart() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
-
- getStart() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
-
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
-
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
-
- getStartDate() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- getStartKey() - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
- getStartOffset() - Method in class org.apache.beam.sdk.io.OffsetBasedSource
-
Returns the starting offset of the source.
- getStartPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
- getStartPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
- getStartPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Returns the starting position of the current range, inclusive.
- getStartTime() - Method in class org.apache.beam.sdk.io.CountingSource.CounterMark
-
Returns the time the reader was started.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
It is the partition_start_time of the child partition token.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
It is the start time at which the partition started existing in Cloud Spanner.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
-
- getState() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
- getState() - Method in class org.apache.beam.runners.dataflow.util.DataflowTemplateJob
-
- getState() - Method in class org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
-
- getState() - Method in class org.apache.beam.runners.flink.FlinkDetachedRunnerResult
-
- getState() - Method in class org.apache.beam.runners.flink.FlinkRunnerResult
-
- getState() - Method in class org.apache.beam.runners.jet.FailedRunningPipelineResults
-
- getState() - Method in class org.apache.beam.runners.jet.JetPipelineResult
-
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- getState() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's current state.
- getState(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.portability.testing.TestJobService
-
- getState() - Method in class org.apache.beam.runners.spark.SparkPipelineResult
-
- getState() - Method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult
-
- getState() - Method in class org.apache.beam.runners.twister2.Twister2PipelineResult
-
- getState() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The state in which the current partition is in.
- getState() - Method in interface org.apache.beam.sdk.PipelineResult
-
Retrieves the current state of the pipeline execution.
- getState() - Method in interface org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimator
-
- getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.Manual
-
- getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
-
- getState() - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.WallTime
-
- getStateBackend() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getStateBackendFactory() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getStateBackendStoragePath() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- getStateCoder() - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
- getStateEvent() - Method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
Retrieve the job's current state.
- getStateStream(JobApi.GetJobStateRequest, StreamObserver<JobApi.JobStateEvent>) - Method in class org.apache.beam.runners.jobsubmission.InMemoryJobService
-
- getStaticCreator(Class, Method, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.JavaBeanUtils
-
- getStaticCreator(Class, Method, Schema, FieldValueTypeSupplier, ByteBuddyUtils.TypeConversionsFactory) - Static method in class org.apache.beam.sdk.schemas.utils.POJOUtils
-
- getStatistic() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
-
- getStatus() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
-
- getStatusCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
-
- getStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator.JobSpecification
-
Returns the mapping of AppliedPTransforms
to the internal step name
for that AppliedPTransform
.
- getStoppedMode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
-
- getStopPipelineWatermark() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
-
- getStopPosition() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
- getStopPosition() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
- getStopPosition() - Method in interface org.apache.beam.sdk.io.range.RangeTracker
-
Returns the ending position of the current range, exclusive.
- getStorageApiAppendThresholdBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getStorageApiAppendThresholdRecordCount() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getStorageClient(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
- getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
-
- getStorageIntegrationName() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getStorageIntegrationName() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting Snowflake integration which is used in COPY statement.
- getStorageIntegrationName() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getStorageLevel() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- getStorageWriteApiTriggeringFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getStreamAppendClient(String, Descriptors.Descriptor) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Create an append client for a given Storage API write stream.
- getStreamAppendClient(String, Descriptors.Descriptor) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getStreamingService() - Method in interface org.apache.beam.sdk.io.snowflake.services.SnowflakeServices
-
- getStreamingService() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeServicesImpl
-
- getStreamingTimeoutMs() - Method in interface org.apache.beam.runners.spark.SparkPortableStreamingPipelineOptions
-
- getStreamName() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getStreamName() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getString(Map<String, Object>, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getString(Map<String, Object>, String, String) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getString() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getString(String) - Method in class org.apache.beam.sdk.values.Row
-
- getString(int) - Method in class org.apache.beam.sdk.values.Row
-
Get a String
value by field index, ClassCastException
is thrown if schema
doesn't match.
- getStrings(Map<String, Object>, String, List<String>) - Static method in class org.apache.beam.runners.dataflow.util.Structs
-
- getSubnetwork() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
- getSubProvider(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
-
- getSubProvider(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Returns a sub-provider, e.g.
- getSubProviders() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Returns all sub-providers, e.g.
- getSubSchema(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getSubSchemaNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The subscription from which to read Pub/Sub messages.
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the subscription being read from.
- getSubscriptionProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
- getSubSequenceNumber() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getSubSequenceNumber() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
-
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets successful bodies from Write.
- getSuccessfulBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Gets successful FhirBundleResponse from execute bundles operation.
- getSuccessfulInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing the
TableRow
s that were written to BQ via the
streaming insert API.
- getSuccessfulTableLoads() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
- getSuggestedFilenameSuffix() - Method in enum org.apache.beam.sdk.io.FileBasedSink.CompressionType
-
Deprecated.
- getSuggestedFilenameSuffix() - Method in interface org.apache.beam.sdk.io.FileBasedSink.OutputFileHints
-
- getSuggestedSuffix() - Method in enum org.apache.beam.sdk.io.Compression
-
- getSum() - Method in class org.apache.beam.sdk.metrics.DistributionResult
-
- getSum() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
-
- getSumAndReset() - Method in class org.apache.beam.sdk.testing.CoderProperties.TestElementByteSizeObserver
-
- getSupertype(Class<? super T>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns the generic form of a supertype.
- getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryFilter
-
- getSupported() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableFilter
-
- getSupportedClass() - Method in interface org.apache.beam.runners.dataflow.util.CloudObjectTranslator
-
- getSupportedClass() - Method in class org.apache.beam.runners.dataflow.util.RowCoderCloudObjectTranslator
-
- getSupportedClass() - Method in class org.apache.beam.runners.dataflow.util.SchemaCoderCloudObjectTranslator
-
- getSynchronizedProcessingOutputWatermark() - Method in interface org.apache.beam.runners.local.Bundle
-
Returns the processing time output watermark at the time the producing Executable
committed this bundle.
- getSynchronizedProcessingTime() - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
-
- getTable(StructType, Transform[], Map<String, String>) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.DatasetSourceBatch
-
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getTable(String) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- getTable(String) - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Get a specific table from this provider it is present, or null if it is not present.
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
-
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Returns the table to read, or null
if reading from a query instead.
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Returns the table reference, or null
.
- getTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Gets the specified
Table
resource by table ID.
- getTable(TableReference, List<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
- getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
- getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
-
- getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getTable() - Method in class org.apache.beam.sdk.io.kudu.TableAndRecord
-
- getTable() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getTable() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
Getting a table as a source of reading or destination to write.
- getTable() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getTableByFullName(TableName) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- getTableByFullName(TableName) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.FullNameTableProvider
-
- getTableDescription() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns the table being read from.
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Read
-
- getTableId() - Method in class org.apache.beam.sdk.io.hbase.HBaseIO.Write
-
- getTableImpl(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- getTableName() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Table name, the last element of the fully-specified table name with path.
- getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The name of the table in which the modifications within this record occurred.
- getTableNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getTablePath(Table) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
-
Returns a full table path (exlucding top-level schema) for a given ZetaSQL Table.
- getTableProvider() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Returns the table to read, or null
if reading from a query instead.
- getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
- getTableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
- getTableResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
-
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
-
- getTables() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Get all tables from this provider.
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
-
- getTables() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
-
- getTableSchema(String, String) - Static method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO
-
- getTableSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
-
- getTableSchema(String, String) - Method in class org.apache.beam.sdk.io.hcatalog.HCatalogBeamSchema
-
Gets the table schema, or absent optional if the table doesn't exist in the database.
- getTableSchema() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
-
- getTableSchema() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
Specifies a table for a BigQuery read job.
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration
-
Writes to the given table specification.
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return the tablespec in [project:].dataset.tableid format.
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.BaseBeamTable
-
- getTableStatistics(PipelineOptions) - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Estimates the number of rows or the rate for unbounded Tables.
- getTableStatistics(PipelineOptions, SchemaIO) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
-
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
-
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
-
- getTableStatistics(PipelineOptions, SchemaIO) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
-
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
-
- getTableStatistics(PipelineOptions) - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.avro.AvroTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.datastore.DataStoreV1TableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.KafkaTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.parquet.ParquetTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite.PubsubLiteTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.ReadOnlyTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.SchemaIOTableProviderWrapper
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.seqgen.GenerateSequenceTableProvider
-
- getTableType() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.TableProvider
-
Gets the table type this provider handles.
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTableProvider
-
- getTableType() - Method in class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
-
- getTableUrn() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return the tablespec in projects/[project]/datasets/[dataset]/tables/[table] format.
- getTag(int) - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the tuple tag at the given index.
- getTag() - Method in class org.apache.beam.sdk.values.TaggedPValue
-
Returns the local tag associated with the
PValue
.
- getTagInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
- getTagInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
- getTargetDataset() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
-
- getTargetParallelism() - Method in interface org.apache.beam.runners.direct.DirectOptions
-
- getTargetTable(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
-
- getTargetTableId(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
-
- getTempDatasetId() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getTempDirectory() - Method in class org.apache.beam.sdk.io.FileBasedSink.WriteOperation
-
- getTempDirectoryProvider() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
- getTempFilename() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
-
- getTemplateLocation() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Where the runner should generate a template file.
- getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter.Options
-
Returns the configured temporary location.
- getTempLocation() - Method in class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
Returns the configured temporary location.
- getTempLocation() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
A pipeline level default location for storing temporary files.
- getTempRoot() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
-
- GetterBasedSchemaProvider - Class in org.apache.beam.sdk.schemas
-
- GetterBasedSchemaProvider() - Constructor for class org.apache.beam.sdk.schemas.GetterBasedSchemaProvider
-
- GetterBasedSchemaProviderBenchmark - Class in org.apache.beam.sdk.jmh.schemas
-
- GetterBasedSchemaProviderBenchmark() - Constructor for class org.apache.beam.sdk.jmh.schemas.GetterBasedSchemaProviderBenchmark
-
- GetterTypeSupplier() - Constructor for class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
-
- getTestMode() - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
-
Set to true to run the job in test mode.
- getTestTimeoutSeconds() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
-
- getThriftClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
Used by the ThriftPayloadSerializerProvider when serializing from a Pub/Sub message.
- getThriftProtocolFactoryClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
Used by the ThriftPayloadSerializerProvider when serializing from a Pub/Sub message.
- getThrottleDuration() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The amount of time an attempt will be throttled if deemed necessary based on previous success
rate.
- getTimeDomain() - Method in interface org.apache.beam.sdk.state.TimerSpec
-
- getTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
- getTimerFamilyId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
-
- getTimerFamilyId() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
-
- getTimerReceivers() - Method in interface org.apache.beam.runners.fnexecution.control.RemoteBundle
-
Get a map of (transform id, timer id) to
receiver
s which consume timers,
forwarding them to the remote environment.
- getTimerReceivers() - Method in class org.apache.beam.runners.fnexecution.control.SdkHarnessClient.BundleProcessor.ActiveBundle
-
- getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.WithMetricsSupport
-
- getTimers() - Method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
- getTimers(MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.WithMetricsSupport
-
- getTimerSpec() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.TimerSpec
-
- getTimerSpecs() - Method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
Get a mapping from PTransform id to timer id to
timer specs
that are used
during execution.
- getTimes() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
- getTimeStamp() - Method in class org.apache.beam.sdk.io.aws2.sqs.SqsMessage
-
Timestamp the message was sent at (in epoch millis).
- getTimestamp() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.Document
-
- getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Indicates the timestamp for which the change stream query has returned all changes.
- getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
-
- getTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
-
Returns timestamp for element being published to Kafka.
- getTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
-
- getTimestamp() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
-
- getTimestamp() - Method in class org.apache.beam.sdk.metrics.GaugeResult
-
- getTimestamp() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the timestamp of this FailsafeValueInSingleWindow
.
- getTimestamp() - Method in class org.apache.beam.sdk.values.TimestampedValue
-
- getTimestamp() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the timestamp of this ValueInSingleWindow
.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message
attributes, specifies the name of the attribute that contains the timestamp.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the timestamp attribute.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the timestamp attribute.
- getTimestampCombiner() - Method in interface org.apache.beam.sdk.state.WatermarkHoldState
-
Return the
TimestampCombiner
which will be used to determine a watermark hold time
given an element timestamp, and to combine watermarks from windows which are about to be
merged.
- getTimestampCombiner() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
-
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
-
Returns record timestamp (aka event time).
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
-
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
-
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
-
- getTimestampTransforms() - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
The transforms applied to the arrival time of an element to determine when this trigger allows
output.
- getTimestampType() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
-
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
-
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
-
- getTimeZone() - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- getTiming() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return the timing of this pane.
- getTo() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Returns the range end timestamp (exclusive).
- getTo() - Method in class org.apache.beam.sdk.io.range.OffsetRange
-
- getToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Unique partition identifier, which can be used to perform a change stream query.
- getTokenNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
Deprecated.
- getTokenNames() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
Deprecated.
- getToKvs() - Method in class org.apache.beam.sdk.schemas.transforms.Group.ByFields
-
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The topic from which to read Pub/Sub messages.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the topic being written to.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the topic being read from.
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
-
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
-
- getTopic() - Method in class org.apache.beam.sdk.io.pulsar.PulsarMessage
-
- getTopicPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
-
- getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
- getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
- getTopics() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
-
- getToRowFunction() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoDynamicMessageSchema
-
- getToRowFunction() - Method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Returns the fromRow conversion function.
- getToRowFunction(Class<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts an object of the specified type to a
Row
object.
- getToRowFunction(TypeDescriptor<T>) - Method in class org.apache.beam.sdk.schemas.SchemaRegistry
-
Retrieve the function that converts an object of the specified type to a
Row
object.
- getToRowFunction(Class<T>, Schema) - Static method in class org.apache.beam.sdk.schemas.utils.AvroUtils
-
- getToRowFunction() - Method in class org.apache.beam.sdk.values.PCollection
-
Returns the attached schema's toRowFunction.
- getTotalBacklogBytes() - Method in interface org.apache.beam.sdk.io.jms.AutoScaler
-
Returns the size of the backlog of unread data in the underlying data source represented by all
splits of this source.
- getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.jms.DefaultAutoscaler
-
- getTotalBacklogBytes() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns the size of the backlog of unread data in the underlying data source represented by
all splits of this source.
- getTotalStreamDuration() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the total stream duration of change stream records so far.
- getTotalStreamTimeMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The total streaming time (in millis) for this record.
- getTraitDef() - Method in enum org.apache.beam.sdk.extensions.sql.impl.rel.BeamLogicalConvention
-
- getTransactionIsolation() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getTransform(RunnerApi.FunctionSpec) - Method in interface org.apache.beam.sdk.expansion.service.ExpansionService.TransformProvider
-
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.DataEndpoint
-
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
-
- getTransformId() - Method in class org.apache.beam.sdk.fn.data.TimerEndpoint
-
- getTransformingMap(Map<K1, V1>, Function<K1, K2>, Function<V1, V2>) - Static method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils
-
- getTransformNameMapping() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Mapping of old PTranform names to new ones, specified as JSON {"oldName":"newName",...}
.
- getTransformPayloadTranslators() - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.Registrar
-
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
-
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
-
- getTransformStepNames() - Method in class org.apache.beam.runners.dataflow.DataflowPipelineJob
-
- getTransformTranslator(Class<TransformT>) - Method in class org.apache.beam.runners.dataflow.DataflowPipelineTranslator
-
Returns the
TransformTranslator
to use for instances of the specified PTransform class,
or null if none registered.
- getTransformTranslator(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.PipelineTranslatorBatch
-
Returns a translator for the given node, if it is possible, otherwise null.
- getTransformTranslator(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
-
- getTransformTranslator(TransformHierarchy.Node) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.streaming.PipelineTranslatorStreaming
-
Returns a translator for the given node, if it is possible, otherwise null.
- getTranslationContext() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator
-
- getTranslator() - Method in class org.apache.beam.runners.dataflow.DataflowRunner
-
Returns the DataflowPipelineTranslator associated with this object.
- getTransport() - Static method in class org.apache.beam.sdk.extensions.gcp.util.Transport
-
- getTrigger() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- getTruncatedRestriction() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
-
- getTruncateTimestamps() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog.DataCatalogPipelineOptions
-
Whether to truncate timestamps in tables described by Data Catalog.
- getTruncateTimestamps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
-
- getTSetEnvironment() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
-
- getTSetGraph() - Method in class org.apache.beam.runners.twister2.Twister2PipelineExecutionEnvironment
-
- getTupleTag() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.TaggedKeyedPCollection
-
Returns the TupleTag of this TaggedKeyedPCollection.
- getTupleTagId(PValue) - Static method in class org.apache.beam.runners.jet.Utils
-
- getTupleTagList() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
Returns the TupleTagList tuple associated with this schema.
- getTwister2Home() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
-
- getType() - Method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns the type this coder encodes/decodes.
- getType(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getType() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPMeasure
-
- getType() - Method in class org.apache.beam.sdk.extensions.sql.meta.Table
-
type of the table.
- getType() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The type of the column.
- getType() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getType() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns the field type.
- getType() - Method in class org.apache.beam.sdk.schemas.Schema.Field
-
- getType(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the type of an option.
- getType() - Method in interface org.apache.beam.sdk.testing.TestStream.Event
-
- getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
- getType() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
- getType() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
- getTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.ViewFn
-
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollection
-
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
-
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.ListViewFn2
-
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MapViewFn2
-
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.MultimapViewFn2
-
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn
-
Deprecated.
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.PCollectionViews.SingletonViewFn2
-
- getTypeDescriptor() - Method in class org.apache.beam.sdk.values.TupleTag
-
Returns a TypeDescriptor
capturing what is known statically about the type of this
TupleTag
instance's most-derived class.
- getTypeFactory() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getTypeMap() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getTypeName() - Method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- getTypeName() - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- getTypeNames() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- getTypeParameter(String) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a TypeVariable
for the named type parameter.
- getTypePayload() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
-
- getTypes() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns a set of
TypeDescriptor
, one for each superclass as well as each
interface implemented by this class.
- getTypeUrn() - Method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService.ArtifactDestination
-
- getUdaf(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
-
- getUdafImpl() - Method in class org.apache.beam.sdk.extensions.sql.impl.LazyAggregateCombineFn
-
- getUdafs() - Method in interface org.apache.beam.sdk.extensions.sql.meta.provider.UdfUdafProvider
-
- getUnboundedReaderMaxElements() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max elements read from an UnboundedReader before checkpointing.
- getUnboundedReaderMaxReadTimeSec() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max amount of time an UnboundedReader is consumed before checkpointing.
- getUnboundedReaderMaxWaitForElementsMs() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The max amount of time waiting for elements when reading from UnboundedReader.
- getUnderlyingDoFn() - Method in class org.apache.beam.runners.dataflow.BatchStatefulParDoOverrides.BatchStatefulDoFn
-
- getUnfinishedEndpoints() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver2
-
Get all unfinished data and timers endpoints represented as [transform_id]:data and
[transform_id]:timers:[timer_family_id].
- getUnfinishedMinWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
- getUnionCoder() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
- getUnionTag() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
-
- getUniqueId() - Method in class org.apache.beam.sdk.io.aws2.kinesis.KinesisRecord
-
- getUniqueId() - Method in class org.apache.beam.sdk.io.kinesis.KinesisRecord
-
- getUntilTrigger() - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
-
The trigger that signals termination of this trigger.
- getUploadBufferSizeBytes() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.CreateOptions
-
If non-null, the upload buffer size to be used.
- getUrl() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getUrl() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getUrn(PrimitiveParDoSingleFactory.ParDoSingle<?, ?>) - Method in class org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory.PayloadTranslator
-
- getUrn() - Method in interface org.apache.beam.sdk.transforms.Materialization
-
- getUseActiveSparkSession() - Method in interface org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineOptions
-
- getUsePublicIps() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies whether worker pools should be started with public IP addresses.
- getUserAgent() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
A user agent string as per RFC2616, describing the pipeline to external services.
- getUserId() - Method in class org.apache.beam.sdk.io.rabbitmq.RabbitMqMessage
-
- getUsername() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- getUsername() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getUsername() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getUsername() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getUsesProvidedSparkContext() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- getUseStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
Enables BigQuery's Standard SQL dialect when reading from a query.
- getUseStorageWriteApi() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getUseStorageWriteApiAtLeastOnce() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
-
- getUsingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
-
- getUUID() - Method in class org.apache.beam.sdk.schemas.Schema
-
Get this schema's UUID.
- getValue() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer.FlinkDistributionGauge
-
- getValue() - Method in class org.apache.beam.runners.flink.metrics.FlinkMetricContainer.FlinkGauge
-
- getValue(String, Class<T>) - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregators
-
- getValue(String, MetricFilter) - Method in class org.apache.beam.runners.spark.metrics.AggregatorMetric
-
- getValue(String, Class<T>) - Method in class org.apache.beam.runners.spark.structuredstreaming.aggregators.NamedAggregators
-
- getValue(String, MetricFilter) - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.AggregatorMetric
-
- getValue() - Method in class org.apache.beam.runners.spark.util.ByteArray
-
- getValue() - Method in class org.apache.beam.runners.spark.util.SideInputBroadcast
-
- getValue() - Method in class org.apache.beam.sdk.coders.StructuralByteArray
-
- getValue() - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
-
- getValue() - Method in class org.apache.beam.sdk.io.range.ByteKey
-
Returns a read-only
ByteBuffer
representing this
ByteKey
.
- getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult.EmptyGaugeResult
-
- getValue() - Method in class org.apache.beam.sdk.metrics.GaugeResult
-
- getValue() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType.Value
-
Return the integer enum value.
- getValue(Class<T>) - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the current value of the OneOf as the destination type.
- getValue() - Method in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType.Value
-
Returns the current value of the OneOf.
- getValue(String) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValue(String, Class<T>) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
Retrieve the value of the display item.
- getValue() - Method in class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
The value of the display item.
- getValue() - Method in class org.apache.beam.sdk.transforms.join.RawUnionValue
-
- getValue() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the value of this FailsafeValueInSingleWindow
.
- getValue() - Method in class org.apache.beam.sdk.values.KV
-
Returns the value of this
KV
.
- getValue(int) - Method in class org.apache.beam.sdk.values.Row
-
Get value by field index, ClassCastException
is thrown if schema doesn't match.
- getValue(String) - Method in class org.apache.beam.sdk.values.Row
-
Get value by field name, ClassCastException
is thrown if type doesn't match.
- getValue(int) - Method in class org.apache.beam.sdk.values.RowWithGetters
-
- getValue(int) - Method in class org.apache.beam.sdk.values.RowWithStorage
-
- getValue() - Method in class org.apache.beam.sdk.values.TaggedPValue
-
- getValue() - Method in class org.apache.beam.sdk.values.TimestampedValue
-
- getValue() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the value of this ValueInSingleWindow
.
- getValue() - Method in class org.apache.beam.sdk.values.ValueWithRecordId
-
- getValueCaptureType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The capture type of the change stream that generated this record.
- getValueCoder() - Method in class org.apache.beam.sdk.coders.KvCoder
-
- getValueCoder() - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
Gets the value coder that will be prefixed by the length.
- getValueCoder() - Method in class org.apache.beam.sdk.coders.MapCoder
-
- getValueCoder() - Method in class org.apache.beam.sdk.coders.NullableCoder
-
- getValueCoder() - Method in class org.apache.beam.sdk.coders.SortedMapCoder
-
- getValueCoder() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getValueCoder() - Method in class org.apache.beam.sdk.testing.TestStream
-
- getValueCoder() - Method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
-
- getValueCoder() - Method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- getValueOrDefault(String, T) - Method in class org.apache.beam.sdk.schemas.Schema.Options
-
Get the value of an option.
- getValues() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
- getValues() - Method in class org.apache.beam.sdk.values.Row
-
Return the list of raw unmodified data values to enable 0-copy code.
- getValues() - Method in class org.apache.beam.sdk.values.RowWithGetters
-
Return the list of raw unmodified data values to enable 0-copy code.
- getValues() - Method in class org.apache.beam.sdk.values.RowWithStorage
-
- getValuesMap() - Method in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
- getValueTranslationFunction() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getValueTypeDescriptor() - Method in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO.Read
-
- getVerifyRowValues() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
-
- getView() - Method in class org.apache.beam.runners.dataflow.CreateDataflowView
-
- getView() - Method in class org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
-
- getView() - Method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
-
- getViewFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
- getViewFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
- getVocabulary() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- getVocabulary() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- getWarehouse() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.CrossLanguageConfiguration
-
- getWarehouse() - Method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceConfiguration
-
- getWarehouse() - Method in interface org.apache.beam.sdk.io.snowflake.SnowflakePipelineOptions
-
- getWarnings() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- getWatchInterval() - Method in class org.apache.beam.sdk.io.FileIO.MatchConfiguration
-
- getWatermark() - Method in class org.apache.beam.runners.spark.io.MicrobatchSource.Reader
-
- getWatermark() - Method in interface org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicy
-
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
-
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
-
- getWatermark() - Method in class org.apache.beam.sdk.io.aws2.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
-
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time for which all records with a timestamp less than it have been processed.
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
-
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
-
Returns watermark for the partition.
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
-
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
-
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
-
- getWatermark() - Method in interface org.apache.beam.sdk.io.kinesis.WatermarkPolicy
-
- getWatermark() - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.ArrivalTimeWatermarkPolicy
-
- getWatermark() - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.CustomWatermarkPolicy
-
- getWatermark() - Method in class org.apache.beam.sdk.io.kinesis.WatermarkPolicyFactory.ProcessingTimeWatermarkPolicy
-
- getWatermark() - Method in class org.apache.beam.sdk.io.UnboundedSource.UnboundedReader
-
Returns a timestamp before or at the timestamps of all future elements read by this reader.
- getWatermark() - Method in class org.apache.beam.sdk.testing.TestStream.WatermarkEvent
-
- getWatermarkAndState() - Method in interface org.apache.beam.sdk.fn.splittabledofn.WatermarkEstimators.WatermarkAndStateObserver
-
- getWatermarkMillis() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterAll
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterEach
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.AfterWatermarkEarlyAndLate
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.AfterWatermark.FromEndOfWindow
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Never.NeverTrigger
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.OrFinallyTrigger
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Repeatedly
-
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.ReshuffleTrigger
-
Deprecated.
- getWatermarkThatGuaranteesFiring(BoundedWindow) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
For internal use only; no backwards-compatibility guarantees.
- getWeigher(Coder<InputT>) - Method in class org.apache.beam.sdk.transforms.GroupIntoBatches.BatchingParams
-
- getWindmillServiceEndpoint() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
Custom windmill service endpoint.
- getWindmillServicePort() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
- getWindow() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
This method returns the number of tuples in each window.
- getWindow() - Method in class org.apache.beam.sdk.io.FileBasedSink.FileResult
-
- getWindow() - Method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
Returns the window of this FailsafeValueInSingleWindow
.
- getWindow() - Method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
Returns the window of this ValueInSingleWindow
.
- getWindowCoder() - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
-
- getWindowedValueCoder(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
- getWindowedWrites() - Method in class org.apache.beam.sdk.io.WriteFiles
-
- getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window.Assign
-
- getWindowFn() - Method in class org.apache.beam.sdk.transforms.windowing.Window
-
- getWindowFn() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- getWindowingStrategy(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
- getWindowingStrategy() - Method in class org.apache.beam.sdk.values.PCollection
-
- getWindowingStrategyInternal() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
- getWindowingStrategyInternal() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
- getWindowMappingFn() - Method in interface org.apache.beam.sdk.values.PCollectionView
-
For internal use only.
- getWindowMappingFn() - Method in class org.apache.beam.sdk.values.PCollectionViews.SimplePCollectionView
-
- getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
-
- getWindowTypeDescriptor() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns a
TypeDescriptor
capturing what is known statically about the window type of
this
WindowFn
instance's most-derived class.
- getWorkCompleted() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
The known amount of completed work.
- getWorkerCacheMb() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
-
The size of the worker's in-memory cache, in megabytes.
- getWorkerCPUs() - Method in interface org.apache.beam.runners.twister2.Twister2PipelineOptions
-
- getWorkerDiskType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Specifies what type of persistent disk is used.
- getWorkerHarnessContainerImage() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
- getWorkerId() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the worker running this pipeline.
- getWorkerId() - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
-
- getWorkerLogLevelOverrides() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.
This option controls the log levels for specifically named loggers.
- getWorkerMachineType() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
-
Machine type to create Dataflow worker VMs as.
- getWorkerPool() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
-
The identity of the worker pool of this worker.
- getWorkerRegion() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones)
in which worker processing should occur, e.g.
- getWorkerSystemErrMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.
Controls the log level given to messages printed to System.err
.
- getWorkerSystemOutMessageLevel() - Method in interface org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
-
Deprecated.
Controls the log level given to messages printed to System.out
.
- getWorkerZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in
which worker processing should occur, e.g.
- getWorkRemaining() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.Progress
-
The known amount of work remaining.
- getWritableByteChannelFactory() - Method in class org.apache.beam.sdk.io.FileBasedSink
-
- getWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
-
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration
-
Specifies what to do with existing data in the table, in case the table already exists.
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.snowflake.crosslanguage.WriteBuilder.Configuration
-
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.snowflake.services.SnowflakeBatchServiceConfig
-
- getWriteFailures() - Method in exception org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
-
- getWriteOperation() - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
Return the WriteOperation that this Writer belongs to.
- getWriteResult() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
-
- getZetaSqlDefaultTimezone() - Method in interface org.apache.beam.sdk.extensions.sql.impl.BeamSqlPipelineOptions
-
- getZetaSqlRuleSets() - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- getZetaSqlRuleSets(Collection<RelOptRule>) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner
-
- getZone() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
- global(Map<Integer, GlobalWatermarkHolder.SparkWatermarks>) - Static method in class org.apache.beam.runners.spark.stateful.SparkTimerInternals
-
Build a global TimerInternals
for all feeding streams.
- Global() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group.Global
-
- globalDefault() - Static method in class org.apache.beam.sdk.values.WindowingStrategy
-
Return a fully specified, default windowing strategy.
- GlobalDigest() - Constructor for class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles.GlobalDigest
-
- globally() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct
-
Computes the approximate number of distinct elements in the input PCollection<InputT>
and returns a PCollection<Long>
.
- globally() - Static method in class org.apache.beam.sdk.extensions.sketching.SketchFrequencies
-
Create the
PTransform
that will build a Count-min sketch for keeping track of the
frequency of the elements in the whole stream.
- globally() - Static method in class org.apache.beam.sdk.extensions.sketching.TDigestQuantiles
-
Compute the stream in order to build a T-Digest structure (MergingDigest) for keeping track of
the stream distribution and returns a PCollection<MergingDigest>
.
- globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct
-
- Globally() - Constructor for class org.apache.beam.sdk.extensions.zetasketch.ApproximateCountDistinct.Globally
-
- globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Extract
-
Returns a PTransform
that takes an input PCollection<byte[]>
of HLL++
sketches and returns a PCollection<Long>
of the estimated count of distinct elements
extracted from each sketch.
- globally() - Method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.Init.Builder
-
Returns a Combine.Globally
PTransform
that takes an input PCollection<InputT>
and returns a PCollection<byte[]>
which consists of the HLL++
sketch computed from the elements in the input PCollection
.
- globally() - Static method in class org.apache.beam.sdk.extensions.zetasketch.HllCount.MergePartial
-
Returns a Combine.Globally
PTransform
that takes an input PCollection<byte[]>
of HLL++ sketches and returns a PCollection<byte[]>
of a new
sketch merged from the input sketches.
- globally() - Static method in class org.apache.beam.sdk.schemas.transforms.Group
-
Returns a transform that groups all elements in the input
PCollection
.
- globally(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
-
Returns a PTransform
that takes a PCollection<T>
and returns a PCollection<List<T>>
whose single value is a List
of the approximate N
-tiles
of the elements of the input PCollection
.
- globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateQuantiles
-
- globally(int) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.
Returns a PTransform
that takes a PCollection<T>
and returns a PCollection<Long>
containing a single value that is an estimate of the number of distinct
elements in the input PCollection
.
- globally(double) - Static method in class org.apache.beam.sdk.transforms.ApproximateUnique
-
Deprecated.
- Globally(int) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- Globally(double) - Constructor for class org.apache.beam.sdk.transforms.ApproximateUnique.Globally
-
Deprecated.
- globally(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.Globally
PTransform
that uses the given
SerializableFunction
to combine all the elements in each window of the input
PCollection
into a single value in the output
PCollection
.
- globally(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.Globally
PTransform
that uses the given
SerializableBiFunction
to combine all the elements in each window of the input
PCollection
into a single value in the output
PCollection
.
- globally(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.Globally
PTransform
that uses the given
GloballyCombineFn
to combine all the elements in each window of the input
PCollection
into a single value in the output
PCollection
.
- globally() - Static method in class org.apache.beam.sdk.transforms.Count
-
- globally() - Static method in class org.apache.beam.sdk.transforms.Latest
-
Returns a
PTransform
that takes as input a
PCollection<T>
and returns a
PCollection<T>
whose contents is the latest element according to its event time, or null if there are no elements.
- globally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a PTransform
that takes an input PCollection<T>
and returns a PCollection<T>
whose contents is the maximum according to the natural ordering of T
of
the input PCollection
's elements, or null
if there are no elements.
- globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a PTransform
that takes an input PCollection<T>
and returns a PCollection<T>
whose contents is the maximum of the input PCollection
's elements, or
null
if there are no elements.
- globally() - Static method in class org.apache.beam.sdk.transforms.Mean
-
Returns a PTransform
that takes an input PCollection<NumT>
and returns a PCollection<Double>
whose contents is the mean of the input PCollection
's elements, or
0
if there are no elements.
- globally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a PTransform
that takes an input PCollection<T>
and returns a PCollection<T>
whose contents is the minimum according to the natural ordering of T
of
the input PCollection
's elements, or null
if there are no elements.
- globally(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a PTransform
that takes an input PCollection<T>
and returns a PCollection<T>
whose contents is the minimum of the input PCollection
's elements, or
null
if there are no elements.
- GloballyDistinct() - Constructor for class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.GloballyDistinct
-
- GlobalSketch() - Constructor for class org.apache.beam.sdk.extensions.sketching.SketchFrequencies.GlobalSketch
-
- GlobalWatermarkHolder - Class in org.apache.beam.runners.spark.util
-
A store to hold the global watermarks for a micro-batch.
- GlobalWatermarkHolder() - Constructor for class org.apache.beam.runners.spark.util.GlobalWatermarkHolder
-
- GlobalWatermarkHolder.SparkWatermarks - Class in org.apache.beam.runners.spark.util
-
- GlobalWatermarkHolder.WatermarkAdvancingStreamingListener - Class in org.apache.beam.runners.spark.util
-
Advance the WMs onBatchCompleted event.
- GlobalWindow - Class in org.apache.beam.sdk.transforms.windowing
-
The default window into which all data is placed (via
GlobalWindows
).
- GlobalWindow.Coder - Class in org.apache.beam.sdk.transforms.windowing
-
- GlobalWindows - Class in org.apache.beam.sdk.transforms.windowing
-
A
WindowFn
that assigns all data to the same window.
- GlobalWindows() - Constructor for class org.apache.beam.sdk.transforms.windowing.GlobalWindows
-
- GoogleApiDebugOptions - Interface in org.apache.beam.sdk.extensions.gcp.options
-
These options configure debug settings for Google API clients created within the Apache Beam SDK.
- GoogleApiDebugOptions.GoogleApiTracer - Class in org.apache.beam.sdk.extensions.gcp.options
-
- GoogleApiTracer() - Constructor for class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
- GraphiteSink - Class in org.apache.beam.runners.spark.metrics.sink
-
- GraphiteSink(Properties, MetricRegistry, SecurityManager) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
-
Constructor for Spark 3.1.x and earlier.
- GraphiteSink(Properties, MetricRegistry) - Constructor for class org.apache.beam.runners.spark.metrics.sink.GraphiteSink
-
Constructor for Spark 3.2.x and later.
- greaterThan(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- greaterThan(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- greaterThan(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a PTransform
that takes an input PCollection<T>
and returns a PCollection<T>
with elements that are greater than a given value, based on the elements'
natural ordering.
- greaterThanEq(T) - Static method in class org.apache.beam.sdk.transforms.Filter
-
Returns a PTransform
that takes an input PCollection<T>
and returns a PCollection<T>
with elements that are greater than or equal to a given value, based on the
elements' natural ordering.
- greaterThanOrEqualTo(T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- greaterThanOrEqualTo(Coder<T>, T) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- Group - Class in org.apache.beam.sdk.schemas.transforms
-
- Group() - Constructor for class org.apache.beam.sdk.schemas.transforms.Group
-
- Group.AggregateCombiner<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransform
that does a combine using an aggregation built up by calls to
aggregateField and aggregateFields.
- Group.ByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransform
that groups schema elements based on the given fields.
- Group.CombineFieldsByFields<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransform
that does a per-key combine using an aggregation built up by calls to
aggregateField and aggregateFields.
- Group.CombineFieldsGlobally<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
a
PTransform
that does a global combine using an aggregation built up by calls to
aggregateField and aggregateFields.
- Group.CombineGlobally<InputT,OutputT> - Class in org.apache.beam.sdk.schemas.transforms
-
- Group.Global<InputT> - Class in org.apache.beam.sdk.schemas.transforms
-
A
PTransform
for doing global aggregations on schema PCollections.
- GroupAlsoByWindowViaOutputBufferFn<K,InputT,W extends BoundedWindow> - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
A FlatMap function that groups by windows in batch mode using ReduceFnRunner
.
- GroupAlsoByWindowViaOutputBufferFn(WindowingStrategy<?, W>, StateInternalsFactory<K>, SystemReduceFn<K, InputT, Iterable<InputT>, Iterable<InputT>, W>, SerializablePipelineOptions) - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.GroupAlsoByWindowViaOutputBufferFn
-
- GroupByKey<K,V> - Class in org.apache.beam.sdk.transforms
-
GroupByKey<K, V>
takes a PCollection<KV<K, V>>
, groups the values by key and
windows, and returns a PCollection<KV<K, Iterable<V>>>
representing a map from each
distinct key and window of the input PCollection
to an Iterable
over all the
values associated with that key in the input per window.
- groupByKeyAndWindow(JavaDStream<WindowedValue<KV<K, InputT>>>, Coder<K>, Coder<WindowedValue<InputT>>, WindowingStrategy<?, W>, SerializablePipelineOptions, List<Integer>, String) - Static method in class org.apache.beam.runners.spark.stateful.SparkGroupAlsoByWindowViaWindowSet
-
- GroupByKeyTranslatorBatch<K,V> - Class in org.apache.beam.runners.twister2.translators.batch
-
GroupByKey translator.
- GroupByKeyTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.GroupByKeyTranslatorBatch
-
- GroupByWindowFunction<K,V,W extends BoundedWindow> - Class in org.apache.beam.runners.twister2.translators.functions
-
GroupBy window function.
- GroupByWindowFunction() - Constructor for class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
-
- GroupByWindowFunction(WindowingStrategy<?, W>, SystemReduceFn<K, V, Iterable<V>, Iterable<V>, W>, PipelineOptions) - Constructor for class org.apache.beam.runners.twister2.translators.functions.GroupByWindowFunction
-
- grouped() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
- groupedValues(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValues
PTransform
that takes a
PCollection
of
KV
s where a key maps to an
Iterable
of values, e.g., the result
of a
GroupByKey
, then uses the given
SerializableFunction
to combine all the
values associated with a key, ignoring the key.
- groupedValues(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValues
PTransform
that takes a
PCollection
of
KV
s where a key maps to an
Iterable
of values, e.g., the result
of a
GroupByKey
, then uses the given
SerializableFunction
to combine all the
values associated with a key, ignoring the key.
- groupedValues(CombineFnBase.GlobalCombineFn<? super InputT, ?, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Combine
-
Returns a
Combine.GroupedValues
PTransform
that takes a
PCollection
of
KV
s where a key maps to an
Iterable
of values, e.g., the result
of a
GroupByKey
, then uses the given
CombineFn
to combine all the values
associated with a key, ignoring the key.
- GroupingState<InputT,OutputT> - Interface in org.apache.beam.sdk.state
-
A
ReadableState
cell that combines multiple input values and outputs a single value of a
different type.
- GroupIntoBatches<K,InputT> - Class in org.apache.beam.sdk.transforms
-
A
PTransform
that batches inputs to a desired batch size.
- GroupIntoBatches.BatchingParams<InputT> - Class in org.apache.beam.sdk.transforms
-
Wrapper class for batching parameters supplied by users.
- GroupIntoBatches.WithShardedKey - Class in org.apache.beam.sdk.transforms
-
- GroupIntoBatchesOverride - Class in org.apache.beam.runners.dataflow
-
- GroupIntoBatchesOverride() - Constructor for class org.apache.beam.runners.dataflow.GroupIntoBatchesOverride
-
- GrowableOffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
-
- GrowableOffsetRangeTracker(long, GrowableOffsetRangeTracker.RangeEndEstimator) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
-
- GrowableOffsetRangeTracker.RangeEndEstimator - Interface in org.apache.beam.sdk.transforms.splittabledofn
-
Provides the estimated end offset of the range.
- Growth() - Constructor for class org.apache.beam.sdk.transforms.Watch.Growth
-
- growthOf(Watch.Growth.PollFn<InputT, OutputT>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function.
- growthOf(Watch.Growth.PollFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function.
- growthOf(Contextful<Watch.Growth.PollFn<InputT, OutputT>>, SerializableFunction<OutputT, KeyT>) - Static method in class org.apache.beam.sdk.transforms.Watch
-
Watches the growth of the given poll function, using the given "key function" to deduplicate
outputs.
- GrpcContextHeaderAccessorProvider - Class in org.apache.beam.sdk.fn.server
-
A HeaderAccessorProvider which intercept the header in a GRPC request and expose the relevant
fields.
- GrpcContextHeaderAccessorProvider() - Constructor for class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
-
- GrpcDataService - Class in org.apache.beam.runners.fnexecution.data
-
- GrpcDataService() - Constructor for class org.apache.beam.runners.fnexecution.data.GrpcDataService
-
- GrpcFnServer<ServiceT extends FnService> - Class in org.apache.beam.sdk.fn.server
-
A
gRPC Server
which manages a single
FnService
.
- GrpcLoggingService - Class in org.apache.beam.runners.fnexecution.logging
-
An implementation of the Beam Fn Logging Service over gRPC.
- GrpcStateService - Class in org.apache.beam.runners.fnexecution.state
-
An implementation of the Beam Fn State service.
- id() - Method in class org.apache.beam.runners.jobsubmission.JobPreparation
-
- id - Variable in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
-
- identifier() - Method in class org.apache.beam.sdk.extensions.protobuf.ProtoPayloadSerializerProvider
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.Fixed32
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.Fixed64
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SFixed32
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SFixed64
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SInt32
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.SInt64
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.UInt32
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.protobuf.ProtoSchemaLogicalTypes.UInt64
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.LocalMktDate
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimeOnly
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.TZTimestamp
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint16
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint32
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint64
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.Uint8
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCDateOnly
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimeOnly
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sbe.SbeLogicalTypes.UTCTimestamp
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.CharType
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.TimeWithLocalTzType
-
- identifier() - Method in class org.apache.beam.sdk.io.AvroSchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
-
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
- identifier() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
-
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
-
- identifier() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.thrift.ThriftPayloadSerializerProvider
-
- identifier() - Method in class org.apache.beam.sdk.schemas.io.payloads.AvroPayloadSerializerProvider
-
- identifier() - Method in class org.apache.beam.sdk.schemas.io.payloads.JsonPayloadSerializerProvider
-
- identifier() - Method in interface org.apache.beam.sdk.schemas.io.Providers.Identifyable
-
Returns an id that uniquely represents this among others implementing its derived interface.
- identifier() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
Returns an id that uniquely represents this IO.
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.EnumerationType
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.MicrosInstant
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.NanosDuration
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.NanosInstant
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.OneOfType
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.PythonCallable
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.SchemaLogicalType
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.logicaltypes.UuidLogicalType
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationLexer
-
- IDENTIFIER - Static variable in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser
-
- IDENTIFIER() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.QualifiedComponentContext
-
- IDENTIFIER() - Method in class org.apache.beam.sdk.schemas.parser.generated.FieldSpecifierNotationParser.SimpleIdentifierContext
-
- identifier() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns an id that uniquely represents this transform.
- Identifier() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
-
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineDoubleFn
-
Returns the identity element of this operation, i.e.
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns the value that should be used for the combine of the empty set.
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineIntegerFn
-
Returns the identity element of this operation, i.e.
- identity() - Method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineLongFn
-
Returns the identity element of this operation, i.e.
- identity() - Static method in class org.apache.beam.sdk.transforms.SerializableFunctions
-
- IDENTITY_ELEMENT - Static variable in class org.apache.beam.sdk.metrics.DistributionResult
-
The IDENTITY_ELEMENT is used to start accumulating distributions.
- IdGenerator - Interface in org.apache.beam.sdk.fn
-
A generator of unique IDs.
- IdGenerators - Class in org.apache.beam.sdk.fn
-
- IdGenerators() - Constructor for class org.apache.beam.sdk.fn.IdGenerators
-
- ignored() - Static method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
-
Returns a handler that ignores metrics.
- ignoreInput(Watch.Growth.TerminationCondition<?, StateT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth
-
Wraps a given input-independent
Watch.Growth.TerminationCondition
as an equivalent condition with
a given input type, passing
null
to the original condition as input.
- ignoreInsertIds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Setting this option to true disables insertId based data deduplication offered by BigQuery.
- ignoreUnknownValues() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Accept rows that contain values that do not match the schema.
- immediate(T) - Static method in class org.apache.beam.sdk.state.ReadableStates
-
A
ReadableState
constructed from a constant value, hence immediately available.
- immutableNames() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
-
- immutableNamesBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
- immutableSteps() - Method in class org.apache.beam.sdk.metrics.MetricsFilter
-
- immutableStepsBuilder() - Method in class org.apache.beam.sdk.metrics.MetricsFilter.Builder
-
- implement(EnumerableRelImplementor, EnumerableRel.Prefer) - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamEnumerableConverter
-
- implementor() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamSqlUnparseContext
-
- importCatalogItems() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
-
- importFhirResource(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
- importFhirResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- importResources(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Import resources.
- importResources(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Import resources.
- importUserEvents() - Static method in class org.apache.beam.sdk.extensions.ml.RecommendationAIIO
-
- Impulse - Class in org.apache.beam.sdk.transforms
-
For internal use only; no backwards-compatibility guarantees.
- IMPULSE_ELEMENT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- ImpulseP - Class in org.apache.beam.runners.jet.processors
-
/** * Jet Processor
implementation for Beam's Impulse primitive.
- ImpulseSource - Class in org.apache.beam.runners.twister2.translators.functions
-
A SourceFunc
which executes the impulse transform contract.
- ImpulseSource() - Constructor for class org.apache.beam.runners.twister2.translators.functions.ImpulseSource
-
- ImpulseTranslatorBatch - Class in org.apache.beam.runners.spark.structuredstreaming.translation.batch
-
- ImpulseTranslatorBatch() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.translation.batch.ImpulseTranslatorBatch
-
- ImpulseTranslatorBatch - Class in org.apache.beam.runners.twister2.translators.batch
-
Impulse translator.
- ImpulseTranslatorBatch() - Constructor for class org.apache.beam.runners.twister2.translators.batch.ImpulseTranslatorBatch
-
- in(Pipeline, PCollection<FhirBundleResponse>, PCollection<HealthcareIOError<FhirBundleParameter>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Entry point for the ExecuteBundlesResult, storing the successful and failed bundles and their
metadata.
- in(Pipeline) - Static method in class org.apache.beam.sdk.values.PBegin
-
- in(Pipeline) - Static method in class org.apache.beam.sdk.values.PDone
-
- IN_ARRAY_OPERATOR - Static variable in class org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BeamBigQuerySqlDialect
-
- InboundDataClient - Interface in org.apache.beam.sdk.fn.data
-
- inc() - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
-
- inc(long) - Method in class org.apache.beam.runners.jet.metrics.CounterImpl
-
- inc() - Method in interface org.apache.beam.sdk.metrics.Counter
-
Increment the counter.
- inc(long) - Method in interface org.apache.beam.sdk.metrics.Counter
-
Increment the counter by the given amount.
- inc() - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
-
Increment the counter.
- inc(long) - Method in class org.apache.beam.sdk.metrics.DelegatingCounter
-
Increment the counter by the given amount.
- incActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
- incDataRecordCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
- incHeartbeatRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
- include(String, HasDisplayData) - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
-
Register display data from the specified subcomponent at the given path.
- inCombinedNonLatePanes(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only
run on the provided window across all panes that were not produced by the arrival of late
data.
- inCombinedNonLatePanes(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- IncompatibleWindowException - Exception in org.apache.beam.sdk.transforms.windowing
-
- IncompatibleWindowException(WindowFn<?, ?>, String) - Constructor for exception org.apache.beam.sdk.transforms.windowing.IncompatibleWindowException
-
- incomplete(List<TimestampedValue<OutputT>>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
Constructs a
Watch.Growth.PollResult
with the given outputs and declares that new outputs might
appear for the current input.
- incomplete(Instant, List<OutputT>) - Static method in class org.apache.beam.sdk.transforms.Watch.Growth.PollResult
-
- incPartitionRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
- incPartitionRecordMergeCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
- incPartitionRecordSplitCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
- incQueryCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
- increment() - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData
-
Returns a RandomAccessData that is the smallest value of same length which is strictly greater
than this.
- increment(Long) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IncrementFn
-
- incrementAll(Date) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.DateIncrementAllFn
-
- IncrementFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IncrementFn
-
- incrementingLongs() - Static method in class org.apache.beam.sdk.fn.IdGenerators
-
Returns an
IdGenerator
which provides successive incrementing longs.
- index() - Method in class org.apache.beam.sdk.io.splunk.SplunkEvent
-
- INDEX_OF_MAX - Static variable in class org.apache.beam.sdk.io.ShardNameTemplate
-
Shard name containing the index and max.
- indexOf(String) - Method in class org.apache.beam.sdk.schemas.Schema
-
Find the index of a given field.
- indexOfProjectionColumnRef(long, List<ResolvedColumn>) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter
-
Return an index of the projection column reference.
- inEarlyGlobalWindowPanes() - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
- inEarlyGlobalWindowPanes() - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- inEarlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only
run on the provided window across all panes that were produced by the arrival of early data.
- inEarlyPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- inEarlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to
only run on the provided window, running the checker only on early panes for each key.
- InferableFunction<InputT,OutputT> - Class in org.apache.beam.sdk.transforms
-
- InferableFunction() - Constructor for class org.apache.beam.sdk.transforms.InferableFunction
-
- InferableFunction(ProcessFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.transforms.InferableFunction
-
- inferSchema(CaseInsensitiveStringMap) - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.DatasetSourceBatch
-
- inferType(Object) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
- inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only
run on the provided window, running the checker only on the final pane for each key.
- inFinalPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- inFinalPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to
only run on the provided window, running the checker only on the final pane for each key.
- InfluxDbIO - Class in org.apache.beam.sdk.io.influxdb
-
IO to read and write from InfluxDB.
- InfluxDbIO.DataSourceConfiguration - Class in org.apache.beam.sdk.io.influxdb
-
A POJO describing a DataSourceConfiguration such as URL, userName and password.
- InfluxDbIO.Read - Class in org.apache.beam.sdk.io.influxdb
-
A
PTransform
to read from InfluxDB metric or data related to query.
- InfluxDbIO.Write - Class in org.apache.beam.sdk.io.influxdb
-
- ingestHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Ingest hl 7 v 2 message ingest message response.
- ingestHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
- ingestMessages(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Write with Messages.Ingest method.
- INHERIT_IO_FILE - Static variable in class org.apache.beam.runners.fnexecution.environment.ProcessManager
-
A symbolic file to indicate that we want to inherit I/O of parent process.
- init(Processor.Context) - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
-
- init(Processor.Context) - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
-
- init(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator
-
Init aggregators accumulator if it has not been initiated.
- init(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator
-
Init metrics accumulator if it has not been initiated.
- init(JavaSparkContext) - Static method in class org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator
-
Init aggregators accumulator if it has not been initiated.
- init(JavaSparkContext) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator
-
Init metrics accumulator if it has not been initiated.
- initAccumulators(SparkPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.SparkRunner
-
Init Metrics/Aggregators accumulators.
- initAccumulators(SparkStructuredStreamingPipelineOptions, JavaSparkContext) - Static method in class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner
-
Init Metrics/Aggregators accumulators.
- initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
-
- initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
-
- initialBackoff() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.auth.NullCredentialInitializer
-
- initialize(AbstractGoogleClientRequest<?>) - Method in class org.apache.beam.sdk.extensions.gcp.options.GoogleApiDebugOptions.GoogleApiTracer
-
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.util.LatencyRecordingHttpRequestInitializer
-
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer
-
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
-
- InitializeDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A DoFn responsible for initializing the change stream Connector.
- InitializeDoFn(DaoFactory, MapperFactory, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
-
- InitialPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Utility class to determine initial partition constants and methods.
- InitialPartition() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
-
- initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
- initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
The restriction for a partition will be defined from the start and end timestamp to query the
partition for.
- initialSystemTimeAt(Instant) - Method in class org.apache.beam.runners.spark.io.CreateStream
-
Set the initial synchronized processing time.
- initPulsarClients() - Method in class org.apache.beam.sdk.io.pulsar.ReadFromPulsarDoFn
-
- InjectPackageStrategy(Class<?>) - Constructor for class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.InjectPackageStrategy
-
- inLatePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
with the assertion restricted to only run on the
provided window across all panes that were produced by the arrival of late data.
- inLatePane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- inLatePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
with the assertion restricted to only run on the
provided window, running the checker only on late panes for each key.
- inMemory(TableProvider...) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
This method creates
BeamSqlEnv
using empty *
Pipeline Options.
- InMemoryBagUserStateFactory<K,V,W extends BoundedWindow> - Class in org.apache.beam.runners.fnexecution.state
-
Holds user state in memory.
- InMemoryBagUserStateFactory() - Constructor for class org.apache.beam.runners.fnexecution.state.InMemoryBagUserStateFactory
-
- inMemoryFinalizer(InstructionRequestHandler) - Static method in class org.apache.beam.runners.fnexecution.control.BundleFinalizationHandlers
-
A bundle finalizer that stores all bundle finalization requests in memory.
- InMemoryJobService - Class in org.apache.beam.runners.jobsubmission
-
A InMemoryJobService that prepares and runs jobs on behalf of a client using a
JobInvoker
.
- InMemoryMetaStore - Class in org.apache.beam.sdk.extensions.sql.meta.store
-
A
MetaStore
which stores the meta info in memory.
- InMemoryMetaStore() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.store.InMemoryMetaStore
-
- InMemoryMetaTableProvider - Class in org.apache.beam.sdk.extensions.sql.meta.provider
-
A InMemoryMetaTableProvider
is an abstract TableProvider
for in-memory types.
- InMemoryMetaTableProvider() - Constructor for class org.apache.beam.sdk.extensions.sql.meta.provider.InMemoryMetaTableProvider
-
- inNamespace(String) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
-
- inNamespace(Class<?>) - Static method in class org.apache.beam.sdk.metrics.MetricNameFilter
-
- Inner() - Constructor for class org.apache.beam.sdk.schemas.transforms.Filter.Inner
-
- innerBroadcastJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform an inner join, broadcasting the right side.
- innerJoin(PCollection<KV<K, V1>>, PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Inner join of two collections of KV elements.
- innerJoin(String, PCollection<KV<K, V1>>, PCollection<KV<K, V2>>) - Static method in class org.apache.beam.sdk.extensions.joinlibrary.Join
-
Inner join of two collections of KV elements.
- innerJoin(PCollection<RhsT>) - Static method in class org.apache.beam.sdk.schemas.transforms.Join
-
Perform an inner join.
- inOnlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only
run on the provided window.
- inOnlyPane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- inOnlyPane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to
only run on the provided window.
- inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only
run on the provided window.
- inOnTimePane(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- inOnTimePane(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to
only run on the provided window, running the checker only on the on-time pane for each key.
- inOrder(Trigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
-
Returns an AfterEach
Trigger
with the given subtriggers.
- inOrder(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterEach
-
Returns an AfterEach
Trigger
with the given subtriggers.
- InProcessServerFactory - Class in org.apache.beam.sdk.fn.server
-
A
ServerFactory
which creates
servers
with the
InProcessServerBuilder
.
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
- inputCollectionNames() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns the input collection names of this transform.
- inputOf(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Returns a type descriptor for the input of the given
ProcessFunction
, subject to Java
type erasure: may contain unresolved type variables if the type was erased.
- inputOf(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- inputOf(Contextful.Fn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- INPUTS - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Inserts the partition metadata.
- insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Inserts the partition metadata.
- INSERT_OR_UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
-
- INSERT_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
-
- insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Inserts
TableRows
with the specified insertIds if not null.
- insertAll(TableReference, List<TableRow>, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- InsertBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertBuilder
-
- insertDataToTable(String, String, String, List<Map<String, Object>>) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Inserts rows to a table using a BigQuery streaming write.
- insertDeduplicate() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
- insertDistributedSync() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
- InsertOrUpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertOrUpdateBuilder
-
- insertQuorum() - Method in class org.apache.beam.sdk.io.clickhouse.ClickHouseIO.Write
-
- InsertRetryPolicy - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A retry policy for streaming BigQuery inserts.
- InsertRetryPolicy() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
- InsertRetryPolicy.Context - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Contains information about a failed insert.
- insertRows(Schema, Row...) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamJavaTypeFactory
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelDataTypeSystem
-
- instance() - Static method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamRelMetadataQuery
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCoGBKJoinRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamEnumerableConverterRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIntersectRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOSinkRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinAssociateRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMatchRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamMinusRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputJoinRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputLookupJoinRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSortRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamTableFunctionScanRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUncollectRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnionRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamValuesRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamWindowRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.impl.rule.LogicalCalcMergeRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamJavaUdfCalcRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcMergeRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcSplittingRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUncollectRule
-
- INSTANCE - Static variable in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRule
-
- INSTANCE - Static variable in exception org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver2.CloseException
-
- INSTANCE - Static variable in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
-
- INSTANCE - Static variable in class org.apache.beam.sdk.schemas.AutoValueSchema.AbstractGetterTypeSupplier
-
- INSTANCE - Static variable in class org.apache.beam.sdk.schemas.JavaBeanSchema.GetterTypeSupplier
-
- INSTANCE - Static variable in class org.apache.beam.sdk.schemas.JavaFieldSchema.JavaFieldTypeSupplier
-
- INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow.Coder
-
- INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.GlobalWindow
-
- INSTANCE - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
-
- instanceId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
-
- InstantCoder - Class in org.apache.beam.sdk.coders
-
A
Coder
for joda
Instant
that encodes it as a big endian
Long
shifted
such that lexicographic ordering of the bytes corresponds to chronological order.
- InstantDeserializer - Class in org.apache.beam.sdk.io.kafka.serialization
-
- InstantDeserializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
-
- instantiateCoder(String, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.translation.PipelineTranslatorUtils
-
Creates a coder for a given PCollection id from the Proto definition.
- instantiateHealthcareClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
-
Instantiate healthcare client.
- instantiateRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Instantiates a runner-side wire coder for the given PCollection.
- instantiateRunnerWireCoder(PipelineNode.PCollectionNode, RunnerApi.Components, RunnerApi.ExecutableStagePayload.WireCoderSetting) - Static method in class org.apache.beam.runners.fnexecution.wire.WireCoders
-
Instantiates a runner-side wire coder for the given PCollection.
- InstantSerializer - Class in org.apache.beam.sdk.io.kafka.serialization
-
- InstantSerializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
-
- InstructionRequestHandler - Interface in org.apache.beam.runners.fnexecution.control
-
Interface for any function that can handle a Fn API BeamFnApi.InstructionRequest
.
- INT16 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- INT16 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of int16 fields.
- INT32 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- INT32 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of int32 fields.
- INT64 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- INT64 - Static variable in class org.apache.beam.sdk.schemas.Schema.FieldType
-
The type of int64 fields.
- INT8 - Static variable in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- IntBundle() - Constructor for class org.apache.beam.sdk.jmh.schemas.RowBundles.IntBundle
-
- INTEGER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- integers() - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a PTransform
that takes an input PCollection<Integer>
and returns a
PCollection<Integer>
whose contents is the maximum of the input PCollection
's
elements, or Integer.MIN_VALUE
if there are no elements.
- integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a PTransform
that takes an input PCollection<Integer>
and returns a
PCollection<Integer>
whose contents is a single value that is the minimum of the input
PCollection
's elements, or Integer.MAX_VALUE
if there are no elements.
- integersGlobally() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a PTransform
that takes an input PCollection<Integer>
and returns a
PCollection<Integer>
whose contents is the sum of the input PCollection
's
elements, or 0
if there are no elements.
- integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Max
-
Returns a PTransform
that takes an input PCollection<KV<K, Integer>>
and
returns a PCollection<KV<K, Integer>>
that contains an output element mapping each
distinct key in the input PCollection
to the maximum of the values associated with that
key in the input PCollection
.
- integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Min
-
Returns a PTransform
that takes an input PCollection<KV<K, Integer>>
and
returns a PCollection<KV<K, Integer>>
that contains an output element mapping each
distinct key in the input PCollection
to the minimum of the values associated with that
key in the input PCollection
.
- integersPerKey() - Static method in class org.apache.beam.sdk.transforms.Sum
-
Returns a PTransform
that takes an input PCollection<KV<K, Integer>>
and
returns a PCollection<KV<K, Integer>>
that contains an output element mapping each
distinct key in the input PCollection
to the sum of the values associated with that key
in the input PCollection
.
- interceptor() - Static method in class org.apache.beam.sdk.fn.server.GrpcContextHeaderAccessorProvider
-
- interceptResponse(HttpResponse) - Method in class org.apache.beam.sdk.extensions.gcp.util.UploadIdResponseInterceptor
-
- Internal - Annotation Type in org.apache.beam.sdk.annotations
-
Signifies that a publicly accessible API (public class, method or field) is intended for internal
use only and not for public consumption.
- InterpolateData() - Constructor for class org.apache.beam.sdk.extensions.timeseries.FillGaps.InterpolateData
-
- interpolateKey(double) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns a
ByteKey
key
such that
[startKey, key)
represents
approximately the specified fraction of the range
[startKey, endKey)
.
- intersectAll(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new PTransform
transform that follows SET ALL semantics to compute the
intersection with provided PCollection<T>
.
- intersectAll() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new PTransform
transform that follows SET ALL semantics which takes a PCollectionList<PCollection<T>>
and returns a PCollection<T>
containing the
intersection all of collections done in order for all collections in PCollectionList<T>
.
- intersectDistinct(PCollection<T>) - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a new PTransform
transform that follows SET DISTINCT semantics to compute the
intersection with provided PCollection<T>
.
- intersectDistinct() - Static method in class org.apache.beam.sdk.transforms.Sets
-
Returns a PTransform
that takes a PCollectionList<PCollection<T>>
and returns a
PCollection<T>
containing the intersection of collections done in order for all
collections in PCollectionList<T>
.
- intersects(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns whether this window intersects the given window.
- IntervalWindow - Class in org.apache.beam.sdk.transforms.windowing
-
- IntervalWindow(Instant, Instant) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Creates a new IntervalWindow that represents the half-open time interval [start, end).
- IntervalWindow(Instant, ReadableDuration) - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
- IntervalWindow.IntervalWindowCoder - Class in org.apache.beam.sdk.transforms.windowing
-
- IntervalWindowCoder() - Constructor for class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.FlatMapElements
-
- into(TypeDescriptor<OutputT>) - Static method in class org.apache.beam.sdk.transforms.MapElements
-
- into(TypeDescriptor<K2>) - Static method in class org.apache.beam.sdk.transforms.MapKeys
-
- into(TypeDescriptor<V2>) - Static method in class org.apache.beam.sdk.transforms.MapValues
-
- into(WindowFn<? super T, ?>) - Static method in class org.apache.beam.sdk.transforms.windowing.Window
-
Creates a
Window
PTransform
that uses the given
WindowFn
to window the
data.
- InTransactionContext(String, TransactionContext) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Constructs a context to execute a user defined function transactionally.
- InvalidConfigurationException - Exception in org.apache.beam.sdk.schemas.io
-
Exception thrown when the configuration for a
SchemaIO
is invalid.
- InvalidConfigurationException(String) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidConfigurationException
-
- InvalidConfigurationException(String, Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidConfigurationException
-
- InvalidConfigurationException(Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidConfigurationException
-
- InvalidLocationException - Exception in org.apache.beam.sdk.schemas.io
-
Exception thrown when the configuration for a
SchemaIO
is invalid.
- InvalidLocationException(String) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidLocationException
-
- InvalidLocationException(String, Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidLocationException
-
- InvalidLocationException(Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidLocationException
-
- InvalidSchemaException - Exception in org.apache.beam.sdk.schemas.io
-
Exception thrown when the schema for a
SchemaIO
is invalid.
- InvalidSchemaException(String) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidSchemaException
-
- InvalidSchemaException(String, Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidSchemaException
-
- InvalidSchemaException(Throwable) - Constructor for exception org.apache.beam.sdk.schemas.io.InvalidSchemaException
-
- InvalidTableException - Exception in org.apache.beam.sdk.extensions.sql.meta.provider
-
Exception thrown when the request for a table is invalid, such as invalid metadata.
- InvalidTableException(String) - Constructor for exception org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
-
- InvalidTableException(String, Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
-
- InvalidTableException(Throwable) - Constructor for exception org.apache.beam.sdk.extensions.sql.meta.provider.InvalidTableException
-
- invokeAdvance(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
-
- invokeStart(ReaderT) - Method in class org.apache.beam.runners.flink.metrics.ReaderInvocationUtil
-
- invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.flink.FlinkJobInvoker
-
- invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.jobsubmission.JobInvoker
-
Start running a job, abstracting its state as a
JobInvocation
instance.
- invokeWithExecutor(RunnerApi.Pipeline, Struct, String, ListeningExecutorService) - Method in class org.apache.beam.runners.spark.SparkJobInvoker
-
- inWindow(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.IterableAssert
-
Creates a new
PAssert.IterableAssert
like this one, but with the assertion restricted to only
run on the provided window.
- inWindow(BoundedWindow) - Method in class org.apache.beam.sdk.testing.PAssert.PCollectionContentsAssert
-
- inWindow(BoundedWindow) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Creates a new
PAssert.SingletonAssert
like this one, but with the assertion restricted to
only run on the provided window.
- ioException() - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil.StorageObjectOrIOException
-
Returns the IOException
.
- ir() - Method in class org.apache.beam.sdk.extensions.sbe.SerializableIr
-
Returns the underlying
Ir
.
- IrOptions() - Constructor for class org.apache.beam.sdk.extensions.sbe.SbeSchema.IrOptions
-
- IS_MERGING_WINDOW_FN - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- IS_PAIR_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- IS_STREAM_LIKE - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- IS_WRAPPER - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- isAbsolute() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- isAccessible() - Method in interface org.apache.beam.sdk.options.ValueProvider
-
- isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
-
- isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.RuntimeValueProvider
-
- isAccessible() - Method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
-
- isAliveOrThrow() - Method in class org.apache.beam.runners.fnexecution.environment.ProcessManager.RunningProcess
-
Checks if the underlying process is still running.
- isAllowedLatenessSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- isAlreadyMerged() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- isArray() - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns true if this type is known to be an array type.
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.BlockBasedSource.BlockBasedReader
-
Returns true if the reader is at a split point.
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.CompressedSource.CompressedReader
-
Returns true only for the first record; compressed sources cannot be split.
- isAtSplitPoint() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- isAutoBalanceWriteFilesShardingEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
- isBlockOnRun() - Method in interface org.apache.beam.runners.direct.DirectOptions
-
- isBlockOnRun() - Method in interface org.apache.beam.sdk.testing.TestPipelineOptions
-
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamIOSourceRel
-
- isBounded() - Method in interface org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode
-
Whether the collection of rows represented by this relational expression is bounded (known to
be finite) or unbounded (may or may not be finite).
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable
-
- isBounded() - Method in interface org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable
-
Whether this table is bounded (known to be finite) or unbounded (may or may not be finite).
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.bigtable.BigtableTable
-
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog.HCatalogTable
-
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.kafka.BeamKafkaTable
-
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.mongodb.MongoDbTable
-
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
-
- isBounded() - Method in class org.apache.beam.sdk.extensions.sql.meta.provider.text.TextTable
-
- isBounded() - Method in class org.apache.beam.sdk.io.AvroSchemaIOProvider
-
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Indicates whether the PCollections produced by this transform will contain a bounded or
unbounded number of elements.
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
-
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
- isBounded() - Method in class org.apache.beam.sdk.io.jdbc.JdbcSchemaIOProvider
-
- isBounded() - Method in interface org.apache.beam.sdk.schemas.io.SchemaIOProvider
-
- isBounded() - Method in class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
-
- isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
-
- isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.GrowableOffsetRangeTracker
-
- isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
- isBounded() - Method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker
-
Return the boundedness of the current restriction.
- isBounded() - Method in class org.apache.beam.sdk.values.PCollection
-
- isBoundedCollection(Collection<PCollection<?>>) - Method in class org.apache.beam.runners.spark.SparkRunner.Evaluator
-
- isCacheDisabled() - Method in interface org.apache.beam.runners.spark.SparkPipelineOptions
-
- isCleanArtifactsPerJob() - Method in class org.apache.beam.runners.jobsubmission.JobServerDriver.ServerConfiguration
-
- isClosed() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- isCollectionType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterPane
-
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.AfterProcessingTime
-
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.DaysWindows
-
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.MonthsWindows
-
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.CalendarWindows.YearsWindows
-
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
-
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.GlobalWindows
-
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.Sessions
-
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
- isCompatible(Trigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
For internal use only; no backwards-compatibility guarantees.
- isCompatible(WindowFn<?, ?>) - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
- isCompositeType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isCompound() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Whether it's a compound table name (with multiple path components).
- isCompressed(String) - Static method in enum org.apache.beam.sdk.io.CompressedSource.CompressionMode
-
Deprecated.
Returns whether the file's extension matches of one of the known compression formats.
- isCompressed(String) - Method in enum org.apache.beam.sdk.io.Compression
-
- isCooperative() - Method in class org.apache.beam.runners.jet.processors.BoundedSourceP
-
- isCooperative() - Method in class org.apache.beam.runners.jet.processors.UnboundedSourceP
-
- isDateTimeType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
Returns true if the type is any of the various date time types.
- isDateType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isDdl(String) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv
-
- isDecimal(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
-
Checks if type is decimal.
- isDirectory() - Method in class org.apache.beam.sdk.extensions.gcp.storage.GcsResourceId
-
- isDirectory() - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem.ClassLoaderResourceId
-
- isDirectory() - Method in interface org.apache.beam.sdk.io.fs.ResourceId
-
Returns
true
if this
ResourceId
represents a directory, false otherwise.
- isDisjoint(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow
-
Returns whether this window is disjoint from the given window.
- isDone() - Method in class org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver
-
Deprecated.
- isDone() - Method in class org.apache.beam.sdk.fn.data.CompletableFutureInboundDataClient
-
- isDone() - Method in interface org.apache.beam.sdk.fn.data.InboundDataClient
-
Deprecated.
Returns true if the client is done, either via completing successfully or by being cancelled.
- isDone() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- isDone() - Method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
- isDone() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
- isEmpty() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SparkSideInputReader
-
- isEmpty() - Method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.CachedSideInputReader
-
- isEmpty() - Method in class org.apache.beam.runners.spark.util.CachedSideInputReader
-
- isEmpty() - Method in class org.apache.beam.runners.spark.util.SparkSideInputReader
-
- isEmpty(StateAccessor<K>) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
-
- isEmpty() - Method in class org.apache.beam.runners.twister2.utils.Twister2SideInputReader
-
- isEmpty() - Method in class org.apache.beam.sdk.io.range.ByteKey
-
Returns
true
if the
byte[]
backing this
ByteKey
is of length 0.
- isEmpty() - Method in class org.apache.beam.sdk.schemas.utils.ByteBuddyUtils.TransformingMap
-
- isEmpty() - Method in interface org.apache.beam.sdk.state.GroupingState
-
- isEmpty() - Method in interface org.apache.beam.sdk.state.MapState
-
- isEmpty() - Method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
- isEmpty() - Method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- isEmpty() - Method in class org.apache.beam.sdk.transforms.Requirements
-
Whether this is an empty set of requirements.
- isEnableStreamingEngine() - Method in interface org.apache.beam.sdk.extensions.gcp.options.GcpOptions
-
- isEncodingPositionsOverridden() - Method in class org.apache.beam.sdk.schemas.Schema
-
Returns whether encoding positions have been explicitly overridden.
- isEnforceEncodability() - Method in interface org.apache.beam.runners.direct.DirectOptions
-
- isEnforceImmutability() - Method in interface org.apache.beam.runners.direct.DirectOptions
-
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
- isEqualTo(T) - Method in interface org.apache.beam.sdk.testing.PAssert.SingletonAssert
-
Asserts that the value in question is equal to the provided value, according to Object.equals(java.lang.Object)
.
- isEqWithEpsilon(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- isExternalizedCheckpointsEnabled() - Method in interface org.apache.beam.runners.flink.FlinkPipelineOptions
-
Enables or disables externalized checkpoints.
- isFirst() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return true if this is the first pane produced for the associated window.
- IsFlinkNativeTransform() - Constructor for class org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.IsFlinkNativeTransform
-
- IsFlinkNativeTransform() - Constructor for class org.apache.beam.runners.flink.FlinkStreamingPortablePipelineTranslator.IsFlinkNativeTransform
-
- isForceStreaming() - Method in interface org.apache.beam.runners.spark.TestSparkPipelineOptions
-
- isForceWatermarkSync() - Method in class org.apache.beam.runners.spark.io.CreateStream
-
- isGetter(Method) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
- isHotKeyLoggingEnabled() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
If enabled then the literal key will be logged to Cloud Logging if a hot key is detected.
- isIn(Collection<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- isIn(Coder<T>, Collection<T>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- isIn(T[]) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- isIn(Coder<T>, T[]) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- isInboundEdgeOfVertex(Edge, String, String, String) - Method in interface org.apache.beam.runners.jet.DAGBuilder.WiringListener
-
- isInboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
-
- IsInf - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
IS_INF(X)
- IsInf() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
-
- isInf(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
-
- isInf(Float) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsInf
-
- isInfinite() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- isInitialPartition(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
-
Verifies if the given partition token is the initial partition.
- isInputSortRelAndLimitOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.AbstractBeamCalcRel
-
- isInsertDefault() - Method in class org.apache.beam.sdk.transforms.Combine.Globally
-
Returns whether or not this transformation applies a default value.
- isIntegral(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
-
Checks if type is integral.
- isJoinLegal(Join) - Static method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
This method checks if a join is legal and can be converted into Beam SQL.
- isKey(ImmutableBitSet) - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- isLast() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return true if this is the last pane that will be produced in the associated window.
- isLastRecordInTransactionInPartition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Indicates whether this record is the last emitted for the given transaction in the given
partition.
- isLe(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- isLimitOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamSortRel
-
- isLogicalType(String) - Method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- isLogicalType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isLt(RelOptCost) - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.BeamCostModel
-
- isMapType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isMetadata() - Method in class org.apache.beam.sdk.values.PCollectionViews.ValueOrMetadata
-
- isMetricsSupported() - Static method in class org.apache.beam.sdk.metrics.MetricsEnvironment
-
Indicates whether metrics reporting is supported.
- isModeSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- isMutable() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteSchema
-
- IsNan - Class in org.apache.beam.sdk.extensions.sql.impl.udf
-
IS_NAN(X)
- IsNan() - Constructor for class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
-
- isNan(Float) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
-
- isNan(Double) - Method in class org.apache.beam.sdk.extensions.sql.impl.udf.IsNan
-
- isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.NonMergingWindowFn
-
- isNonMerging() - Method in class org.apache.beam.sdk.transforms.windowing.WindowFn
-
Returns true if this WindowFn
never needs to merge any windows.
- isNull(String) - Method in class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IsNullFn
-
- isNullable() - Method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
-
- isNullable() - Method in class org.apache.beam.sdk.schemas.FieldValueTypeInformation
-
Returns whether the field is nullable.
- IsNullFn() - Constructor for class org.apache.beam.sdk.extensions.sql.provider.UdfTestProvider.IsNullFn
-
- isNumericType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isOneOf(T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- isOneOf(Coder<T>, T...) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- isOutboundEdgeOfVertex(Edge, String, String, String) - Method in interface org.apache.beam.runners.jet.DAGBuilder.WiringListener
-
- isOutboundEdgeOfVertex(Edge, String, String, String) - Method in class org.apache.beam.runners.jet.processors.FlattenP.Supplier
-
- isPrimaryKey() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
True if the column is part of the primary key, false otherwise.
- isPrimitiveType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isReadOnly() - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- isReadSeekEfficient() - Method in class org.apache.beam.sdk.io.fs.MatchResult.Metadata
-
- isReady() - Method in class org.apache.beam.sdk.fn.stream.DataStreams.DataStreamDecoder
-
- isReady() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterator
-
Returns true
if and only if Iterator.hasNext()
and Iterator.next()
will not require an
expensive operation.
- isRegisterByteSizeObserverCheap(RandomAccessData) - Method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- isRegisterByteSizeObserverCheap(ByteString) - Method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
-
- isRegisterByteSizeObserverCheap(BigDecimal) - Method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
- isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
- isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
- isRegisterByteSizeObserverCheap(Short) - Method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
- isRegisterByteSizeObserverCheap(BigInteger) - Method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
- isRegisterByteSizeObserverCheap(Boolean) - Method in class org.apache.beam.sdk.coders.BooleanCoder
-
- isRegisterByteSizeObserverCheap(byte[]) - Method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- isRegisterByteSizeObserverCheap(Byte) - Method in class org.apache.beam.sdk.coders.ByteCoder
-
- isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.Coder
-
- isRegisterByteSizeObserverCheap(Double) - Method in class org.apache.beam.sdk.coders.DoubleCoder
-
- isRegisterByteSizeObserverCheap(ReadableDuration) - Method in class org.apache.beam.sdk.coders.DurationCoder
-
- isRegisterByteSizeObserverCheap(Float) - Method in class org.apache.beam.sdk.coders.FloatCoder
-
- isRegisterByteSizeObserverCheap(Instant) - Method in class org.apache.beam.sdk.coders.InstantCoder
-
- isRegisterByteSizeObserverCheap(IterableT) - Method in class org.apache.beam.sdk.coders.IterableLikeCoder
-
- isRegisterByteSizeObserverCheap(KV<K, V>) - Method in class org.apache.beam.sdk.coders.KvCoder
-
Returns whether both keyCoder and valueCoder are considered not expensive.
- isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
LengthPrefixCoder
is cheap if valueCoder
is cheap.
- isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.NullableCoder
-
NullableCoder
is cheap if valueCoder
is cheap.
- isRegisterByteSizeObserverCheap(T) - Method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
-
- isRegisterByteSizeObserverCheap(Integer) - Method in class org.apache.beam.sdk.coders.VarIntCoder
-
- isRegisterByteSizeObserverCheap(Long) - Method in class org.apache.beam.sdk.coders.VarLongCoder
-
- isRegisterByteSizeObserverCheap(Void) - Method in class org.apache.beam.sdk.coders.VoidCoder
-
- isRegisterByteSizeObserverCheap(ByteString) - Method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- isRegisterByteSizeObserverCheap(HyperLogLogPlus) - Method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
-
- isRegisterByteSizeObserverCheap(EncodedBoundedWindow) - Method in class org.apache.beam.sdk.fn.windowing.EncodedBoundedWindow.Coder
-
- isRegisterByteSizeObserverCheap(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- isRegisterByteSizeObserverCheap(ProducerRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
-
- isRegisterByteSizeObserverCheap(OffsetRange) - Method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
-
- isRegisterByteSizeObserverCheap(RawUnionValue) - Method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
Since this coder uses elementCoders.get(index) and coders that are known to run in constant
time, we defer the return value to that coder.
- isRegisterByteSizeObserverCheap(IntervalWindow) - Method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- isRunnerDeterminedSharding() - Method in interface org.apache.beam.runners.direct.DirectTestOptions
-
- isSdfTimer(String) - Static method in class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
-
A helper function to help check whether the given timer is the timer which is set for
rescheduling BeamFnApi.DelayedBundleApplication
.
- isSetter(Method) - Static method in class org.apache.beam.sdk.schemas.utils.ReflectUtils
-
- isShouldReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Whether additional diagnostic metrics should be reported for a Transform.
- isSideInputLookupJoin() - Method in class org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel
-
- isSimple() - Method in class org.apache.beam.sdk.extensions.sql.impl.TableName
-
Whether it's a simple name, with a single name component.
- isSplittable() - Method in class org.apache.beam.sdk.io.CompressedSource
-
Determines whether a single file represented by this source is splittable.
- isSplittable() - Method in class org.apache.beam.sdk.io.FileBasedSource
-
Determines whether a file represented by this source is can be split into bundles.
- isStart() - Method in class org.apache.beam.sdk.io.redis.RedisCursor
-
- isStarted() - Method in class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- isStarted() - Method in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
- isStreaming() - Method in interface org.apache.beam.sdk.options.StreamingOptions
-
Set to true if running a streaming pipeline.
- isStreamingEngine() - Method in interface org.apache.beam.runners.dataflow.TransformTranslator.TranslationContext
-
- isStringType(Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils
-
- isStringType() - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isSubtypeOf(Schema.TypeName) - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
- isSubtypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Return true if this type is a subtype of the given type.
- isSuccess() - Method in class org.apache.beam.sdk.io.tika.ParseResult
-
Returns whether this file was parsed successfully.
- isSuccess() - Method in class org.apache.beam.sdk.testing.SuccessOrFailure
-
- isSupertypeOf(Schema.TypeName) - Method in enum org.apache.beam.sdk.schemas.Schema.TypeName
-
Whether this is a super type of the another type.
- isSupertypeOf(TypeDescriptor<?>) - Method in class org.apache.beam.sdk.values.TypeDescriptor
-
Returns true if this type is assignable from the given type.
- isSupported() - Method in enum org.apache.beam.sdk.extensions.sql.meta.ProjectSupport
-
- isTableEmpty(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Returns true if the table is empty.
- isTableEmpty(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
- isTableResolved(Table) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
-
True if the table was resolved using the Calcite schema.
- isTerminal() - Method in enum org.apache.beam.runners.local.ExecutionDriver.DriverState
-
- isTerminal() - Method in enum org.apache.beam.sdk.PipelineResult.State
-
- isTerminated(JobApi.JobState.Enum) - Static method in class org.apache.beam.runners.jobsubmission.JobInvocation
-
- isTimer() - Method in class org.apache.beam.sdk.fn.data.LogicalEndpoint
-
- isTimestampCombinerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- isTriggerSpecified() - Method in class org.apache.beam.sdk.values.WindowingStrategy
-
- isTrustSelfSignedCerts() - Method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.ConnectionConfiguration
-
- isUnknown() - Method in class org.apache.beam.sdk.extensions.sql.impl.BeamTableStatistics
-
- isUnknown() - Method in class org.apache.beam.sdk.extensions.sql.impl.planner.NodeStats
-
If any of the values for rowCount, rate or window is infinite, it returns true.
- isUnknown() - Method in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
Return true if there is no timing information for the current
PaneInfo
.
- isUpdate() - Method in interface org.apache.beam.runners.dataflow.options.DataflowPipelineOptions
-
Whether to update the currently running pipeline with the same name as this one.
- isValid(int) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- isWholeStream - Variable in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
Whether the encoded or decoded value fills the remainder of the output or input (resp.)
record/stream contents.
- isWildcard(GcsPath) - Static method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Returns true if the given spec
contains wildcard.
- isWrapperFor(Class<?>) - Method in class org.apache.beam.sdk.extensions.sql.impl.CalciteConnectionWrapper
-
- isWrapping() - Method in class org.apache.beam.sdk.io.cassandra.RingRange
-
- isZero() - Method in class org.apache.beam.runners.spark.aggregators.NamedAggregatorsAccumulator
-
- isZero() - Method in class org.apache.beam.runners.spark.metrics.MetricsContainerStepMapAccumulator
-
- isZero() - Method in class org.apache.beam.runners.spark.structuredstreaming.aggregators.NamedAggregatorsAccumulator
-
- isZero() - Method in class org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsContainerStepMapAccumulator
-
- item(String, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and string value.
- item(String, ValueProvider<?>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
- item(String, Integer) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and integer value.
- item(String, Long) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and integer value.
- item(String, Float) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and floating point value.
- item(String, Double) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and floating point value.
- item(String, Boolean) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and boolean value.
- item(String, Instant) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and timestamp value.
- item(String, Duration) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and duration value.
- item(String, Class<T>) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key and class value.
- item(String, DisplayData.Type, T) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData
-
Create a display item for the specified key, type, and value.
- Item() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.Item
-
- items() - Method in class org.apache.beam.sdk.io.aws.dynamodb.DynamoDBIO.Read
-
Deprecated.
- items() - Method in class org.apache.beam.sdk.io.aws2.dynamodb.DynamoDBIO.Read
-
- items() - Method in class org.apache.beam.sdk.transforms.display.DisplayData
-
- ItemSpec() - Constructor for class org.apache.beam.sdk.transforms.display.DisplayData.ItemSpec
-
- iterable(Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- iterable() - Static method in class org.apache.beam.sdk.transforms.Materializations
-
For internal use only; no backwards-compatibility guarantees.
- ITERABLE_MATERIALIZATION_URN - Static variable in class org.apache.beam.sdk.transforms.Materializations
-
The URN for a
Materialization
where the primitive view type is an iterable of fully
specified windowed values.
- IterableCoder<T> - Class in org.apache.beam.sdk.coders
-
- IterableCoder(Coder<T>) - Constructor for class org.apache.beam.sdk.coders.IterableCoder
-
- IterableLikeCoder<T,IterableT extends java.lang.Iterable<T>> - Class in org.apache.beam.sdk.coders
-
An abstract base class with functionality for assembling a
Coder
for a class that
implements
Iterable
.
- IterableLikeCoder(Coder<T>, String) - Constructor for class org.apache.beam.sdk.coders.IterableLikeCoder
-
- iterables() - Static method in class org.apache.beam.sdk.transforms.Flatten
-
Returns a PTransform
that takes a PCollection<Iterable<T>>
and returns a PCollection<T>
containing all the elements from all the Iterable
s.
- iterables() - Static method in class org.apache.beam.sdk.transforms.ToString
-
Transforms each item in the iterable of the input
PCollection
to a
String
using
the
Object.toString()
method followed by a "," until the last element in the iterable.
- iterables(String) - Static method in class org.apache.beam.sdk.transforms.ToString
-
Transforms each item in the iterable of the input
PCollection
to a
String
using
the
Object.toString()
method followed by the specified delimiter until the last element
in the iterable.
- iterables(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- iterableView(PCollection<T>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
Returns a
PCollectionView<Iterable<T>>
capable of processing elements windowed using
the provided
WindowingStrategy
.
- IterableViewFn(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn
-
Deprecated.
- IterableViewFn2(PCollectionViews.TypeDescriptorSupplier<T>) - Constructor for class org.apache.beam.sdk.values.PCollectionViews.IterableViewFn2
-
- iterableViewUsingVoidKey(TupleTag<Materializations.MultimapView<Void, T>>, PCollection<KV<Void, T>>, PCollectionViews.TypeDescriptorSupplier<T>, WindowingStrategy<?, W>) - Static method in class org.apache.beam.sdk.values.PCollectionViews
-
- iterableWithSize(int) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- iterableWithSize(SerializableMatcher<? super Integer>) - Static method in class org.apache.beam.sdk.testing.SerializableMatchers
-
- iterator() - Method in class org.apache.beam.sdk.extensions.gcp.util.gcsfs.GcsPath
-
- iterator() - Method in interface org.apache.beam.sdk.fn.stream.PrefetchableIterable
-
- iterator() - Method in class org.apache.beam.sdk.fn.stream.PrefetchableIterables.Default
-
- iterator() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
-
- iterator() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
- iterator() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
-
- OBJECT_MAPPER - Static variable in class org.apache.beam.sdk.extensions.sql.impl.JdbcDriver
-
- OBJECT_TYPE_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- ObjectPool<KeyT,ObjectT> - Class in org.apache.beam.sdk.io.aws2.common
-
Reference counting object pool to easily share & destroy objects.
- ObjectPool(Function<KeyT, ObjectT>) - Constructor for class org.apache.beam.sdk.io.aws2.common.ObjectPool
-
- ObjectPool(Function<KeyT, ObjectT>, ThrowingConsumer<Exception, ObjectT>) - Constructor for class org.apache.beam.sdk.io.aws2.common.ObjectPool
-
- ObjectPool.ClientPool<ClientT extends SdkClient> - Class in org.apache.beam.sdk.io.aws2.common
-
Client pool to easily share AWS clients per configuration.
- observe(RestrictionTracker<RestrictionT, PositionT>, RestrictionTrackers.ClaimObserver<PositionT>) - Static method in class org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers
-
- observeTimestamp(Instant) - Method in interface org.apache.beam.sdk.transforms.splittabledofn.TimestampObservingWatermarkEstimator
-
Update watermark estimate with latest output timestamp.
- observeTimestamp(Instant) - Method in class org.apache.beam.sdk.transforms.splittabledofn.WatermarkEstimators.MonotonicallyIncreasing
-
- of(String, String, String) - Static method in class org.apache.beam.runners.dataflow.util.PackageUtil.StagedFile
-
- of() - Static method in class org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
-
- of(BeamFnApi.ProcessBundleDescriptor, List<RemoteInputDestination>, Map<String, Coder>, Map<String, Map<String, ProcessBundleDescriptors.SideInputSpec>>, Map<String, Map<String, ProcessBundleDescriptors.BagUserStateSpec>>, Map<String, Map<String, ProcessBundleDescriptors.TimerSpec>>) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.ExecutableProcessBundleDescriptor
-
- of(String, String, RunnerApi.FunctionSpec, Coder<T>, Coder<W>) - Static method in class org.apache.beam.runners.fnexecution.control.ProcessBundleDescriptors.SideInputSpec
-
- of(Coder<T>, FnDataReceiver<T>) - Static method in class org.apache.beam.runners.fnexecution.control.RemoteOutputReceiver
-
- of(Coder<T>, String) - Static method in class org.apache.beam.runners.fnexecution.data.RemoteInputDestination
-
- of() - Static method in class org.apache.beam.runners.fnexecution.wire.ByteStringCoder
-
- of(K, Coder<K>) - Static method in class org.apache.beam.runners.local.StructuralKey
-
Create a new Structural Key of the provided key that can be encoded by the provided coder.
- of(T, CloseableResource.Closer<T>) - Static method in class org.apache.beam.runners.portability.CloseableResource
-
- of(JobApi.MetricResults) - Static method in class org.apache.beam.runners.portability.PortableMetrics
-
- of(Coder<T>, Duration, boolean) - Static method in class org.apache.beam.runners.spark.io.CreateStream
-
Creates a new Spark based stream intended for test purposes.
- of(Coder<T>, Duration) - Static method in class org.apache.beam.runners.spark.io.CreateStream
-
Creates a new Spark based stream without forced watermark sync, intended for test purposes.
- of(NamedAggregators) - Static method in class org.apache.beam.runners.spark.metrics.AggregatorMetric
-
- of(NamedAggregators) - Static method in class org.apache.beam.runners.spark.structuredstreaming.metrics.AggregatorMetric
-
- of(Coder<TupleTag>, Map<TupleTag<?>, Coder<?>>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.helpers.MultiOutputCoder
-
- of(SideInputReader) - Static method in class org.apache.beam.runners.spark.structuredstreaming.translation.utils.CachedSideInputReader
-
Create a new cached SideInputReader
.
- of(SideInputReader) - Static method in class org.apache.beam.runners.spark.util.CachedSideInputReader
-
Create a new cached SideInputReader
.
- of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns an AvroCoder
instance for the provided element type.
- of(TypeDescriptor<T>, boolean) - Static method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns an AvroCoder
instance for the provided element type, respecting whether to use
Avro's Reflect* or Specific* suite for encoding and decoding.
- of(Class<T>) - Static method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns an AvroCoder
instance for the provided element class.
- of(Schema) - Static method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns an AvroGenericCoder
instance for the Avro schema.
- of(Class<T>, boolean) - Static method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns an AvroCoder
instance for the given class, respecting whether to use Avro's
Reflect* or Specific* suite for encoding and decoding.
- of(Class<T>, Schema) - Static method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns an AvroCoder
instance for the provided element type using the provided Avro
schema.
- of(Class<T>, Schema, boolean) - Static method in class org.apache.beam.sdk.coders.AvroCoder
-
Returns an AvroCoder
instance for the given class and schema, respecting whether to use
Avro's Reflect* or Specific* suite for encoding and decoding.
- of(Schema) - Static method in class org.apache.beam.sdk.coders.AvroGenericCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.BigDecimalCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianIntegerCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianLongCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.BigEndianShortCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.BigIntegerCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.BitSetCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.BooleanCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.ByteArrayCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.ByteCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.CollectionCoder
-
- of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
-
- of(Coder<IntermediateT>, DelegateCoder.CodingFunction<T, IntermediateT>, DelegateCoder.CodingFunction<IntermediateT, T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.DelegateCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.DequeCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.DoubleCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.DurationCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.FloatCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.InstantCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.IterableCoder
-
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.KvCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.LengthPrefixCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.ListCoder
-
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.MapCoder
-
Produces a MapCoder with the given keyCoder and valueCoder.
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.NullableCoder
-
- of(Schema) - Static method in class org.apache.beam.sdk.coders.RowCoder
-
- of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
-
- of(Class<T>) - Static method in class org.apache.beam.sdk.coders.SerializableCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SetCoder
-
Produces a
SetCoder
with the given
elementCoder
.
- of(Coder<KeyT>) - Static method in class org.apache.beam.sdk.coders.ShardedKeyCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.SnappyCoder
-
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.coders.SortedMapCoder
-
Produces a MapCoder with the given keyCoder and valueCoder.
- of(Class<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- of(Class<T>, TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.coders.StringDelegateCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.StringUtf8Coder
-
- of() - Static method in class org.apache.beam.sdk.coders.TextualIntegerCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.coders.TimestampPrefixingWindowCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.VarIntCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.VarLongCoder
-
- of() - Static method in class org.apache.beam.sdk.coders.VoidCoder
-
- of(Class<? extends InputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.AsJsons
-
- of(Class<? extends OutputT>) - Static method in class org.apache.beam.sdk.extensions.jackson.ParseJsons
-
- of() - Static method in class org.apache.beam.sdk.extensions.protobuf.ByteStringCoder
-
- of(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
- of(ProtoDomain, Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
- of(ProtoDomain, String) - Static method in class org.apache.beam.sdk.extensions.protobuf.DynamicProtoCoder
-
- of(Class<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- of(TypeDescriptor<T>) - Static method in class org.apache.beam.sdk.extensions.protobuf.ProtoCoder
-
- of(String) - Static method in class org.apache.beam.sdk.extensions.python.transforms.DataframeTransform
-
Instantiates a multi-language wrapper for a Python DataframeTransform with a given lambda
function.
- of(String, Schema.FieldType, Schema.FieldType) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
Instantiates a multi-language wrapper for a Python RunInference with a given model loader.
- of(String, Schema) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
Instantiates a multi-language wrapper for a Python RunInference with a given model loader.
- of() - Static method in class org.apache.beam.sdk.extensions.sketching.ApproximateDistinct.HyperLogLogPlusCoder
-
- of(BeamSqlTable) - Static method in class org.apache.beam.sdk.extensions.sql.impl.BeamCalciteTable
-
- of(RexCall) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPCall
-
- of(RexPatternFieldRef) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPFieldRef
-
- of(RexLiteral) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(Byte) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(Short) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(Integer) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(Long) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(BigDecimal) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(Float) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(Double) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(ReadableDateTime) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(Boolean) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(String) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPLiteral
-
- of(RexNode) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperation
-
- of(SqlOperator) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPOperator
-
- of(Schema, String, RexCall, Quantifier) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.CEPPattern
-
- of(RelFieldCollation) - Static method in class org.apache.beam.sdk.extensions.sql.impl.cep.OrderKey
-
- of(Duration, Duration) - Static method in class org.apache.beam.sdk.extensions.sql.impl.TVFSlidingWindowFn
-
- of(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
Convenient way to build a mocked bounded table.
- of(Schema) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestBoundedTable
-
Build a mocked bounded table with the specified type.
- of(Object...) - Static method in class org.apache.beam.sdk.extensions.sql.meta.provider.test.TestUnboundedTable
-
Convenient way to build a mocked unbounded table.
- of(FrameworkConfig, ExpressionConverter, RelOptCluster, QueryTrait) - Static method in class org.apache.beam.sdk.extensions.sql.zetasql.translation.ConversionContext
-
- of(Duration, String...) - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
-
Construct the transform for the given duration and key fields.
- of(Duration, FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.extensions.timeseries.FillGaps
-
Construct the transform for the given duration and key fields.
- of() - Static method in class org.apache.beam.sdk.io.aws.dynamodb.AttributeValueCoder
-
- of() - Static method in class org.apache.beam.sdk.io.aws2.dynamodb.AttributeValueCoder
-
- of(BigInteger, BigInteger) - Static method in class org.apache.beam.sdk.io.cassandra.RingRange
-
- of(String, TableSchema.ColumnType) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
-
- of(String, TableSchema.ColumnType, TableSchema.DefaultType, Object) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.Column
-
- of(TableSchema.TypeName) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema.ColumnType
-
- of(TableSchema.Column...) - Static method in class org.apache.beam.sdk.io.clickhouse.TableSchema
-
- of() - Static method in class org.apache.beam.sdk.io.DefaultFilenamePolicy.ParamsCoder
-
- of() - Static method in class org.apache.beam.sdk.io.elasticsearch.ElasticsearchIO.DocumentCoder
-
- of(Coder<BoundedWindow>, Coder<DestinationT>) - Static method in class org.apache.beam.sdk.io.FileBasedSink.FileResultCoder
-
- of() - Static method in class org.apache.beam.sdk.io.fs.MetadataCoder
-
- of() - Static method in class org.apache.beam.sdk.io.fs.MetadataCoderV2
-
- of() - Static method in class org.apache.beam.sdk.io.fs.ResourceIdCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
-
- of(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
- of(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
- of(FhirBundleParameter, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
- of(String, String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
Creates a FhirSearchParameter of type T.
- of(String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
Creates a FhirSearchParameter of type T, without a key.
- of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
-
- of(PCollectionTuple) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
-
- of(Class<HL7v2Message>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
-
- of(PubsubMessage, long, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
- of(PubsubMessage, long, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
-
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
-
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
-
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
-
- of(ByteString) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
-
- of(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Constructs a timestamp range.
- of(Class<T>) - Static method in class org.apache.beam.sdk.io.hadoop.WritableCoder
-
Returns a WritableCoder
instance for the provided element class.
- of(JdbcIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.DataSourceProviderFromDataSourceConfiguration
-
- of(JdbcIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.jdbc.JdbcIO.PoolableDataSourceProvider
-
- of(String, String) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
-
- of(String, String, Integer) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
-
- of(String, String, Integer, Map<String, ?>) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
-
- of(String, int, String) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
-
- of(String, int, String, Integer) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
-
- of(String, int, String, Integer, Map<String, ?>) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
-
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
-
- of(TopicPartition, Long, Instant, Long, Instant, List<String>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
-
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
-
- of(Neo4jIO.DriverConfiguration) - Static method in class org.apache.beam.sdk.io.neo4j.Neo4jIO.DriverProviderFromDriverConfiguration
-
- of() - Static method in class org.apache.beam.sdk.io.pulsar.PulsarMessageCoder
-
- of(String, Long, Long, MessageId, String, String) - Static method in class org.apache.beam.sdk.io.pulsar.PulsarSourceDescriptor
-
- of(int...) - Static method in class org.apache.beam.sdk.io.range.ByteKey
-
Creates a new
ByteKey
backed by a copy of the specified
int[]
.
- of(ByteKey, ByteKey) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Creates a new
ByteKeyRange
with the given start and end keys.
- of(ByteKeyRange) - Static method in class org.apache.beam.sdk.io.range.ByteKeyRangeTracker
-
- of() - Static method in class org.apache.beam.sdk.io.range.OffsetRange.Coder
-
- of(Coder<MatchResult.Metadata>) - Static method in class org.apache.beam.sdk.io.ReadableFileCoder
-
- of() - Static method in class org.apache.beam.sdk.io.ReadableFileCoder
-
- of(String, long, boolean) - Static method in class org.apache.beam.sdk.io.redis.RedisCursor
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDate
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeDateTime
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTime
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestamp
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampLTZ
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampNTZ
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.datetime.SnowflakeTimestampTZ
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.geospatial.SnowflakeGeography
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.logical.SnowflakeBoolean
-
- of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDecimal
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeDouble
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeFloat
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeInteger
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
-
- of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumber
-
- of(int, int) - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeNumeric
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.numeric.SnowflakeReal
-
- of(String, SnowflakeDataType) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
-
- of(String, SnowflakeDataType, boolean) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeColumn
-
- of(SnowflakeColumn...) - Static method in class org.apache.beam.sdk.io.snowflake.data.SnowflakeTableSchema
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeArray
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeObject
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.structured.SnowflakeVariant
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
-
- of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeBinary
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeChar
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
-
- of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeString
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
-
- of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeText
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarBinary
-
- of() - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
-
- of(long) - Static method in class org.apache.beam.sdk.io.snowflake.data.text.SnowflakeVarchar
-
- of(SnowflakeIO.DataSourceConfiguration) - Static method in class org.apache.beam.sdk.io.snowflake.SnowflakeIO.DataSourceProviderFromDataSourceConfiguration
-
- of(Class<T>, TProtocolFactory) - Static method in class org.apache.beam.sdk.io.thrift.ThriftCoder
-
Returns an
ThriftCoder
instance for the provided
clazz
and
protocolFactory
.
- of(Class<T>) - Static method in class org.apache.beam.sdk.io.xml.JAXBCoder
-
Create a coder for a given type of JAXB annotated objects.
- of(ValueProvider<X>, SerializableFunction<X, T>) - Static method in class org.apache.beam.sdk.options.ValueProvider.NestedValueProvider
-
- of(T) - Static method in class org.apache.beam.sdk.options.ValueProvider.StaticValueProvider
-
- of(FieldAccessDescriptor.FieldDescriptor.ListQualifier) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
-
- of(FieldAccessDescriptor.FieldDescriptor.MapQualifier) - Static method in class org.apache.beam.sdk.schemas.FieldAccessDescriptor.FieldDescriptor.Qualifier
-
- of(SerializableFunction<Row, byte[]>, SerializableFunction<byte[], Row>) - Static method in interface org.apache.beam.sdk.schemas.io.payloads.PayloadSerializer
-
- of(int) - Static method in class org.apache.beam.sdk.schemas.logicaltypes.FixedBytes
-
- of(String, Schema.FieldType) - Static method in class org.apache.beam.sdk.schemas.Schema.Field
-
Return's a field with the give name and type.
- of(Schema.TypeName) - Static method in class org.apache.beam.sdk.schemas.Schema.FieldType
-
- of(Schema.Field...) - Static method in class org.apache.beam.sdk.schemas.Schema
-
- of(Schema, TypeDescriptor<T>, SerializableFunction<T, Row>, SerializableFunction<Row, T>) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
-
- of(Schema) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
-
- of() - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.Narrowing
-
- of(Schema, Cast.Validator) - Static method in class org.apache.beam.sdk.schemas.transforms.Cast
-
- of() - Static method in class org.apache.beam.sdk.schemas.transforms.Cast.Widening
-
- of(FieldAccessDescriptor) - Static method in class org.apache.beam.sdk.schemas.transforms.WithKeys
-
- of(Coder<T>) - Static method in class org.apache.beam.sdk.testing.TestStream.TestStreamCoder
-
- of(SerializableBiFunction<V, V, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.BinaryCombineFn
-
Returns a CombineFn
that uses the given SerializableBiFunction
to combine
values.
- of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns a CombineFn
that uses the given SerializableFunction
to combine
values.
- of(SerializableFunction<Iterable<V>, V>, int) - Static method in class org.apache.beam.sdk.transforms.Combine.IterableCombineFn
-
Returns a CombineFn
that uses the given SerializableFunction
to combine
values, attempting to buffer at least bufferSize
values between invocations.
- of(SerializableFunction<Iterable<V>, V>) - Static method in class org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
-
Deprecated.
Returns a CombineFn
that uses the given SerializableFunction
to combine
values.
- of(ClosureT, Requirements) - Static method in class org.apache.beam.sdk.transforms.Contextful
-
Constructs a pair of the given closure and its requirements.
- of(Iterable<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces a
PCollection
containing
elements of the provided
Iterable
.
- of(T, T...) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces a
PCollection
containing
the specified elements.
- of(Map<K, V>) - Static method in class org.apache.beam.sdk.transforms.Create
-
Returns a new
Create.Values
transform that produces a
PCollection
of
KV
s corresponding to the keys and values of the specified
Map
.
- of(DisplayData.Path, Class<?>, String) - Static method in class org.apache.beam.sdk.transforms.display.DisplayData.Identifier
-
- of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.DoFnTester
-
- of(CoGbkResultSchema, UnionCoder) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult.CoGbkResultCoder
-
- of(TupleTag<V>, List<V>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResult
-
Returns a new CoGbkResult that contains just the given tag and given data.
- of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.transforms.join.CoGbkResultSchema
-
- of(TupleTag<InputT>, PCollection<KV<K, InputT>>) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
Returns a new KeyedPCollectionTuple<K>
with the given tag and initial PCollection.
- of(String, PCollection<KV<K, InputT>>) - Static method in class org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple
-
- of(List<Coder<?>>) - Static method in class org.apache.beam.sdk.transforms.join.UnionCoder
-
Builds a union coder with the given list of element coders.
- of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
-
- of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Max
-
- of() - Static method in class org.apache.beam.sdk.transforms.Mean
-
A Combine.CombineFn
that computes the arithmetic mean (a.k.a.
- of(T, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
-
- of(ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Min
-
- of(DoFn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.transforms.ParDo
-
- of(int, Partition.PartitionWithSideInputsFn<? super T>, Requirements) - Static method in class org.apache.beam.sdk.transforms.Partition
-
Returns a new Partition
PTransform
that divides its input PCollection
into the given number of partitions, using the given partitioning function.
- of(int, Partition.PartitionFn<? super T>) - Static method in class org.apache.beam.sdk.transforms.Partition
-
Returns a new Partition
PTransform
that divides its input PCollection
into the given number of partitions, using the given partitioning function.
- of() - Static method in class org.apache.beam.sdk.transforms.Reshuffle
-
Deprecated.
- of(ByteKeyRange) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.ByteKeyRangeTracker
-
- of(RestrictionT) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.RestrictionTracker.TruncateResult
-
- of(RestrictionT, RestrictionT) - Static method in class org.apache.beam.sdk.transforms.splittabledofn.SplitResult
-
Returns a
SplitResult
for the specified primary and residual restrictions.
- of() - Static method in class org.apache.beam.sdk.transforms.ToJson
-
- of(int, ComparatorT) - Static method in class org.apache.beam.sdk.transforms.Top
-
Returns a PTransform
that takes an input PCollection<T>
and returns a PCollection<List<T>>
with a single element containing the largest count
elements of
the input PCollection<T>
, in decreasing order, sorted using the given Comparator<T>
.
- of(PCollectionView<ViewT>) - Static method in class org.apache.beam.sdk.transforms.View.CreatePCollectionView
-
- of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
-
Returns an AfterAll
Trigger
with the given subtriggers.
- of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterAll
-
Returns an AfterAll
Trigger
with the given subtriggers.
- of(Trigger.OnceTrigger...) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
-
Returns an AfterFirst
Trigger
with the given subtriggers.
- of(List<Trigger>) - Static method in class org.apache.beam.sdk.transforms.windowing.AfterFirst
-
Returns an AfterFirst
Trigger
with the given subtriggers.
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.DefaultTrigger
-
Returns the default trigger.
- of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.FixedWindows
-
Partitions the timestamp space into half-open intervals of the form [N * size, (N + 1) * size),
where 0 is the epoch.
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.IntervalWindow.IntervalWindowCoder
-
- of() - Static method in class org.apache.beam.sdk.transforms.windowing.PaneInfo.PaneInfoCoder
-
- of(Duration) - Static method in class org.apache.beam.sdk.transforms.windowing.SlidingWindows
-
Assigns timestamps into half-open intervals of the form [N * period, N * period + size), where
0 is the epoch.
- of(T, Exception) - Static method in class org.apache.beam.sdk.transforms.WithFailures.ExceptionElement
-
- of(OutputT, PCollection<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
- of(PCollection<OutputElementT>, PCollection<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
- of(PCollectionTuple, TupleTag<OutputElementT>, TupleTag<FailureElementT>) - Static method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
- of(SerializableFunction<V, K>) - Static method in class org.apache.beam.sdk.transforms.WithKeys
-
Returns a PTransform
that takes a PCollection<V>
and returns a PCollection<KV<K, V>>
, where each of the values in the input PCollection
has been
paired with a key computed from the value by invoking the given SerializableFunction
.
- of(K) - Static method in class org.apache.beam.sdk.transforms.WithKeys
-
Returns a PTransform
that takes a PCollection<V>
and returns a PCollection<KV<K, V>>
, where each of the values in the input PCollection
has been
paired with the given key.
- of(SerializableFunction<T, Instant>) - Static method in class org.apache.beam.sdk.transforms.WithTimestamps
-
- of(Coder<T>, Coder<ErrorT>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow.Coder
-
- of(T, Instant, BoundedWindow, PaneInfo, ErrorT) - Static method in class org.apache.beam.sdk.values.FailsafeValueInSingleWindow
-
- of(K, V) - Static method in class org.apache.beam.sdk.values.KV
-
Returns a
KV
with the given key and value.
- of(PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionList
-
- of(Iterable<PCollection<T>>) - Static method in class org.apache.beam.sdk.values.PCollectionList
-
- of(String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- of(String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- of(String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>, String, PCollection<Row>) - Static method in class org.apache.beam.sdk.values.PCollectionRowTuple
-
- of(TupleTag<T>, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
- of(String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
- of(String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
- of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
- of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
- of(String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>, String, PCollection<T>) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
- of(K, int) - Static method in class org.apache.beam.sdk.values.ShardedKey
-
- of(TupleTag<?>, PCollection<?>) - Static method in class org.apache.beam.sdk.values.TaggedPValue
-
- of(V, Instant) - Static method in class org.apache.beam.sdk.values.TimestampedValue
-
Returns a new TimestampedValue
with the given value and timestamp.
- of(Coder<T>) - Static method in class org.apache.beam.sdk.values.TimestampedValue.TimestampedValueCoder
-
- of(TupleTag<?>) - Static method in class org.apache.beam.sdk.values.TupleTagList
-
- of(List<TupleTag<?>>) - Static method in class org.apache.beam.sdk.values.TupleTagList
-
- of(Class<T>) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
-
- of(Type) - Static method in class org.apache.beam.sdk.values.TypeDescriptor
-
- of(Coder<T>, Coder<? extends BoundedWindow>) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow.Coder
-
- of(T, Instant, BoundedWindow, PaneInfo) - Static method in class org.apache.beam.sdk.values.ValueInSingleWindow
-
- of(Coder<ValueT>) - Static method in class org.apache.beam.sdk.values.ValueWithRecordId.ValueWithRecordIdCoder
-
- of(WindowFn<T, W>) - Static method in class org.apache.beam.sdk.values.WindowingStrategy
-
- ofByteSize(long) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Aim to create batches each with the specified byte size.
- ofByteSize(long, SerializableFunction<InputT, Long>) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Aim to create batches each with the specified byte size.
- ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Max
-
- ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Min
-
- ofDoubles() - Static method in class org.apache.beam.sdk.transforms.Sum
-
- ofExpandedValue(PCollection<?>) - Static method in class org.apache.beam.sdk.values.TaggedPValue
-
- offer(ArtifactRetrievalService, ArtifactStagingServiceGrpc.ArtifactStagingServiceStub, String) - Static method in class org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService
-
Lazily stages artifacts by letting an ArtifactStagingService resolve and request artifacts.
- offerCoders(Coder[]) - Method in interface org.apache.beam.sdk.state.StateSpec
-
For internal use only; no backwards-compatibility guarantees.
- offeringClientsToPool(ControlClientPool.Sink, HeaderAccessor) - Static method in class org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService
-
- ofFirstElement() - Static method in class org.apache.beam.sdk.transforms.windowing.AfterSynchronizedProcessingTime
-
- offset(Duration) - Method in interface org.apache.beam.sdk.state.Timer
-
- OFFSET_INFINITY - Static variable in class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Offset corresponding to infinity.
- OffsetBasedReader(OffsetBasedSource<T>) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader
-
- OffsetBasedSource<T> - Class in org.apache.beam.sdk.io
-
A
BoundedSource
that uses offsets to define starting and ending positions.
- OffsetBasedSource(long, long, long) - Constructor for class org.apache.beam.sdk.io.OffsetBasedSource
-
- OffsetBasedSource.OffsetBasedReader<T> - Class in org.apache.beam.sdk.io
-
- OffsetByteRangeCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
- OffsetByteRangeCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
-
- OffsetRange - Class in org.apache.beam.sdk.io.range
-
A restriction represented by a range of integers [from, to).
- OffsetRange(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRange
-
- OffsetRange.Coder - Class in org.apache.beam.sdk.io.range
-
- OffsetRangeTracker - Class in org.apache.beam.sdk.io.range
-
- OffsetRangeTracker(long, long) - Constructor for class org.apache.beam.sdk.io.range.OffsetRangeTracker
-
Creates an OffsetRangeTracker
for the specified range.
- OffsetRangeTracker - Class in org.apache.beam.sdk.transforms.splittabledofn
-
- OffsetRangeTracker(OffsetRange) - Constructor for class org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker
-
- ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Max
-
- ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Min
-
- ofIntegers() - Static method in class org.apache.beam.sdk.transforms.Sum
-
- ofKVs(String, Schema.FieldType, Schema.FieldType, Coder<KeyT>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
Similar to
RunInference#of(String, FieldType, FieldType)
but the input is a
PCollection
of
KV
s.
- ofKVs(String, Schema, Coder<KeyT>) - Static method in class org.apache.beam.sdk.extensions.python.transforms.RunInference
-
- ofLongs() - Static method in class org.apache.beam.sdk.transforms.Max
-
- ofLongs() - Static method in class org.apache.beam.sdk.transforms.Min
-
- ofLongs() - Static method in class org.apache.beam.sdk.transforms.Sum
-
- ofNamed(Map<String, ?>) - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
-
- ofNone() - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
-
- ofPatientEverything(HealthcareApiClient, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
-
Instantiates a new GetPatientEverything FHIR resource pages iterator.
- ofPositional(List) - Static method in class org.apache.beam.sdk.extensions.sql.impl.QueryPlanner.QueryParameters
-
- ofPrimitiveOutputsInternal(Pipeline, TupleTagList, Map<TupleTag<?>, Coder<?>>, WindowingStrategy<?, ?>, PCollection.IsBounded) - Static method in class org.apache.beam.sdk.values.PCollectionTuple
-
For internal use only; no backwards-compatibility guarantees.
- ofProvider(ValueProvider<T>, Coder<T>) - Static method in class org.apache.beam.sdk.transforms.Create
-
- ofSearch(HealthcareApiClient, String, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
-
Instantiates a new search FHIR resource pages iterator.
- ofSize(long) - Static method in class org.apache.beam.sdk.transforms.GroupIntoBatches
-
Aim to create batches each with the specified element count.
- on(Join.FieldsEqual.Impl) - Method in class org.apache.beam.sdk.schemas.transforms.Join.Impl
-
Join the PCollections using the provided predicate.
- on(PCollection<?>...) - Static method in class org.apache.beam.sdk.transforms.Wait
-
Waits on the given signal collections.
- on(List<PCollection<?>>) - Static method in class org.apache.beam.sdk.transforms.Wait
-
Waits on the given signal collections.
- ON_TIME_AND_ONLY_FIRING - Static variable in class org.apache.beam.sdk.transforms.windowing.PaneInfo
-
PaneInfo
to use when there will be exactly one firing and it is on time.
- onAdvance(int, int) - Method in class org.apache.beam.sdk.fn.stream.AdvancingPhaser
-
- onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
-
- onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
-
- onBatchCompleted(JavaStreamingListenerBatchCompleted) - Method in class org.apache.beam.runners.spark.util.GlobalWatermarkHolder.WatermarkAdvancingStreamingListener
-
- onBundleSuccess() - Method in interface org.apache.beam.sdk.transforms.DoFn.BundleFinalizer.Callback
-
- OnceTrigger(List<Trigger>) - Constructor for class org.apache.beam.sdk.transforms.windowing.Trigger.OnceTrigger
-
- onCheckpoint(BeamFnApi.ProcessBundleResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleCheckpointHandler
-
- onCheckpoint(BeamFnApi.ProcessBundleResponse) - Method in class org.apache.beam.runners.fnexecution.control.BundleCheckpointHandlers.StateAndTimerBundleCheckpointHandler
-
- onClaimed(PositionT) - Method in interface org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers.ClaimObserver
-
- onClaimFailed(PositionT) - Method in interface org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers.ClaimObserver
-
- onClose(Consumer<FnApiControlClient>) - Method in class org.apache.beam.runners.fnexecution.control.FnApiControlClient
-
- onCompleted(BeamFnApi.ProcessBundleResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
-
Handles the bundle's completion report.
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
-
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
-
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
-
- onCompleted() - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
-
- OneOfType - Class in org.apache.beam.sdk.schemas.logicaltypes
-
A logical type representing a union of fields.
- OneOfType.Value - Class in org.apache.beam.sdk.schemas.logicaltypes
-
Represents a single OneOf value.
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
-
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
-
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
-
- onError(Throwable) - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
-
- onGcTimer(Instant, ValueState<SortedMap<Instant, TimestampedValue<ValueT>>>, ValueState<SortedMap<Instant, Long>>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.FillGapsDoFn
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregateProjectMergeRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamAggregationRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamBasicAggregationRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcMergeRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCalcSplittingRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamCoGBKJoinRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamIOPushDownRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinAssociateRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamJoinPushThroughJoinRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamSideInputJoinRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.BeamUnnestRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.impl.rule.LogicalCalcMergeRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.BeamZetaSqlCalcMergeRule
-
- onMatch(RelOptRuleCall) - Method in class org.apache.beam.sdk.extensions.sql.zetasql.unnest.BeamZetaSqlUnnestRule
-
- onMerge(ReduceFn<K, T, Iterable<T>, W>.OnMergeContext) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
-
- onNext(T) - Method in class org.apache.beam.sdk.fn.stream.BufferingStreamObserver
-
- onNext(T) - Method in class org.apache.beam.sdk.fn.stream.DirectStreamObserver
-
- onNext(ReqT) - Method in class org.apache.beam.sdk.fn.stream.ForwardingClientResponseObserver
-
- onNext(V) - Method in class org.apache.beam.sdk.fn.stream.SynchronizedStreamObserver
-
- onPollComplete(StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
Called by the
Watch
transform to compute a new termination state after every poll
completion.
- onProgress(BeamFnApi.ProcessBundleProgressResponse) - Method in interface org.apache.beam.runners.fnexecution.control.BundleProgressHandler
-
Handles a progress report from the bundle while it is executing.
- onSeenNewOutput(Instant, StateT) - Method in interface org.apache.beam.sdk.transforms.Watch.Growth.TerminationCondition
-
- onStartup() - Method in interface org.apache.beam.sdk.harness.JvmInitializer
-
Implement onStartup to run some custom initialization immediately after the JVM is launched for
pipeline execution.
- onSuccess(List<KinesisRecord>) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicy
-
Called after Kinesis records are successfully retrieved.
- onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
-
- onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
-
- onSuccess(List<KinesisRecord>) - Method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicy
-
Called after Kinesis records are successfully retrieved.
- onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
-
- onSuccess(List<KinesisRecord>) - Method in class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DelayIntervalRateLimiter
-
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
-
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
-
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
-
- onThrottle(KinesisClientThrottledException) - Method in interface org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicy
-
Called after the Kinesis client is throttled.
- onThrottle(KinesisClientThrottledException) - Method in class org.apache.beam.sdk.io.aws2.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
-
- onThrottle(KinesisClientThrottledException) - Method in interface org.apache.beam.sdk.io.kinesis.RateLimitPolicy
-
Called after the Kinesis client is throttled.
- onThrottle(KinesisClientThrottledException) - Method in class org.apache.beam.sdk.io.kinesis.RateLimitPolicyFactory.DefaultRateLimiter
-
- onTimer(String, String, KeyT, BoundedWindow, Instant, Instant, TimeDomain) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
-
- onTimer(String, Instant, TimerMap, TimerMap, ValueState<SortedMap<Instant, TimestampedValue<ValueT>>>, ValueState<SortedMap<Instant, Long>>, DoFn.OutputReceiver<ValueT>) - Method in class org.apache.beam.sdk.extensions.timeseries.FillGaps.FillGapsDoFn
-
- OnTimerContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.OnTimerContext
-
- onTrigger(ReduceFn<K, T, Iterable<T>, W>.OnTriggerContext) - Method in class org.apache.beam.runners.twister2.translators.functions.internal.SystemReduceFnBuffering
-
- onWindowExpiration(BoundedWindow, Instant, KeyT) - Method in class org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate
-
- OnWindowExpirationContext() - Constructor for class org.apache.beam.sdk.transforms.DoFn.OnWindowExpirationContext
-
- open(MetricConfig) - Method in class org.apache.beam.runners.flink.metrics.FileReporter
-
- open(GcsPath) - Method in class org.apache.beam.sdk.extensions.gcp.util.GcsUtil
-
Opens an object in GCS.
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.AvroIO.Sink
-
- open(ClassLoaderFileSystem.ClassLoaderResourceId) - Method in class org.apache.beam.sdk.io.ClassLoaderFileSystem
-
- open(String) - Method in class org.apache.beam.sdk.io.FileBasedSink.Writer
-
- open() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
- open(WritableByteChannel) - Method in interface org.apache.beam.sdk.io.FileIO.Sink
-
Initializes writing to the given channel.
- open(ResourceIdT) - Method in class org.apache.beam.sdk.io.FileSystem
-
Returns a read channel for the given ResourceIdT
.
- open(ResourceId) - Static method in class org.apache.beam.sdk.io.FileSystems
-
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.parquet.ParquetIO.Sink
-
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.TextIO.Sink
-
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.TFRecordIO.Sink
-
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.thrift.ThriftIO.Sink
-
- open(WritableByteChannel) - Method in class org.apache.beam.sdk.io.xml.XmlIO.Sink
-
- openSeekable() - Method in class org.apache.beam.sdk.io.FileIO.ReadableFile
-
- optimizedWrites() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, enables new codepaths that are expected to use less resources while writing to
BigQuery.
- Options() - Constructor for class org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Options
-
- Options() - Constructor for class org.apache.beam.runners.direct.DirectRegistrar.Options
-
- Options() - Constructor for class org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options
-
- Options() - Constructor for class org.apache.beam.runners.jet.JetRunnerRegistrar.Options
-
- options() - Method in class org.apache.beam.runners.jobsubmission.JobPreparation
-
- Options() - Constructor for class org.apache.beam.runners.spark.SparkRunnerRegistrar.Options
-
- Options() - Constructor for class org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunnerRegistrar.Options
-
- Options() - Constructor for class org.apache.beam.runners.twister2.Twister2RunnerRegistrar.Options
-
- options() - Static method in class org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
-
- options - Variable in class org.apache.beam.sdk.extensions.sorter.ExternalSorter
-
- Options() - Constructor for class org.apache.beam.sdk.extensions.sorter.ExternalSorter.Options
-
- Options() - Constructor for class org.apache.beam.sdk.options.ManualDockerEnvironmentOptions.Options
-
- Options() - Constructor for class org.apache.beam.sdk.options.RemoteEnvironmentOptions.Options
-
- Options() - Constructor for class org.apache.beam.sdk.transforms.resourcehints.ResourceHintsOptions.Options
-
- OptionsRegistrar() - Constructor for class org.apache.beam.runners.portability.testing.TestUniversalRunner.OptionsRegistrar
-
- Order - Class in org.apache.beam.sdk.extensions.sql.example.model
-
Describes an order.
- Order(int, int) - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Order
-
- Order() - Constructor for class org.apache.beam.sdk.extensions.sql.example.model.Order
-
- OrderByKey() - Constructor for class org.apache.beam.sdk.values.KV.OrderByKey
-
- OrderByValue() - Constructor for class org.apache.beam.sdk.values.KV.OrderByValue
-
- orderedList(Coder<T>) - Static method in class org.apache.beam.sdk.state.StateSpecs
-
- OrderedListState<T> - Interface in org.apache.beam.sdk.state
-
A
ReadableState
cell containing a list of values sorted by timestamp.
- OrderKey - Class in org.apache.beam.sdk.extensions.sql.impl.cep
-
The OrderKey
class stores the information to sort a column.
- orFinally(Trigger.OnceTrigger) - Method in class org.apache.beam.sdk.transforms.windowing.Trigger
-
Specify an ending condition for this trigger.
- OrFinallyTrigger - Class in org.apache.beam.sdk.transforms.windowing
-
A
Trigger
that executes according to its main trigger until its "finally" trigger fires.
- org.apache.beam.io.debezium - package org.apache.beam.io.debezium
-
Transforms for reading from DebeziumIO.
- org.apache.beam.runners.dataflow - package org.apache.beam.runners.dataflow
-
Provides a Beam runner that executes pipelines on the Google Cloud Dataflow service.
- org.apache.beam.runners.dataflow.options - package org.apache.beam.runners.dataflow.options
-
- org.apache.beam.runners.dataflow.util - package org.apache.beam.runners.dataflow.util
-
Provides miscellaneous internal utilities used by the Google Cloud Dataflow runner.
- org.apache.beam.runners.direct - package org.apache.beam.runners.direct
-
- org.apache.beam.runners.flink - package org.apache.beam.runners.flink
-
Internal implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.flink.metrics - package org.apache.beam.runners.flink.metrics
-
Internal metrics implementation of the Beam runner for Apache Flink.
- org.apache.beam.runners.fnexecution.artifact - package org.apache.beam.runners.fnexecution.artifact
-
Pipeline execution-time artifact-management services, including abstract implementations of the
Artifact Retrieval Service.
- org.apache.beam.runners.fnexecution.control - package org.apache.beam.runners.fnexecution.control
-
Utilities for a Beam runner to interact with the Fn API Control Service
via java abstractions.
- org.apache.beam.runners.fnexecution.data - package org.apache.beam.runners.fnexecution.data
-
Utilities for a Beam runner to interact with the Fn API Data Service
via java abstractions.
- org.apache.beam.runners.fnexecution.environment - package org.apache.beam.runners.fnexecution.environment
-
Classes used to instantiate and manage SDK harness environments.
- org.apache.beam.runners.fnexecution.environment.testing - package org.apache.beam.runners.fnexecution.environment.testing
-
Test utilities for the environment management package.
- org.apache.beam.runners.fnexecution.logging - package org.apache.beam.runners.fnexecution.logging
-
Classes used to log informational messages over the Beam Fn Logging Service
.
- org.apache.beam.runners.fnexecution.provisioning - package org.apache.beam.runners.fnexecution.provisioning
-
Provision api services.
- org.apache.beam.runners.fnexecution.state - package org.apache.beam.runners.fnexecution.state
-
State API services.
- org.apache.beam.runners.fnexecution.status - package org.apache.beam.runners.fnexecution.status
-
Worker Status API services.
- org.apache.beam.runners.fnexecution.translation - package org.apache.beam.runners.fnexecution.translation
-
Shared utilities for a Beam runner to translate portable pipelines.
- org.apache.beam.runners.fnexecution.wire - package org.apache.beam.runners.fnexecution.wire
-
Wire coders for communications between runner and SDK harness.
- org.apache.beam.runners.jet - package org.apache.beam.runners.jet
-
Implementation of the Beam runner for Hazelcast Jet.
- org.apache.beam.runners.jet.metrics - package org.apache.beam.runners.jet.metrics
-
Helper classes for implementing metrics in the Hazelcast Jet based runner.
- org.apache.beam.runners.jet.processors - package org.apache.beam.runners.jet.processors
-
Individual DAG node processors used by the Beam runner for Hazelcast Jet.
- org.apache.beam.runners.jobsubmission - package org.apache.beam.runners.jobsubmission
-
Job management services for use in beam runners.
- org.apache.beam.runners.local - package org.apache.beam.runners.local
-
Utilities useful when executing a pipeline on a single machine.
- org.apache.beam.runners.portability - package org.apache.beam.runners.portability
-
Support for executing a pipeline locally over the Beam fn API.
- org.apache.beam.runners.portability.testing - package org.apache.beam.runners.portability.testing
-
Testing utilities for the reference runner.
- org.apache.beam.runners.spark - package org.apache.beam.runners.spark
-
Internal implementation of the Beam runner for Apache Spark.
- org.apache.beam.runners.spark.aggregators - package org.apache.beam.runners.spark.aggregators
-
Provides internal utilities for implementing Beam aggregators using Spark accumulators.
- org.apache.beam.runners.spark.coders - package org.apache.beam.runners.spark.coders
-
Beam coders and coder-related utilities for running on Apache Spark.
- org.apache.beam.runners.spark.io - package org.apache.beam.runners.spark.io
-
Spark-specific transforms for I/O.
- org.apache.beam.runners.spark.metrics - package org.apache.beam.runners.spark.metrics
-
Provides internal utilities for implementing Beam metrics using Spark accumulators.
- org.apache.beam.runners.spark.metrics.sink - package org.apache.beam.runners.spark.metrics.sink
-
Spark sinks that supports beam metrics and aggregators.
- org.apache.beam.runners.spark.stateful - package org.apache.beam.runners.spark.stateful
-
Spark-specific stateful operators.
- org.apache.beam.runners.spark.structuredstreaming - package org.apache.beam.runners.spark.structuredstreaming
-
Internal implementation of the Beam runner for Apache Spark.
- org.apache.beam.runners.spark.structuredstreaming.aggregators - package org.apache.beam.runners.spark.structuredstreaming.aggregators
-
Provides internal utilities for implementing Beam aggregators using Spark accumulators.
- org.apache.beam.runners.spark.structuredstreaming.examples - package org.apache.beam.runners.spark.structuredstreaming.examples
-
- org.apache.beam.runners.spark.structuredstreaming.metrics - package org.apache.beam.runners.spark.structuredstreaming.metrics
-
Provides internal utilities for implementing Beam metrics using Spark accumulators.
- org.apache.beam.runners.spark.structuredstreaming.metrics.sink - package org.apache.beam.runners.spark.structuredstreaming.metrics.sink
-
Spark sinks that supports beam metrics and aggregators.
- org.apache.beam.runners.spark.structuredstreaming.translation - package org.apache.beam.runners.spark.structuredstreaming.translation
-
Internal translators for running Beam pipelines on Spark.
- org.apache.beam.runners.spark.structuredstreaming.translation.batch - package org.apache.beam.runners.spark.structuredstreaming.translation.batch
-
Internal utilities to translate Beam pipelines to Spark batching.
- org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions - package org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions
-
Internal implementation of the Beam runner for Apache Spark.
- org.apache.beam.runners.spark.structuredstreaming.translation.helpers - package org.apache.beam.runners.spark.structuredstreaming.translation.helpers
-
Internal helpers to translate Beam pipelines to Spark streaming.
- org.apache.beam.runners.spark.structuredstreaming.translation.streaming - package org.apache.beam.runners.spark.structuredstreaming.translation.streaming
-
Internal utilities to translate Beam pipelines to Spark streaming.
- org.apache.beam.runners.spark.structuredstreaming.translation.utils - package org.apache.beam.runners.spark.structuredstreaming.translation.utils
-
Internal utils to translate Beam pipelines to Spark streaming.
- org.apache.beam.runners.spark.util - package org.apache.beam.runners.spark.util
-
Internal utilities to translate Beam pipelines to Spark.
- org.apache.beam.runners.twister2 - package org.apache.beam.runners.twister2
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translation.wrappers - package org.apache.beam.runners.twister2.translation.wrappers
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators - package org.apache.beam.runners.twister2.translators
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators.batch - package org.apache.beam.runners.twister2.translators.batch
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators.functions - package org.apache.beam.runners.twister2.translators.functions
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators.functions.internal - package org.apache.beam.runners.twister2.translators.functions.internal
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.translators.streaming - package org.apache.beam.runners.twister2.translators.streaming
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.runners.twister2.utils - package org.apache.beam.runners.twister2.utils
-
Internal implementation of the Beam runner for Twister2.
- org.apache.beam.sdk - package org.apache.beam.sdk
-
Provides a simple, powerful model for building both batch and streaming parallel data processing
Pipeline
s.
- org.apache.beam.sdk.annotations - package org.apache.beam.sdk.annotations
-
Defines annotations used across the SDK.
- org.apache.beam.sdk.coders - package org.apache.beam.sdk.coders
-
Defines
Coders
to specify how data is encoded to and
decoded from byte strings.
- org.apache.beam.sdk.expansion - package org.apache.beam.sdk.expansion
-
Contains classes needed to expose transforms to other SDKs.
- org.apache.beam.sdk.expansion.service - package org.apache.beam.sdk.expansion.service
-
Classes used to expand cross-language transforms.
- org.apache.beam.sdk.extensions.arrow - package org.apache.beam.sdk.extensions.arrow
-
Extensions for using Apache Arrow with Beam.
- org.apache.beam.sdk.extensions.gcp.auth - package org.apache.beam.sdk.extensions.gcp.auth
-
Defines classes related to interacting with Credentials
for pipeline
creation and execution containing Google Cloud Platform components.
- org.apache.beam.sdk.extensions.gcp.options - package org.apache.beam.sdk.extensions.gcp.options
-
Defines
PipelineOptions
for configuring pipeline execution
for Google Cloud Platform components.
- org.apache.beam.sdk.extensions.gcp.storage - package org.apache.beam.sdk.extensions.gcp.storage
-
Defines IO connectors for Google Cloud Storage.
- org.apache.beam.sdk.extensions.gcp.util - package org.apache.beam.sdk.extensions.gcp.util
-
Defines Google Cloud Platform component utilities that can be used by Beam runners.
- org.apache.beam.sdk.extensions.gcp.util.gcsfs - package org.apache.beam.sdk.extensions.gcp.util.gcsfs
-
Defines utilities used to interact with Google Cloud Storage.
- org.apache.beam.sdk.extensions.jackson - package org.apache.beam.sdk.extensions.jackson
-
Utilities for parsing and creating JSON serialized objects.
- org.apache.beam.sdk.extensions.joinlibrary - package org.apache.beam.sdk.extensions.joinlibrary
-
Utilities for performing SQL-style joins of keyed
PCollections
.
- org.apache.beam.sdk.extensions.ml - package org.apache.beam.sdk.extensions.ml
-
Provides DoFns for integration with Google Cloud AI Video Intelligence service.
- org.apache.beam.sdk.extensions.protobuf - package org.apache.beam.sdk.extensions.protobuf
-
Defines a
Coder
for Protocol Buffers messages,
ProtoCoder
.
- org.apache.beam.sdk.extensions.python - package org.apache.beam.sdk.extensions.python
-
Extensions for invoking Python transforms from the Beam Java SDK.
- org.apache.beam.sdk.extensions.python.transforms - package org.apache.beam.sdk.extensions.python.transforms
-
Extensions for invoking Python transforms from the Beam Java SDK.
- org.apache.beam.sdk.extensions.sbe - package org.apache.beam.sdk.extensions.sbe
-
Extension for working with SBE messages in Beam.
- org.apache.beam.sdk.extensions.schemaio.expansion - package org.apache.beam.sdk.extensions.schemaio.expansion
-
External Transform Registration for SchemaIOs.
- org.apache.beam.sdk.extensions.sketching - package org.apache.beam.sdk.extensions.sketching
-
Utilities for computing statistical indicators using probabilistic sketches.
- org.apache.beam.sdk.extensions.sorter - package org.apache.beam.sdk.extensions.sorter
-
Utility for performing local sort of potentially large sets of values.
- org.apache.beam.sdk.extensions.sql - package org.apache.beam.sdk.extensions.sql
-
BeamSQL provides a new interface to run a SQL statement with Beam.
- org.apache.beam.sdk.extensions.sql.example - package org.apache.beam.sdk.extensions.sql.example
-
Example how to use Data Catalog table provider.
- org.apache.beam.sdk.extensions.sql.example.model - package org.apache.beam.sdk.extensions.sql.example.model
-
Java classes used to for modeling the examples.
- org.apache.beam.sdk.extensions.sql.expansion - package org.apache.beam.sdk.extensions.sql.expansion
-
External Transform Registration for Beam SQL.
- org.apache.beam.sdk.extensions.sql.impl - package org.apache.beam.sdk.extensions.sql.impl
-
Implementation classes of BeamSql.
- org.apache.beam.sdk.extensions.sql.impl.cep - package org.apache.beam.sdk.extensions.sql.impl.cep
-
Utilities for Complex Event Processing (CEP).
- org.apache.beam.sdk.extensions.sql.impl.nfa - package org.apache.beam.sdk.extensions.sql.impl.nfa
-
Package of Non-deterministic Finite Automata (NFA
) for MATCH_RECOGNIZE.
- org.apache.beam.sdk.extensions.sql.impl.parser - package org.apache.beam.sdk.extensions.sql.impl.parser
-
Beam SQL parsing additions to Calcite SQL.
- org.apache.beam.sdk.extensions.sql.impl.planner - package org.apache.beam.sdk.extensions.sql.impl.planner
-
BeamQueryPlanner
is the main interface.
- org.apache.beam.sdk.extensions.sql.impl.rel - package org.apache.beam.sdk.extensions.sql.impl.rel
-
BeamSQL specified nodes, to replace RelNode
.
- org.apache.beam.sdk.extensions.sql.impl.rule - package org.apache.beam.sdk.extensions.sql.impl.rule
-
- org.apache.beam.sdk.extensions.sql.impl.schema - package org.apache.beam.sdk.extensions.sql.impl.schema
-
define table schema, to map with Beam IO components.
- org.apache.beam.sdk.extensions.sql.impl.transform - package org.apache.beam.sdk.extensions.sql.impl.transform
-
- org.apache.beam.sdk.extensions.sql.impl.transform.agg - package org.apache.beam.sdk.extensions.sql.impl.transform.agg
-
Implementation of standard SQL aggregation functions, e.g.
- org.apache.beam.sdk.extensions.sql.impl.udaf - package org.apache.beam.sdk.extensions.sql.impl.udaf
-
UDAF classes.
- org.apache.beam.sdk.extensions.sql.impl.udf - package org.apache.beam.sdk.extensions.sql.impl.udf
-
UDF classes.
- org.apache.beam.sdk.extensions.sql.impl.utils - package org.apache.beam.sdk.extensions.sql.impl.utils
-
Utility classes.
- org.apache.beam.sdk.extensions.sql.meta - package org.apache.beam.sdk.extensions.sql.meta
-
Metadata related classes.
- org.apache.beam.sdk.extensions.sql.meta.provider - package org.apache.beam.sdk.extensions.sql.meta.provider
-
Table providers.
- org.apache.beam.sdk.extensions.sql.meta.provider.avro - package org.apache.beam.sdk.extensions.sql.meta.provider.avro
-
Table schema for AvroIO.
- org.apache.beam.sdk.extensions.sql.meta.provider.bigquery - package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery
-
Table schema for BigQuery.
- org.apache.beam.sdk.extensions.sql.meta.provider.bigtable - package org.apache.beam.sdk.extensions.sql.meta.provider.bigtable
-
Table schema for BigTable.
- org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog - package org.apache.beam.sdk.extensions.sql.meta.provider.datacatalog
-
Table schema for Google Cloud Data Catalog.
- org.apache.beam.sdk.extensions.sql.meta.provider.datastore - package org.apache.beam.sdk.extensions.sql.meta.provider.datastore
-
Table schema for DataStore.
- org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog - package org.apache.beam.sdk.extensions.sql.meta.provider.hcatalog
-
Table schema for HCatalog.
- org.apache.beam.sdk.extensions.sql.meta.provider.kafka - package org.apache.beam.sdk.extensions.sql.meta.provider.kafka
-
Table schema for KafkaIO.
- org.apache.beam.sdk.extensions.sql.meta.provider.mongodb - package org.apache.beam.sdk.extensions.sql.meta.provider.mongodb
-
Table schema for MongoDb.
- org.apache.beam.sdk.extensions.sql.meta.provider.parquet - package org.apache.beam.sdk.extensions.sql.meta.provider.parquet
-
Table schema for ParquetIO.
- org.apache.beam.sdk.extensions.sql.meta.provider.pubsub - package org.apache.beam.sdk.extensions.sql.meta.provider.pubsub
-
- org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite - package org.apache.beam.sdk.extensions.sql.meta.provider.pubsublite
-
Provides abstractions for schema-aware IOs.
- org.apache.beam.sdk.extensions.sql.meta.provider.seqgen - package org.apache.beam.sdk.extensions.sql.meta.provider.seqgen
-
Table schema for streaming sequence generator.
- org.apache.beam.sdk.extensions.sql.meta.provider.test - package org.apache.beam.sdk.extensions.sql.meta.provider.test
-
Table schema for in-memory test data.
- org.apache.beam.sdk.extensions.sql.meta.provider.text - package org.apache.beam.sdk.extensions.sql.meta.provider.text
-
Table schema for text files.
- org.apache.beam.sdk.extensions.sql.meta.store - package org.apache.beam.sdk.extensions.sql.meta.store
-
Meta stores.
- org.apache.beam.sdk.extensions.sql.provider - package org.apache.beam.sdk.extensions.sql.provider
-
Package containing UDF providers for testing.
- org.apache.beam.sdk.extensions.sql.udf - package org.apache.beam.sdk.extensions.sql.udf
-
Provides interfaces for defining user-defined functions in Beam SQL.
- org.apache.beam.sdk.extensions.sql.zetasql - package org.apache.beam.sdk.extensions.sql.zetasql
-
ZetaSQL Dialect package.
- org.apache.beam.sdk.extensions.sql.zetasql.translation - package org.apache.beam.sdk.extensions.sql.zetasql.translation
-
Conversion logic between ZetaSQL resolved query nodes and Calcite rel nodes.
- org.apache.beam.sdk.extensions.sql.zetasql.translation.impl - package org.apache.beam.sdk.extensions.sql.zetasql.translation.impl
-
Java implementation of ZetaSQL functions.
- org.apache.beam.sdk.extensions.sql.zetasql.unnest - package org.apache.beam.sdk.extensions.sql.zetasql.unnest
-
Temporary solution to support ZetaSQL UNNEST.
- org.apache.beam.sdk.extensions.timeseries - package org.apache.beam.sdk.extensions.timeseries
-
Utilities for operating on timeseries data.
- org.apache.beam.sdk.extensions.zetasketch - package org.apache.beam.sdk.extensions.zetasketch
-
PTransform
s to compute statistical sketches on data streams based on the
ZetaSketch implementation.
- org.apache.beam.sdk.fn - package org.apache.beam.sdk.fn
-
The top level package for the Fn Execution Java libraries.
- org.apache.beam.sdk.fn.channel - package org.apache.beam.sdk.fn.channel
-
gRPC channel management.
- org.apache.beam.sdk.fn.data - package org.apache.beam.sdk.fn.data
-
Classes to interact with the portability framework data plane.
- org.apache.beam.sdk.fn.server - package org.apache.beam.sdk.fn.server
-
gPRC server factory.
- org.apache.beam.sdk.fn.splittabledofn - package org.apache.beam.sdk.fn.splittabledofn
-
- org.apache.beam.sdk.fn.stream - package org.apache.beam.sdk.fn.stream
-
gRPC stream management.
- org.apache.beam.sdk.fn.test - package org.apache.beam.sdk.fn.test
-
Utilities for testing use of this package.
- org.apache.beam.sdk.fn.windowing - package org.apache.beam.sdk.fn.windowing
-
Common utilities related to windowing during execution of a pipeline.
- org.apache.beam.sdk.function - package org.apache.beam.sdk.function
-
Java 8 functional interface extensions.
- org.apache.beam.sdk.harness - package org.apache.beam.sdk.harness
-
Utilities for configuring worker environment.
- org.apache.beam.sdk.io - package org.apache.beam.sdk.io
-
Defines transforms for reading and writing common storage formats, including
AvroIO
, and
TextIO
.
- org.apache.beam.sdk.io.amqp - package org.apache.beam.sdk.io.amqp
-
Transforms for reading and writing using AMQP 1.0 protocol.
- org.apache.beam.sdk.io.aws.coders - package org.apache.beam.sdk.io.aws.coders
-
Defines common coders for Amazon Web Services.
- org.apache.beam.sdk.io.aws.dynamodb - package org.apache.beam.sdk.io.aws.dynamodb
-
Defines IO connectors for Amazon Web Services DynamoDB.
- org.apache.beam.sdk.io.aws.options - package org.apache.beam.sdk.io.aws.options
-
Defines
PipelineOptions
for configuring pipeline execution
for Amazon Web Services components.
- org.apache.beam.sdk.io.aws.s3 - package org.apache.beam.sdk.io.aws.s3
-
Defines IO connectors for Amazon Web Services S3.
- org.apache.beam.sdk.io.aws.sns - package org.apache.beam.sdk.io.aws.sns
-
Defines IO connectors for Amazon Web Services SNS.
- org.apache.beam.sdk.io.aws.sqs - package org.apache.beam.sdk.io.aws.sqs
-
Defines IO connectors for Amazon Web Services SQS.
- org.apache.beam.sdk.io.aws2.coders - package org.apache.beam.sdk.io.aws2.coders
-
Defines common coders for Amazon Web Services.
- org.apache.beam.sdk.io.aws2.common - package org.apache.beam.sdk.io.aws2.common
-
Common code for AWS sources and sinks such as retry configuration.
- org.apache.beam.sdk.io.aws2.dynamodb - package org.apache.beam.sdk.io.aws2.dynamodb
-
Defines IO connectors for Amazon Web Services DynamoDB.
- org.apache.beam.sdk.io.aws2.kinesis - package org.apache.beam.sdk.io.aws2.kinesis
-
Transforms for reading from Amazon Kinesis.
- org.apache.beam.sdk.io.aws2.options - package org.apache.beam.sdk.io.aws2.options
-
Defines
PipelineOptions
for configuring pipeline execution
for Amazon Web Services components.
- org.apache.beam.sdk.io.aws2.s3 - package org.apache.beam.sdk.io.aws2.s3
-
Defines IO connectors for Amazon Web Services S3.
- org.apache.beam.sdk.io.aws2.sns - package org.apache.beam.sdk.io.aws2.sns
-
Defines IO connectors for Amazon Web Services SNS.
- org.apache.beam.sdk.io.aws2.sqs - package org.apache.beam.sdk.io.aws2.sqs
-
Defines IO connectors for Amazon Web Services SQS.
- org.apache.beam.sdk.io.azure.blobstore - package org.apache.beam.sdk.io.azure.blobstore
-
Defines IO connectors for Azure Blob Storage.
- org.apache.beam.sdk.io.azure.options - package org.apache.beam.sdk.io.azure.options
-
Defines IO connectors for Microsoft Azure Blobstore.
- org.apache.beam.sdk.io.cassandra - package org.apache.beam.sdk.io.cassandra
-
Transforms for reading and writing from/to Apache Cassandra.
- org.apache.beam.sdk.io.clickhouse - package org.apache.beam.sdk.io.clickhouse
-
Transform for writing to ClickHouse.
- org.apache.beam.sdk.io.contextualtextio - package org.apache.beam.sdk.io.contextualtextio
-
Transforms for reading from Files with contextual Information.
- org.apache.beam.sdk.io.elasticsearch - package org.apache.beam.sdk.io.elasticsearch
-
Transforms for reading and writing from Elasticsearch.
- org.apache.beam.sdk.io.fs - package org.apache.beam.sdk.io.fs
-
Apache Beam FileSystem interfaces and their default implementations.
- org.apache.beam.sdk.io.gcp.bigquery - package org.apache.beam.sdk.io.gcp.bigquery
-
Defines transforms for reading and writing from Google BigQuery.
- org.apache.beam.sdk.io.gcp.bigtable - package org.apache.beam.sdk.io.gcp.bigtable
-
Defines transforms for reading and writing from Google Cloud Bigtable.
- org.apache.beam.sdk.io.gcp.common - package org.apache.beam.sdk.io.gcp.common
-
Defines common Google Cloud Platform IO support classes.
- org.apache.beam.sdk.io.gcp.datastore - package org.apache.beam.sdk.io.gcp.datastore
-
Provides an API for reading from and writing to
Google Cloud Datastore over different
versions of the Cloud Datastore Client libraries.
- org.apache.beam.sdk.io.gcp.firestore - package org.apache.beam.sdk.io.gcp.firestore
-
- org.apache.beam.sdk.io.gcp.healthcare - package org.apache.beam.sdk.io.gcp.healthcare
-
Provides an API for reading from and writing to
Google Cloud Datastore over different
versions of the Cloud Datastore Client libraries.
- org.apache.beam.sdk.io.gcp.pubsub - package org.apache.beam.sdk.io.gcp.pubsub
-
- org.apache.beam.sdk.io.gcp.pubsublite - package org.apache.beam.sdk.io.gcp.pubsublite
-
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
- org.apache.beam.sdk.io.gcp.pubsublite.internal - package org.apache.beam.sdk.io.gcp.pubsublite.internal
-
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
- org.apache.beam.sdk.io.gcp.spanner - package org.apache.beam.sdk.io.gcp.spanner
-
- org.apache.beam.sdk.io.gcp.spanner.changestreams - package org.apache.beam.sdk.io.gcp.spanner.changestreams
-
- org.apache.beam.sdk.io.gcp.spanner.changestreams.action - package org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Action processors for each of the types of Change Stream records received.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.dao - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Database Access Objects for querying change streams and modifying the Connector's metadata
tables.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
DoFn and SDF definitions to process Google Cloud Spanner Change Streams.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder - package org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
-
User model for the Spanner change stream API.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper - package org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
-
Mapping related functionality, such as from
ResultSet
s to Change
Stream models.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.model - package org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
User models for the Spanner change stream API.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction - package org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
Custom restriction tracker related classes.
- org.apache.beam.sdk.io.gcp.testing - package org.apache.beam.sdk.io.gcp.testing
-
Defines utilities for unit testing Google Cloud Platform components of Apache Beam pipelines.
- org.apache.beam.sdk.io.hadoop - package org.apache.beam.sdk.io.hadoop
-
Classes shared by Hadoop based IOs.
- org.apache.beam.sdk.io.hadoop.format - package org.apache.beam.sdk.io.hadoop.format
-
Defines transforms for writing to Data sinks that implement
HadoopFormatIO
.
- org.apache.beam.sdk.io.hbase - package org.apache.beam.sdk.io.hbase
-
Transforms for reading and writing from/to Apache HBase.
- org.apache.beam.sdk.io.hcatalog - package org.apache.beam.sdk.io.hcatalog
-
Transforms for reading and writing using HCatalog.
- org.apache.beam.sdk.io.hdfs - package org.apache.beam.sdk.io.hdfs
-
- org.apache.beam.sdk.io.influxdb - package org.apache.beam.sdk.io.influxdb
-
Transforms for reading and writing from/to InfluxDB.
- org.apache.beam.sdk.io.jdbc - package org.apache.beam.sdk.io.jdbc
-
Transforms for reading and writing from JDBC.
- org.apache.beam.sdk.io.jms - package org.apache.beam.sdk.io.jms
-
Transforms for reading and writing from JMS (Java Messaging Service).
- org.apache.beam.sdk.io.kafka - package org.apache.beam.sdk.io.kafka
-
Transforms for reading and writing from Apache Kafka.
- org.apache.beam.sdk.io.kafka.serialization - package org.apache.beam.sdk.io.kafka.serialization
-
Kafka serializers and deserializers.
- org.apache.beam.sdk.io.kinesis - package org.apache.beam.sdk.io.kinesis
-
Transforms for reading and writing from Amazon Kinesis.
- org.apache.beam.sdk.io.kinesis.serde - package org.apache.beam.sdk.io.kinesis.serde
-
Defines serializers / deserializers for AWS.
- org.apache.beam.sdk.io.kudu - package org.apache.beam.sdk.io.kudu
-
Transforms for reading and writing from/to Apache Kudu.
- org.apache.beam.sdk.io.mongodb - package org.apache.beam.sdk.io.mongodb
-
Transforms for reading and writing from MongoDB.
- org.apache.beam.sdk.io.mqtt - package org.apache.beam.sdk.io.mqtt
-
Transforms for reading and writing from MQTT.
- org.apache.beam.sdk.io.neo4j - package org.apache.beam.sdk.io.neo4j
-
Transforms for reading from and writing to from Neo4j.
- org.apache.beam.sdk.io.parquet - package org.apache.beam.sdk.io.parquet
-
Transforms for reading and writing from Parquet.
- org.apache.beam.sdk.io.pulsar - package org.apache.beam.sdk.io.pulsar
-
Transforms for reading and writing from Apache Pulsar.
- org.apache.beam.sdk.io.rabbitmq - package org.apache.beam.sdk.io.rabbitmq
-
Transforms for reading and writing from RabbitMQ.
- org.apache.beam.sdk.io.range - package org.apache.beam.sdk.io.range
-
Provides thread-safe helpers for implementing dynamic work rebalancing in position-based bounded
sources.
- org.apache.beam.sdk.io.redis - package org.apache.beam.sdk.io.redis
-
Transforms for reading and writing from Redis.
- org.apache.beam.sdk.io.snowflake - package org.apache.beam.sdk.io.snowflake
-
Snowflake IO transforms.
- org.apache.beam.sdk.io.snowflake.crosslanguage - package org.apache.beam.sdk.io.snowflake.crosslanguage
-
Cross-language for SnowflakeIO.
- org.apache.beam.sdk.io.snowflake.data - package org.apache.beam.sdk.io.snowflake.data
-
Snowflake IO data types.
- org.apache.beam.sdk.io.snowflake.data.datetime - package org.apache.beam.sdk.io.snowflake.data.datetime
-
Snowflake IO date/time types.
- org.apache.beam.sdk.io.snowflake.data.geospatial - package org.apache.beam.sdk.io.snowflake.data.geospatial
-
Snowflake IO geospatial types.
- org.apache.beam.sdk.io.snowflake.data.logical - package org.apache.beam.sdk.io.snowflake.data.logical
-
Snowflake IO logical types.
- org.apache.beam.sdk.io.snowflake.data.numeric - package org.apache.beam.sdk.io.snowflake.data.numeric
-
Snowflake IO numeric types.
- org.apache.beam.sdk.io.snowflake.data.structured - package org.apache.beam.sdk.io.snowflake.data.structured
-
Snowflake IO structured types.
- org.apache.beam.sdk.io.snowflake.data.text - package org.apache.beam.sdk.io.snowflake.data.text
-
Snowflake IO text types.
- org.apache.beam.sdk.io.snowflake.enums - package org.apache.beam.sdk.io.snowflake.enums
-
Snowflake IO data types.
- org.apache.beam.sdk.io.snowflake.services - package org.apache.beam.sdk.io.snowflake.services
-
Snowflake IO services and POJOs.
- org.apache.beam.sdk.io.solr - package org.apache.beam.sdk.io.solr
-
Transforms for reading and writing from/to Solr.
- org.apache.beam.sdk.io.splunk - package org.apache.beam.sdk.io.splunk
-
Transforms for writing events to Splunk's Http Event Collector (HEC).
- org.apache.beam.sdk.io.thrift - package org.apache.beam.sdk.io.thrift
-
Transforms for reading and writing to Thrift files.
- org.apache.beam.sdk.io.tika - package org.apache.beam.sdk.io.tika
-
Transform for reading and parsing files with Apache Tika.
- org.apache.beam.sdk.io.xml - package org.apache.beam.sdk.io.xml
-
Transforms for reading and writing Xml files.
- org.apache.beam.sdk.jmh.schemas - package org.apache.beam.sdk.jmh.schemas
-
Benchmarks for schemas.
- org.apache.beam.sdk.jmh.util - package org.apache.beam.sdk.jmh.util
-
Benchmarks for core SDK utility classes.
- org.apache.beam.sdk.metrics - package org.apache.beam.sdk.metrics
-
Metrics allow exporting information about the execution of a pipeline.
- org.apache.beam.sdk.options - package org.apache.beam.sdk.options
-
- org.apache.beam.sdk.schemas - package org.apache.beam.sdk.schemas
-
Defines
Schema
and other classes for representing schema'd
data in a
Pipeline
.
- org.apache.beam.sdk.schemas.annotations - package org.apache.beam.sdk.schemas.annotations
-
Defines
Schema
and other classes for representing schema'd
data in a
Pipeline
.
- org.apache.beam.sdk.schemas.io - package org.apache.beam.sdk.schemas.io
-
Provides abstractions for schema-aware IOs.
- org.apache.beam.sdk.schemas.io.payloads - package org.apache.beam.sdk.schemas.io.payloads
-
Provides abstractions for schema-aware IOs.
- org.apache.beam.sdk.schemas.logicaltypes - package org.apache.beam.sdk.schemas.logicaltypes
-
A set of common LogicalTypes for use with schemas.
- org.apache.beam.sdk.schemas.parser - package org.apache.beam.sdk.schemas.parser
-
Defines utilities for deailing with schemas.
- org.apache.beam.sdk.schemas.parser.generated - package org.apache.beam.sdk.schemas.parser.generated
-
Defines utilities for deailing with schemas.
- org.apache.beam.sdk.schemas.transforms - package org.apache.beam.sdk.schemas.transforms
-
Defines transforms that work on PCollections with schemas..
- org.apache.beam.sdk.schemas.utils - package org.apache.beam.sdk.schemas.utils
-
Defines utilities for deailing with schemas.
- org.apache.beam.sdk.state - package org.apache.beam.sdk.state
-
Classes and interfaces for interacting with state.
- org.apache.beam.sdk.testing - package org.apache.beam.sdk.testing
-
Defines utilities for unit testing Apache Beam pipelines.
- org.apache.beam.sdk.transforms - package org.apache.beam.sdk.transforms
-
Defines
PTransform
s for transforming data in a pipeline.
- org.apache.beam.sdk.transforms.display - package org.apache.beam.sdk.transforms.display
-
- org.apache.beam.sdk.transforms.join - package org.apache.beam.sdk.transforms.join
-
Defines the
CoGroupByKey
transform for joining
multiple PCollections.
- org.apache.beam.sdk.transforms.resourcehints - package org.apache.beam.sdk.transforms.resourcehints
-
- org.apache.beam.sdk.transforms.splittabledofn - package org.apache.beam.sdk.transforms.splittabledofn
-
- org.apache.beam.sdk.transforms.windowing - package org.apache.beam.sdk.transforms.windowing
-
Defines the
Window
transform for dividing the
elements in a PCollection into windows, and the
Trigger
for controlling when those elements are output.
- org.apache.beam.sdk.values - package org.apache.beam.sdk.values
-
- out() - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
-
- out(int) - Static method in class org.apache.beam.runners.spark.io.ConsoleIO.Write
-
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
The tag for the main output of FHIR resources.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
-
The tag for the main output of FHIR Resources from a search.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
-
The tag for the main output of FHIR Resources from a GetPatientEverything request.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
-
The tag for the main output of HL7v2 Messages.
- outbound(DataStreams.OutputChunkConsumer<ByteString>) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
-
Converts a single element delimited OutputStream
into multiple ByteStrings
.
- outbound(DataStreams.OutputChunkConsumer<ByteString>, int) - Static method in class org.apache.beam.sdk.fn.stream.DataStreams
-
Converts a single element delimited OutputStream
into multiple ByteStrings
using the specified maximum chunk size.
- OutboundObserverFactory - Class in org.apache.beam.sdk.fn.stream
-
Creates factories which determine an underlying StreamObserver
implementation to use in
to interact with fn execution APIs.
- OutboundObserverFactory() - Constructor for class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
- OutboundObserverFactory.BasicFactory<ReqT,RespT> - Interface in org.apache.beam.sdk.fn.stream
-
Creates an outbound observer for the given inbound observer.
- outboundObserverFor(StreamObserver<ReqT>) - Method in interface org.apache.beam.sdk.fn.stream.OutboundObserverFactory.BasicFactory
-
- outboundObserverFor(OutboundObserverFactory.BasicFactory<ReqT, RespT>, StreamObserver<ReqT>) - Method in class org.apache.beam.sdk.fn.stream.OutboundObserverFactory
-
Creates an outbound observer for the given inbound observer by potentially inserting hooks into
the inbound and outbound observers.
- OUTER - Static variable in class org.apache.beam.sdk.coders.Coder.Context
-
Deprecated.
The outer context: the value being encoded or decoded takes up the remainder of the
record/stream contents.
- OutgoingMessage() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
- OUTPUT - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- output(T) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
-
Output the object.
- output(T, Instant) - Method in interface org.apache.beam.sdk.io.mongodb.MongoDbGridFSIO.ParserCallback
-
Output the object using the specified timestamp.
- output(OutputT, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
-
Adds the given element to the main output PCollection
at the given timestamp in the
given window.
- output(TupleTag<T>, T, Instant, BoundedWindow) - Method in class org.apache.beam.sdk.transforms.DoFn.FinishBundleContext
-
Adds the given element to the output PCollection
with the given tag at the given
timestamp in the given window.
- output(T) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
-
- output(OutputT) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the main output PCollection
.
- output(TupleTag<T>, T) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the output PCollection
with the given tag.
- output() - Method in class org.apache.beam.sdk.transforms.WithFailures.Result
-
- OUTPUT_DIR - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
- OUTPUT_FORMAT_CLASS_ATTR - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
- OUTPUT_INFO - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- OUTPUT_KEY_CLASS - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
- OUTPUT_NAME - Static variable in class org.apache.beam.runners.dataflow.util.PropertyNames
-
- OUTPUT_VALUE_CLASS - Static variable in class org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO
-
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
- outputCollectionNames() - Method in interface org.apache.beam.sdk.schemas.transforms.SchemaTransformProvider
-
Returns the output collection names of this transform.
- outputColumnMap - Variable in class org.apache.beam.sdk.extensions.sql.zetasql.QueryTrait
-
- outputOf(ProcessFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
Returns a type descriptor for the output of the given
ProcessFunction
, subject to Java
type erasure: may contain unresolved type variables if the type was erased.
- outputOf(SerializableFunction<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- outputOf(Contextful.Fn<InputT, OutputT>) - Static method in class org.apache.beam.sdk.values.TypeDescriptors
-
- OutputRangeTracker(OffsetRange) - Constructor for class org.apache.beam.sdk.transforms.PeriodicSequence.OutputRangeTracker
-
- OutputReceiverFactory - Interface in org.apache.beam.runners.fnexecution.control
-
A factory that can create output receivers during an executable stage.
- OutputReference - Class in org.apache.beam.runners.dataflow.util
-
A representation used by
Step
s to reference the
output of other
Step
s.
- OutputReference(String, String) - Constructor for class org.apache.beam.runners.dataflow.util.OutputReference
-
- outputRuntimeOptions() - Method in interface org.apache.beam.sdk.options.PipelineOptions
-
- outputSchema() - Method in class org.apache.beam.sdk.schemas.transforms.Cast
-
- outputSchemaCoder - Variable in class org.apache.beam.sdk.schemas.utils.ConvertHelpers.ConvertedSchemaInformation
-
- OutputTagFilter<OutputT,InputT> - Class in org.apache.beam.runners.twister2.translators.functions
-
Output tag filter.
- OutputTagFilter() - Constructor for class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
-
- OutputTagFilter(int) - Constructor for class org.apache.beam.runners.twister2.translators.functions.OutputTagFilter
-
- outputWithTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.transforms.DoFn.OutputReceiver
-
- outputWithTimestamp(OutputT, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the main output PCollection
, with the given timestamp.
- outputWithTimestamp(TupleTag<T>, T, Instant) - Method in class org.apache.beam.sdk.transforms.DoFn.WindowedContext
-
Adds the given element to the specified output PCollection
, with the given timestamp.
- overlaps(ByteKeyRange) - Method in class org.apache.beam.sdk.io.range.ByteKeyRange
-
Returns
true
if the specified
ByteKeyRange
overlaps this range.
- overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.coders.RowCoder
-
Override encoding positions for the given schema.
- overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.coders.RowCoderGenerator
-
- overrideEncodingPositions(UUID, Map<String, Integer>) - Static method in class org.apache.beam.sdk.schemas.SchemaCoder
-
Override encoding positions for the given schema.