Interface and Description |
---|
org.apache.beam.sdk.io.AvroIO.RecordFormatter
Users can achieve the same by providing this transform in a
ParDo before using write in AvroIO AvroIO.write(Class) . |
org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
This interface will no longer be the source of truth for worker logging configuration
once jobs are executed using a dedicated SDK harness instead of user code being co-located
alongside Dataflow worker code. Please set the option below and also the corresponding option
within
SdkHarnessOptions to ensure forward compatibility. |
org.apache.beam.sdk.testing.StreamingIT
tests which use unbounded PCollections should be in the category
UsesUnboundedPCollections . Beyond that, it is up to the runner and test configuration to
decide whether to run in streaming mode. |
Enum and Description |
---|
org.apache.beam.sdk.io.CompressedSource.CompressionMode
Use
Compression instead |
org.apache.beam.sdk.transforms.DoFnTester.CloningBehavior
Use
TestPipeline with the DirectRunner . |
org.apache.beam.sdk.io.FileBasedSink.CompressionType
use
Compression . |
org.apache.beam.sdk.io.TextIO.CompressionType
Use
Compression . |
org.apache.beam.sdk.io.TFRecordIO.CompressionType
Use
Compression . |
org.apache.beam.sdk.io.xml.XmlIO.Read.CompressionType
Use
Compression instead. |
Constructor and Description |
---|
org.apache.beam.runners.fnexecution.data.GrpcDataService()
This constructor is for migrating Dataflow purpose only.
|
Enum Constant and Description |
---|
org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType.BASIC |