java.lang.Object
org.apache.beam.sdk.transforms.PTransform<PCollection<KV<org.apache.beam.sdk.util.ShardedKey<DestinationT>,Iterable<StorageApiWritePayload>>>,PCollectionTuple>
org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords<DestinationT,ElementT>
- All Implemented Interfaces:
Serializable,HasDisplayData
public class StorageApiWritesShardedRecords<DestinationT extends @NonNull Object,ElementT>
extends PTransform<PCollection<KV<org.apache.beam.sdk.util.ShardedKey<DestinationT>,Iterable<StorageApiWritePayload>>>,PCollectionTuple>
A transform to write sharded records to BigQuery using the Storage API (Streaming).
- See Also:
-
Field Summary
Fields inherited from class org.apache.beam.sdk.transforms.PTransform
annotations, displayData, name, resourceHints -
Constructor Summary
ConstructorsConstructorDescriptionStorageApiWritesShardedRecords(org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinations<ElementT, DestinationT> dynamicDestinations, BigQueryIO.Write.CreateDisposition createDisposition, String kmsKey, BigQueryServices bqServices, Coder<DestinationT> destinationCoder, Coder<BigQueryStorageApiInsertError> failedRowsCoder, Coder<TableRow> successfulRowsCoder, TupleTag<BigQueryStorageApiInsertError> failedRowsTag, @Nullable TupleTag<TableRow> successfulRowsTag, Predicate<String> successfulRowsPredicate, boolean autoUpdateSchema, boolean ignoreUnknownValues, com.google.cloud.bigquery.storage.v1.AppendRowsRequest.MissingValueInterpretation defaultMissingValueInterpretation, @Nullable Map<String, String> bigLakeConfiguration) -
Method Summary
Modifier and TypeMethodDescriptionexpand(PCollection<KV<org.apache.beam.sdk.util.ShardedKey<DestinationT>, Iterable<StorageApiWritePayload>>> input) Override this method to specify how thisPTransformshould be expanded on the givenInputT.Methods inherited from class org.apache.beam.sdk.transforms.PTransform
addAnnotation, compose, compose, getAdditionalInputs, getAnnotations, getDefaultOutputCoder, getDefaultOutputCoder, getDefaultOutputCoder, getKindString, getName, getResourceHints, populateDisplayData, setDisplayData, setResourceHints, toString, validate, validate
-
Constructor Details
-
StorageApiWritesShardedRecords
public StorageApiWritesShardedRecords(org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinations<ElementT, DestinationT> dynamicDestinations, BigQueryIO.Write.CreateDisposition createDisposition, String kmsKey, BigQueryServices bqServices, Coder<DestinationT> destinationCoder, Coder<BigQueryStorageApiInsertError> failedRowsCoder, Coder<TableRow> successfulRowsCoder, TupleTag<BigQueryStorageApiInsertError> failedRowsTag, @Nullable TupleTag<TableRow> successfulRowsTag, Predicate<String> successfulRowsPredicate, boolean autoUpdateSchema, boolean ignoreUnknownValues, com.google.cloud.bigquery.storage.v1.AppendRowsRequest.MissingValueInterpretation defaultMissingValueInterpretation, @Nullable Map<String, String> bigLakeConfiguration)
-
-
Method Details
-
expand
public PCollectionTuple expand(PCollection<KV<org.apache.beam.sdk.util.ShardedKey<DestinationT>, Iterable<StorageApiWritePayload>>> input) Description copied from class:PTransformOverride this method to specify how thisPTransformshould be expanded on the givenInputT.NOTE: This method should not be called directly. Instead apply the
PTransformshould be applied to theInputTusing theapplymethod.Composite transforms, which are defined in terms of other transforms, should return the output of one of the composed transforms. Non-composite transforms, which do not apply any transforms internally, should return a new unbound output and register evaluators (via backend-specific registration methods).
- Specified by:
expandin classPTransform<PCollection<KV<org.apache.beam.sdk.util.ShardedKey<DestinationT extends @NonNull Object>,Iterable<StorageApiWritePayload>>>, PCollectionTuple>
-