public abstract class BeamKafkaTable extends BaseBeamTable implements java.io.Serializable
BeamKafkaTable
represent a Kafka topic, as source or target. Need to
extend to convert between BeamSqlRow
and KV<byte[], byte[]>
.beamRecordSqlType
Modifier | Constructor and Description |
---|---|
protected |
BeamKafkaTable(BeamRecordSqlType beamSqlRowType) |
|
BeamKafkaTable(BeamRecordSqlType beamSqlRowType,
java.util.List<org.apache.kafka.common.TopicPartition> topicPartitions,
java.lang.String bootstrapServers) |
|
BeamKafkaTable(BeamRecordSqlType beamSqlRowType,
java.lang.String bootstrapServers,
java.util.List<java.lang.String> topics) |
Modifier and Type | Method and Description |
---|---|
PCollection<BeamRecord> |
buildIOReader(Pipeline pipeline)
create a
PCollection<BeamSqlRow> from source. |
PTransform<? super PCollection<BeamRecord>,PDone> |
buildIOWriter()
create a
IO.write() instance to write to target. |
java.lang.String |
getBootstrapServers() |
abstract PTransform<PCollection<KV<byte[],byte[]>>,PCollection<BeamRecord>> |
getPTransformForInput() |
abstract PTransform<PCollection<BeamRecord>,PCollection<KV<byte[],byte[]>>> |
getPTransformForOutput() |
BeamIOType |
getSourceType()
In Beam SQL, there's no difference between a batch query and a streaming
query.
|
java.util.List<java.lang.String> |
getTopics() |
BeamKafkaTable |
updateConsumerProperties(java.util.Map<java.lang.String,java.lang.Object> configUpdates) |
getRowType
protected BeamKafkaTable(BeamRecordSqlType beamSqlRowType)
public BeamKafkaTable(BeamRecordSqlType beamSqlRowType, java.lang.String bootstrapServers, java.util.List<java.lang.String> topics)
public BeamKafkaTable(BeamRecordSqlType beamSqlRowType, java.util.List<org.apache.kafka.common.TopicPartition> topicPartitions, java.lang.String bootstrapServers)
public BeamKafkaTable updateConsumerProperties(java.util.Map<java.lang.String,java.lang.Object> configUpdates)
public BeamIOType getSourceType()
BeamSqlTable
BeamIOType
is used to validate the sources.getSourceType
in interface BeamSqlTable
public abstract PTransform<PCollection<KV<byte[],byte[]>>,PCollection<BeamRecord>> getPTransformForInput()
public abstract PTransform<PCollection<BeamRecord>,PCollection<KV<byte[],byte[]>>> getPTransformForOutput()
public PCollection<BeamRecord> buildIOReader(Pipeline pipeline)
BeamSqlTable
PCollection<BeamSqlRow>
from source.buildIOReader
in interface BeamSqlTable
public PTransform<? super PCollection<BeamRecord>,PDone> buildIOWriter()
BeamSqlTable
IO.write()
instance to write to target.buildIOWriter
in interface BeamSqlTable
public java.lang.String getBootstrapServers()
public java.util.List<java.lang.String> getTopics()