public abstract class BeamKafkaTable extends SchemaBaseBeamTable
BeamKafkaTable represent a Kafka topic, as source or target. Need to extend to convert
 between BeamSqlRow and KV<byte[], byte[]>.| Modifier and Type | Field and Description | 
|---|---|
protected int | 
numberOfRecordsForRate  | 
schema| Modifier | Constructor and Description | 
|---|---|
protected  | 
BeamKafkaTable(Schema beamSchema)  | 
  | 
BeamKafkaTable(Schema beamSchema,
              java.util.List<TopicPartition> topicPartitions,
              java.lang.String bootstrapServers)  | 
  | 
BeamKafkaTable(Schema beamSchema,
              java.lang.String bootstrapServers,
              java.util.List<java.lang.String> topics)  | 
| Modifier and Type | Method and Description | 
|---|---|
PCollection<Row> | 
buildIOReader(PBegin begin)
create a  
PCollection<Row> from source. | 
POutput | 
buildIOWriter(PCollection<Row> input)
create a  
IO.write() instance to write to target. | 
protected KafkaIO.Read<byte[],byte[]> | 
createKafkaRead()  | 
java.lang.String | 
getBootstrapServers()  | 
protected abstract PTransform<PCollection<KafkaRecord<byte[],byte[]>>,PCollection<Row>> | 
getPTransformForInput()  | 
protected abstract PTransform<PCollection<Row>,PCollection<ProducerRecord<byte[],byte[]>>> | 
getPTransformForOutput()  | 
BeamTableStatistics | 
getTableStatistics(PipelineOptions options)
Estimates the number of rows or the rate for unbounded Tables. 
 | 
java.util.List<java.lang.String> | 
getTopics()  | 
PCollection.IsBounded | 
isBounded()
Whether this table is bounded (known to be finite) or unbounded (may or may not be finite). 
 | 
BeamKafkaTable | 
updateConsumerProperties(java.util.Map<java.lang.String,java.lang.Object> configUpdates)  | 
getSchemabuildIOReader, constructFilter, supportsProjectsprotected BeamKafkaTable(Schema beamSchema)
public BeamKafkaTable(Schema beamSchema, java.lang.String bootstrapServers, java.util.List<java.lang.String> topics)
public BeamKafkaTable(Schema beamSchema, java.util.List<TopicPartition> topicPartitions, java.lang.String bootstrapServers)
public BeamKafkaTable updateConsumerProperties(java.util.Map<java.lang.String,java.lang.Object> configUpdates)
public PCollection.IsBounded isBounded()
BeamSqlTableprotected abstract PTransform<PCollection<KafkaRecord<byte[],byte[]>>,PCollection<Row>> getPTransformForInput()
protected abstract PTransform<PCollection<Row>,PCollection<ProducerRecord<byte[],byte[]>>> getPTransformForOutput()
public PCollection<Row> buildIOReader(PBegin begin)
BeamSqlTablePCollection<Row> from source.protected KafkaIO.Read<byte[],byte[]> createKafkaRead()
public POutput buildIOWriter(PCollection<Row> input)
BeamSqlTableIO.write() instance to write to target.public java.lang.String getBootstrapServers()
public java.util.List<java.lang.String> getTopics()
public BeamTableStatistics getTableStatistics(PipelineOptions options)
BeamSqlTablegetTableStatistics in interface BeamSqlTablegetTableStatistics in class BaseBeamTable