public abstract static class SnowflakeIO.Write<T> extends PTransform<PCollection<T>,PDone>
SnowflakeIO.write()
.name
Constructor and Description |
---|
Write() |
Modifier and Type | Method and Description |
---|---|
PDone |
expand(PCollection<T> input)
Override this method to specify how this
PTransform should be expanded on the given
InputT . |
protected PTransform |
streamToTable(SnowflakeService snowflakeService,
ValueProvider<java.lang.String> stagingBucketDir) |
SnowflakeIO.Write<T> |
to(java.lang.String table)
A table name to be written in Snowflake.
|
SnowflakeIO.Write<T> |
to(ValueProvider<java.lang.String> table) |
SnowflakeIO.Write<T> |
withCreateDisposition(CreateDisposition createDisposition)
A disposition to be used during table preparation.
|
SnowflakeIO.Write<T> |
withDataSourceConfiguration(SnowflakeIO.DataSourceConfiguration config)
Setting information about Snowflake server.
|
SnowflakeIO.Write<T> |
withDataSourceProviderFn(SerializableFunction<java.lang.Void,javax.sql.DataSource> dataSourceProviderFn)
Setting function that will provide
SnowflakeIO.DataSourceConfiguration in runtime. |
SnowflakeIO.Write<T> |
withDebugMode(StreamingLogLevel debugLevel)
The option to verbose info (or only errors) of loaded files while streaming.
|
SnowflakeIO.Write<T> |
withFileNameTemplate(java.lang.String fileNameTemplate)
A template name for files saved to GCP.
|
SnowflakeIO.Write<T> |
withFlushRowLimit(java.lang.Integer rowsCount)
Sets number of row limit that will be saved to the staged file and then loaded to Snowflake.
|
SnowflakeIO.Write<T> |
withFlushTimeLimit(Duration triggeringFrequency)
Sets duration how often staged files will be created and then how often ingested by Snowflake
during streaming.
|
SnowflakeIO.Write<T> |
withQueryTransformation(java.lang.String query)
A query to be executed in Snowflake.
|
SnowflakeIO.Write<T> |
withQueryTransformation(ValueProvider<java.lang.String> query) |
SnowflakeIO.Write<T> |
withQuotationMark(java.lang.String quotationMark)
Sets Snowflake-specific quotations around strings.
|
SnowflakeIO.Write<T> |
withShardsNumber(java.lang.Integer shardsNumber)
Number of shards that are created per window.
|
SnowflakeIO.Write<T> |
withSnowflakeService(SnowflakeService snowflakeService)
A snowflake service
SnowflakeService implementation which is supposed to be used. |
SnowflakeIO.Write<T> |
withSnowPipe(java.lang.String snowPipe)
Sets name of SnowPipe
which can be created in Snowflake dashboard or cli:
|
SnowflakeIO.Write<T> |
withSnowPipe(ValueProvider<java.lang.String> snowPipe)
Same as
withSnowPipe(String , but with a ValueProvider . |
SnowflakeIO.Write<T> |
withStagingBucketName(java.lang.String stagingBucketName)
Name of the cloud bucket (GCS by now) to use as tmp location of CSVs during COPY statement.
|
SnowflakeIO.Write<T> |
withStagingBucketName(ValueProvider<java.lang.String> stagingBucketName) |
SnowflakeIO.Write<T> |
withStorageIntegrationName(java.lang.String integrationName)
Name of the Storage Integration in Snowflake to be used.
|
SnowflakeIO.Write<T> |
withStorageIntegrationName(ValueProvider<java.lang.String> integrationName) |
SnowflakeIO.Write<T> |
withTableSchema(SnowflakeTableSchema tableSchema)
Table schema to be used during creating table.
|
SnowflakeIO.Write<T> |
withUserDataMapper(SnowflakeIO.UserDataMapper userDataMapper)
User-defined function mapping user data into CSV lines.
|
SnowflakeIO.Write<T> |
withWriteDisposition(WriteDisposition writeDisposition)
A disposition to be used during writing to table phase.
|
compose, compose, getAdditionalInputs, getDefaultOutputCoder, getDefaultOutputCoder, getDefaultOutputCoder, getKindString, getName, populateDisplayData, toString, validate
public SnowflakeIO.Write<T> withDataSourceConfiguration(SnowflakeIO.DataSourceConfiguration config)
config
- An instance of SnowflakeIO.DataSourceConfiguration
.public SnowflakeIO.Write<T> withDataSourceProviderFn(SerializableFunction<java.lang.Void,javax.sql.DataSource> dataSourceProviderFn)
SnowflakeIO.DataSourceConfiguration
in runtime.dataSourceProviderFn
- a SerializableFunction
.public SnowflakeIO.Write<T> to(java.lang.String table)
table
- String with the name of the table.public SnowflakeIO.Write<T> to(ValueProvider<java.lang.String> table)
public SnowflakeIO.Write<T> withStagingBucketName(java.lang.String stagingBucketName)
stagingBucketName
- String with the name of the bucket.public SnowflakeIO.Write<T> withStagingBucketName(ValueProvider<java.lang.String> stagingBucketName)
public SnowflakeIO.Write<T> withStorageIntegrationName(java.lang.String integrationName)
integrationName
- String with the name of the Storage Integration.public SnowflakeIO.Write<T> withStorageIntegrationName(ValueProvider<java.lang.String> integrationName)
public SnowflakeIO.Write<T> withQueryTransformation(java.lang.String query)
query
- String with query.public SnowflakeIO.Write<T> withQueryTransformation(ValueProvider<java.lang.String> query)
public SnowflakeIO.Write<T> withFileNameTemplate(java.lang.String fileNameTemplate)
fileNameTemplate
- String with template name for files.public SnowflakeIO.Write<T> withUserDataMapper(SnowflakeIO.UserDataMapper userDataMapper)
userDataMapper
- an instance of SnowflakeIO.UserDataMapper
.public SnowflakeIO.Write<T> withFlushTimeLimit(Duration triggeringFrequency)
triggeringFrequency
- time for triggering frequency in Duration
type.public SnowflakeIO.Write<T> withSnowPipe(java.lang.String snowPipe)
CREATE snowPipeName AS COPY INTO your_table from @yourstage;
The stage in COPY statement should be pointing to the cloud integration with the valid bucket url, ex. for GCS:
CREATE STAGE yourstage
URL = 'gcs://yourbucket/path/'
STORAGE_INTEGRATION = your_integration;
CREATE STORAGE INTEGRATION your_integration
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = GCS
ENABLED = TRUE
STORAGE_ALLOWED_LOCATIONS = ('gcs://yourbucket/path/')
snowPipe
- name of created SnowPipe in Snowflake dashboard.public SnowflakeIO.Write<T> withSnowPipe(ValueProvider<java.lang.String> snowPipe)
withSnowPipe(String
, but with a ValueProvider
.snowPipe
- name of created SnowPipe in Snowflake dashboard.public SnowflakeIO.Write<T> withShardsNumber(java.lang.Integer shardsNumber)
shardsNumber
- defined number of shards or 1 by default.public SnowflakeIO.Write<T> withFlushRowLimit(java.lang.Integer rowsCount)
withFlushTimeLimit(Duration
triggeringFrequency)
rowsCount
- Number of rows that will be in one file staged for loading. Default: 10000.public SnowflakeIO.Write<T> withWriteDisposition(WriteDisposition writeDisposition)
writeDisposition
- an instance of WriteDisposition
.public SnowflakeIO.Write<T> withCreateDisposition(CreateDisposition createDisposition)
createDisposition
- - an instance of CreateDisposition
.public SnowflakeIO.Write<T> withTableSchema(SnowflakeTableSchema tableSchema)
tableSchema
- - an instance of SnowflakeTableSchema
.public SnowflakeIO.Write<T> withSnowflakeService(SnowflakeService snowflakeService)
SnowflakeService
implementation which is supposed to be used.snowflakeService
- an instance of SnowflakeService
.public SnowflakeIO.Write<T> withQuotationMark(java.lang.String quotationMark)
quotationMark
- with possible single quote '
, double quote "
or nothing.
Default value is single quotation '
.public SnowflakeIO.Write<T> withDebugMode(StreamingLogLevel debugLevel)
debugLevel
- error or info debug level from enum StreamingLogLevel
public PDone expand(PCollection<T> input)
PTransform
PTransform
should be expanded on the given
InputT
.
NOTE: This method should not be called directly. Instead apply the PTransform
should
be applied to the InputT
using the apply
method.
Composite transforms, which are defined in terms of other transforms, should return the output of one of the composed transforms. Non-composite transforms, which do not apply any transforms internally, should return a new unbound output and register evaluators (via backend-specific registration methods).
expand
in class PTransform<PCollection<T>,PDone>
protected PTransform streamToTable(SnowflakeService snowflakeService, ValueProvider<java.lang.String> stagingBucketDir)