public interface DataflowPipelineOptions extends PipelineOptions, GcpOptions, ApplicationNameOptions, DataflowPipelineDebugOptions, DataflowPipelineWorkerPoolOptions, BigQueryOptions, GcsOptions, StreamingOptions, CloudDebuggerOptions, DataflowWorkerLoggingOptions, DataflowProfilingOptions, PubsubOptions
DataflowRunner
.Modifier and Type | Interface and Description |
---|---|
static class |
DataflowPipelineOptions.FlexResourceSchedulingGoal
Set of available Flexible Resource Scheduling goals.
|
static class |
DataflowPipelineOptions.StagingLocationFactory
Returns a default staging location under
GcpOptions.getGcpTempLocation() . |
DataflowPipelineDebugOptions.DataflowClientFactory, DataflowPipelineDebugOptions.StagerFactory
DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
GcsOptions.ExecutorServiceFactory, GcsOptions.PathValidatorFactory
DataflowWorkerLoggingOptions.Level, DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
DataflowProfilingOptions.DataflowProfilingAgentConfiguration
GcpOptions.DefaultProjectFactory, GcpOptions.EnableStreamingEngineFactory, GcpOptions.GcpTempLocationFactory, GcpOptions.GcpUserCredentialsFactory
GoogleApiDebugOptions.GoogleApiTracer
STATE_CACHE_SIZE, STATE_SAMPLING_PERIOD_MILLIS
STREAMING_ENGINE_EXPERIMENT, WINDMILL_SERVICE_EXPERIMENT
Modifier and Type | Method and Description |
---|---|
java.lang.String |
getCreateFromSnapshot()
If set, the snapshot from which the job should be created.
|
java.util.List<java.lang.String> |
getDataflowServiceOptions()
Service options are set by the user and configure the service.
|
java.lang.String |
getDataflowWorkerJar() |
DataflowPipelineOptions.FlexResourceSchedulingGoal |
getFlexRSGoal()
This option controls Flexible Resource Scheduling mode.
|
java.util.List<java.lang.String> |
getJdkAddOpenModules()
Open modules needed for reflection that access JDK internals with Java 9+
|
java.util.Map<java.lang.String,java.lang.String> |
getLabels()
Labels that will be applied to the billing records for this job.
|
java.lang.String |
getPipelineUrl()
The URL of the staged portable pipeline.
|
java.lang.String |
getProject()
Project id to use when launching jobs.
|
java.lang.String |
getRegion()
The Google Compute Engine region for
creating Dataflow jobs.
|
java.lang.String |
getServiceAccount()
Run the job as a specific service account, instead of the default GCE robot.
|
java.lang.String |
getStagingLocation()
GCS path for staging local files, e.g.
|
java.lang.String |
getTemplateLocation()
Where the runner should generate a template file.
|
boolean |
isHotKeyLoggingEnabled()
If enabled then the literal key will be logged to Cloud Logging if a hot key is detected.
|
boolean |
isUpdate()
Whether to update the currently running pipeline with the same name as this one.
|
void |
setCreateFromSnapshot(java.lang.String value) |
void |
setDataflowServiceOptions(java.util.List<java.lang.String> options) |
void |
setDataflowWorkerJar(java.lang.String dataflowWorkerJar) |
void |
setFlexRSGoal(DataflowPipelineOptions.FlexResourceSchedulingGoal goal) |
void |
setHotKeyLoggingEnabled(boolean value) |
void |
setJdkAddOpenModules(java.util.List<java.lang.String> options) |
void |
setLabels(java.util.Map<java.lang.String,java.lang.String> labels) |
void |
setPipelineUrl(java.lang.String urlString) |
void |
setProject(java.lang.String value) |
void |
setRegion(java.lang.String region) |
void |
setServiceAccount(java.lang.String value) |
void |
setStagingLocation(java.lang.String value) |
void |
setTemplateLocation(java.lang.String value) |
void |
setUpdate(boolean value) |
getApiRootUrl, getDataflowClient, getDataflowEndpoint, getDataflowJobFile, getDumpHeapOnOOM, getGCThrashingPercentagePerPeriod, getJfrRecordingDurationSec, getMaxBundlesFromWindmillOutstanding, getMaxBytesFromWindmillOutstanding, getNumberOfWorkerHarnessThreads, getOverrideWindmillBinary, getReaderCacheTimeoutSec, getRecordJfrOnGcThrashing, getSaveHeapDumpsToGcsPath, getSdkHarnessContainerImageOverrides, getStager, getStagerClass, getTransformNameMapping, getUnboundedReaderMaxElements, getUnboundedReaderMaxReadTimeSec, getUnboundedReaderMaxWaitForElementsMs, getWindmillServiceEndpoint, getWindmillServicePort, getWorkerCacheMb, setApiRootUrl, setDataflowClient, setDataflowEndpoint, setDataflowJobFile, setDumpHeapOnOOM, setGCThrashingPercentagePerPeriod, setJfrRecordingDurationSec, setMaxBundlesFromWindmillOutstanding, setMaxBytesFromWindmillOutstanding, setNumberOfWorkerHarnessThreads, setOverrideWindmillBinary, setReaderCacheTimeoutSec, setRecordJfrOnGcThrashing, setSaveHeapDumpsToGcsPath, setSdkHarnessContainerImageOverrides, setStager, setStagerClass, setTransformNameMapping, setUnboundedReaderMaxElements, setUnboundedReaderMaxReadTimeSec, setUnboundedReaderMaxWaitForElementsMs, setWindmillServiceEndpoint, setWindmillServicePort, setWorkerCacheMb
addExperiment, getExperiments, getExperimentValue, hasExperiment, setExperiments
getAutoscalingAlgorithm, getDiskSizeGb, getMaxNumWorkers, getMinCpuPlatform, getNetwork, getNumWorkers, getSdkContainerImage, getSubnetwork, getUsePublicIps, getWorkerDiskType, getWorkerHarnessContainerImage, getWorkerMachineType, setAutoscalingAlgorithm, setDiskSizeGb, setMaxNumWorkers, setMinCpuPlatform, setNetwork, setNumWorkers, setSdkContainerImage, setSubnetwork, setUsePublicIps, setWorkerDiskType, setWorkerHarnessContainerImage, setWorkerMachineType
getFilesToStage, setFilesToStage
getBigQueryProject, getBqStreamingApiLoggingFrequencySec, getHTTPWriteTimeout, getInsertBundleParallelism, getMaxBufferingDurationMilliSec, getMaxStreamingBatchSize, getMaxStreamingRowsToBatch, getNumStorageWriteApiStreamAppendClients, getNumStorageWriteApiStreams, getNumStreamingKeys, getSchemaUpdateRetries, getStorageApiAppendThresholdBytes, getStorageApiAppendThresholdRecordCount, getStorageWriteApiTriggeringFrequencySec, getTempDatasetId, getUseStorageWriteApi, getUseStorageWriteApiAtLeastOnce, setBigQueryProject, setBqStreamingApiLoggingFrequencySec, setHTTPWriteTimeout, setInsertBundleParallelism, setMaxBufferingDurationMilliSec, setMaxStreamingBatchSize, setMaxStreamingRowsToBatch, setNumStorageWriteApiStreamAppendClients, setNumStorageWriteApiStreams, setNumStreamingKeys, setSchemaUpdateRetries, setStorageApiAppendThresholdBytes, setStorageApiAppendThresholdRecordCount, setStorageWriteApiTriggeringFrequencySec, setTempDatasetId, setUseStorageWriteApi, setUseStorageWriteApiAtLeastOnce
getExecutorService, getGcsEndpoint, getGcsPerformanceMetrics, getGcsUploadBufferSizeBytes, getGcsUtil, getPathValidator, getPathValidatorClass, setExecutorService, setGcsEndpoint, setGcsPerformanceMetrics, setGcsUploadBufferSizeBytes, setGcsUtil, setPathValidator, setPathValidatorClass
getDebuggee, getEnableCloudDebugger, getMaxConditionCost, setDebuggee, setEnableCloudDebugger, setMaxConditionCost
getDefaultWorkerLogLevel, getWorkerLogLevelOverrides, getWorkerSystemErrMessageLevel, getWorkerSystemOutMessageLevel, setDefaultWorkerLogLevel, setWorkerLogLevelOverrides, setWorkerSystemErrMessageLevel, setWorkerSystemOutMessageLevel
getProfilingAgentConfiguration, getSaveProfilesToGcs, setProfilingAgentConfiguration, setSaveProfilesToGcs
getPubsubRootUrl, setPubsubRootUrl, targetForRootUrl
getCredentialFactoryClass, getDataflowKmsKey, getGcpCredential, getGcpTempLocation, getImpersonateServiceAccount, getWorkerRegion, getWorkerZone, getZone, isEnableStreamingEngine, setCredentialFactoryClass, setDataflowKmsKey, setEnableStreamingEngine, setGcpCredential, setGcpTempLocation, setImpersonateServiceAccount, setWorkerRegion, setWorkerZone, setZone
getGoogleApiTrace, setGoogleApiTrace
@Validation.Required @Default.InstanceFactory(value=GcpOptions.DefaultProjectFactory.class) java.lang.String getProject()
GcpOptions
getProject
in interface GcpOptions
void setProject(java.lang.String value)
setProject
in interface GcpOptions
@Default.InstanceFactory(value=DataflowPipelineOptions.StagingLocationFactory.class) java.lang.String getStagingLocation()
Must be a valid Cloud Storage URL, beginning with the prefix "gs://"
If getStagingLocation()
is not set, it will default to GcpOptions.getGcpTempLocation()
. GcpOptions.getGcpTempLocation()
must be a valid GCS
path.
void setStagingLocation(java.lang.String value)
boolean isUpdate()
void setUpdate(boolean value)
java.lang.String getCreateFromSnapshot()
void setCreateFromSnapshot(java.lang.String value)
java.lang.String getTemplateLocation()
void setTemplateLocation(java.lang.String value)
java.util.List<java.lang.String> getDataflowServiceOptions()
void setDataflowServiceOptions(java.util.List<java.lang.String> options)
@Hidden @Experimental java.lang.String getServiceAccount()
void setServiceAccount(java.lang.String value)
@Default.InstanceFactory(value=DefaultGcpRegionFactory.class) java.lang.String getRegion()
void setRegion(java.lang.String region)
java.util.Map<java.lang.String,java.lang.String> getLabels()
void setLabels(java.util.Map<java.lang.String,java.lang.String> labels)
java.lang.String getPipelineUrl()
void setPipelineUrl(java.lang.String urlString)
java.lang.String getDataflowWorkerJar()
void setDataflowWorkerJar(java.lang.String dataflowWorkerJar)
@Default.Enum(value="UNSPECIFIED") DataflowPipelineOptions.FlexResourceSchedulingGoal getFlexRSGoal()
void setFlexRSGoal(DataflowPipelineOptions.FlexResourceSchedulingGoal goal)
boolean isHotKeyLoggingEnabled()
void setHotKeyLoggingEnabled(boolean value)
java.util.List<java.lang.String> getJdkAddOpenModules()
With JDK 16+, JDK internals are strongly encapsulated and can result in an InaccessibleObjectException being thrown if a tool or library uses reflection that access JDK internals. If you see these errors in your worker logs, you can pass in modules to open using the format module/package=target-module(,target-module)* to allow access to the library. E.g. java.base/java.lang=jamm
You may see warnings that jamm, a library used to more accurately size objects, is unable to make a private field accessible. To resolve the warning, open the specified module/package to jamm.
void setJdkAddOpenModules(java.util.List<java.lang.String> options)