@Hidden public interface DataflowPipelineDebugOptions extends ExperimentalOptions, PipelineOptions
Modifier and Type | Interface and Description |
---|---|
static class |
DataflowPipelineDebugOptions.DataflowClientFactory
Returns the default Dataflow client built from the passed in PipelineOptions.
|
static class |
DataflowPipelineDebugOptions.StagerFactory
Creates a
Stager object using the class specified in
getStagerClass() . |
PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, PipelineOptions.DirectRunner, PipelineOptions.JobNameFactory, PipelineOptions.NoOpMetricsSink, PipelineOptions.UserAgentFactory
Modifier and Type | Method and Description |
---|---|
java.lang.String |
getApiRootUrl()
The root URL for the Dataflow API.
|
com.google.api.services.dataflow.Dataflow |
getDataflowClient()
An instance of the Dataflow client.
|
java.lang.String |
getDataflowEndpoint()
Dataflow endpoint to use.
|
java.lang.String |
getDataflowJobFile()
The path to write the translated Dataflow job specification out to
at job submission time.
|
boolean |
getDumpHeapOnOOM()
If true, save a heap dump before killing a thread or process which is GC
thrashing or out of memory.
|
int |
getNumberOfWorkerHarnessThreads()
Number of threads to use on the Dataflow worker harness.
|
java.lang.String |
getOverrideWindmillBinary()
Custom windmill_main binary to use with the streaming runner.
|
java.lang.String |
getSaveHeapDumpsToGcsPath()
CAUTION: This option implies dumpHeapOnOOM, and has similar caveats.
|
Stager |
getStager()
The resource stager instance that should be used to stage resources.
|
java.lang.Class<? extends Stager> |
getStagerClass()
The class responsible for staging resources to be accessible by workers
during job execution.
|
java.util.Map<java.lang.String,java.lang.String> |
getTransformNameMapping()
Mapping of old PTranform names to new ones, specified as JSON
{"oldName":"newName",...} . |
java.lang.String |
getWindmillServiceEndpoint()
Custom windmill service endpoint.
|
int |
getWindmillServicePort() |
void |
setApiRootUrl(java.lang.String value) |
void |
setDataflowClient(com.google.api.services.dataflow.Dataflow value) |
void |
setDataflowEndpoint(java.lang.String value) |
void |
setDataflowJobFile(java.lang.String value) |
void |
setDumpHeapOnOOM(boolean dumpHeapBeforeExit) |
void |
setNumberOfWorkerHarnessThreads(int value) |
void |
setOverrideWindmillBinary(java.lang.String value) |
void |
setSaveHeapDumpsToGcsPath(java.lang.String gcsPath) |
void |
setStager(Stager stager) |
void |
setStagerClass(java.lang.Class<? extends Stager> stagerClass) |
void |
setTransformNameMapping(java.util.Map<java.lang.String,java.lang.String> value) |
void |
setWindmillServiceEndpoint(java.lang.String value) |
void |
setWindmillServicePort(int value) |
getExperiments, hasExperiment, setExperiments
as, getJobName, getMetricsHttpSinkUrl, getMetricsPushPeriod, getMetricsSink, getOptionsId, getRunner, getStableUniqueNames, getTempLocation, getUserAgent, outputRuntimeOptions, setJobName, setMetricsHttpSinkUrl, setMetricsPushPeriod, setMetricsSink, setOptionsId, setRunner, setStableUniqueNames, setTempLocation, setUserAgent
populateDisplayData
@Default.String(value="https://dataflow.googleapis.com/") java.lang.String getApiRootUrl()
dataflowEndpoint
can override this value
if it contains an absolute URL, otherwise apiRootUrl
will be combined with
dataflowEndpoint
to generate the full URL to communicate with the Dataflow API.void setApiRootUrl(java.lang.String value)
@Default.String(value="") java.lang.String getDataflowEndpoint()
Defaults to the current version of the Google Cloud Dataflow API, at the time the current SDK version was released.
If the string contains "://", then this is treated as a URL,
otherwise getApiRootUrl()
is used as the root
URL.
void setDataflowEndpoint(java.lang.String value)
java.lang.String getDataflowJobFile()
void setDataflowJobFile(java.lang.String value)
@Default.Class(value=GcsStager.class) java.lang.Class<? extends Stager> getStagerClass()
void setStagerClass(java.lang.Class<? extends Stager> stagerClass)
@Default.InstanceFactory(value=DataflowPipelineDebugOptions.StagerFactory.class) Stager getStager()
void setStager(Stager stager)
@Default.InstanceFactory(value=DataflowPipelineDebugOptions.DataflowClientFactory.class) com.google.api.services.dataflow.Dataflow getDataflowClient()
void setDataflowClient(com.google.api.services.dataflow.Dataflow value)
java.util.Map<java.lang.String,java.lang.String> getTransformNameMapping()
{"oldName":"newName",...}
. To mark a transform as deleted, make newName the
empty string.void setTransformNameMapping(java.util.Map<java.lang.String,java.lang.String> value)
java.lang.String getOverrideWindmillBinary()
void setOverrideWindmillBinary(java.lang.String value)
java.lang.String getWindmillServiceEndpoint()
void setWindmillServiceEndpoint(java.lang.String value)
@Default.Integer(value=443) int getWindmillServicePort()
void setWindmillServicePort(int value)
int getNumberOfWorkerHarnessThreads()
void setNumberOfWorkerHarnessThreads(int value)
boolean getDumpHeapOnOOM()
CAUTION: Heap dumps can of comparable size to the default boot disk. Consider increasing the boot disk size before setting this flag to true.
void setDumpHeapOnOOM(boolean dumpHeapBeforeExit)
@Experimental java.lang.String getSaveHeapDumpsToGcsPath()
void setSaveHeapDumpsToGcsPath(java.lang.String gcsPath)