@Hidden public interface DataflowPipelineDebugOptions extends PipelineOptions
Modifier and Type | Interface and Description |
---|---|
static class |
DataflowPipelineDebugOptions.DataflowClientFactory
Returns the default Dataflow client built from the passed in PipelineOptions.
|
static class |
DataflowPipelineDebugOptions.StagerFactory
Creates a
Stager object using the class specified in
getStagerClass() . |
PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, PipelineOptions.DirectRunner, PipelineOptions.JobNameFactory
Modifier and Type | Method and Description |
---|---|
java.lang.String |
getApiRootUrl()
The root URL for the Dataflow API.
|
com.google.api.services.dataflow.Dataflow |
getDataflowClient()
An instance of the Dataflow client.
|
java.lang.String |
getDataflowEndpoint()
Dataflow endpoint to use.
|
java.lang.String |
getDataflowJobFile()
The path to write the translated Dataflow job specification out to
at job submission time.
|
boolean |
getDumpHeapOnOOM()
If true, save a heap dump before killing a thread or process which is GC
thrashing or out of memory.
|
java.util.List<java.lang.String> |
getExperiments()
The list of backend experiments to enable.
|
int |
getNumberOfWorkerHarnessThreads()
Number of threads to use on the Dataflow worker harness.
|
java.lang.String |
getOverrideWindmillBinary()
Custom windmill_main binary to use with the streaming runner.
|
Stager |
getStager()
The resource stager instance that should be used to stage resources.
|
java.lang.Class<? extends Stager> |
getStagerClass()
The class responsible for staging resources to be accessible by workers
during job execution.
|
java.util.Map<java.lang.String,java.lang.String> |
getTransformNameMapping()
Mapping of old PTranform names to new ones, specified as JSON
{"oldName":"newName",...} . |
java.lang.String |
getWindmillServiceEndpoint()
Custom windmill service endpoint.
|
int |
getWindmillServicePort() |
void |
setApiRootUrl(java.lang.String value) |
void |
setDataflowClient(com.google.api.services.dataflow.Dataflow value) |
void |
setDataflowEndpoint(java.lang.String value) |
void |
setDataflowJobFile(java.lang.String value) |
void |
setDumpHeapOnOOM(boolean dumpHeapBeforeExit) |
void |
setExperiments(java.util.List<java.lang.String> value) |
void |
setNumberOfWorkerHarnessThreads(int value) |
void |
setOverrideWindmillBinary(java.lang.String value) |
void |
setStager(Stager stager) |
void |
setStagerClass(java.lang.Class<? extends Stager> stagerClass) |
void |
setTransformNameMapping(java.util.Map<java.lang.String,java.lang.String> value) |
void |
setWindmillServiceEndpoint(java.lang.String value) |
void |
setWindmillServicePort(int value) |
as, getJobName, getOptionsId, getRunner, getStableUniqueNames, getTempLocation, outputRuntimeOptions, setJobName, setOptionsId, setRunner, setStableUniqueNames, setTempLocation
populateDisplayData
@Experimental @Nullable java.util.List<java.lang.String> getExperiments()
Dataflow provides a number of experimental features that can be enabled with this flag.
Please sync with the Dataflow team before enabling any experiments.
void setExperiments(@Nullable java.util.List<java.lang.String> value)
@Default.String(value="https://dataflow.googleapis.com/") java.lang.String getApiRootUrl()
dataflowEndpoint
can override this value
if it contains an absolute URL, otherwise apiRootUrl
will be combined with
dataflowEndpoint
to generate the full URL to communicate with the Dataflow API.void setApiRootUrl(java.lang.String value)
@Default.String(value="") java.lang.String getDataflowEndpoint()
Defaults to the current version of the Google Cloud Dataflow API, at the time the current SDK version was released.
If the string contains "://", then this is treated as a URL,
otherwise getApiRootUrl()
is used as the root
URL.
void setDataflowEndpoint(java.lang.String value)
java.lang.String getDataflowJobFile()
void setDataflowJobFile(java.lang.String value)
@Default.Class(value=GcsStager.class) java.lang.Class<? extends Stager> getStagerClass()
void setStagerClass(java.lang.Class<? extends Stager> stagerClass)
@Default.InstanceFactory(value=DataflowPipelineDebugOptions.StagerFactory.class) Stager getStager()
void setStager(Stager stager)
@Default.InstanceFactory(value=DataflowPipelineDebugOptions.DataflowClientFactory.class) com.google.api.services.dataflow.Dataflow getDataflowClient()
void setDataflowClient(com.google.api.services.dataflow.Dataflow value)
java.util.Map<java.lang.String,java.lang.String> getTransformNameMapping()
{"oldName":"newName",...}
. To mark a transform as deleted, make newName the
empty string.void setTransformNameMapping(java.util.Map<java.lang.String,java.lang.String> value)
java.lang.String getOverrideWindmillBinary()
void setOverrideWindmillBinary(java.lang.String value)
java.lang.String getWindmillServiceEndpoint()
void setWindmillServiceEndpoint(java.lang.String value)
@Default.Integer(value=443) int getWindmillServicePort()
void setWindmillServicePort(int value)
int getNumberOfWorkerHarnessThreads()
void setNumberOfWorkerHarnessThreads(int value)
boolean getDumpHeapOnOOM()
CAUTION: Heap dumps can of comparable size to the default boot disk. Consider increasing the boot disk size before setting this flag to true.
void setDumpHeapOnOOM(boolean dumpHeapBeforeExit)