public interface SparkContextOptions extends SparkPipelineOptions
PipelineOptions
to work with properties related to JavaSparkContext
.
This can only be used programmatically (as opposed to passing command line arguments), since the properties here are context-aware and should not be propagated to workers.
Separating this from SparkPipelineOptions
is needed so the context-aware properties,
which link to Spark dependencies, won't be scanned by PipelineOptions
reflective instantiation.
Note that SparkContextOptions
is not registered with SparkRunnerRegistrar
.
Modifier and Type | Interface and Description |
---|---|
static class |
SparkContextOptions.EmptyListenersList
Returns an empty list, top avoid handling null.
|
SparkPipelineOptions.TmpCheckpointDirFactory
PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, PipelineOptions.DirectRunner, PipelineOptions.JobNameFactory, PipelineOptions.UserAgentFactory
Modifier and Type | Method and Description |
---|---|
java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> |
getListeners() |
org.apache.spark.api.java.JavaSparkContext |
getProvidedSparkContext() |
void |
setListeners(java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> listeners) |
void |
setProvidedSparkContext(org.apache.spark.api.java.JavaSparkContext jsc) |
getBatchIntervalMillis, getCheckpointDir, getCheckpointDurationMillis, getEnableSparkMetricSinks, getMaxRecordsPerBatch, getMinReadTimeMillis, getReadTimePercentage, getSparkMaster, getStorageLevel, getUsesProvidedSparkContext, setBatchIntervalMillis, setCheckpointDir, setCheckpointDurationMillis, setEnableSparkMetricSinks, setMaxRecordsPerBatch, setMinReadTimeMillis, setReadTimePercentage, setSparkMaster, setStorageLevel, setUsesProvidedSparkContext
isStreaming, setStreaming
getAppName, setAppName
as, getJobName, getOptionsId, getRunner, getStableUniqueNames, getTempLocation, getUserAgent, outputRuntimeOptions, setJobName, setOptionsId, setRunner, setStableUniqueNames, setTempLocation, setUserAgent
populateDisplayData
org.apache.spark.api.java.JavaSparkContext getProvidedSparkContext()
void setProvidedSparkContext(org.apache.spark.api.java.JavaSparkContext jsc)
@Default.InstanceFactory(value=SparkContextOptions.EmptyListenersList.class) java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> getListeners()
void setListeners(java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> listeners)