public interface SparkContextOptions extends SparkPipelineOptions
PipelineOptions
to work with properties related to JavaSparkContext
.
This can only be used programmatically (as opposed to passing command line arguments), since the properties here are context-aware and should not be propagated to workers.
Separating this from SparkPipelineOptions
is needed so the context-aware properties,
which link to Spark dependencies, won't be scanned by PipelineOptions
reflective
instantiation. Note that SparkContextOptions
is not registered with SparkRunnerRegistrar
.
Modifier and Type | Interface and Description |
---|---|
static class |
SparkContextOptions.EmptyListenersList
Returns an empty list, to avoid handling null.
|
SparkCommonPipelineOptions.TmpCheckpointDirFactory
PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, PipelineOptions.DirectRunner, PipelineOptions.JobNameFactory, PipelineOptions.UserAgentFactory
DEFAULT_MASTER_URL
Modifier and Type | Method and Description |
---|---|
java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> |
getListeners() |
org.apache.spark.api.java.JavaSparkContext |
getProvidedSparkContext() |
void |
setListeners(java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> listeners) |
void |
setProvidedSparkContext(org.apache.spark.api.java.JavaSparkContext jsc) |
getBatchIntervalMillis, getBundleSize, getCheckpointDurationMillis, getMaxRecordsPerBatch, getMinReadTimeMillis, getReadTimePercentage, getStorageLevel, getUsesProvidedSparkContext, isCacheDisabled, isLocalSparkMaster, prepareFilesToStage, setBatchIntervalMillis, setBundleSize, setCacheDisabled, setCheckpointDurationMillis, setMaxRecordsPerBatch, setMinReadTimeMillis, setReadTimePercentage, setStorageLevel, setUsesProvidedSparkContext
getCheckpointDir, getEnableSparkMetricSinks, getFilesToStage, getSparkMaster, setCheckpointDir, setEnableSparkMetricSinks, setFilesToStage, setSparkMaster
isStreaming, setStreaming
getAppName, setAppName
as, getJobName, getOptionsId, getRunner, getStableUniqueNames, getTempLocation, getUserAgent, outputRuntimeOptions, setJobName, setOptionsId, setRunner, setStableUniqueNames, setTempLocation, setUserAgent
populateDisplayData
org.apache.spark.api.java.JavaSparkContext getProvidedSparkContext()
void setProvidedSparkContext(org.apache.spark.api.java.JavaSparkContext jsc)
@Default.InstanceFactory(value=SparkContextOptions.EmptyListenersList.class) java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> getListeners()
void setListeners(java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> listeners)