public interface SparkContextOptions extends SparkPipelineOptions
PipelineOptions
to work with properties related to JavaSparkContext
.
This can only be used programmatically (as opposed to passing command line arguments), since the properties here are context-aware and should not be propagated to workers.
Separating this from SparkPipelineOptions
is needed so the context-aware properties,
which link to Spark dependencies, won't be scanned by PipelineOptions
reflective
instantiation. Note that SparkContextOptions
is not registered with SparkRunnerRegistrar
.
Note: It's recommended to use org.apache.beam.runners.spark.translation.SparkContextFactory#setProvidedSparkContext(JavaSparkContext)
instead of setProvidedSparkContext(JavaSparkContext)
for testing.
When using @TestPipeline
any provided JavaSparkContext
via SparkContextOptions
is dropped.
Modifier and Type | Interface and Description |
---|---|
static class |
SparkContextOptions.EmptyListenersList
Returns an empty list, to avoid handling null.
|
SparkCommonPipelineOptions.StorageLevelFactory, SparkCommonPipelineOptions.TmpCheckpointDirFactory
PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, PipelineOptions.DirectRunner, PipelineOptions.JobNameFactory, PipelineOptions.UserAgentFactory
DEFAULT_MASTER_URL
Modifier and Type | Method and Description |
---|---|
java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> |
getListeners() |
org.apache.spark.api.java.JavaSparkContext |
getProvidedSparkContext() |
void |
setListeners(java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> listeners) |
void |
setProvidedSparkContext(org.apache.spark.api.java.JavaSparkContext jsc) |
getBatchIntervalMillis, getBundleSize, getCheckpointDurationMillis, getMaxRecordsPerBatch, getMinReadTimeMillis, getReadTimePercentage, getUsesProvidedSparkContext, isCacheDisabled, setBatchIntervalMillis, setBundleSize, setCacheDisabled, setCheckpointDurationMillis, setMaxRecordsPerBatch, setMinReadTimeMillis, setReadTimePercentage, setUsesProvidedSparkContext
getCheckpointDir, getEnableSparkMetricSinks, getSparkMaster, getStorageLevel, prepareFilesToStage, setCheckpointDir, setEnableSparkMetricSinks, setSparkMaster, setStorageLevel
isStreaming, setStreaming
getAppName, setAppName
as, getJobName, getOptionsId, getRunner, getStableUniqueNames, getTempLocation, getUserAgent, outputRuntimeOptions, revision, setJobName, setOptionsId, setRunner, setStableUniqueNames, setTempLocation, setUserAgent
populateDisplayData
getFilesToStage, setFilesToStage
org.apache.spark.api.java.JavaSparkContext getProvidedSparkContext()
void setProvidedSparkContext(org.apache.spark.api.java.JavaSparkContext jsc)
@Default.InstanceFactory(value=SparkContextOptions.EmptyListenersList.class) java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> getListeners()
void setListeners(java.util.List<org.apache.spark.streaming.api.java.JavaStreamingListener> listeners)