Class TestDStream<T>
java.lang.Object
org.apache.spark.streaming.dstream.DStream<WindowedValue<T>>
org.apache.spark.streaming.dstream.InputDStream<WindowedValue<T>>
org.apache.beam.runners.spark.translation.streaming.TestDStream<T>
- All Implemented Interfaces:
Serializable
,org.apache.spark.internal.Logging
,scala.Serializable
public class TestDStream<T>
extends org.apache.spark.streaming.dstream.InputDStream<WindowedValue<T>>
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.SparkShellLoggingFilter
-
Constructor Summary
ConstructorsConstructorDescriptionTestDStream
(TestStream<T> test, org.apache.spark.streaming.StreamingContext ssc) -
Method Summary
Modifier and TypeMethodDescriptionscala.Option
<org.apache.spark.rdd.RDD<WindowedValue<T>>> compute
(org.apache.spark.streaming.Time validTime) void
start()
void
stop()
Methods inherited from class org.apache.spark.streaming.dstream.InputDStream
baseScope, dependencies, id, isTimeValid, lastValidTime, lastValidTime_$eq, name, rateController, slideDuration
Methods inherited from class org.apache.spark.streaming.dstream.DStream
cache, checkpoint, checkpointData, checkpointDuration, checkpointDuration_$eq, clearCheckpointData, clearMetadata, context, count, countByValue, countByValue$default$1, countByValue$default$2, countByValueAndWindow, countByValueAndWindow$default$3, countByValueAndWindow$default$4, countByWindow, createRDDWithLocalProperties, creationSite, filter, flatMap, foreachRDD, foreachRDD, generatedRDDs, generatedRDDs_$eq, generateJob, getOrCompute, glom, graph, graph_$eq, initialize, initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isInitialized, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, map, mapPartitions, mapPartitions$default$2, mustCheckpoint, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, parentRememberDuration, persist, persist, print, print, reduce, reduceByWindow, reduceByWindow, register, remember, rememberDuration, rememberDuration_$eq, repartition, restoreCheckpointData, saveAsObjectFiles, saveAsObjectFiles$default$2, saveAsTextFiles, saveAsTextFiles$default$2, setContext, setGraph, slice, slice, ssc, ssc_$eq, storageLevel, storageLevel_$eq, toPairDStreamFunctions, toPairDStreamFunctions$default$4, transform, transform, transformWith, transformWith, union, updateCheckpointData, validateAtStart, window, window, zeroTime, zeroTime_$eq
-
Constructor Details
-
TestDStream
-
-
Method Details
-
compute
public scala.Option<org.apache.spark.rdd.RDD<WindowedValue<T>>> compute(org.apache.spark.streaming.Time validTime) - Specified by:
compute
in classorg.apache.spark.streaming.dstream.DStream<WindowedValue<T>>
-
start
public void start()- Specified by:
start
in classorg.apache.spark.streaming.dstream.InputDStream<WindowedValue<T>>
-
stop
public void stop()- Specified by:
stop
in classorg.apache.spark.streaming.dstream.InputDStream<WindowedValue<T>>
-