class JavaStreamingSuiteBase extends JavaSuiteBase with StreamingSuiteCommon
This is the base trait for Spark Streaming testsuite. This provides basic functionality to run user-defined set of input on user-defined stream operations, and verify the output matches as expected.
This implementation is designed to work with JUnit for java users.
Note: this always uses the manual clock to control Spark Streaming's batches.
- Alphabetic
- By Inheritance
- JavaStreamingSuiteBase
- StreamingSuiteCommon
- Logging
- Logging
- JavaSuiteBase
- SharedJavaSparkContext
- SparkContextProvider
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new JavaStreamingSuiteBase()
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
actuallyWait: Boolean
- Definition Classes
- StreamingSuiteCommon
-
def
appID: String
- Definition Classes
- SparkContextProvider
-
def
appName: String
- Definition Classes
- SparkContextProvider
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
batchDuration: Duration
- Definition Classes
- StreamingSuiteCommon
-
def
beforeAllTestCasesHook(): Unit
Hooks for setup code that needs to be executed/torn down in order with SparkContexts
Hooks for setup code that needs to be executed/torn down in order with SparkContexts
- Attributes
- protected[testing]
- Definition Classes
- SharedJavaSparkContext
-
lazy val
checkpointDir: String
- Definition Classes
- StreamingSuiteCommon
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @IntrinsicCandidate()
-
def
compareArrays[U](i1: Array[U], i2: Array[U], sorted: Boolean = false)(implicit arg0: ClassTag[U]): Unit
Utility wrapper around assertArrayEquals that resolves the types
Utility wrapper around assertArrayEquals that resolves the types
- Definition Classes
- JavaSuiteBase
-
def
conf: SparkConf
- Definition Classes
- JavaStreamingSuiteBase → StreamingSuiteCommon → SparkContextProvider
-
def
copyAndSort[U](array: Array[U])(implicit arg0: ClassTag[U]): Array[U]
- Definition Classes
- JavaSuiteBase
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
val
eventuallyTimeout: Timeout
- Definition Classes
- StreamingSuiteCommon
-
def
framework: String
- Definition Classes
- StreamingSuiteCommon
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @IntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @IntrinsicCandidate()
-
def
initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
initializeLogIfNecessary(isInterpreter: Boolean): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
isTraceEnabled(): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
jsc(): JavaSparkContext
- Definition Classes
- SharedJavaSparkContext
-
def
log: Logger
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logName: String
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
master: String
- Definition Classes
- StreamingSuiteCommon
-
def
maxWaitTimeMillis: Int
- Definition Classes
- StreamingSuiteCommon
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @IntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @IntrinsicCandidate()
-
def
numInputPartitions: Int
- Definition Classes
- StreamingSuiteCommon
-
def
runBefore(): Unit
- Definition Classes
- SharedJavaSparkContext
- Annotations
- @Before()
-
def
sc(): SparkContext
- Definition Classes
- SharedJavaSparkContext → SparkContextProvider
-
def
setup(sc: SparkContext): Unit
Setup work to be called when creating a new SparkContext.
Setup work to be called when creating a new SparkContext. Default implementation currently sets a checkpoint directory.
This _should_ be called by the context provider automatically.
- Definition Classes
- SparkContextProvider
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
testOperation[U, V, W](input1: List[List[U]], input2: List[List[V]], operation: Function2[JavaDStream[U], JavaDStream[V], JavaDStream[W]], expectedOutput: List[List[W]], ordered: Boolean): Unit
Test binary DStream operation with two lists of inputs, with number of batches to run same as the number of input values.
Test binary DStream operation with two lists of inputs, with number of batches to run same as the number of input values. The size of the two input lists should be equal.
Each input micro-batch is a list of values or as null to simulate empty batch.
- input1
First sequence of input collections
- input2
Second sequence of input collections
- operation
Binary DStream operation to be applied to the 2 inputs
- expectedOutput
Sequence of expected output collections
- ordered
Compare output values with expected output values within the same output batch ordered or unOrdered. Comparing doubles may not work well in case of unordered.
-
def
testOperation[U, V, W](input1: List[List[U]], input2: List[List[V]], operation: Function2[JavaDStream[U], JavaDStream[V], JavaDStream[W]], expectedOutput: List[List[W]]): Unit
Test binary DStream operation with two lists of inputs, with number of batches to run same as the number of input values.
Test binary DStream operation with two lists of inputs, with number of batches to run same as the number of input values. The size of the two input lists should be equal.
Each input micro-batch is a list of values or as null to simulate empty batch.
- input1
First sequence of input collections
- input2
Second sequence of input collections
- operation
Binary DStream operation to be applied to the 2 inputs
- expectedOutput
Sequence of expected output collections
-
def
testOperation[U, V](input: List[List[U]], operation: Function[JavaDStream[U], JavaDStream[V]], expectedOutput: List[List[V]], ordered: Boolean): Unit
Test unary DStream operation with a list of inputs, with number of batches to run same as the number of input values.
Test unary DStream operation with a list of inputs, with number of batches to run same as the number of input values.
Each input micro-batch is a list of values or as null to simulate empty batch.
- input
Sequence of input collections
- operation
Binary DStream operation to be applied to the 2 inputs
- expectedOutput
Sequence of expected output collections
- ordered
Compare output values with expected output values within the same output batch ordered or unordered. Comparing doubles may not work well in case of unordered.
-
def
testOperation[U, V](input: List[List[U]], operation: Function[JavaDStream[U], JavaDStream[V]], expectedOutput: List[List[V]]): Unit
Test unary DStream operation with a list of inputs, with number of batches to run same as the number of input values.
Test unary DStream operation with a list of inputs, with number of batches to run same as the number of input values.
Each input micro-batch is a list of values or as null to simulate empty batch.
- input
Sequence of input collections
- operation
Binary DStream operation to be applied to the 2 inputs
- expectedOutput
Sequence of expected output collections
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
useManualClock: Boolean
- Definition Classes
- StreamingSuiteCommon
-
def
verifyOutput[V](output: Seq[Seq[V]], expectedOutput: Seq[Seq[V]], ordered: Boolean)(implicit arg0: ClassTag[V]): Unit
Verify whether the output values after running a DStream operation is same as the expected output values, by comparing the output collections either as lists (order matters) or sets (order does not matter)
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated