trait StreamingSuiteBase extends BeforeAndAfterAll with Logging with StreamingSuiteCommon with SharedSparkContext
This is the base trait for Spark Streaming testsuites. This provides basic functionality to run user-defined set of input on user-defined stream operations, and verify the output.
- Self Type
- StreamingSuiteBase with Suite
- Alphabetic
- By Inheritance
- StreamingSuiteBase
- SharedSparkContext
- StreamingSuiteCommon
- SparkContextProvider
- Logging
- Logging
- BeforeAndAfterAll
- SuiteMixin
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Abstract Value Members
-
abstract
def
expectedTestCount(filter: Filter): Int
- Definition Classes
- SuiteMixin
-
abstract
def
nestedSuites: IndexedSeq[Suite]
- Definition Classes
- SuiteMixin
-
abstract
def
rerunner: Option[String]
- Definition Classes
- SuiteMixin
-
abstract
def
runNestedSuites(args: Args): Status
- Attributes
- protected
- Definition Classes
- SuiteMixin
-
abstract
def
runTest(testName: String, args: Args): Status
- Attributes
- protected
- Definition Classes
- SuiteMixin
-
abstract
def
runTests(testName: Option[String], args: Args): Status
- Attributes
- protected
- Definition Classes
- SuiteMixin
-
abstract
def
suiteId: String
- Definition Classes
- SuiteMixin
-
abstract
def
suiteName: String
- Definition Classes
- SuiteMixin
-
abstract
def
tags: Map[String, Set[String]]
- Definition Classes
- SuiteMixin
-
abstract
def
testDataFor(testName: String, theConfigMap: ConfigMap): TestData
- Definition Classes
- SuiteMixin
-
abstract
def
testNames: Set[String]
- Definition Classes
- SuiteMixin
-
abstract
val
styleName: String
- Definition Classes
- SuiteMixin
- Annotations
- @deprecated
- Deprecated
(Since version 3.1.0) The styleName lifecycle method has been deprecated and will be removed in a future version of ScalaTest with no replacement.
Concrete Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
actuallyWait: Boolean
- Definition Classes
- StreamingSuiteCommon
-
def
afterAll(): Unit
- Definition Classes
- StreamingSuiteBase → SharedSparkContext → BeforeAndAfterAll
-
def
appID: String
- Definition Classes
- SparkContextProvider
-
def
appName: String
- Definition Classes
- SparkContextProvider
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
batchDuration: Duration
- Definition Classes
- StreamingSuiteCommon
-
def
beforeAll(): Unit
- Definition Classes
- StreamingSuiteBase → SharedSparkContext → BeforeAndAfterAll
-
lazy val
checkpointDir: String
- Definition Classes
- StreamingSuiteCommon
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @IntrinsicCandidate()
-
def
conf: SparkConf
- Definition Classes
- StreamingSuiteCommon → SparkContextProvider
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
val
eventuallyTimeout: Timeout
- Definition Classes
- StreamingSuiteCommon
-
def
framework: String
- Definition Classes
- StreamingSuiteCommon
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @IntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @IntrinsicCandidate()
-
def
initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
initializeLogIfNecessary(isInterpreter: Boolean): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
val
invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected: Boolean
- Definition Classes
- BeforeAndAfterAll
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
isTraceEnabled(): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
log: Logger
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logName: String
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
master: String
- Definition Classes
- StreamingSuiteCommon
-
def
maxWaitTimeMillis: Int
- Definition Classes
- StreamingSuiteCommon
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @IntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @IntrinsicCandidate()
-
def
numInputPartitions: Int
- Definition Classes
- StreamingSuiteCommon
-
implicit
def
reuseContextIfPossible: Boolean
- Attributes
- protected
- Definition Classes
- SharedSparkContext
-
def
run(testName: Option[String], args: Args): Status
- Definition Classes
- BeforeAndAfterAll → SuiteMixin
-
def
sc: SparkContext
- Definition Classes
- SharedSparkContext → SparkContextProvider
-
def
setup(sc: SparkContext): Unit
Setup work to be called when creating a new SparkContext.
Setup work to be called when creating a new SparkContext. Default implementation currently sets a checkpoint directory.
This _should_ be called by the context provider automatically.
- Definition Classes
- SparkContextProvider
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
testOperation[U, V, W](input1: Seq[Seq[U]], input2: Seq[Seq[V]], operation: (DStream[U], DStream[V]) ⇒ DStream[W], expectedOutput: Seq[Seq[W]], ordered: Boolean)(implicit arg0: ClassTag[U], arg1: ClassTag[V], arg2: ClassTag[W], equality: Equality[W]): Unit
Test binary DStream operation with two lists of inputs, with number of batches to run same as the number of input values.
Test binary DStream operation with two lists of inputs, with number of batches to run same as the number of input values. The size of the two input lists should be the same.
Each input micro-batch is a list of values or as null to simulate empty batch.
- input1
First sequence of input collections
- input2
Second sequence of input collections
- operation
Binary DStream operation to be applied to the 2 inputs
- expectedOutput
Sequence of expected output collections
- ordered
Compare output values with expected output values within the same output batch ordered or unOrdered. Comparing doubles may not work well in case of unordered.
-
def
testOperation[U, V](input: Seq[Seq[U]], operation: (DStream[U]) ⇒ DStream[V], expectedOutput: Seq[Seq[V]], ordered: Boolean)(implicit arg0: ClassTag[U], arg1: ClassTag[V], equality: Equality[V]): Unit
Test unary DStream operation with a list of inputs, with number of batches to run same as the number of input values.
Test unary DStream operation with a list of inputs, with number of batches to run same as the number of input values.
Each input micro-batch is a list of values or as null to simulate empty batch.
- input
Sequence of input collections
- operation
Binary DStream operation to be applied to the 2 inputs
- expectedOutput
Sequence of expected output collections
- ordered
Compare output values with expected output values within the same output batch ordered or unordered. Comparing doubles may not work well in case of unordered.
- def testOperation[U, V, W](input1: Seq[Seq[U]], input2: Seq[Seq[V]], operation: (DStream[U], DStream[V]) ⇒ DStream[W], expectedOutput: Seq[Seq[W]])(implicit arg0: ClassTag[U], arg1: ClassTag[V], arg2: ClassTag[W], equality: Equality[W]): Unit
- def testOperation[U, V](input: Seq[Seq[U]], operation: (DStream[U]) ⇒ DStream[V], expectedOutput: Seq[Seq[V]])(implicit arg0: ClassTag[U], arg1: ClassTag[V], equality: Equality[V]): Unit
-
def
testOperationWithRDD[U, V, W](input1: Seq[Seq[U]], input2: Seq[V], operation: (DStream[U], RDD[V]) ⇒ DStream[W], expectedOutput: Seq[Seq[W]], ordered: Boolean)(implicit arg0: ClassTag[U], arg1: ClassTag[V], arg2: ClassTag[W], equality: Equality[W]): Unit
Test binary DStream and RDD operation with two lists of inputs, with number of batches to run same as the number of input values corresponding to the DStream.
Test binary DStream and RDD operation with two lists of inputs, with number of batches to run same as the number of input values corresponding to the DStream.
Each input micro-batch is a list of values or as null to simulate empty batch.
- input1
Sequence of input collections corresponding to the DStream
- input2
Sequence of input values corresponding to the RDD
- operation
Binary DStream and RDD operation to be applied to the 2 inputs
- expectedOutput
Sequence of expected output collections
- ordered
Compare output values with expected output values within the same output batch ordered or unOrdered. Comparing doubles may not work well in case of unordered.
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
useManualClock: Boolean
- Definition Classes
- StreamingSuiteCommon
-
def
verifyOutput[V](output: Seq[Seq[V]], expectedOutput: Seq[Seq[V]], ordered: Boolean)(implicit arg0: ClassTag[V], equality: Equality[V]): Unit
Verify whether the output values after running a DStream operation is same as the expected output values, by comparing the output collections either as lists (order matters) or sets (order does not matter)
Verify whether the output values after running a DStream operation is same as the expected output values, by comparing the output collections either as lists (order matters) or sets (order does not matter)
- ordered
Compare output values with expected output values within the same output batch ordered or unordered. Comparing doubles may not work well in case of unordered.
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated