trait DatasetSuiteBase extends DataFrameSuiteBase with DatasetSuiteBaseLike
- Self Type
- DatasetSuiteBase with Suite
- Alphabetic
- By Inheritance
- DatasetSuiteBase
- DatasetSuiteBaseLike
- DataFrameSuiteBase
- DataFrameSuiteBaseLike
- Serializable
- Serializable
- SharedSparkContext
- SparkContextProvider
- BeforeAndAfterAll
- SuiteMixin
- TestSuite
- TestSuiteLike
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Abstract Value Members
-
abstract
def
expectedTestCount(filter: Filter): Int
- Definition Classes
- SuiteMixin
-
abstract
def
nestedSuites: IndexedSeq[Suite]
- Definition Classes
- SuiteMixin
-
abstract
def
rerunner: Option[String]
- Definition Classes
- SuiteMixin
-
abstract
def
runNestedSuites(args: Args): Status
- Attributes
- protected
- Definition Classes
- SuiteMixin
-
abstract
def
runTest(testName: String, args: Args): Status
- Attributes
- protected
- Definition Classes
- SuiteMixin
-
abstract
def
runTests(testName: Option[String], args: Args): Status
- Attributes
- protected
- Definition Classes
- SuiteMixin
-
abstract
def
suiteId: String
- Definition Classes
- SuiteMixin
-
abstract
def
suiteName: String
- Definition Classes
- SuiteMixin
-
abstract
def
tags: Map[String, Set[String]]
- Definition Classes
- SuiteMixin
-
abstract
def
testDataFor(testName: String, theConfigMap: ConfigMap): TestData
- Definition Classes
- SuiteMixin
-
abstract
def
testNames: Set[String]
- Definition Classes
- SuiteMixin
-
abstract
val
styleName: String
- Definition Classes
- SuiteMixin
- Annotations
- @deprecated
- Deprecated
(Since version 3.1.0) The styleName lifecycle method has been deprecated and will be removed in a future version of ScalaTest with no replacement.
Concrete Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
afterAll(): Unit
- Definition Classes
- DataFrameSuiteBase → SharedSparkContext → BeforeAndAfterAll
-
def
appID: String
- Definition Classes
- SparkContextProvider
-
def
appName: String
- Definition Classes
- SparkContextProvider
-
def
approxEquals(r1: Row, r2: Row, tol: Double, tolTimestamp: Duration): Boolean
- Definition Classes
- DataFrameSuiteBaseLike
-
def
approxEquals(r1: Row, r2: Row, tolTimestamp: Duration): Boolean
- Definition Classes
- DataFrameSuiteBaseLike
-
def
approxEquals(r1: Row, r2: Row, tol: Double): Boolean
- Definition Classes
- DataFrameSuiteBaseLike
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
assert[U](message: String, expected: U, actual: U)(implicit CT: ClassTag[U]): Unit
- Definition Classes
- TestSuite → TestSuiteLike
-
def
assert[U](expected: U, actual: U)(implicit CT: ClassTag[U]): Unit
- Definition Classes
- TestSuite → TestSuiteLike
-
def
assertDataFrameApproximateEquals(expected: DataFrame, result: DataFrame, tol: Double, tolTimestamp: Duration, customShow: (DataFrame) ⇒ Unit = _.show()): Unit
Compares if two DataFrames are equal, checks that the schemas are the same.
Compares if two DataFrames are equal, checks that the schemas are the same. When comparing inexact fields uses tol & tolTimestamp.
- tol
max acceptable numeric tolerance, should be less than 1.
- tolTimestamp
max acceptable timestamp tolerance.
- customShow
unit function to customize the show method when dataframes are not equal. IE: df.show(false) or df.toJSON.show(false).
- Definition Classes
- DataFrameSuiteBaseLike
-
def
assertDataFrameDataEquals(expected: DataFrame, result: DataFrame): Unit
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. The resulting DataFrame should be empty inferring the two DataFrames have the same elements. Does not compare the schema.
- Definition Classes
- DataFrameSuiteBaseLike
-
def
assertDataFrameEquals(expected: DataFrame, result: DataFrame, customShow: (DataFrame) ⇒ Unit = _.show()): Unit
Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.
Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.
- customShow
unit function to customize the show method when dataframes are not equal. IE: df.show(false) or df.toJSON.show(false).
- Definition Classes
- DataFrameSuiteBaseLike
-
def
assertDataFrameNoOrderEquals(expected: DataFrame, result: DataFrame): Unit
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. The resulting DataFrame should be empty inferring the two DataFrames have the same elements. Also verifies that the schema is identical.
- Definition Classes
- DataFrameSuiteBaseLike
-
def
assertDatasetApproximateEquals[U](expected: Dataset[U], result: Dataset[U], tol: Double, tolTimestamp: Duration, customShow: (DataFrame) ⇒ Unit = _.show())(implicit UCT: ClassTag[U]): Unit
Compares if two Datasets are equal, Datasets should have the same type.
Compares if two Datasets are equal, Datasets should have the same type. When comparing inexact fields uses tol & tolTimestamp.
- tol
max acceptable tolerance for numeric (between(0, 1))
- tolTimestamp
max acceptable timestamp tolerance.
- customShow
unit function to customize the show method when dataframes are not equal. IE: df.show(false) or df.toJSON.show(false).
- Definition Classes
- DatasetSuiteBaseLike
-
def
assertDatasetEquals[U](expected: Dataset[U], result: Dataset[U])(implicit UCT: ClassTag[U]): Unit
Check if two Datasets are equals, Datasets should have the same type.
Check if two Datasets are equals, Datasets should have the same type. This method could be customized by overriding equals method for the given class type.
- Definition Classes
- DatasetSuiteBaseLike
-
def
assertEmpty[U](arr: Array[U])(implicit CT: ClassTag[U]): Unit
- Definition Classes
- TestSuite → TestSuiteLike
-
def
assertSchemasEqual(expected: StructType, result: StructType): Unit
Compare if two schemas are equal, ignoring autoGeneratedAlias magic
Compare if two schemas are equal, ignoring autoGeneratedAlias magic
- Definition Classes
- DataFrameSuiteBaseLike
-
def
assertSmallDataFrameDataEquals(expected: DataFrame, result: DataFrame): Unit
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. Similar to the function assertDataFrameDataEquals but for small DataFrame that can be collected in memory for the comparison.
- Definition Classes
- DataFrameSuiteBaseLike
-
def
assertTrue(expected: Boolean): Unit
- Definition Classes
- TestSuite → TestSuiteLike
-
def
beforeAll(): Unit
- Definition Classes
- DataFrameSuiteBase → SharedSparkContext → BeforeAndAfterAll
-
def
builder(): Builder
Constructs a configuration for hive or iceberg, where the metastore is located in a temp directory.
Constructs a configuration for hive or iceberg, where the metastore is located in a temp directory.
- Definition Classes
- DataFrameSuiteBaseLike
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @IntrinsicCandidate()
-
def
conf: SparkConf
- Definition Classes
- SparkContextProvider
-
def
enableHiveSupport: Boolean
- Attributes
- protected
- Definition Classes
- DataFrameSuiteBaseLike
-
def
enableIcebergSupport: Boolean
- Attributes
- protected
- Definition Classes
- DataFrameSuiteBaseLike
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
fail(message: String): Unit
- Definition Classes
- TestSuite → TestSuiteLike
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @IntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @IntrinsicCandidate()
-
val
icebergWarehouse: String
- Definition Classes
- DataFrameSuiteBaseLike
-
implicit
def
impSqlContext: SQLContext
- Attributes
- protected
- Definition Classes
- DataFrameSuiteBaseLike
-
val
invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected: Boolean
- Definition Classes
- BeforeAndAfterAll
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
lazy val
localMetastorePath: String
- Definition Classes
- DataFrameSuiteBaseLike
-
lazy val
localWarehousePath: String
- Definition Classes
- DataFrameSuiteBaseLike
-
val
maxUnequalRowsToShow: Int
- Definition Classes
- DataFrameSuiteBaseLike
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @IntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @IntrinsicCandidate()
-
implicit
def
reuseContextIfPossible: Boolean
- Attributes
- protected
- Definition Classes
- SharedSparkContext
-
def
run(testName: Option[String], args: Args): Status
- Definition Classes
- BeforeAndAfterAll → SuiteMixin
-
def
sc: SparkContext
- Definition Classes
- SharedSparkContext → SparkContextProvider
-
def
setup(sc: SparkContext): Unit
Setup work to be called when creating a new SparkContext.
Setup work to be called when creating a new SparkContext. Default implementation currently sets a checkpoint directory.
This _should_ be called by the context provider automatically.
- Definition Classes
- SparkContextProvider
-
lazy val
spark: SparkSession
- Definition Classes
- DataFrameSuiteBaseLike
- Annotations
- @transient()
-
def
sqlBeforeAllTestCases(): Unit
- Definition Classes
- DataFrameSuiteBaseLike
-
lazy val
sqlContext: SQLContext
- Definition Classes
- DataFrameSuiteBaseLike
- Annotations
- @transient()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
lazy val
tempDir: File
- Definition Classes
- DataFrameSuiteBaseLike
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
def
withSQLConf(pairs: (String, String)*)(f: ⇒ Unit): Unit
Sets all SQL configurations specified in
pairs, callsf, and then restores all SQL configurations.Sets all SQL configurations specified in
pairs, callsf, and then restores all SQL configurations. Taken from Spark SQLHelper.- Attributes
- protected
- Definition Classes
- DataFrameSuiteBase
Deprecated Value Members
-
def
assertDataFrameApproximateEquals(expected: DataFrame, result: DataFrame, tol: Double): Unit
Compares if two DataFrames are equal, checks that the schemas are the same.
Compares if two DataFrames are equal, checks that the schemas are the same. When comparing inexact fields uses tol.
- tol
max acceptable tolerance for numeric (between(0, 1)) & timestamp (millis).
- Definition Classes
- DataFrameSuiteBaseLike
- Annotations
- @deprecated
- Deprecated
(Since version 1.5.0) Use
assertDataFrameApproximateEqualswith timestamp tolerance
-
def
assertDatasetApproximateEquals[U](expected: Dataset[U], result: Dataset[U], tol: Double)(implicit UCT: ClassTag[U]): Unit
Compares if two Datasets are equal, Datasets should have the same type.
Compares if two Datasets are equal, Datasets should have the same type. When comparing inexact fields uses tol.
- tol
max acceptable tolerance for numeric (between(0, 1)) & timestamp (millis).
- Definition Classes
- DatasetSuiteBaseLike
- Annotations
- @deprecated
- Deprecated
(Since version 1.5.0) Use
assertDatasetApproximateEqualswith timestamp tolerance
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated