trait DataFrameSuiteBaseLike extends SparkContextProvider with TestSuiteLike with Serializable
- Alphabetic
- By Inheritance
- DataFrameSuiteBaseLike
- Serializable
- Serializable
- TestSuiteLike
- SparkContextProvider
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Abstract Value Members
-
abstract
def
assert[U](message: String, expected: U, actual: U)(implicit CT: ClassTag[U]): Unit
- Definition Classes
- TestSuiteLike
-
abstract
def
assert[U](expected: U, actual: U)(implicit CT: ClassTag[U]): Unit
- Definition Classes
- TestSuiteLike
-
abstract
def
assertEmpty[U](arr: Array[U])(implicit CT: ClassTag[U]): Unit
- Definition Classes
- TestSuiteLike
-
abstract
def
assertTrue(expected: Boolean): Unit
- Definition Classes
- TestSuiteLike
-
abstract
def
fail(message: String): Unit
- Definition Classes
- TestSuiteLike
-
abstract
def
sc: SparkContext
- Definition Classes
- SparkContextProvider
Concrete Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
appID: String
- Definition Classes
- SparkContextProvider
- def approxEquals(r1: Row, r2: Row, tol: Double): Boolean
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
assertDataFrameApproximateEquals(expected: DataFrame, result: DataFrame, tol: Double): Unit
Compares if two DataFrames are equal, checks that the schemas are the same.
Compares if two DataFrames are equal, checks that the schemas are the same. When comparing inexact fields uses tol.
- tol
max acceptable tolerance, should be less than 1.
-
def
assertDataFrameDataEquals(expected: DataFrame, result: DataFrame): Unit
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. The resulting DataFrame should be empty inferring the two DataFrames have the same elements. Does not compare the schema.
-
def
assertDataFrameEquals(expected: DataFrame, result: DataFrame): Unit
Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.
-
def
assertDataFrameNoOrderEquals(expected: DataFrame, result: DataFrame): Unit
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. The resulting DataFrame should be empty inferring the two DataFrames have the same elements. Also verifies that the schema is identical.
-
def
assertSmallDataFrameDataEquals(expected: DataFrame, result: DataFrame): Unit
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.
Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. Similar to the function assertDataFrameDataEquals but for small DataFrame that can be collected in memory for the comparison.
-
def
builder(): Builder
Constructs a configuration for hive or iceberg, where the metastore is located in a temp directory.
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @HotSpotIntrinsicCandidate()
-
def
conf: SparkConf
- Definition Classes
- SparkContextProvider
-
def
enableHiveSupport: Boolean
- Attributes
- protected
-
def
enableIcebergSupport: Boolean
- Attributes
- protected
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- val icebergWarehouse: String
-
implicit
def
impSqlContext: SQLContext
- Attributes
- protected
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- lazy val localMetastorePath: String
- lazy val localWarehousePath: String
- val maxUnequalRowsToShow: Int
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
setup(sc: SparkContext): Unit
Setup work to be called when creating a new SparkContext.
Setup work to be called when creating a new SparkContext. Default implementation currently sets a checkpoint directory.
This _should_ be called by the context provider automatically.
- Definition Classes
- SparkContextProvider
-
lazy val
spark: SparkSession
- Annotations
- @transient()
- def sqlBeforeAllTestCases(): Unit
-
lazy val
sqlContext: SQLContext
- Annotations
- @transient()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
- lazy val tempDir: File
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated