c

com.holdenkarau.spark.testing

JavaDatasetSuiteBase

class JavaDatasetSuiteBase extends JavaDataFrameSuiteBase with DatasetSuiteBaseLike with Serializable

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. JavaDatasetSuiteBase
  2. DatasetSuiteBaseLike
  3. JavaDataFrameSuiteBase
  4. JavaTestSuite
  5. DataFrameSuiteBaseLike
  6. Serializable
  7. Serializable
  8. TestSuiteLike
  9. SharedJavaSparkContext
  10. SparkContextProvider
  11. AnyRef
  12. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new JavaDatasetSuiteBase()

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def appID: String
    Definition Classes
    SparkContextProvider
  5. def approxEquals(r1: Row, r2: Row, tol: Double): Boolean
    Definition Classes
    DataFrameSuiteBaseLike
  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def assert[U](message: String, expected: U, actual: U)(implicit CT: ClassTag[U]): Unit
    Definition Classes
    JavaTestSuiteTestSuiteLike
  8. def assert[U](expected: U, actual: U)(implicit CT: ClassTag[U]): Unit
    Definition Classes
    JavaTestSuiteTestSuiteLike
  9. def assertDataFrameApproximateEquals(expected: DataFrame, result: DataFrame, tol: Double): Unit

    Compares if two DataFrames are equal, checks that the schemas are the same.

    Compares if two DataFrames are equal, checks that the schemas are the same. When comparing inexact fields uses tol.

    tol

    max acceptable tolerance, should be less than 1.

    Definition Classes
    DataFrameSuiteBaseLike
  10. def assertDataFrameDataEquals(expected: DataFrame, result: DataFrame): Unit

    Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.

    Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. The resulting DataFrame should be empty inferring the two DataFrames have the same elements. Does not compare the schema.

    Definition Classes
    DataFrameSuiteBaseLike
  11. def assertDataFrameEquals(expected: DataFrame, result: DataFrame): Unit

    Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.

    Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.

    Definition Classes
    DataFrameSuiteBaseLike
  12. def assertDataFrameNoOrderEquals(expected: DataFrame, result: DataFrame): Unit

    Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.

    Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. The resulting DataFrame should be empty inferring the two DataFrames have the same elements. Also verifies that the schema is identical.

    Definition Classes
    DataFrameSuiteBaseLike
  13. def assertDatasetApproximateEquals[U](expected: Dataset[U], result: Dataset[U], tol: Double): Unit

    Compares if two Datasets are equal, Datasets should have the same type.

    Compares if two Datasets are equal, Datasets should have the same type. When comparing inexact fields uses tol.

    tol

    max acceptable tolerance, should be less than 1.

  14. def assertDatasetApproximateEquals[U](expected: Dataset[U], result: Dataset[U], tol: Double)(implicit UCT: ClassTag[U]): Unit

    Compares if two Datasets are equal, Datasets should have the same type.

    Compares if two Datasets are equal, Datasets should have the same type. When comparing inexact fields uses tol.

    tol

    max acceptable tolerance, should be less than 1.

    Definition Classes
    DatasetSuiteBaseLike
  15. def assertDatasetEquals[U](expected: Dataset[U], result: Dataset[U]): Unit

    Check if two Datasets are equals, Datasets should have the same type.

    Check if two Datasets are equals, Datasets should have the same type. This method could be customized by overriding equals method for the given class type.

  16. def assertDatasetEquals[U](expected: Dataset[U], result: Dataset[U])(implicit UCT: ClassTag[U]): Unit

    Check if two Datasets are equals, Datasets should have the same type.

    Check if two Datasets are equals, Datasets should have the same type. This method could be customized by overriding equals method for the given class type.

    Definition Classes
    DatasetSuiteBaseLike
  17. def assertEmpty[U](arr: Array[U])(implicit CT: ClassTag[U]): Unit
    Definition Classes
    JavaTestSuiteTestSuiteLike
  18. def assertSmallDataFrameDataEquals(expected: DataFrame, result: DataFrame): Unit

    Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other.

    Compares if two DataFrames are equal without caring about order of rows, by finding elements in one DataFrame that is not in the other. Similar to the function assertDataFrameDataEquals but for small DataFrame that can be collected in memory for the comparison.

    Definition Classes
    DataFrameSuiteBaseLike
  19. def assertTrue(expected: Boolean): Unit
    Definition Classes
    JavaTestSuiteTestSuiteLike
  20. def beforeAllTestCasesHook(): Unit

    Hooks for setup code that needs to be executed/torn down in order with SparkContexts

    Hooks for setup code that needs to be executed/torn down in order with SparkContexts

    Definition Classes
    JavaDataFrameSuiteBaseSharedJavaSparkContext
  21. def builder(): Builder

    Constructs a configuration for hive or iceberg, where the metastore is located in a temp directory.

    Constructs a configuration for hive or iceberg, where the metastore is located in a temp directory.

    Definition Classes
    DataFrameSuiteBaseLike
  22. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  23. def conf: SparkConf
    Definition Classes
    SparkContextProvider
  24. def enableHiveSupport: Boolean
    Attributes
    protected
    Definition Classes
    DataFrameSuiteBaseLike
  25. def enableIcebergSupport: Boolean
    Attributes
    protected
    Definition Classes
    DataFrameSuiteBaseLike
  26. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  27. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  28. def fail(message: String): Unit
    Definition Classes
    JavaTestSuiteTestSuiteLike
  29. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  30. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  31. val icebergWarehouse: String
    Definition Classes
    DataFrameSuiteBaseLike
  32. implicit def impSqlContext: SQLContext
    Attributes
    protected
    Definition Classes
    DataFrameSuiteBaseLike
  33. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  34. def jsc(): JavaSparkContext
    Definition Classes
    SharedJavaSparkContext
  35. lazy val localMetastorePath: String
    Definition Classes
    DataFrameSuiteBaseLike
  36. lazy val localWarehousePath: String
    Definition Classes
    DataFrameSuiteBaseLike
  37. val maxUnequalRowsToShow: Int
    Definition Classes
    DataFrameSuiteBaseLike
  38. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  39. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  40. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  41. def runBefore(): Unit
    Definition Classes
    SharedJavaSparkContext
    Annotations
    @Before()
  42. def sc(): SparkContext
  43. def setup(sc: SparkContext): Unit

    Setup work to be called when creating a new SparkContext.

    Setup work to be called when creating a new SparkContext. Default implementation currently sets a checkpoint directory.

    This _should_ be called by the context provider automatically.

    Definition Classes
    SparkContextProvider
  44. lazy val spark: SparkSession
    Definition Classes
    DataFrameSuiteBaseLike
    Annotations
    @transient()
  45. def sqlBeforeAllTestCases(): Unit
    Definition Classes
    DataFrameSuiteBaseLike
  46. lazy val sqlContext: SQLContext
    Definition Classes
    DataFrameSuiteBaseLike
    Annotations
    @transient()
  47. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  48. lazy val tempDir: File
    Definition Classes
    DataFrameSuiteBaseLike
  49. def toString(): String
    Definition Classes
    AnyRef → Any
  50. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  51. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  52. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from DatasetSuiteBaseLike

Inherited from JavaDataFrameSuiteBase

Inherited from JavaTestSuite

Inherited from DataFrameSuiteBaseLike

Inherited from Serializable

Inherited from Serializable

Inherited from TestSuiteLike

Inherited from SharedJavaSparkContext

Inherited from SparkContextProvider

Inherited from AnyRef

Inherited from Any

Ungrouped