public class DeepSparkContext
extends org.apache.spark.api.java.JavaSparkContext
| Constructor and Description |
|---|
DeepSparkContext(org.apache.spark.SparkContext sc)
Overridden superclass constructor.
|
DeepSparkContext(String master,
String appName)
Overridden superclass constructor.
|
DeepSparkContext(String master,
String appName,
String sparkHome,
String jarFile)
Overridden superclass constructor.
|
DeepSparkContext(String master,
String appName,
String sparkHome,
String[] jars)
Overridden superclass constructor.
|
DeepSparkContext(String master,
String appName,
String sparkHome,
String[] jars,
Map<String,String> environment)
Overridden superclass constructor.
|
| Modifier and Type | Method and Description |
|---|---|
<T extends IDeepType> |
cassandraEntityRDD(IDeepJobConfig<T> config)
Builds a new testentity based CassandraEntityRDD.
|
CassandraRDD<Cells> |
cassandraGenericRDD(IDeepJobConfig<Cells> config)
Builds a new generic (cell based) CassandraGenericRDD.
|
<T> CassandraJavaRDD<T> |
cassandraJavaRDD(IDeepJobConfig<T> config)
Builds a new CassandraJavaRDD.
|
org.apache.spark.api.java.JavaDoubleRDD |
union(org.apache.spark.api.java.JavaDoubleRDD... arg0) |
<K,V> org.apache.spark.api.java.JavaPairRDD<K,V> |
union(org.apache.spark.api.java.JavaPairRDD<K,V>... arg0) |
<T> org.apache.spark.api.java.JavaRDD<T> |
union(org.apache.spark.api.java.JavaRDD<T>... arg0) |
accumulable, accumulator, accumulator, accumulator, addFile, addJar, broadcast, cancelAllJobs, cancelJobGroup, checkpointFile, clearCallSite, clearFiles, clearJars, clearJobGroup, doubleAccumulator, env, fromSparkContext, getCheckpointDir, getConf, getLocalProperty, getSparkHome, hadoopConfiguration, hadoopFile, hadoopFile, hadoopRDD, hadoopRDD, intAccumulator, jarOfClass, jarOfObject, newAPIHadoopFile, newAPIHadoopRDD, objectFile, objectFile, parallelize, parallelize, parallelizeDoubles, parallelizeDoubles, parallelizePairs, parallelizePairs, sc, sequenceFile, sequenceFile, setCallSite, setCheckpointDir, setJobGroup, setLocalProperty, stop, textFile, textFile, toSparkContext, union, union, unionpublic DeepSparkContext(org.apache.spark.SparkContext sc)
sc - an already created spark context.public DeepSparkContext(String master, String appName)
master - the url of the master node.appName - the name of the application.public DeepSparkContext(String master, String appName, String sparkHome, String jarFile)
master - the url of the master node.appName - the name of the application.sparkHome - the spark home folder.jarFile - the jar file to serialize and send to all the cluster nodes.public DeepSparkContext(String master, String appName, String sparkHome, String[] jars)
master - the url of the master node.appName - the name of the application.sparkHome - the spark home folder.jars - the jar file(s) to serialize and send to all the cluster nodes.public DeepSparkContext(String master, String appName, String sparkHome, String[] jars, Map<String,String> environment)
master - the url of the master node.appName - the name of the application.sparkHome - the spark home folder.jars - the jar file(s) to serialize and send to all the cluster nodes.environment - a map of environment variables.public <T> CassandraJavaRDD<T> cassandraJavaRDD(IDeepJobConfig<T> config)
config - the deep configuration object to use to create the new RDD.public <T extends IDeepType> CassandraRDD<T> cassandraEntityRDD(IDeepJobConfig<T> config)
config - the deep configuration object to use to create the new RDD.public CassandraRDD<Cells> cassandraGenericRDD(IDeepJobConfig<Cells> config)
config - the deep configuration object to use to create the new RDD.public <T> org.apache.spark.api.java.JavaRDD<T> union(org.apache.spark.api.java.JavaRDD<T>... arg0)
public org.apache.spark.api.java.JavaDoubleRDD union(org.apache.spark.api.java.JavaDoubleRDD... arg0)
public <K,V> org.apache.spark.api.java.JavaPairRDD<K,V> union(org.apache.spark.api.java.JavaPairRDD<K,V>... arg0)
Copyright © 2014. All rights reserved.