-
Notifications
You must be signed in to change notification settings - Fork 1
Spark Shell
Jonathan Janetzki edited this page Jul 26, 2017
·
1 revision
This page explains how to use the Spark Shell by taking the example of counting elements in a Cassandra table.
Start spark-shell
by importing your package. For example:
spark-shell --jars jars/DataLakeImport-assembly-1.0.jar
import DataLake._
Your case classes are known from now on.
import com.datastax.spark.connector._
val subjects = sc.cassandraTable[Subject]("datalake","subject")
subjects.count()
res0: Long = 859390