From d35e2041e4ee7b32bde4cd9c4940c36c1753b27c Mon Sep 17 00:00:00 2001 From: Ignacio Date: Mon, 26 Sep 2016 17:45:58 +0200 Subject: [PATCH] Changes proposed to make it work in my machine I needed to download Spark 1.6.1 release because the version of Scala used is 2.0 and in the Spark 2.0 release it is using Scala 2.1. Also I deleted the '--jars' because it is the one we want to execute, not extra jars and it was trying to run --profile and --parents as args of /bin/spark-submit, which doesn't have those parameters. Maybe this was only my case, but I put this here because some other classmates have had also some similar problems. --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 98b0d53..152e7cd 100644 --- a/README.md +++ b/README.md @@ -23,7 +23,7 @@ sudo apt-get install python-dev libncurses-dev python-pip sudo pip install ipython[all]==3.2.1 ``` -3. Download and install Java and Spark on your machine. The easiest way to install Spark is to download its "pre-built for CDH 4" from the [here](http://spark.apache.org/downloads.html). +3. Download and install Java and Spark on your machine. The easiest way to install Spark is to download its "pre-built for CDH 4" from the [here](http://spark.apache.org/downloads.html). Make sure it is version 1.6.1. 4. Set the environment variables. ```bash @@ -83,7 +83,7 @@ c.KernelManager.kernel_cmd = [spark_home+"/bin/spark-submit", "--master", master, "--class", "org.tribbloid.ispark.Main", "--executor-memory", "2G", - "--jars", "/ispark-core-assembly-0.2.0-SNAPSHOT.jar", + "/ispark-core-assembly-0.2.0-SNAPSHOT.jar", "--profile", "{connection_file}", "--parent"] ```