HelIo. I followed "A Standalone App in Java" part of the tutorial
https://spark.apache.org/docs/0.8.1/quick-start.html
Spark standalone cluster looks it's running without a problem :
http://i.stack.imgur.com/7bFv8.png
I have built a fat jar for running this JavaApp on the cluster. Before maven
Things I tried and the errors are :
String path =
"/home/ubuntu/spark-0.9.1/SimpleApp/target/simple-project-1.0-allinone.jar";..
.set(path)
$mvn package[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:2.0.2:compile (default-compile)
on project simple-project: Compi
In my situation each slave has 8 GB memory. I want to use the maximum memory
that I can: .set("spark.executor.memory", "?g") How can I determine the amount
of memory I should set ? It fails when I set it to 8GB.