Hello, I downloaded and built Spark 1.0.1 using sbt/sbt assembly. Once built I attempted to go through a couple examples. I could run Spark interactively through the Scala Shell and the example sc.parallelize(1 to 1000).count() returned correcly with 1000. Then I attempted to run the example using ./bin/run-example SparkPi 10. The first thing I get is Warning: Local jar C:\cygdrive\Users\myUser\Desktop\spark-1.0.1\examples\target\scala-2.10\spark-examples-1.0.1-hadoop1.0.4.jar does not exist, skipping. Which is followed up by an Exception in thread "main" java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi The first thing I did was check that I had spark-examples-1.0.1-hadoop1.0.4.jar and I did. After failing to run this example I then tried to run the python API with ./bin/pyspark which never failed but also never started the shell either. Finally I attempted to run the python example and received ImportError: No module named pyspark. Again I double checked to see if pyspark was there and it was.
Any help would be appreciated. I am running this on a Windows 7 machine through Cygwin. -Colin Taylor -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Examples-tp11380.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org