Hi folks,
I have just upgraded to Spark 1.1.0, and try some examples like:
./run-example SparkPageRank pagerank_data.txt 5
It turns out that Spark keeps trying to connect to my name node and read the
file from HDFS other than local FS:
Client: Retrying connect to server: Node1/192.168
Hi,
What is the setup of your native library? Probably it is not thread safe?
Thanks,
Max
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/EC2-JNI-crashes-JVM-with-multi-core-instances-tp13463p13470.html
Sent from the Apache Spark User List mailing list ar
Rstudio should be fine.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/IDE-for-sparkR-tp4764p4772.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Just figure it out. I need to add a "file://" in URI. I guess it is not
needed in previous Hadoop versions.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-0-8-examples-in-local-mode-tp2892p2897.html
Sent from the Apache Spark User List mailing list ar
Hi folks,
I have just upgrade to Spark 0.8.1, and try some examples like:
./run-example org.apache.spark.examples.SparkHdfsLR local lr_data.txt 3
It turns out that Spark keeps trying to read the file from HDFS other than
local FS:
Client: Retrying connect to server: Node1/192.168.0.101:90