Perhaps you need to set this in your spark-defaults.conf so that¹s it¹s
already set when your slave/worker processes start.
-Joe
On 1/25/15, 6:50 PM, "ilaxes" wrote:
>Hi,
>
>I've a similar problem. I want to see the detailed logs of Completed
>Applications so I've set in my program :
>set("spar
So you’ve got a point A and you want the sum of distances between it and all
other points? Or am I misunderstanding you?
// target point, can be Broadcast global sent to all workers
val tarPt = (10,20)
val pts = Seq((2,2),(3,3),(2,3),(10,2))
val rdd= sc.parallelize(pts)
rdd.map( pt => Math.sqrt(
I’ve setup a Spark cluster in the last few weeks and everything is working, but
I cannot run spark-shell interactively against the cluster from a remote host
* Deploy .jar to cluster from remote (laptop) spark-submit and have it run
– Check
* Run .jar on spark-shell locally – Check
*