Discussions about how CDH packages Spark aside, you should be using
the spark-class script (assuming you're still in 0.9) instead of
executing Java directly. That will make sure that the environment
needed to run Spark apps is set up correctly.
CDH 5.1 ships with Spark 1.0.0, so it has spark-submi
That's how it's supposed to work, right? You don't deploy an assembly
.jar for this reason. You get things like Hadoop from the cluster at
runtime. At least this was the gist of what Matei described last
month. This is not some issue with CDH.
On Wed, Jul 23, 2014 at 8:28 AM, Debasish Das wrote:
> Is there any documentation from cloudera on how to run Spark apps on CDH
Manager deployed Spark ?
Asking the cloudera community would be a good idea.
http://community.cloudera.com/
In the end only Cloudera will fix quickly issues with CDH...
Bertrand Dechoux
On Wed, Jul 23, 2014 at 9:28 AM,
I found the issue...
If you use spark git and generate the assembly jar then
org.apache.hadoop.io.Writable.class is packaged with it
If you use the assembly jar that ships with CDH in
/opt/cloudera/parcels/CDH/lib/spark/assembly/lib/spark-assembly_2.10-0.9.0-cdh5.0.2-hadoop2.3.0-cdh5.0.2.jar,
If you need to run Spark apps through Hue, see if Ooyala's job server helps:
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-deployed-by-Cloudera-Manag