Re: CDH 5.0 and Spark 0.9.0

2014-05-05 Thread Paul Schooss
| London > > > On Thu, May 1, 2014 at 12:44 AM, Paul Schooss > wrote: > > Hello, > > > > So I was unable to run the following commands from the spark shell with > CDH > > 5.0 and spark 0.9.0, see below. > > > > Once I removed the property >

CDH 5.0 and Spark 0.9.0

2014-04-30 Thread Paul Schooss
Hello, So I was unable to run the following commands from the spark shell with CDH 5.0 and spark 0.9.0, see below. Once I removed the property io.compression.codec.lzo.class com.hadoop.compression.lzo.LzoCodec true from the core-site.xml on the node, the spark commands worked. Is there a spec

Re: JMX with Spark

2014-04-25 Thread Paul Schooss
Hello Folks, Sorry for the delay, these emails got missed due to the volume. Here is my metrics.conf root@jobs-ab-hdn4:~# cat /opt/klout/spark/conf/metrics.conf # syntax: [instance].sink|source.[name].[options]=[value] # This file configures Spark's internal metrics system. The metrics syste

JMX with Spark

2014-04-15 Thread Paul Schooss
Has anyone got this working? I have enabled the properties for it in the metrics.conf file and ensure that it is placed under spark's home directory. Any ideas why I don't see spark beans ?

Re: Can't run a simple spark application with 0.9.1

2014-04-15 Thread Paul Schooss
I am a dork please disregard this issue. I did not have the slaves correctly configured. This error is very misleading On Tue, Apr 15, 2014 at 11:21 AM, Paul Schooss wrote: > Hello, > > Currently I deployed 0.9.1 spark using a new way of starting up spark > > exec start-stop

Can't run a simple spark application with 0.9.1

2014-04-15 Thread Paul Schooss
Hello, Currently I deployed 0.9.1 spark using a new way of starting up spark exec start-stop-daemon --start --pidfile /var/run/spark.pid --make-pidfile --chuid ${SPARK_USER}:${SPARK_GROUP} --chdir ${SPARK_HOME} --exec /usr/bin/java -- -cp ${CLASSPATH} -Dcom.sun.management.jmxremote.authentica

Re: Spark 0.9.1 - How to run bin/spark-class with my own hadoop jar files?

2014-03-25 Thread Paul Schooss
Andrew, I ran into the same problem and eventually settled on just running the jars directly with java. Since we use sbt to build our jars we had all the dependancies builtin to the jar it self so need for random class paths. On Tue, Mar 25, 2014 at 1:47 PM, Andrew Lee wrote: > Hi All, > > I'm

NoClassFound Errors with Streaming Twitter

2014-03-13 Thread Paul Schooss
Hello Folks, We have a strange issue going on with a spark standalone cluster in which a simple test application is having a hard time using external classes. Here are the details The application is located here: https://github.com/prantik/spark-example We use classes such as spark's streaming

Re: Applications for Spark on HDFS

2014-03-11 Thread Paul Schooss
; -Sandy > > > On Tue, Mar 11, 2014 at 3:09 PM, Paul Schooss wrote: > >> Hello Folks, >> >> I was wondering if anyone had experience placing application jars for >> Spark onto HDFS. Currently I have distributing the jars manually and would >> love to source

Applications for Spark on HDFS

2014-03-11 Thread Paul Schooss
Hello Folks, I was wondering if anyone had experience placing application jars for Spark onto HDFS. Currently I have distributing the jars manually and would love to source the jar via HDFS a la distributed caching with MR. Any ideas? Regards, Paul