| London
>
>
> On Thu, May 1, 2014 at 12:44 AM, Paul Schooss
> wrote:
> > Hello,
> >
> > So I was unable to run the following commands from the spark shell with
> CDH
> > 5.0 and spark 0.9.0, see below.
> >
> > Once I removed the property
>
Hello,
So I was unable to run the following commands from the spark shell with CDH
5.0 and spark 0.9.0, see below.
Once I removed the property
io.compression.codec.lzo.class
com.hadoop.compression.lzo.LzoCodec
true
from the core-site.xml on the node, the spark commands worked. Is there a
spec
Hello Folks,
Sorry for the delay, these emails got missed due to the volume.
Here is my metrics.conf
root@jobs-ab-hdn4:~# cat /opt/klout/spark/conf/metrics.conf
# syntax: [instance].sink|source.[name].[options]=[value]
# This file configures Spark's internal metrics system. The metrics syste
Has anyone got this working? I have enabled the properties for it in the
metrics.conf file and ensure that it is placed under spark's home
directory. Any ideas why I don't see spark beans ?
I am a dork please disregard this issue. I did not have the slaves
correctly configured. This error is very misleading
On Tue, Apr 15, 2014 at 11:21 AM, Paul Schooss wrote:
> Hello,
>
> Currently I deployed 0.9.1 spark using a new way of starting up spark
>
> exec start-stop
Hello,
Currently I deployed 0.9.1 spark using a new way of starting up spark
exec start-stop-daemon --start --pidfile /var/run/spark.pid
--make-pidfile --chuid ${SPARK_USER}:${SPARK_GROUP} --chdir ${SPARK_HOME}
--exec /usr/bin/java -- -cp ${CLASSPATH}
-Dcom.sun.management.jmxremote.authentica
Andrew,
I ran into the same problem and eventually settled on just running the jars
directly with java. Since we use sbt to build our jars we had all the
dependancies builtin to the jar it self so need for random class paths.
On Tue, Mar 25, 2014 at 1:47 PM, Andrew Lee wrote:
> Hi All,
>
> I'm
Hello Folks,
We have a strange issue going on with a spark standalone cluster in which a
simple test application is having a hard time using external classes. Here
are the details
The application is located here:
https://github.com/prantik/spark-example
We use classes such as spark's streaming
; -Sandy
>
>
> On Tue, Mar 11, 2014 at 3:09 PM, Paul Schooss wrote:
>
>> Hello Folks,
>>
>> I was wondering if anyone had experience placing application jars for
>> Spark onto HDFS. Currently I have distributing the jars manually and would
>> love to source
Hello Folks,
I was wondering if anyone had experience placing application jars for Spark
onto HDFS. Currently I have distributing the jars manually and would love
to source the jar via HDFS a la distributed caching with MR. Any ideas?
Regards,
Paul
10 matches
Mail list logo