Dumping Metics on HDFS

2014-06-07 Thread Rahul Singhal
ns to navigate around this limitation would be helpful. Thanks, Rahul Singhal

How to create a RPM package

2014-04-04 Thread Rahul Singhal
erably through maven) to create a RPM package. Is there a script (which is probably used for spark releases) that I can get my hands on? Or should I write one on my own? P.S. I don't want to use the "alien" software to convert a debian package to a RPM. Thanks, Rahul Singhal

Re: Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Rahul Singhal
rk.apache.org/docs/latest/running-on-yarn.html, for sbt build, you could try: SPARK_HADOOP_VERSION=2.3.0 SPARK_YARN=true sbt/sbt assembly Thanks, Rahul Singhal From: Erik Freed mailto:erikjfr...@codecision.com>> Reply-To: "user@spark.apache.org<mailto:user@spark.apache.org>&

Re: How to create a RPM package

2014-04-04 Thread Rahul Singhal
rt to make-distribution.sh. This script produces a tar ball which I can then use to create a RPM package. Thanks, Rahul Singhal From: Christophe Préaud mailto:christophe.pre...@kelkoo.com>> Reply-To: "user@spark.apache.org<mailto:user@spark.apache.org>" mailto:user@spark.apache.o

Re: How to create a RPM package

2014-04-06 Thread Rahul Singhal
earn how the list of files does not need to be maintained in the spec file (the spec file that Christophe attached was using a explicit list). Thanks for the explanation and your help. Thanks, Rahul Singhal On 05/04/14 9:48 PM, "Will Benton" wrote: >Hi Rahul, > >As Christ

Re: Null Pointer Exception in Spark Application with Yarn Client Mode

2014-04-07 Thread Rahul Singhal
t in a bug fix. FYI, this issue seems to be resolved in master branch (1.0). Reference: https://groups.google.com/forum/#!topic/shark-users/8qFsy9JSt4E http://hadoop.apache.org/docs/r2.3.0/hadoop-yarn/hadoop-yarn-common/yarn-default.xml Thanks, Rahul Singhal From: Sai Prasanna mailto:

Re: the spark configuage

2014-04-30 Thread Rahul Singhal
Hi, Just in case you already have the 64 bit version, the following works for me on spark 0.9.1 SPARK_LIBRARY_PATH=/opt/hadoop/lib/native/ ./bin/spark-shell (where my libhadoop.so is present in /opt/hadoop/lib/native/) Thanks, Rahul Singhal From: Akhil Das mailto:ak...@sigmoidanalytics.com