ns to navigate around this limitation would be helpful.
Thanks,
Rahul Singhal
erably through maven) to create a RPM package. Is there a
script (which is probably used for spark releases) that I can get my hands on?
Or should I write one on my own?
P.S. I don't want to use the "alien" software to convert a debian package to a
RPM.
Thanks,
Rahul Singhal
rk.apache.org/docs/latest/running-on-yarn.html, for sbt
build, you could try:
SPARK_HADOOP_VERSION=2.3.0 SPARK_YARN=true sbt/sbt assembly
Thanks,
Rahul Singhal
From: Erik Freed mailto:erikjfr...@codecision.com>>
Reply-To: "user@spark.apache.org<mailto:user@spark.apache.org>&
rt to make-distribution.sh. This script
produces a tar ball which I can then use to create a RPM package.
Thanks,
Rahul Singhal
From: Christophe Préaud
mailto:christophe.pre...@kelkoo.com>>
Reply-To: "user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.o
earn how the list of
files does not need to be maintained in the spec file (the spec file that
Christophe attached was using a explicit list).
Thanks for the explanation and your help.
Thanks,
Rahul Singhal
On 05/04/14 9:48 PM, "Will Benton" wrote:
>Hi Rahul,
>
>As Christ
t in a bug fix. FYI, this issue seems
to be resolved in master branch (1.0).
Reference:
https://groups.google.com/forum/#!topic/shark-users/8qFsy9JSt4E
http://hadoop.apache.org/docs/r2.3.0/hadoop-yarn/hadoop-yarn-common/yarn-default.xml
Thanks,
Rahul Singhal
From: Sai Prasanna mailto:
Hi,
Just in case you already have the 64 bit version, the following works for me on
spark 0.9.1
SPARK_LIBRARY_PATH=/opt/hadoop/lib/native/ ./bin/spark-shell
(where my libhadoop.so is present in /opt/hadoop/lib/native/)
Thanks,
Rahul Singhal
From: Akhil Das mailto:ak...@sigmoidanalytics.com