Hi Robert,
I am using the following maven command to build spark 1.0 for hadoop 2 +
hbase 0.96.2:
mvn -Dhadoop.version=2.3.0 -Dprotobuf.version=2.5.0 -DskipTests clean
package
Regards,
siyuan
On Sun, Jun 29, 2014 at 3:20 PM, Robert James
wrote:
> Although Spark's home page offers binaries f
Hi Stephen,
I am using spark1.0+ HBase0.96.2. This is what I did:
1) rebuild spark using: mvn -Dhadoop.version=2.3.0 -Dprotobuf.version=2.5.0
-DskipTests clean package
2) In spark-env.sh, set SPARK_CLASSPATH =
/path-to/hbase-protocol-0.96.2-hadoop2.jar
Hopefully it can help.
Siyuan
On Sat, Jun
Hey Haoming,
Actually akka.loggers has already been set to
"akka.event.slf4j.Slf4jLogger". You can check
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/AkkaUtils.scala
Regards,
SY
On Fri, Jun 27, 2014 at 3:55 PM, Haoming Zhang
wrote:
> Hi Siyuan,
>
> Can
Hi all,
I can start a spark streaming app in "Client" mode on a Pseudo-standalone
cluster on my local machine.
However when I tried to start it in "Cluster" mode. It always got the
following exception on the Driver.
Exception in thread "main" akka.ConfigurationException: Could not
start logger d
Hi all,
I can start a spark streaming app in "Client" mode on a Pseudo-standalone
cluster on my local machine.
However when I tried to start it in "Cluster" mode. It always get the
following exception on the Driver.
Exception in thread "main" akka.ConfigurationException: Could not
start logger d
Hi Cheney
Which mode you are running? YARN or standalone?
I got the same exception when I ran spark on YARN.
On Tue, May 6, 2014 at 10:06 PM, Cheney Sun wrote:
> Hi Nan,
>
> In worker's log, I see the following exception thrown when try to launch
> on executor. (The SPARK_HOME is wrongly specif