?????? Job Error:Actor not found for: ActorSelection[Anchor(akka.tcp://sparkDriver@130.1.10.108:23600/)

2015-12-25 Thread donhoff_h
constrains me. -- -- ??: "Saisai Shao";; : 2015??12??25??(??) 4:43 ??: "donhoff_h"<165612...@qq.com>; : "user"; : Re: Job Error:Actor not found for: ActorSelection[Anchor(akka.

?????? Job Error:Actor not found for: ActorSelection[Anchor(akka.tcp://sparkDriver@130.1.10.108:23600/)

2015-12-25 Thread donhoff_h
virtual machine which has 2 cores and 4G memory and with yarn-client mode. -- -- ??: "Saisai Shao";; : 2015??12??25??(??) 4:15 ??: "donhoff_h"<165612...@qq.com>; : "user"; ???

Job Error:Actor not found for: ActorSelection[Anchor(akka.tcp://sparkDriver@130.1.10.108:23600/)

2015-12-24 Thread donhoff_h
Hi,folks I wrote some spark jobs and these jobs could ran successfully when I ran them one by one. But if I ran them concurrently, for example 12 jobs parallel running, I met the following error. Could anybody tell me what cause this? How to solve it? Many Thanks! Exception in thread "main"

how to register CompactBuffer in Kryo

2015-08-28 Thread donhoff_h
Hi, all I wrote a spark program which uses the Kryo serialization. When I count a rdd which type is RDD[(String,String)], it reported an Exception like the following : * Class is not registered: org.apache.spark.util.collection.CompactBuffer[] * Note: To register this class use: kryo.register(

?????? ?????? ?????? How to use spark to access HBase with Security enabled

2015-05-23 Thread donhoff_h
er contain the principal I set up with the kinit command? Many Thanks! -- -- ??: "yuzhihong";; ????: 2015??5??22??(??) 7:25 ??: "donhoff_h"<165612...@qq.com>; : "Bill Q"; "user"; ?

?????? ?????? How to use spark to access HBase with Security enabled

2015-05-22 Thread donhoff_h
tring) } finally { tbl.close() conn.close() es.shutdown() } val rdd = sc.parallelize(Array(1,2,3,4,5,6,7,8,9,10)) val v = rdd.sum() println("Value="+v) sc.stop() } } -- -- ??: "yuzhihong";; : 2015

?????? How to use spark to access HBase with Security enabled

2015-05-21 Thread donhoff_h
any Thanks! -- -- ??: "yuzhihong";; : 2015??5??22??(??) 5:29 ??: "donhoff_h"<165612...@qq.com>; : "Bill Q"; "user"; : Re: How to use spark to access HBase with Security enabled Are the worker nodes colocated with HBase region servers

?????? How to use spark to access HBase with Security enabled

2015-05-21 Thread donhoff_h
urity.auth.login.config=/home/spark/spark-hbase.jaas -Djava.security.krb5.conf=/etc/krb5.conf" /home/spark/myApps/TestHBase.jar -- -- ??: "Bill Q";; : 2015??5??20??(??) 10:13 ??: "donhoff_h"<165612...@qq.com>; : "yuzhihong"; "user&

How to set HBaseConfiguration in Spark

2015-05-20 Thread donhoff_h
Hi, all I wrote a program to get HBaseConfiguration object in Spark. But after I printed the content of this hbase-conf object, I found they were wrong. For example, the property "hbase.zookeeper.quorum" should be "bgdt01.dev.hrb,bgdt02.dev.hrb,bgdt03.hrb". But the printed value is "localhost"

?????? How to use spark to access HBase with Security enabled

2015-05-19 Thread donhoff_h
a secured HBase in a spark program which use the API "newAPIHadoopRDD" to get information from HBase? Many Thanks! -- -- ??: "yuzhihong";; : 2015??5??19??(??) 9:54 ??: "donhoff_h"<165612...@qq.co

?????? How to use spark to access HBase with Security enabled

2015-05-19 Thread donhoff_h
solve this problem. Can anybody help me to figure it out? Many Thanks! -- -- ??: "yuzhihong";; : 2015??5??19??(??) 7:55 ??: "donhoff_h"<165612...@qq.com>; : "user"; : Re: How to use

How to use spark to access HBase with Security enabled

2015-05-19 Thread donhoff_h
Hi, experts. I ran the "HBaseTest" program which is an example from the Apache Spark source code to learn how to use spark to access HBase. But I met the following exception: Exception in thread "main" org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptio

Re: Does NullWritable can not be used in Spark?

2015-05-10 Thread donhoff_h
From: "yuzhihong";; Send time: Sunday, May 10, 2015 10:44 PM To: "donhoff_h"<165612...@qq.com>; Cc: "user"; Subject: Re: Does NullWritable can not be used in Spark? Looking at ./core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala :

Does NullWritable can not be used in Spark?

2015-05-09 Thread donhoff_h
Hi, experts. I wrote a spark program to write a sequence file. I found if I used the NullWritable as the Key Class of the SequenceFile, the program reported exceptions. But if I used the BytesWritable or Text as the Key Class, the program did not report the exceptions. Does spark not support

Meet Exception when learning Broadcast Variables

2015-04-21 Thread donhoff_h
Hi, experts. I wrote a very little program to learn how to use Broadcast Variables, but met an exception. The program and the exception are listed as following. Could anyone help me to solve this problem? Thanks! **My Program is as following** object TestBroadcast02 {

meet weird exception when studying rdd caching

2015-04-20 Thread donhoff_h
Hi, I am studying the RDD Caching function and write a small program to verify it. I run the program in a Spark1.3.0 environment and on Yarn cluster. But I meet a weird exception. It isn't always generated in the log. Only sometimes I can see this exception. And it does not affect the output of

Can not get executor's Log from Spark's History Server

2015-04-07 Thread donhoff_h
Hi, Experts I run my Spark Cluster on Yarn. I used to get executors' Logs from Spark's History Server. But after I started my Hadoop jobhistory server and made configuration to aggregate logs of hadoop jobs to a HDFS directory, I found that I could not get spark's executors' Logs any more. Is

Serialization Problem in Spark Program

2015-03-25 Thread donhoff_h
Hi, experts I wrote a very simple spark program to test the KryoSerialization function. The codes are as following: object TestKryoSerialization { def main(args: Array[String]) { val conf = new SparkConf() conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer") c