constrains
me.
-- --
??: "Saisai Shao";;
: 2015??12??25??(??) 4:43
??: "donhoff_h"<165612...@qq.com>;
: "user";
: Re: Job Error:Actor not found for:
ActorSelection[Anchor(akka.
virtual machine
which has 2 cores and 4G memory and with yarn-client mode.
-- --
??: "Saisai Shao";;
: 2015??12??25??(??) 4:15
??: "donhoff_h"<165612...@qq.com>;
: "user";
???
Hi,folks
I wrote some spark jobs and these jobs could ran successfully when I ran them
one by one. But if I ran them concurrently, for example 12 jobs parallel
running, I met the following error. Could anybody tell me what cause this? How
to solve it? Many Thanks!
Exception in thread "main"
Hi, all
I wrote a spark program which uses the Kryo serialization. When I count a rdd
which type is RDD[(String,String)], it reported an Exception like the following
:
* Class is not registered: org.apache.spark.util.collection.CompactBuffer[]
* Note: To register this class use:
kryo.register(
er contain the principal I set up with the kinit
command?
Many Thanks!
-- --
??: "yuzhihong";;
????: 2015??5??22??(??) 7:25
??: "donhoff_h"<165612...@qq.com>;
: "Bill Q"; "user";
?
tring)
}
finally {
tbl.close()
conn.close()
es.shutdown()
}
val rdd = sc.parallelize(Array(1,2,3,4,5,6,7,8,9,10))
val v = rdd.sum()
println("Value="+v)
sc.stop()
}
}
-- --
??: "yuzhihong";;
: 2015
any Thanks!
-- --
??: "yuzhihong";;
: 2015??5??22??(??) 5:29
??: "donhoff_h"<165612...@qq.com>;
: "Bill Q"; "user";
: Re: How to use spark to access HBase with Security enabled
Are the worker nodes colocated with HBase region servers
urity.auth.login.config=/home/spark/spark-hbase.jaas
-Djava.security.krb5.conf=/etc/krb5.conf" /home/spark/myApps/TestHBase.jar
-- --
??: "Bill Q";;
: 2015??5??20??(??) 10:13
??: "donhoff_h"<165612...@qq.com>;
: "yuzhihong"; "user&
Hi, all
I wrote a program to get HBaseConfiguration object in Spark. But after I
printed the content of this hbase-conf object, I found they were wrong. For
example, the property "hbase.zookeeper.quorum" should be
"bgdt01.dev.hrb,bgdt02.dev.hrb,bgdt03.hrb". But the printed value is
"localhost"
a secured HBase in a spark
program which use the API "newAPIHadoopRDD" to get information from HBase?
Many Thanks!
-- --
??: "yuzhihong";;
: 2015??5??19??(??) 9:54
??: "donhoff_h"<165612...@qq.co
solve this problem. Can anybody help me to
figure it out? Many Thanks!
-- --
??: "yuzhihong";;
: 2015??5??19??(??) 7:55
??: "donhoff_h"<165612...@qq.com>;
: "user";
: Re: How to use
Hi, experts.
I ran the "HBaseTest" program which is an example from the Apache Spark source
code to learn how to use spark to access HBase. But I met the following
exception:
Exception in thread "main"
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
attempts=36, exceptio
From: "yuzhihong";;
Send time: Sunday, May 10, 2015 10:44 PM
To: "donhoff_h"<165612...@qq.com>;
Cc: "user";
Subject: Re: Does NullWritable can not be used in Spark?
Looking at
./core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala :
Hi, experts.
I wrote a spark program to write a sequence file. I found if I used the
NullWritable as the Key Class of the SequenceFile, the program reported
exceptions. But if I used the BytesWritable or Text as the Key Class, the
program did not report the exceptions.
Does spark not support
Hi, experts.
I wrote a very little program to learn how to use Broadcast Variables, but met
an exception. The program and the exception are listed as following. Could
anyone help me to solve this problem? Thanks!
**My Program is as following**
object TestBroadcast02 {
Hi,
I am studying the RDD Caching function and write a small program to verify it.
I run the program in a Spark1.3.0 environment and on Yarn cluster. But I meet a
weird exception. It isn't always generated in the log. Only sometimes I can see
this exception. And it does not affect the output of
Hi, Experts
I run my Spark Cluster on Yarn. I used to get executors' Logs from Spark's
History Server. But after I started my Hadoop jobhistory server and made
configuration to aggregate logs of hadoop jobs to a HDFS directory, I found
that I could not get spark's executors' Logs any more. Is
Hi, experts
I wrote a very simple spark program to test the KryoSerialization function. The
codes are as following:
object TestKryoSerialization {
def main(args: Array[String]) {
val conf = new SparkConf()
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
c
18 matches
Mail list logo