hi, all :
I got a strange error:
bin/spark-shell --deploy-mode client
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
22/03/21 13:51:39 WARN util.Utils: spark.executor.instances less than
spark.dynamicAllocation.
I am seeing this issue when running Spark 3.0.2 on YARN.
Has a resolution been found for this? (I recentlly upgraded from using Spark
2.x on YARN)
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscr
Hi, any one knows how to fix below error?
java.lang.NoSuchMethodError:
org.apache.spark.sql.catalyst.catalog.CatalogTable.copy(Lorg/apache/spark/sql/catalyst/TableIdentifier;Lorg/apache/spark/sql/catalyst/catalog/CatalogTableType;Lorg/apache/spark/sql/catalyst/catalog/CatalogStorageFormat;Lorg/apa
Could you check the Scala version of your Kafka?
Best Regards,
Shixiong Zhu
2015-12-18 2:31 GMT-08:00 Christos Mantas :
> Thank you, Luciano, Shixiong.
>
> I thought the "_2.11" part referred to the Kafka version - an unfortunate
> coincidence.
>
> Indeed
>
> spark-submit --jars spark-streaming-
Thank you, Luciano, Shixiong.
I thought the "_2.11" part referred to the Kafka version - an
unfortunate coincidence.
Indeed
spark-submit --jars spark-streaming-kafka-assembly_2.10-1.5.2.jar
my_kafka_streaming_wordcount.py
OR
spark-submit --packages
org.apache.spark:spark-stream
Unless you built your own Spark distribution with Scala 2_11, you want to
use the 2.10 dependency :
--packages org.apache.spark:spark-streaming-kafka_2.10:1.5.2
On Thu, Dec 17, 2015 at 10:10 AM, Christos Mantas wrote:
> Hello,
>
> I am trying to set up a simple example with Spark Streaming (
What's the Scala version of your Spark? Is it 2.10?
Best Regards,
Shixiong Zhu
2015-12-17 10:10 GMT-08:00 Christos Mantas :
> Hello,
>
> I am trying to set up a simple example with Spark Streaming (Python) and
> Kafka on a single machine deployment.
> My Kafka broker/server is also on the same m
Hello,
I am trying to set up a simple example with Spark Streaming (Python) and
Kafka on a single machine deployment.
My Kafka broker/server is also on the same machine (localhost:1281) and
I am using Spark Version: spark-1.5.2-bin-hadoop2.6
Python code
...
ssc = StreamingContext(sc, 1
rebuilding spark help?
From: Fengdong Yu
Sent: Monday, December 7, 2015 10:31 PM
To: Sunil Tripathy
Cc: user@spark.apache.org
Subject: Re: NoSuchMethodError:
com.fasterxml.jackson.databind.ObjectMapper.enable
Can you try like this in your sbt:
val spark_versio
Can you try like this in your sbt:
val spark_version = "1.5.2"
val excludeServletApi = ExclusionRule(organization = "javax.servlet", artifact
= "servlet-api")
val excludeEclipseJetty = ExclusionRule(organization = "org.eclipse.jetty")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark
I am getting the following exception when I use spark-submit to submit a spark
streaming job.
Exception in thread "main" java.lang.NoSuchMethodError:
com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxml/jackson/databind/ObjectMapper;
And, also make sure your scala version is 2.11 for your build.
> On Nov 16, 2015, at 3:43 PM, Fengdong Yu wrote:
>
> Ignore my inputs, I think HiveSpark.java is your main method located.
>
> can you paste the whole pom.xml and your code?
>
>
>
>
>> On Nov 16, 2015, at 3:39 PM, Fengdong Y
Ignore my inputs, I think HiveSpark.java is your main method located.
can you paste the whole pom.xml and your code?
> On Nov 16, 2015, at 3:39 PM, Fengdong Yu wrote:
>
> The code looks good. can you check your ‘import’ in your code? because it
> calls ‘honeywell.test’?
>
>
>
>
>
>> O
The code looks good. can you check your ‘import’ in your code? because it
calls ‘honeywell.test’?
> On Nov 16, 2015, at 3:02 PM, Yogesh Vyas wrote:
>
> Hi,
>
> While I am trying to read a json file using SQLContext, i get the
> following error:
>
> Exception in thread "main" java.lang.No
I am trying to just read a JSON file in SQLContext and print the
dataframe as follows:
SparkConf conf = new SparkConf().setMaster("local").setAppName("AppName");
JavaSparkContext sc = new JavaSparkContext(conf);
SQLContext sqlContext = new SQLContext(sc);
DataFrame df = sqlC
what’s your SQL?
> On Nov 16, 2015, at 3:02 PM, Yogesh Vyas wrote:
>
> Hi,
>
> While I am trying to read a json file using SQLContext, i get the
> following error:
>
> Exception in thread "main" java.lang.NoSuchMethodError:
> org.apache.spark.sql.SQLContext.(Lorg/apache/spark/api/java/JavaS
Hi,
While I am trying to read a json file using SQLContext, i get the
following error:
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.sql.SQLContext.(Lorg/apache/spark/api/java/JavaSparkContext;)V
at com.honeywell.test.testhive.HiveSpark.main(HiveSpark.java:15)
For some reason you are having two different versions of spark jars in your
classpath.
Thanks
Best Regards
On Tue, Aug 4, 2015 at 12:37 PM, Deepesh Maheshwari <
deepesh.maheshwar...@gmail.com> wrote:
> Hi,
>
> I am trying to read data from kafka and process it using spark.
> i have attached my s
Hi,
I am trying to read data from kafka and process it using spark.
i have attached my source code , error log.
For integrating kafka,
i have added dependency in pom.xml
org.apache.spark
spark-streaming_2.10
1.3.0
org.apache.spark
Anyone met the same problem like me?
2015-06-12 23:40 GMT+08:00 Tao Li :
> Hi all:
>
> I complied new spark 1.4.0 version today. But when I run WordCount demo,
> it throws NoSuchMethodError "*java.lang.NoSuchMethodError:
> com.fasterxml.jackson.module.scala.deser.BigDecimalD
Hi all:
I complied new spark 1.4.0 version today. But when I run WordCount demo, it
throws NoSuchMethodError "*java.lang.NoSuchMethodError:
com.fasterxml.jackson.module.scala.deser.BigDecimalDeserialize*r".
I found the default "*fasterxml.jackson.version*" is *2.4.4*. It
I am trying to process events from a flume avro sink, but i keep getting this
same error. I am just running it locally using flumes avro-client. With the
following commands to start the job and client. It seems like it should be
a configuration problems since its a NoSuchMethodError, but
quot;match":{"schemaName":"
>> SuperControllerRequest.json"}}}"""
>> *val searched = sqlCtx.esRDD(esResource, query)* // < PROBLEM
>> HERE <-
>> println(searched.schema)
>>
>> I can as
ed = sqlCtx.esRDD(esResource, query)* // < PROBLEM HERE
<-
println(searched.schema)
I can assemble this with sbt assembly, after much work in getting SBT to work.
However, at RUN TIME, I have the
following output, which complains my sqlCtx.esRDD() has a
NoSuchMethodE
uery)* // < PROBLEM HERE
<-
println(searched.schema)
I can assemble this with sbt assembly, after much work in getting SBT to
work. However, at RUN TIME, I have the following output, which complains my
sqlCtx.esRDD() has a
NoSuchMethodError org.apache.spark.sql.ca
;
> intransitive()
> val sparkStreamingFromKafka = "org.apache.spark" %
> "spark-streaming-kafka_2.10" % "0.9.1" excludeAll(
>
>
> -Original Message-
> From: Sean Owen [mailto:so...@cloudera.com]
> Sent: January-22-15 11:39 AM
> To: Adrian M
amingFromKafka = "org.apache.spark" % "spark-streaming-kafka_2.10"
% "0.9.1" excludeAll(
-Original Message-
From: Sean Owen [mailto:so...@cloudera.com]
Sent: January-22-15 11:39 AM
To: Adrian Mocanu
Cc: u...@spark.incubator.apache.org
Subject: Re: Exceptio
ubator.apache.org
Subject: Re: Exception: NoSuchMethodError:
org.apache.spark.streaming.StreamingContext$.toPairDStreamFunctions
NoSuchMethodError almost always means that you have compiled some code against
one version of a library but are running against another. I wonder if you are
including dif
NoSuchMethodError almost always means that you have compiled some code
against one version of a library but are running against another. I
wonder if you are including different versions of Spark in your
project, or running against a cluster on an older version?
On Thu, Jan 22, 2015 at 3:57 PM
Hi
I get this exception when I run a Spark test case on my local machine:
An exception or error caused a run to abort:
org.apache.spark.streaming.StreamingContext$.toPairDStreamFunctions(Lorg/apache/spark/streaming/dstream/DStream;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Orderi
Good luck. Let me know If I can assist you further
Regards
-Pankaj
Linkedin
https://www.linkedin.com/profile/view?id=171566646
Skype
pankaj.narang
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration
or is Exception in thread "main" java.lang.NoSuchMethodError:
> com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J
>
> at
> akka.stream.StreamSubscriptionTimeoutSettings$.apply(FlowMaterializer.scala:256)
>
> I think there is vers
imeUnit;)J
at
akka.stream.StreamSubscriptionTimeoutSettings$.apply(FlowMaterializer.scala:256)
I think there is version mismatch on the jars you use at runtime
If you need more help add me on skype pankaj.narang
---Pankaj
--
View this message in context:
http://apache-spark
o avoid adding all these jars with
>>>> --jars?
>>>>
>>>> *My build.sbt file*
>>>>
>>>> name := "Simple Project"
>>>>
>>>> version := "1.0"
>>>>
>>>> scalaVersion := &qu
andra-driver-core" %
>>> "2.1.3"
>>>
>>> libraryDependencies += "com.datastax.spark" %%
>>> "spark-cassandra-connector" %
>>> "1.1.0" withSources() withJavadoc()
>>>
>>> libraryDependencies += "org.apa
ot;cassandra-thrift" %
>> "2.0.5"
>>
>> libraryDependencies += "joda-time" % "joda-time" % "2.6"
>>
>>
>>
>> libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.3.8"
% "2.4.0"
>
> libraryDependencies += "ch.qos.logback"% "logback-classic" % "1.1.2"
>
> libraryDependencies += "org.mockito" % "mockito-all" % "1.10.17"
>
> libraryDe
g.mockito" % "mockito-all" % "1.10.17"
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.3"
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.5"
libraryDependenci
com.msiops.footing
footing-tuple
0.2
com.datastax.cassandra
cassandra-driver-core
2.1.3
Thanks and Regards,
Md. Aiman Sarosh.
Accenture Services Pvt. Ltd.
Mob #: (+91) - 9836112841.
From: Gerard Maas
Sent: Tuesday, December 9, 2014 4:39
eems to be problem with line
> javaFunctions(data).writerBuilder("testkeyspace", "test_table",
> mapToRow(TestTable.class)).saveToCassandra();
> I am getting NoSuchMethodError.
> The code, the error-log and POM.xml dependencies are listed below:
> Please help me find the rea
Hi,
I am intending to save the streaming data from kafka into Cassandra, using
spark-streaming:
But there seems to be problem with line
javaFunctions(data).writerBuilder("testkeyspace", "test_table",
mapToRow(TestTable.class)).saveToCassandra();
I am getting NoSuchMethod
Thanks Helena. I think I will wait for the new release and try it.
Again thanks,
/Shahab
On Tue, Nov 11, 2014 at 3:41 PM, Helena Edelson wrote:
> Hi,
> It looks like you are building from master
> (spark-cassandra-connector-assembly-1.2.0).
> - Append this to your com.google.guava declaration:
Hi,
It looks like you are building from master
(spark-cassandra-connector-assembly-1.2.0).
- Append this to your com.google.guava declaration: % "provided"
- Be sure your version of the connector dependency is the same as the assembly
build. For instance, if you are using 1.1.0-beta1, build your
Hi,
I have a spark application which uses Cassandra
"connectorspark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar" to load
data from Cassandra into spark.
Everything works fine in the local mode, when I run in my IDE. But when I
submit the application to be executed in standalone Spark server,
-connector_2.10-1.1.0-alpha3.jar
>
> It resolved our issue.
>
> Sasi
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-cassand
wing order -
a) cassandra-driver-core-2.1.0.jar
b) cassandra-thrift-2.1.0.jar
c) libthrift-0.9.0.jar
d) spark-cassandra-connector_2.10-1.1.0-alpha3.jar
It resolved our issue.
Sasi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-c
t;--jars" option for spark-submit. However, we stuck with
> NoSuchMethodError: cassandra.thrift.ITransportFactory.openTransport()
>
> Find enclosed image for the complete error.
> <http://apache-spark-user-list.1001560.n3.nabble.com/file/n17338/Error.png>
>
> We included follow
led it. Developed small sample code for
> connecting to cassandra using
>
> https://github.com/datastax/spark-cassandra-connector/blob/b1.0/doc/0_quick_start.md
> link. During "spark-submit", we faced some JARs related issue and we
> resolved them using "--jars"
link. During "spark-submit", we faced some JARs related issue and we
resolved them using "--jars" option for spark-submit. However, we stuck with
NoSuchMethodError: cassandra.thrift.ITransportFactory.openTransport()
Find enclosed image for the complete error.
<http://apache-
For posterity's sake, I solved this. The problem was the Cloudera cluster
I was submitting to is running 1.0, and I was compiling against the latest
1.1 release. Downgrading to 1.0 on my compile got me past this.
On Tue, Oct 14, 2014 at 6:08 PM, Michael Campbell <
michael.campb...@gmail.com> wro
How did you resolve it?
On Tue, Jul 15, 2014 at 3:50 AM, SK wrote:
> The problem is resolved. Thanks.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/jsonRDD-NoSuchMethodError-tp9688p9742.html
> Sent from the Apache Spar
Hey all, I'm trying a very basic spark SQL job and apologies as I'm new to
a lot of this, but I'm getting this failure:
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.sql.SchemaRDD.take(I)[Lorg/apache/spark/sql/catalyst/expressions/Row;
I've tried a variety of uber-jar c
ed
by this...
Regards,
Jari
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-NoSuchMethodError-on-Spark-1-0-0-standalone-tp9746.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
The problem is resolved. Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/jsonRDD-NoSuchMethodError-tp9688p9742.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
.nabble.com/jsonRDD-NoSuchMethodError-tp9688p9735.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Have you upgraded the cluster where you are running this 1.0.1 as
well? A NoSuchMethodError
almost always means that the class files available at runtime are different
from those that were there when you compiled your program.
On Mon, Jul 14, 2014 at 7:06 PM, SK wrote:
> Hi,
>
> I
r=> r.trim != "")
val data = sqlc.jsonRDD(jrdd)
data.printSchema()
}
}
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/jsonRDD-NoSuchMethodError-tp9688.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
ars to be converting the incoming Map[String, Integer] to a
> Map[String, Integer]. I'm not seeing the purpose of it... help? (I'm a
> bit of a scala newbie.)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p8953.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
er] to a
Map[String, Integer]. I'm not seeing the purpose of it... help? (I'm a
bit of a scala newbie.)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p8953.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Try deleting the .iv2 directory in your home and then do a sbt clean
assembly would solve this issue i guess.
Thanks
Best Regards
On Thu, Jun 26, 2014 at 3:10 AM, Robert James
wrote:
> In case anyone else is having this problem, deleting all ivy's cache,
> then doing a sbt clean, then recompil
In case anyone else is having this problem, deleting all ivy's cache,
then doing a sbt clean, then recompiling everything, repackaging, and
reassemblying, seems to have solved the problem. (From the sbt docs,
it seems that having to delete ivy's cache means a bug in sbt)
On 6/25/14, Robert James
Thanks Paul. I'm unable to follow the discussion on SPARK-2075. But
how would you recommend I test or follow up on that? Is there a
workaround?
On 6/25/14, Paul Brown wrote:
> Hi, Robert --
>
> I wonder if this is an instance of SPARK-2075:
> https://issues.apache.org/jira/browse/SPARK-2075
>
>
Hi, Robert --
I wonder if this is an instance of SPARK-2075:
https://issues.apache.org/jira/browse/SPARK-2075
-- Paul
—
p...@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/
On Wed, Jun 25, 2014 at 6:28 AM, Robert James
wrote:
> On 6/24/14, Robert James wrote:
> > My app works f
On 6/24/14, Robert James wrote:
> My app works fine under Spark 0.9. I just tried upgrading to Spark
> 1.0, by downloading the Spark distro to a dir, changing the sbt file,
> and running sbt assembly, but I get now NoSuchMethodErrors when trying
> to use spark-submit.
>
> I copied in the SimpleAp
On 6/24/14, Peng Cheng wrote:
> I got 'NoSuchFieldError' which is of the same type. its definitely a
> dependency jar conflict. spark driver will load jars of itself which in
> recent version get many dependencies that are 1-2 years old. And if your
> newer version dependency is in the same packag
happens try re-importing the project)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Upgrading-to-Spark-1-0-0-causes-NoSuchMethodError-tp8207p8220.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
My app works fine under Spark 0.9. I just tried upgrading to Spark
1.0, by downloading the Spark distro to a dir, changing the sbt file,
and running sbt assembly, but I get now NoSuchMethodErrors when trying
to use spark-submit.
I copied in the SimpleApp example from
http://spark.apache.org/docs/
-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p7400.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
ssible to use Kafka to Spark Streaming pipeline from Java
> only with the default String message decoders, which makes this tool almost
> useless (unless you are a great JSON fan).
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p7347.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
e
> kafka stream, only checks the interface.
>
> Currently it is possible to use Kafka to Spark Streaming pipeline from Java
> only with the default String message decoders, which makes this tool almost
> useless (unless you are a great JSON fan).
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p7347.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p7347.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> On Wed, May 14, 2014 at 5:46 AM, wxhsdp <
>
>> wxhsdp@
>
>> > wrote:
>>
>>> Hi, DB
>>> i've add breeze jars to workers using sc.addJar()
>>> breeze jars include :
>>> breeze-natives_2.10-0.7.jar
i've add breeze jars to workers using sc.addJar()
> breeze jars include :
> breeze-natives_2.10-0.7.jar
> breeze-macros_2.10-0.3.jar
> breeze-macros_2.10-0.3.1.jar
> breeze_2.10-0.8-SNAPSHOT.jar
> breeze_2.10-0.7.jar
>
> almost all t
finally i fixed it. previous failure is caused by lack of some jars.
i pasted the classpath in local mode to workers by using "show
compile:dependencyClasspath"
and it works!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-bre
7;ve add breeze jars to workers using sc.addJar()
>> breeze jars include :
>> breeze-natives_2.10-0.7.jar
>> breeze-macros_2.10-0.3.jar
>> breeze-macros_2.10-0.3.1.jar
>> breeze_2.10-0.8-SNAPSHOT.jar
>> breeze_2.10-0.7.jar
>>
>> almost a
still NoSuchMethodError:
breeze.linalg.DenseMatrix
from the executor stderr, you can see the executor successsully fetches
these jars, what's wrong
about my method? thank you!
14/05/14 20:36:02 INFO Executor: Fetching
http://192.168.0.106:42883/jars/breeze-natives_2.10-0.7.jar with timestamp
140007095
rn to standalone mode, sbt "run spark://127.0.0.1:7077
> ...",
> >> error occurs
> >>
> >> 14/05/04 18:56:29 WARN scheduler.TaskSetManager: Loss was due to
> >> java.lang.NoSuchMethodError
> >> java.lang.NoSuchMethodError:
> >>
>
dError:
>>
>> breeze.linalg.DenseMatrix$.implOpMulMatrix_DMD_DMD_eq_DMD()Lbreeze/linalg/operators/DenseMatrixMultiplyStuff$implOpMulMatrix_DMD_DMD_eq_DMD$;
>>
>> in my opinion, everything needed is packaged to the jar file, isn't it?
>> and does anyone used breeze befo
hing needed is packaged to the jar file,
isn't it?
and does anyone used breeze before? is it good for matrix operation?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310.html
Sent
MulMatrix_DMD_DMD_eq_DMD()Lbreeze/linalg/operators/DenseMatrixMultiplyStuff$implOpMulMatrix_DMD_DMD_eq_DMD$;
>
> in my opinion, everything needed is packaged to the jar file, isn't it?
> and does anyone used breeze before? is it good for matrix operation?
>
>
>
> --
> View this m
ix_DMD_DMD_eq_DMD$;
in my opinion, everything needed is packaged to the jar file, isn't it?
and does anyone used breeze before? is it good for matrix operation?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMa
hsdp wrote:
> i fixed it.
>
> i make my sbt project depend on
> spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
> and it works
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSu
gt; scalaVersion := "2.10.4"
> >
> > libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"
> >
> > resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
> >
> > is there something
i fixed it.
i make my sbt project depend on
spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
and it works
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5096.html
Sent from the
message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5094.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
esolvers += "Akka Repository" at "http://repo.akka.io/releases/";
>
> is there something need to modify?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5076.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
ot;0.9.1"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
is there something need to modify?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5076.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
I am seeing the following exception from a very basic test project when it
runs on spark local.
java.lang.NoSuchMethodError:
org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;
The project is built with Java 1.6, Scala 2.10.3 and spark 0.9.1
Please remove me from the mail list.
-邮件原件-
发件人: Deepak Nulu [mailto:deepakn...@gmail.com]
发送时间: 2014年3月7日 7:45
收件人: u...@spark.incubator.apache.org
主题: Re: NoSuchMethodError - Akka - Props
I see the same error. I am trying a standalone example integrated into a Play
Framework v2.2.2
does have such a constructor.
>
> What is going wrong? Can someone help solve this mystery and help with my
> misery? Basically stuck for last 2 days - as I am a Java Guy and would like
> to develop downstream code in Java
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
onstructor.
>
> What is going wrong? Can someone help solve this mystery and help with my
> misery? Basically stuck for last 2 days - as I am a Java Guy and would like
> to develop downstream code in Java
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
ine
now.
-deepak
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-Akka-Props-tp2191p2377.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
;
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-Akka-Props-tp2191p2375.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-Akka-Props-tp2191p2375.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
/NoSuchMethodError-in-KafkaReciever-tp2209.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
95 matches
Mail list logo