Got it to work...thanks a lot for the help! I started a new cluster where
Spark has Yarn as a dependency. I ran it with the script with local[2] and
it worked (this same script did not work with Spark in standalone mode).
A follow up question...I have seen this question posted around the internet
I thought I was running it in local mode as
http://spark.apache.org/docs/1.1.1/submitting-applications.html says that
if I don't include "--deploy-mode cluster" then it will run as local mode?
I tried both of the scripts above and they gave the same result as the
script I was running before.
Also
;>>>>>>>> Make sure you verify the following:
>>>>>>>>>>>>>
>>>>>>>>>>>>> -
gt;>>>>>>>
>>>>>>>>>>>>> https://github.com/apache/spark/blob/master/examples/scala-2.10/src/main/java/org/apache/spark/examples/streaming/JavaKafkaWordCount.java
>>>>>>>>>>>>>
>>>>>>>&g
;> pom.xml
>>>>>>>>>>>> <
>>>>>>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n20
ubmit
>>>>>>>>>>> --class SimpleApp --master spark://10.0.1.230:7077 --jars
>>>>>>>>>>> $(echo
>>>>>>>>>>> /home/ec2-user/sparkApps/SimpleApp/lib/*.jar | tr ' ' ',')
>>>>>>>>>>> /home/ec2-use
code:
>>>>>>>>>> KafkaJavaConsumer.txt
>>>>>>>>>> <
>>>>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n20879/KafkaJavaConsumer.txt
>>>>>>>>>> >
>
gt;>>>>>>> 14/12/29 05:58:05 INFO BlockGenerator: Started block pushing thread
>>>>>>>>> 14/12/29 05:58:05 INFO ReceiverSupervisorImpl: Stopping receiver
>>>>>>>>> with
>>>>>>>>> message: Error st
5:58:05 INFO ReceiverSupervisorImpl: Deregistering
>>>>>>>> receiver 0
>>>>>>>> ^C14/12/29 05:58:05 ERROR ReceiverTracker: Deregistered receiver
>>>>>>>> for stream
>>>>>>>> 0: Error starting receiver 0 - java.lang.NoClassDefFoundError
ies.scala:26)
>>>>>>> at
>>>>>>> kafka.consumer.ConsumerConfig.(ConsumerConfig.scala:94)
>>>>>>> at
>>>>>>>
>>>>>>> org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaI
treaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
>>>>>> at
>>>>>>
>>>>>> org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
>>>>>> a
(SparkContext.scala:1121)
>>>>> at
>>>>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
>>>>> at org.apache.spark.scheduler.Task.run(Task.scala:54)
>>>>> at
>>>>> org.apache.sp
gt;>> at
>>>>
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
>>>> at java.lang.Thread.run(Thread.java:722)
>>>> Caused by: java.lang.ClassNotFoundException: scala.reflect.ClassManifest
>&g
sLoader.java:355)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>>> at java.lang.ClassLoa
hem
>> by adding dependencies on the pom file, but have not found such a solution
>> to this error.
>>
>> On the Kafka side of things, I am simply typing in messages as soon as I
>> start the Java app on another console. Is this okay?
>>
>> I have not s
On the Kafka side of things, I am simply typing in messages as soon as I
> start the Java app on another console. Is this okay?
>
> I have not set up an advertised host on the kafka side as I was able to
> still receive messages from other consoles by setting up a consumer to
> listen t
-spark-user-list.1001560.n3.nabble.com/Setting-up-Simple-Kafka-Consumer-via-Spark-Java-app-tp20879.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apac
17 matches
Mail list logo