hi,all :
I try to run example org.apache.spark.examples.streaming.KafkaWordCount , I
got error :
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/streaming/kafka/KafkaUtils$
at
org.apache.spark.examples.streaming.KafkaWordCount$.main(KafkaWordCount.scala:57)
at
org.apache
Should be back and building now:
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/job/spark-master-maven-snapshots/1595/console
I see a 2.1.0-SNAPSHOT in
https://repository.apache.org/content/groups/snapshots/org/apache/spark/spark-core_2.11/,
so it looks like everything should be wor
On 23 Jul 2016, at 23:59, Mark Hamstra
mailto:m...@clearstorydata.com>> wrote:
Sure, signalling well ahead of time is good, as is getting better performance
from Java 8; but do either of those interests really require dropping Java 7
support sooner rather than later?
Now, to retroactively cop
BTW - "signalling ahead of time" is called deprecating, not dropping
support...
(personally I only use JDK 8 / Scala 2.11 so I'm for it)
Ofir Manor
Co-Founder & CTO | Equalum
Mobile: +972-54-7801286 | Email: ofir.ma...@equalum.io
On Sun, Jul 24, 2016 at 1:50 AM, Koert Kuipers wrote:
> i care
I had favored this for 2.0 even, so favor it sooner than later.
The general shape of the argument is:
- supporting Java 7 is starting to pinch a little because of extra
builds and the inevitable gap between what the PR builder (7) tests
and what the later Java 8 tests runs show
- requiring Java 8