How are you submitting the application? Use a standard build tool like
maven or sbt to build your project, it will download all the dependency
jars, when you submit your application (if you are using spark-submit, then
use --jars option to add those jars which are causing
classNotFoundException). I
Hi All,
I configured Kafka cluster on a single node and I have streaming
application which reads data from kafka topic using KafkaUtils. When I
execute the code in local mode from the IDE, the application runs fine.
But when I submit the same to spark cluster in standalone mode, I end up
with
+1 (non-binding)
I have a simple test of ALS(Implicit), LR(SGD,L-BFGS) algorithm. Looks no
problem‍
-- Original --
From: "Patrick Wendell";;
Date: Sun, Apr 5, 2015 08:09 AM
To: "dev@spark.apache.org";
Subject: [VOTE] Release Apache Spark 1.3.1
Please
+1 (non-binding, of course)
1. Compiled OSX 10.10 (Yosemite) OK Total time: 15:04 min
mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
-Dhadoop.version=2.6.0 -Phive -DskipTests -Dscala-2.11
2. Tested pyspark, mlib - running as well as compare results with 1.3.0
pyspark works well
+1
Tested some DataFrame functions locally on Mac OS X.
On Sat, Apr 4, 2015 at 5:09 PM, Patrick Wendell wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 1.3.1!
>
> The tag to be voted on is v1.3.1-rc1 (commit 0dcb5d9f):
>
> https://git-wip-us.apache.org/repos/
Please vote on releasing the following candidate as Apache Spark version 1.3.1!
The tag to be voted on is v1.3.1-rc1 (commit 0dcb5d9f):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
The list of fixes present in this release can be found at
Thanks Cheng. Yes, the problem is that the way to set up to run inside
Intellij changes v frequently. It is unfortunately not simply a one-time
investment to get IJ debugging working properly: the steps required are a
moving target approximately monthly to bi-monthly.
Doing remote debugging is pr
I found in general it's a pain to build/run Spark inside IntelliJ IDEA.
I guess most people resort to this approach so that they can leverage
the integrated debugger to debug and/or learn Spark internals. A more
convenient way I'm using recently is resorting to the remote debugging
feature. In