My guess is that Jia wants to run C++ on top of Spark. If that's the case, I'm
afraid this is not possible. Spark has support for Java, Python, Scala and R.
The best way to achieve this is to run your application in C++ and used the
data created by said application to do manipulation within Spark
I tried to build Spark according to the build directions and the it failed due
to the following error:
| |
| | | | | |
| Building Spark - Spark 1.5.1 DocumentationBuilding Spark Building with
build/mvn Building a Runnable Distribution Setting up Maven’s Memory Usage
Specifying the H
Can someone please provide insight why I get an access denied when I do the
build according to the documentation?
Ted said I have to provide the credentials but there's nothing mention about
that in the build documentation.
On Thursday, October 15, 2015 8:39 PM, Annabel Melongo
I was trying to build a cloned version of Spark on my local machine using the
command: mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests
clean packageHowever I got the error: [ERROR] Failed to execute goal
org.apache.maven.plugins:maven-shade-plugin:2.4.1:shade (default) on