which maven dependency i need, too??
http://www.cloudera.com/content/cloudera/en/documentation/core/v5-2-x/topics/cdh_vd_cdh5_maven_repo.html
Am 02.06.2015 um 16:04 schrieb Yana Kadiyska:
Can you run using spark-submit? What is happening is that you are
running a simple java program -- you've wrapped spark-core in your fat
jar but at runtime you likely need the whole Spark system in order to
run your application. I would mark the spark-core as provided(so you
don't wrap it in your fat jar) and run via spark submit. If you
insist on running via "java" for whatever reason, see the runtime path
that spark submit sets up and make sure you include all of these jars
when you run your app
On Tue, Jun 2, 2015 at 9:57 AM, Pa Rö <paul.roewer1...@googlemail.com
<mailto:paul.roewer1...@googlemail.com>> wrote:
okay, but how i can compile my app to run this without
-Dconfig.file=alt_
reference1.conf?
2015-06-02 15:43 GMT+02:00 Yana Kadiyska <yana.kadiy...@gmail.com
<mailto:yana.kadiy...@gmail.com>>:
This looks like your app is not finding your Typesafe config.
The config should usually be placed in particular folder under
your app to be seen correctly. If it's in a non-standard
location you can pass -Dconfig.file=alt_reference1.conf to
java to tell it where to look. If this is a config that belogs
to Spark and not your app, I'd recommend running your jar via
spark submit (that should run) and dump out the
classpath/variables that spark submit sets up...
On Tue, Jun 2, 2015 at 6:58 AM, Pa Rö
<paul.roewer1...@googlemail.com
<mailto:paul.roewer1...@googlemail.com>> wrote:
hello community,
i have build a jar file from my spark app with maven (mvn
clean compile assembly:single) and the following pom file:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>mgm.tp.bigdata</groupId>
<artifactId>ma-spark</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>ma-spark</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<repositories>
<repository>
<id>cloudera</id>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.1.0-cdh5.2.5</version>
</dependency>
<dependency>
<groupId>mgm.tp.bigdata</groupId>
<artifactId>ma-commons</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>mgm.tp.bigdata.ma_spark.SparkMain</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
</project>
if i run my app with java -jar
ma-spark-0.0.1-SNAPSHOT-jar-with-dependencies.jar on
terminal, i get the following error message:
proewer@proewer-VirtualBox:~/Schreibtisch$ java -jar
ma-spark-0.0.1-SNAPSHOT-jar-with-dependencies.jar
2015-Jun-02 12:53:36,348 [main] org.apache.spark.util.Utils
WARN - Your hostname, proewer-VirtualBox resolves to a
loopback address: 127.0.1.1; using 10.0.2.15 instead (on
interface eth0)
2015-Jun-02 12:53:36,350 [main] org.apache.spark.util.Utils
WARN - Set SPARK_LOCAL_IP if you need to bind to another
address
2015-Jun-02 12:53:36,401 [main]
org.apache.spark.SecurityManager
INFO - Changing view acls to: proewer
2015-Jun-02 12:53:36,402 [main]
org.apache.spark.SecurityManager
INFO - Changing modify acls to: proewer
2015-Jun-02 12:53:36,403 [main]
org.apache.spark.SecurityManager
INFO - SecurityManager: authentication disabled; ui acls
disabled; users with view permissions: Set(proewer); users
with modify permissions: Set(proewer)
Exception in thread "main"
com.typesafe.config.ConfigException$Missing: No
configuration setting found for key 'akka.version'
at
com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
at
com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
at
com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
at
com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
at
com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
at
com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197)
at
akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:136)
at
akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
at
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
at
org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
at
mgm.tp.bigdata.ma_spark.SparkMain.main(SparkMain.java:38)
what i do wrong?
best regards,
paul