Thanks for the advice. In my case it turned out to be two issues.
- use Java rather than Scala to launch the process, putting the core Scala libs
on the class path.
- I needed a merge strategy of Concat for reference.conf files in my build.sbt
Regards,
Mike
> On 23 Oct 2015, at 01:00, Ted Yu
RemoteActorRefProvider is in akka-remote_2.10-2.3.11.jar
jar tvf
~/.m2/repository/com/typesafe/akka/akka-remote_2.10/2.3.11/akka-remote_2.10-2.3.11.jar
| grep RemoteActorRefProvi
1761 Fri May 08 16:13:02 PDT 2015
akka/remote/RemoteActorRefProvider$$anonfun$5.class
1416 Fri May 08 16:13:02 PDT
RemoteActorRefProvider is in akka-remote_2.10-2.3.11.jar
jar tvf
~/.m2/repository/com/typesafe/akka/akka-remote_2.10/2.3.11/akka-remote_2.10-2.3.11.jar
| grep RemoteActorRefProvi
1761 Fri May 08 16:13:02 PDT 2015
akka/remote/RemoteActorRefProvider$$anonfun$5.class
1416 Fri May 08 16:13:02 PDT
Hi,
I have a Spark driver process that I have built into a single ‘fat jar’ this
runs fine, in Cygwin, on my development machine,
I can run:
scala -cp my-fat-jar-1.0.0.jar com.foo.MyMainClass
this works fine, it will submit Spark job, they process, all good.
However, on Linux (all Jars Spar