Re: Problem with version compatibility

2015-06-25 Thread jimfcarroll
Yana and Sean, Thanks for the feedback. I can get it to work a number of ways, I'm just wondering if there's a preferred means. One last question. Is there a reason the deployed Spark install doesn't contain the same version of several classes as the maven dependency. Is this intentional? Thank

Re: Problem with version compatibility

2015-06-25 Thread Yana Kadiyska
Jim, I do something similar to you. I mark all dependencies as provided and then make sure to drop the same version of spark-assembly in my war as I have on the executors. I don't remember if dropping in server/lib works, I think I ran into an issue with that. Would love to know "best practices" wh

Re: Problem with version compatibility

2015-06-25 Thread Sean Owen
Try putting your same Mesos assembly on the classpath of your client then, to emulate what spark-submit does. I don't think you merely also want to put it on the classpath but make sure nothing else from Spark is coming from your app. In 1.4 there is the 'launcher' API which makes programmatic acc

Re: Problem with version compatibility

2015-06-25 Thread jimfcarroll
Ah. I've avoided using spark-submit primarily because our use of Spark is as part of an analytics library that's meant to be embedded in other applications with their own lifecycle management. One of those application is a REST app running in tomcat which will make the use of spark-submit difficul

Re: Problem with version compatibility

2015-06-25 Thread Sean Owen
Yes spark-submit adds all this for you. You don't bring Spark classes in your app On Thu, Jun 25, 2015, 4:01 PM jimfcarroll wrote: > Hi Sean, > > I'm packaging spark with my (standalone) driver app using maven. Any > assemblies that are used on the mesos workers through extending the > classpath

Re: Problem with version compatibility

2015-06-25 Thread jimfcarroll
Hi Sean, I'm packaging spark with my (standalone) driver app using maven. Any assemblies that are used on the mesos workers through extending the classpath or providing the jars in the driver (via the SparkConf) isn't packaged with spark (it seems obvious that would be a mistake). I need, for exa

Re: Problem with version compatibility

2015-06-25 Thread Sean Owen
-dev +user That all sounds fine except are you packaging Spark classes with your app? that's the bit I'm wondering about. You would mark it as a 'provided' dependency in Maven. On Thu, Jun 25, 2015 at 5:12 AM, jimfcarroll wrote: > Hi Sean, > > I'm running a Mesos cluster. My driver app is built

Re: Problem with version compatibility

2015-06-24 Thread jimfcarroll
Hi Sean, I'm running a Mesos cluster. My driver app is built using maven against the maven 1.4.0 dependency. The Mesos slave machines have the spark distribution installed from the distribution link. I have a hard time understanding how this isn't a standard app deployment but maybe I'm missing

Re: Problem with version compatibility

2015-06-24 Thread Sean Owen
They are different classes even. Your problem isn't class-not-found though. You're also comparing different builds really. You should not be including Spark code in your app. On Wed, Jun 24, 2015, 9:48 PM jimfcarroll wrote: > These jars are simply incompatible. You can see this by looking at tha

Re: Problem with version compatibility

2015-06-24 Thread jimfcarroll
These jars are simply incompatible. You can see this by looking at that class in both the maven repo for 1.4.0 here: http://central.maven.org/maven2/org/apache/spark/spark-core_2.10/1.4.0/spark-core_2.10-1.4.0.jar as well as the spark-assembly jar inside the .tgz file you can get from the officia