Re: Spark Java example using external Jars

2014-03-24 Thread dmpour23
Hello, Has anyone got any ideas? I am not quite sure if my problem is an exact fit for Spark. Since in reality in this section of my program i am not really doing a reduce job simply a group by and partition. Would calling pipe on the Partiotined JavaRDD do the trick? Are there any examples usin

Re: Spark Java example using external Jars

2014-03-20 Thread dmpour23
Thanks for the example. However my main problem is that what i would like to do is: Create a SparkApp that will Sort and Partition the initial file (k) times based on a key. JavaSparkContext ctx = new JavaSparkContext("spark://dmpour:7077", "BasicFileSplit", System.getenv("SPARK_HOME"), J

Re: Spark Java example using external Jars

2014-03-13 Thread Adam Novak
Have a look at my project: . I use the SBT Native Packager, which dumps my jar and all its dependency jars into one directory. Then I have my code find the jar it's running from, and loop through tha