Re: Implementing a custom Spark shell

2014-03-06 Thread Sampo Niskanen
Hi, I've tried to enable debug logging, but can't figure out what might be going wrong. Can anyone assist decyphering the log? The log of the startup and run attempts is at http://pastebin.com/XyeY92VF This uses SparkILoop, DEBUG level logging and settings.debug.value = true option. Line 323:

Re: Implementing a custom Spark shell

2014-02-28 Thread Prashant Sharma
You can enable debug logging for repl, thankfully it uses sparks logging framework. Trouble must be with wrappers. Prashant Sharma On Fri, Feb 28, 2014 at 12:29 PM, Sampo Niskanen wrote: > Hi, > > Thanks for the pointers. I did get my code working within the normal > spark-shell. However, sin

Re: Implementing a custom Spark shell

2014-02-27 Thread Sampo Niskanen
Hi, Thanks for the pointers. I did get my code working within the normal spark-shell. However, since I'm building a separate analysis service which pulls in the Spark libraries using SBT, I'd much rather have the custom shell incorporated in that, instead of having to use the default downloadabl

Re: Implementing a custom Spark shell

2014-02-26 Thread Matei Zaharia
In Spark 0.9 and master, you can pass the -i argument to spark-shell to load a script containing commands before opening the prompt. This is also a feature of the Scala shell as a whole (try scala -help for details). Also, once you’re in the shell, you can use :load file.scala to execute the co

Implementing a custom Spark shell

2014-02-25 Thread Sampo Niskanen
Hi, I'd like to create a custom version of the Spark shell, which has automatically defined some other variables / RDDs (in addition to 'sc') specific to our application. Is this possible? I took a look at the code that the spark-shell invokes, and it seems quite complex. Can this be reused fro