Yeah, unfortunately that will be up to them to fix, though it wouldn't hurt to
send them a JIRA mentioning this.
Matei
> On Nov 25, 2014, at 2:58 PM, Corey Nolet wrote:
>
> I was wiring up my job in the shell while i was learning Spark/Scala. I'm
> getting more comfortable with them both now
I was wiring up my job in the shell while i was learning Spark/Scala. I'm
getting more comfortable with them both now so I've been mostly testing
through Intellij with mock data as inputs.
I think the problem lies more on Hadoop than Spark as the Job object seems
to check it's state and throw an e
How are you creating the object in your Scala shell? Maybe you can write a
function that directly returns the RDD, without assigning the object to a
temporary variable.
Matei
> On Nov 5, 2014, at 2:54 PM, Corey Nolet wrote:
>
> The closer I look @ the stack trace in the Scala shell, it appear
Hi,
I'm trying to make custom input format for CSV file, if you can share little
bit more what you read as input and what things you have implemented. I'll
try to replicate the same things. If I find something interesting at my end
I'll let you know.
Thanks,
Harihar
-
--Harihar
--
Vi
The closer I look @ the stack trace in the Scala shell, it appears to be
the call to toString() that is causing the construction of the Job object
to fail. Is there a ways to suppress this output since it appears to be
hindering my ability to new up this object?
On Wed, Nov 5, 2014 at 5:49 PM, Cor