Has this changed now? Can a new RDD be implemented in Java?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/is-there-any-easier-way-to-define-a-custom-RDD-in-Java-tp6917p23027.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---
erwise, maybe you can dump your data source out as text
> and load it from there. Without more detail on what your data source is,
> it'll be hard for anyone to help.
>
> On Mon, May 25, 2015 at 5:00 PM, swaranga
> wrote:
>
>> Hello,
>>
>> I have a custom da
Hello,
I have a custom data source and I want to load the data into Spark to
perform some computations. For this I see that I might need to implement a
new RDD for my data source.
I am a complete Scala noob and I am hoping that I can implement the RDD in
Java only. I looked around the internet an
Hello,
I have a custom data source and I want to load the data into Spark to
perform some computations. For this I see that I might need to implement a
new RDD for my data source.
I am a complete Scala noob and I am hoping that I can implement the RDD in
Java only. I looked around the internet an
Experts,
This is an academic question. Since Spark runs on the JVM, how it is able to
do things like offloading RDDs from memory to disk when the data cannot fit
into memory. How are the calculations performed? Does it use the methods
availabe in the java.lang.Runtime class to get free/available m