Hi all,
I'd like to use an octree data structure in order to simplify several
computations in a big data set. I've been wondering if Spark has any
built-in options for such structures (the only thing I could find is the
DecisionTree), specially if they make use of RDDs.
I've also been exploring t
gt; I think someone mentioned before that this is a good use case for
> having a "tail" method on RDDs too, to skip the header for subsequent
> processing. But you can ignore it with a filter, or logic in your map
> method.
>
> On Wed, Jul 16, 2014 at 11:01 AM, Silvina Caíno
Hi everyone!
I'm really new to Spark and I'm trying to figure out which would be the
proper way to do the following:
1.- Read a file header (a single line)
2.- Build with it a configuration object
3.- Use that object in a function that will be called by map()
I thought about using filter() after
Right, the compile error is a casting issue telling me I cannot assign
a JavaPairRDD to a JavaPairRDD. It happens in the mapToPair()
method.
On 9 July 2014 19:52, Sean Owen wrote:
> You forgot the compile error!
>
>
> On Wed, Jul 9, 2014 at 6:14 PM, Silvina Caíno Lores >
Hi everyone,
I am new to Spark and I'm having problems to make my code compile. I have
the feeling I might be misunderstanding the functions so I would be very
glad to get some insight in what could be wrong.
The problematic code is the following:
JavaRDD bodies = lines.map(l -> {Body b = new Bo