Monoids are useful in Aggregations and try avoiding Anonymous functions,
creating out functions out of the spark code allows the functions to be
reused(Possibly between Spark and Spark Streaming)
On Thu, Feb 19, 2015 at 6:56 AM, Jean-Pascal Billaud
wrote:
> Thanks Arush. I will check that out.
>
Thanks Arush. I will check that out.
On Wed, Feb 18, 2015 at 11:06 AM, Arush Kharbanda <
ar...@sigmoidanalytics.com> wrote:
> I find monoids pretty useful in this respect, basically separating out the
> logic in a monoid and then applying the logic to either a stream or a
> batch. A list of such
I find monoids pretty useful in this respect, basically separating out the
logic in a monoid and then applying the logic to either a stream or a
batch. A list of such practices could be really useful.
On Thu, Feb 19, 2015 at 12:26 AM, Jean-Pascal Billaud
wrote:
> Hey,
>
> It seems pretty clear t
Hey,
It seems pretty clear that one of the strength of Spark is to be able to
share your code between your batch and streaming layer. Though, given that
Spark streaming uses DStream being a set of RDDs and Spark uses a single
RDD there might some complexity associated with it.
Of course since DSt