Just to add to that, DStream.transform allows you do to arbitrary
RDD-to-RDD function. Inside that you can do iterative RDD operations as
well.


On Thu, Apr 2, 2015 at 6:27 AM, Sean Owen <so...@cloudera.com> wrote:

> You can have diamonds but not cycles in the dependency graph.
>
> But what you are describing really sounds like simple iteration, since
> presumably you mean that the state of each element in the 'cycle'
> changes each time, and so isn't really the same element each time, and
> eventually you decide to stop. That is quite possible.
>
>
> On Thu, Apr 2, 2015 at 12:53 PM, anshu shukla <anshushuk...@gmail.com>
> wrote:
> > I  didn't  find any documentation  regarding support for  cycles in spark
> > topology , although storm supports  this using manual  configuration in
> > acker function logic (setting it to a particular count) .By cycles  i
> >  doesn't mean infinite loops .
> >
> > Can any body please help me in that .
> >
> > --
> > Thanks & Regards,
> > Anshu Shukla
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to