Just a note: the foreach package has solved this by providing a
"nesting" operator, which effectively converts multiple nested foreach
loops into one big one:
http://cran.r-project.org/web/packages/foreach/vignettes/nested.pdf
On Thu 14 Nov 2013 09:24:29 AM PST, Michael Lawrence wrote:
I like
I like the general idea of having iterators; was just checking out the
itertools package after not having looked at it for a while. I could see
having a BiocIterators package, and a bpiterate(iterator, FUN, ...,
BPPARAM). My suggestion was simpler though. Right now, bpmapply runs a
single job per i
We use a design iterator in BatchExperiments::makeDesign for a cartesian
product. I found a old version of designIterator (cf. <
https://github.com/tudo-r/BatchExperiments/blob/master/R/designs.R>) w/o
the optional data.frame input which is easier to read: <
https://gist.github.com/mllg/7469844>.
Something could go into BatchJobs, but it would be nice to have abstract
support for it at the level of BiocParallel.
On Thu, Nov 14, 2013 at 6:32 AM, Vincent Carey
wrote:
> Streamer package has DAGTeam/DAGParam components that I believe are
> relevant.
> An abstraction of the reduction plan for
Streamer package has DAGTeam/DAGParam components that I believe are
relevant.
An abstraction of the reduction plan for a parallelized task would seem to
have a natural
home in BatchJobs.
On Thu, Nov 14, 2013 at 8:15 AM, Michael Lawrence wrote:
> Hi guys,
>
> We often need to iterate over the ca
Hi guys,
We often need to iterate over the cartesian product of two dimensions, like
sample X chromosome. This is preferable to nested iteration, which is
complicated. I've been using expand.grid and bpmapply for this, but it
seems like this could be made easier. Like bpmapply could gain a CARTESI