I've been watching this for a bit and was thinking that a combination of a
message bus (Rabbit MQ?) and Cro should provide most of what you'd need for
a backbone.
The fact that Raku has Supplies and Channels built in means it feels like a
problem that's easy enough to fix.
This is probably me com
William, I didn't use SparkR. I use R primarily for plotting.
Spark's basic API is quite simple, it does the distributed computing of
map, filter, group, reduce etc, which are all covered by perl's map, sort,
grep functions IMO.
for instance, this common statistics on Spark:
>>> fruit.take(5)
[(
Hi Piper!
Have you used SparkR (R on Spark)?
https://spark.apache.org/docs/latest/sparkr.html
I'm encouraged by the data-type mapping between R and Spark. It
suggests to me that with a reasonable Spark API, mapping data types
between Raku and Spark should be straightforward:
https://spark.apach