Here is a nice discussion about fexprs:
http://lambda-the-ultimate.org/node/3640
On Tuesday, 26 April 2016 23:04:09 UTC+2, tbc++ wrote:
>
> Congradulations! You've discovered Fexprs! An ancient technology from a
> by-gone age: https://en.wikipedia.org/wiki/Fexpr
>
> On Tue, Apr 26, 2016 at 2:2
Wow ;-) now I know that I'm talking about fexprs ;-) So my propose is to
back them alive again. Let the new community consume them in new manner and
judge theirs usefulness.
On Tuesday, 26 April 2016 23:04:09 UTC+2, tbc++ wrote:
>
> Congradulations! You've discovered Fexprs! An ancient technolo
Congradulations! You've discovered Fexprs! An ancient technology from a
by-gone age: https://en.wikipedia.org/wiki/Fexpr
On Tue, Apr 26, 2016 at 2:23 PM, Olek wrote:
> Yes, the delay and force does the job. Now it would be nice to hide delay
> declaration in arguments destruction as already prop
Yes, the delay and force does the job. Now it would be nice to hide delay
declaration in arguments destruction as already proposed:
(den mycrazyif [ statement ~onsuccess ~onfailure ] ; nonsuccess and on
failure becomes delay objects
(if statement ; just evalutated with mycrazyif call
I'm not sure I fully understand your proposal, but when I really need lazy
evaluation (which is pretty rare) I reach for `delay` and `force`.
On Tuesday, 26 April 2016 16:41:08 UTC+1, Olek wrote:
>
> Hi!
>
> In short:
>
> I have noticed that in most cases I use macros only for lazy arguments
> e
And Happy April Fools' everyone. Nicely done Pradip.
On Tuesday, April 1, 2014 3:58:34 PM UTC-4, Jozef Wagner wrote:
>
> Reducers should be given IMO a more attention, support and importance and
> I'm actually experimenting on a hypothesis that reducers can replace lazy
> seqs in most cases (e
Reducers should be given IMO a more attention, support and importance and
I'm actually experimenting on a hypothesis that reducers can replace lazy
seqs in most cases (excluding simple stuff, e.g. destructuring). Imagine a
core API where functions like map, filter, rseq, range etc. are working
The question is "replace them with what"? I remember with not so fond
memories the days of using IEnumerable in C#, there was no immutability and
no caching. So if you created a massive chain of data and iterated over it
twice you would have to execute the entire chain of functions twice. With
Lazy
Back of the envelope meaning that you thought about the implementation and
are estimating, or you have measurements?
Either way, I agree that there are definitely use cases where non-lazy
processing can give performance improvements. Probably even relatively
common use cases.
Cases where lazy se