Hi all,

some time ago me and my friend had a discussion about optimizations performed 
by GHC. We wrote a 
bunch of nested list comprehensions that selected objects based on their 
properties (e.g. finding 
products with a price higher than 50). Then we rewrote our queries in a more 
efficient form by 
using tree-based maps as indices over the data. My friend knows how to turn 
nested list 
comprehensions into queries that use maps and claims this could be automated. 
He 
argued that it would be fine if compiler did such an optimization 
automatically, since we can 
guarantee that both versions have exactly the same semantics and if they have 
the same semantics 
we are free to use faster version. From a functional point of view this is a 
good point, but 
nevertheless I objected to his opinion, claiming that if compiler performed 
such a high-level 
optimization - replace underlying data structure with a different one and turn 
one algorithm into 
a completely different one - programmer wouldn't be able to reason about space 
behaviour of a 
program. I concluded that such a solution should not be built into a compiler 
but instead turned 
into an EDSL.

I would like to hear your opinion on this. How far a compiler can go with 
transforming code? I 
don't want to concentrate on discussing whether such an optimization is 
possible to implement 
from a technical point of view. What I'm interested in is whether such kind of 
high-level 
optimization could be accepted.

Janek

_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to