>> Yeah, and this is where two models of computation have been conflated, >> creating magical effects, confusing everybody. I challenge you to get >> down to the machine code in scheme and formally describe how it's >> doing both. > > Which two models of computation are you talking about? And what magica; > effects?
Well, I delineate all computation involving predicates (like lambda calculus) between those using digital logic (like C). These realms of computation are so different, they are akin to mixing the complex numbers with the real. Yet hardly anyone points it out (I've concluded that hardly anyone has ever noticed -- the Church-Turing thesis has lulled the whole field into a shortcut in thinking which actually doesn't pan out in practice). > AFAIK there is no magic in computer science, although every sufficiently > advanced ... Ha! That's very good. I'm glad you catch the spirit of my rant. "Any sufficiently advanced compiler can be substituted with magic to the neophyte without a change in output." A mini Liskov substitution. -- MarkJ Tacoma, Washington -- https://mail.python.org/mailman/listinfo/python-list