On 4/3/14 12:14 PM, Marko Rauhamaa wrote:
Mark H Harris <harrismh...@gmail.com>:

So, python(3)'s use of unicode is exciting, not only as a step forward
for the python interpreter, but also as a leadership step forward in
computer science around the world.

Big words. I don't think computer science has experienced major steps
forward since the 1930's: combinatory logic, the Turing machine, the
Entscheidungsproblem, the halting problem,...

The one major latter-day addition is complexity theory (1960's).

hi Marko, computer science covers everything from a linked list to virtual reality, from cpu pipe lining to flash memory, from punched tape i/o to plasma displays--- to led back-lit flat panels. Computer science also includes theory, and most of what you mention actually had its beginnings in mathematics, not computer science. And yet, most of what you mention as fundamental to computer science is only the beginning.

The Turning a-machines together (and parallel to) Alonzo Church's lambda calculus (diverse methodologies on computability) brought a negative answer on the Entscheidungsproblem; so much so that one might even think that artificial intelligence were impossible. Alan Turning proved (before computers ever existed) that one a-machine may not determine whether another a-machine configuration will loop or halt. So what? Do we cease to work towards artificial intelligence? Do you believe that the AI work at MIT (using lisp) was a non step forwards for artificial intelligence; for computer science?

Did not David Hilbert get a kick-in-the-pants? You might have thought that mathematics at IAS would have folded its tents and blown away after Kurt Gődel proved (mostly as consequence of self-reference) that if an axiomatic system is complete it is also inconsistent, and if consistent assuredly incomplete! There are true statements which cannot be proven! Oh, crap. There must be systems of computation for which there is no proof, yet function non-the-less. Does this impact computer science today; does this impact AI studies today?

We as human beings have only just begun. The human mind is a quantum computer. Can a bit be 1 and 0 at the same time?? It most certainly can; entanglement is a computational reality that we have only just begun to think about let alone comprehend, nor code for (whatever we might mean by that).

Mathematicians hate this, but, computers are the way forward for mathematics. Computer proofs are increasing; we are discovering that proofs of import are requiring computers and computation algorithms. We don't through our straight edges and compasses away; nor do we toss out our BIC pens and paper. Algorithm is what is needed because the mathematics is too complicated. Computer science is moving understanding forward with algorithm.

Beyond all of that is communication. That is where unicode comes in. Computer science is going to handle the problem of Universal Translation. Great strides have been taken towards this already. More are sure to come.

שלם

marcus

--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to