Nick writes:

"My basic New Thought (new to me, I mean) was, why talk about biology when we 
can talk about computer programming, given the wonders that simple algorithms 
(eg, cellular automata) can generate."

It's true it is all much more coherent.   But the algorithms are simple and the 
machines that execute them can't (yet) reproduce or repair themselves.   They 
are at most shallow 3D fixed-purpose devices, not complex evolving nanomachines 
like cells.   Most computer programs are built around the so-called von Neumann 
architecture that separates programs from the machine that executes programs, 
and this architecture has favored serial step-by-step programs instead of 
highly-distributed and scalable signaling.   Papers like the one Roger shared 
are interesting to me is because the latent `discovered' structure might 
suggest new (synthetic biology) programming models, which could either be used 
directly to perform different tasks (eat up CO2, clean up toxic waste, novel 
medications, perform large distributed calculations) or inspire new designs for 
more conventional (e.g. silicon) systems.   It's a fishing expedition to find 
fixed-function machines that already exist in nature and can be adapted to do 
what we want.

Marcus
 

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
archives back to 2003: http://friam.471366.n2.nabble.com/
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove

Reply via email to