On Sun, Oct 13, 2019 at 3:18 PM <[email protected]> wrote:
> @Matt I had thought about that a few years ago. Once the first AGI is > made, it will be faster ex. by 50x using light, never sleep (2x faster), > never get tired or unfocused (faster!), have more memory, attention, > sensors, data, list goes on and on actually. 1 AGI will seem like 10 > brains, [maybe] 1,000. It will find ways to improve neural discovery and > research. > HA! Check out the ratio of components to interconnections on the Cray-3. Ever since I was introduced to the ILLIAC IV vs Seymour's CDC Cyber wars at the University of Illinois PLATO project, it has been apparent that, except for Seymour, computer "architects" are are like drunks continually looking for their keys under the lightpost. They keep coming up with normative "laws" they claim are "good" for programmers to follow. The classic one that, the moment I saw it in the CDC Cyber 180 architecture document, I knew CDC was doomed: "good locality of reference". No wonder Seymour bolted. He managed to hold it together pretty well until the spooks finally got to him with the Cray-4 and gave up on minimizing globally shared memory latency and let his company be taken over by the "architects". I predicted he'd die -- although I suspected it would be murder. I even predicted it would be a jeeping "accident". Although I thought it would be in the mountains rather than in the city. You want to see an architecture more in line with Seymour's heyday? http://jimbowery.blogspot.com/2013/04/a-circuit-minimizing-multicore-shared.html That's more like what you need to do a brain than anything out there right now that I'm aware of. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Td4a5dff7d017676c-M6a155798cbfe92522d7b1047 Delivery options: https://agi.topicbox.com/groups/agi/subscription
