However, having 500GB and a good extractor are drowned in speed time / how many brains are working on different questions. Better extraction requires longer time and bigger memory to work in. So yes size and speed still define how much it can do in a given amount of time. It might spend most time generating a convincing lure but have no good answer if it needs to act fast before you notice it.
Still, a nanobot cell can grow very fast, and a computer may attain deep knowledge and have answers after 5 days if has a good enough extractor. There's room to play IOW. Interesting to note that both improving the dataset size or not and just the extractor, improve Lossless Compression (the Hutter Prize evaluation for AGI). A better extractor gets free data out of the dataset, so it's the same thing - more data=more accuracy. 10000000PBs of data still wouldn't lead us to the cure for cancer, so maybe a really good extractor is incredibly powerful at not just getting a lot of data out of 1MB (999999999999999PBs) but can also do things that simple extractors can't do to find a cancer cure. So although the time to run is same, it can do other things. It can get smarter answers that are worth more. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T1f1af8ac2c36937b-M2eeb832ba41db5ce7da04a87 Delivery options: https://agi.topicbox.com/groups/agi/subscription
