James, I know compression is  AGI but what is your AGI project or research idea 
to spend money on? To compress it further? What will it teach us further that 
Alexander Rhatushnyak (current long-time winner) didn't already teach us? We 
already know context mixing was needed, multiple predictive models combined 
that were trained  on lots of data. It's just a bag of context - even the 
models are put in a bag. The Next Bit in the sequence is either related to 
and/or frequent and therefore predictable. What more did Alexander Rhatushnyak 
discover when he beat the record 4 times??? What improved the compression??? We 
get the pre-stage of it, you use Huffman Coding to make bitstrings into number 
codes (if your file has 10 unique words, you can store each as a few bits 
instead of 40 bits). But after that it's just pattern prediction.

reference: http://prize.hutter1.net/
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T26a5f8008aa0b4f8-M5d8afd7cc742baf8a1c27026
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to