Alexander Rhatusnyak wasn't required to publish source code for his winning
Hutter prize entry at the time he submitted it, but I have seen some of his
other code, a lot of it based on my work, so I have some insight.

Optimizing data compression code is a highly experimental process, sort of
like evolution. You make seemingly random changes and keep the ones that
work. You end up with an incomprehensible mess with no simple theory behind
it. If you can't explain your own code, certainly the source will be of no
help to anyone else. It's like trying to understand DNA.

I am pretty sure that the code uses dictionary encoding with related words
grouped, followed by PAQ style context mixing. But there is lots of leeway
in the details.


On Sat, Nov 16, 2019, 4:01 PM <[email protected]> wrote:

> James, I know compression is AGI but what is your AGI project or research
> idea to spend money on? To compress it further? What will it teach us
> further that Alexander Rhatushnyak (current long-time winner) didn't
> already teach us? We already know context mixing was needed, multiple
> predictive models combined that were trained on lots of data. It's just a
> bag of context - even the models are put in a bag. The Next Bit in the
> sequence is either related to and/or frequent and therefore predictable.
> What more did Alexander Rhatushnyak discover when he beat the record 4
> times??? What improved the compression??? We get the pre-stage of it, you
> use Huffman Coding to make bitstrings into number codes (if your file has
> 10 unique words, you can store each as a few bits instead of 40 bits). But
> after that it's just pattern prediction.
>
> reference: http://prize.hutter1.net/
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/T26a5f8008aa0b4f8-M5d8afd7cc742baf8a1c27026>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T26a5f8008aa0b4f8-M47abe3a5bf8c49b25ce87df0
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to