The Hutter Prize targets automated language modeling. Ockham's Guillotine targets social modeling whether automated or not. The difference is in the kind of data sets and, secondarily, in the Hutter Prize's requirement that a compressor be published along with the self-extracting archive.
My opinion is that Ockham's Guillotine is also vastly more urgent than the Hutter Prize. See the Atlantic's issue on preventing a looming violent conflict within America. https://www.theatlantic.com/press-releases/archive/2019/11/atlantics-december-2019-issue/601795/ Matt has pointed out, perhaps correctly, that just as the corpus for the Hutter Prize is not large enough, so, too, is Ockham's Guillotine's dataset. This is easily remedied, although I'd be interested to see a cite for the calculation of what the minimum size must be. On Sat, Nov 16, 2019 at 3:01 PM <[email protected]> wrote: > James, I know compression is AGI but what is your AGI project or research > idea to spend money on? To compress it further? What will it teach us > further that Alexander Rhatushnyak (current long-time winner) didn't > already teach us? We already know context mixing was needed, multiple > predictive models combined that were trained on lots of data. It's just a > bag of context - even the models are put in a bag. The Next Bit in the > sequence is either related to and/or frequent and therefore predictable. > What more did Alexander Rhatushnyak discover when he beat the record 4 > times??? What improved the compression??? We get the pre-stage of it, you > use Huffman Coding to make bitstrings into number codes (if your file has > 10 unique words, you can store each as a few bits instead of 40 bits). But > after that it's just pattern prediction. > > reference: http://prize.hutter1.net/ > *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + delivery > options <https://agi.topicbox.com/groups/agi/subscription> Permalink > <https://agi.topicbox.com/groups/agi/T26a5f8008aa0b4f8-M5d8afd7cc742baf8a1c27026> > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T26a5f8008aa0b4f8-M43bc663634f90f340d2df1bc Delivery options: https://agi.topicbox.com/groups/agi/subscription
