On Tuesday, November 19, 2019, at 11:42 AM, Matt Mahoney wrote:
> The best compressors are very complex. They use hundreds or thousands of 
> independent context models and adaptively combine their bit predictions and 
> encodes the prediction error. The decompressor uses an exact copy of the 
> model trained on previous output to reconstruct the original data.

THAT Doesn't sound very complex :) You literally just told us it: *combines 
models into 1 model & adaptively predicts next bit.*

Can you add "_details_" to that? :)

*My understanding** *is it does Huffman coding to eliminate *totally useless* 
up-scaling ignorant humans subjected it to. Then it combines many 
randomly-initiated web heterarchies like w2v/seq2seq of word-part/word/phrase 
code meanings, and combines many randomly-initiated models of the text for 
entailment purposes that used modern Transformer BERT Attention to know next 
word-part/word/phrase candidates plus frequency based on prior words and 
related meaning words from heterarchy. When it predicts the next bit it 
basically knows what word-parts/words/phrases are around (ya, bi-direction BERT 
tech) it including related meanings and based on frequency and scores it will 
decide the range candidate.

_Why does it work?_ Because patterns are in words and word parts and phrases, 
like 7zip recognizes. There's frequency as well. *When the next bit or bits* 
are predicted it knows what candidates there is to place next (or to refine one 
already added) and it *looks at codes around it* for context and sees multiple 
bit codes around it like boy/girl or ism/ing and knows the frequency of these 
codes and of what entails them and pays attention to related meanings around it 
as well to add to the score. Repeat recursively bi-directionally.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T26a5f8008aa0b4f8-M19d0644b85eb91249ae7e16a
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to