Text compression handles this case. A model assigns a probability distribution over all possible words. If the second choice is correct, the compression is better than if the third is correct. Semantic models improve compression and are used in the best compressors.
On Thu, Sep 17, 2020, 3:51 PM <[email protected]> wrote: > We have been evaluating text prediction wrong. If the word to predict is > 'dog' and we predict cat is 80% likely, dog 20% likely, that costs us > accuracy! But really our prediction "cat is 80% likely, dog 20% likely" is > better than "boat 80% likely, dog 20% likely", because cat is more similar > to dog than boat, we can't only look at the prediction dog. But we can't > evaluate like that until we find a good algorithm that first is good at > accuracy of exact prediction (I predict cat, and none else), then we do % > (I predict cat 80%, dog 20%), then similars, and so on, eventually making > the test the same as the algorithm. We can store arithmetic better by > making cat and dog share the same space. > > ??? > *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + delivery > options <https://agi.topicbox.com/groups/agi/subscription> Permalink > <https://agi.topicbox.com/groups/agi/T5031dc71532ea161-Mc1c205905847aa73ea21a389> > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T5031dc71532ea161-M4bd7b35bc2034c0c9f44839d Delivery options: https://agi.topicbox.com/groups/agi/subscription
