Jim,

If you look at how lossless compression works, e.g. lossless text
compression, it is mostly based on predictive probability models ...

If you have an opaque predictive model of a body of text, e.g. a deep
NN, then it's hard to manipulate the internals of the model ...

OTOH if you have a predictive model that is explicitly represented as
(say) a probabilistic logic program, then it's easier to manipulate
the internals of the model...

So I think actually "operating on compressed versions of data" is
roughly equivalent to "producing highly accurate probabilistic models
that have transparent internal semantics"

Which is important for AGI for a lot of reasons

-- Ben
On Sat, Oct 6, 2018 at 5:05 AM Jim Bromer via AGI <[email protected]> wrote:
> 
> A good goal for a next generation compression system is to allow
> functional transformations to operate on some compressed data without
> needing to decompress it first. (I forgot what this is called but
> there is a Wikipedia entry on something s8milar in cryptography.)
> This is how multiplication works by the way.
> 
> If a 'dynamic compression' was preformed in stages using 'components'
> which had certain abstract attributes that could be used in
> computations that were done in multiple passes, then it might be
> possible to postpone a complete analysis or computation until the data
> was presented in a more abstract format (relative to the given
> problem). The goal is to find a way to make each pass effective but
> seriously less complicated. The idea is that the data 'components'
> (the data produced by a previous pass) might have certain abstract
> properties that were general, and subsequent passes might then operate
> on narrower classes. (This is how many algorithms work now that I
> think about it, but they are not described and defined using the
> concept of compression abstractions as a fundamental principle.)
> Jim Bromer



-- 
Ben Goertzel, PhD
http://goertzel.org

"The dewdrop world / Is the dewdrop world / And yet, and yet …" --
Kobayashi Issa

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T55454c75265cabe2-M41141e5fa13b30106edc749f
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to