I think that without an algorithm-enabling meta structure (and its not just that alone, but an integral of a holistic computational platform), algorithmic flexibility would not become dynamically enabled, nor would real-time, lossless compression rates be consistently attainable. We need to think about algorithms as an enduring ecology, not as a "brilliant" event. ________________________________ From: Ben Goertzel <[email protected]> Sent: Saturday, 06 October 2018 4:27 AM To: AGI Cc: agi Subject: Re: [agi] Compressed Algorithms that can work on compressed data.
Jim, If you look at how lossless compression works, e.g. lossless text compression, it is mostly based on predictive probability models ... If you have an opaque predictive model of a body of text, e.g. a deep NN, then it's hard to manipulate the internals of the model ... OTOH if you have a predictive model that is explicitly represented as (say) a probabilistic logic program, then it's easier to manipulate the internals of the model... So I think actually "operating on compressed versions of data" is roughly equivalent to "producing highly accurate probabilistic models that have transparent internal semantics" Which is important for AGI for a lot of reasons -- Ben On Sat, Oct 6, 2018 at 5:05 AM Jim Bromer via AGI <[email protected]> wrote: > > A good goal for a next generation compression system is to allow > functional transformations to operate on some compressed data without > needing to decompress it first. (I forgot what this is called but > there is a Wikipedia entry on something s8milar in cryptography.) > This is how multiplication works by the way. > > If a 'dynamic compression' was preformed in stages using 'components' > which had certain abstract attributes that could be used in > computations that were done in multiple passes, then it might be > possible to postpone a complete analysis or computation until the data > was presented in a more abstract format (relative to the given > problem). The goal is to find a way to make each pass effective but > seriously less complicated. The idea is that the data 'components' > (the data produced by a previous pass) might have certain abstract > properties that were general, and subsequent passes might then operate > on narrower classes. (This is how many algorithms work now that I > think about it, but they are not described and defined using the > concept of compression abstractions as a fundamental principle.) > Jim Bromer -- Ben Goertzel, PhD http://goertzel.org "The dewdrop world / Is the dewdrop world / And yet, and yet …" -- Kobayashi Issa ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T55454c75265cabe2-M1b8b729a40659af240ab20dc Delivery options: https://agi.topicbox.com/groups/agi/subscription
