On Wed, Apr 30, 2025 at 12:42 PM James Bowery <jabow...@gmail.com> wrote:
> What I came up with was the widely cited "Causal inference using the 
> algorithmic Markov condition" by Janzing and Schölkopf.  That paper argues 
> for a method, based on computable approximation of Kolmogorov Complexity, to 
> select from among different directed Acyclic graphs that model a standard 
> dataset.  This is, of course, reminiscent of the, in my opinion, profoundly 
> misleading "MDL principle" as set forth by Rissanen in which the descriptive 
> codes were not Turing complete.
at least their pedantry addresses some of Matt's and does so even
without the requirement of directed Cyclic graphs to model datasets.

I think I understand. We say that X causes Y if we can describe Y as a
function of X. If the simplest description of X and Y has the form Y =
f(X), then we are using algorithmic information to find causality. For
example,

X Y
- -
1 1
2 2
3 2

then I can write Y as a function of X, but not X as a function of Y.
Thus, the DAG X -> Y is more plausible than Y -> X.

To make this practical, the paper postulates a noise signal, as Y =
f(X, N), where N can be 0 in the first case but not in the second.
Thus, less algorithmic information is needed to encode the first case.

-- 
-- Matt Mahoney, mattmahone...@gmail.com

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T0f47884dae19d52d-M1f5eca5720c58732f2ec5179
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to