Messages by Date
-
2025/06/20
Re: [agi] RETRACTION: The Starving Artist Experiment
Mark Nuzz
-
2025/06/20
Re: [agi] RETRACTION: The Starving Artist Experiment
John Rose
-
2025/06/19
Re: [agi] RETRACTION: The Starving Artist Experiment
Quan Tesla
-
2025/06/19
[agi] RETRACTION: The Starving Artist Experiment
Mark Nuzz
-
2025/06/18
Re: [agi] Assembly Theory Issues
Matt Mahoney
-
2025/06/17
[agi] Assembly Theory Issues
John Rose
-
2025/06/17
Re: [agi] AI induced psychosis
John Rose
-
2025/06/14
[agi] Iran forecast.
Alan Grimes via AGI
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Mark Nuzz
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Keyvan M. Sadeghi
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Mark Nuzz
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Keyvan M. Sadeghi
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Mark Nuzz
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Keyvan M. Sadeghi
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Keyvan M. Sadeghi
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Mark Nuzz
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Mark Nuzz
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Mark Nuzz
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Keyvan M. Sadeghi
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Mark Nuzz
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Keyvan M. Sadeghi
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Keyvan M. Sadeghi
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Mark Nuzz
-
2025/06/13
Re: [agi] Re: Iran <> Israel, can AGI zealots do anything?
Keyvan M. Sadeghi
-
2025/06/12
[agi] Re: The correct visionary predictions about AI and Creative intelligence from 2013 and the new top AI leaders rediscovering them and repeating it literally a decade later when they became obvious for the laymen
twenkid
-
2025/06/12
Re: [agi] The correct visionary predictions about AI and Creative intelligence from 2013 and the new top AI leaders rediscovering them and repeating it literally a decade later when they became obvious for the laymen
James Bowery
-
2025/06/12
[agi] The correct visionary predictions about AI and Creative intelligence from 2013 and the new top AI leaders rediscovering them and repeating it literally a decade later when they became obvious for the laymen
twenkid
-
2025/05/23
Re: [agi] AI induced psychosis
Mark Nuzz
-
2025/05/23
Re: [agi] AI induced psychosis
Matt Mahoney
-
2025/05/22
Re: [agi] AI induced psychosis
John Rose
-
2025/05/21
Re: [agi] AI induced psychosis
Mark Nuzz
-
2025/05/21
Re: [agi] AI induced psychosis
John Rose
-
2025/05/21
Re: [agi] AI induced psychosis
Mark Nuzz
-
2025/05/21
Re: [agi] AI induced psychosis
John Rose
-
2025/05/20
Re: [agi] AI induced psychosis
Mark Nuzz
-
2025/05/20
Re: [agi] AI induced psychosis
John Rose
-
2025/05/18
Re: [agi] AI induced psychosis
John Rose
-
2025/05/17
Re: [agi] AI induced psychosis
Mark Nuzz
-
2025/05/17
Re: [agi] AI induced psychosis
James Bowery
-
2025/05/16
Re: [agi] AI induced psychosis
John Rose
-
2025/05/16
Re: [agi] AI induced psychosis
Matt Mahoney
-
2025/05/15
Re: [agi] an AGI research agenda
James Bowery
-
2025/05/15
Re: [agi] Compression = understanding
Mike Archbold
-
2025/05/15
Re: [agi] an AGI research agenda
Matt Mahoney
-
2025/05/15
Re: [agi] an AGI research agenda
James Bowery
-
2025/05/15
Re: [agi] an AGI research agenda
Matt Mahoney
-
2025/05/15
Re: [agi] an AGI research agenda
James Bowery
-
2025/05/15
Re: [agi] Re: my AGI-2025 paper: Combining RL, LLM, and Logic
Matt Mahoney
-
2025/05/14
Re: [agi] Compression = understanding
Mike Archbold
-
2025/05/14
[agi] Compression = understanding
Matt Mahoney
-
2025/05/13
Re: [agi] an AGI research agenda
Yan King Yin, 甄景贤
-
2025/05/12
Re: [agi] an AGI research agenda
Rob Freeman
-
2025/05/12
Re: [agi] an AGI research agenda
Matt Mahoney
-
2025/05/11
Re: [agi] an AGI research agenda
Yan King Yin, 甄景贤
-
2025/05/11
Re: [agi] an AGI research agenda
Rob Freeman
-
2025/05/11
Re: [agi] AI induced psychosis
Keyvan M. Sadeghi
-
2025/05/11
Re: [agi] an AGI research agenda
Matt Mahoney
-
2025/05/11
[agi] AI induced psychosis
Matt Mahoney
-
2025/05/10
Re: [agi] an AGI research agenda
Yan King Yin, 甄景贤
-
2025/05/03
Re: [agi] Controlling AGI through a chain of intermediate intelligences
James Bowery
-
2025/05/02
[agi] Controlling AGI through a chain of intermediate intelligences
Matt Mahoney
-
2025/05/02
Re: [agi] "Causal inference using the algorithmic Markov condition" by Janzing and Schölkopf
James Bowery
-
2025/05/02
Re: [agi] "Causal inference using the algorithmic Markov condition" by Janzing and Schölkopf
James Bowery
-
2025/05/02
Re: [agi] "Causal inference using the algorithmic Markov condition" by Janzing and Schölkopf
James Bowery
-
2025/05/01
Re: [agi] "Causal inference using the algorithmic Markov condition" by Janzing and Schölkopf
Matt Mahoney
-
2025/05/01
Re: [agi] "Causal inference using the algorithmic Markov condition" by Janzing and Schölkopf
Matt Mahoney
-
2025/04/30
[agi] "Causal inference using the algorithmic Markov condition" by Janzing and Schölkopf
James Bowery
-
2025/04/30
Re: [agi] Re: The Information Industry's CrimeStop
James Bowery
-
2025/04/29
Re: [agi] Re: The Information Industry's CrimeStop
Keyvan M. Sadeghi
-
2025/04/29
[agi] SIGI-2025: Rolling Year-Long Conference of The Sacred Computer: Thinking Machines, Creativity and Human Development
twenkid
-
2025/04/24
Re: [agi] Re: The Information Industry's CrimeStop
Matt Mahoney
-
2025/04/21
[agi] Re: The Information Industry's CrimeStop
James Bowery
-
2025/04/21
[agi] Re: The Information Industry's CrimeStop
James Bowery
-
2025/04/21
[agi] The Information Industry's CrimeStop
James Bowery
-
2025/04/20
Re: [agi] Robots run half marathon in China
James Bowery
-
2025/04/20
[agi] Robots run half marathon in China
Matt Mahoney
-
2025/03/29
[agi] Launch a Coin & Get Rich? Or Spawn the Next AI Grift?
Keyvan M. Sadeghi
-
2025/03/06
[agi] Zizians giving AI doomers a bad name
Matt Mahoney
-
2025/03/06
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/03/05
Re: [agi] Melbourne start-up launches 'biological computer' made of human brain cells
Matt Mahoney
-
2025/03/05
Re: [agi] Melbourne start-up launches 'biological computer' made of human brain cells
Dorian Aur
-
2025/03/05
Re: [agi] Melbourne start-up launches 'biological computer' made of human brain cells
Matt Mahoney
-
2025/03/04
[agi] Melbourne start-up launches 'biological computer' made of human brain cells
Shashank Yadav
-
2025/03/03
Re: [agi] The Cognitive Singularity Theorem (CST)
Matt Mahoney
-
2025/03/03
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/25
Re: [agi] The Cognitive Singularity Theorem (CST)
Matt Mahoney
-
2025/02/25
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/24
Re: [agi] The Cognitive Singularity Theorem (CST)
Matt Mahoney
-
2025/02/24
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/23
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/23
Re: [agi] The Cognitive Singularity Theorem (CST)
James Bowery
-
2025/02/23
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/23
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/22
Re: [agi] The Cognitive Singularity Theorem (CST)
Keyvan M. Sadeghi
-
2025/02/22
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/22
Re: [agi] The Cognitive Singularity Theorem (CST)
Matt Mahoney
-
2025/02/21
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/20
Re: [agi] The Cognitive Singularity Theorem (CST)
Matt Mahoney
-
2025/02/19
Re: [agi] The Cognitive Singularity Theorem (CST)
Keyvan M. Sadeghi
-
2025/02/19
Re: [agi] The Cognitive Singularity Theorem (CST)
Matt Mahoney
-
2025/02/19
Re: [agi] The Cognitive Singularity Theorem (CST)
John Rose
-
2025/02/19
Re: [agi] The Cognitive Singularity Theorem (CST)
Keyvan M. Sadeghi
-
2025/02/19
Re: [agi] The Cognitive Singularity Theorem (CST)
Matt Mahoney
-
2025/02/17
Re: [agi] Re: The Cognitive Singularity Theorem (CST)
Keyvan M. Sadeghi
-
2025/02/17
Re: [agi] The Cognitive Singularity Theorem (CST)
Keyvan M. Sadeghi
-
2025/02/17
Re: [agi] The Cognitive Singularity Theorem (CST)
Matt Mahoney
-
2025/02/17
Re: [agi] Re: The Cognitive Singularity Theorem (CST)
Mark Nuzz
-
2025/02/12
[agi] Re: The Cognitive Singularity Theorem (CST)
Keyvan M. Sadeghi
-
2025/02/12
[agi] The Cognitive Singularity Theorem (CST)
Keyvan M. Sadeghi
-
2025/02/11
Re: [agi] China is winning the race to AGI
Keyvan M. Sadeghi
-
2025/02/11
Re: [agi] China is winning the race to AGI
Matt Mahoney
-
2025/02/10
Re: [agi] China is winning the race to AGI (refpersys a6dd508b6466)
Matt Mahoney
-
2025/02/10
Re: [agi] Whale language is structured like human language
Rob Freeman
-
2025/02/10
Re: [agi] China is winning the race to AGI
Keyvan M. Sadeghi
-
2025/02/10
Re: [agi] China is winning the race to AGI
James Bowery
-
2025/02/10
Re: [agi] China is winning the race to AGI
Matt Mahoney
-
2025/02/10
Re: [agi] China is winning the race to AGI
Telmo Menezes
-
2025/02/10
Re: [agi] China is winning the race to AGI
Matt Mahoney
-
2025/02/10
Re: [agi] Whale language is structured like human language
Matt Mahoney
-
2025/02/09
Re: [agi] Whale language is structured like human language
Rob Freeman
-
2025/02/09
Re: [agi] China is winning the race to AGI
James Bowery
-
2025/02/08
[agi] China is winning the race to AGI
Matt Mahoney
-
2025/02/07
[agi] Whale language is structured like human language
Matt Mahoney
-
2025/02/01
Re: [agi] DeepSeek hacked
Matt Mahoney
-
2025/02/01
Re: [agi] DeepSeek hacked
James Bowery
-
2025/01/31
[agi] DeepSeek hacked
Matt Mahoney
-
2025/01/29
Re: [agi] OpenAI accusing DeepSeek of training on GPT-4
Keyvan M. Sadeghi
-
2025/01/29
[agi] OpenAI accusing DeepSeek of training on GPT-4
Matt Mahoney
-
2025/01/29
Re: [agi] ClosedAI Secrets
John Rose
-
2025/01/28
Re: [agi] ClosedAI Secrets
John Rose
-
2025/01/28
[agi] One leap away from AGI: GOAL Formation (objective discovery)
Keyvan M. Sadeghi
-
2025/01/28
Re: [agi] ClosedAI Secrets
Keith Brawner
-
2025/01/28
Re: [agi] ClosedAI Secrets
Matt Mahoney
-
2025/01/28
Re: [agi] ClosedAI Secrets
John Rose
-
2025/01/27
Re: [agi] ClosedAI Secrets
Shashank Yadav
-
2025/01/27
Re: [agi] ClosedAI Secrets
John Rose
-
2025/01/20
Re: [agi] please forgive me for venting
Quan Tesla
-
2025/01/20
Re: [agi] please forgive me for venting
Bill Hibbard via AGI
-
2025/01/20
Re: [agi] please forgive me for venting
Quan Tesla
-
2025/01/20
[agi] please forgive me for venting
Bill Hibbard via AGI
-
2025/01/18
Re: [agi] ClosedAI Secrets
Quan Tesla
-
2025/01/17
[agi] ClosedAI Secrets
John Rose
-
2025/01/17
Re: [agi] Re: Safety Via Wolpert-Constrained ML
John Rose
-
2025/01/17
Re: [agi] Re: Safety Via Wolpert-Constrained ML
Matt Mahoney
-
2025/01/16
[agi] Nonresponse from Gemini.
Alan Grimes via AGI
-
2025/01/16
Re: [agi] Re: Safety Via Wolpert-Constrained ML
twenkid
-
2025/01/14
Re: [agi] Re: Safety Via Wolpert-Constrained ML
John Rose
-
2025/01/14
Re: [agi] Re: Safety Via Wolpert-Constrained ML
Matt Mahoney
-
2025/01/13
Re: [agi] Re: Safety Via Wolpert-Constrained ML
John Rose
-
2025/01/12
Re: [agi] Re: Safety Via Wolpert-Constrained ML
James Bowery
-
2025/01/11
Re: [agi] Re: Safety Via Wolpert-Constrained ML
John Rose
-
2025/01/11
Re: [agi] Re: Safety Via Wolpert-Constrained ML
Keyvan M. Sadeghi
-
2025/01/11
Re: [agi] Re: Safety Via Wolpert-Constrained ML
Matt Mahoney
-
2025/01/10
Re: [agi] Re: Safety Via Wolpert-Constrained ML
James Bowery
-
2025/01/10
Re: [agi] Re: Safety Via Wolpert-Constrained ML
Matt Mahoney
-
2025/01/10
Re: [agi] Re: Safety Via Wolpert-Constrained ML
James Bowery
-
2025/01/10
Re: [agi] Re: Safety Via Wolpert-Constrained ML
James Bowery
-
2025/01/09
Re: [agi] Re: Safety Via Wolpert-Constrained ML
Matt Mahoney
-
2025/01/07
[agi] Re: Safety Via Wolpert-Constrained ML
James Bowery
-
2025/01/06
[agi] Re: Humans think at 10 bits per second
twenkid
-
2025/01/06
[agi] Re: Humans think at 10 bits per second
twenkid
-
2025/01/03
[agi] Humans think at 10 bits per second
Matt Mahoney
-
2024/12/30
Re: [agi] Microsoft defined when AGI will be achieved!!!
John Rose
-
2024/12/29
Re: [agi] Microsoft defined when AGI will be achieved!!!
Matt Mahoney
-
2024/12/29
Re: [agi] Microsoft defined when AGI will be achieved!!!
John Rose
-
2024/12/28
Re: [agi] Microsoft defined when AGI will be achieved!!!
Matt Mahoney
-
2024/12/28
Re: [agi] Microsoft defined when AGI will be achieved!!!
David Salamon
-
2024/12/28
Re: [agi] Microsoft defined when AGI will be achieved!!!
Matt Mahoney
-
2024/12/28
[agi] Microsoft defined when AGI will be achieved!!!
twenkid
-
2024/12/24
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/23
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/23
[agi] Re: Safety Via Wolpert-Constrained ML
James Bowery
-
2024/12/22
Re: [agi] Logic Transformer, brought to you by YKY
Quan Tesla
-
2024/12/22
Re: [agi] Logic Transformer, brought to you by YKY
Ben Goertzel
-
2024/12/22
Re: [agi] Logic Transformer, brought to you by YKY
Ben Goertzel
-
2024/12/22
Re: [agi] Logic Transformer, brought to you by YKY
Quan Tesla
-
2024/12/22
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/22
Re: [agi] Logic Transformer, brought to you by YKY
Quan Tesla
-
2024/12/22
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/22
[agi] Illya's Famous NeurIPS Talk
James Bowery
-
2024/12/21
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/21
[agi] Re: New O3 model from ClosedAI....
stefan.reich.maker.of.eye via AGI
-
2024/12/20
[agi] New O3 model from ClosedAI....
Alan Grimes via AGI
-
2024/12/20
Re: [agi] Logic Transformer, brought to you by YKY
James Bowery
-
2024/12/19
Re: [agi] Logic Transformer, brought to you by YKY
Matt Mahoney
-
2024/12/19
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/17
Re: [agi] Total AGI domination.
Mark Nuzz
-
2024/12/17
[agi] Total AGI domination.
Alan Grimes via AGI
-
2024/12/17
[agi] Safety Via Wolpert-Constrained ML
James Bowery
-
2024/12/17
Re: [agi] Logic Transformer, brought to you by YKY
stefan.reich.maker.of.eye via AGI
-
2024/12/16
Re: [agi] Logic Transformer, brought to you by YKY
dissipate
-
2024/12/16
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/16
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/15
Re: [agi] Logic Transformer, brought to you by YKY
dissipate
-
2024/12/15
Re: [agi] Logic Transformer, brought to you by YKY
Quan Tesla
-
2024/12/15
Re: [agi] Logic Transformer, brought to you by YKY
dissipate
-
2024/12/15
Re: [agi] Logic Transformer, brought to you by YKY
mm ee
-
2024/12/15
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤
-
2024/12/15
Re: [agi] Logic Transformer, brought to you by YKY
Mike Archbold
-
2024/12/15
Re: [agi] Logic Transformer, brought to you by YKY
Yan King Yin, 甄景贤