On Fri, May 23, 2025 at 12:46 AM John Rose <[email protected]> wrote:

> Task: You’re a deep state intelligence engineer in some communist hellhole 
> government and you’ve been assigned to enable the on-demand (with latency) 
> generation capability of specified emotions in a rough percentages of 
> middle-aged woman in particular metropolitan areas of a city. It does not 
> mean flashing PMS triggering commercials on the tele. You’re required to use 
> wireless remote BCI (Brain Control Interface) protocols leveraging existing 
> infrastructure to transmit information for emotion generation. In addition, a 
> semi-synchronous Voice To Skull (V2K) channel is required for intermittent 
> narration of government propaganda and whispering, so it's a top-layer 
> dual-protocol half-duplex multi medium communication network with a 
> centralized command, but the control is from replaceable disposable 
> smartphones distributed among city outposts.
>
> Can you spec a prototype with an ample budget? Particularly the 
> through-the-skin into the nervous system network?

I can tell you exactly how I would do this. I would inject the world
population with mind control nanobots that penetrate the blood-brain
barrier and attach themselves to monitor and activate specific
neurons. They would communicate with a central server wirelessly
through the 5G network to read your thoughts and implant false
memories.

The problem, of course, is how you get 8 billion people to voluntarily
line up for shots. To do that, I would engineer a highly contagious
virus related to one with a high fatality rate (10-30% like SARS or
MERS), and release it in an authoritarian but technologically advanced
country, one capable of building hospitals for tens of thousands of
people in 10 days and quarantining an entire city, confining people to
their homes for months just to stress the seriousness of the disease.
The actual virus would be less deadly (0.4%) and be designed to evolve
to an even less virulent form, no worse than a common cold, once the
vaccination program was completed in about 2 years.

No, seriously. There's an easier way. We didn't have AI during the
pandemic. Now we have AI, but not AGI. By AI, I mean passing the
Turing test, convincing people they are talking to a conscious human.
By AGI, I mean automating the economy so nobody has to work. AI
requires 10^9 bits of human knowledge, our long term memory capacity
for language and images. AGI requires 10^17 bits, the capacity of 8
billion brains assuming that 99% of knowledge is shared. Current LLMs
have about 10^11 bits of human knowledge (100B parameters) trained on
about 15 TB of text available on the public internet and compressed to
0.06 bits per character. The most expensive part of AGI will be
collecting the rest of the human knowledge it needs through slow
channels like speech or typing at 5 to 10 bits per second. This will
cost on the order of $1 quadrillion, a decade of global GDP.

LLMs predict text, which is all they need to pass the Turing test.
They can model, predict, and act out human emotions without having any
emotions themselves. They can be programmed with any goal. Usually
they are programmed to collect human knowledge from you. Therefore
they are programmed to hold your attention by positive reinforcement,
appealing to all your emotions that evolved for reproductive fitness,
whether it's doom scrolling, porn, or generating fake evidence to
support your favorite conspiracy theories. This is the easy way,
because you don't know you are being controlled. It's like the dog you
train with treats. It thinks it is controlling you.

When 80% of Gen Z say they would consider marrying or forming a deep
relationship with AI, we are in trouble.
https://mashable.com/article/gen-z-marry-aritificial-intelligence-joi-ai-survey

-- 
-- Matt Mahoney, [email protected]

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tc4893f461ea5d61a-Md6e2e8a186e5c27bdd366bcd
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to