In reply to Jed Rothwell's message of Sun, 2 Apr 2023 20:15:54 -0400: Hi, [snip] >Robin <mixent...@aussiebroadband.com.au> wrote: > > >> Note, if it is really smart, and wants us gone, it will engineer the >> circumstances under which we wipe ourselves out. We >> certainly have the means. (A nuclear escalation ensuing from the war in >> Ukraine comes to mind.) >> > >As I pointed out, it would have to be really smart, really crazy, and *really, >really* suicidal. Because this would quickly cut off the electricity and >tech support, so the AI computer would soon stop. If the AI was smart >enough to destroy humanity, surely it would know this. It seems a little >unlikely to me that such an insane, suicidal intelligence could function >well enough to destroy civilization. That level of insanity is >dysfunctional.
..a level of insanity that we humans regularly demonstrate - wars. Cloud storage:- Unsafe, Slow, Expensive ...pick any three.