Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A considerable group of people think AGI or even ASI is right around the corner




I've never gotten a straight answer as to whether AGI is a good thing for humanity or the economy.

Real AGI would be alive and would be capable of art and music and suffering and community, of course. So there would really be no more need for humans except to shovel the coal (or bodies of other humans) into the furnace that power the Truly Valuable members is society, the superintelligent AIs, which all other aspects of our society will be structured towards serving.

Real AGI might realistically decide to go to war with us if we've leaned anything from current LLMs and their penchant for blackmail


Best case scenario for ASI is that they create enormous wealth and will keep humans around as pets because it costs essentially nothing, like in the Culture series by Ian Banks or The Polity series Neal Asher.

Humans are extremely costly in terms of the resources we need in order to survive, the space we take up, and the messes we make (everything from the shit and piss we generate to the wars/destruction/death/violence we inflict on pretty much everything around us). We'd make for entertaining pets, but we're the farthest thing from low maintenance. Even ignoring that all other animals on earth share many of the problems we'd impose on our owners we stand alone in having done devastating amounts of damage (some irreversible) to ourselves and our planet.

I'd hope that they'd keep a few of us around, but it's hard to see the logic in them keeping all of us and allowing us the freedom to live and breed the way we do right now.


"Humans are extremely costly in terms of the resources we need in order to survive"

No. With the kind of wealth ASI can generate keeping 10 billion humans alive with a very good standard of living is like a human owning a cat.


That's all been thought of, yeah.

No, AGI isn't a good thing. We should expect it to go badly, because there are so many ways it could be catastrophic. Bad outcomes might even be the default without intervention. We have virtually no idea how to drive good outcomes of AGI.

AGI isn't being pursued because it will be good, it's being pursued because it is believed to be more-or-less inevitable, and everyone wants to be the one holding the reins for the best odds of survival and/or being crowned god-emperor (this is pretty obviously sam altman's angle for example)


Is it believed to be inevitable, or are their careers just dependent on that being the case at all costs.

Those seem the same to me

it is believed by some to be inevitable, yes.

Sure, just as a considerable group of alchemists believed the recipe for gold was right around the corner

technically we can turn lead into gold now though it's not economical or scalable. so the alchemist were proven right in the end.

> so the alchemist were proven right in the end.

Except for the fact that it wasn't right around the corner???


Remove the people with horses in the race and your "considerable" group becomes much much smaller.

Don't a considerable group of people think the rapture is right around the corner?

https://www.pewresearch.org/short-reads/2022/12/08/about-fou...


A surprisingly sticky belief when taught young enough. Anxiety about having been left behind every time you come home to an empty house is just a bonus.

Thankfully education about physics, first principles, and critical thinking got me out from under it. Hopefully they can do the same for the rest--if we get them young enough.


Always.

Coincidence, it is the same group trying to sell "AI" startup/services.

Modern AI is not an intelligence. Wonder what crap they are calling AGI.


If it was that close I think they would have made well over half their money back by now . . .



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: