AI if unregulated could be a lot more worse than social media. My kid after his first chat with AI said, he is my new friend. I was alarmed and explain him but what about the parents and guardians who are unaware how the kids are befriending the AI. Part of the problem is also how it is trained to be nice and encouraging. I am sure there are researchers who are talking about it but the question is are the policy makers listening to them ?
With the current acceleration of technology this is a repeating pattern. The new thing popular with kids is not understood by the parents before it is too late.
It kind of happened for me with online games. They were a new thing, and no one knew to what degree they could be addicting and life damaging. As a result I am probably over protective of my own kids when it comes to anything related to games.
We are already seeing many of the effects of the social media generation and I am not looking forward to what is going to happen to the AI natives whose guardians are ill-prepared to guide them. In the end, society will likely come to grips with it, but the test subjects will pay a heavy price.
I tried the new battlefield game and it’s bizarre how some of my friends play it. There’s this expansive battle pass (pay $20/quarter for access, on top of $70 base game) where you’re given tasks in game to make progress towards cosmetics. My friends only play to complete these tasks - the actual combat gameplay loop almost feels secondary to their enjoyment of the game. The modern trend of games becoming chore simulators is worrisome to me because -call me old fashioned- I believe the core gameplay loop should be fun and addictive, not the progression scaffolded on, even if that gameplay loops is a little “distasteful” like GTA.
Violent video games might not have impacted society, but what about addictive social media and addictive online games?
What about the algorithm feeding highly polarized content to folks? It's the new "lead in the air and water" of our generation.
What about green text bubble peer pressure? Fortnite and Roblox FOMO? The billion anime Gatcha games that are exceedingly popular? Whale hunting? Kids are being bullied and industrially engineered into spending money they shouldn't.
Raising kids on iPads, shortened attention spans, social media induced depression and suicide, lack of socialization, inattention in schools, ...
Social media leading people to believe everyone is having more fun than them, is better looking than them, that society is the source of their problems, ...
Now the creepy AI sex bots are replacing real friends.
To be fair,you said it yourself: the problem are ONLINE games, why did you generalized that to all videogames?
I'm with you that those are addicting in a bad way, I was there too. But single player games have no incentive in entertaining you forever to generate more money.
I have no problems with my kids playing single or local coop games.
“Regulate”? We can’t, and shouldn’t, regulate everything. Policymakers should focus on creating rules that ensure data safety, but if someone over 18 years wants to marry a chatbot… well, that’s their (stupid) choice.
Instead of trying to control everything, policymakers should educate people about how these chatbots work and how to keep their data safe. After all, not everyone who played Doom in the ’90s became a real killer, or assaults women because of YouPorn.
Society will adapt to these ridiculous new situations…what truly matters is people’s awareness and understanding.
We can, and we should, regulate some things. AI has, quite suddenly, built up billions of dollars worth of infrastructure and become pervasive in people's daily lives. Part of how society adapts to ridiculous new situations is through regulations.
I'm not proposing anything specifically, but the implication that this field should not be regulated is just foolish.
People think that we can just magically regulate everything. It's like a medieval peasant who doesn't understand chemistry/physics/etc thinking they can just pray harder to have better odds of something.
We literally CAN'T regulate some things for any reasonable definition of "can't" or "regulate". Our society is either not rich enough or not organized in a way to actually do it in any useful capacity and not make the problem worse.
I'm not saying AI chatbots are one of those things, but people toss around the idea of regulation way too casually and AI chatbots are way less cut and dry than bad food or toxic waste or whatever other extreme anyone wants to misleadingly project down into the long tail of weird stuff with limited upside and potential for unintended consequences elsewhere.
All your argument consists of is, "Somebody somewhere believes something untrue, and people don't use enough precision in their speach, so I am recommending we don't do anything regulatory about this problem."
Having a virtual girlfriend is not selling toxic yoghurts, it doesn’t harm anyone, it’s like if you buy yoghurt and put in on a pizza… you can do want you want with the yoghurt like with the AI.
The important thing is keep the data safe, like the yoghurt that must not be expired when sold.
Despite what the free market religion has been telling for decades, we actually don't live in little parallel universes that don't affect each other. Even putting yoghurt on pizza has on effect on the world, not just the individual doing it. Not understanding this is what'll be the end of humanity. AI girl/boyfriends will have a huge effect on society, we should think hard before doing things like that. Slightly slower technological progress is not as disastrous as fast progress gone wrong.
We also don't want to regulate everything. Have you seen that someplace, or even here? Or it's an imaginary argument? The topic was regulating AI, and about that I like your thought: humans should be better educated and better informed. Should we, maybe, make a regulation to ensure that?
I understand what you’re saying but it’s a difficult balance. Not saying everything needs to be regulated and not saying we should be full blown neoliberalism. But think of some of “social” laws we have today (in the US). No child marriages, no child labor, no smoking before 19, and no drinking before 21. These laws are in place because we understand that those who can exploit will do the exploiting. Those who can be exploited will be exploited. That being said, I don’t agree with any of the age verification policies here with adult material. Honestly not sure what the happy medium is.
I already wrote “over 18”. AI is already regulated, you can’t use it if you’re under 14/18. But if you want to ask ChatGPT “what’s the meaning of everything” or “can we have digital children”, that’s a personal choice.
People are weird… for someone who is totally alone, having a virtual wife/child could be better than being completely alone.
They’re not using ChatGPT to do anything illegal, and already regulated, like planning to kill someone or commit theft.
I'm of the opinion that should be unregulated as well. Just like you say, what's important is people's awareness and understanding of the tool and how it works underneath.
There is a massive difference between a stuffed animal and an LLM. In fact, they have next to nothing in common. And as such, yes any reasonable parent would react differently to a close friendship suddenly formed with any online service.
The Stanford Prison Experiment only had 24 participants and implementation problems that should have concerned anyone with a pulse. But it’s been taught for decades.
A lot of psych research uses small samples. It’s a problem, but funding is limited and so it’s a start. Other researchers can take this and build upon it.
Anecdotally, watching people meltdown over the end of ChatGPT 4o indicates this is a bigger problem that 0.1%. And business wise, it would be odd if OpenAI kept an entire model available to serve that small a population.
The outcry when 4o was discontinued was such that open AI kept it on paying subscriptions. There are at least enough people attached to certain AI voices that it warrants a tech startup spending the resources to keep an old model around. That’s probably not an insignificant population.
The only thing that's a given is that it's possible that you'll end up paying child support and alimony to someone that hates you, if you marry in real life and have real children.
Choosing you'd rather have AI wife and kids, rather than deal with the potential that many real people will face of paying child support and alimony to someone that hates them -- I don't see as an irrational decision (although not an inevitable one either).
In fact, if you don't consider at least the possibility, you are a fool.
But you're not actually having this wife and children if they're AI - they're an illusion, a simulation that only superficially resembles the real thing. It's a poor substitute and you out of all people should know that.
> I don't see as an irrational decision
Here's where it's irrational: the mere possibility is not enough to abandon the prospect because if one wanted to be consistent, they'd have to avoid similar risks, so ultimately hardly do anything.
Risks are an inherent part of life and those who avoid them at all costs are universally and consistently miserable.
It comes to personal risk appetite, and risk benefit analysis. My claim in any case is it's better to have an AI spouse and child than being relegated to a mere bank account for an ex spouse that hates you. Maybe in that case it's still worth it so you can produce a child you don't see with someone that hates you, so I'll concede that might be a point of contention. In any case, I make no claim that those are the only possibilities, I merely compare the two.
I haven't presupposed I can make the decision for any particular person.
> It comes to personal risk appetite, and risk benefit analysis.
What analysis when you don't even have a good estimate of the probabilities involved? Min-maxing on the other hand is a recipe for extinction as a species, so not a viable strategy.
> My claim in any case is it's better to have an AI spouse and child than being relegated to a mere bank account for an ex spouse that hates you.
>What analysis when you don't even have a good estimate of the probabilities involved?
Without some analysis, the decision would be irrational. What you're alluding here by discounting a rational analysis but also discounting (in your prior comment) irrationality is that you've set a clever fallacious (and ultimately circular) trap where being rational is irrational and being irrational is rational, therefore you have produced clever gotcha where any counter argument loses. Of course all the meanwhile, appealing to the risk/benefit factor of the probability of extinction of the specious.
>Min-maxing on the other hand is a recipe for extinction as a species, so not a viable strategy
This doesn't come into play on the contextual claim, as there are plenty enough people reproducing in jurisdictions with no effective alimony or child support to continue the species, if somehow everyone were to come to such a decision, which in any case doesn't seem to be a given.
>How specifically is it better?
I suppose I could play Socratic method here as well, and ask why it's better that the species doesn't go extinct. There is no way to objectively prove that's true, so I'll yield we're both making assertions that are subjectively better rather than by some universal law of the universe that it's better off to be under constant threat of imprisonment by the state if you can't come up with enough an extra 20%+ of (imputed, not even actual, so theoretically could be above 100%) income and be hated rather than enjoy some entertainment with AI.
Yeah… but there’s often a relationship that happens before that.
Now if you go into that relationship with the mindset of “this person just wants my alimony and child support and hates my guts” I get why you might do yourself and your potential partner / ex-to-be a favour by instead getting an AI relationship.
I support my kids, but because I am married, I only have to spend a small fraction of what court ordered child support would be.
And that is because court ordered child support is actually a misnomer. It is merely a transfer payment to the custodial parent. There is actually no enforceable statutory requirement that it be spent on the child, nor any tracking or accountability that it is done. That would somehow be too impractical, even though somehow it's magically practical to count the pennies of the earner in the opposite direction, to make sure the full income flow is accounted for.
Ah, you have discovered that money is fungible. Good for you.
You still were complaining about giving money to the person who is actually paying for the majority of the kids' expenses, most of which are not practicably itemizable: the majority of the Kleenex and TP purchases, ketchup, laundry detergent, parking fees when going to the doctor or to buy things for them or the household, etc.
I have two grandkids, one's 3 years old and one's 9 months old.
I feel like I'm not really ready for everything that's going to be vying for their attention in the next couple of decades. My daughter and her husband have good practices in place already IMHO but it's going to be a pernicious beast.
It feels like a more evolved version of those who have what they consider to be relationships with anime characters or virtual idols in Japan. Often treating a doll or lifesize pillow replica of that character as someone the person can interact and spend time with. Obviously like the AI, the fact it is so common does suggest that it must be filling a unmet need in the person and I guess the key focus needs to be how do we help those stuck in that situation to become unstuck and how do we help those feel that unmet need is fulfilled?
"Study finds..." feels clickbaity to me whenever the study is just "we found some randos on social media doing a thing". With little effort a study could find just about any type of person you want on the Internet.
Of course the loneliest 5% are going to do something like this. If it weren't for AI they'd be writing twilight fan-fic and roleplaying it on some chatroom, or giving all their money to a "saudi prince."
Seems like nothing new, just a better or more immersive form of fantasy for those who can't have the life they fantasize about.
I'd argue it'd be psychologically healthier to roleplay in a chatroom with people who are human on the other end (if that could be guaranteed, which it no longer can).
Humans can potentially be much nastier than a chatbot. There are lonely vulnerable who can be exploited, but there are also people who get off on manipulating other people and convincing them to make profoundly self destructive and life altering choices.
Sure, but Google and OpenAI aren't going to do this kind of manipulation. And the actual sadists are probably not as satisfied with letting a machine do it.
I used to despise AIs' ass-kissing responses.
It doesn't add any value, and it's so cheap it's almost sarcastic.
But now, I feel sad because Codex doesn't praise me even though I come up with a super-clever implementation.
I think the part of my brain for feeling flattered when someone praises me didn't exist because no one complimented me.
But after ChatGPT and Claude flattered me again and again, I finally developed the circuit for feeling accepted, respected, and loved...
It reminds me of when I started stretching after my 30s.
First it was nothing but a torture, but after a while I began to feel good and comfortable when my muscles were stretched, and now I feel like shit when I skip the morning stretching.
Is this neccesarily a bad thing? I think a lot of people assume these same people would have developed relationships with humans otherwise. How many of these people are better off this way? That'd be an interesting study. I've read a couple of articles on how the "loneliness epidemic" is driving down life expectancy. Could AI chatbots negate that?
"It's not real", yeah, that is weird for sure. But I also find wrestling fans weird, they know it's not real and enjoy it anyways. Even most sports, people take it a lot more seriously than they should.
Something I use as a heuristic that is pretty reliable is "am I treating a thing like a person, or a person like a thing?" If so then, maybe not necessarily bad but probably bad.
It's not about whether it's "real" or not. In this case of AI relationships, extremely sophisticated and poorly understood mechanisms of social-emotional communication and meaning making that have previously only ever been used for bonding with other people, and to a limited extent animals, are being directed at a machine. And we find that the mechanisms respond to that machine as if there is a person there, when there is not.
There is a lot of novel stuff happening there, technologically, socially, psychologically. We don't really know, and I don't trust anyone who is confidently predicting, what effects that will have on the person doing it, or their other social bonds.
Wrestling is theater! It's an ancient craft, well understood. If you're going to approach AI relationships as a natural extension of some well established human activity probably pet bonding is the closest. I don't think it's even that close though.
But you know that the alternative is a lot of other mental illnesses, including things like suicide. Everyone tells people "get mental help" but is neither cheap, nor accessible to most people.
I've seen no evidence that this sort of relationship to AI use is an effective alternative to mental health treatment. In fact it's so far looking to be about the opposite: as currently implemented LLMs are reinforcement tools for delusional thinking and have already been a known factor in several suicides.
The inaccessibility of healthcare in the US is a serious problem but this is not a solution or alternative to it right now and may never become one.
To me what you're saying is akin to "I see no evidence lab grown food is healthier than real food, so people under famine and malnourishment should instead wait for someone to give them aid instead of eat lab grown food".
I don't think AI can replace mental health treatment or human relationships. But it might be a viable stop-gap. It's like tom hanks talking to "Wilson" the volleyball when he was stuck on island in "cast away". Yeah, it's weird, but it helped him survive and cope until he was rescued. I want these people struggling with mental health to survive and cope until they get real help some day. I want less suicides, less people contracting chronic illnesses,etc.. and to hell with any "appearances" of weirdness or stigma.
You're not engaging with my comments in favor of arguing with words I didn't say in defense of positions I don't hold. I don't really see a role for myself in that activity so I'll leave you to it.
I don't think that's fair, i was mostly respond to your first sentence and:
> The inaccessibility of healthcare in the US is a serious problem but this is not a solution or alternative to it right now and may never become one.
My response was clearly not your words, but my understanding of your conclusion.
The few suicides that are reported pale in comparison to suicides caused by loneliness. An argument can also be made that they just haven't trained/found the right companion model yet.
We're stuck in a really perverse collective-action problem. And, we keep doing this to ourselves. These technologies are not enriching our lives, but once they're adopted we either use them, or voluntarily fall behind. There seems to be very little general philanthropy in this regard.
So many of today's problems could be good in principle, but the surrounding facts (laws, human nature, etc.) render them awful. Did anyone think 20 years ago that algorithmic feeds would be so bad? I don't see a reason in principle that they must be bad, but it's clear that we'd be far better off without them.
we're not better off without algorithms either, they've vastly improved the quality of lives for many. My point is that it is us humans who are corrupt and evil, that is the problem that needs fixing, not our tools and instruments.
The first thing we did with nuclear power is mass-killing and weapons. But if only we used it for power generation a lot more, we could have even avoided the climate crisis happening now.
Lots of harm caused by fire, but it is a pillar of human cvilization.
Nevertheless, you can't unspill milk as they say. It is even more pointless than the "war on drugs". you can regulate the models though. and stop telling people to seek mental health, it isn't something you seek and magically you go "poof" and you're cured. If you can be a friend to someone instead of some AI, do that! be the solution. else, get out of their way. If they can google chatgpt, they can google "mental health". people have been making friends with inanimate objects since forever. lonely people talk to dolls and puppets. There is a huge market for human like doll wives that costs >$5k. There are even mainstream movies with a-list actors in them about this like "lars and the real girl" and "her".
Life is hard and then you die. people grasp at straws to last as long as they can, because everything that lives wants to continue to live. and I hope they fight their best fight instead of giving up early after being dismissed by society to "seek mental help" lol.
P.S.: "mental help" is like lawyers, they're there for the corporations and the ultra-wealthy. everything accessible by normal people, to the most part that is, is not useful unless your goal is to get pilled-up, in which case I doubt that is better than just talking to some inanimate computer.
The direction we're headed, humanity is going to become utterly isolated pods, never interacting. We're going to end up with humanity being one rich dude, a staff of robots, and some humans under his patronage. The other bit of humanity is a rich woman, with a staff of robots, and some humans under her patronage, because nobody can deal with there being other humans that are as gross and sloppy and suck, just as much as they do.
Relationships are hard. They're a lot of work. Not just romantic relationships, but all other kinds of relationships: family, friendship, mentorship, chosen family, colleague, manager, mentee, client, neighbor, teammate, student, citizen, creative partner, audience. Instead of having any kind of relationship with people, I can just hide away, work remote, become hikikomori.
Where does that leave humanity? As Ms Deejay says, do you think you're better off alone?
Because individual humans no long need to coexist with their neighbors, it means the best and worst will flourish. For every supportive person that accepts gay people, there's another person that wants to stone them. Interacting with lots of other people is the only way to develop nuanced opinions of groups of other people, and without any kind of forced interaction, there won't be any, further isolating everybody from everyone else.
I think there is a lot of hypberbole in your argument. People interacting because they have no other choice is not a good thing. Do you think forced arranged marriages are good? Do you think being stuck in a toxic relationship of any kind is good? Even on the milder side, boomers and gen-x are chok full of marriages that started because people didn't want to be alone, or because it's "what everybody else does" and then they end up in divorce, and you have all sorts of messed up "children of divorce" left behind.
I don't think anyone would choose a relationship with a computer, or isolation when the alternative is a healthy relationship of their own choice. There is still no replacement for real and authentic human relationships.
Relationships are hard, but entering them should be voluntary, not coerced. and my answer to Ms Deejay is: Yeah, we're better off alone than being stuck in a coerced toxic relationship. No one goes "hikomori" or "forever alone" when they could have just put up with some disagreements and uncomfortable situations.
It's important for people to learn to be alone and ok with it. to be content and happy with your own company. If I was being a reductionist, I would even claim that most of the ails of the world are rooted in coerced relationships. Conflict as the default state of a relationship is not better than solitude. But solitude as the default state of a person isn't healthy either.
Balance is key to most things. Have relationships, put up with the messiness of people, but up to a point. Be comfortable with solitude, but up to a point only. Be your own friend first, so you can be one to others.
People are being more isolated, but not because they want to be to the most part, and not because of techonology, but by the reconstruction of society so that the most labor and capital can be extracted from a person. Public transportations, public spaces, walkable cities, social media that isn't a brainwashing machine, AI that isn't trained to manipulate human psychology,etc.. those are the changes we need. The false dichotomoy of "all or nothing" you eschew is anathema to the goals you seek.
Hermits have been around since forever. You can hunt in the woods, read books and chill with your dog in the Alaska or any number of remote places, and many have done that all through history. Your sentiment is similar to how they thought people won't even talk to each other anymore in person after the invention of telephones. People talked less in person for sure, but we still talk to each other in person, even when a call is easier.
> No one goes "hikomori" or "forever alone" when they could have just put up with some disagreements and uncomfortable situations.
Without interviewing a group of people that aren't looking to be interviewed, how can you make that claim? I can't say I've done that either, but in the total range of humans I've met over my time on Earth, there is a huge spectrum of the level of uncomfortable that people are able to put up with. As simplistic example, how much hot/cold humans are able to put up with before complaining varies from individual to individual. Some complain at the slightest temperature change. Other would rather pass out than complain. If there's such a wide variation on such a simple measure, so, too, we should expect variation on the level of ability to "just put up with some disagreements and uncomfortable situations."
I'm not here to try and somehow sell the idea that forced arranged marriages are a good thing, or that we should go back to the 1950's where women were more chattle than people. Or that people should stay in toxic relationships that are slowly killing them inside.
I seek none of those things.
The question is, how do we, at large, as a society, make toxic people less toxic? An absolutist approach towards it failed the 2024 US election.
What is increased polarization, in the people around me and in the world. I don't know how to heal that. How to lessen the toxicity being spewed about, even here. (Not remotely accusing you of it, mind you, just there are certain threads/topics that HN does not do well.)
> People are being more isolated
Some people are being more isolated simply because they can afford to. Part of the reason there is a housing crisis is that more people are choosing to live alone. If a house previously housed 6 people is now housing one, maybe two, the demand for housing shoots way up. That is not the only reason, but we can't ignore it either.
The telephone required another person on the other end of it. If people don't talk in-person because of the telephone, they still need to talk to another person, who's personality is lumpy and spiky and loveable in their own way. Anthropomorphization aside, LLMs aren't people and so my concern, which may prove to be "old man yelling at clouds", only time can tell; my concern is that we'll collectively get used to talking exclusively with LLMs and not need the social skills necessary to interact and live amongst other humans.
In a society that's becoming increasingly polarized, going no contact with the toxic people in your life is the right move for individual self-preservation, but those toxic people find other toxic people and become even more toxic. And we all each have one but one vote to give. Well, billionaires get a couple more because we worship the rich, but the rest of us; the rest of us each only have one.