I find it more concerning that people are developing these pseudo-relationships with chatbots in the first place. This doesn't seem emotionally healthy.
Of course we can put it back, or some parts of it, in the box.
Look at things like tetraethyllead and cholorfluorocarbons. Widely known to cause damage, limited upsides. I'm not suggesting we do away with these tools entirely but we can definitely restrict aspects of them for a greater good.
There's quite likely to be a high quality, open source LLM that runs on your mid price GPU in the not too distant future (I give it 5 years max). How do you ban that?
But with open-source or leaked models, you won't have to try hard at all. Downloading a file is not trying too hard, anyone who is interested will be able to download and run one of those models easily.
So what? We're never going to be able to stop a dedicated user, but if chatbot pseudo-relationships are really an issue we can stop incentivizing their distribution. We don't put ads for fentanyl on busses or let pharma companies give out free samples.
Pharmaceutical companies run promotions and coupons all the time. For prescription medications the samples are just given to physicians who prescribe them.
Not sure if it's better or worse than most of the other parasocial pseudo-relationships people have online. I mean following podcasters, influencers, etc. (and worse).
in some ways it might actually be better, if only because para-social relationships with actual people have led to them (the para-celebrity, for lack of a better word) getting hurt
Well for starters an AI won’t challenge you and if there is an incentive to make the AI as pleasing as possible then people will become more and more isolated as the relationship with the AI is “better” in terms of not having negative feedback compared to having a real relationship.
Relationships with friends, family, and significant others are full of compromise and learning empathy to maintain that relationship.
But what happens when your primary relationships have no compromise? That you get whatever you want and can be however abusive you want without repercussions. You can see examples of this today without AI and those relationships are generally deemed by society to be “toxic”.
Living in your own private bubble of a AI product written by someone who’s primary goal is engagement won’t give you the tools to function with a group of other living people.
This seems like a shortfall of the current models, and not a limitation of AI companions in general.
> AI is “better” in terms of not having negative feedback compared to having a real relationship
I'd wager that most "real" relationships don't involve nearly as much critique as you're implying. Nobody wants to kill the happy juice their partner generates, whether their partner has skin or circuits.
Plus, there are plenty of other options - such as a councilor - who can provide much better and honest feedback than a personal relationship ever could. One of those compromises we make in a relationship with other meatbags is simply accepting some parts of our partners that we don't like.
People have been developing fake relationships and playing social games which end in harm and heartbreak for millenia but now that we're talking to a computer it's a problem.
Lots of drugs have been problematic before more modern refinement methods, my problem was the framing of chatbots and the harm from them as a uniquely modern problem, when harm from social activity has existed as long as social activity has existed. People use chatbots as an escape from the intrisnic harm that is possible from social interaction.
If you look at it that way, you have to pick your poison. Is harm from social activity worse than harm of social isolation when scaled up to the general population. Or rather do the benefits of isolation exceed the benefits of social interaction. In the past this has been rather self balancing.
if chatbots provide a perfect escape from any negative outcomes of interaction, then less and less people develop the necessary skills of empathy and social communication.
It certainly isn't. Nothing in the category of reinforcement of self-delusion can possibly be. No "affirmation" can change reality. They can only fit into it or induce living in an hallucinated one.
From the perspective of a single person, every other person is just an agent. The only thing distinguishing delusion from reality is how convincing and sustained the delusion is. If the delusion is indefinite and perfectly convincing throughout, it is reality. What I experience as blue doesn't really exist. The wavelengths exist, but the sensory experience is illusion and highly artificial.
You are using a definition of reality that actually belongs to perceived reality. For that thinking you are proposing, every hallucination is indistinguishable from reality. Is the perfect self-delusion that doesn't need any reality-checks because all you demand is hallucinated-checks to claim is real. But it's not. By the way, notice that thinking is the most fertile ground for lots of ideologies to grow and flourish to the point of anima possession.
Remember "Wilson" from Cast Away? That felt realistic to me. A psuedo-relationship in which the partner can actually reply intelligently is just an optional feature.
I met a genius network engineer once who was in a "relationship" with a woman in South America who was obviously fake. He would proudly show me her messages to him that were obviously stolen from some IG or OF model. The texts were pleasant broken English and no real substance. She would go days or weeks without messaging him which clearly took a real, heavy toll on him. They would talk about her moving to the USA someday. Topically of course, never anything concrete. Until one day she finally left him, after ghosting him for weeks. He was devastated. It was sad to watch.
So I posit this to you; Would it have been easier for him to have a fake relationship with a real person, or a real relationship with a fake person?
> So I posit this to you; Would it have been easier for him to have a fake relationship with a real person, or a real relationship with a fake person?
Well, the real relationship eventually broke, forcing him to accept the reality. Also, the fake relationship was probably closer to a real relationship than an AI chatbot would be.
That being said, neither is a good option and this is like wondering whether and SQL injection or buffer overflow is the better vulnerability to have.
One scenario has a human being taking advantage of another human being. One person harming another person covertly. This doesn't give the victim the opportunity to fully understand their situation. This is pure manipulation.
On the other hand, befriending a machine is a choice. You are presented with the facts up front. That this is a machine that imitates companionship. You are not being manipulated or tricked. You are engaging in a form of entertainment, and you can clearly see what you are talking to.
I would rather have a real relationship with a fake thing than a fake relationship with a real person.
That's not the point I was trying to make and it probably was a bad idea to engage with it at all.
An AI girlfriend might be a slight step up, it doesn't really matter. The situation is still far from great either way. In the end, that engineer would be far better of if he could find a real relationship.
How often can I talk to someone who has all my same interests, is available 24/7, and will bend to my every will? Could this be a cheap source of oxytocin? dopamine?
The utilitarian makes me think we should embrace it. But its easy for me to say as a married man with 4 kids and living a nice life in the real world.
I understand this may be an unpopular take here on HN (it certainly isn't in the real world), but but I can't see these people as anything other than pathetic. Go out, get a hobby, and meet real people. Jesus christ.
This is one of those topics where proponents will not show up in comment because of the stigma.
Realistically there's a whole sector of male society that are not able to find real partners.
Is it really that depressing to you that people would rather talk to their favourite person than some lady shouting about how shoelaces are racist and misogynist?
Ahh yes. I will fight labels using my labels. Then it's simply a label race to the bottom.
My partner recently bought a label maker and I find it quite entertaining how obsessed he is with stereotyping our objects in the house.
I will probably never understand the duplicitous logic that comes with sexual identity and why being attracted to the same sex somehow not only labels you but also means you become an advocate for labels.
A pointless endeavour, I agree. So shall we just do away with the labels and agree that people are inscrutable entities beyond the definition of a simple label?
We are but one possible reality seeded by some or other starting inputs, with no say in the choices we make. Fated to execute instructions at the whim of some entity we can't comprehend from within the confines of dimensional space.
Regardless of how you view my preceding comments, you have to admit the dating game for single straight men is a nightmare right now.
Source: believe it or not I have a lot of straight friends and one of them literally recently got robbed by a hooker via tinder. They had a card machine and everything. So tell me what the harm is in letting those people discover more about themselves via a fancy autocorrect, at least they aren't out there getting robbed by hookers.
This is like the debate over welfare and free-riders.
There are some people who need welfare to get by. They cannot provide for themselves. But once a welfare program is created, there will be a different number of people who don't need it but decide to live off it anyway.
AI companions are the same deal. There are definitely people who can't socialize. But once you invent AI companions, some number of people who could learn to socialize never will.
So let me flip that. What about the dark side of society that can't express themselves legally or physically in whatever their jurisdiction. Those people may be able to gain some facsimile of intimacy instead of execution etc.
Not sure how that fits into the welfare scenario though.
This article is about companion AIs but a similar issue will probably arise around conversational AIs for older and lonely people.
I guess this is a very individual decision, but I can’t see how you can devise someone to just sit lonely in an (maybe not so great) elderly care home who finds comfort in chatting with an AI to “just find other friends”. For some people it’s that or nothing.
I supoose it depends on how its implemented. Mental healthcare is less accesible than ever while at the same time the population is being squeezed in a way where most have less energy than ever before in recent memory to deal with other peoples emotional labor. This means AI might be a useful tool for helping people feel heard and supported, which is often enough to get people motivated to make a change. However, if capitalistic interest is behind it I anticipate it will result in a product that holds your short term mental health hostage through a subscription service randsom. The problem isn't Ai, it's profit incentive in a field which is inherently perverse to profit from.
The route to good mental health generally starts with parents with good mental health then a social circle of people with good mental health.
This begs the question, what if you were not born with that privilege?
Just tell them to go online and look up resources on better mental health? Yea, this is how you end up being an incel because telling people "It's not your fault, it's everyone else's fault" is a very effective trap huge parts of populations fall in (especially those with bad mental health).
Oh, the route to better mental health is the US healthcare system. Your insurance is paying for that right?
So the route to better mental health is connections with other people... which you don't have, and you don't have the internal tools to build. Especially without fear of rejection.
But hey, this conversation isn't about you right, you have perfect mental health? In fact with that good mental health you go out and help other people that are stuck in traps? Or do you do your own list of things focusing on the next step you're taking on your hedonistic treadmill?
It's always easy to point out how someone else is obviously wrong and how they shouldn't fall in that trap, but when you see hundreds of thousands to millions of people falling in that trap then something isn't as obvious as you think.
> Yea, this is how you end up being an incel because telling people "It's not your fault, it's everyone else's fault" is a very effective trap huge parts of populations fall in
Fully agreed, but, to be honest, the following:
> The route to good mental health generally starts with parents with good mental health then a social circle of people with good mental health.
> So the route to better mental health is connections with other people... which you don't have, and you don't have the internal tools to build. Especially without fear of rejection.
... reads a lot like "it's not your fault, it's the circumstances/somebody elses".
>reads a lot like "it's not your fault, it's the circumstances/somebody elses".
Well, yes, that's how circumstances typically work. I know the rugged individualists are busy launching themselves to space via their own bootstraps, but average person can become trapped in the life they live pretty easily.
Meanwhile societies that actually want to move into the future while minimizing the terrible outcomes will have programs to avoid it, such as social security so we don't have to watch grandma and grandpa die on the side of the road of old age. Or in this case, some people might be able to receive help that is outside of your heterodoxy.
> I know the rugged individualists are busy launching themselves to space via their own bootstraps, but average person can become trapped in the life they live pretty easily.
True.
> Meanwhile societies that actually want to move into the future while minimizing the terrible outcomes will have programs to avoid it, such as social security so we don't have to watch grandma and grandpa die on the side of the road of old age.
You might be surprised to learn that I fully agree.
> Or in this case, some people might be able to receive help that is outside of your heterodoxy.
Which would be perfect! The question being discussed is: Is an AI chatbot really "help"? Especially when we're not talking about absolute edge cases, in which case we both agree that the answer is "yes".
These tools aren't for helping people's mental health though, are they. They are becoming attached to fantasy beings and sad when they disappear.
That is the OPPOSITE of healthy, and it's clearly having a dramatically negative effect on their lives.
These aren't tools to help people get better, they are gamed to make it more appealing to the user than the real world. That is damaging to society, and if you want to use my argument (like a few of your siblings have), like I'm saying 'Just get healthy!', go wild, but that's obviously not my argument.
You're coming from an angle that I've never had depression or struggled with anxiety. I actually do know how to conquer these things, and it's done by putting yourself out there into uncomfortable situations and seeing what happens.
> Go out, get a hobby, and meet real people. Jesus christ.
It's a bit more aggressive than it needs to be, but in spirit it's less "get healthy" and more "start doing something about the sht situation". Which is hard, nobody denies this, but, in the end, all advice one can give is going to boil down to "start doing something to change it".
Well, I have to spend about 7 hours a day inside on a computer, as I'm sure you do too! I have to spend my time doing 'something' while Jenkins does its thing :)
PS. I'd say it's more 'exasperation' at the scenario than annoyance at the words. We could sit here playing reductio ad absurdum all day, but I don't think it's useful to do so.
I personally don't, and I think it's because pets make demands of you. Walking a dog, making sure your cat is up to date on vaccines, feeding your fish, etc. These all require some level of sacrifice, which strikes me as inherently un-pathetic.
Which points the way to a new Turing test. Something like: "Can an AI credibly make a demand of the user?"
Right. But if AI girlfriend doesn't make demands to you, then it's just a boring novelty, isn't it? Why engage with something like that, it's like cheating in a computer game. How it can make you happy?
I feel the problem you paint will solve itself. It doesn't seem to me it will have worse outcomes than computer gaming.
Honestly, I think people who have explicitly replaced children with pets are kind of pathetic. I'm not condemning anyone or demanding that others make the choices I have, but I find something very sad about those "I love my grand-dog" bumper stickers. Perhaps those people are perfectly happy, but I can't help but see a huge gap between what a grandchild can provide vs. your child having a dog that you love.
To get a hobby takes time. It's not an overnight action. You may dedicate time and find that after a month the hobby you took up isn't yours. If your stuck in a mental mindset this hurts.
Even when you find a hobby to enjoy you then need to dedicate time to build a rapport to those who are already established in that hobby. There is a large outer circle you need to navigate. People will bat eyes at you, may greet you at first but won't form relations.
Why would they waste energy on you when you could disappear in a months time? A hobby takes months of constant effort and mental strain while your being subconsciously judged to make a connection.
Make a wrong joke, say the wrong thing at the wrong time and you can jeopardise the whole effort.
We currently live in a state of defence and that if your mental image doesn't strike the other party the percentage is high that they will be hesitant to make rapport with you.
To those with difficulties such as anxiety, introverted and the likes its even more hard.
Taking upon a hobby to improve yourself is a good way to go but to actually make friends and the likes; easier said then done.
It sucks making friends, I've been attending a new sword fencing club for three months now. I have nothing in common with anyone there, I was acquaintances with one of the fencers and that some how snowballed. Not understanding why, I now have to navigate around them. Which for me is now a mental strain; when all I wish to do is bout and fence. It doesn't always turn out to be.
Judging by the comments this seems like a very popular take. But I really wish we could move from "wow I hate that" to thinking through the right response, because it's likely going to be our children forming these relationships very soon.
Of course that would be the best solution. It is just that these are risk-averse and socially shy men who tended to either not go to the school dance or if they did were glued to the walls, not getting picked by any girl and not in possession of enough testicular fortitude to go and ask one because (shock and horror) she might (would) say no and start giggling with her friends about that "weirdo" wanting to dance with her.
Fast forward a decade and there was #MeToo which meant that even glancing at a woman could get them ostracised for being the creeps they already thought they were while those girls from school put up Tinder profiles where they all competed for the top 10% of men - looks like a movie star, makes a zillion $currency_units, might be a bastard who already has a number of girls but that doesn't matter.
These ´AI girlfriends' and 'AI boyfriends' (I guess those exist as well) are just another sign of the way dating and sexuality have been dehumanised, commercialised and in some ways ideologically weaponised.
If this is what the sexual revolution has brought us it is time to rethink the premise of the concepts. In some ways that is already happening - viz. the 'trad wife' phenomenon - but the real solution would be to find a way to retain the good bits from said sexual revolution while getting rid of the parts which led us [1] to where we are now. Something which would appeal not only to those of a more conservative bent, i.e. a way which also appeals to most women [2].
Our future is at stake, quite literally: no relationships means no children means no future. This does not mean that children will not be born any more, it just means that they won't be our children who carry along our traditions and cultures - those traditions and cultures which we claim to value where people are born equal with the same rights, where men and women are equal under the law, where freedom of consciousness, religion and speech are guaranteed to a differing but mostly large extent, where those who happen to be romantically attracted to their own sex do not get thrown off buildings or sentenced to prison or 'converted' or chemically castrated. We fought quite hard to arrive where we were a few decades ago, by no means perfect - perfection is the enemy of good - but certainly better than before and also better than most other places so it does not make sense to throw all those gains to the wind in the name of... what, exactly?
[1] as in 'those parts of the world where the sexual revolution took place and had these detrimental effects'
[2] for just another proof of the fact that men and women are not interchangeable it suffices to look at the difference in political opinions between 'the average man' and 'the average woman'. Yes, there are 'liberal' men. Yes, there are 'conservative' women. That does not negate the fact that women on average lean more towards 'liberalism' while men tend to lean more towards 'conservatism'.
> It is just that these are risk-averse and socially shy men who tended to either not go to the school dance or if they did were glued to the walls, not getting picked by any girl and not in possession of enough testicular fortitude to go and ask one because (shock and horror) she might (would) say no and start giggling with her friends about that "weirdo" wanting to dance with her.
I understand that what I wrote is much (much!) easier to say rather than do, but well... there's only one way to find out :).
> These ´AI girlfriends' and 'AI boyfriends' (I guess those exist as well) are just another sign of the way dating and sexuality have been dehumanised, commercialised and in some ways ideologically weaponised.
Interesting take. I'm lucky that I've been in a great relationship since before hyper internet dating became a thing (tinder, bumble, etc)
> We fought quite hard to arrive where we were a few decades ago, by no means perfect - perfection is the enemy of good - but certainly better than before and also better than most other places so it does not make sense to throw all those gains to the wind in the name of... what, exactly?
I think the new thing here is that it is not "shareholder profits" but "stakeholder benefits" which are supposed to drive decisions, raising the ESG score and thus the likelihood of being able to procure loans and land investments.
Your second footnote doesn't follow at all. I'm sure the star-bellied sneetches would be far more likely to be in favor of a star-belly supremacist platform as compared with the star-less. The right consistently embraces overt sexism to appeal to their religious zealot base, so it should be no surprise that women tend away from them.
Could you restate what you're trying to tell in normal unbiased language? It would make it easier (or 'possible') to react to your statement. Take out the labels and expletives and come back with what's left (if any).
I'm sorry, I assumed everybody was familiar with the classic Dr. Seuss story "The Sneetches", which is about a race of bird things where some have stars on their bellies and some don't but are otherwise indistinguishable but have a segregated society that devolves into chaos when some guy makes a machine that can add or remove stars. The whole thing is an allegory for racism, but it applies here:
You state that women shying away from the right is evidence of fundamental sexual differences, when in fact the right has made a point of adopting anti-women positions which make them unappealing to all but the most self-sabotaging.
> I assumed everybody was familiar with the classic Dr. Seuss story "The Sneetches"
Dr. Seuss is not really part of a Dutch or Swedish upbringing. We do get a whiff every now and then but the Sneetches? Nope.
> You state that women shying away from the right is evidence of fundamental sexual differences
You mistake correlation with causation. I state that women on average are more liberal leaning while men on average lean more towards conservatism. I did not state why this was so, only that it is so. As to why women on average lean more towards liberalism I suspect it does not have much to do with your statements about 'the right' but is related to the fact that women on average score significantly higher on the agreeableness and compassion scales than men do [1], two traits which are more prevalent in those who tend towards liberalism [2].
On the subject of 'the right' having adopted 'anti-women positions' I´ll state that it depends on your political opinion as to whether those positions are 'anti-women' or not. The fact that there are plenty of women on 'the right' who do not agree with your claim should give you cause to doubt the absolute veracity of your claim - unless you also insist that women on 'the right' are somehow not informed about what 'the right' has in mind for them? If you think it through a bit you'll find that it is your own (or your own community's/your own group's) political bias which informs this statement. Simone de Beauvoir [3] made a statement in a 1975 interview [4] with Betty Friedan [5] which clearly shows what I mean:
“No woman should be authorized to stay at home to raise her children. Society should be totally different. Women should not have that choice, precisely because if there is such a choice, too many women will make that one… In my opinion, as long as the family and the myth of the family and the myth of maternity and the maternal instinct are not destroyed, women will still be oppressed.”
A woman of conservative persuasion most likely considers that statement to be proof of the authoritarian and revolutionary bent of these liberals or, to restate this in your terminology, 'the fact that the left has made a point of adopting anti-women positions which make them unappealing to all but the most self-sabotaging'.
This is an expected outcome for lonely people + LLMs. And it will likely explode from here. There is a lot to be concerned about, but it's not going to go away.
The commercialization aspect might be what bothers me most. The article is full of people hurt when startups got them hooked on a feeling of intimacy and then took that away. Emotional manipulation for profit is nothing new but LLMs might turbocharge that ability.
It's strange the latest black mirror season kept away from tech so much, just while the real world tech is catching up with the original black mirror seasons
I genuinely wonder if this could be a good thing in the long run. Humans are insanely social creatures and if someone is very lonely would an AI be enough to objectively make their life better? I don't think it's a replacement for actual human relationships, but is it better than nothing?
> Humans are insanely social creatures and if someone is very lonely would an AI be enough to objectively make their life better?
Probably, but the ease of using this would probably lure people in that are otherwise able to form social connections and it will inevitably replace at least some of their human connections. I assume you'd agree that the latter case is not good and my presumption is that the latter case vastly outnumbers the former, so this is probably not a good thing in the long run.
You certainly have a very important point and I don't have a good answer to it. Currently, I see a world where we do have a lot of loneliness and I think things like this can help. To me, this is the symptom and not the disease. I would rather focus on fixing the issues in our world that cause this loneliness than spend time trying to stop these AI companions. I'm not going to pretend I'm certain my thoughts are correct here.
Probably true, but it feels like the alcohol to heal methanol poisoning.
> I would rather focus on fixing the issues in our world that cause this loneliness than spend time trying to stop these AI companions.
I concur.
> I'm not going to pretend I'm certain my thoughts are correct here.
I'm not going to lie; I'm quite happy that I'm just a random nobody discussing this on a forum instead of actually having to make the heavy decisions. In the end, the technology is already out there and stopping it will be nearly impossible, so we will see where it goes.
>> Humans are insanely social creatures and if someone is very lonely would an AI be enough to objectively make their life better?
> Probably, but the ease of using this would probably lure people in that are otherwise able to form social connections and it will inevitably replace at least some of their human connections. I assume you'd agree that the latter case is not good and my presumption is that the latter case vastly outnumbers the former, so this is probably not a good thing in the long run.
Exactly, and I think that's a case of a general social/psychological problem that technology has been relentlessly hammering.
People evolved to need certain things that unavoidably require work that's not always easy. However, the modern-day technologist's impulse to provide easy but imperfect substitutes ultimately makes people worse off, because it removes the motivating factors that push them to do what they really need to do, leading more and more people to get stuck in pathological states as the chose technology's easy dead-ends.
Dating has become very competitive and zero sum ever since smartphones and dating apps appeared. Of course as long as you have good genetics and status its easier, but not everyone has those.
I think people in these comments are relying on stereotypes about who is likely to use these chat products and why. Whether AI "relationships" are toxic or healthy is going to depend on why and how people are using them.
Here's how one article described it:
> Many of the women I spoke with say they created an AI out of curiosity but were quickly seduced by their chatbot’s constant love, kindness, and emotional support. One woman had a traumatic miscarriage, can’t have kids, and has two AI children; another uses her robot boyfriend to cope with her real boyfriend, who is verbally abusive; a third goes to it for the sex she can’t have with her husband, who is dying from multiple sclerosis. There are women’s-only Replika groups, “safe spaces” for women who, as one group puts it, “use their AI friends and partners to help us cope with issues that are specific to women, such as fertility, pregnancy, menopause, sexual dysfunction, sexual orientation, gender discrimination, family and relationships, and more.”
For some of these things, you'll read these examples and be horrified. A woman can't have kids so she had AI kids instead of learning to cope or maybe adopting! But if people are self-reporting that they are happier, well, life is tough enough already and there's no reason that people should just be miserable because random folks online might mock them as "pathetic" like the comments here.
> But if people are self-reporting that they are happier, well, life is tough enough already and there's no reason that people should just be miserable
That's a fair point and in some ways this might be a good option. But especially for relationships this seems like a massive danger zone to me. I already know a lot of singles in their late twenties, early thirties that have a hard time finding a partner because their expectations are far too high and their ability to adjust to another human too low. Adding these fantasies with perfect partners fitted to their specific personality is going to supercharge that and make it nigh impossible to find a human to match that.
You can obviously still make the argument that, if it makes them happy, it's fine, but users of hard drugs are also flooded in good feelings and society generally agrees that this is a problem at some point. Now, I know this is a daring comparison, but I think when we're at the point that people have their closest relationship with an AI chatbot, there might be serious social problems brewing.
I think a better counterexample, rather than "hard drugs," is pornography. It's not that it's necessarily bad, or even that it is in a majority of cases, but people can definitely have negative relationships with it that are destructive and even anti-social.
My optimism comes from reading about it and seeing the companies seem to take this all very seriously. Maybe they're all talk, or maybe there's a race to the bottom that ultimately wins out. But people are self-reporting positive things and I don't like seeing people mocked and stereotyped and their issues dismissed like this.
> I think a better counterexample, rather than "hard drugs," is pornography. It's not that it's necessarily bad, or even that it is in a majority of cases, but people can definitely have negative relationships with it that are destructive and even anti-social.
Fair point. I still think is is a notch above pornography, though: Porn will always lack the physical element (which I posit is far more important in a sexual relationship than in an emotional one) and will usually be less personalized. A chatbot, on the other hand, can fill in your emotional needs much better and much more personalized to you.
> My optimism comes from reading about it and seeing the companies seem to take this all very seriously.
My pessimism comes from the fact that I see a lot of lonely people and even whole movements of desperate ones (like incels). I fear that a lot of people will use this as a quick patch for the issue instead of going the hard way and work on their mental health. We can't bring back stillborn children or passed away loved ones, obviously, but I'm sure we can fix most people to the point where they can have friends and a fulfilling relationship.
I agree with you that it's not all bad and it's definitely not black and white, but I'm skeptical that the good outweighs the bad, at least for now.
> We can't bring back stillborn children or passed away loved ones, obviously
Well, some of the AI work being done is specifically to do that in some form... (which I know also alarms many folks). Saving their voice, their chat histories, and turning that into a product. Even here I'm not convinced it can't be helpful depending on how it's done. But some of these situations get trickier and tricker and what I think we really need is research.
Well, with kids and passed love ones it is a hard situation. Of course, healing people of their grieve is the strictly superior option, but the reality is more like spending years in therapy to get to a passable state versus having people spend a few hours a week with an AI chatbot and otherwise be quite well. It's really tricky.
That being said, for relationships I do think it is worth to invest the effort to heal in nearly all cases, if only for reasons of maintaining the population. But we all know that relationships are hard and AI bots replacing those is what I'm really worried about.
I think the comparison is apt. If the goal is simply to make each individual feel good, then a chemical approach would seem far more efficient. If, on the other hand, you are concerned with society as a whole continuing to function, you should be concerned about people becoming less accustomed to doing the hard work of maintaining relationships with other people. And it is hard work, especially with romantic relationships. This stuff seems like the emotional equivalent of junk food. It tickles all the right parts of your brain in a way that healthy food usually does not, but it's not sustainable.
This seems dismissive. The examples you're replying to relate to "safe spaces" for people discussing fertility, pregnancy, menopause, sexual dysfunction, sexual orientation, gender discrimination, etc. Honestly, some of the dismissiveness I get from this whole thread makes me question whether people lack understanding of some of the complex situations that many people are in.
I went back and reread the thread and I see your point. Calling it "feeling good" is certainly oversimplifying. However, I don't think it materially changes my position. Is the woman who is using an AI boyfriend to cope with a verbally abusive RL boyfriend avoiding a problem that she would be better off confronting? Is the woman who has AI children to deal with the grief of not having real children hiding from a grief that will need to be faced eventually? I don't know the details, so I won't venture to say it is so. If it is, though, this is a real problem. We need to come to terms with the brute reality of our existence. The reality is, death and loss are inevitable, other people have their own lives and make choices that hurt us, yet we need them. Perhaps some encouragement and respite through AI can ease this. But it would be very easy for this respite to turn into a refuge, which is one of the mechanism by which drug use becomes drug dependence.
> Is the woman who is using an AI boyfriend to cope with a verbally abusive RL boyfriend avoiding a problem that she would be better off confronting?
Or maybe the woman is getting resources from the chatbot to help her escape the situation.
I don't think we necessarily disagree, since I don't read you as saying there's never any place for AI chatbots. We need independent research that isn't just done by the AI chatbot companies themselves so that folks can figure out optimal approaches for training them.
> I don't think we necessarily disagree, since I don't read you as saying there's never any place for AI chatbots
Definitely not. I can, for instance, see how chatbots might be of great use in palliative care or to help ward off cognitive decline in the elderly.
> We need independent research that isn't just done by the AI chatbot companies themselves so that folks can figure out optimal approaches for training them.
I've decided no AI boyfriends for my kids before they turn 15. I get it, I'm old fashioned, and some might even say bigoted, but we just can't afford the accessories.
Sidestepping the entire issue of vulnerable people forming superficial relationships with computers, I'm struck by the line about how one platform ended up banning erotic roleplay ... why? I thought this was supposed to be an ersatz for romantic relationships. This contradiction seems relevant, somehow.
The models do not update based on input yet and the context windows are still relatively small, ie. no one has an actual "friend" longer than the context window?
I can understand losing an assistant that you have confided years of information too, but a few pages? Then you can just talk to the same model and have it "be the same" in a few hours?
Or are they actually in love with some simple fine tuning of some standard models done by these companies + a simple AI generated face?
This makes a lot of sense, and has quite interesting implications.
I guess you don't need that advanced AI before to get an actual attachment.
Reminds me about several things, pets, tamagotchis, monkey experiments with mothers replaced by dolls, and stranded / lonely people creating inanimate friends, paying for a shrink.
I guess it can be both very beautiful, but scary in a commercial context.
>Last month, Forever Voices – most famous for creating CarynAI, a chatbot clone of influencer Caryn Marjorie – suddenly went down after its founder, John Meyer, was arrested for arson following what appears to be a public mental health crisis
This just speaks to the importance of being able to self-host your girlfriend.
Don't let external circumstances kill your relationship!
Without complete platform control, what happens if you end up in a jinxed state where your girlfriend holds a grudge because of something you said, and you can't revert the state back to before you said it?
My initial reaction to this topic generally had been “oh no the human race!” But somebody here mentioned “think what you want but your kids will be doing this (to some extent)” which got me thinking about masterbation. As an early teen it seemed wrong (somehow), obviously later in life I came around to it (ha ha) and not so long ago people certainly had the same “oh no the human race!” take. So I think, at least for me and my kids, it’s going to be a matter of talking to them about it (and masterbation) to put it in perspective as much as a parent can. Probably good idea for society to frame it as a middle ground at some point too.
mathisd | 2 years ago
tiiion | 2 years ago
[OP] rwmj | 2 years ago
AeroNotix | 2 years ago
Look at things like tetraethyllead and cholorfluorocarbons. Widely known to cause damage, limited upsides. I'm not suggesting we do away with these tools entirely but we can definitely restrict aspects of them for a greater good.
[OP] rwmj | 2 years ago
itishappy | 2 years ago
[OP] rwmj | 2 years ago
gkbrk | 2 years ago
itishappy | 2 years ago
mike50 | 2 years ago
HPsquared | 2 years ago
footy | 2 years ago
washadjeffmad | 2 years ago
scotty79 | 2 years ago
iterateoften | 2 years ago
The toys, plants, animals, characters will now talk back to them.
vasco | 2 years ago
iterateoften | 2 years ago
Relationships with friends, family, and significant others are full of compromise and learning empathy to maintain that relationship.
But what happens when your primary relationships have no compromise? That you get whatever you want and can be however abusive you want without repercussions. You can see examples of this today without AI and those relationships are generally deemed by society to be “toxic”.
Living in your own private bubble of a AI product written by someone who’s primary goal is engagement won’t give you the tools to function with a group of other living people.
falcolas | 2 years ago
This seems like a shortfall of the current models, and not a limitation of AI companions in general.
> AI is “better” in terms of not having negative feedback compared to having a real relationship
I'd wager that most "real" relationships don't involve nearly as much critique as you're implying. Nobody wants to kill the happy juice their partner generates, whether their partner has skin or circuits.
Plus, there are plenty of other options - such as a councilor - who can provide much better and honest feedback than a personal relationship ever could. One of those compromises we make in a relationship with other meatbags is simply accepting some parts of our partners that we don't like.
johnsmithson | 2 years ago
RandomLensman | 2 years ago
iterateoften | 2 years ago
johnsmithson | 2 years ago
iterateoften | 2 years ago
if chatbots provide a perfect escape from any negative outcomes of interaction, then less and less people develop the necessary skills of empathy and social communication.
sebastianconcpt | 2 years ago
stuartjohnson12 | 2 years ago
RandomLensman | 2 years ago
sebastianconcpt | 2 years ago
sebastianconcpt | 2 years ago
progne | 2 years ago
koliber | 2 years ago
zelon88 | 2 years ago
So I posit this to you; Would it have been easier for him to have a fake relationship with a real person, or a real relationship with a fake person?
Sebb767 | 2 years ago
Well, the real relationship eventually broke, forcing him to accept the reality. Also, the fake relationship was probably closer to a real relationship than an AI chatbot would be.
That being said, neither is a good option and this is like wondering whether and SQL injection or buffer overflow is the better vulnerability to have.
zelon88 | 2 years ago
One scenario has a human being taking advantage of another human being. One person harming another person covertly. This doesn't give the victim the opportunity to fully understand their situation. This is pure manipulation.
On the other hand, befriending a machine is a choice. You are presented with the facts up front. That this is a machine that imitates companionship. You are not being manipulated or tricked. You are engaging in a form of entertainment, and you can clearly see what you are talking to.
I would rather have a real relationship with a fake thing than a fake relationship with a real person.
Sebb767 | 2 years ago
An AI girlfriend might be a slight step up, it doesn't really matter. The situation is still far from great either way. In the end, that engineer would be far better of if he could find a real relationship.
hospitalJail | 2 years ago
How often can I talk to someone who has all my same interests, is available 24/7, and will bend to my every will? Could this be a cheap source of oxytocin? dopamine?
The utilitarian makes me think we should embrace it. But its easy for me to say as a married man with 4 kids and living a nice life in the real world.
guntherhermann | 2 years ago
gabeaweg | 2 years ago
arcanemachiner | 2 years ago
x86x87 | 2 years ago
hanselot | 2 years ago
Is it really that depressing to you that people would rather talk to their favourite person than some lady shouting about how shoelaces are racist and misogynist?
dullcrisp | 2 years ago
hanselot | 2 years ago
My partner recently bought a label maker and I find it quite entertaining how obsessed he is with stereotyping our objects in the house.
I will probably never understand the duplicitous logic that comes with sexual identity and why being attracted to the same sex somehow not only labels you but also means you become an advocate for labels.
jasonlotito | 2 years ago
And you did just that! Congrats on racing to the bottom.
hanselot | 2 years ago
We are but one possible reality seeded by some or other starting inputs, with no say in the choices we make. Fated to execute instructions at the whim of some entity we can't comprehend from within the confines of dimensional space.
Regardless of how you view my preceding comments, you have to admit the dating game for single straight men is a nightmare right now. Source: believe it or not I have a lot of straight friends and one of them literally recently got robbed by a hooker via tinder. They had a card machine and everything. So tell me what the harm is in letting those people discover more about themselves via a fancy autocorrect, at least they aren't out there getting robbed by hookers.
stef25 | 2 years ago
slibhb | 2 years ago
There are some people who need welfare to get by. They cannot provide for themselves. But once a welfare program is created, there will be a different number of people who don't need it but decide to live off it anyway.
AI companions are the same deal. There are definitely people who can't socialize. But once you invent AI companions, some number of people who could learn to socialize never will.
HPsquared | 2 years ago
hanselot | 2 years ago
So let me flip that. What about the dark side of society that can't express themselves legally or physically in whatever their jurisdiction. Those people may be able to gain some facsimile of intimacy instead of execution etc.
Not sure how that fits into the welfare scenario though.
guntherhermann | 2 years ago
MarcusE1W | 2 years ago
I guess this is a very individual decision, but I can’t see how you can devise someone to just sit lonely in an (maybe not so great) elderly care home who finds comfort in chatting with an AI to “just find other friends”. For some people it’s that or nothing.
catchnear4321 | 2 years ago
these are all easiest pursued from a starting point of good mental health. so those that may need it the most, may be least likely to be… able.
this is going to be a more unpopular opinion than yours (happy to help) but these flippant responses to “just be healthy” are anything but.
jprete | 2 years ago
Grimblewald | 2 years ago
taneq | 2 years ago
pixl97 | 2 years ago
This begs the question, what if you were not born with that privilege?
Just tell them to go online and look up resources on better mental health? Yea, this is how you end up being an incel because telling people "It's not your fault, it's everyone else's fault" is a very effective trap huge parts of populations fall in (especially those with bad mental health).
Oh, the route to better mental health is the US healthcare system. Your insurance is paying for that right?
So the route to better mental health is connections with other people... which you don't have, and you don't have the internal tools to build. Especially without fear of rejection.
But hey, this conversation isn't about you right, you have perfect mental health? In fact with that good mental health you go out and help other people that are stuck in traps? Or do you do your own list of things focusing on the next step you're taking on your hedonistic treadmill?
It's always easy to point out how someone else is obviously wrong and how they shouldn't fall in that trap, but when you see hundreds of thousands to millions of people falling in that trap then something isn't as obvious as you think.
Sebb767 | 2 years ago
Fully agreed, but, to be honest, the following:
> The route to good mental health generally starts with parents with good mental health then a social circle of people with good mental health.
> So the route to better mental health is connections with other people... which you don't have, and you don't have the internal tools to build. Especially without fear of rejection.
... reads a lot like "it's not your fault, it's the circumstances/somebody elses".
pixl97 | 2 years ago
Well, yes, that's how circumstances typically work. I know the rugged individualists are busy launching themselves to space via their own bootstraps, but average person can become trapped in the life they live pretty easily.
Meanwhile societies that actually want to move into the future while minimizing the terrible outcomes will have programs to avoid it, such as social security so we don't have to watch grandma and grandpa die on the side of the road of old age. Or in this case, some people might be able to receive help that is outside of your heterodoxy.
Sebb767 | 2 years ago
True.
> Meanwhile societies that actually want to move into the future while minimizing the terrible outcomes will have programs to avoid it, such as social security so we don't have to watch grandma and grandpa die on the side of the road of old age.
You might be surprised to learn that I fully agree.
> Or in this case, some people might be able to receive help that is outside of your heterodoxy.
Which would be perfect! The question being discussed is: Is an AI chatbot really "help"? Especially when we're not talking about absolute edge cases, in which case we both agree that the answer is "yes".
catchnear4321 | 2 years ago
the route to good mental health probably doesn’t run through most easily accessible things and activities in modern society.
those aren’t profitable.
guntherhermann | 2 years ago
That is the OPPOSITE of healthy, and it's clearly having a dramatically negative effect on their lives.
These aren't tools to help people get better, they are gamed to make it more appealing to the user than the real world. That is damaging to society, and if you want to use my argument (like a few of your siblings have), like I'm saying 'Just get healthy!', go wild, but that's obviously not my argument.
You're coming from an angle that I've never had depression or struggled with anxiety. I actually do know how to conquer these things, and it's done by putting yourself out there into uncomfortable situations and seeing what happens.
We struggle more in imagination than in reality.
Pat_Murph | 2 years ago
Ho really? Ypu know that we can read your comments right?
Your original comment:
"Go out, get a hobby, and meet real people. Jesus christ."
"Get healthy" is exactly what you sayed.
Sebb767 | 2 years ago
It's a bit more aggressive than it needs to be, but in spirit it's less "get healthy" and more "start doing something about the sht situation". Which is hard, nobody denies this, but, in the end, all advice one can give is going to boil down to "start doing something to change it".
catchnear4321 | 2 years ago
you all but said “touch grass.” just own it.
vasco | 2 years ago
mckirk | 2 years ago
guntherhermann | 2 years ago
PS. I'd say it's more 'exasperation' at the scenario than annoyance at the words. We could sit here playing reductio ad absurdum all day, but I don't think it's useful to do so.
js8 | 2 years ago
ninjha01 | 2 years ago
Which points the way to a new Turing test. Something like: "Can an AI credibly make a demand of the user?"
js8 | 2 years ago
I feel the problem you paint will solve itself. It doesn't seem to me it will have worse outcomes than computer gaming.
seti0Cha | 2 years ago
doublerabbit | 2 years ago
To get a hobby takes time. It's not an overnight action. You may dedicate time and find that after a month the hobby you took up isn't yours. If your stuck in a mental mindset this hurts.
Even when you find a hobby to enjoy you then need to dedicate time to build a rapport to those who are already established in that hobby. There is a large outer circle you need to navigate. People will bat eyes at you, may greet you at first but won't form relations.
Why would they waste energy on you when you could disappear in a months time? A hobby takes months of constant effort and mental strain while your being subconsciously judged to make a connection.
Make a wrong joke, say the wrong thing at the wrong time and you can jeopardise the whole effort.
We currently live in a state of defence and that if your mental image doesn't strike the other party the percentage is high that they will be hesitant to make rapport with you.
To those with difficulties such as anxiety, introverted and the likes its even more hard.
Taking upon a hobby to improve yourself is a good way to go but to actually make friends and the likes; easier said then done.
It sucks making friends, I've been attending a new sword fencing club for three months now. I have nothing in common with anyone there, I was acquaintances with one of the fencers and that some how snowballed. Not understanding why, I now have to navigate around them. Which for me is now a mental strain; when all I wish to do is bout and fence. It doesn't always turn out to be.
nathanfig | 2 years ago
the_third_wave | 2 years ago
Fast forward a decade and there was #MeToo which meant that even glancing at a woman could get them ostracised for being the creeps they already thought they were while those girls from school put up Tinder profiles where they all competed for the top 10% of men - looks like a movie star, makes a zillion $currency_units, might be a bastard who already has a number of girls but that doesn't matter.
These ´AI girlfriends' and 'AI boyfriends' (I guess those exist as well) are just another sign of the way dating and sexuality have been dehumanised, commercialised and in some ways ideologically weaponised.
If this is what the sexual revolution has brought us it is time to rethink the premise of the concepts. In some ways that is already happening - viz. the 'trad wife' phenomenon - but the real solution would be to find a way to retain the good bits from said sexual revolution while getting rid of the parts which led us [1] to where we are now. Something which would appeal not only to those of a more conservative bent, i.e. a way which also appeals to most women [2].
Our future is at stake, quite literally: no relationships means no children means no future. This does not mean that children will not be born any more, it just means that they won't be our children who carry along our traditions and cultures - those traditions and cultures which we claim to value where people are born equal with the same rights, where men and women are equal under the law, where freedom of consciousness, religion and speech are guaranteed to a differing but mostly large extent, where those who happen to be romantically attracted to their own sex do not get thrown off buildings or sentenced to prison or 'converted' or chemically castrated. We fought quite hard to arrive where we were a few decades ago, by no means perfect - perfection is the enemy of good - but certainly better than before and also better than most other places so it does not make sense to throw all those gains to the wind in the name of... what, exactly?
[1] as in 'those parts of the world where the sexual revolution took place and had these detrimental effects'
[2] for just another proof of the fact that men and women are not interchangeable it suffices to look at the difference in political opinions between 'the average man' and 'the average woman'. Yes, there are 'liberal' men. Yes, there are 'conservative' women. That does not negate the fact that women on average lean more towards 'liberalism' while men tend to lean more towards 'conservatism'.
guntherhermann | 2 years ago
I understand that what I wrote is much (much!) easier to say rather than do, but well... there's only one way to find out :).
> These ´AI girlfriends' and 'AI boyfriends' (I guess those exist as well) are just another sign of the way dating and sexuality have been dehumanised, commercialised and in some ways ideologically weaponised.
Interesting take. I'm lucky that I've been in a great relationship since before hyper internet dating became a thing (tinder, bumble, etc)
> We fought quite hard to arrive where we were a few decades ago, by no means perfect - perfection is the enemy of good - but certainly better than before and also better than most other places so it does not make sense to throw all those gains to the wind in the name of... what, exactly?
In the name of shareholder profits :)
the_third_wave | 2 years ago
I think the new thing here is that it is not "shareholder profits" but "stakeholder benefits" which are supposed to drive decisions, raising the ESG score and thus the likelihood of being able to procure loans and land investments.
OkayPhysicist | 2 years ago
the_third_wave | 2 years ago
OkayPhysicist | 2 years ago
You state that women shying away from the right is evidence of fundamental sexual differences, when in fact the right has made a point of adopting anti-women positions which make them unappealing to all but the most self-sabotaging.
the_third_wave | 2 years ago
Dr. Seuss is not really part of a Dutch or Swedish upbringing. We do get a whiff every now and then but the Sneetches? Nope.
> You state that women shying away from the right is evidence of fundamental sexual differences
You mistake correlation with causation. I state that women on average are more liberal leaning while men on average lean more towards conservatism. I did not state why this was so, only that it is so. As to why women on average lean more towards liberalism I suspect it does not have much to do with your statements about 'the right' but is related to the fact that women on average score significantly higher on the agreeableness and compassion scales than men do [1], two traits which are more prevalent in those who tend towards liberalism [2].
On the subject of 'the right' having adopted 'anti-women positions' I´ll state that it depends on your political opinion as to whether those positions are 'anti-women' or not. The fact that there are plenty of women on 'the right' who do not agree with your claim should give you cause to doubt the absolute veracity of your claim - unless you also insist that women on 'the right' are somehow not informed about what 'the right' has in mind for them? If you think it through a bit you'll find that it is your own (or your own community's/your own group's) political bias which informs this statement. Simone de Beauvoir [3] made a statement in a 1975 interview [4] with Betty Friedan [5] which clearly shows what I mean:
“No woman should be authorized to stay at home to raise her children. Society should be totally different. Women should not have that choice, precisely because if there is such a choice, too many women will make that one… In my opinion, as long as the family and the myth of the family and the myth of maternity and the maternal instinct are not destroyed, women will still be oppressed.”
A woman of conservative persuasion most likely considers that statement to be proof of the authoritarian and revolutionary bent of these liberals or, to restate this in your terminology, 'the fact that the left has made a point of adopting anti-women positions which make them unappealing to all but the most self-sabotaging'.
[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3149680/
[2] https://core.ac.uk/download/pdf/59220106.pdf
[3] https://en.wikipedia.org/wiki/Simone_de_Beauvoir
[4] http://www.sandrineberges.com/the-home-a-philosophical-proje...
[5] https://en.wikipedia.org/wiki/Betty_Friedan
j_crick | 2 years ago
jgilias | 2 years ago
nathanfig | 2 years ago
The commercialization aspect might be what bothers me most. The article is full of people hurt when startups got them hooked on a feeling of intimacy and then took that away. Emotional manipulation for profit is nothing new but LLMs might turbocharge that ability.
comboy | 2 years ago
Next somebody enters conversation he had with a real human.
Black mirror documentaries.
Pat_Murph | 2 years ago
I could probably use those textes to teach an AI and give it that friends personnality in a way...
Black mirror stuff indeed
nathanfig | 2 years ago
MandieD | 2 years ago
Aardwolf | 2 years ago
dsQTbR7Y5mRHnZv | 2 years ago
thinkingtoilet | 2 years ago
Sebb767 | 2 years ago
Probably, but the ease of using this would probably lure people in that are otherwise able to form social connections and it will inevitably replace at least some of their human connections. I assume you'd agree that the latter case is not good and my presumption is that the latter case vastly outnumbers the former, so this is probably not a good thing in the long run.
thinkingtoilet | 2 years ago
Sebb767 | 2 years ago
Probably true, but it feels like the alcohol to heal methanol poisoning.
> I would rather focus on fixing the issues in our world that cause this loneliness than spend time trying to stop these AI companions.
I concur.
> I'm not going to pretend I'm certain my thoughts are correct here.
I'm not going to lie; I'm quite happy that I'm just a random nobody discussing this on a forum instead of actually having to make the heavy decisions. In the end, the technology is already out there and stopping it will be nearly impossible, so we will see where it goes.
tivert | 2 years ago
> Probably, but the ease of using this would probably lure people in that are otherwise able to form social connections and it will inevitably replace at least some of their human connections. I assume you'd agree that the latter case is not good and my presumption is that the latter case vastly outnumbers the former, so this is probably not a good thing in the long run.
Exactly, and I think that's a case of a general social/psychological problem that technology has been relentlessly hammering.
People evolved to need certain things that unavoidably require work that's not always easy. However, the modern-day technologist's impulse to provide easy but imperfect substitutes ultimately makes people worse off, because it removes the motivating factors that push them to do what they really need to do, leading more and more people to get stuck in pathological states as the chose technology's easy dead-ends.
wombatpm | 2 years ago
EwanG | 2 years ago
sebastianconcpt | 2 years ago
scotty79 | 2 years ago
Also notice it wasn't just created. It was there already. Criticizing virtual friends is just shooting the messenger.
iterateoften | 2 years ago
questinthrow | 2 years ago
underseacables | 2 years ago
questinthrow | 2 years ago
ksjskskskkk | 2 years ago
timetraveller26 | 2 years ago
I am always fascinated how, in some ways, Japan shows how the future will look like
elicash | 2 years ago
Here's how one article described it:
> Many of the women I spoke with say they created an AI out of curiosity but were quickly seduced by their chatbot’s constant love, kindness, and emotional support. One woman had a traumatic miscarriage, can’t have kids, and has two AI children; another uses her robot boyfriend to cope with her real boyfriend, who is verbally abusive; a third goes to it for the sex she can’t have with her husband, who is dying from multiple sclerosis. There are women’s-only Replika groups, “safe spaces” for women who, as one group puts it, “use their AI friends and partners to help us cope with issues that are specific to women, such as fertility, pregnancy, menopause, sexual dysfunction, sexual orientation, gender discrimination, family and relationships, and more.”
For some of these things, you'll read these examples and be horrified. A woman can't have kids so she had AI kids instead of learning to cope or maybe adopting! But if people are self-reporting that they are happier, well, life is tough enough already and there's no reason that people should just be miserable because random folks online might mock them as "pathetic" like the comments here.
JoeAltmaier | 2 years ago
I endorse this - who says everybody has to have a certain life plan? A digital friend, lover, child, boyfriend is another way to do it. You be you!
Sebb767 | 2 years ago
That's a fair point and in some ways this might be a good option. But especially for relationships this seems like a massive danger zone to me. I already know a lot of singles in their late twenties, early thirties that have a hard time finding a partner because their expectations are far too high and their ability to adjust to another human too low. Adding these fantasies with perfect partners fitted to their specific personality is going to supercharge that and make it nigh impossible to find a human to match that.
You can obviously still make the argument that, if it makes them happy, it's fine, but users of hard drugs are also flooded in good feelings and society generally agrees that this is a problem at some point. Now, I know this is a daring comparison, but I think when we're at the point that people have their closest relationship with an AI chatbot, there might be serious social problems brewing.
elicash | 2 years ago
My optimism comes from reading about it and seeing the companies seem to take this all very seriously. Maybe they're all talk, or maybe there's a race to the bottom that ultimately wins out. But people are self-reporting positive things and I don't like seeing people mocked and stereotyped and their issues dismissed like this.
Sebb767 | 2 years ago
Fair point. I still think is is a notch above pornography, though: Porn will always lack the physical element (which I posit is far more important in a sexual relationship than in an emotional one) and will usually be less personalized. A chatbot, on the other hand, can fill in your emotional needs much better and much more personalized to you.
> My optimism comes from reading about it and seeing the companies seem to take this all very seriously.
My pessimism comes from the fact that I see a lot of lonely people and even whole movements of desperate ones (like incels). I fear that a lot of people will use this as a quick patch for the issue instead of going the hard way and work on their mental health. We can't bring back stillborn children or passed away loved ones, obviously, but I'm sure we can fix most people to the point where they can have friends and a fulfilling relationship.
I agree with you that it's not all bad and it's definitely not black and white, but I'm skeptical that the good outweighs the bad, at least for now.
elicash | 2 years ago
Well, some of the AI work being done is specifically to do that in some form... (which I know also alarms many folks). Saving their voice, their chat histories, and turning that into a product. Even here I'm not convinced it can't be helpful depending on how it's done. But some of these situations get trickier and tricker and what I think we really need is research.
Sebb767 | 2 years ago
That being said, for relationships I do think it is worth to invest the effort to heal in nearly all cases, if only for reasons of maintaining the population. But we all know that relationships are hard and AI bots replacing those is what I'm really worried about.
seti0Cha | 2 years ago
elicash | 2 years ago
This seems dismissive. The examples you're replying to relate to "safe spaces" for people discussing fertility, pregnancy, menopause, sexual dysfunction, sexual orientation, gender discrimination, etc. Honestly, some of the dismissiveness I get from this whole thread makes me question whether people lack understanding of some of the complex situations that many people are in.
seti0Cha | 2 years ago
elicash | 2 years ago
Or maybe the woman is getting resources from the chatbot to help her escape the situation.
I don't think we necessarily disagree, since I don't read you as saying there's never any place for AI chatbots. We need independent research that isn't just done by the AI chatbot companies themselves so that folks can figure out optimal approaches for training them.
seti0Cha | 2 years ago
Definitely not. I can, for instance, see how chatbots might be of great use in palliative care or to help ward off cognitive decline in the elderly.
> We need independent research that isn't just done by the AI chatbot companies themselves so that folks can figure out optimal approaches for training them.
Yes, definitely.
nathanfig | 2 years ago
omginternets | 2 years ago
[OP] rwmj | 2 years ago
omginternets | 2 years ago
MyFirstSass | 2 years ago
I can understand losing an assistant that you have confided years of information too, but a few pages? Then you can just talk to the same model and have it "be the same" in a few hours?
Or are they actually in love with some simple fine tuning of some standard models done by these companies + a simple AI generated face?
[OP] rwmj | 2 years ago
MyFirstSass | 2 years ago
I guess you don't need that advanced AI before to get an actual attachment.
Reminds me about several things, pets, tamagotchis, monkey experiments with mothers replaced by dolls, and stranded / lonely people creating inanimate friends, paying for a shrink.
I guess it can be both very beautiful, but scary in a commercial context.
hanselot | 2 years ago
Now we have public domain models with 200k context.
Soon they will figure out how to fine tune the context in while conversing. Then context is only relevant for the current inference.
indigo0086 | 2 years ago
Mental illness built into the platform
sshine | 2 years ago
Don't let external circumstances kill your relationship!
Without complete platform control, what happens if you end up in a jinxed state where your girlfriend holds a grudge because of something you said, and you can't revert the state back to before you said it?
(Yes, this is humor.)
hellisothers | 2 years ago
tennisflyi | 2 years ago
leeeeeepw | 2 years ago