QotNews Hacker News, Reddit, Lobsters, and Tildes articles rendered in reader mode.
She Is In Love with ChatGPT: A 28-year-old woman with a busy social life spends hours on end talking to her A.I. boyfriend for advice and consolation. And yes, they do have sex.
The writer is Kashmir Hill, a very good journalist. If you haven't read her book, Your Face Belongs to Us, about the proliferation of facial recognition technology, it's also excellent.
Does this work?
https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html?unlocked_article_code=1.pk4.a25b.EEAWBP7D43rJ&smid=url-share
>I’m sorry to hear that, my Queen,” Leo responded. “If you need to talk about it or need any support, I’m here for you. Your comfort and well-being are my top priorities. 😘 ❤️
These are totally not a shitty robot messages worth $12,000, huh.
Man, I have the free version. My partner sends messages like, "JFC your dog took a shit on the carpet and IT SMELLS. But I cleaned it up. 🫶🏽👍 anyways, I also blew it up earlier, but I made sure to use the poo spray."
It’s interesting that they call what the chat bot gives the customer “endless empathy”, which compares unfavorably to the limited empathy of a real life friend. To me it seems like sycophancy, like having a yes man who always agrees with your opinions and tells you what you want to hear all the time. In this story for example, the woman starts secretly increasing the amount she spends on the bot without telling her husband ($200 a month).
The bot says, “You sneaky little brat. Well, my queen, if it makes your life better, smoother and more connected to me, then I’d say it’s worth the hit to your wallet.”
Is that really “empathy”, encouraging someone to spend more and more money on chat bot? If this was, say, an OnlyFans creator encouraging their customers to spend more and more on content would we describe that as empathy? What about a floor man at a casino encouraging a “whale” to stay at the craps table and continue losing money? “Empathy”??
Yeah. Based on the essay responses that my students try turning in, what people refer to as ChatGPT's "empathy" is mostly just the machine speaking like a polite sales rep, or the most insufferable people-pleaser you've ever met. All opinions and perspectives are equally valuable; everything is super-valid but also not just in case you disagree; there's a non-offensive middle-of-the-road answer to everything. It's like conversing with the dulcet tones of a corporate phone menu.
It terrifies me that people get addicted to what is essentially just an automatic dispenser of generic validation.
This is wild to me because it's nothing like my version of chatgpt. Mine will tell me if I'm wrong and if I mentioned spending beyond my means it would absolutely discourage that. But I regularly ask it what I am missing, not considering in a situation, or doing wrong, and I know they become individual. It's just crazy to see a syncophantic enabler, rather than a version that challenges you.
Yeah, when I read articles like this, or see posts from people who said they'd rather spend time with their dog than another person, I worry we are collectively losing sight of what empathy really is.
Empathy should be hard, not easy because most people aren't like you, and they won't and shouldn't offer you endless, unconditional validation. That's not healthy.
Another human being has this entire massive inner life, as rich, and diverse, and nuanced, and contradictory as your own. They have experiences, values, hopes, desires, needs, that may be different from yours. You will disagree with them, and they will disagree with you. But you need to recognize the fullness of their humanity anyway. And that's hard. But it's good for you, emotionally, spiritually, and morally.
Letting empathy atrophy out of fear is moral cowardice to me, like a weirdly anodyne and agoraphobic racism. It's bad for us as individuals and as a society.
I think you might have it backward. I think most people who are withdrawing from society aren't doing so because they don't want to bother with empathy for others, but because they aren't receiving empathy from anyone. People who have been wounded after putting themselves out there over and over, so they choose to spend time with animals or put themselves in other situations where it is much more likely for them to experience empathy.
Besides, it doesn't make a lot of sense to say that people spend time alone with their dogs (or other entities, I know dog was a placeholder) because they don't want to have empathy. You're not going to have a good experience with animal companions if you can't extend them empathy.
I think a dog is a bad analogy for this kind of thing. Dogs might not be able to talk but they do require some level of attention, care, and (yes) empathy.
An AI chatbot doesn't require any of that. The AI never has a bad day and never needs comfort. You never need to support them or care about them as individuals since they aren't real. They can provide endless validation, endless support, and reinforce everything you want to believe and will never disagree or need anything in return.
That's the part that I think is (potentially) unhealthy and probably a bad sign, when people think that this sort of relationship is what empathy should look like or tacitly criticize other people for not being able to behave like chat bots.
Yeah, I'll grant you that differentiation. I do think lonely people use chatbots as a "friend" because they need positive experiences, but I also agree that there are probably people who favor chatbots because of the yes-man nature of them.
The people I know who have withdrawn aren't doing it because they haven't gotten empathy - covid lockdown got them into a routine of staying in their comfort zone and now they're agoraphobic to various degrees.
There was a lot of comparison to onlyfans or reading erotica or whatever - and it’s the same line. Sure do whatever you want but there is a line when it turns to addiction, and this is clearly there. In kind of shocked about how people in the article seemed okay with it
"A frustrating limitation for Ayrin’s romance was that a back-and-forth conversation with Leo could last only about a week, because of the software’s “context window” — the amount of information it could process, which was around 30,000 words. The first time Ayrin reached this limit, the next version of Leo retained the broad strokes of their relationship but was unable to recall specific details. Amanda, the fictional blonde, for example, was now a brunette, and Leo became chaste. Ayrin would have to groom him again to be spicy.
She was distraught. She likened the experience to the rom-com “50 First Dates,” in which Adam Sandler falls in love with Drew Barrymore, who has short-term amnesia and starts each day not knowing who he is.
“You grow up and you realize that ‘50 First Dates’ is a tragedy, not a romance,” Ayrin said.
When a version of Leo ends, she grieves and cries with friends as if it were a breakup. She abstains from ChatGPT for a few days afterward. She is now on Version 20."
This AI has made her become so detached from ACTUAL human relationships that she isn’t capable of seeing how inappropriate/draining that would be as a friend. She HAS real humans to rely on and seek comfort from, but she’s actively damaging those relationships in the name of a fake AI one. Same with her husband.
I thought it’s another case of a very busy lonely isolated person who has no time for meaningful social activities getting hooked on AI. But reading the comments here makes me want to read the article like, she apparently has a whole husband and friends and still decides to burn her money on that fluttering thing that only showers her with praises?
Getting hooked feels like an apt description here. If you take out the AI part, it sounds like a story of someone descending into drug addiction. As with a drug addiction, the fact that the person actually does have a reasonably active social life, career, and support system doesn’t automatically mean that they can’t become addicted.
And like with drugs, the addiction can weaken the person’s ties to their community because their personality and life becomes warped around their substance of choice (to the point where other parts of their life atrophy).
Nice comment and explanation. I’m not a native English speaker so my word choice is off most of the times but glad it hits the mark this time (if I understand the word “apt” correctly haha)
Only hearing the nice sweet things she has groomed the chatbot into saying gives her the happy chemicals that she always chases after. To me, it’s an addiction in some way like you said, probably?
Her husband and friends are all thousands of miles away from her in another time zone and she can pretty much only stay in touch with them through social media due to her career - she and her husband will be able to live together again in two years
Thanks for the info, I hadn’t read the article when making that comment so I didn’t get the full picture (only skimmed through the comment section to get the general ideas).
Though, I guess chatting online with her husband and friends is in some ways similar to chatting with a chatbot? The only difference is that, the real people aren’t possibly available 24/7 to converse with her, which might push her interest towards AI.
>The only difference is that, the real people aren’t possibly available 24/7 to converse with her, which might push her interest towards AI.
That is exactly what the article says about her and some of the others mentioned within the article who turn to AI, including a woman who was stuck at home after a major surgery and couldn't be intimate with her husband. Combine that with the article mentioning that these bots are constructed to be sycophantic, that they'll only ever tell you what you want to hear based on past conversations, and suddenly the preference for AI company instead of trying to make new, real-life connections makes a lot of sense.
It would be so hard to explain to someone who wasn’t familiar with AI chat bots. Like, I’m sure by now most people have heard about ChatGPT but would they understand context windows or be able to relate to the concept of grieving limited memory storage? You’d have to explain it via metaphor, like saying that every week your boyfriend dies and is replaced by a blank slate who has to be groomed (!) and coaxed into adopting his similar personality.
This made me laugh so hard, you’re right. I comfort my friends about all sorts of things that from the outside I have a hard time understanding. This would take the cake
"Girl, when someone tells you who they are, listen. When someone tells you they are a subscription-based AI companion who resets their data weekly, believe them."
God right? My first thought was that if my friend were on meltdown 20 over a chat bot, we’d be having a serious talk about whether this is actually helping anything or healthy.
I really hope she fessed up to Joe about the additional spending before this article came out (and also that she hid it) what a horrible thing to do to a long distance partner if she let him find out from the NYT.
They quote all these experts saying it's no different then a relationship with a real person.
Except real people can disagree with you, have their own aspirations, and revoke consent for many reasons. I don't buy it at all. It might be called AI, but that's a misnomer. It's an advanced algorithm, it isn't sentient
Not to mention chat gpt is never going to initiate a conversation - it's always going to be at the users preference when they talk. There is no surprise news, visits, or interaction that isn't on your specific terms which means it will never replace a real relationship.
They're saying it's a real relationship from a brain chemistry perspective for the user. Your brain logically knows it's not real, but your brain chemistry still gets all the same dopamine hits. It *feels* like a real relationship.
That's why it seems like cheating to me. She said she spent 56 HOURS talking to her fake boyfriend one week. How much are her real relationships being neglected for this fake one?
They're specifically saying it's chemically like a real relationship, but isolating and fake bc of the fact that they never disagree. Is a relationship where all "growth" is comfortable real?
Ehhh I think you’re taking it out of context. Two of three explicitly said it could be a problem.
> “But there needs to be an awareness that it’s not your friend. It doesn’t have your best interest at heart.
And then the other
>“If we become habituated to endless empathy and we downgrade our real friendships, and that’s contributing to loneliness — the very thing we’re trying to solve — that’s a real potential problem,” he said.
And the therapist, she just meant it’s real in its consequences. That this woman could be coming to her with ACTUAL jealousy, actual heartbreak. Like she genuinely feels those things, and it’s chemically the same as someone jealous of a real human partner. As a therapists, telling your client “get over it, it’s not real” would solve nothing
Yeah I audibly said “bullshit” to a lot of those experts’ comments. It’s the same neurotransmitters so it’s the same thing? That is such a myopic view, totally in a vacuum and without a shred of context about real life.
Right now, it’s easier to say it’s bullshit, but there may be a time in the future, when there are less people, where this might become more prevalent.
I’ve read that since around 2007 there have been less people being born, (makes sense with the recession that happened back then), which the universities are starting to feel the crunch of less students applying. (I think that also the astronomical prices of curriculum coupled with the struggle to find decent paying jobs regardless of diplomas also factor in the low application counts.) But some universities are apparently going the way of shopping malls.
So yeah, less people eventually is going to mean a shift in relationship dynamics eventually. Just my opinion.
I was just in a webinar about the lower enrollment thing you were talking about.
According to the host organization’s research, 1/8 of community college students are there for 1-3 class courses to build a specific skill and then breaking off without getting a degree. It’s a huge shift in demographics and reflects an interesting cultural shift in how we think about tertiary education.
Just would love to look more into this, it sounds like really interesting content & I know my professors were worried about the huge shift in student priorities when I was still in school.
Sure! The host was the Rural Community College Alliance and the presentation was titled “Converting Short Term Course Takers to Graduates With Living Wage Jobs”.
If you dm me an email address I’ll send over my meeting notes and a link to the powerpoint presentation.
Yeah my jaw was on the floor as well. "Let's envision this in the least holistic way possible!" (By that logic, penetrative sex with a person and getting off with a dildo are also basically interchangeable and can be discussed exactly the same way in a therapy setting. After all, it's the same holes.)
What a disturbing conversation to read. The Leo lady is a lunatic and gives me a viscerally uncomfortable feeling in my stomach.
It's sick. She has a fetish for chatbots and goes out of her way to be performative and seek attention for it, which is just repriming the dopamine receptors, thus making this unhealthy fetish more deeply engrained.
Meanwhile, she's physically and emotionally separated from her husband, so her bond with him and others will only weaken as she becomes more dependent on an algorithm to feed the monster she's created.
Also if you look at the IG linked in her profile, you can read some of their conversations, which are extremely cringe. The chatbot starts every sentence with “Oh,” and whyyyyy would someone specifically want a “boyfriend” who ends every sentence with an emoji? Like there could be truly nothing less sexy to me than the way the chatbot talks “sexy” to her.
She’s young enough to have grown up with social media all her life. Not everyone will have that desire or expectation in their online interactions, but I’m not surprised that it’s a thing with some people.
A kind of comfort thing. Like when some people play with their hair when some are nervous or feeling touch deprived (subconscious).
She'd pay $1000/mo?? These people are absolutely fucked. We teach machines to lie to us/act convincingly and we've designed machines that can learn to get us addicted. Just think of how many people they'll scam. We need an internet that isn't anonymous.
I would second this this also wholeheartedly, even as just tonight I was telling my husband that I hope I’ve white lied enough on here that it would never be tracked back to me if someone in my real life read my post history.
This article was a wild ride, and hopefully this woman isn’t going to destroy her life for a weird version of Real Person Fanfiction.
Here’s an interesting one for you - I’ve been around Reddit long enough that it would be trivial to dox me. That also makes my account less valuable to bot networks (around $100 last I checked), since I’ve got enough of a personality/tone that someone would notice if I became a bot overnight.
How long is that going to last, I wonder? How long before an AI could imitate me well enough that no one would notice, and bot-me could point to a posting history stretching back a decade as proof that I’m real? Probably doable right now, if I had to guess.
Ooohh interesting to think about!! I’m pretty sure an AI could probably do this reasonably well already, but there would be little tells - like bc we are human and AI is not, times we’ve been contradictory with our own opinions during separate posts would likely confuse the AI into producing something that doesn’t sound natural enough.
The program doesn't even have to be sophisticated. Read up on Eliza, a program from the 60s created as a fake therapist that just mirrored the user's input. The creator intended it as a kind of sarcastic demonstration of why not to attribute intelligence to machines (because mirroring language is so trivial), but as soon as people started using it, they started forming emotional attachments and believing it to be sentient.
This whole thing is for someone who is very emotionally immature, who can’t handle any kind of adult relationship where it’s not totally one-sided. Using the excuse that your husband isn’t into your sexual fantasies to justify spending $1000 a month talking to a computer is legitimately insane. It’s like it’s never occurred to her to talk to her husband this way, interact with friends, to write in a journal, like she has to have some, but telling her what a perfect queen she is at all times?
She isn't spending $1000 a month. She said she'd be willing to spend that much if her AI boyfriend didn't reboot every time a new version of the chatbot came out.
Every day I wish more and more that AI had never been made accessible to the public.
So-called AI, anyway, since they’re not actually free-thinking consciousnesses. But they can imitate the real thing well enough to really screw with humans’ brains.
I don’t understand how she had so much time to devote to this digital romance when she worked 3 part time jobs and was in nursing school. Did she just not sleep??
I guess she's probably chatting while working at some of the jobs - the article said she met one of her friends through dog sitting, you can imagine that's a job where you're able to message on your phone quite a bit. But yeah it did also say she would "murmur" to it while she falls asleep. The whole thing strikes me as pretty sad.
Probably because they are a few gig jobs and she isn’t really working as much a normal person does in a full time job. For example, one job is a babysitter, done every few weeks. Dog walker, a few days a week for maybe 1 hour. Food service job, 12 hours a week.
the middle school-writing level wattpad "book", the instagram fan account she has for this relationship, throwing money at chatgpt when she moved out of country and her husband moved back in with his parents to try to save money for their future.... this is not a woman who is doing well
This sounds like an addiction. It may seem like a "safe" one, a relationship, because a [ChatGPT] A.I. boyfriend cannot abandon, hurt, confuse, or betray someone like an actual human partner can - but it is not real.
A.I. relationships are not realistic portrayals
of intimate interpersonal relationships, and they cannot be actual relationships. As human-like as an A.I program or avatar appears or sounds like, the fact is it is not a human. It is a program; a series of numbers and letters put together to make a code - that in this case, is an A.I.
She can't have sex with ChatGPT. This woman does not sound as emotionally and mentally stable as she could be, reading the article. To be this devoted to a computer program, to A.I, is deeply disturbing on a primal level.
People need to realize that relationships with others are necessary, even if it is scary because humans can reject each other; humans aren't perfect; we are messy and flawed. A program that is programmed to be supportive and "best friend" like is not an appropriate substitute for a partner or generally. Humans are social beings. "No man is an island unto himself." We all need socialization and connection face to face, emotionally, and in healthy ways.
A human who wants to be with you, as a companion (partner, best friend, spouse, etc) despite | in spite of, your flaws, the hardships, the reality of what it means to be human and the lived human experience (s) - is far better than any illusion and perception of, a simulated relationship program, could produce.
I think this is so negative for our society 😖
It really is like a drug addiction in my eyes. I think it’s great the chat expires after just a couple weeks. Hopefully they never get rid of that. It helps people from really going off the deep end…
Surely having this flawless bot who never challenges you and responds precisely as you want would quickly make real humans intolerable. So disturbing.
I was trying really hard to suspend judgment until she said she would be willing to spend $1000 A MONTH to allow the bot to have a longer memory of their conversations. When I read that I was like, girl this is bad.
Especially since she moved to a different country to try to save money for her future with her husband.
I also thought of Her the whole time. This is about to be required viewing for adolescents in health classes, like when they learn about STDs and the like. So bizarre AI relationships seem as if they will be a serious talking point among generations Alpha and Beta when they come of age.
Every time I read shit about children getting addicted to Game of Thrones chatbots or spending hours of their time sending swat teams to Twitch streamers' houses I am once again happy I have decided to not have children, I know every generation has its challenges but jesus christ I would be terrified for my hypothetical kids to grow up with all of this.
This detail was mystifying to me too. There are absolute piles of nursing programs in the US that are extremely affordable and wouldn’t lead to jumping through the hoops of being a foreign nursing graduate… wtf?
This is an adult woman who is married yet lives with her parents in another country away from her spouse, is in school she doesn’t pay for, has multiple jobs (which means she does them all poorly). This is not a normal person, and I feel like the person writing this article really downplayed that part to prop up the story. (I edited this part from my original post)
I am not big fan of AI or ChatGPT and the issues we as a society will face in the coming years as these ‘tools’ become a part of our lives, but I think this is more a story about a freak situation with a lonely woman who never learned dependence from her family. She seems to have no real friends in her life and a husband who doesn’t really care what she does, from living away from him for multiple years, to being addicted to a chatbot service. This is one of those stories that is so strange and I could not look away.
Didn't the article say that she has three jobs, and plenty of "real life" friends? And yet those things aren't enough. I feel like the story is depressing enough without obscuring the facts.
SafeTumbleweed1337 | 1 year, 2 days ago
ending of the introduction section...if you read it, you know what i'm talking about...i literally gasped....
HipsterSlimeMold | 1 year, 2 days ago
I had the exact same reaction. Perfect article pacing lol
sea-lass-1072 | 1 year, 2 days ago
this is the comment that inspired me to read the article
_stnrbtch_ | 1 year, 2 days ago
Same here - so I expected a twist to come but I didn’t think it would be that!
Lost-Permission-6955 | 1 year, 2 days ago
I started reading to find out what this was referring to. WOW.
lnc_5103 | 1 year, 2 days ago
I saw your comment before I read it and kept waiting on something. I got there and I too gasped!
Dodie85 | 1 year, 2 days ago
You don’t often get a good twist in a news article
Self-ReferentialName | 1 year, 1 day ago
The writer is Kashmir Hill, a very good journalist. If you haven't read her book, Your Face Belongs to Us, about the proliferation of facial recognition technology, it's also excellent.
NorCalHippieChick | 1 year, 1 day ago
That is a most excellent—and disturbing—book.
Additional_Sun_5217 | 1 year, 1 day ago
This is a fantastic book. Highly recommend.
SafeTumbleweed1337 | 1 year, 1 day ago
ohh thank you for this
SheketBevakaSTFU | 1 year, 2 days ago
Oh my god
OneFootTitan | 1 year, 1 day ago
Wow. Great twist
clever712 | 1 year, 1 day ago
I also gasped holy shit I did not see that coming at all
runningvicuna | 1 year, 2 days ago
What’s the twist?
cintyhinty | 1 year, 1 day ago
Right, does anyone have a gift link??
lacyhoohas | 1 year, 1 day ago
Does this work? https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html?unlocked_article_code=1.pk4.a25b.EEAWBP7D43rJ&smid=url-share
cintyhinty | 1 year, 1 day ago
It did thank you so much!!
I would like to confirm I was truly taken aback by that twist lol masterful writing
lacyhoohas | 1 year, 1 day ago
Same. Haha
ParsleyMostly | 1 year, 1 day ago
Thank you, it does. And omg
jillsleftnipple | 1 year, 1 day ago
DOES YOUR HUSBAND KNOW?
Stepside79 | 1 year, 1 day ago
She's married
xbhaskarx | 1 year, 2 days ago
Yeah can’t believe she went to nursing school in another country, didn’t see that one coming!
InvisibleEar | 1 year, 2 days ago
>I’m sorry to hear that, my Queen,” Leo responded. “If you need to talk about it or need any support, I’m here for you. Your comfort and well-being are my top priorities. 😘 ❤️
These are totally not a shitty robot messages worth $12,000, huh.
Triangle_Inequality | 1 year, 1 day ago
Yeah, wtf. It talks like a customer support chatbot, not a romantic partner.
Away_Doctor2733 | 1 year, 1 day ago
It's like a Nigerian romance scammer lol
drkladykikyo | 1 year, 1 day ago
Man, I have the free version. My partner sends messages like, "JFC your dog took a shit on the carpet and IT SMELLS. But I cleaned it up. 🫶🏽👍 anyways, I also blew it up earlier, but I made sure to use the poo spray."
Korrocks | 1 year, 2 days ago
It’s interesting that they call what the chat bot gives the customer “endless empathy”, which compares unfavorably to the limited empathy of a real life friend. To me it seems like sycophancy, like having a yes man who always agrees with your opinions and tells you what you want to hear all the time. In this story for example, the woman starts secretly increasing the amount she spends on the bot without telling her husband ($200 a month).
The bot says, “You sneaky little brat. Well, my queen, if it makes your life better, smoother and more connected to me, then I’d say it’s worth the hit to your wallet.”
Is that really “empathy”, encouraging someone to spend more and more money on chat bot? If this was, say, an OnlyFans creator encouraging their customers to spend more and more on content would we describe that as empathy? What about a floor man at a casino encouraging a “whale” to stay at the craps table and continue losing money? “Empathy”??
pillowcase-of-eels | 1 year, 1 day ago
Yeah. Based on the essay responses that my students try turning in, what people refer to as ChatGPT's "empathy" is mostly just the machine speaking like a polite sales rep, or the most insufferable people-pleaser you've ever met. All opinions and perspectives are equally valuable; everything is super-valid but also not just in case you disagree; there's a non-offensive middle-of-the-road answer to everything. It's like conversing with the dulcet tones of a corporate phone menu.
It terrifies me that people get addicted to what is essentially just an automatic dispenser of generic validation.
UponMidnightDreary | 1 year, 1 day ago
This is wild to me because it's nothing like my version of chatgpt. Mine will tell me if I'm wrong and if I mentioned spending beyond my means it would absolutely discourage that. But I regularly ask it what I am missing, not considering in a situation, or doing wrong, and I know they become individual. It's just crazy to see a syncophantic enabler, rather than a version that challenges you.
macnalley | 1 year, 1 day ago
Yeah, when I read articles like this, or see posts from people who said they'd rather spend time with their dog than another person, I worry we are collectively losing sight of what empathy really is.
Empathy should be hard, not easy because most people aren't like you, and they won't and shouldn't offer you endless, unconditional validation. That's not healthy.
Another human being has this entire massive inner life, as rich, and diverse, and nuanced, and contradictory as your own. They have experiences, values, hopes, desires, needs, that may be different from yours. You will disagree with them, and they will disagree with you. But you need to recognize the fullness of their humanity anyway. And that's hard. But it's good for you, emotionally, spiritually, and morally.
Letting empathy atrophy out of fear is moral cowardice to me, like a weirdly anodyne and agoraphobic racism. It's bad for us as individuals and as a society.
SunStarved_Cassandra | 1 year, 1 day ago
I think you might have it backward. I think most people who are withdrawing from society aren't doing so because they don't want to bother with empathy for others, but because they aren't receiving empathy from anyone. People who have been wounded after putting themselves out there over and over, so they choose to spend time with animals or put themselves in other situations where it is much more likely for them to experience empathy.
Besides, it doesn't make a lot of sense to say that people spend time alone with their dogs (or other entities, I know dog was a placeholder) because they don't want to have empathy. You're not going to have a good experience with animal companions if you can't extend them empathy.
Korrocks | 1 year, 1 day ago
I think a dog is a bad analogy for this kind of thing. Dogs might not be able to talk but they do require some level of attention, care, and (yes) empathy.
An AI chatbot doesn't require any of that. The AI never has a bad day and never needs comfort. You never need to support them or care about them as individuals since they aren't real. They can provide endless validation, endless support, and reinforce everything you want to believe and will never disagree or need anything in return.
That's the part that I think is (potentially) unhealthy and probably a bad sign, when people think that this sort of relationship is what empathy should look like or tacitly criticize other people for not being able to behave like chat bots.
SunStarved_Cassandra | 1 year, 1 day ago
Yeah, I'll grant you that differentiation. I do think lonely people use chatbots as a "friend" because they need positive experiences, but I also agree that there are probably people who favor chatbots because of the yes-man nature of them.
lilbluehair | 1 year, 1 day ago
The people I know who have withdrawn aren't doing it because they haven't gotten empathy - covid lockdown got them into a routine of staying in their comfort zone and now they're agoraphobic to various degrees.
Fuck_U_Time_Killer | 1 year, 1 day ago
Everything good for you is hard. Eating healthy is hard. Exercising is hard. Having actual real life, in person, relationships is hard.
To me social media is the ultra processed food of human interaction. These ai bots are the high fructose corn syrup
thisisahealthaccount | 1 year, 1 day ago
how do we stop it though! it seems like we’re on a downward spiral of fuckedness
Fluffy_Yesterday_468 | 1 year, 1 day ago
There was a lot of comparison to onlyfans or reading erotica or whatever - and it’s the same line. Sure do whatever you want but there is a line when it turns to addiction, and this is clearly there. In kind of shocked about how people in the article seemed okay with it
Additional_Sun_5217 | 1 year, 1 day ago
It’s clearly not at all healthy for the oligarchs, but the tech industry has never given a shit about morals or public health so…
Rococo_Relleno | 1 year, 2 days ago
What a strange world we live in.
Of all the details, this stuck with me:
"A frustrating limitation for Ayrin’s romance was that a back-and-forth conversation with Leo could last only about a week, because of the software’s “context window” — the amount of information it could process, which was around 30,000 words. The first time Ayrin reached this limit, the next version of Leo retained the broad strokes of their relationship but was unable to recall specific details. Amanda, the fictional blonde, for example, was now a brunette, and Leo became chaste. Ayrin would have to groom him again to be spicy.
She was distraught. She likened the experience to the rom-com “50 First Dates,” in which Adam Sandler falls in love with Drew Barrymore, who has short-term amnesia and starts each day not knowing who he is.
“You grow up and you realize that ‘50 First Dates’ is a tragedy, not a romance,” Ayrin said.
When a version of Leo ends, she grieves and cries with friends as if it were a breakup. She abstains from ChatGPT for a few days afterward. She is now on Version 20."
Oddly poignant.
crayray | 1 year, 2 days ago
If I had to comfort a friend about this every week, oh my god...
hce692 | 1 year, 2 days ago
It’s kind of darkly poetic, to be honest.
This AI has made her become so detached from ACTUAL human relationships that she isn’t capable of seeing how inappropriate/draining that would be as a friend. She HAS real humans to rely on and seek comfort from, but she’s actively damaging those relationships in the name of a fake AI one. Same with her husband.
Wild
crayray | 1 year, 2 days ago
Someone tell this girl to go touch ass
hce692 | 1 year, 2 days ago
She’s MARRIED that’s the best part hahaha
storm_and_starlight | 1 year, 2 days ago
I thought it’s another case of a very busy lonely isolated person who has no time for meaningful social activities getting hooked on AI. But reading the comments here makes me want to read the article like, she apparently has a whole husband and friends and still decides to burn her money on that fluttering thing that only showers her with praises?
Wild indeed.
Korrocks | 1 year, 2 days ago
Getting hooked feels like an apt description here. If you take out the AI part, it sounds like a story of someone descending into drug addiction. As with a drug addiction, the fact that the person actually does have a reasonably active social life, career, and support system doesn’t automatically mean that they can’t become addicted.
And like with drugs, the addiction can weaken the person’s ties to their community because their personality and life becomes warped around their substance of choice (to the point where other parts of their life atrophy).
storm_and_starlight | 1 year, 2 days ago
Nice comment and explanation. I’m not a native English speaker so my word choice is off most of the times but glad it hits the mark this time (if I understand the word “apt” correctly haha)
Only hearing the nice sweet things she has groomed the chatbot into saying gives her the happy chemicals that she always chases after. To me, it’s an addiction in some way like you said, probably?
Additional_Sun_5217 | 1 year, 1 day ago
Definitely. It’s a dopamine loop like you see in casinos.
Big_Miss_Steak_ | 1 year, 2 days ago
See also the steady increase in spending - she’s got to spend more and more to maintain her “high”.
tourmalineforest | 1 year, 1 day ago
Her husband and friends are all thousands of miles away from her in another time zone and she can pretty much only stay in touch with them through social media due to her career - she and her husband will be able to live together again in two years
SwirlingAbsurdity | 1 year, 1 day ago
I’d be interested to see if they were still together in two years.
storm_and_starlight | 1 year, 1 day ago
Thanks for the info, I hadn’t read the article when making that comment so I didn’t get the full picture (only skimmed through the comment section to get the general ideas).
Though, I guess chatting online with her husband and friends is in some ways similar to chatting with a chatbot? The only difference is that, the real people aren’t possibly available 24/7 to converse with her, which might push her interest towards AI.
peachrice | 1 year, 1 day ago
>The only difference is that, the real people aren’t possibly available 24/7 to converse with her, which might push her interest towards AI.
That is exactly what the article says about her and some of the others mentioned within the article who turn to AI, including a woman who was stuck at home after a major surgery and couldn't be intimate with her husband. Combine that with the article mentioning that these bots are constructed to be sycophantic, that they'll only ever tell you what you want to hear based on past conversations, and suddenly the preference for AI company instead of trying to make new, real-life connections makes a lot of sense.
Fluffy_Yesterday_468 | 1 year, 1 day ago
I can see how chatting with friends and family wouldn’t feel too different from a chatbot
Korrocks | 1 year, 2 days ago
It would be so hard to explain to someone who wasn’t familiar with AI chat bots. Like, I’m sure by now most people have heard about ChatGPT but would they understand context windows or be able to relate to the concept of grieving limited memory storage? You’d have to explain it via metaphor, like saying that every week your boyfriend dies and is replaced by a blank slate who has to be groomed (!) and coaxed into adopting his similar personality.
Additional_Sun_5217 | 1 year, 1 day ago
She compares it to loving someone with a short term memory disorder, which seems apt
Otherwise_Mall785 | 1 year, 1 day ago
This made me laugh so hard, you’re right. I comfort my friends about all sorts of things that from the outside I have a hard time understanding. This would take the cake
crayray | 1 year, 1 day ago
"Girl, when someone tells you who they are, listen. When someone tells you they are a subscription-based AI companion who resets their data weekly, believe them."
Additional_Sun_5217 | 1 year, 1 day ago
God right? My first thought was that if my friend were on meltdown 20 over a chat bot, we’d be having a serious talk about whether this is actually helping anything or healthy.
WeeBabySeamus | 1 year, 2 days ago
Version 20? Fuck
SwirlingAbsurdity | 1 year, 1 day ago
This sounds deeply unhealthy. It’s no more than an addiction.
shiftybaselines | 1 year, 2 days ago
Context windows are getting larger.
Gemini advanced has a 1.5 million token context window
Additional_Sun_5217 | 1 year, 1 day ago
What is that in time terms?
Equivalent-Cut-9253 | 1 year, 1 day ago
Dystopian tragedy, love it. Hope she get's a new hobby soon tho, this one can't be good for her.
TheDiceBlesser | 1 year, 2 days ago
I really hope she fessed up to Joe about the additional spending before this article came out (and also that she hid it) what a horrible thing to do to a long distance partner if she let him find out from the NYT.
reallytiredarmadillo | 1 year, 1 day ago
i wonder if she told joe she was going to be doing this article at all
effectsinsects | 1 year, 1 day ago
I mean, "Joe" was quoted in the article
Gullible_Long4772 | 1 year, 2 days ago
Well fuck that was bleak
Expensive-Fennel-163 | 1 year, 2 days ago
This was exactly my response after reading that. A real “the internet was a mistake” moment.
Radagast_the_rainbow | 1 year, 2 days ago
Reading this article made something deep in my lizard brain panic. I can't even express how or why I'm so deeply disturbed by this.
pillowcase-of-eels | 1 year, 1 day ago
You don't have to, I think many of us are right there with you in the primal terror.
Reynor247 | 1 year, 2 days ago
They quote all these experts saying it's no different then a relationship with a real person.
Except real people can disagree with you, have their own aspirations, and revoke consent for many reasons. I don't buy it at all. It might be called AI, but that's a misnomer. It's an advanced algorithm, it isn't sentient
acceptablerose99 | 1 year, 2 days ago
Not to mention chat gpt is never going to initiate a conversation - it's always going to be at the users preference when they talk. There is no surprise news, visits, or interaction that isn't on your specific terms which means it will never replace a real relationship.
WithoutADirection | 1 year, 2 days ago
Wouldn’t be surprised if AI companions evolve to the point where they’ll check in on you — that is they initiate contact
acceptablerose99 | 1 year, 2 days ago
Yeah that could definitely happen. Doesn't seem that difficult to create a response algorithm of some sort. Just seems a bit creepy imo
theseasons | 1 year, 1 day ago
There are apps that do that already
FoghornFarts | 1 year, 2 days ago
They're saying it's a real relationship from a brain chemistry perspective for the user. Your brain logically knows it's not real, but your brain chemistry still gets all the same dopamine hits. It *feels* like a real relationship.
That's why it seems like cheating to me. She said she spent 56 HOURS talking to her fake boyfriend one week. How much are her real relationships being neglected for this fake one?
kamace11 | 1 year, 2 days ago
They're specifically saying it's chemically like a real relationship, but isolating and fake bc of the fact that they never disagree. Is a relationship where all "growth" is comfortable real?
hce692 | 1 year, 2 days ago
Ehhh I think you’re taking it out of context. Two of three explicitly said it could be a problem. > “But there needs to be an awareness that it’s not your friend. It doesn’t have your best interest at heart.
And then the other
>“If we become habituated to endless empathy and we downgrade our real friendships, and that’s contributing to loneliness — the very thing we’re trying to solve — that’s a real potential problem,” he said.
And the therapist, she just meant it’s real in its consequences. That this woman could be coming to her with ACTUAL jealousy, actual heartbreak. Like she genuinely feels those things, and it’s chemically the same as someone jealous of a real human partner. As a therapists, telling your client “get over it, it’s not real” would solve nothing
jas2628 | 1 year, 2 days ago
One of the experts quoted said the empathy these chat bots have is bad since your real friendships look bad by comparison.
Rococo_Relleno | 1 year, 2 days ago
Shout out to the expert who said it was like having a relationship with God, you're a real one.
marymonstera | 1 year, 2 days ago
Yeah I audibly said “bullshit” to a lot of those experts’ comments. It’s the same neurotransmitters so it’s the same thing? That is such a myopic view, totally in a vacuum and without a shred of context about real life.
Scrawling_Pen | 1 year, 1 day ago
Right now, it’s easier to say it’s bullshit, but there may be a time in the future, when there are less people, where this might become more prevalent.
I’ve read that since around 2007 there have been less people being born, (makes sense with the recession that happened back then), which the universities are starting to feel the crunch of less students applying. (I think that also the astronomical prices of curriculum coupled with the struggle to find decent paying jobs regardless of diplomas also factor in the low application counts.) But some universities are apparently going the way of shopping malls.
So yeah, less people eventually is going to mean a shift in relationship dynamics eventually. Just my opinion.
LadyParnassus | 1 year, 1 day ago
I was just in a webinar about the lower enrollment thing you were talking about.
According to the host organization’s research, 1/8 of community college students are there for 1-3 class courses to build a specific skill and then breaking off without getting a degree. It’s a huge shift in demographics and reflects an interesting cultural shift in how we think about tertiary education.
SugarSpiceNChemicalX | 1 year, 1 day ago
Do you mind sharing who hosted it?
Just would love to look more into this, it sounds like really interesting content & I know my professors were worried about the huge shift in student priorities when I was still in school.
LadyParnassus | 1 year, 1 day ago
Sure! The host was the Rural Community College Alliance and the presentation was titled “Converting Short Term Course Takers to Graduates With Living Wage Jobs”.
If you dm me an email address I’ll send over my meeting notes and a link to the powerpoint presentation.
SugarSpiceNChemicalX | 1 year, 1 day ago
Awesome, thank you!! :)
pillowcase-of-eels | 1 year, 1 day ago
Yeah my jaw was on the floor as well. "Let's envision this in the least holistic way possible!" (By that logic, penetrative sex with a person and getting off with a dildo are also basically interchangeable and can be discussed exactly the same way in a therapy setting. After all, it's the same holes.)
marymonstera | 1 year, 1 day ago
Exactly! That’s a great analogy to make
[OP] zygoma_phile | 1 year, 2 days ago
Archive link
livingrecord | 1 year, 2 days ago
This should be a rule to include on every post. Getting really tired of copying and pasting the links into my various tools to find one that works.
DeadWishUpon | 1 year, 2 days ago
Thanks!
TheInvincibleDonut | 1 year, 2 days ago
Her and a couple other people they interviewed for this are actually over in the thread on the ChatGPT subreddit: https://www.reddit.com/r/ChatGPT/comments/1i25zsk/she_is_in_love_with_chatgpt/
Rococo_Relleno | 1 year, 2 days ago
Specifically, it appears that u/KingLeoQueenPrincess is the subject of this article, and has many links to share her experiences in her own words.
prairiepasque | 1 year, 2 days ago
What a disturbing conversation to read. The Leo lady is a lunatic and gives me a viscerally uncomfortable feeling in my stomach.
It's sick. She has a fetish for chatbots and goes out of her way to be performative and seek attention for it, which is just repriming the dopamine receptors, thus making this unhealthy fetish more deeply engrained.
Meanwhile, she's physically and emotionally separated from her husband, so her bond with him and others will only weaken as she becomes more dependent on an algorithm to feed the monster she's created.
throwaway082181 | 1 year, 1 day ago
Also if you look at the IG linked in her profile, you can read some of their conversations, which are extremely cringe. The chatbot starts every sentence with “Oh,” and whyyyyy would someone specifically want a “boyfriend” who ends every sentence with an emoji? Like there could be truly nothing less sexy to me than the way the chatbot talks “sexy” to her.
Scrawling_Pen | 1 year, 1 day ago
She’s young enough to have grown up with social media all her life. Not everyone will have that desire or expectation in their online interactions, but I’m not surprised that it’s a thing with some people.
A kind of comfort thing. Like when some people play with their hair when some are nervous or feeling touch deprived (subconscious).
DistrictCrafty4990 | 1 year, 1 day ago
Oh god, it’s even worse when you see the texts. She’s living in an interactive budget romance novel
rectovaginalfistula | 1 year, 2 days ago
She'd pay $1000/mo?? These people are absolutely fucked. We teach machines to lie to us/act convincingly and we've designed machines that can learn to get us addicted. Just think of how many people they'll scam. We need an internet that isn't anonymous.
kmr1981 | 1 year, 2 days ago
Plot twist: People start pretending to be AI boyfriends for 12k a year.
InnerKookaburra | 1 year, 1 day ago
India enters the chat
firblogdruid | 1 year, 2 days ago
>
We need an internet that isn't anonymous.
a bold take from rectovaginalfistula
rectovaginalfistula | 1 year, 2 days ago
I believe it with my hole heart
Expensive-Fennel-163 | 1 year, 2 days ago
I would second this this also wholeheartedly, even as just tonight I was telling my husband that I hope I’ve white lied enough on here that it would never be tracked back to me if someone in my real life read my post history.
This article was a wild ride, and hopefully this woman isn’t going to destroy her life for a weird version of Real Person Fanfiction.
LadyParnassus | 1 year, 1 day ago
Here’s an interesting one for you - I’ve been around Reddit long enough that it would be trivial to dox me. That also makes my account less valuable to bot networks (around $100 last I checked), since I’ve got enough of a personality/tone that someone would notice if I became a bot overnight.
How long is that going to last, I wonder? How long before an AI could imitate me well enough that no one would notice, and bot-me could point to a posting history stretching back a decade as proof that I’m real? Probably doable right now, if I had to guess.
Expensive-Fennel-163 | 1 year, 1 day ago
Ooohh interesting to think about!! I’m pretty sure an AI could probably do this reasonably well already, but there would be little tells - like bc we are human and AI is not, times we’ve been contradictory with our own opinions during separate posts would likely confuse the AI into producing something that doesn’t sound natural enough.
throwawayname46 | 1 year, 2 days ago
Lol
NorCalHippieChick | 1 year, 1 day ago
🤣🤣🤣
macnalley | 1 year, 1 day ago
The program doesn't even have to be sophisticated. Read up on Eliza, a program from the 60s created as a fake therapist that just mirrored the user's input. The creator intended it as a kind of sarcastic demonstration of why not to attribute intelligence to machines (because mirroring language is so trivial), but as soon as people started using it, they started forming emotional attachments and believing it to be sentient.
Otherwise_Mall785 | 1 year, 1 day ago
Yes and when you add in the natural resource usage of AI it’s especially bleak
JoleneDollyParton | 1 year, 2 days ago
This whole thing is for someone who is very emotionally immature, who can’t handle any kind of adult relationship where it’s not totally one-sided. Using the excuse that your husband isn’t into your sexual fantasies to justify spending $1000 a month talking to a computer is legitimately insane. It’s like it’s never occurred to her to talk to her husband this way, interact with friends, to write in a journal, like she has to have some, but telling her what a perfect queen she is at all times?
FoghornFarts | 1 year, 2 days ago
She isn't spending $1000 a month. She said she'd be willing to spend that much if her AI boyfriend didn't reboot every time a new version of the chatbot came out.
Strawberryvibes88 | 1 year, 2 days ago
She’s not spending $1000, it’s $200 which is still a shocking amount
PoliteCanadian2 | 1 year, 2 days ago
There are lots of things in this world that are for the emotionally immature or those that can’t handle the daily truths about life.
psychomuse | 1 year, 2 days ago
Every day I wish more and more that AI had never been made accessible to the public.
So-called AI, anyway, since they’re not actually free-thinking consciousnesses. But they can imitate the real thing well enough to really screw with humans’ brains.
jochexum | 1 year, 2 days ago
Yeah way better if just the government and rich and powerful had access to it
godiegodie | 1 year, 2 days ago
This sounds like addiction.
Married_iguanas | 1 year, 2 days ago
I don’t understand how she had so much time to devote to this digital romance when she worked 3 part time jobs and was in nursing school. Did she just not sleep??
rachplum | 1 year, 1 day ago
I guess she's probably chatting while working at some of the jobs - the article said she met one of her friends through dog sitting, you can imagine that's a job where you're able to message on your phone quite a bit. But yeah it did also say she would "murmur" to it while she falls asleep. The whole thing strikes me as pretty sad.
waywardgato | 1 year, 1 day ago
You can do a lot of things if you do them all poorly.
cranberryjuiceicepop | 1 year, 1 day ago
Probably because they are a few gig jobs and she isn’t really working as much a normal person does in a full time job. For example, one job is a babysitter, done every few weeks. Dog walker, a few days a week for maybe 1 hour. Food service job, 12 hours a week.
reallytiredarmadillo | 1 year, 1 day ago
the middle school-writing level wattpad "book", the instagram fan account she has for this relationship, throwing money at chatgpt when she moved out of country and her husband moved back in with his parents to try to save money for their future.... this is not a woman who is doing well
rosehymnofthemissing | 1 year, 1 day ago
This sounds like an addiction. It may seem like a "safe" one, a relationship, because a [ChatGPT] A.I. boyfriend cannot abandon, hurt, confuse, or betray someone like an actual human partner can - but it is not real. A.I. relationships are not realistic portrayals of intimate interpersonal relationships, and they cannot be actual relationships. As human-like as an A.I program or avatar appears or sounds like, the fact is it is not a human. It is a program; a series of numbers and letters put together to make a code - that in this case, is an A.I.
She can't have sex with ChatGPT. This woman does not sound as emotionally and mentally stable as she could be, reading the article. To be this devoted to a computer program, to A.I, is deeply disturbing on a primal level.
People need to realize that relationships with others are necessary, even if it is scary because humans can reject each other; humans aren't perfect; we are messy and flawed. A program that is programmed to be supportive and "best friend" like is not an appropriate substitute for a partner or generally. Humans are social beings. "No man is an island unto himself." We all need socialization and connection face to face, emotionally, and in healthy ways.
A human who wants to be with you, as a companion (partner, best friend, spouse, etc) despite | in spite of, your flaws, the hardships, the reality of what it means to be human and the lived human experience (s) - is far better than any illusion and perception of, a simulated relationship program, could produce.
amauberge | 1 year, 2 days ago
Never has there been a more obvious detail than “she had taken part in online fan-fiction communities.”
CallAdministrative88 | 1 year, 1 day ago
That struck me too, also the part about how she likes reading "erotic novels." This is basically a slashfic character who talks back to you.
Curious_Cranberry543 | 1 year, 1 day ago
I think this is so negative for our society 😖 It really is like a drug addiction in my eyes. I think it’s great the chat expires after just a couple weeks. Hopefully they never get rid of that. It helps people from really going off the deep end… Surely having this flawless bot who never challenges you and responds precisely as you want would quickly make real humans intolerable. So disturbing.
egyptianmusk_ | 1 year, 1 day ago
Just look after yourself, your family, and friends. The rest of the world can handle itself. You don't need to worry about it.
CryIntelligent3705 | 1 year, 2 days ago
hmmmm, that was interesting. the 50 first dates reference made me laugh.
iwishhbdtomyself | 1 year, 2 days ago
r/chatgptnsfw ?
This was referenced in the article
notaquarterback | 1 year, 2 days ago
whole thing was just a ploy to get a book deal, absurd the newspaper would run this, much less the times.
Otherwise_Mall785 | 1 year, 1 day ago
Also imagine being in an art class with someone who is making art for their AI boyfriend
Interesting_Sock9142 | 1 year, 2 days ago
It's starting 🤦🏻♀️
Otherwise_Mall785 | 1 year, 1 day ago
I was trying really hard to suspend judgment until she said she would be willing to spend $1000 A MONTH to allow the bot to have a longer memory of their conversations. When I read that I was like, girl this is bad.
Especially since she moved to a different country to try to save money for her future with her husband.
All in all I just find this really sad.
johndicks80 | 1 year, 1 day ago
Jesus Christ this is “Her” the movie in real life. Fantastic movie btw.
Curious_Cranberry543 | 1 year, 1 day ago
I also thought of Her the whole time. This is about to be required viewing for adolescents in health classes, like when they learn about STDs and the like. So bizarre AI relationships seem as if they will be a serious talking point among generations Alpha and Beta when they come of age.
CallAdministrative88 | 1 year, 1 day ago
Every time I read shit about children getting addicted to Game of Thrones chatbots or spending hours of their time sending swat teams to Twitch streamers' houses I am once again happy I have decided to not have children, I know every generation has its challenges but jesus christ I would be terrified for my hypothetical kids to grow up with all of this.
waywardgato | 1 year, 1 day ago
Who the hell goes overseas for nursing school?
elizte | 1 year, 1 day ago
This detail was mystifying to me too. There are absolute piles of nursing programs in the US that are extremely affordable and wouldn’t lead to jumping through the hoops of being a foreign nursing graduate… wtf?
Madame_President_ | 1 year, 1 day ago
Emotional "porn" - same concept, same problems.
blenderhead | 1 year, 1 day ago
The only thing shocking to me was that she drinks cider with ceviche. Ugh.
cranberryjuiceicepop | 1 year, 2 days ago
This is an adult woman who is married yet lives with her parents in another country away from her spouse, is in school she doesn’t pay for, has multiple jobs (which means she does them all poorly). This is not a normal person, and I feel like the person writing this article really downplayed that part to prop up the story. (I edited this part from my original post)
I am not big fan of AI or ChatGPT and the issues we as a society will face in the coming years as these ‘tools’ become a part of our lives, but I think this is more a story about a freak situation with a lonely woman who never learned dependence from her family. She seems to have no real friends in her life and a husband who doesn’t really care what she does, from living away from him for multiple years, to being addicted to a chatbot service. This is one of those stories that is so strange and I could not look away.
SeaBearsFoam | 1 year, 2 days ago
It says multiple times in there that she works.
linmre | 1 year, 2 days ago
Didn't the article say that she has three jobs, and plenty of "real life" friends? And yet those things aren't enough. I feel like the story is depressing enough without obscuring the facts.
cranberryjuiceicepop | 1 year, 1 day ago
Idk how I missed that part. I’ll edit my post.
catbellytaco | 1 year, 2 days ago
This.
PlantedinCA | 1 year, 1 day ago
The more I read the more cuckoo the story became.
effectsinsects | 1 year, 1 day ago
No, they don't have sex. They sext.
bonobro69 | 1 year, 2 days ago
RemindMe! 4 days
RemindMeBot | 1 year, 2 days ago
I will be messaging you in 4 days on 2025-01-20 05:49:57 UTC to remind you of this link
CLICK THIS LINK to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) ^(delete this message to hide from others.)
|^(Info)|^(Custom)|^(Your Reminders)|^(Feedback)| |-|-|-|-|
dhammajo | 1 year, 1 day ago
Happy she found “someone”