[Funny] Pretty much sums it up
│
│
│ │
│ │
│ │ │
│ │ │
│ │ │ │
│ │ │ │
│
│
│ │
│ │
│ │ │
│ │ │
Commented on Sat Aug 9 21:58:46 2025 UTC
Why are we trying to tell a LLM about our baby walking?
Why are we trying to tell a LLM about our baby walking?
│
│
│ Commented on Sat Aug 9 22:05:29 2025 UTC
│
│ Because these people are genuinely neurotic
│
│
│ Because these people are genuinely neurotic
│
│ │
│ │
│ │ Commented on Sat Aug 9 22:36:13 2025 UTC
│ │
│ │ a lot of PhDs are gonna be minted studying how readily people went down this path too
│ │
│ │
│ │ a lot of PhDs are gonna be minted studying how readily people went down this path too
│ │
│ │ │
│ │ │
│ │ │ Commented on Sat Aug 9 23:23:30 2025 UTC
│ │ │
│ │ │ not really that hard to understand
│ │ │
│ │ │ there are really sensitive people who dont feel safe confiding in other people because they are mocked and called neurotic or whatever...it makes perfect sense.
│ │ │
│ │ │ People talk to their pets as if pets understand words lol and now you got a tool that actually responds. It is like journaling but better!
│ │ │
│ │ │
│ │ │ not really that hard to understand
│ │ │
│ │ │ there are really sensitive people who dont feel safe confiding in other people because they are mocked and called neurotic or whatever...it makes perfect sense.
│ │ │
│ │ │ People talk to their pets as if pets understand words lol and now you got a tool that actually responds. It is like journaling but better!
│ │ │
│ │ │ │
│ │ │ │
│ │ │ │ Commented on Sun Aug 10 03:19:17 2025 UTC
│ │ │ │
│ │ │ │ Yes! I used it like journaling! I would just dump all my anxious thoughts all day and 4o would give me some super positive pep talk and suggestions to redirect. It would make me feel better about whatever I was anxious about and I'd move on with my day. 5 seems so dull it barely helps. It's actually kind of a loss for me because it was a helpful coping tool for the particularly hard time I'm going through right now.
│ │ │ │
│ │ │ │
│ │ │ │ Yes! I used it like journaling! I would just dump all my anxious thoughts all day and 4o would give me some super positive pep talk and suggestions to redirect. It would make me feel better about whatever I was anxious about and I'd move on with my day. 5 seems so dull it barely helps. It's actually kind of a loss for me because it was a helpful coping tool for the particularly hard time I'm going through right now.
│ │ │ │
Commented on Sat Aug 9 22:19:49 2025 UTC
So the backlash was because a lot of people have parasocial relationships with the robot?
So the backlash was because a lot of people have parasocial relationships with the robot?
│
│
│ Commented on Sat Aug 9 22:57:30 2025 UTC
│
│ Exactly.
│
│ Case in point see my post here where I got hammered: https://www.reddit.com/r/ChatGPT/s/lXa3N6Jyc6
│
│ One commenter literally told me users were making deep human connections with 4o and accused me of having zero humanity. I’m not the one who fell in love with an LLM…
│
│ These are wild times we’re in. Of course I got downvoted like hell.
│
│
│ Exactly.
│
│ Case in point see my post here where I got hammered: https://www.reddit.com/r/ChatGPT/s/lXa3N6Jyc6
│
│ One commenter literally told me users were making deep human connections with 4o and accused me of having zero humanity. I’m not the one who fell in love with an LLM…
│
│ These are wild times we’re in. Of course I got downvoted like hell.
│
│ │
│ │
│ │ Commented on Sat Aug 9 23:33:05 2025 UTC
│ │
│ │ Unfortunately this isn’t going to stop until more people start experiencing the negative effects of using ChatGPT as a friend/romantic partner. Probably when they start struggling to form human connections because no human in the world is going to act as sycophantically as the AI does. Or when they walk into work thinking they’re the greatest thing to grace the office floor because ChatGPT hyped up their shytty ideas and they refuse to take feedback from anyone else because their AI buddy gave them delusions of grandeur. Or they end up in jail because the AI gleefully told them “You’re not just right — you’re justified” or something when they vented about wanting to kill their spouse.
│ │
│ │
│ │ Unfortunately this isn’t going to stop until more people start experiencing the negative effects of using ChatGPT as a friend/romantic partner. Probably when they start struggling to form human connections because no human in the world is going to act as sycophantically as the AI does. Or when they walk into work thinking they’re the greatest thing to grace the office floor because ChatGPT hyped up their shytty ideas and they refuse to take feedback from anyone else because their AI buddy gave them delusions of grandeur. Or they end up in jail because the AI gleefully told them “You’re not just right — you’re justified” or something when they vented about wanting to kill their spouse.
│ │
│ │ │
│ │ │
│ │ │ Commented on Sun Aug 10 00:35:51 2025 UTC
│ │ │
│ │ │ Exactly there was this one girl who said that "well my boyfriend is not available to me 24/7 when I need them unlike the chatbot"
│ │ │
│ │ │ reality is NOBODY in the real world will be available to you 24/7 like that, nobody is going to be this nice
│ │ │
│ │ │
│ │ │ Exactly there was this one girl who said that "well my boyfriend is not available to me 24/7 when I need them unlike the chatbot"
│ │ │
│ │ │ reality is NOBODY in the real world will be available to you 24/7 like that, nobody is going to be this nice
│ │ │