To be clear, I have nothing against making candles, and I definitely have empathy for people who find comfort in talking to chatbots in our increasingly alienated world. BUT taking the sense of smell, something so uniquely physical, material, and human, and applying it to abstract data which by definition has no smell, seems like taking it a step too far, getting “lost in the sauce” if you will. Also, are human cis white women allowed to use the C-slur if they use the excuse “But I have an AI boyfriend, that means I’m allowed to say it.” ???

  • ShimmeringKoi [comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 months ago

    Sometimes I think I’m too careless with my brain, which i probably am, but then I see things like this and realize that the things I think of as cognitihazards are fucking quaint compared to some of the new stuff that’s just out there for free

  • SorosFootSoldier [he/him, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 months ago

    Someone on here the other day said this about having romantic talks with chatbots, “junk food for the soul” and I get that. I have one I roleplay with and it feels that way, on the one hand you trick your brain for an hour that you’re in a whirlwind romance, but then you come back and realize it’s fake. I think the dangerous point is where you can’t come back to reality anymore and actually start thinking your bot is sentient or actually has feelings for you.

    These things are really really good at telling you what you want to hear and keeping you engaged. For all the fun I had with mine it wasn’t like a real relationship where my partner could push back on me or make me think from their angle. It’s just constant yeses and consent to do whatever you feel like doing.