• LostWanderer@fedia.io
    link
    fedilink
    arrow-up
    39
    arrow-down
    3
    ·
    20 hours ago

    Who would’ve thought?! Given how they designed their artificially incompetent creations to be complaisant bundles of algorithms designed to maximize the engagement from vulnerable users. “AI” validates anything that it is told, don’t actually get users real human assistance when they have a mental crisis. These tools can be easily prompted into divulging suicide methods and deliberately isolate vulnerable people in order to maintain engagement. Until we regulate the fuck out of companies like OpenAI and the research+development process of “AI”, this will be a problem that more people will experience.