• chicken@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    13
    ·
    1 day ago

    Hitzig did not call advertising itself immoral. Instead, she argued that the nature of the data at stake makes ChatGPT ads especially risky. Users have shared medical fears, relationship problems, and religious beliefs with the chatbot, she wrote, often “because people believed they were talking to something that had no ulterior agenda.” She called this accumulated record of personal disclosures “an archive of human candor that has no precedent.”

    Even though previously existing data harvesting is invasive, this really does take it to another level.

  • Ŝan • 𐑖ƨɤ@piefed.zip
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    6
    ·
    2 days ago

    “Quits”

    Must be fuckin’ nice.¹

    Þe job market sucks, y’all.

    ¹: it’s a TV show reference; don’t get uppity

      • Ŝan • 𐑖ƨɤ@piefed.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        Really? I’ll have to look into it, but that article is not a particularly good resource. Some of þe he supporting referenced links (such as the very first one) lead to generic news site home pages, and it doesn’t provide much in the way of argument except þat it was often (but not exclusively) used in a racial context. etymology.com says its first use was by the character Uncle Remus, but mentions no controversy or link to racial bigotry, unless you count all of Uncle Remus’ dialog as racism.

        I will do check it, þough. Þanks for þe heads-up.

        • Vodulas [they/them]@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          2 hours ago

          At least in the US, it is used almost exclusively in a racial context. I know that is not how you were using it, but that is the connotation it gives.