• Carighan Maconar@piefed.world
    link
    fedilink
    English
    arrow-up
    48
    ·
    13 hours ago

    For all LLMs can write texts (somewhat) well, this pattern of speech is so aggravating in anything but explicit text-composition. I don’t need the 500 word blurb to fill the void with. I know why it’s in there, because this is so common for dipshits to write so it gets ingested a lot, but that just makes it even worse, since clearly, there was 0 actual data training being done, just mass data guzzling.

    • SaraTonin@lemmy.world
      link
      fedilink
      arrow-up
      41
      ·
      11 hours ago

      That’s an excellent point! You’re right that you don’t need 500 word blurb to fill the void with. Would you like me to explain more about mass data guzzling? Or is there something else I can help you with?

    • ricecake@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      ·
      9 hours ago

      They likely did do actual training, but starting with a general pre-trained model and specializing tends to yield higher quality results faster. It’s so excessively obsequious because they told it to be profoundly and sincerely apologetic if it makes an error, and people don’t actually share the text of real apologies online in a way that’s generic, so it can only copy the tone of form letters and corporate memos.

    • UnspecificGravity@infosec.pub
      link
      fedilink
      arrow-up
      1
      ·
      7 hours ago

      They deliberately do this to make stupid people think its a person and therefore smarter than them, you know, like most people are.