• Alex@lemmy.ml
    link
    fedilink
    arrow-up
    41
    ·
    5 hours ago

    If you have ever read the “thought” process on some of the reasoning models you can catch them going into loops of circular reasoning just slowly burning tokens. I’m not even sure this isn’t by design.

    • Ech@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 hour ago

      It’s like the text predictor on your phone. If you just keep hitting the next suggested word, you’ll usually end up in a loop at some point. Same thing here, though admittedly much more advanced.

    • FishFace@piefed.social
      link
      fedilink
      English
      arrow-up
      58
      ·
      6 hours ago

      LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model’s view of “context” doesn’t change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don’t work perfectly.

      *Token

      • ideonek@piefed.social
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        5 hours ago

        That was the answer I was looking for. So it’s simmolar to “seahorse” emoji case, but this time.at some point he just glitched that most likely next world for this sentence is “or” and after adding the “or” is also “or” and after adding the next one is also “or”, and after a 11th one… you may just as we’ll commit. Since thats the same context as with 10.

        Thanks!

          • ideonek@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            4 minutes ago

            Chill dude. It’s a grammatical/translation error, not an ideological declaration. Especially common mistake if of your native language have “grammatical gender”. Everything have “gender” in mine. “Spoon” is a “she” for example, but im not proposing to any one soon. Not all hills are worth nitpicking on.

            • atomicbocks@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 minutes ago

              This one is. People need to stop anthropomorphizing AI. It’s a piece of software.

              I am chill, you shouldn’t assume emotion from text.

    • Arghblarg@lemmy.ca
      link
      fedilink
      arrow-up
      22
      ·
      5 hours ago

      LLM showed its true nature, probabilistic bullshit generator that got caught in a strange attractor of some sort within its own matrix of lies.

    • palordrolap@fedia.io
      link
      fedilink
      arrow-up
      6
      ·
      4 hours ago

      Unmentioned by other comments: The LLM is trying to follow the rule of three because sentences with an “A, B and/or C” structure tend to sound more punchy, knowledgeable and authoritative.

      Yes, I did do that on purpose.