‘It’s just parroting the training data!’ That’s supposed to be reassuring??

    • Andy@slrpnk.net
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      2 days ago

      I actually kinda agree with this.

      I don’t think LLMs are conscious. But I do think human cognition is way, way dumber than most people realize.

      I used to listen to this podcast called “You Are Not So Smart”. I haven’t listened in years, but now that I’m thinking about it, I should check it out again.

      Anyway, a central theme is that our perceptions are comprised heavily of self-generated delusions that fill the gaps for dozens of cludgey systems to create a very misleading experience of consciousness. Our eyes aren’t that great, so our brains fill in details that aren’t there. Our decision making is too slow, so our brains react on reflex and then generate post-hoc justifications if someone asks why we did something. Our recall is shit, so our brains hallucinate (in ways that admittedly seem surprisingly similar sometimes to LLMs) and then applies wild overconfidence to fabricated memories.

      We’re interesting creatures, but we’re ultimately made of the same stuff as goldfish.

    • Grail@multiverse.soulism.net
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Yeah, you’re right. Humans get really weird and precious about the concept of consciousness and assign way too much value and meaning to it. Which is ironic, because they spend most of their lives unconscious and on autopilot. They find consciousness to be an unpleasant sensation and go to efforts to avoid it.