• verdigris@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    5 hours ago

    Whole lotta ifs and assumptions. “A high enough level of information processing” is meaningless if we don’t have any idea what sort of information processing could lead to consciousness, because it clearly isn’t just raw throughput.

    AGI definitionally improves itself, which implies awareness of itself and intention. Those are a huge amount of how we define consciousness.

    • Iconoclast@feddit.uk
      link
      fedilink
      arrow-up
      1
      ·
      5 hours ago

      In neuroscience and philosophy, when people talk about consciousness, they’re typically referring to the fact of experience - that it feels like something to be. That experience has qualia.

      Nowhere is it written that this is a requirement for general intelligence. It’s perfectly conceivable to imagine a system that’s more intelligent than any human but where it doesn’t feel like anything to be that system. It could even appear conscious without actually being so. Philosophical zombie, so to speak.