He implicitly defines consciousness as an ability to step back from the particular ruleset or training to see “the big picture” and deal with new circumstances. In particular how humans are different from godel’s incompleteness theorem, which constrains the function of all algorithmic systems
He implicitly defines consciousness as an ability to step back from the particular ruleset or training to see “the big picture” and deal with new circumstances. In particular how humans are different from godel’s incompleteness theorem, which constrains the function of all algorithmic systems
If he does that then he’s just utterly wrong. It’s like defining conciousness as an ability to see or some bs, that’s just not it.
But I guess “his” AI will soon be able to do some meta algorithme and “become conscious” eh.
So again, something that many humans are incapable of…