Yeah see I don’t agree with that base premise, that it’s as simple as information processing. I think sentience - and, therefore, intelligence - is a more holistic process that requires many more tightly-coupled external feedback loops and an embedding of the processes in a way that makes the processing analogous to the world as modelled. But who can say, eh?
It’s not obvious to me that sentience has to come along for the ride. It’s perfectly conceivable that there’s nothing it’s like to be a superintelligent AGI system. What I’ve been talking about this whole time is intelligence — not sentience, or what I’d call consciousness.
Yeah see I don’t agree with that base premise, that it’s as simple as information processing. I think sentience - and, therefore, intelligence - is a more holistic process that requires many more tightly-coupled external feedback loops and an embedding of the processes in a way that makes the processing analogous to the world as modelled. But who can say, eh?
It’s not obvious to me that sentience has to come along for the ride. It’s perfectly conceivable that there’s nothing it’s like to be a superintelligent AGI system. What I’ve been talking about this whole time is intelligence — not sentience, or what I’d call consciousness.