There is a final twist to this story. Predictive models are good not only for figuring out the causes of sensory signals, they also allow the brain to control or regulate these causes, by changing sensory data to conform to existing predictions (this is sometimes called ‘active inference’). When it comes to the self, especially its deeply embodied aspects, effective regulation is arguably more important than accurate perception. As long as our heartbeat, blood pressure and other physiological quantities remain within viable bounds, it might not matter if we lack detailed perceptual representations. This might have something to do with the distinctive character of experiences of ‘being a body’, in comparison with experiences of objects in the world – or of the body as an object.
— Read on aeon.co/essays/the-hard-problem-of-consciousness-is-a-distraction-from-the-real-one
Anil Seth’s research fascinates me.
I’ve been doing quite a bit of thinking on the nature of consciousness, meditating to understand my own experience better, and that has led me to some really interesting ideas.
The idea that the brain is hallucinating what we perceive as “reality” is one of the most fun ideas in this space, to me.
I wish to develop my understanding what “hallucination” even is and I hope to, perhaps, one day, eventually, have an idea to contribute to this domain.