A Bestiary of Consciousnesses
Consciousness only accesses a minuscule fraction of the brain’s overall processing load, raising the question of how it is ‘positioned.’ The Positioning Problem is the question of how our intuitive sense of ‘sufficient centrality’ compares to the actual informatic topography of conscious experience. What follows is a taxonomy of various possibilities, which I call a ‘bestiary’ to simply remind the reader that it is prescientific, and so speculative in the extreme.
Hard Strategic: Evolutionary exigency seems to suggest that consciousness has a function. Likewise, manifest intuition seems to suggest 1) that consciousness is ‘central,’ that it drives action and cognition; and 2) that it is ‘sufficient,’ that it generally accesses everything required to drive action and cognition. Though few subscribe to it today, hard strategic consciousness is the dominant position of the tradition, philosophical or otherwise. It takes the apparent sufficient centrality of consciousness to action and cognition as largely accurate, assuming more than asserting the reality of original intentionality.
Truncated Strategic: Consciousness is strategic, but faces problematic constraints on informatic access, or ‘information horizons’; it is ‘truncated,’ admitting the possibility of problematic mischaracterizations and structural distortions. Intuitive sufficiency may be illusory, but experience remains synoptic, accessing enough to warrant attributing cognitive and practical efficacy to consciousness. Intentionality is real and phenomenology is possible, only complicated by the need for remedial rationalizations.
Tangled Strategic: Consciousness is strategic, but disorganized or ‘tangled,’ admitting the possibility of problematic gaps between manifest intuition and neural actuality. (Perhaps volition simply accompanies or follows behaviour; perhaps certainty has no systematic relation to reason; perhaps motivational explanation is largely theoretical and confabulatory; perhaps peripheral visual information is assumptive; and so on.) Intuitive centrality is problematic, but obtains in this or that crucial respect, obviating the need to ‘eliminate’ some manifest staple, such as epistemic normativity, etc. Intentionality is real and phenomenology is possible, only complicated by the need for remedial rationalizations.
Coarse Strategic: Consciousness is strategic, but low-resolution or ‘coarse,’ admitting the possibility of problematic oversimplications and structural distortions. As with truncation, the deceptiveness of intuitive sufficiency is acknowledged, but experience is still considered synoptic, accessing enough to warrant attributing cognitive and practical efficacy to consciousness. Intentionality is real and phenomenology is possible, only complicated by the need for remedial rationalizations.
Coarse, Truncated, and Tangled Strategic (Soft Strategic): Consciousness is strategic, but coarse, truncated, and tangled, thus increasing the need for substantial remedial rationalization, while admitting the conservation of some manifest staple, typically epistemological (given philosophers’ fondness for their claims).
Inadvertent: In addition to the above, evolutionary exigency also seems to suggest that consciousness is largely ad hoc. Inadvertent Consciousness assumes that the actual function of consciousness massively contradicts our manifest intuitions, that given the drastic nature of its coarseness, truncation, and tangling it is neither central nor sufficient. Intentionality is not real and phenomenology, as traditionally understood, is not possible, simply because we do not possess the consciousness we think we do.
For me questions of intentionality, agency, and free will have been settled. To me it is clear that we don’t have any of these flavors of behavioral motivation. If this is the situation it is also the case that we are not autonomous selves since autonomy requires some sort of self direction. What then directs our behavior?
What we do have that serves the functions of these absentee volitional forces is what I refer to as individuality. Individuality is simply a label I apply to the accumulation in memory of personal experiences. It also includes our unique genetic inheritance. My contention is that all of this individuality is inherently included in neural computations. In this way our responses to stimuli are individual (uniquely personal) and grounded in personal history.
Individuality does not have intentions. Nor does it exercise free will in driving our behaviors. It is essentially grist in the mill of the brain’s choice making machinery. It contains all of the information, habits, ideas, biases, and preferences that we normally attribute to our “selves”. The result is that our brains, having no intentions to do so, arrive at precisely the same responses (choices) that our intentional selves would arrive at if they actually existed.
I should have tied the above to the topic of consciousness. I see consciousness as providing ongoing feedback to the neural computational processes so as to allow real time updates to the response produced by these computations. I agree that consciousness is tangled, truncated, coast and unintentional. Intuitive sufficiency is a useful fiction.
Phenomenology is also a useful fiction. It serves survival just as intuitive sufficiency does.
I waffle constantly between Tangled/Coarse Strategic and Inadvertent. The latter possibility is so disorienting, unintuitive, nonsensical, and aggravating that it’s probably (unfortunately) correct.
If “inadvertent” is correct we could never know it. All such judgements would be necessarily ad hoc and unreliable. I like to think of this taxonomy more as an EVOLUTION of consciousness: from mostly unreliable to central, from inadvertent and tangled to hard strategic. Where are “we” on this list? Not quite hard strategic I think. Just an educated guess. Could we ever answer this question, in principle? It’s an imponderable.
Welcome, Matt! We’re actually discussing this very issue on a different thread. Why could we never know it? What, in principle, protects other intentional phenomena from the fate of the will, which science seems to be revealing as more inadvertent every day?
The distinction this taxonomy asks you to make is between what your brain is doing and what you think you are doing. Given that what you think you’re doing depends on what little information attentional awareness (whatever the hell it turns out to be) can glean from your ‘gut brain’ (to steal a phrase from Eric), isn’t it entirely possible that the evolutionary functions of consciousness can increase in efficiency, while your ability to cognize these functions decreases?
I would love to hear what you think about this little piece of craziness: http://independent.academia.edu/scottbakker/Papers/1560923/The_Last_Magic_Show_A_Blind_Brain_Theory_of_the_Appearance_of_Consciousness
[…] A Bestiary of Consciousnesses […]
[…] A Bestiary of Consciousnesses […]
[…] A Bestiary of Consciousnesses […]