Mental Content R.I.P. (1977-2013)

by rsbakker

Fred Dretske’s recent death has got me rereading his Knowledge and the Flow of Information, and thinking how strange projects like his, projects that take the ‘mental’ at face value, will likely seem in the near future. The thought I want to consider here is the possibility that as soon as the question of aboutness is tied to the question of information, information is no longer understood (which isn’t to say that I understand it!).

The argument is pretty straightforward:

1) Aboutness is a cognitive heuristic, a highly schematic way to conceive the relation between ourselves and our environments.

2) Heuristics are domain specific cognitive devices that provide computational efficiencies by exploiting specific information structures in our environments.

3) Aboutness is a domain specific cognitive device.

So when Fred Dretske defines information as information about, the first question he needs to ask is whether aboutness is applicable to what he takes to be his problem ecology: basically the naturalization of ‘mental content.’ The only way to answer this question is to inquire into what the primary problem ecology of aboutness could be. And one way to determine this is to simply look at the information we neglect when we think in terms of aboutness. As it turns out, the information neglected happens to the grist of naturalistic explanation: causal information. The brain’s own complexity renders the causal cognition of its environmental comportments–‘beliefs’–impossible. Thus the heuristic convenience of aboutness.

So essentially Dretske is trying to naturalize mental content using a heuristic that systematically neglects all the information relevant to the naturalization of mental content.

Not surprisingly, he runs into problems.

Now we know for a fact that we are causally related to our environment, but this issue of being intentionally related, of being a subject in a world of objects, has proven to be a tough nut to crack, philosophically and scientifically. The problem, as I’ve suggested elsewhere, is that since we have no metacognitive access to the ecological constraints pertaining to traditionally fundamental heuristics such as aboutness, we assume their universality, and so continually misapply what are parochial cognitive devices to ecologies they are simply not adapted to. This continual misapplication forms the discursive bulk of philosophy.

Dretske’s explanandum, mental content, is a metacognitive posit. Given medial neglect (the brain’s abject inability to accurately cognize its neuromechanical functions) metacognition must simply make do, or ‘go heuristic.’ Mental content, you could say, is simply the best the brain can make of its environmental comportments given the information and resources available. Any natural explanation of mental content, therefore, will require some account of this information and resources–the very thing provided by BBT. We must, in other words, understand just what it is we are trying to explain before we can have any hope of explaining it.

So with reference to causal theories of mental content like Dretske’s, the problem always comes back to relevance, the question of sorting content-determining from non-content-determining causes, for a reason. What fixes the about relation, such that it makes sense to say that X represents a dog, as opposed to a dog or a fox in the dark or etc? If we look at mental content as a mere metacognitive posit, a schematic way to grasp an astronomically complex causal process after the fact, then this question is moot. Aboutness and mental content are simply kinds of metacognitive shorthand having everything to do with the problematic way the brain relates to itself and very little to do with the way the brain relates to the world. Given the post hoc, heuristic status of aboutness, all that causal complexity is simply ‘taken for granted.’ From our informatically impoverished metacognitive standpoint, in other words, X just represents a dog, not a fox in the dark, or anything else. As soon as we try to nail down the sufficient conditions for X representing dogs and dogs only we’re trying to explain a heuristic circumvention of information–aboutness–in terms of the information it circumvents–causal mechanism–as if no information were circumvented at all.

Content determination is the primary problem afflicting causal theories of mental content because ‘mental content’ is literally a metacognitive tool for understanding our environmental comportments in the absence of information pertaining to content determination!