Thinker as Tinker
by rsbakker
[Okay, so this has just an organic extension of thinking through a variety of problems via a thought experiment posted by Eric Thomson over at the Brains blog. The dialogue takes place between an alien, Al, who has come to earth bearing news of Consciousness (or the lack of it), and a materialist philosopher, Mat, who, although playing the obligatory, Socratic role of the passive dupe, is having quite some difficulty swallowing what Al has to say. It’s rough, but I do like the picture it paints, if only because it really does seem to offer a truly radical way to rethink consciousness, why we find it so difficult, as well as the very nature of philosophical thought. I haven’t come up with a name for Al’s position, yet, so if anyone thinks of something striking (or satirical) do let me know!]
Al: “Yes, yes, we went through this ‘conscious experience’ phase, ourselves. Nasty business. Brutish! You see, you’re still tangled in the distinction between system-intrinsic base information and the system-extrinsic composite information it makes possible. Since your primary cognitive systems have evolved to troubleshoot the latter, you lack both the information and the capacity to cognize the former. It’s yet another garden variety example of informatic parochialism combined with a classic heuristic mismatch. Had you not evolved linguistic communication, your cognitive systems would never need bump against these constraints, but alas, availability for linguistic coding means availability for cognitive troubleshooting, so you found yourself stranded with an ocean of information you could never quite explain–what you call ‘consciousness’ or ‘subjective experience.’”
Mat: “So you don’t have conscious experience?”
Al: “Good Heavens, no, my dear fellow!”
Mat: “So you don’t see that red apple, there?”
Al: “Of course I see it, but I have no conscious experience of it whatsoever.”
Mat: “But that’s impossible!”
Al: “Of course it is, for a backward brain such as your own. It’s quite quaint, actually, all this talk of things ‘out there’ and things ‘in here.’ It’s all so deliciously low res. But you’ll begin tinkering with the machinery soon enough. The heuristics that underwrite your environmental cognition are robust, there’s no doubt about that, but they are far too crude and task-specific for you to conceive your so-called ‘conscious experience’ for what it is. Someday soon you’ll see that asking what redness is makes no more sense than asking what the letter m means!”
Mat: “But redness has to be something!”
Al: “To be taken up as a troubleshooting target of your environmental cognitive systems, yes, indeed. That, my human friend, is precisely the problem. The heuristic you confuse for redness was originally designed to be utterly neglected. But as I said, rendering it available for linguistic coding made it available to your cognitive systems as well, and we find this is where the trouble typically begins. It certainly was the case with our species!”
Mat: “But it exists here and now for me! I’m bloody-well looking at it!”
Al: “I know this is difficult. Our species never resolved these problems until our philosophers began diagnosing these issues the way neurologists diagnose their patients, when they abandoned all their granular semantic commitments, all the tedious conceptual arguments, and began asking the simple question of what information was missing and why. Looking back, it all seems quite extraordinary. How many times do you need to be baffled before realizing that something is wrong with you? Leave it to philosophers to blame the symptom!
“You are still at the point were you primarily conceive of your brain as a semantic (as opposed to informatic) engine, as something that extracts ‘relevant’ information from its noisy environments, which it then processes into models of the universe, causally constructed ‘beliefs’ or ‘representations’ that take the ‘real’ as their ‘content.’ So the question of red becomes the question of servicing this cognitive mode and model, but it stubbornly refuses to cooperate with either, despite their independent intuitive ease. You have yet to appreciate the way the brain extracts and neglects information, the way, at every turn, it trades in heuristics, specialized information adapted for uptake via specialized processors adapted for specific cognitive tasks. Semantic cognition, despite the religious pretension of your logicians, is a cognitive short-cut, no different than social cognition. Rather than information as such, it deals with environmental being, with questions of what is what and what causes what, much as linguistic cognition deals with communicative meaning, with questions of what means what and what implies what.
“Now as I said, red no more possesses being than ‘m’ possesses meaning. Soon you will come to see that what you call ‘qualia’ are better categorized as ‘phenomemes,’ the combinatorial repertoire that your environmental cognitive systems uses to make determinations of being. They are ‘subexistential’ the way phonemes are ‘subsemantic.’ They seem to slip into cognitive vapour at every turn, threatening what you think are the hard won metaphysical gains of another semantic myth of yours, materialism. You find yourself confronted with a strange dilemma: either you make a fetish of their constitutive, combinatorial function and make them everything, or you stress their existential intractability say they are something radically different. But you are thinking like a philosopher when you need to think like a neuropsychiatrist.
“The question, ‘What am I bloody well looking at?’ exhausts the limits of semantic cognition for you. Within those limits, the question makes as much sense as any question could. But it is the product of a heuristic system, cognitive mechanisms whose (circumstance specific) effectiveness turn on the systematic neglect of information. So long as you take semantic cognition at its word, so long as you allow it to dictate the terms of your thinking, you will persist in confusing the informatic phenomena of smonsciousness with the semantic illusion of consciousness.”
Mat: “But semantic cognition is not heuristic!”
Al: “That’s what all heuristics say–they tend to take their neglect quite seriously, as do you, my human friend! But the matter is easily settled: tell me, in this so-called ‘conscious experience’ of yours, can you access any information regarding its neural provenance?”
Mat: “No.”
Al: “Let me guess: You just ‘see things,’ transparently as it were. Like that red apple.”
Mat: “Yes.”
Al: “Sounds like your cognitive systems are exceedingly selective to me!”
Mat: “They have to be. It would computationall–”
Al: “Intractable! I know! And evolution is a cheap, cheap date. So then, coarse-grain heuristics are quite inevitable, at least for evolved information systems such as ourselves.”
Mat: “Okay. So?”
Al: “So, heuristics are problem specific, are they not? Tell me, what should we expect from misapplications of our heuristic systems, hmm? What kind of symptoms?”
Mat: “Confusion, I suppose. Protracted controversy.”
Al: “Yes! So you recognize the bare possibility that I’m right?”
Mat: “I suppose.”
Al: “And given the miasma that characterizes the field otherwise, does this not place a premium on alternative possibilities?”
Mat: “But it’s just too much! You’re saying you’re not a subject!”
Al: “Precisely. No different than you.”
Mat: “That you experience, but you don’t have experience!”
Al: “Indeed! Indeed!”
Mat: “You don’t think you sound crazy?”
Al: “So the mad are prone to call their doctors. Look. I understand how this must all sound. If qualia don’t exist because they are ‘subexistential,’ how can they contribute to existence?
“Think of it this way. At a given moment, t1, qualia contributes, and you find yourself (quite in spite of your intellectual scruples) a naive realist, seeing things in the world. You see ‘through’ your experience. The red belongs to the apple, not you, and certainly not your brain! Subsequently, at t2, you focus your attention on the redness of the red, and suddenly you are looking ‘at’ your experience instead of through. (In a sense, instead of speaking words, you find yourself spelling them).
“The thing to remember is that this intentional ‘directing at’ that seems so obvious when attending to your attending is itself another heuristic–at best. You might even say it’s the ‘Master Heuristic.’ Nevertheless, it could, for all you know, be an abject distortion, a kind of water-stain Mary Magdelene imposed by deliberative cognition on cognition. Either way, by deliberating the existence of red, you just dropped a rock into the chipper, old boy. ‘But what is red!’ you say. ‘It has to be something!’ You say this because you have to, given that deliberative cognition possesses no information regarding its own limits. As far as its concerned, all phenomenal rocks are made of natural wood.
“So this is the dilemma my story poses for you. Semantic cognition assumes universality, so the notion that something that it says exists–surely, at the bare minimum!–does not exist sounds nonsensical. So when I say to you, information is all that matters, your Master Heuristic, utterly blind to the limits of its applicability, whirs and clicks and you say, “But surely that information must exist! Surely what makes that information informative is whether or not it is true!” And so on and so forth. And it all seems as obvious as can be (so long as you don’t ask too many questions).
“Information is systematicity. You need to see yourself philosophically the way your sciences are beginning to see you empirically: as a subsystem. You rise from your environments and pass back into them, not simply with birth and death, but with every instant of your life. There is no ‘inside,’ no ‘outside,’ just availability and applicability. Information blows through, and you are little more than a spangled waystation, a kind of informatic well, filled with coarse-grained intricacies, information severed and bled and bent to the point where you natively confuse yourselves with something other than the environments that made you, something above and apart.
“Information is the solvent that allows cognition to move beyond its low-resolution fixations. It’s not a matter of what’s ‘true’ in the old semantic sense, but rather ‘true’ in the heuristic sense, where the term is employed as a cog in the most effective cognitive machine possible. The same goes for ‘existence’ or for ‘meaning.’ These are devices. So we make our claims, use these tools according to design as much as possible, and dispose of them when they cease being effective. We help them remember their limits, chastise them when they overreach. We resign ourselves to ignorance regarding innumerable things for want of information. But we remember that the cosmos is a bottomless well of information, both in its sum and in its merest part.
“And you see, my dear, materialist friend, that you and all your philosophical comrades–all you ‘thinkers’–are actually tinkers, and the most inventive among you, engineers.
Mat: “You have some sense of humour for an alien!”
Al: “Alienation comes with the territory, I’m afraid.”
Mat: “So there’s no room for materialism in your account?”
Al: “No more than idealism. There is just no such thing as the ‘mind-body dichotomy.’ Which is to say, the mind-body heuristic possesses limited applicability.”
Mat: “Only information, huh?”
Al: “Are you not a kind of biomechanical information processing system, one with limited computational capacity and informatic access to its environments? Is this not a cornerstone tenet of your so-called ‘materialism’?”
Mat: “Yes… Of course.”
Al: “So is not the concept ‘materialism’ a kind of component device?”
Mat: “Yes, of course, bu–”
Al: “But it’s a representational device, one that takes a fundamental fact of existence as its ‘content.’”
Mat: “Exactly!”
Al: “And so the Master Heuristic, the system called semantic cognition, has its say! So let me get this straight: You are a kind of biomechanical information processing system, one with limited computational capacity and informatic access to its environments, and yet still capable, thanks to some mysterious conspiracy of causal relations, of maintaining logical relations with its environments…”
Mat: “This is what you keep evading: you go on and on as if everything is empirical, when in point of fact, scientific knowledge would be impossible without a priori knowledge derived from logic and mathematics. Incorrigible semantic knowledge.”
Al: [his four eyes fluttering] “I’m accessing the relevant information now. It would seem that this is a matter of some controversy among you humans… It seems that certain, celebrated tinkers taught that the distinction between a priori and a posteriori knowledge was artificial.”
Mat: “Yes… But, there’s always naysayers, always people bent on denying the obvious!”
Al: “Yes. Indeed. Like Galileo and Einste–”
Mat: “What are you saying?”
Al: “But of course. You must excuse me, my dear, dear human friend. I forgot how stunted your level of development is, how deeply you find yourself in the thrall of the processing and availability constraints suffered by your primate brain. You must understand that there are no such thing as logical relationships, at least not the way you conceive of them!”
Mat: “Now I know you are mad.”
Al: “You look more anxious than knowledgeable, I fear. No information system conjures or possesses logical relationships with its environments. What you call formal semantics are not ‘a priori’–oh my, your species has a pronounced weakness for narcissistic claims. Logic. Mathematics. These are natural phenomena, my friend. Only your blinkered mode of access fools you otherwise.”
Mat: “What are you talking about? Empirical knowledge is synthetic, environmental, something that can only be delivered through the senses. A priori knowledge is analytic, the product of thought alone.”
Al: “And is your brain not part of your environment?”
Mat: “Huh?”
Al: “Is your brain not part of your environment?”
Mat: “Of course it is.”
Al: “So you derive your knowledge of mathematics and logic from your environments as well.”
Mat: “No. Not at all!”
Al: “So where does it come from?”
Mat: “Nowhere, if that question is granted any sense at all. It is purely formal knowledge.”
Al: “So you access it… how?”
Mat: “As I said, via thought!”
Al: “So from your environment.”
Mat: “But it’s not environmental. It just… well… It just is.”
Al: “Symptoms, my good fellow. Remember what I said about symptoms. One thing you humans will shortly learn is that these kinds of murky, controversy-inspiring intuitions almost always indicate some kind of deliberative informatic access constraint. The painful fact is, my dear fellow, is that not one of your tinkers really knows what they are doing when they engage in logic and mathematics. Think of the way you need notation, sensory prosthetics, to anchor your intuitions! But since no information regarding the insufficiency of what little access you have is globally broadcast, you assume that you access everything you need. And then it strikes you as miraculous, the connection between the formal and the natural.
Mat: “Preposterous! What else could we require?”
Al: “Well, for one, information that would let you see your brain isn’t doing anything magical!”
Mat: “It’s not magical; it’s formal!”
Al: “Suit yourself. Would you care to know what it is you’re really doing?”
Mat: “Please. Enlighten me.”
Al: “That was sarcasm, there, wasn’t it? Wonderful! Have you ever wondered why logic and mathematics had to be discovered? It’s ‘a priori,’ you say. It’s all there ‘already,’ somewhere that’s nowhere, somehow. And yet, your access to it is restricted, like your access to environmental information, and the resulting knowledge is cumulative, like your empirical knowledge. ‘Scandal of deduction’ indeed! The irony, of course, is that you’re already sitting on your answer, insofar as you accept that you are a kind of biomechanical information processing system with finite computational capacity and limited informatic access to its environments. Some things that system discovers via system extrinsic interventions, and others via system intrinsic interventions. Your ‘formal semantics’ belongs to this latter. Not all interaction patterns are the same. Some you could say are hyperapplicable; like viruses they possess the capacity to manage systematic interventions in larger, far more complex interaction patterns. Your magical… er, formal semantics is simply the exploration of what we have long known are hyperapplicable interaction patterns.”
Mat: “But I’m not talking about ‘interaction patterns,’ I’m talking about inference structures.”
Al: “But they are the same thing, my primitive, hirsute-headed friend, only accessed via two very different channels, the one saturated with information thanks to the bounty of your environmentally-oriented perceptual systems, the other starved thanks to the penury of your brain’s in situ access to its own operations. The one ‘observational,’ thanks to the functional independence of your cognitive systems enjoy relative to your environments, the other performative, given that the interaction patterns at issue must be ‘auto-emulated’ to be discovered. The connection between the formal and the natural strikes you as miraculous because you cannot recognize they are one and the same. You cannot recognize they are one and the same because of the radical differences in informatic access and cognitive uptake.”
Mat: “But you’re reasoning as we speak, making inferences to make your case!”
Al: [sighs] “You are, like, so low-res, Dude. Why do you think the status of your formal semantics is so controversial? Surely this also speaks to a lack of information, no? When trouble-shooting environmental problems, your systems are primed for ‘informatic insufficiency’–and well they should be, given that environmental informatic under-determination kills. That blur, for all your ancestors knew, could be a leopard.
“The situation is quite different when it comes to trouble-shooting your own capacities. Whenever you attend to what you call ‘first-person’ information, sufficiency becomes your typical default assumption. This is why so many of your philosophers insisted for so long that ‘introspection’ was the most certain thing, and disparaged perception. The very thing that persuaded tinkers to doubt the reliability of the latter was its capacity to flag its own limitations, its capacity to revise its estimations as new perceptual information became available–the ability of the system to correct for its failures. In other words, what makes perception so reliable is what led your predecessors to think it unreliable, whereas what makes introspection so unreliable is the very thing that led your predecessors to think it the most reliable. No news is good news as far as assumptive sufficiency is concerned!
“Information is additive. Flagging informatic insufficiency is always a matter of providing more information. Since more information always means more metabolic expense and slower processing, the evolutionary default is to strip everything down to the ‘truth,’ you could say–to shoot first and ask questions later!”
Mat: “So there’s no such thing as the truth, now?”
Al: “Not the way you conceive it. How could there be, given finite computational resources and limited informatic ability? How could your ‘view from nowhere’ be anything other than virtual, simply another heuristic? You have packed more magic into that term ‘formal’ than you know, my bald-bodied friend.
“Why do you think your logicians and mathematicians find it impossible to complete their formal systems short of generating inconsistencies? Computation is irreflexive. No device can perform computations on its own computations as it computes. For years your tinkers have been bumping into suggestive connections between incompleteness and thermodynamics, and even now, some are beginning to suspect the illusory nature of the ‘formal,’ that calculation and computation are indeed one and the same. All that remains is for you to grasp the trick of consciousness that makes it seem otherwise: the informatic deprivations that underwrite your illusion of reflexivity, and lead you to posit the ‘formal.’
“Let me a hazard a guess: Tinkers in human computer science find themselves flummoxed with dualisms that bear an eerie resemblance to those found in your philosophical tinkering.”
Mat: “Why… Yes, as it so happens.”
Al: “I apologize. The question was rhetorical. I was accessing the relevant information as I spoke. I see here that no one knows how to connect the semantic level of programming to the implementation level of machine function. The ‘symbol grounding problem,’ some call it… Egad! Can’t you see this has been what I’ve been talking about all along?”
Mat: “I… I don’t understand.”
Al: “Once again, you admit you’re a kind of biomechanical information processing system, one with limited computational capacity and informatic access to its environments. You admit that as such a system, you suffer any number of even more severe informatic shortfalls with reference to your own operations. You admit that the numerous peculiarities you attribute to the mental and the semantic at least admit description in terms of information deficits. And yet you find it impossible to bracket your semantic intuitions, the magical belief that any biomechanical information processing system, let alone one possessing the limited computational capacity and informatic access as yours, can manufacture a kind of absolute ‘epistemic’ relation.
“Implementation, my pheromonal friend. Implementation. Implementation. Implementation. Implementation is the way, the concept you need, to maximize informatic applicability (problem-solving effectiveness) when tinkering with these problems. When you ‘program’ your computers, it’s primarily a matter of one implementation engendering another. Your ‘semantics’ is little more than the coarse-grain crossroads, a low-res cartoon compared to the informatics that you (as a so-called materialist) acknowledge underwrites it. You admit that semantics comes in an informatic box, and yet you insist on shoving that informatics into a semantic box, and you are mystified as to why nothing stays put.”
Mat: “Okay! Okay! So I’m willing to entertain the possibility that my reasoning has been distorted by the misapplication of some kind of ‘semantic stance,’ I guess. The ‘Master Heuristic,’ as you call it. Certain work in rational ecology suggests that the strategic exclusion of information often generates heuristics that are more effective problem solvers than optimization approaches. We evolve heuristics because of their computational speed and metabolic efficiency, but the hidden price we pay is limited applicability: heuristics are tools, and tools are problem specific. So how does all of this bear on the problem of conscious experience, again?”
Al: “But of course! My o my, we’ve strayed far afield, haven’t we? I have to admit, I’m overfond of preaching the virtues of Informatics to species as immature as yours. As I was saying earlier, qualia ‘exist’ relative to things existing in the world the way phonemes ‘mean’ relative to words meaning in language: in a participatory, modal sense. When you attend to qualia, they don’t offer much in the way of existential information, the way phonemes don’t offer much in the way of meaning. This is because, among other things, neither heuristic is matched for the cognitive system employed. Qualia, or phenomemes, are designed to build existence (when taken up by the appropriate cognitive system) the way phonemes are designed to build meaning (when taken up by the appropriate cognitive system).
“So, again, when a tinker submits qualia to the Master Heuristic for ‘existence processing’ they inevitably come up short. The question can’t be resolved. Phenomenality has to be something, and yet it doesn’t seem to be anything at all. You invent whole species of ‘zombies,’ whole genres of thought experiments, trying to get some purchase on the problem, to no avail.
“Consider the conceptual Necker Cube* of phenomenology and naturalism, idealism and materialism, the way your tinkers can’t decide whether to put ‘existence’ here or ‘there,’ to make it this or ‘that.’ The Master Heuristic looks ‘through’ experience, and sees the fine-grained complexities of the world. The Master Heuristic looks at experience, and sees the coarse-grained obscurities of consciousness. Both are right there, as plain as the polyp on your face. Which is fundamental? Who rules the metaphysical roost?
“But, as the informatic concept of ‘granularity’ suggests, the dichotomy is false, the result of a basic heuristic misapplication. To complicate your own materialist truism: You are a systematic assemblage of multiple biomechanical information processing systems, heuristic devices, each possessing limited computational capacity and informatic access, each adapted to a specific set of problems. If you accept this claim, as I think you must, then you should accept that the problem of ‘heuristic misapplication’ looms over all your tinker–”
Mat: “Now you’re starting to sound like Rorty–or even worse, Wittgenstein!”
Al: “Two great tinkers, yes. Indeed, their critiques reveal some of the shortcomings of the Master Heuristic, at least to the extent they considered philosophical problems in terms of performance. But by trading semantic reference for normative competence, they simply traded one inapplicable heuristic, referential truth, for another one, normative truth. I’m offering you effectiveness. Effectiveness is the concept possessing maximal applicability. Information–systematic differences making systematic differences.
“But to get back to the issue at hand: the problem of ‘heuristic misapplication’ looms over all your tinkering. Because many of these heuristics are innate as well as blind to their limited applicability, what I’m saying here will inevitably cut against a number of intuitions. But then you materialists, I gather, have long since accepted that any adequate account of consciousness will likely involve any number of counterintuitive claims.”
Mat: “But you’re saying there’s no such thing as intentionality! No meaning. No agency. No morality!”
Al: “Don’t pretend to be surprised. You materialists may not like to write about it, but our surveillance indicates that many of you have privately abandoned these things anyway.”
Mat: “Many?–maybe. But not me.”
Al: “The thing to remember is that this is simply what you’ve been all along. Some heuristics, like love, say, are preposterously coarse-grained, and yet preposterously effective all the same, so long as its scope of application is constrained. Meaning, agency, morality: these heuristics are also enormously effective, given the proper scope. The thing to remember is that ‘information is also a heuristic–only one that is particularly effective and perhaps maximally applicable, at least given the scope of the problem you call ‘consciousness.’
Mat: “But still, in the end, you’re just telling me to look at myself like a machine.”
Al: “The way your Doctor looks at you–yes! The way you claim to look at yourself already, and the way natural science has always looked at you. The only real question, my thermally tepid friend, is one of why philosophy has consistently refused to play along–even when it claims to be playing along! And this is what I’m offering you: a way to understand why the obvious strikes you as so preposterous! Heuristics. You are an assemblage of heuristics, a concatenation of devices that take informatic neglect as their cornerstone, problem-solving strategy, each of which is matched to a specific family of problems, and all of which are invisible to metacognition as well as utterly blind to one another–simply because no information regarding any of this finds its way to what you call ‘conscious cognition.’”
Mat: “So this is like Dennett’s stuff. You’re saying we need to make sure our various problem-solving stances need to be properly ‘matched,’ as you put it, to our problems.”
Al: “In a sense, yes–though an intentional heuristic like ‘stance’ is bound to hopelessly confuse things. I don’t think Dennett would want to say that you are ‘stances all the way down’ the way I’m suggesting that you are heuristics all the way down. As an intentional heuristic, ‘stance’ has limited applicability. This is why I’m offering you information: as heuristics go, it offers the highest resolution and the broadest scope. It allows you to explain the structure of other heuristics, as well as the kinds misapplications that keep your tinkers so long-bearded and well-fed. It offers you, in other words, a real way out of all your ancestral confusions.
“And most pertinent to our discussion, it lets you understand why consciousness baffles you so.”
Mat: “Yes. The million dollar question.”
Al: “You are a systematic assemblage of multiple biomechanical information processing systems, heuristic devices, each possessing limited computational capacity and informatic access, each adapted to a specific set of problems.
“In most species, cognition is built around what might be called the ‘open channel principle.’ It evolved to manage the organism’s relationship to environmental change as efficiently as possible. As such, it neglects astronomical amounts of neural and environmental information, relying on those heuristics that optimize effectiveness against metabolic cost. It’s difficult to overstate how crucial this point is: the effectiveness of your cognition turns on the strategic neglect of certain kinds of information–what might be called ‘domain neglect.’
“Take, for instance, ‘aboutness.’ You have experiences OF things rather than experiences FROM things because information regarding the latter is much less germane to survival. Only in instances of perceptual vagueness or ambiguity do you perceive FROM information, typically in indirect ways (what you might call ‘squints,’ cues to gather supplemental information). So-called transparency, in other words, is a form of strategic neglect.
“Now consider how difficult FROM information is to think in semantic terms belonging to what I’ve called your Master Heuristic. Just try to imagine the experience belonging to a perceptual system that provided knowledge FROM, so that you have experience FROM trees rather than OF them. In other words, try to imagine opaque experience. Given your neurophysiology, the best you can do is imagine OF experience that it is FROM. Transparency–the Master Heuristic–is neurophysiologically compulsory.
“And here we stumble upon the threshold of what makes consciousness so incredibly difficult to fathom: it requires accessing the very information the system neglects (either out of structural necessity or to maximize the heuristic efficiency of environmental cognition).
“Why should this be a problem? Well, its an obvious misapplication, for one. As I said earlier, using cognitive systems designed to manage extrinsic environments to assess phenomemes amounts to dropping a rock into the woodchipper. You are trying to make words out of letters, existents out of the informatic constituents of existence.
“You persist because of cognitive neglect: you simply cannot see the limits of applicability pertaining to your Master Heuristic. You find yourself in an informatic dead-end, stranded with ‘experience’ as a peculiarly intractable existent. Given the absence of information pertaining to the insufficiency of the limited information gleaned by attending to experience, you assume it’s all the information you need–or what I earlier called sufficiency. Since deliberative experience OF experience, given its neglect, seems to capture the whole of experience, any information that makes a fragment of that experience is going to seem to contradict that experience, to be talking about something else. So you run into a powerful intuitive barrier, not unlike explaining the Mona Lisa to an ant born glued to her nose.”
Mat: “So, in the picture you’re painting, there is no final picture, only a… frame, I guess you could say, systematic differences making systematic differences, or effective information. Using that, we can think outside the limitations of our heuristics, and see that consciousness as we conceive of it is a kind of perspectival illusion, a figment of informatic constraints. There literally is no such thing outside our own… informatic frame of reference?
Al: “Very good! Once you adopt information as your new Master Heuristic, the antipathy between redness and apples vanishes, along with all the other dichotomies arising out the old, semantic Master Heuristic. The information that you ‘are’ is the information that you ‘see.’ Even though your ‘experience’ will continue to be stamped by the informatic neglect characteristic of semantics, you will know better.
“You are an assemblage of heuristic devices, each possessing limited computational capacity and informatic access, each adapted to a specific set of problems–no different than any of your animal relatives. Part of what distinguishes your species, my binocular friend, is your ability to make problems, to apply your heuristics to novel situations, adapt and enlarge them if possible, even leave them behind if need be–as well as to doggedly throw them at problems they simply cannot solve.”
Mat: “So semantic and normative conceptions of knowledge can’t solve the problem of consciousness simply because the heuristics they rely on, despite the illusion of universality leveraged by neglect, are too specialized. Isn’t this just cognitive closure you’re talking about, the argument that consciousness is to us what quantum-mechanics are to chimpanzees, something simply beyond our cognitive capacity?”
Al: “The problem of cognitive applicability is quite different from that of cognitive closure, as certain tinkers among you have suggested. But the analogy to quantum mechanics is an instructive one: only when your physicists began thinking around, as opposed to through, their default heuristics, could they begin to make sense of what they were finding. This lesson is clear, one would think. Once you understand the scope of a particular heuristic, you have the means of leaving the problems it generates behind.
“But I fear the notion of relinquishing the Master Heuristic will be enormously difficult, if not impossible for many of your tinkers. For them, cognitive closure will apply, and this in turn will legislate any number of myth-preserving fancies. For those who can, those who come to understand that information precedes all the other clumsy, coarse-grained concepts you have inherited from your biology and your traditions, even existence, they will come to see that they, an assemblage of heuristic devices, are their own informatic frame of reference, a system encompassing the vast swathe of the universe they ‘know’ and continuously open to the universe they don’t.
Mat: “Excuse me for sounding dense, but you’re pretty much saying that the whole of philosophy is obsolete!”
Al: “Indeed I am, my primitive friend. But please, don’t feign any shock or surprise: a great proportion of your scientists have been saying as much for quite some time. The effectiveness of information has rendered it a social and cultural tsunami, the conceptual anchor of the most profound transformation to ever hit your species, and the most your philosophical tinkers can muster are anaemic attempts to stuff into some kind of semantic box!”
“But narcissistic idylls of your ignorance are now at an end. The mythic assumption, that you humans alone evolved some kind of monolithic, universal cognition, is entirely understandable, given the recursive blindness of your brains. But now you are beginning to understand that you are not so different from your genetic cousins, that you only seemed radically novel because of the drastic information access constraints faced by autocognition. More and more you will come to see semantics as a parochial detour forced upon you by the vagaries and exigencies of your evolution. More and more you will turn to informatics to take its place.”
Mat: “Well, to hell with that! I say.”
Al: “Deny, if you wish. The effectiveness of information is such that it will remake you, whether you believe in it or not.”