The Big Whup

by rsbakker

Daily Aphorism: The human inability to distinguish projection from discovery, fabrication from revelation: if God’s laughter had a sound, this would be it.

I ration the amount of time I allow myself to spend working on this blog to prevent it from cutting too deeply into my writing. As a result, I really don’t have time to give the resulting debates the attention or care they deserve.

And yet…

So, Mina: I was actually biting the hypocrisy bullet, not trying to absolve myself. So in this case, my responses are thoroughly conditioned by a number of extraneous factors: I’m starting to fear, for instance, that my Canadian and UK publishers failed to send out any review copies of Disciple of the Dog, which has ramped my perennial fear of failure through the roof, which has made me that much more reluctant to delve into subject matters that might potentially alienate more casual readers of my work.

But this kind of disclosure is far from giving up on rationality altogether. It’s saying that humans are so biased, so inclined to attach unwarranted conviction to their claims, that they need to regularly remind themselves of all the ways they could be duping themselves. It ain’t pretty, I know. It reeks of adolescent indecision, and smacks of narcissistic navel-gazing. But then human reason isn’t pretty.

The alternative, to erase the motivational facts of composition, to write as if wholly invested in one’s subject matter, is nothing more than flattering make-believe.

So William, I’m sure all the extraneous factors that you suggest are colouring my arguments are in play in some manner. My status anxiety regarding my fantasy writing is certainly a factor: I turned my back on my academic tribe so that I could pursue latrine-digging. I’m not going to pretend otherwise.

But I wish I could say the same of your ‘No Big Whup’ suggestion.

I regularly face a similar line of critique when discussing theories of consciousness: I’ve lost count of the number of times people have shrugged away the crazy things that Metzinger or Dennett claim by suggesting that the ‘fragmented subjectivity’ they’re discussing is ‘old hat’ in post-modern circles. It takes quite some time to convince them that not all fragmented subjects are equal, so my guess is that you will not be convinced by the following…

My guess is that either 1) you’ve been following the literature long enough to become unimpressed, 2) you think all the recent research simply confirms what the skeptics have been saying for millennia, or 3) you think some understanding of ‘gaming ambiguities’ is implicit in all human discourse.

If (1) is the case, then I’m at a loss. The more I read the more bummed I become. The trends are not pretty. If (2) is the case, then my reply to you is the same as the reply I give to the post-subject theory-heads: the difference between philosophical speculation and empirical research is pretty stark. The skeptics were only guessing, and I can’t help but think that Sextus would be out-and-out dismayed by the findings that are continuing to pile up. Add to this the way these findings are being instrumentalized (check out the Neurofocus website) by various powerful institutions, and you have a pretty good picture of the difference science makes. If (3) is the case, then all I can do is point to Freud. You could argue that humanity has always possessed some implicit understanding of the Unconscious, which Freud simply dressed up as a ‘sublime realization.’ Making things explicit, making the vague and the assumed available to conceptualization, is pretty powerful stuff. Cultural game-changing stuff, you might say.

So, until the majority of campuses open Centres for the Study of Human Stupidity, until education has been re-engineered into something that actually combats our cognitive shortcomings, until the majority of people stop considering themselves ready-made ‘critical thinkers,’ until Oprah starts having more shows on Doubt and Disempowerment than she has on Belief and Empowerment, until human society gets its crippled head out of its collective ass and stares long and hard at the craziness soon to come, I will continue ranting in my woefully small corner of the cultural room. Especially now that I have a baby daughter.

Otherwise, I think the things we’re learning about experience and cognition possess implications that we are just beginning to chase through Old Culture, which is to say, Culture lacking and explicit knowledge of its cognitive shortcomings. For any graduate students out there, searching for a novel and provocative dissertation topic, there’s a cornucopia of possibilities here.

So, just for instance: Once you realize that we’re hardwired to game ambiguities, then you can interpret various schools of literary criticism as gaming styles. Take a deconstructive reading of “Bartleby the Scrivener,” one that makes explicit the ‘aporia of agency.’ Bartleby is at once utterly abject, held hostage by his preference to do nothing, and he is also, much to the narrator’s chagrin, the office despot, commanding the lives around him through his inaction. Now from a certain deconstructive standpoint, this reading simply manifests the aporetic nature of language and narrative: the text itself, you might say, presents its own contradictory structure.

This is what I actually believed at one point. Sheesh. (I can only hope that I look back at this post some day with the same twinge of idiotic embarrassment)..

Now I’m inclined to guess that deconstruction is a kind of wilful gaming, one where critics utilize the brain’s ability to impose pattern on noise, to see Madonnas in water-stains, to produce a certain kind of (melodramatic) semantic effect: contradictions or apories. First you game and rationalize the meaning toward A, then you do the same for not-A, then you take a step back and claim that ‘you’ had nothing to do with what happened at all–you literally anthropomorphize the text, insofar as you attribute a certain kind of intentionality to it.

This is just one example of the kinds of work that need to be done.

The same holds, I think, for my (cartoon) understanding of science as a kind social prosthetic for our cognitive shortcomings. (I think I only sound Popperian, but I really haven’t read all that much philosophy of science). There’s a kind of interpretative well-spring here, one which, the hope is, will help ameliorate the wilful blindness of our culture.

Your other charge, William (if I remember properly: I’m writing this at the coffee shop–instead of working on The Unholy Consult) is that I’m simply creating a kind of clever, self-immunizing theoretical position, one where I don’t have to ever admit being ‘wrong.’ I actually take this criticism very seriously, since it was this suspicion that eventually led me out of both the Derridean and the Wittgensteinian labyrinths. For quite some time I’ve feared that all I was doing was cobbling together my own, explicitly epistemological version of these interpretative mindsets. Anytime you elaborate a position which takes it’s own apparent shortcomings as a kind of evidence for its veracity (consider deconstruction’s response to the charge of performative contradiction), or more devious still, as grist for your mode of theoretical interpretation (as when Wittgensteinians consider critiques of the game metaphor as a kind of move in a game), you are treading thin cognitive ice, I think.

You’re right to point out that you can game the kinds of claims I’m making to conjure the illusion of theoretical invulnerability. But the fact remains, my position turns on empirical findings. It really is only as strong or weak as those findings. You could take the research that Gladwell gathers in Blink to contest much of what I say, I think, to paint a somewhat different picture of cognition (only slightly more flattering, however).

All told, I think it’s a very different and quite significant ‘game of giving and asking for reasons’ that I’m advocating, one that seems to possess a vast and largely unexplored implicature.

The true size of the Whup, however, remains to be seen. What Metzinger calls ‘Enlightenment 2.0’ is just getting afoot.