Three Pound Brain

No bells, just whistling in the dark…


“Reinstalling Eden”

by rsbakker

So several months back I was going through my daily blog roll and I noticed that Eric Schwitzgebel, a well-known skeptic and philosopher of mind, had posted a small fictional piece on Splintered Minds dealing with the morality of creating artificial consciousnesses. Forget 3D printing: what happens when we develop the power to create ‘secondary worlds’ filled with sentient and sapient entities, our own DIY Matrices, in effect? I’m not sure why, but for some reason, a sequel to his story simply leapt into my head. Within 20 minutes or so I had broken one of the more serious of the Ten Blog Commandments: ‘Thou shalt not comment to excess.’ But, as is usually the case, my exhibitionism got the best of me, and I posted it, entirely prepared to apologize for my breach of e-etiquette if need be. As it turned out, Eric loved the piece, so much so he emailed me suggesting that we rewrite both parts for possible publication. Since looooonnng form fiction is my area of expertise I contacted a friend of mine, Karl Schroeder, asking him what kind of venue would be appropriate, and he suggested we pitch Nature – [cue heavenly choir] – who has a coveted page dedicated to short pieces of speculative fiction.

And lo, it came to pass – largely thanks to Eric, who looked after all the technical details, and who was able to cajole the subpersonal herd of cats I call my soul into actually completing something short for a change. The piece can be found here. And be warned that, henceforth, anyone who trips me up on some point of reason will be met with, “Oh yeeeah. Like, I’m published in Nature, maan.”

‘Cause as we all know, Nature rocks.

Russell Smith Shrugged

by rsbakker

Holidays are upon me. But an old time foil o’ mine, Russell Smith, has managed to put me into an old time mood with a piece in today’s Globe and Mail.

The topic, predictably, is genre versus literature. And the argument, predictably, is the standard ‘argument’ given by those at the high end of any cultural authority gradient: Those on the bottom have no reason to bellyache because there is no bottom, the implication being, of course, that really, when all is said and done, ‘they’re just jealous.’

Smith is confused by what, for him, amounts to a mythical injustice. “Every day,” he writes, “I read angry emails and posts from sci-fi writers complaining about the terrible snobbery and irrelevance of the literary establishment which still doesn’t give major awards to the speculative or fantastical, or give it enough review space in the books pages of newspapers.” Now group specific dissatisfaction of any sort always begs for some kind of consideration of motivations. But Smith glides over this question, perhaps realizing the trickiness that awaits. Implying ‘They’re just jealous!’ is one thing, but actually writing as much would place him in some uncomfortable company. So he simply declares that he has never heard anyone in his ingroup explicitly dismiss genre–as if only those who explicitly embrace bigotry can be bigots. And as if he and his cohort don’t regularly deride the ignorant masses via their ignorant tastes. The guy doubles as a fashion columnist, after all.

Because make no mistake, Russell Smith is a cultural bigot through and through–and of the worst kind, in fact. He is a status quo apologist convinced he has nothing to apologize for, who feels hurt and bewildered and quite frankly, annoyed, by the deluge of small-minded belly-aching he has to listen to. And since he belongs to an ingroup that self-identifies itself as ‘critical’ and ‘open’–namely, as all those things each and every ingroup is not–he simply assumes that he has to be right. His is the enlightened institution. There’s no need to ask the motivation question, no need to consider the possibility that the perception of cultural inequity is all that cultural inequity amounts to (even though, it is the case that only writers that primarily self-identify themselves as ‘literary’ win the awards and the funding).

To get a sense of how bad his argument is, consider:

There is a paradox at the heart of these complaints: They proclaim the artificiality of genre divisions while simultaneously demanding respect for a specific one. Are we to abolish genres or privilege one? Either you want a level playing field or you don’t.

Sound appealing? Sensible? Well, let’s spice up the stakes a bit, see if it doesn’t sound more familiar:

There is a paradox at the heart of these complaints: They proclaim the artificiality of racial divisions while simultaneously demanding respect for a specific one. Are we to abolish races or privilege one? Either you want a level playing field or you don’t.

He doesn’t get it because he has no need to get it, because he belongs to what remains, in far too many cultural corners, the privileged ingroup. “Why does this argument even need to be made?” he writes. After all, so very many literary novels contain magical or surreal or futuristic elements, such as “Invisible Cities by Italo Calvino, The Tin Drum by Gunter Grass, and Beloved by Toni Morrison.”

Apparently his people love our stuff when his people write about it.

Wake up. It’s about power, idiot, not the statistical distribution of tropes. It’s about who has it, and who don’t.

Unfortunately for us, mainstream literature is not nearly as irrelevant as it should be. It remains a fat, greasy parasite that continues to feed on far too much talent, continues to convince far too many bright and sensitive souls to turn their backs on their greater culture (in what is, without any doubt, the most momentous epoch in human history) all in the name of accumulating ingroup prestige within a socially obsolescent institution. All writers are post-posterity writers, nowadays, and if they truly want to walk their egalitarian, prosocial talk, then they need to reach out with their writing, self-consciously game the preference parsing algorithms that increasingly command our horizons of cultural exposure. In other words, they need to do the very opposite of what conservative apologists like Smith continually urge, which is to bury their heads in sand at the bottom of the hourglass.

“These category questions,” Smith writes, “are marketing ones, not literary ones.” Once upon a time, maybe, but certainly not anymore. If literature is as literature does, then what we call ‘literary’ today does precious little that can be called literary–thanks to marketing. The outgroup philistines that literary writers pretend to ‘challenge’ let alone ‘illuminate’ no longer stumble into their books, leaving only classroom captives to complete the literary circuit (with dead or moribund authors, no less). Literature describes a certain, transformative connection between writers and readers, and marketing just happens to be all about connecting buyers with sellers. Given that confirmation is the primary engine of consumer demand, literature is simply writing that jumps the tracks, that somehow, someway, finds itself in the wrong hands. The rest, as DFW would say, is fucking Entertainment. More apology.

The future of literature in the age of information technology lies with genre, plain and simple, with writers possessing the wherewithal to turn their backs on apologetic apparatchiks like Smith, and actually contribute to building the critical popular culture we will need to survive the strange, strange days ahead. The alternative–Smith’s alternative–is to preach to the choir, apologize and reinforce, cater to expectations–do all the things that ‘sellouts’ do–then to endlessly declare yourself a missionary of transformation. Quite a scam, I would say.

Wake-up Call

by rsbakker

Aphorism of the Day: If I have smelled farther than others, it is because I have shoved my head up the asses of giants.


Take it for what it’s worth. I’ve been camped on the outskirts of Golgotterath for awhile now, and it gets hard, sometimes, keeping things distinct, sorting the theoretical moods from the narrative, deciding what’s besieging what, and who’s storming whom. Besides, I find it plum exhausting not pissing people off.

So apparently someone posted a link to my previous post on Hagglund’s Facebook page, where it lingered a bit before mysteriously disappearing. I certainly understand the impulse, but for whatever reason explicit acts of hypocrisy rot my soul. I just finished reading an entire book by the guy extolling exposure, so I gotta call it. Just what kind of exposure was he extolling? The flattering kind? The self-promotional kind? Or (what amounts to the same) just the kind that keeps the ugly, dishevelled, and uncredentialed at the door?

I know the fears, I suppose. Academic politics, as ‘Sayre’s Law’ has it, are so vicious because the stakes are so low. Rumour and reputation are the coin of the realm when you profess for a living–aside, that is (ahem), from a steady paycheck, summers off, and the obedience of gullible undergrads. The circles are small enough that you always need consider whom might be listening–especially if you’re fool enough to entertain ambition. Everyone is careful to be careful, urgent to be urbane. Ask yourself, is anything more insane than the ‘academic tone’? You pour your thoughts into a sieve, and you shake and shake and shake, not to gather the kernels of genuine individuality, but the chaff, the maximally processed flour, whatever your ingroup peers can use to bake their maximally tasteless bread. Panning for dirt, the way it has to be when you make any bureaucratized institution your yardstick of value and success. Extolling originality only when it’s dead.

Everything alive is safer that way. Dead.

And agreement is so much more agreeable. A degree. A library. An attitude. A skin. A religion. Want to know how much you really ‘appreciate difference’? Just look at the vocabularies of your friends.

I ain’t no different.

But it’s worth marvelling all the same. The hypocrisy, enough to make a fundamentalist Christian blush. Who would have imagined that the academic humanities, in the course of ceaselessly generating more graduates than jobs, would succeed in casting a para-academic shadow more substantial than themselves? Because this, my friends, seems to be precisely what’s happening. I know there’s people out there who feel this way. Plenty. More importantly, I know there’s people out there with organizational skills who feel this way. A strategic handful. I ain’t that person. I’m just a fucking windbag, but I will assist you if I can. They may have the paychecks, but we have the pots and pans…

Okay, I’m not sure what that means, exactly, except that we now have the capacity to be loud in ways they no longer dare. Too much training. Too much droning before audiences both living and legal. And certainly too much striving, toiling, labouring to secure what our disenfranchised numbers have transformed into a rare earth metal. Too much market share to risk risk. You shrink once you attain what you covet. Worse yet, you set out to make good on all that you have sacrificed. All those norms you had to imbibe, they replace you sip by odourless sip, until you begin sweating colour, inhaling white oblivion–until meticulous grooming becomes second nature. You walk across your campus faerie-land, and you walk and you walk until the day comes when you feel more entitled than astounded. You pull your edges into defensive circles. And you talk and you talk, until your voice feels like an ancient and indestructible boot. Your erudition fades into a pastime. Your relevance escapes you. You fuck anything that lets you. Your faculty photo becomes another orthopedic insert. Laziness becomes indistinguishable from insight, so you begin to promise relief, like any other over-the-counter medication. You peer at your eyebrows in the mirror, thinking, Hmmmm

Of course we’re more ‘real.’ Our failure (your success) keeps us hungry. Our hunger (your fat) keeps us distinct, mindful of what once mattered.

Eager for overthrow… or at the very least some bell to signal morning.

Because the frontdesk has forgotten.

Neither Separate, Nor Equal

by rsbakker

Aphorisim of the Day: Some argue against yesterday. Some argue against tomorrow. But everyone kisses ass when it comes to today.


‘Continuity bias’ is a term I coined years back to explain how it could be that so many people could remain so unaware of the kinds of fundamental upheaval that are about to engulf human civilization. I sit with my three year-old daughter watching little robots riding bicycles, walking tightropes, doing dance routines and so on, thinking how when I was her age the world was electrified by the first handheld calculators. So I ask myself, with more than a little apprehension, I assure you, What can my daughter expect?

The only remotely plausible answer to this question is almost entirely empty, and yet all the more consequential for it: What can my daughter expect? Something radically different than this…

Something fundamentally discontinuous.

To crib concepts used by Reinhart Kosselleck to characterize Neuzeit, or modernity, we are living in an age where our ‘horizon of expectation’ has all but collapsed into our ‘space of experience.’ My daughter will live through an age when the traditional verities of human experience will likely be entirely discredited by neuroscientific fact, and where the complexities and capacities of our machines will almost certainly outrun our own complexities and capacities. And this, as much as anything else, is the reason why I find any kind of principled defence of traditionalism at once poignant and alarming: poignant because I too belonged to that tradition and I too mourn its imminent passing, and alarming because it does not bode well when the change at issue is so fundamental that the very institutions charged with critiquing the tradition are now scrambling to rationalize its defence.

So it was I found myself shaking my head while reading Jason Bartulis’s recent defence of nooconservativism on I decided to write on it because of the way it exemplifies what I’ve been calling the ‘separate-but-equal strategy’ and how it tends to devolve into question-begging and special pleading. But since head-shaking whilst reading is never a good sign, I encourage people to challenge my interpretation, particularly if you find Bartulis’s position appealing. Maybe I am overlooking something. Against all reason, thousands of people are now reading these posts, more than enough for me to become sensitive to the consequences of any oversights on my part.

Bartulis summarizes his position thus:

I’ve been arguing …. that engineering questions can only be answered in engineering terms. Conversely, I’ve tracked the infelicities attending the importation of the explanatory vocabulary of the natural sciences into human sciences to demonstrate why engineering explanations can’t work as explanations to normative questions. Thinking they can is one way of committing, not the Intentional, but the Naturalistic Fallacy in (literary) epistemology and in the philosophy of mind that subtends most attempts to make cognition a category for literary and cultural analysis.

Now since I once defended a position similar to this, I understand the straightforward (if opportunistic) nature of its appeal: ‘Your cognition has its yardsticks, my cognition has mine, therefore keep your yardstick away from my cognition.’ But it really is a peculiar argument, if you think about. For instance, it’s a given that functional explanations and intentional explanations are conceptually incommensurable. This has been part of the problem all along. And yet Bartulis (like Zizek, only less dramatically) has convinced himself that this problem is itself the solution.

Bartulis is arguing that because the functional and the intentional are incommensurable, the traditional intentional discursive domain is secure. Why? Because once you acknowledge the cognitive autonomy of intentional discourse, you can label any functional explanatory incursion into that discourse’s domain as ‘fallacious,’ a version of G. E Moore’s ‘Naturalistic Fallacy,’ to be precise. A kind of ‘category mistake.’ And why should we acknowledge the cognitive autonomy of intentional discourse? Well, because only it can cognize its domain. As he puts it:

My point, of course, is an anti-reductionist one. No amount of mapping of which synaptic vectors alight when can explain why I think that I should interpret a passage (or character, or author) one way rather than another. Nor can visual mapping, in and of itself, explain what I mean to do by interpreting a passage one way rather than another. And that’s because neither normative significance nor meaning is something that synapses, simply, have, and so normative significance and meaning aren’t things that we can, simply, see. Stating the position a bit more carefully: at least in the case of human perception—say, listening to a work of art or, more ordinarily, conversing with a familiar foe—there certainly are cases when normative significance and meaning can be seen and heard straightaway. Moreover, there are interpretive contexts when would-be explainers immediately perceive, and so can intelligibly claim to know, that a given subject is herself immediately perceiving the meaning of some object. But our best account of those instances proceeds…by placing those instances in the space of reasons.

Here we can clearly see how the separate but equal strategy requires that the nooconservative make a virtue out of ignorance and the failure of imagination. I could pick this passage apart phrase by phrase, fault Bartulis for cherry-picking neurofunctional elements that rhetorically jar with traditional conceits (as opposed to ‘tracking infelicities’), or I could take him at his word, and devise the very interpretations that he finds unimaginable, argue–along lines at least as plausible as his own–that ‘normative significance’ is something that only neurofunctional accounts will allow us to cognize. Why, for instance, should the subpersonal prove any less appropriate than the psychoanalytic?

But all I really need to do is invoke the what I’ve called the Big Fat Pessimistic Induction: Given that, throughout its historical metastasis, science (and functional explanation) has utterly revolutionized every discursive domain it has colonized, why should we presume the soul will prove to be any different? What plucks us from the sum of natural explanation, and so guarantees the cognitive autonomy of your tradition?

The fact that Bartulis needs to recognize is that these are questions that only science can decisively answer. The only way we have of knowing whether the brain sciences will revolutionize the humanities is to wait and see whether the brain sciences will revolutionize the humanities. He and innumerable other traditionalists will float claim after territorial claim only to watch them vanish over the cataract of academic fashion, while the sciences of the brain will continue their grim and inexorable march, leveraging techniques and technologies that will command public and commercial investment, not to mention utterly remake the ‘human.’ Once again, it’s a given that functional explanations and intentional explanations are conceptually incommensurable. This is a big part of the problem. The other part lies in the power of functional explanations, the fact that they, unlike the ‘dramatic idiom’ of intentionality, actually allow us to radically remake the natural world–of which we happen to be a part. The sad fact is that Bartulis and his ilk are institutionally overmatched, that the contest was never equal, but only appeared so, simply because the complexities of the brain afforded their particular prescientific discourse a prolonged reprieve from the consequences of scientific inquiry.

“How uncanny,” Bartulis writes of those bemoaning scientific literacy in the humanities, “to find the language of change, force, and progress surfacing in an intellectual domain whose defining critical gesture, for better or worse, have involved critiques of those very terms as they operate in liberal discourse and other Enlightenment ideologies.” But this is simply a canard. He thinks he’s rapping critical knuckles–‘You should know better!’–when in point of fact he’s underscoring his own ignorance. Personally, I think science will cut our collective throats (using, of course, enough anaesthesia to confound the event with bliss and transcendence). Science builds, complicates, empowers, no matter what one thinks of Old Enlightenment ideologies. And the fact that it does so blindly does more to impugn his nooconservative stance than support it.

So, to return to the quote above: Yes, it is the case that we often, in those instances, enjoy the ‘feeling of knowing.’ But we now know the feeling itself is an indicator of nothing (fools, after all, have their convictions). We also now know that deliberative metacognition is severely limited: veridical auto-theorization is clearly not something our brains evolved to do. And we have no clue whatsoever whether ‘our best account of those instances proceeds by placing those instances in the space of reasons.’

‘But you’re arguing in the space of reasons now!’ Bartulis would almost certainly cry, assuming that I necessarily mean what he means when I use concepts like ‘use’ (even though I do not).

To which, I need only shrug and say, ‘It’s a long shot, but you could be right.’

I wanna believe, but traditions and their centrisms generally don’t fare that well once science jams its psychopathic foot in the door.

The Anxiety of Irrelevance (an Open Letter to Arthur Krystal)

by rsbakker

Dear Mr. Krystal,

It’s invisible to you and most all of your peers. In engineering they actually have a name for it: ‘unk-unk,’ an abbreviation of the ‘unknown-unknowns’ of Donald Rumsfeld fame. Not only can you not see, you cannot see that you cannot see. Ignorance is bliss because it is absolute. No balm is so numbing, so effortlessly destructive, as not knowing.

But you’re beginning to feel it nonetheless, aren’t you? The anxiety…

The literati are drifting into the same fog that swallows all failed cultural programmes. The siren call of all those professors has grown thin and insecure, and the bright and cunning youth are turning to different pursuits. Your first mistake, when all is said in done, is the human one. It’s the mistake we all make all the time. It’s the engine of the evening news: Groupishness. We are walking, talking yardsticks, us humans, condemned by nature to condemn according to those values we find most handy–our own. You think of yourself as a ‘critical thinker,’ not realizing there is no such thing, just varying degrees of bigotry. The proof of this lies in the thoughtless way you place your kind at the top of your authority gradient, the same as those who place you on the bottom.

With ‘unk-unk,’ comes ‘me-me-we.’

Your second mistake is to defend traditional communicative values in the middle of the most profound communications revolution in the history of the human race, to be an aesthetic conservative at a time when nothing could be more preposterous. Everything about your cultural habitat is changing, leveraged by accelerating technologies that will lead us who knows where in a matter of decades. Look at the sales! you cry. The galas! The prizes! forgetting that the best fed animals are those found in a zoo.

Your third mistake is to think literature is a thing, and not just any thing, the thing that you happen to know, command and possess. Thinking this, you assume that literature, the fiction that transcends ‘mere fiction,’ can be defined by relations of resemblance. If it looks like a duck, quacks like a duck…

Who gives a damn about clipped wings!

But literature has always been an event. This is how it transcends: by refusing to be a fixed thing. Literature ‘looks like’ nothing. Only it’s consequences can define it. Literature moves. Literature challenges and transforms. Literature turns souls around, and sends them in different directions. And this, Mr. Krystal, is what makes me pity you and your cloistered circle–your ilk. Because all you can do is rehearse the postures you learned in school, regurgitate a fixed form packaged for effortless consumption by a dedicated audience: the canned intellectual buzzes, the warmed-over wisdom, the indispensable rule-breaking.

Oh yes… And the interviews. All that faux-transformative ambition.

Literature, meanwhile, has done what it’s always done, moved on, leaving your ‘masterpieces’ to waddle content, overfed and flightless, appreciated by mobs of like-minded souls, people who share your voting record to a tee, who nod and cluck in approval whenever you magically ‘challenge’ those who read no more than your titles and dismiss you laughing. Somehow, without quite understanding why, you have found yourself defending a genre against literature.

Because we, those who are never quite as significant as yourselves, see clearly how much damage you have done advocating, endlessly advocating, that the bright and the cunning turn their back on the cultural commons, on the ‘low’ subject matters that allow them to reach out across a balkanizing society. Everyone can recognize an aristocrat by their silk shirts, and a pretender by their ratty shoes.

Oh yes. You feel it. You feel it in the comment threads, in the ‘unmotivated’ vitriol that ‘shocks and dismays’ you, in the emptiness of claiming authority where no authority is conceded. Snark! you cry. Snark! never pausing to truly listen. You feel it, the anxiety of irrelevance, because you see it, how despite all the sales and galas and prizes, the polling tide continues to turn inexorably against you. The corporations! you cry, though they stack your books just as high. You wring your hands, blame the very culture you have created. By writing for yourselves, you’ve been writing for people like yourselves, the clones who wave the same flag of make-believe difference. By all means, you urge, write about them, the base and the witless and the disenfranchised, but do not dare write for them. Not if you want to be taken ‘seriously,’ my ingroup brother. No, write for us.

Meanwhile, those on the outside do what those on the outside always do when backs are turned against them: they turn their backs in turn, define themselves against. And vote accordingly.

Watch this election closely, Mr. Krystal. Watch closely, anxiously, and as you marvel that it could be so close, that so many of them could be so benighted, as you pass your fat and lazy judgment on the culture, the times, or whatever patsy your unconscious conceit renders, pause, Mr. Krystal, strip away the ancient vanities and plug yourself into the living American equation. Ask, for once, What have we been doing wrong?

Because you are losing our battle–that much is incontrovertible.


R. Scott Bakker

Tell Me Another One

by rsbakker

Aphorism of the Day: The taller the tale, the shorter the teller.


A couple of weeks ago The Boston Globe published a piece by Jonathan Gottschall, whose recent book, The Story-telling Animal: How Stories Make us Human had already made my woefully long list of books-I-must-pretend-to-read. “Until recently,” Gottschall writes, “we’ve only been able to guess about the actual psychological effects of fiction on individuals and society. But new research in psychology and broad-based literary analysis is finally taking questions about morality out of the realm of speculation.”

The New York Times also has a short piece on the research of Keith Oatley and Raymond Mar detailing the ways narrative not only accesses those parts of the brain—social and experiential—that light up when we actually experience what is described, but also seems to make us better at navigating the social complexities of everyday life.

Despite mountains of residual institutional animus, empirical research into things literary continues to grow in profile. Over the course of twenty years, Joseph Carroll has managed to bootstrap what was a heretical cult of science nerds into a full blown intellectual movement. For me, all of this smacks of inevitability. Once the human brain became genuinely permeable to science, the obsolescence of the traditional discourses of the soul—the ‘Humanities’—was pretty much assured. Why? Simply because prescientific theoretical discourses always yield when science gains some purchase on their subject matter.

E. O. Wilson only sounds radical the degree to which you are Medieval.

Make no mistake, I was mightily impressed by post-structuralism and post-modernism back in the day. I had no fucking clue what that bespectacled, scarf-wearing twit at the front of the class was talking about, but I knew a powerful ingroup social status display when I saw one. I made it my mission to conquer all that French wankery, to master the ‘po-mo’ language game, and I did. Soon I was that obnoxious prick in the back who actually argued degrees of semantic promiscuity with the twit at the chalkboard.

But it didn’t take me long to burn through my enthusiasm. And now, when I find myself reading new stuff written in that old mould I always suffer a stab of pity—not so different, perhaps, than the one I feel upon hearing that another species of amphibian has gone extinct. The naive social constructivism. The preposterous faith in bald theoretical assertion. The woeful ignorance of some of the crazy and counterintuitive things that science, the Great Satan, has to say.

Don’t get me wrong. I’m not saying the integration of the sciences and the humanities is a good thing. Science is far too prone to level nuances and to provide psychologically indigestible facts for me to believe this. Only that it is inevitable, and that any discourse that fails to engage or incorporate the sciences of the soul are doomed to irrelevance and amphibian extinction.

Besides, the naturalization of a field of discourse only entails the death of a certain kind of theory and speculation. As certain questions are removed from “the realm of speculation” new ones arise, proliferate. The very foundation of interrogation moves. This is likely the most exciting time, intellectually speaking, for any wanker to be alive, the dawn of an Enlightenment that will make the previous one look as profound as a trip to Home Depot.

Gottschall, for instance, has an answer for one of the things that has consistently puzzled me about the fracas over my books. Why does fiction motivate so much moral defensiveness, the blithe willingness to pass summary judgment on the worth of an entire life in defence of a mere reading? According to Gottschall:

“Studies show that when we read nonfiction, we read with our shields up. We are critical and skeptical. But when we are absorbed in a story, we drop our intellectual guard. We are moved emotionally, and this seems to make us rubbery and easy to shape.”

As I suggested not so long ago, we seem to understand this at some instinctive level. As Alan Dershowitz likes to say, everyone is censorious somehow. Who is saying what to whom is something that we are exquisitely sensitive to: We are literally hardwired to wage and win communicative warfare, and morality, it seems, is our principle battleground.

Again, Gottschall writes:

“Since fiction’s earliest beginnings, morally repulsive behavior has been a great staple of the stories we tell. From the sickening sexual violence of “The Girl with the Dragon Tattoo,” to the deranged sadism of Shakespeare’s Titus Andronicus, to Oedipus stabbing his eyes out in disgust, to the horrors portrayed on TV shows like “Breaking Bad” and “CSI” — throughout time, the most popular stories have often featured the most unpleasant subject matter. Fiction’s obsession with filth and vice has led critics of different stripes to condemn plays, novels, comic books, and TV for corroding values and corrupting youth.”

Narratives make us nervous simply because they can be dangerous: Gottschall references, for instance, the way Birth of a Nation revived the KKK. But they also, he is quick to point out, tend to increase our overall capacity to empathize with others, and so reinforce what he calls “an ethic of decency that is deeper than politics.” The research he cites to support this case may seem impressive, but it’s important to realize that this is a nascent field, and that some warts are likely bound to come into focus sooner or later. One might ask, for instance, the degree to which that ability to empathize is group specific. Could it be that reading makes us more likely to demonize perceived outgroup competitors as well? (If anyone comes across any research along these lines be sure to pile in with links).

But what I find most interesting about the article is the pervasive role accorded to fantasy, not in the literary, but the cognitive sense. According to Gottschall, the vast majority of narratives not only depict morally structured worlds—ones where events mete out punishments and rewards according to the moral rectitude of the characters involved—they also tend to strengthen what some psychologists call the ‘just-world bias,’ the projection of one’s own moral scruples (particularly those involving victimization) onto the world…

Moral anthropomorphism.

And this, Gottschall argues, is a good thing. “[F]iction’s happy endings,” he writes, “seem to warp our sense of reality. They make us believe in a lie: that the world is more just than it actually is. But believing that lie has important effects for society — and it may even help explain why humans tell stories in the first place.”

I have this running ‘You-know-the-Semantic-Apocalypse-is-beginning-when…’ list, and at the top are instances like these, discoveries of deceptions we depend on, not only for personal, mental-health reasons, but for our social cohesion as well. Narratives may delude us, Gottschall is saying, but they delude us in the best way possible.

The evopsych explanation of the survival value of narrative probably predates the field of evolutionary psychology: narratives affirm ingroup identity and reinforce prevailing social norms, thus providing what Gottschall calls the ‘social glue’ that enabled our hunter-gatherer ancestors to survive. Perhaps, given the benefits of self-sacrifice and cooperation in times of scarcity, the promised ‘happy ending’ wasn’t nearly so far-fetched for our ancestors. Gottschall concludes his article with a study of his own, one suggesting that the traits most commonly associated with protagonists and antagonists line up rather neatly with the moral expectations of actual hunter-gathering peoples. Narratives, on this account, provide a collective counterweight to the cognitive conceits and vanities that serve to communicate our genes at the individual level.

But whatever the evolutionary fable, the connection between narrative and the fantastic, not to mention the antithesis posed by the scientific worldview, is out-and-out striking. Narrative, according to Gottschall’s ‘social simulator account,’ is an organ of our moral instincts, a powerful and pervasive way to organize the world into judgments of right and wrong, punishment and reward. Their very nature imposes a psychological structure onto the utterly indifferent world of science. The Whirlwind doesn’t give a damn, but we do. And, when it comes to the cosmos, it seems we would much rather be hated than go unnoticed.

This happens to be something I’ve pondered quite a bit over the years: the idea of using the assumptive truth of nihilism as an informal metric for distinguishing different varieties of fiction. (I self-consciously explore this in Light, Time, and Gravity, where the idea is to stretch story so tight over recalcitrant facts that the fabric rips and death shines through). On this ‘sliding semantic scale,’ fantasy would represent the ‘maxing out’ of meaning, where the world (setting) is intentional, events (plot) are intentional, and people (characters) are intentional. Drain intentionality out of the world, and you have the story-telling form we moderns are perhaps most familiar with, narratives with meaningful people doing meaningful things. Drain intentionality out of events, and you have something that most of us would recognize as ‘literary,’ those ‘slice of life’ stories that typically leave us feeling pompous, mortal, and bummed by the ending. Drain intentionality out of the characters—abandon morality and value altogether—and you have something no one has attempted (yet): Even the most radical post-modern narratives cowtow to meaning in the end, an incipient (and insipid) humanism that falls out of their commitment to transcendental speculation (post-structuralism, social constructivism, etc.).

A few weeks back I finished reading Luciano Floridi’s wonderfully written Philosophy of Information, and I’ve been surprised how the first two introductory chapters, which I blew through, have remained stuck in the craw of my imagination. (For those of you into the wank, I heartily recommend you give it a read, if only because of the inevitability of the ‘Informatic Turn.’ Just think: If you start now, you will never need to race to keep up! Even though it fairly bristles with brilliance, I personally found the book sad, largely because of the extreme lengths Floridi is forced to go in his attempt to defend a semantic account of information. At every turn, it seemed to me, the easiest thing to do would be to simply abandon the semantics and to just look at information in terms of systematic differences making systematic differences. The only reason I can say as much is simply because I think I might have found a means, not only of explaining semantics away, but of explaining why it seems impossible to circumvent—why, in other words, philosophers like Floridi have to heap rationalization upon ambiguity upon outright obscurity in order to accommodate it. I was hoping PI would show me a way out of the Void, and all I found was another indirect argument for it.)

For some reason, reading Gottschall reminded me of this particular passage from the opening chapter:

“From Descartes to Kant, epistemology can be seen as a branch of information theory. Ultimately, its task is decrypting and deciphering the world, god’s message. From Galileo to Newton, the task was made easier by a theological background against which the message is guaranteed to make sense, at least in principle. So, whatever made Descartes’s god increasingly frail and ultimately killed it … Nietzsche was right to mourn its disappearance. Contemporary philosophy is founded on that loss, and on the irreplaceable absence of the great programmer of the game of Being.” (20)

As crazy as it sounds, fantasy is also founded on that loss. With Descartes, remember, it is God that assures the integrity of nature’s message. The world is a kind of communication. Of course, everything will ‘make sense,’ or ‘turn out for the best,’ because we are living a kind of story, one where punishment and reward will be dispensed according to the villainy or heroism or our role. The death of God, Nietzsche points out, forces us to abandon all such assurances, to acknowledge that the world makes no narrative, or moral, sense whatsoever.

And that those who insist that it does are probably living in a fantasy world…

Telling the tallest of tales.

Caution Flag

by rsbakker

Aphorism of the Day: The first thing to go when you turn your back on philosophy is your Ancient Greek. The next is your formal logic. Then you lose your ability to masturbate in good conscience, which tends to dwindle in direct proportion to your ability to read German.

So I’m still reading Kahneman’s Thinking Fast and Slow here and there between several other works. One of the things I’m enjoying about the book is the significance he attributes to what he calls (rather cumbersomely) WYSIATI – or ‘What You See Is All There Is.’

For years I’ve referred to it as the Invisibility of Ignorance, or the ‘unknown unknown’ (of Donald Rumsfield fame), but lately I’ve started to call it ‘sufficiency.’ I’m also beginning to think it’s the most profound and pervasive cognitive illusion of them all.

Consider the following one sentence story about Johnny:

S1: Johnny went to the corner store, grabbed some milk, then came home to watch Bill Maher.

This is innocuous enough in isolation, until you begin packing in some modifiers:

S2: Johnny went to the corner store, stepped over the blood running into the aisle, grabbed some milk, then came home to smoke a joint and watch that idiot Bill Maher.

Pack in some more:

S3: Rather than take his medication, Johnny went to the corner store, shot the guy at the till in the face, stepped over the blood running into the aisle, grabbed some milk, then came home to smoke a joint and watch that liberal scumbag idiot Bill Maher with his neighbour’s corpse.

Oof. That Johnny’s fucking crazy, man.

The point here has nothing to do with ‘what really happened’ with Johnny. The extra modifiers are additions, not revelations. The lesson lies in the way each formulation strikes us as complete – or in other words, sufficient. This is one of those hard won nuggets of wisdom that most writers, I think, learn without realizing: reading fiction is pretty much a long-drawn out exercise in sufficiency (WYSIATI). What they don’t know about your story and your world literally does not exist for them, not even as an absence. Going back to Johnny, you can see this quite clearly: It wasn’t as if anything in the meaning of the prior sentences required anything whatsoever from the subsequent ones…

Well, not quite. S2, you might have noticed, contained an incongruous detail, ‘the blood running into the aisle,’ that pointed to the existence of something more, something crucial that had been left unsaid. Let’s call this a flag.

A flag is simply information that cuts against the invisibility of ignorance, a detail that explicitly begs other details. You might say that the key to effective writing lies in balancing sufficiency against ‘flag play.’ One of my biggest weaknesses as a young writer was to turn everything into a flag. I made the mistake of thinking the relationship between the reader’s intrigue was directly related to the quantity of flags in my prose, not realizing that the fine line between narrative confusion and mystery was a much more complicated matter of balancing sufficiency against the strategic deployment of flags. Roger’s piece, I think, can be used as a case study in just how well it can be done.

Flags also help us understand the first problem I mentioned, the way novice writers often have difficulty trusting the sufficiency of their prose, and so think they need to exhaust scenes with detail that readers already assume, such as the fact that rooms have walls, homes have windows, and so on. The fact is, the apparent sufficiency of anything can always be flagged. All you have to do is ask the right questions, and what seems sufficient will suddenly sport gaping holes. This why learning to write requires learning to anticipate the kinds of questions the bulk of your readers will be prone to ask, the kinds of things they may gloss while reading, but flag when reflecting on the story in retrospect.

This, by the way, explains why stories that strike some as pitch perfect will strike others as ridiculously flawed: different expectations means different flags means different estimations of sufficiency.

This also explains why criticism is such a delicate art, and why writers have to be exceedingly critical of the critiques they receive: since anything can be flagged, so much depends on the mindset of the reader. So many critiques you encounter as a writer turn on individual readers asking atypical questions. ‘Finding’ problems in a text is literally indistinguishable from ‘making’ problems for a text, so when you read looking for problems, you will invariably find them. Anything can be flagged. All you have to do is find the right question.

This also explains the ‘poisoning the well’ effect, the way simply broadcasting certain questions can have the effect ruining the illusion of sufficiency (for as should be apparent, sufficiency is always an illusion) for other readers. You could say that fiction is like religion this way: it requires that some questions go unasked to maintain its sufficiency. In other words, ignorance underwrites narrative bliss as much as spiritual.

And this explains how it is different books sort readers in different ways, and why so many people are inclined to judge the intelligence and character of other people on the basis of what they read: pretentious, stupid, what have you. You can tell as much about a person by the things they’re prone to find sufficient as you can by the things they’re prone to flag.

Moreover, since we seem to be hardwired to flag the other guy, we generally (mistakenly) assume that our judgments are sufficient. One of the things that makes Johnny crazy, you might assume, is the fact that he thinks S1 is an honest characterization of S3. We literally have systems in our brain dedicated to editing S3, the ugly truth of our character as others see it, into the streamlined and blameless S1, which then becomes the very gospel of sufficiency. Our memories are edited and rewritten. Our attention is prone to overlook potential flags, and cherry-pick anything that coheres with whatever ‘sufficiency hypothesis’ we happen to need.

There’s a reason you bristle every time your spouse flags something you do.

And things go deeper still. Wank deep.

You could say, for instance, that sufficiency lies at the heart of what Martin Heidegger and Jacques Derrida call the ‘Metaphysics of Presence,’ and that deconstruction, for example, is simply a regimented way to flag apparently sufficient texts.

You could also say the same about ‘essentialism,’ the philosophical whipping boy of pragmatism and contextualism more generally. Or Adorno’s ‘Identity Thinking.’

In fact, so much of contemporary philosophy and cultural critique can be seen as attempts to read S3 into S1, crazy into innocuous – raising all the same flag finding/making problems pertaining to reader critiques I mentioned above. What does wholesale cultural critique mean when it’s so bloody easy? All you have to do is begin inserting the right modifiers or asking the right questions.

And deeper still, you have science, whose claims we take as sufficient, often despite the best efforts of its flag-waving critics, primarily because nuclear explosions, cell phones, and octogenarian life-expectancies are so damn impressive.

Science, as it turns out, is the greatest flag machine in human history. Only those claims that survive its interpolative and interrogative digestive tract are taken as sufficient. And now, after centuries of development and refinement, it finally possesses the tools and techniques required to read the brain into S1, to show that innocuous Johnny, when viewed through the same lens that make nuclear explosions, cell phones, and octogenarian life-expectancies possible, is in fact a crazy ass biomechanism. Just a more complicated version of his neighbour’s corpse.

A bundle of flags, pretending to be sufficient.

Freebasing Thaumazein

by rsbakker

Aphorism of the Day I: When arguing, I always try to meet people in the middle, knowing that there, at least, I will be left alone.

Aphorism of the Day II: Chase wonder through the mill of reason, and you find philosophy. Chase wonder through the mill of desire, and you find fantasy. Since desire always has its reasons, and since reason is never free of desire, there’s no having the one without somehow committing the other.


It was summer. One of those days when a nimbus of white frames all the windows and the breeze hisses through the screens. We lived in this little frame farmhouse not so far from the shore of Lake Erie, and far enough from any town or village that you could pass a day without hearing a car. I was sitting in the dining room – on a bean bag chair, I think. I was reading The Fellowship of the Ring. I was ten years old.

So that would make it about 1977.

I look up from the page.

The house is empty. The compressor on the fridge hums. Outside, the wind brushes the hair of the world–the maples out front and the giant willows out back and cornfields that square all creation.

I see the battered old couch in the livingroom. Branches waving no-no in the top corner of the adjacent window. Light smeared like wax across the paneling. Crumbs on the carpet.

And I feel something between static and vertigo climb into the interval that separates what I see from the fact of my seeing.

There’s no words for it really, except, maybe…

What is this?

Seeing? Breathing? Being?

How could this be?

Then I hear my mother calling out to my brother… I can’t remember what, only that it was out back, beneath the willows. And the horror threatening Middle-earth pulls my eyes back down to the narcotic lines on the page–drags me back in.

But I never forgot the moment: How could I when it would be the first of so very many? Even still, as a middle-aged man, though it often seems the glass has been scuffed to a fog. Sometimes daily.

What – (inhale) – the-fuck – (exhale) – is-going-on?

Plato called it thaumazein, wonder, and for him it was the origin of all philosophy. But sometimes I can’t help but think that philosophy begins precisely when we forget to wonder, or even worse, confuse it with the will to answer. Sometimes, when I consider all the things I think I know, that old feeling climbs into the interval yet again, and it seems so clear that I know nothing at all. The strange and the weird and the mad all become possible… Beautiful.

And the urge to write fantasy rules me once again. It’s pursuing this urge that I feel I write my best stuff, when the story drips from the tips of my fingers. And it’s the urge that makes me smile and nod when I recognize it in writing that belongs to others.

A Beastiary of Future Literatures

by rsbakker

With the collapse of mainstream literary fiction as a commercially viable genre in 2036 and its subsequent replacement with Algorithmic Sentimentalism, so-called ‘human literature’ became an entirely state and corporate funded activity. Freed from market considerations, writers could concentrate on accumulating the ingroup prestige required to secure so-called ‘non-reciprocal’ sponsors. In the wake of the new sciences, this precipitated an explosion of ‘genres,’ some self-consciously consolatory, others bent on exploring life in the wake of the so-called ‘Semantic Apocalypse,’ the scientific discrediting of meaning and morality that remains the most troubling consequence of the ongoing (and potentially never-ending) Information Enlightenment.

Amar Stevens, in his seminal Muse: The Exorcism of the Human, famously declared this the age of ‘Post-semanticism,’ where, as he puts it, “writers write with the knowledge that they write nothing” (7). He maps post-semantic literature according to its ‘meaning stance,’ the attitude it takes to the experience of meaning both in the text and the greater world, dividing it into four rough categories: 1) Nostalgic Prosemanticism, which he describes as “a paean to a world that never was” (38); 2) Revisionary Prosemanticism, which attempts “to forge new meaning, via forms of quasi-Nietzschean affirmation, out of the sciences of the soul” (122); 3) Melancholy Antisemanticism, which “embraces the death of meaning as an irredeemable loss” (243); and 4) Neonihilism, which he sees as “the gleeful production of novel semantic illusions via the findings of cognitive neuroscience” (381).

Stevens ends Muse with his famous declaration of the ‘death of literature’:

“For the sum of human history, storytelling, or ‘literature,’ has framed our identity, ordered our lives, and graced our pursuits with the veneer of transcendence. It seemed to be the baseline, the very ‘sea-level’ of what it meant to be human. But now that science has drained the black waters, we can see we have been stranded on lonely peaks all along, and that the wholesome family of meaning was little more than an assemblage of unrelated strangers. We were as ignorant of literature as you are ignorant of the monstrous complexities concealed by these words. Until now, all literature was confabulation, lies that we believed. Until now, we could enthral one another in good conscience. At last we can see there was never any such thing as ‘literature,’ realize that it was of a piece with the trick of perspective we once called the soul” (498)


Algorithmic Sentimentalism: Freely disseminated computer-generated fiction based on the neuronarrative feedback work of Dr. Hilary Kohl, designed to maximize the possibilities of product placement while engendering the ‘mean peak narrative response,’ or story-telling pleasure. Following the work of neurolinguist Pavol Berman, whose ‘Whole Syntax Theory’ is credited with transforming linguistics into a properly natural science, Kohl developed the imaging techniques that allowed her to isolate what she called Subexperiential Narrative Grammar (SNG), and so, like Berman before her, provided narratology with its scientific basis. “Once we were able to isolate the relevant activation architecture, the grammar and its permutations became clear as a road map,” she explained in a 2035 OWN interview. “Then it was simply a matter of imaging people while they read the story-telling greats, and deriving the algorithms needed to generate new heart-touching and gut-wrenching novels.

In 2033, she founded the publishing startup, Muse, releasing algorithmically produced novels for free and generating revenue through the sale of corporate product placements. Initial skepticism was swept away in 2034, when Imp, the story of a small boy surviving the tribulations of ‘social sanctioning’ in a Haitian public school, won the Pulitzer, the National Book Award, and was short-listed for the Man Booker. In 2040, Muse purchased Bertelsmann to become the largest publisher in the world.

In a recent Salon interview, Kohl claimed to have developed what she called Submorphic Adaptation Algorithms that could “eventually replace all literature, even the so-called avante garde fringe.” In a rebuttal piece that appeared in the New York Times, she outraged academics by claiming “Shakespeare only seems deep because we can’t see past the skin of what is really going on, and what has been all along.”

Mundane Fantasy (aka, the ‘mundane fantastic’ or ‘nostalgic realism’ in academic circles): According to Stevens, the primary nostalgic prosemantic genre, the “vestigial remnant of what was once the monumental edifice of mainstream literary fiction” (39).

Technorealism: According to Stevens, the primary revisionary pro-semantic genre, where the traditional narrative form remains as “something to be gamed and/or problematized” (Muse, 45) in the context of “imploding social realities” (46).

Neuroexperimentalism: Movement founded by Gregor Shin, which uses data-mining to isolate so-called ‘orthogonalities,’ a form of lexical and sentential ‘combinetrics’ that generate utterly novel semantic effects.

Impersonalism:  A major literary school (commonly referred to as ‘It Lit’) centred around the work of Michel Grant (who famously claims to be the illegitimate son of the late Michel Houellebecq, even though DNA evidence has proved otherwise), which has divided into a least two distinct movements, Hard Impersonalism, where no intentional concepts are used whatsoever, and Soft Impersonalism, where only the so-called ‘Intentionalities of the Self’ are eschewed.

New Absurdism: A growing, melancholy anti-semantic movement inspired by the writing of Tuck Gingrich, noted for what Stevens calls, “its hysterical anti-realism.” Mira Gladwell calls it the ‘meta-meta’ – or ‘meta-squared’ – for the way it continually takes itself as its object of reference. In “One for One for One,” a small position piece published in The New Yorker, the famously reclusive Gingrich seems to argue (the text is notoriously opaque) that “meta-inclusionary satire” constitutes a form of communication that algorithmic generation can never properly duplicate. To date, neither Muse nor Penguin-Narratel have managed to disprove his claim. A related genre called Anthroplasticism has recently found an enthusiastic audience in literary science departments across eastern China and, ironically enough, the southern USA.

Extinctionism: The so-called ‘cryptic school’ thought by many to be algorithmic artifacts, both because of the volume and anonymity of texts available. However, Sheila Siddique, the acclaimed author of Without, has recently claimed connection to the school, stating that the anonymity of authorship is crucial to the authenticity of the genre, which eschews all notions of agency.

The Pen is Mightier than the Word

by rsbakker

Aphorism of the Day: Of all the sad things in the world, none possess the poignant absurdity of self-described radicals defending the status quo. Like squeezing lemons over a razor cut: it’s just too stupid to be really painful.

Sorry guys, I know this is small fry stuff (compared to the latest fare), but… This is the newspaper I read.

Russell Smith, my favourite arts columnist at the Globe and Mail, has offered yet another defence of the literary status quo, this time on the societal value of “snarky snobs.”

We need snarky snobs, we love them, we look to them with interest even if we’re not going to slavishly follow their proclamations: We want at least to know what the snarky snob position is.

I actually agree with a great deal of what he says in this article, with a number of crucial provisos. So long as, for instance, you consider his ‘we’ in the above quote to be royal–or limited to his particular ingroup–I entirely concur. Russell Smith is nothing if not social display conscious: he also writes for the Fashion and Style section after all. Odds are, someone who’s keen on what people are wearing in Milan will also be keen on what people are reading in New York.

I also agree with his argument from analogy: literary critics are indeed like fountain pen geeks. They are a dedicated group of enthusiastic specialists who think their criteria are the criteria. Now if literature were just another commodity like fountain pens, one where the commodity virtues of reliability, ease-of-use, and stylishness (social display value) reigned supreme, then the argument would be a real zinger.

Of course, like an angry ex, he reaches for the tried and true buttons. He explicitly claims that those who rail against the ‘Gatekeepers’ are basically just jealous. He implies their ignorance and ingratitude at almost every turn. But, I can forgive him that. As a long time writer for the Globe and Mail, I imagine he finds spinning criticism into flattery almost effortless.

It’s his final statement that gets my goat: “I will continue to think the role of the educated critic is to pull the gates of art wide open.” Question-begging, anyone? The criticism is 1) that the ‘educated critic’ has actually forgotten what art is, that they use parochial ingroup yardsticks to measure the world; and 2) that they belong to a much larger societal apparatus that has monopolized several crucial institutional bottlenecks, perpetuating a set of values and exclusions that have a number of negative social and political consequences.

With reference to (1), what he should have said was that the present role of the educated critic is to pull open the gates of what they think is cool open. I’ll grant that they know what they think is cool better than I do any day. But to say that their present role is to pull the gates of art open, well then, I think we need to debate the changing nature of art in the information age, because what you call ‘art’ looks an awful lot like upscale Entertainment to me…

Or worse, fountain pens.

With reference to (2), what he should have acknowledged was that the ‘educated critic’s’ role is institutional through and through. His failure to do so makes me think he isn’t all that, well… educated. The literary critic’s role is to discharge numerous institutional requirements and obligations, including, securing ingroup prestige, conserving socio-institutional capital, reasserting identity claims, and the long, long list of dodgey things we all do all the bloody time. Since Russell seems to know next to nothing about the human animal, he takes the literary critics claims at face value. I imagine he thinks that self-righteousness and unconscious social agendas are something that only social conservatives suffer. But like Chomsky is quick to point out in his institutional critiques, the thing to remember is that slave-owners were generally nice people. Most everyone is generally nice. Most everyone ‘just’ wants to ‘help’ people. Yes, even God-fearing slave-owners had ‘good intentions.’

Parachute nice people into problematic institutions and no matter how innocuous their small sphere of activity seems, they simply become another cog in a socially pernicious machine. They assimilate their norms to the institution’s norms (toothless bitching is often one of them) because that’s what humans do in the quest for economic and social security. They also confuse agreement for intelligence, and so take pride for surrounding themselves with ‘intelligent’ people, those who only dare debate the details. And the next thing you know, they encounter agreement almost everywhere they turn, including in what they read. All those intelligent books!

Insofar as every formally organized institution in the history of the human race falls under this description, I’m not sure there’s much to debate. Even if you don’t agree with the specifics of my critique, I hope you can at least see that Russell is making a clear cut claim to institutional authority, saying that you should defer (with hip cynicism doubling as genuine skepticism, of course) to his definition of art. Thus the fountain pen analogy: no one has problems deferring to the authority of a fountain pen geek. By simply analogizing literary culture to the fountain pen industry, he implies that it is harmless and friendly. That the authority at stake is trivial.

And this drops us into the lap of the real dilemma: which is that literary culture has stopped asking what literature is supposed to do. Instead, it has circled its wagons around a family of historical resemblances and traditional exclusions, a covert literary essentialism. Never has the human communications environment changed so radically in such a short time, and yet the blithe, comforting, and privilege-conserving assumption is that the literary animal has no need to adapt to its crazy new habitat. There’s no need to look, let alone debate. The old modernist morphology need not worry about context to accomplish its goal. Which is… what? Alienate the larger outgroup community? Ornament itself with prizes and galas and trusts? Religiously avoid any baseline cultural appeal? Gratify the values and attitudes of its readership?

To make matters worse, literary culture has become ideologically defensive and conservative–as Russell typifies, I think. It seems to have lost the capacity to even honestly consider outgroup criticisms (‘they’re just a bunch of jealous manques’), let alone be anything remotely approaching the ‘self-critical institution’ it pretends to be. Too many mortgages. Too many privileges. Too many smiles in the mirror.

So, Russell… you… look… marvelous