Ptolemaic Consciousness
by rsbakker
Aphorism of the Day: Fictions. I’m fine with that. Really, our only point of difference is our estimation of the threat. Since conscious experience accesses no information about its neurofunctional role, it always seems the only game in town. Our experience of something as robust as logical reasoning could find itself anywhere in the neural digestive tract and it would still feel like the mouth, like it comes first. This could be our version of the Ptolemaic perspectival trap. Consciousness has so little access to the ways its conditioned, it has to seem like the centre of a universe. The False Unconditioned. (In reply to tickli, 2012/06/13)
.
The comment boards have been eerily quiet of late, but the number of views has remained consistently strong, given the absence of any real posting. I wanted to leave Light, Time, and Gravity up for a while to solicit as much feedback as I could, but so far only a few hardy souls have weighed in with their opinions. My inclination is to think this is a bad sign, that far and away most of those reading the book think it sucks. If so, sound off! Otherwise, I urge people to link the book far and wide, especially on literary forums or blogs or what have you.
I need quorums! I especially need to know what kinds of defenses target readers will be inclined to resort to. I want LTG to be thoroughly weaponized, to be labyrinthine with traps. I want this to be an example of a new kind of literature, one that is self-consciously viral, that treats itself as a machine bent on parasitically compromising other machines…
And a lot of you guys are fucking scary smart. You should be nailing my balls to the wall!
So I wanted to follow my exhortation with a question: I know that hundreds, at least, have read the draft, so why the silence? Is it merely the prospect of touching my balls?
Otherwise, I wanted to get back to business as usual on TPB: more wank and self-loathing.
So I’ve been thinking a lot about the specifics of the trends I keep talking about in cognitive psychological and neuroscientific research. One of the advantages of pondering all these things as an amateur, I think, is the freedom from the various institutional demands placed on professionals it affords. I can write on and read about any damned thing I please. This means that I can maintain a certain wary distance, and so hopefully avoid the myopia that belongs to devoting oneself to knowledge of one thing down to the very bottom. The problem, of course, is that I’m bound to sound like a superficial amateur to anyone with professional expertise—as I should!
Because a superficial amateur is precisely what I am.
But, I’ve been having some success with my social and psychological guesses of late. So perhaps the time has come to hazard one more.
A while back I posted on what I thought was one of the most significant things to fall out of my peculiar take on consciousness: sufficiency. The idea is that consciousness is generally not privy to information pertaining to the limits of the information it receives, so that it almost always seems entirely ‘full’—sufficient—as a result. This is why, for example, ignorance is so crucial to certainty, and why belief-systems bent on orthodoxy and solidarity are so intent on policing that ignorance—why religions are so partial to having their own schools. The information consciousness does receive regarding these limits, I suggested, come in the form of ‘flags.’
The problem is that these flags almost never pertain to the experience itself: we rarely experience the insufficiency of experience.
The trend I want to talk about is one that I’ve yet to encounter in the literature, but which is plainly visible throughout the discussions we’ve been having here, if you know what to look for. So regarding volition or the ‘feeling of willing,’ famed psychologist Daniel Wegner claims that far from initiating action (his troubling research shows quite clearly that it’s something we attach to actions after the fact), conscious will is the somatic marker of personal authorship, an emotion that authenticates the action’s owner as the self” (The Illusion of Conscious Will, 327). Our gut brain initiates an action which our conscious brain subsequently perceives as something it authored. Since it possesses no information regarding its post hoc nature, it takes itself to be sufficient, and the action to be something it ‘authored’ beforehand, even when, as Wegner’s experiments show, it was not responsible for the action at all.
Regarding moral reasoning, we have Jonathan Haidt and his analogy of the elephant and the rider:
the rider acts as spokesman for the elephant, even though it doesn’t necessarily know what the elephant is really thinking. The rider is skilled at fabricating post hoc explanations for whatever the elephant has just done, and it is good at finding reasons for justifying whatever it is the elephant wants to do next. Once human beings developed language and began to use it to gossip about each other, it became extremely valuable for elephants to carry around on their backs a full-time public relations firm. (The Righteous Mind, 46)
Once again we are told that something we thought sufficient and originary, our capacity for moral reason, is in fact post hoc and secondary. Our judgments come first and our rationalizations come after. This accords quite well with Dan Sperber’s Argumentative Theory of Reason, which takes the growing mountain of research regarding human ‘dysrationalia’ as evidence that human reasoning has more to do with post hoc social signalling than with anything properly epistemic. You don’t so much have reasons to believe as you have beliefs to rationalize for public consumption.
Then you have Michael Gazzaniga’s research on split-brain patients, which demonstrated the propensity of the left brain to confabulate explanatory narratives for actions engendered by the right:
Experiments on split-brain patients reveal how readily the left brain interpreter can make up stories and beliefs. In one experiment, for example, when the word walk was presented only to the right side of a patient’s brain, he got up and started walking. When he was asked why he did this, the left brain (where language is stored and where the word walk was not presented) quickly created a reason for the action: “I wanted to go get a Coke.” (The Ethical Brain)
The frightening upshot of this (and other research) is that all of our behavioural justifications are (possibly) post hoc, that we merely confabulate our ‘motives’ after the fact, using the same information available to outside observers along with the (confabulated) pretence of control.
And if this wasn’t bad enough, you have a growing body of evidence that raises wholesale questions about the sufficiency of experience, at least as it appears to attentional awareness. As Eric Schwitzgebel, following a whirlwind tour of the disastrous experimental track record of subjective reports of experience, writes:
Descartes thought, or is often portrayed thinking, that we know our own experience first and most directly and then infer from it to the external world. If that is right—if our judgments about the outside world, to be trustworthy, must be grounded in sound judgments about our experiences—then our epistemic situation is dire indeed. However, I see no reason to accept any such introspective foundationalism. Indeed, I suspect that the opposite is nearer the truth: Our judgements about the world tend to drive our judgments about our experiences. Properly so, since the former are the more secure. (Perplexities of Consciousness, 137)
Again, we find the selfsame lesson: that consciousness as it appears to attentional awareness lacks anything approaching the information required to make ‘sound judgments.’ Chronic, wholesale insufficiency. In all the above examples, what makes the findings so peculiar is the discovery that the conscious brain, lacking any real access to the gut brain, looks outside to generate interpretations and justifications regarding itself. And why not, when it has spent millions of years second-guessing its fellow brains? Why not use this history of adaptation, not to mention exteroceptive sensory systems hundreds of millions of years in the making, to come to grips with itself?
Look outside. Guess what’s within. Believe with absolute certainty.
So my guess amounts to this: that consciousness is perhaps best thought of as a kind of social interface—a neural version of a Facebook page—something the brain primarily evolved in response to the ever-increasing social complication of its environment. It is a jury-rigged add-on, bent on transmitting only the information needed to successfully reproduce and raise children to reproductive age. Since all the brains in a neural collective are equally blind to themselves, none of them has any way of ‘fact-checking,’ so the social consequences of interoceptive deceptions are nil. In fact, given the astronomical complexity of the brain, accurate self-tracking would be prohibitively expensive, both metabolically and otherwise. So our brains, quite simply, began telling each other lies, linguistically hacking one another to secure what they needed, individually and collectively. The informatic and evolutionary inevitability of sufficiency did the rest, stranding us with a hallucinatory soul.
What we call experience could very well be a by-product of this game of giving and asking for lies. Human consciousness is Ptolemaic consciousness, trapped by its parochial perspective into thinking it stands at the origin of all motion, that it is the centre of the intentional universe, for the simple want of information its evolution could not afford.
For my own part, I haven’t commented on LTG because I know that I am not “scary smart.” I have an inkling of what the book could mean (cf. my comment about the book as horror movie for unwary literary-types), but almost nothing to contribute in the way of making the book better. I want for it the same thing that it sounds like you want, for it to effectively be quicksand for any literary-types fool enough to wade in (that is the best part, that they will know it is quicksand and STILL wade in thinking they’ll find a way out).
But, the book isn’t for me. Don’t get me wrong, I loved it and expect I’ll be rereading it at least a few times over the coming years, but I’m one of the people that an earlier comment of mine said will survive your book (“those too dumb to know they are in any danger”). I don’t *really* think I’m stupid, but I know that I just don’t have psychological, philosophical, literary, or theoretical background (or language) to help you hone LTG. I won’t be getting that way anytime soon either, the next few years of my life are already planned around math and computer science. In short, I’m merely a “fan boy” looking forward to the carnage, but still just an audience member.
Any comments of mine would be along the lines of “I didn’t quite get this” or “my eyes glazed over a little bit here” and I don’t want that stuff being taken into consideration. I want the really smart guys, the ones with a better grasp of the horror show you wrote, to help with that. I hope that they realize that by commenting and engaging they are helping put the finishing touches on Mjolnir.
You demure too much! But even if you were right, ‘misses’ are every bit as important as ‘hits’ when it comes to assessing the effectiveness of your writing vis a vis some audience. I want to minimize the ‘eye-glazing’ as much as possible.
My inclination is to think this is a bad sign.
Speaking for myself, I hate reading on a computer screen, and reading a book is a sort of activity I do at a different time and in a different room. So reading the book in this blog form is something I don’t like much and end up delaying.
If it was out and published I would have already ordered it and put on top of my queue.
That said, I’m behind. There still a lot of material to cover and I’ll have to go back and read some comments that seemed interesting.
So I’d say you’re overflowing with material and it’s hard to keep up 😉 Couple of months ago every new post was a kind of follow-up to the same discussion, but now it’s really HARD to keep track of everything and participate.
And here I thought I was slowing down! I think I was just expecting the same comment deluge from earlier to be carried over into LTG… I find it interesting the way claims are packaged can have such a drastic impact on the way people respond to them.
I really liked the idea of the book and sections on it but felt the “wank sections” needed to be pared down.
Absolutely loved the reflections on Canadian identity though. That’s pretty original stuff that would get widely quoted if the book ever got popular IMO.
Do you recall which sections in particular irritated you, Stephan?
Not irritation, that doesn’t describe it all. More a sense that momentum was being lost in sections that tending to run a bit too long a bit too frequently. One of the ways the wank worked best was when it felt like a cold, jarring slap to the face right as you got invested in the narrative. And light and fast works better for a slap than heavy and slow.
For example 107 works pretty well and so does 111 but the sections between get somewhat bogged down as explanation gets denser. It throws of the pace and you lose some of the narrative flow right were we’ve been speeding up for the conclusion.
Wank sections made me drop the book from my immediate reading cycle, but the actually story was really interesting. Actually, even the wank was good until things veered more to toward the “men are ever deceived” angle.
The consciousness is just an illusion thing seems a bit masturbatory at this stage, no?
There’s something a bit odd about all this “literary quicksand” talk, like a need for disapproval akin to high school Sci cranking up Tool’s “Hooker with a Penis” just daring Dad/Mom to challenge my use of the sound system.
Starring in a porno isn’t a good revenge strategy against the literati’s lack of attention. ;-P
Agh…. I knew I shouldn’t have checked back at the comments. But I did, so here I am.
I don’t think the ‘wank’ needs to be pared down unless there are redundencies (or tautologies? I dunno what I am saying). From what I can gather from RSB’s previous comments, a LOT of it has already been scrapped. If anything, some of it might need to be (for lack of a better word) re-jiggered into somewhat more digestable chunks so as not to lose the casual readers (like myself) 3/4 the way down a paragraph of philosophy or neuroscience jargon.
The story, as entertaining as it is, is merely a delivery vehicle. A non-genre tale of Canadian blah blah blah to get past casual dismissal of the literati. If they can’t bin it as merely genre, then they have to give it a more “critical” look and then BAM! they’re lost. As it creeps on them that every tool they think they have for critically evaluating something is just a deeper delusion, that their entire arsenal is unsalvagable, that even the quiet assurance that we all feel (“but, it’s ME(!), I *really do* know the truth even if I can’t find the words to express it right now”) is delusional bullshit, what will they do with it then?
For guys like me, I can just go back to my math homework and chalk LTG up to a good read that I can perhaps eventually use to fuck with the English tutors I work with. For guys that actually take this stuff seriously, whose identities are wrapped up in things like “the life of the mind” (or whatever) and their ability to rack-and-stack literature, I hope they wade into this book, get their pants pulled down, and shuffle away feeling like it was their first night in prison.
I don’t see LTG as courting the disapproval of the cultural elite, I see it as showing them that the emperor has no clothes.
One thing I do intend on doing is dialling back on the neuro-jargon as much as possible. In my last reread, bumping into it was almost like stumbling on a corpse hiking through the woods.
Only just found your blog a couple of weeks ago and have been enjoying the posts on consciousness, free-will etc as it’s a pet layperson’s interest of mine. I’m not into literary fiction so I’ll pass on making any comments there.
I agree that on the whole, there’s more post-hoc explanations occurring in awareness of one’s behaviour than most people would be generally comfortable with. However, I’d say that we’re not just Skinner automatons reacting to stimuli with a 1 second delay post-hoc explanatory algorithm tacked on.
We also have a general hypothesis-creating “program” which can be used to create theories of self based on self-observation/monitoring. Also, we have some form of interrupt call available that can then be utilised if the automaton like portions of us begin to engage in a behaviour that higher level functions are not interested in.
The self-monitor, hypothesis creating and interrupt call portions of us can actually be trained with the end result of reducing reactive automaton behaviour. I think you pretty much cover this Prince of Nothing: specifically Kelhus (I haven’t read it for some years so I could be wrong, if so apologies).
I’m sure many have pointed out similarities between this to Buddhist theories of self (or rather non-self) and their practices. There’s a bunch of interesting philosopher types at the speculative non-buddhism blog who are currently engaged in some very entertaining and intellectually challenging bare-knuckled boxing around this stuff. The main dude’s (Glenn Wallis) approach is based on Laruelle’s concept of non-philosophy. They tend to drop lots of French theorists everywhere. If you don’t get there, you might find Thomas Metzinger interesting too.
Anyway, I haven’t read all the comments and all your blog entries so no doubt others have covered similar ground.
Thomas is old bud of mine, and I know Susan Blackmore has written extensively on the Buddhist connection as well, but as I regularly scold lit-crit and continental types, the picture developing is not one that affirms their well-worn dogmas of fragmented subjectivity insofar as it paints a picture where everything is broken, the whole of their conceptual discourse stands teetering. Like Buddhism, continental philosophy has numerous transcendental commitments which turn on some kind of affirmation of intentionality – the very thing dangling by a neural thread, IMHO.
This is anything but a ‘Skinner-box’ problem, I assure you. ‘Determinism’ is the least of our worries.
I find Laruelle very interesting, and am more than a little relieved to see that he’s helping people to forget Badiou. I have a spoof on non-philosophy called Rhapsophy in the Speculative Fragments section if you’re interested.
“For guys that actually take this stuff seriously, whose identities are wrapped up in things like “the life of the mind” (or whatever) and their ability to rack-and-stack literature, I hope they wade into this book, get their pants pulled down, and shuffle away feeling like it was their first night in prison.
Why does it matter? Apart from their in-groups, does anyone care? I know a paramedic who is into the Diablo tie-in novels. I don’t even know how many lives he’s saved.
Another of my friends loves “Chicken Soup for The Soul” type books, she’s a principal for an inner city school. I don’t know how many kids she’s had an impact on over the years.
Parole office friend spends her free time watching trashy horror movies. Oncologist cousin who likes fantasy novels mixed in with Xbox fantasy games.
And so on.
People living real lives don’t care about the Scarf Wars. Just seems like a waste of time IMO, because at the end of the day it’s no different to me than the people who’d come into the comic shop and rank people on whether they like Watchmen more than the latest issue of X-men. Today I was enchanted by Virginia Woolf’s Orlando and the latest Batman comic, and what mattered was the pleasure I got out of reading both.
Better to just write what you love, and try to speak with a genuine voice about actual things in the world. If the story is only a cover for some secret anti-literati weapon, hopefully there’ll be a non-weaponized version for the rest of us?
Why is speaking with a genuine voice about actual things in the world important? Let’s not forget where exactly highbrow literature’s “write what you know” mantra of authenticity and realism has lead us. Certainly not to the epic fantasy of Diablo tie-in novels, I can tell you that much.
But since this is the place for wank, I’ll go all in. Why is health, education, and the rule of law important? Ultimately, I think it comes down to an enculturated metaphysical belief in the sanctity, or at least the significance of human life. It’s an arbitrary belief (since plenty of people to greater and lesser extents don’t feel this way) that only seems so indisputable as to be self evident based on the strength of our convictions. Of course paramedics, teachers, and parole officers are important. How could they not be? Or if you’re a dyed in the wool nihilist, of course they’re unimportant. How could you be so naive as to think they are?
Why does anything matter? It matters because people believe it matters. And people don’t always hold the same beliefs.
Will this be the fate of this book? Almost certainly, given that it is the fate of almost every book that seeks to shake things up.
Does this mean no one should bother writing anything critical? Does this mean that they should just ‘write what they love’? That’s the bullshit myth in a nutshell! Writing for yourself is writing for people like yourself is condemning yourself to be an apologist is simply being yet one more entertainer. The voices we’re most inclined to think ‘genuine’ are the ones that most agree with our assumptions. The things in the world we’re most inclined to think ‘actual’ are the things we have already decided exist.
Don’t you see that this is the ‘literary trick,’ what actually separates books that warm our necks from books that make us scowl and grope for trite dismissals? You don’t like the way I’m breaking the rules with this – that much is obvious, Sci. But surely you have better reasons than these blanket dismissals. Should not authors try to be critical? What separates the right way from the wrong way of doing this?
As for the scarf wars, as soon as governments and universities and endless mainstream cultural institutions beginning throwing reams of capital (monetary and cultural) at the Watchmen to the exclusion of X-men then I’ll be inclined to agree with your analogy.
“So my guess amounts to this: that consciousness is perhaps best thought of as a kind of social interface—a neural version of a Facebook page—something the brain primarily evolved in response to the ever-increasing social complication of its environment.”
Bruce Hood would probably agree with this. In The Self Illusion he makes the case that the self evolved as a strategy to increase interdependence and thereby provide the survival advantage of strength in numbers. After all without a self there is no one to give a shit what anybody else thinks.
I think what you are asking for and what could potentially result from this book are highly compatible.
I’m not a writer, but if I were then I could see being somewhat offended, perhaps even a bit mystified, by the literary community’s reluctance to acknowledge genre fiction as works of art on the same level as “literary” fiction. Other writers (and I guess English dept. academics) would be my peer group, my professional colleagues, and their opinion of my work WOULD matter.
If the whole lot of them, after years of conditioning and indoctrination in a system that had grown increasingly esoteric and convoluted over that past “x” number of centuries, were convinced that the sky is purple, but I as a writer had found a way to convince them–in their own language–that the sky was actually blue, then wouldn’t that be worth the effort? If, as a result, writers could write whatever they wanted, whatever they loved, and not have to resign themselves to the likely consequence that many of their peers would dismiss their work out-of-hand. I think there would be even MORE fantastic things for all of us to read.
That is good that you and your friends are above caring what others think about what they read, but writing isn’t their profession. If something were amiss about how they were judged as paramedics, or principals, or parole officers, then I suspect they would care a great deal about setting things right.
Hopefully LTG will convince some that the sky is actually blue, and writers can create the art they love without dealing with automatic judging from their peers, and we can all have a bunch of cool stories to read.
If something were amiss about how they were judged as paramedics, or principals, or parole officers, then I suspect they would care a great deal about setting things right.
But those things…well, they actually matter.
And people can already write cool things. Even if the literati are sneering, Grant Morrison just got his MBE membership card (http://www.comicsbeat.com/2012/06/16/grant-morrison-mbe/) 😛
Heck, people shit on Anne Carson and Mary Oliver for writing simplistic/sentimental poetry – just goes to show haters gonna hate.
If a person, given the opportunity, has refused to learn some cursory statistics, logic and economics, then that might be a valid reason to judge them on the grounds that their willful ignorance is inimical to the democratic process.
What books they read for fun, exempting something like child pornography or Mein Kampf, is a worthless criteria. And it seems to me everyone not fighting the Scarf-Wars recognizes this.
I think someone fighting the “Scarf-Wars” rummages around in exactly the same mud as everybody else, which might be one of the points LTG implies.
But people want to get out of the mud to feel reassured that they are not as dirty as all the other people. They are always trying – most of the time not consciously so – to upvalue the picture of themself. (I used ‘they’ here, but don’t get me wrong. I’m not immune to this. Nobody is.)
“But those things…well, they actually matter.”
I suspect you will be unable to justify this based on anything but feelings, and feelings have no objectivity.
I think the refutation is rather simple actually:
“If we, citizens, do not support our artists, then we sacrifice our imagination on the altar of crude reality and we end up believing in nothing and having worthless dreams.”
-Yann Martel, The Life of Pi
Martel is such a superficial wanker. Isn’t he really saying: “If YOU, citizens, do not support ME (and my narrow, self-serving conception of art), then YOU sacrifice your worth and become meaningless.” This is precisely the kind of ingroup self-glorification that incites outgroup populations to make a virtue out being philistine, as well as convince new generations of cultural producers to turn their backs on the cultural commons, and to buy into a privileged social identity that simply fuels the cultural short-circuit. As soon as artists set themselves apart from the people that make them possible, they are no longer artists, but ingroup status seekers. As soon as you stop writing for others, you become part of the problem.
I think you might be too harsh on Martel. It’s hard to think of Life of Pi as a novel not meant for public consumption. It’s very much a feel good book from what I’ve read so far.
Philosophy is something I quite enjoy. I put it in the same place as reading fiction or listening to music. It’s entertainment. For me, nothing else matters and it couldn’t be otherwise. I mean that quite literally.
Read the wank…don’t really like that word and just decided not to use it anymore, at least not with LTG….soo….read the philosophical sections of book 1. My aim with this read was finding things I don’t agree with or that weren’t clear enough for me. This is what I found:
–
48
“The truly bizarre thing, however, is that language seems to agree, at least in terms of its logical structure. As a rule, it preserves truth-value across claims only so long as you refrain from conditioning them with operators like “believes that” and so on. As soon as you introduce these–‘propositional attitudes,’ they call them–language becomes something quite unruly from a logical point of view. It really does seem that linguistic reference, to function logically, must always be aimed outward, away from the performances that makes it possible.
The act of referring, in other words, has to vanish in the course of the reference.
We must speak as gods to communicate as men.”
I agree with the conclusion, that we must speak as gods, probably because it makes one feel good to be more god-like. But why doesn’t language preserve truth-value across claims with operators like “believes that”?
Maybe I read this in a wrong way?
‘X believes that p’
Ergo: It is true that p.
this is obviously not a sound argument but did you mean this? I don’t think this would make language become “unruly from a logical point of view”.
This also made me think of epistomological modal logic.
78
“Experience is a product of the brain. Of course, paradoxes abound here, conundrums so profound you might as well say we have no fucking clue what’s going on. Just for instance: if the world is all in our heads, and our heads only make sense as things in the world, doesn’t that mean our head is also in our head? And if we can never get outside of our heads, how do we know we have a ‘head’ at all?”
Here you make a fast step from ‘experience of the world is all in our head’ to ‘the worlds is all in our head’. The latter is something only an idealist would say.
…(1) the world is all in our heads,…
(2) our heads only make sense as things in the world,
Ergo: our head is also in our head
The first thing is that I don’t agree with (1), which makes this argument not very
convincing for my part.
Second, there is a quite subtle error in this: In (1) you use ‘head[s]’ with meaning ‘mind’, ‘soul’, whatever, except you meant that the world literally is in a physical object, that is our head. In (2) you use ‘head[s]’ as an physical object.
The argument falls apart when you substitute (1) with ‘the experience of the world is all in our head’. Then you could conclude: ‘The experience of our head is in our head’, which is perfectly fine and not paradoxical at all.
83
“Life is a paradox because we are impossible. We are impossible because we hold ourselves up by our own hair. We hold ourselves up by our own hair because what makes us possible does not exist for us. What makes us possible does not exist for us because of the information horizon of the thalamocortical system.”
‘we are impossible’ is used in a misleading way. I think the only thing that creates a paradox here is the word ‘impossible’. Fact is we, aren’t impossible. “We” just don’t exist in the way we thought we would, that is at all. From existence and possibility of existence, again, is too rushed a step.
So my version of this: “We hold the pictures of ourselves up by our own hair because what makes us exist in this way does not exist for us. What makes us exist in this way does not exist for us because of the information horizon of the thalamocortical system.”
–
Book 1, I found, worked really well, not only the philosophy sections. Don’t know if you would agree, but apart from those minor points above, I would leave book 1 as it is unless you want a drastical rewrite like you suggested before.
Dietl
This kind of detailed feedback is invaluable. I really don’t expect many people to agree with much of Dylan’s thinking – I actually have tried hard to restrict him to an older phase of my own thought on many issues (only embracing the pessimism) – I don’t agree with him either! It’s the issue of clarity I’m most interested in.
I think the problem you have with 78 is the same as 83: the way Dylan surreptiously ‘swaps representational frames’ in his constructions. He would argue against your substitution of (1) by simply pointing out that the intentionality implicit in ‘experience of’ is the very thing targeted. You can only defuse his paradox by begging the question. Really, all he’s doing is giving a vernacular reformulation of the old ‘veil of representations’ problem – or what Floridi calls the one-dimensionality of experience.
(1) Our representation of the world exists within our heads.
(2) Our heads are another representation of the world.
/ Our representation of the world exists within our representation of the world.
Regarding 48, he’s over-generalizing the way descriptive claims seem to naturally assume the ‘view from nowhere’ for dramatic purposes. But there’s far more to natural languages than description, and as you point out, far more to description as well.
PS: Don’t know when I have time for book 2. I expect this to be a bit harder.
Scott, don’t forget, that it’s only been three weeks since you started posting the book. Apart from that, this isn’t easy stuff. People need time to think this through and maybe read it more often to effectively criticize it. I’m a pretty slow reader.
Yeah, it’s just dawning on me that I’m being impatient… But thanks for the halftime update, though!
Alright, well, I guess you got me.
If I’m picking up what you are putting down, then it seems to me that you are saying that professional writers shouldn’t care what their peers think about their work because what they do doesn’t really matter. Or, it doesn’t matter in the same way the paramedics, principals, and parole officers matter.
So since you don’t see why this war would seem important–from their own perspective– for a writer to engage in, because their work doesn’t really matter (not *really* matter, like important professions), they shouldn’t see it as worth fighting either. Instead, they should just write about stuff that is less important to them but more in line with what is really important, right? Not some silly scarf-war nonsense.
Nobody is talking about the democratic process being compromised by the idiocy of the unwashed masses. This is about writers battling it out for a redifinition of terms that are important to them. I’m not so sure that I see the importance of that fight either, but that is pretty clearly what a large chunck of LTG is about. I’m not going to tell the guy, “Hey, yeah I kinda get what you are trying to do here but let me tell you something: it really doesn’t matter that much. So… could you cut out all that wonky stuff. Yeah… that’d be great. Mmmmkay?”
Even though I don’t fully “get” what RSB is saying with LTG, I respect that the dude considers it his fight and is trying to do something about it. What I do “get” of it seems far from stale. Whether I think the fight is important or not doesn’t really matter.
It sounds like, for you, you made the right choice by switching to something else to read.
Well, I’m saying the judgement of craft doesn’t matter if that judgement in no way offers a means of improving the work.
It doesn’t matter if the work is important or not. I’m slogging through Kafka’s shorts (the complete collection at 5 pages minimum a day for 100 days) and I’m hard pressed to say any of it matters at the moment.
But it would be strange for Kafka to worry about people dismissing his work on the lack of coherency, when as an artist that wasn’t what he was going for.
OTOH, one good argument mentioned in Annalisa De Lippo’s treatment of comics (in the intro to her PhD work on Alan Moore) is that one cannot simultaneously make a claim to “literature” without an actual effort to produce quality material.
A lot of genre works are, quite honestly, not comparable to “literature”. Speculative fiction includes the great Salman Rushdie, but it also includes a lot of works that lack that level of craft. This doesn’t make them bad books, in the same way that a JavaScript file is not a “bad program” compared to some AI project written in LISP.
If the humanities thrives on false dichotomies and ego-fellation via reading lists, then we as a society really need to evaluate the value of a college education given how many people are going into debt over it. This is a much more serious problem than this silly dick wagging contest. As my Math+Econ double major friend put it, it’s an issue where even at top universities one can get away with “Physics for Poets” but STEM majors can’t get away with Dr. Seuss to fulfill their English core req.
Creating disparate sets of hyperfocused graduates either lacking in cursory knowledge of statistics, logic, and economics on the one hand or history, culture, and composition on the other is hardly in anybody’s best interests. STEM majors leaving their programs barely able to communicate in writing because they went in thinking all English core reqs were as relevant to them as Dr Seuss is one of the most pressing problems that makes English department heads sweat.
I think some drift has happened here, so I’d like to try to get back to what I thought we were discussing.
My understanding, Sciborg, is that your position is that the wonky parts of LTG are largely unnecessary because they address an obscure war that is largely invisible outside of the literary community and English dept. bickering. A fight that hardly anyone, outside of the participants, cares about or feels is important, and therefore the participants themselves should just get over it and write whatever they feel like.
So where does that leave the writer? Are they somehow exempt from the competitive spirit, desire to excel, and sensitivity to the judgement of their peers that affect other people? Should the writers of genre cede the high ground because you think their battle doesn’t really matter? Perhaps, as you feel, writing literature isn’t as important as the meaningful work your friends perform. Does that mean we ask the writer to stop taking himself and his profession so seriously because *we* don’t see what the big deal is? Doesn’t that seem a bit condescending?
I don’t think they (the writers of genre) are asking that any formulaic sword & sorcery pulp (or thriller, or horror, or whatever) be automatically considered on the same level as “literary” fiction, I think what they want is for when something truly beautiful, or amazing, or otherwise great gets written, that it isn’t reflexively relegated to an inferior status by their peers simply because it is genre.
You don’t think that matters, I get it. I’m not sure it really does to me either, but to the guy that wrote LTG it is clearly a big deal. I still have the impression that what you are really saying–after all the talk about what we did in Uni or whatever is done–is something to the effect of: I can clearly see that this subject is important to you, but I think it is silly and pointless and your book would be better if you just cut most of that stuff out.
I disagree. I may not be convinced of the importance of the “Scarf-war,” but it does make for fascinating reading. Also, the fight-spectator in me would really enjoy seeing some wounded literati egos once the fine tuning of this work is complete.
Now, I have spent much more time on these comments than I would like to and my shadowy Over-Andrew is starting to demand explanations, so I will have to cede to you any last-word points you make if you bother to reply.
For everyone else, keep up with the insightful comments. I’m very, very much looking forward to when this Death Star is fully operational.
Well, my reading is one reading but I think a “Death Star” work against the literati reads more like a cry for attention than a genuine critique. It’s like dissing the hot girl, supposedly because she’s X and Y, but really because she’s just not into you.
It really starts to detract from the actual narrative, which appears to me to be a valuable examination of class and Canada. That it’s done to bruise the egos of people who are just not very important, well that seems to weighing down the narrative for no good reason at all.
Perhaps, as you feel, writing literature isn’t as important as the meaningful work your friends perform.
This isn’t what I said. My point is creating art should not be an exercise in appeasement. You definitely shouldn’t simultaneously dismiss and plead for accolades. I suppose that’s what makes me taste vomit in the back of my throat, that it’s not “Academia is Bad!” but rather “Academia isn’t paying attention to me!”
Again, one person’s reading of LTG, and I didn’t all that far before dropping it.
Anywho, check out the book Tiepolo Pink by Calasso -> If Tiepolo had been solely interested in wowing his critics with his use of color and light, he likely wouldn’t have created the ghoulish, black and white Scherzi.
Not to mention Alan Moore’s work seems to be making it’s way into academia, and he did this by becoming delightfully, bug-fucking insane. ;-P
You’re starting to sound more and more like Fennel! Dylan’s point is that the arbitrary thing, the preposterous thing, is thinking that the ‘valuable examination of class in Canada’ is somehow transcendent, that the literary writer/reader, with his or her swarm of bigotries and chauvinisms, by virtue of bildungsroman bootstrapping and intellectual enlightenment, somehow stands outside this circuit of bigotries and chauvinisms, rather than being responsible for them. This is the thing he consistently refuses to do: to let this book become another book about ‘those poor people.’
But this is the way it always works when you have criticism running the wrong way (which is to say, up) on the authority gradient: the first, most telling and at the same time most effective dismissal is: “They’re just jealous.”
What Dylan is saying is pretty straightforward: “You failed your community.” The question, I suppose, is whether there’s anyway for him to say this without running afoul the TJJ response.
I appreciate that craft is necessary for literature, but you seem to be suggesting that it’s sufficient.
Otherwise, you have the bad habit of letting your rhetoric carry your argumentative load. I understand you want LTG to be ‘wrong-headed’ in some way, but aside from throwaway dismissals I have no idea what your argument is. “Silly dick-wagging contest,” for example, is something that Dylan would love. He would say, “Show me something that isn’t?” My guess is that you lay a bunch of ‘genuine voices’ about ‘actual things in the world’ and he would laugh even harder, say, “So you’re just another Christian after-all!”
He would say, “Sounds like the only way you can handle me is warning others against grabbing hold!”
Put the book in a pdf format. Absolutely hated reading it like this. Even though it seemed interesting I quit.
My worry is that if I do, then there will be no real incentive for people to buy thing when it does come out. Crass, I know… But I gotta feed the chil’en.
I don’t know what to say.
My productivity at work was annihilated by you posting LTG. I’m not qualified to try and tear it a new one from the point of view of literary merit or how well it challenges the establishment of the Canadian/American literati. The writing is clearly hostile, but I think you intended it that way. Dylan’s a guy ostensibly within the establishment, writing FOR the establishment, trying to challenge the establishment, but ultimately succumbing to the very ‘tropes’ that define literary fiction. The intro parodies how one of these scarf-wearing motherfuckers might react to the whole thing. The passage with alternating perspectives was a particularly interesting way to play on the literati’s obsession with non-standard forms.
The quality of the writing was, as usual, top notch.
Is that enough ego validation Bakky? Another thing you can do is just wait until you released TUC, it outsells your expectations, it builds your confidence and then you do it.
On the other hand, if you want it torn to shreds, why don’t you just send that Docx guy a copy.
Like, I get it. The best artists are always nervous about their product etc. But like, it’s good. Plus, you already got paid for it, so if it gets ignored, fawk ’em.
“You see, I really am free to think whatever the hell I want, so long as I continue telling rip-roaring yarns.”
R.S. Bakker, ca. 2008
Fawk. Owich.
The check, as they say, is in the mortgage. Something like this will never make real money – but as an experiment in literary narrowcasting I want it crack as many heads as possible…
If only to let their heads spill out so that I can crack those numbskulls in turn.
I’m sure you’ll agree.
For myself, the posting of the book has just come at a super busy time for me. I prefer to read an entire section in one go, so I’m struggling to find time to sit down with my smartphone and have at it. As I mentioned previously, a lot of the heavier wank goes well over my head, especially when some of the heavier jargon is used. My experience in academia ended with my bachelor’s degree in engineering, so while I understand in principle what you are going for, it doesn’t resonate with me as much as the story sections have so far. I’ll post further once I actually manage to finish.
If you don’t mind, litg, keep track of the sections that cross your eyes the most.
Aw man, now you want me to TAKE NOTES?? Sure, not a problem, I’ll start keeping track.
Follow-up question, would you prefer comments posted here and/or back in the manuscript posts, or is it easier for you to keep up with comments if I post in whatever the most recent TPB entry is?
Well I’ll let other cool cats talk about LTG because I only now heard about it and will have it for cognitive supper soon enough.
As for the ideas presented in your post, I would hazard that the process is not one way or the other, but a constant give and take. The way we are shaped in our early childhood, our neural patterns that define our reactions and ultimately how we define ourselves, is governed by our genes, which in turn shape our perception of the outside world as babies, which then changes us again internally and then we change it by perceiving it only through our own measures.
Genes shape world and gives it a fixed pattern to mold. World shapes pattern. Pattern shapes world. The dance continues.
But all on a fixed basis of potential. Like the intelligence potential, you can change but have fixed limits.
In other words, each person is a whirling storm of little bits of information – constantly giving and taking, exchanging with other people, places and events. But we can only trade according to our basic structure – our brain and its patterns, and so can only accept and give bits after they have been molded to fit our perception.
It is one of the scariest things for some, liberating to others. Some would rather be shaped by the world, be based in what they consider objective reality. In truth, the outside world may only be the reality most of us agree on, the very basic things, up is up, down is down, cold is bad. The rest may just be us.
This won’t get read, but it helped me to write it 🙂
If you ever write a novel, Noah, start with that last line. My head hurts thinking it through!
Intuition and the way it plugs into received dogma has to be the crux of the problem. Central to that is ‘personhood,’ a conceptual system that your reply invokes at quite a few turns. This is the system that’s threatened with extinction. Perhaps something not unlike what you describe can be salvaged, where we have to settle for a reduced, more ephemeral and tangled notion, but the evidence does seem to be trending otherwise, sketching a picture where there’s no give or take because there really was nothing being given or taken all along.
STEM majors leaving their programs barely able to communicate in writing because they went in thinking all English core reqs were as relevant to them as Dr Seuss is one of the most pressing problems that makes English department heads sweat.
I’ve barely ever met anyone like this, and as part of the uni’s Math Club I think I had more than enough opportunity. I also had a stint in the theology department, where I ended up inducted to the theological honor society (you just have to not fail 6 classes haha) so I go a good picture of the other side.
This idea of the mumbling STEM major I think has more to do with caricatures than realities. I’ve met English majors who can write amazing things but can’t talk to the people around them, or are just flat out nuts (which is admittedly endearing when they are genuine creators rather than puffed up consumers).
But, on the chance there’s some pertinent data, the problem is utility and equivalence. If you can’t communicate, you don’t rise through the ranks. Managers of divisions have to be both good communicators AND programmers or risk managers or engineers or etc.
I worked under a Dante scholar for my junior year. I also took a Visual Communications class about graphical presentation, film, and advertising. The latter would be far more applicable to the job-seeker. Critical reading and writing, especially as it pertains directly to reports and analysis, should be part of the core.
The preservation of canon can largely be relegated to electives. Morally, I find it unseemly that people are asked to go into debt for an education that seems more interested in shoring up questionably useful coursework than working to provide a genuine return on investment.
Finally, I recall a friend of mine, a comic book artist, getting into a bit of an argument with two sneering literati. He finally got out of his chair and asked, “Do you want to take this outside?” -> They declined. Now was he thuggish or did cut through their delusional egos by reasserting some realpolitik?
I think neither, rather it shows that the size of one’s dick always depends on the ruler.
Scott, feel free to use that last sentence as a TUC epigraph…though I think you actually said something similar in an earlier discussion. 😛
-Sciborg 2.0, June 21 2012 (dating these for the inevitable collected edition)
As as STEM myself, I agree with your statement regarding caricatures vs. reality. Caricatures exist in real life, of course. There are always extremes. I’ve met the stereotypical engineer incapable of communicating with anyone, in fact I’ve met more than one. But the most introverted, withdrawn person I know was an English major and now teaches it in high school (how, I honestly don’t know). And you’re correct that it’s the effective communicators that rise through the ranks at workplaces, regardless of their technical aptitude (and often in spite of a lack of it).
I suspect the caricature of the STEM major exists largely as a final defense as one realizes that majoring in basket weaving may not have been the best career move. Sadly I suspect these ego-fellated brats end up sullying the more legitimate concerns of the Occupy Movement, as without an economic training they come off as naive idiots.
Strangely enough, I was friends with some incredibly gifted English majors, and the ones capable of genuinely original work seemed pretty humble to me. Hell, the one who was showered by academic accolades organized our house’s weekly South Park viewing. She actually left that world to become a JAG lawyer. Another who was invited to varied poetry conferences for her work was the one obsessed with Project Runway and America’s Next Top Model.
Good points. I didn’t mean to imply any condescension towards English majors, merely to say that socially awkward people exist in all walks of life. And as for idiots, there are more than enough of those to spread around. My English major friend I referenced is quite intelligent, just supremely awkward socially.
I can’t help but see the picture very differently, since I’ve tutored my share of Science, Engineering, and Econ students struggling to make their way through their Arts and English reqs (including 4th years who’ve put them off for as long as possible, and in one case I’m not particularly proud of, a grad student). But then again my sample size is as small and inconclusive as yours. What I have heard in addition, though, are accounts of acrimonious intra-department wrangling at my local university over desperately needed initiatives to increase outside faculty enrollment and performance in English courses. A STEM graduate who fails to overcome personal difficulties with reading and writing is not an occasion for the English department to point fingers and say “ha ha I told you so,” at least not here. On the contrary, it’s a sign of failure in the education process that makes everybody involved look bad. If you can’t communicate well you might be unemployable or get stuck with shitty paying gruntwork despite your expensive degree, but post-secondary education isn’t some pull yourself up by the bootstraps fantasy of perfectly self reliant autodidacts who rise and fall based purely on their own merits. Educators bear responsibility too.
And on that note, tuition costs in general are way too high in North America, especially in the States. I don’t see how getting slapped with massive debt right out of campus somehow becomes okay just because you graduated in a field that will let you work it off over years rather than decades.
But since you seem keen on dismissing higher education in the humanities wholesale on the grounds of utility and what really matters, then how far should we take that utilitarian pragmatism? Just how useful is string theory versus basket weaving? At least physical baskets have day to day utility, right? Plays and novels, even literary ones, improve the quality of life for countless enthusiasts. (It’s a lot harder to be a self satisfied scarf wearing hipster when you have nothing highbrow to be smug about.) Should all STEM research be gauged on the same grounds of improving quality of life as well? In the course of doing away with pointless artsy fartsy courses, should we also stop teaching introductory matrix algebra? After all, impressionable freshmen might be led to believe they have a future in the highly competitive yet only selectively useful and often not very will paying fields of particle physics or advanced mathematics. Isn’t shunting them all into Introduction to Accounting much more sensible?
Utilitarian thinking can go a long way in the sciences. Deep sea marine biology? That sounds pretty flaky. If you’re not down there to search for oil I don’t really see what the point might be. Gamma ray astronomy? Are you looking for ET or something? Stick to launching satellites. All that deep space gazing is totally useless. Nanoengineering? Until you build something than can have an earnings figure attached to it, stop wasting time and money carving tiny engravings of the flying spaghetti monster everywhere.
Just how practical should we be?
Now I’m not going to play the relativist game of suggesting that since measurements of utility can be all over the place for various disciplines in both the sciences and the humanities, this means the two are somehow equal. The cultural theory literati’s continuing slide into echo chamber isolation and irrelevance is very real, very problematic (in terms of “looming crisis” and not just “theoretical problematic”), and nowhere close to directly translatable to the sciences. And in those terms of crisis, I see this book as a specialized text addressing that crisis to experts in the field via language and methods that they themselves value and thus cannot easily dismiss. Much of it is not meant for general consumption, any more than the mathematics research papers you might be familiar with are meant to be read by Dante experts or random people in your neighbourhood. Scott might be deliberately cranking up the “challenge” of this book to show the absurdities of modern literature’s mantra of “challenge your reader,” but you still need some of the specialized training he and others in his field posses to see just what kind of challenge is been debunked via stretching to the breaking point. You’re not one of the people who refuses to confront the abysmal failures of the system, so again this is not meant for you.
I’ve said it once and I’ll say it again, plenty of people think this is important, so to those people (even if it’s only to those people), this is important. This book is a narrowband work of genre aimed at a narrow audience for a particular purpose. To ask him to write something with wider appeal would be to ask him to write something else entirely.
Anyway, I think I’ve put enough absurd generalities into Scott’s mouth for the time being. Have at it.
Again, I’m talking about equivalences – the problem is that top tier universities offer an out via STEM-lite classes for humanities majors. The idea is that the lite classes are all that’s necessary if you aren’t a STEM-major, so why can’t that go both ways? But this is actually only a lead-in to my actual desire, which is major surgery on what constitutes the core of a college education.
There’s no reason to straddle students with a bloated core of non-applicable skills. Those students wishing to major in X or Y are free to do so, but the hoops of “general knowledge” need to reconsidered as they represent, IMO, an opportunity cost. I’ve already seen people get good jobs with associate degrees in specialized things like .NET programming or graphic design. Colleges may end up being seen as an unnecessary burden if they don’t abandon the somewhat vampiric practices of demanding students pay for things that don’t give a good chance of ROI so that varied university positions can be funded.
I’m fine with colleges having to trim abstract research, though recall the fact that number theory, once lauded for being divorced from material concerns, is now the foundation of modern cryptography. I doubt STEM departments are all that bloated given the value they add back into society.
As for the students you tutor, I don’t doubt there’s some small sample of STEM majors whose writing abilities are too poor to make them easily employable. That will be the case in any field. They should definitely have various courses to train them, and a way to make that affordable is to trim down expectations of “gentleman’s knowledge” from the core.
I’ve actually noted elsewhere that “science”, like foreign language or trigonometry, isn’t something that is necessary to most. This isn’t to say all science should be eliminated, but I do believe high schools would benefit from shaving off some of the physics/biology/etc they put kids through and replacing these with varied courses dealing with economics, statistics, logic, critical writing, visual communication…even giving students more electives would likely make education have greater impact and make students care more. It would likely help some students be more focused once they got to college. Imagine have a good backbone in art and then going into programming games or marketing!
I actually do think accounting could be a useful part of the college core, far more so than two semesters of foreign language for example. Another benefit of a trimmed core is allowed people to possibly boost their resumes with another minor that can aid in employment.
As for unemployable particle physicists, one only has to look at the degrees of Goldman Sachs employees to see that STEM majors have a home on Wall Street regardless of their field of study, so I’m not sure where your idea of starving experts on particle physics and advance mathematics is coming from. The ones I know who leave academia leave for six figure starting salaries.
If someone chooses to major in a field where probability of employment is low, that is a perfectly fine choice so long as they understand what they are doing.
Just how practical should we be?
Definitely practical enough so that a college degree has a high probability of giving people a return on investment.
Ideally practical enough so that citizens are at least partially able to grasp the economic and legal arguments that would enlighten political discourse.
-Sciborg 2.0 June 22, 2012
College for the already financially well off, vocational school for everyone else? Retool mathematics majors towards economics so we can maximize graduate employability on wall street? Fund number theory and other abstract research only if they already demonstrates commercial appilcations? Limit high school students to the prescribed practical subjects, and if for some reason they’re still interested in the frivilous hard sciences or humanities we barely taught, then and only then let them major in it?
Speaking of ROI. I’m not sure those six figure sallaries for the math grads on wall street was such a good deal for the rest of us.
I’m seeing this weird contradiction in your argument where on the one hand you’re saying students should have more choices to find what interests them, but on the other shouldn’t have their interests cultivated in the ‘wrong’ subjects that might lead them to a less lucrative career. It seems to boil down to “students should be allowed to choose to learn whatever they want, but we must limit their choices so that they only choose what we say is the best for them.”
So why this dance? Teenagers are terrible decision makers anyway. Why not do away with electives altogether and use the metric of projected earnings to chart out the entirety of their education for them?
After all, we might have gotten a banker out of Scott, rather than an author known for channeling his wanky philosophy education into works of epic fantasy.
As a counterbalance to Wegner, Bargh, & Co., I offer a couple of pieces by UC Berkeley social psychologist John Kihlstrom:
– Is There A ‘People Are Stupid’ School in Social Psychology? Short and pithy.
–The Automaticity Juggernaut Longer, still rather pithy.
@Scott: I think it’s important to separate my dismissal of the book’s apparent “sticking it to the man” goals and the actual narrative. The latter is well done, the prose enjoyable as always and some of the invective is important to the “oomph” factor.
The former just makes me groan, think how I’ve read this before in one of the Earwa books, and give up.
I was more responding to Andrew’s excitement at sticking it to the literati. It’s an instinctual emotion I have as a long time genre reader, but I think it’s worth examining my own animosity to those ivory tower bastards alongside the idea that this very animosity buttresses/validates those whose judgement I find erroneous.
I suspect someday soon there’ll be a document analyzer trained on “literary” works and we can then utilize it to classify which books in genre should be deemed worthy of academic study. Seems like a good resolution to the Scarf-Wars.
Until then, some more directed responses follow:
I appreciate that craft is necessary for literature, but you seem to be suggesting that it’s sufficient.
I think that’s just me being sloppy in my use of the word “cr
Silly dick-wagging contest,” for example, is something that Dylan would love. He would say, “Show me something that isn’t?”
I think Dylan and I are in agreement here. The dick-wagging contest seems orthogonal to the creation of art.
My guess is that you lay a bunch of ‘genuine voices’ about ‘actual things in the world’ and he would laugh even harder, say, “So you’re just another Christian after-all!”
I don’t even know what this means. He might as well tell me I’m another Eagles fan. Regardless, I don’t take “Christian” as an automatically pejorative label, which may be part of the confusion.
He would say, “Sounds like the only way you can handle me is warning others against grabbing hold!”
It’s possible. We’ll know when I hopefully return to and complete the whole book. I think part of the problem was I was rereading good chunks of White Luck Warrior and the repetitiveness of the same message in both works became irritating.
Does this mean no one should bother writing anything critical? Does this mean that they should just ‘write what they love’? That’s the bullshit myth in a nutshell! Writing for yourself is writing for people like yourself is condemning yourself to be an apologist is simply being yet one more entertainer.
Does one need a narrative work to criticize? Dylan emailing me a power point presentation would likely suffice. 😉
I think I fall in line with, IIRC, Virginia Woolf’s criticism of moralizing in narrative. If nothing else, less telling, more showing is a good guideline.
There’s a difference between critiquing an aspect of society and critiquing critics. It’s hard for me to believe that one can have a genuine passion for the latter, whereas the former has produced some incredible pieces of art.
There’s also a problem, IMO, with saying the same thing, in the same way, no matter the book.
As for the scarf wars, as soon as governments and universities and endless mainstream cultural institutions beginning throwing reams of capital (monetary and cultural) at the Watchmen to the exclusion of X-men then I’ll be inclined to agree with your analogy.
Ah, excellent use of “Watchmen” in your riposte! 🙂
I do think there are a lot of problems in mainstream culture, but sometimes worthwhile stuff does get disseminated.
Ware the Volvos. Much of this stuff comes out of the attic, no doubt, but most of it hasn’t made its way into any of my books in any way. Since I have such a conflicted relationship to so much of my thought, I made Dylan the nihilistic believer I can’t bring myself to be. In many ways this work is ‘autotheoretical’ in addition to being ‘autobiographical,’ and I’m actually quite pleased the way it turned out, how the blurring of identities keeps stacking, from the level of the story told, through the level of story-and-theory-telling, all the way to the level of composition. Even this ‘crowdsourcing’ exercise is something I hope to incorporate at the narrative level.
Dylan would take ‘orthogonal’ to mean ‘fundamentally constitutive’ – is this the way you mean it?
And like all rule’s of thumb it depends on how and what the author is trying to achieve. Dylan is quite explicit: given that he thinks the ‘demure author’ is simply one more ideological power play, he opts for the second person, for direct address. The same way he turns the Bildungsroman on its head, he works a figure-field switch on the reader: ‘You are the unreliable one. You are the fiction. It is the only place where you and I overlap, and become something more than what we are.’ The point is that no one stands outside the circuit of the natural, that it consumes all and everything.
For my own part, I’m dreadfully interested in the intersection of narrative and theory, the ways they complement and contradict one another – and most important, the artificiality of thinking they can (let alone should) be kept apart. Especially in times like these, where change has so totally outrun our meagre powers of reflection. (Apparently the big reason there’s so little research on the neurological and psychological effects of information technology is that the funding-aquisition/study-execution/publication cycle is so cumbersome that results always arrive too late).
You’re channelling Fennel again. Like me, Dylan would argue that most of the former is actually an apologetic exercise, self-congratulation posturing as ‘critique,’ a condemnation of the already condemned for the edification of those condemning, and that the latter is what makes the former possible. The more obvious point, however, is that critics are most certainly ‘an aspect of society,’ and an incredibly powerful one, given the degree to which they sort the ‘serious’ from the ‘silly.’ Their time is long, long overdue, wouldn’t you agree?
“There’s a difference between critiquing an aspect of society and critiquing critics. It’s hard for me to believe that one can have a genuine passion for the latter, whereas the former has produced some incredible pieces of art.”
Then you suffer from a failure of imagination! The passion for critiquing critics is the passion for philosophy. 🙂
@Scott, I only read the first four sections of the book so far. It’s very, very strange to read a novel that seems to present a condensed version of the blog posts I’ve read here over the past 18 months, with a narrative to boot. As far as criticisms of that (quite small) section I’ve read:
1) The editor’s forward about publishing the mysterious manuscript of a missing person seemed a bit stale to me, it struck me as a cheap dramatic trick I’ve seen some version of too many times. Still, I must admit that it provides a great frame of reference to understand Dylan in, providing a dismissive academic to sound off about the dismissals of several other academics before launching into Dylan’s sneering dismissal of academia.
2) Chapter 30, where Dylan claims that this work is smarter than you, the critic, just seems like a silly degree of posturing given his nihilism.
3) Dylan’s nihilism germinates in a poker game with a Russian? It reminds me of an intellectual version of a 90’s action movie scene, where the demonized Ruski ends up being as genuine as he says he is, and the insignificant bit of information he provided 30 minutes in ends up saving the day. Or at least Dylan’s theoretical rigor.
4) The wank sections have a certain pattern to them. You start wanking harder and harder, layering up your premises, providing your examples, reaching your conclusion, and then following with a single-line paragraph that usually either sums things up or puts the conclusion in different terms.
Stop concluding with so many of these.
~Ross
Great stuff.
You’re not the first to complain about the false forward, and I largely agree with you. Tired tropes are okay so long as they actually do some substantial heavy-lifting…
I take another look at 30: all these ‘in-your-face’ passages make me nervous, for the predictable suite of reasons. I think they would be more effective if they were actually more manipulative, if they could trick the reader into agreeing, then pull the carpet from beneath their feet. I actually hope to do more of this kind of bait-and-switch stuff in the rewrite.
I’m curious what troubled you specifically with the one-line zingers, Ross. Too many of them too glib, or is it the over all strategy that made them seem stale by the end?
My main issue with the one-liners was that they felt very formulaic after awhile. I came to anticipate them quite early in my reading, which made me eventually dread them.
As for the “glib” factor, that’s not quite how I would put it, but I did get a vaguely similar vibe now that you mention it. I perceived the one-liners as one of those things that people pepper their books with when they know they’re being consciously literary. Of course, I’m not speaking from much of a perspective because Kurt Vonnegut is about as literary as I normally go (though I did push my boundaries last year by nearly finishing Pynchon’s Gravity’s Rainbow, haha.)
Of course I’m being careful to phrase that all in intentional terms like how it “felt” or how I “perceived” it because I wish to make no claims that my perceptions are even accurate, let alone telling. This is simply one of those times when you requested unfettered bitching, and I’ve delivered.
~Ross
Saajan,
but I think it’s worth examining my own animosity to those ivory tower bastards alongside the idea that this very animosity buttresses/validates those whose judgement I find erroneous.
You seem to be reading it as some sort of superiority position grab? As if grabbing the judgement cap, and everyone else but the judgement cap wearer is judged?
To me LTG seems more like the way a murder detective can themselves be suspected, investigated, charged and even jailed for murder. Ie, a murder detective isn’t somehow outside of what he investigates. Everyones inside. Indeed yes this does validate investigtion – but it does not validate investigators/particular people. LTG seems to simply turn the investigation onto the investigators. But it seems you read it as being some sort of grab for judgemental power (and with grabbing all such power and no one else having any, an invulnerability to being judged)?
Havn’t had the change to read all the other comments, so I hope I’m not repeating anyone’s ideas here.
Your description of the sufficiency issue and the mechanism behind it seems frighteningly plausible.
I do wonder about one thing, however.
The idea that the gut brain invisibly influences the conscious brain explains a lot of clearly observable phenomena.
And though the invisibilty of the forner to the latter is an important part of this mechanism, I can’t help but wonder if the conscious brain doesn’t invisibly influence the gut brain in turn (if in a very different manner).
This might explain a variety of other obeservarions, such as the a person’s ability to change her behaviour due to (the closest approximation one can muster of) conscious reasoning and introspection (which could be just as skewed and biased as any other rationalization-masking-as-reasoning, but still requires a backward communication channel between the conscious and the gut brain).
I hope I’m not just grasping at straws here, but other than the fact that I do think this kind of a relationship exists in the mechanism of consciousness (whatever the hell it might be), I would like to think that we have some hope of overcoming our falliable, self deluded minds…
I’m wondering if as much as someone else telling you you are wrong, the so called public relations brain potentially switches from telling others they are wrong to telling themselves they are wrong (having similar effects as others telling you you are wrong – which seemingly does have effects, if at the very least, peer pressure), thus modifying their own behaviour and creating a increasingly complex feedback loop. I’m even thinking of this as an abberation of the public relations calling others wrong – kind of like using a wrench to bang in nails is an abberation, even if it works.
This is certainly the case, and the thrust of the behavioural modification programs (and the dual process models of cognition they turn on) that are presently sweeping the world. I’m not arguing epiphenomenalism, that consciousness is a side-effect, a causally inert shadow thrown on the wall, just that since it lacks access to its neurofunctional role, it is constitutively blind to it. The kinds of ‘Skinner-box’ strategies these programs utilize amount to a gradual, albeit fuzzy, understanding of what those roles are – as well as giving up on the phantom of ‘willpower.’ We use the deliberative brain to train the automatic brain to better feed the deliberate brain to better train the automatic brain…
Something I embraced almost twenty-years ago to rebuild my shattered life.
Our ability to influence and “communicate” to our gut brain, and thus alter and to some extent steer our behavior, to me means that willpower is not a complete illusion (a concept that in addition to being totally depressing, might be interpreted as relieving people from any responsibility for their behavior), but rather simply and essentially different from our ignorant and simplified view of it.
Rather than being a “real time” willpower, it’s more of a retrospective, constantly evolving self conditioning willpower.
I would prefer ‘auto-conditioning,’ but only because I think this way of phrasing it avoids the peril of lapsing into ‘cake and eat it too’ type thinking regarding the will – the ‘free won’t’ approaches you find. All our auto-conditioning is auto-conditioned. There really is no getting around the problem this poses for responsibility as we presently understand it, and either we find some way around it, or I fear our culture will become very schizophrenic indeed.
You seem to be reading it as some sort of superiority position grab? As if grabbing the judgement cap, and everyone else but the judgement cap wearer is judged?
Well I think I’m taking this post off the rails so I’ll stop. My point was if this all sort of silly, and the ivory tower is more and more irrelevant, then what do those of us outside of it have to be concerned about?
It’s their problem to change or fade, while the delightfully mad like Grant Morrison get honored by the Queen. 🙂
Delightfuly mad is about right.
“The Invsibles is a comic book that is my attempt to explain what happened to my after I was abducted by aliens in Kathmandu in 1994 after I went to Kathmandu in 1994 to be abducted by aliens.” The aliens apparently looked like Terence McKena.
I was reading about that in Supergods, his collage of comic history and biography. The aliens were hyper-dimensional blobs apparently.
Hey, if nothing else they made him rich. I think he’s one of the few actual millionaire creators to come out of comics.
Yeesh. The thing that always kills me about New Age Wank is its crazy conservativism – the mise en abyme always features infinite reflections of the self. The ALIEN is alway so bloody familiar… and comforting for it.
The irrelevance of the establishment IS the problem. It’s certainly what Dylan comes back to time and again, how he (and everyone he knew) was failed by an institution more bent on communicating (amongst themselves, in painfully predictable self-congratulatory ways) ABOUT him (and everyone he knew) rather than TO him (and everyone he knew). By abandoning the cultural commons they became simply another self-regarding institution, one bent on soaking up (and so neutralizing) as much critical potential as they can. Some parasitic pretending to be symbiotic.
That said, Sci, these are precisely the rails I was hoping the comments would ride!
Glad I didn’t veer us off to far from the charted waters Railsea, I have enough manners to worry I take things too quickly into the “Here there be Monsters!” parts of the map while everyone else is just looking for a new route to India.
Though in the interest of fairness I feel I should leave space for those who’ve, heh, actually finished the book!
As to the topic of school, debt, and the admitted guesswork of blame game, I find Kanye West can offer up a good closer:
“This the real world, homie school finished.
They done stole your dreams, you don’t know who did it.”
-Gorgeous
if this all sort of silly, and the ivory tower is more and more irrelevant, then what do those of us outside of it have to be concerned about?
It depends if you think there’s some sort of role for an organisation who brings about some kinda critical thinking in the larger population (or atleast raises question marks that haunt popular conciousness). And if you do, who do you punt that to – a capitalistic business, who might go bankrupt if they really make some unpleasant challenge for the larger population? Never mind how pandering to the preconception is far more profitable, ie what businesses are supposed to pursue.
Then you have the issue that there is an organisation ostensibly for this role, having lots of money funneled to it from the government, but are they actually doing the job they are being paid for, or in a massive, paid for circle jerk? That practically borders on fraud. I mean, they even ad hoc wrote out a 20k check to some crazy haired guy to write some book…(heh, yeah, this goes against the idea somewhat)
Finally it depends on how much you think your world is self sufficient enough to continue (for any reading for pleasure to occur), whilst the rest of the world crashes? If it is self sufficient, then has it become isolationist and indifferent to the rest of the worlds population? Or if it isn’t self sufficient, then just focusing on reading for pleasure…well…?
When has the critic ever enacted massive changes in the consciousness/conscience of the public at large?
Isn’t that always the responsibility of the artist and the activist?
“No church in the wild” yo. 😉
-Sciborg 2.0, June 22, 8:49pm
What is an artist and activist but a critic?
Oh, you mean critics as in those who rag on about movies and other media? Fair enough confusion on the use of the word, but I mean the story isn’t just someone ragging on about the state of literature (see prior blog posts for that! hehe!). Or…does it kinda seem to close to the source, kinda dissapearing up it’s own arse because it’s artists/critics criticising artists/critics? It perhaps just doesn’t seem enough of an artistic performance and more like a ragging on? Just trying to guess your position ’cause I could imagine it feeling that way and being kinda blah because of it. Am I way off?
Yo!
Well, so long as this isn’t a derailment-
You said that academia could be useful, if they could open the gateway between what Scott calls the “silly” and the “serious”.
Yet it seems to me that this gate is for a tower we don’t even need. Fayanal and the gang have bypassed it on the way to the capital.
So my thinking is, if this tower isn’t all that useful, why ask people to pay for it? At least we should divide the rations a little differently no? Obviously the subjects of the humanities are invaluable, though like calculus and discrete mathematics they aren’t that valuable after a point.
It seems the only thing stopping people here from coming to my conclusion that if pointless the humanities can afford to have its funded gutted…Given that lack of protest to “Science is the only claim making institution worth caring about”….Well, is a sentimental attachment for the departments they either come from or work in.
Either that, or more people should speak up when the “Science is the only…” claim about claims is made. Otherwise Average Joe’s like myself start feeding the counter-elitist monsters in the back of our minds….
ps. No Church in the Wild is a new Jay-Z + Kanye West song, used in the Great Gatsby trailer. Interesting that rap music is our go to for decadence, even when portraying the decadence of a largely white class in the 1920s.
Very interesting use of the modern to frame the perception of the past, no? Maybe all time is a physical structure, a grand architecture, and those crazy comic book writers are on to something….
“I just need to open my eyes and the river of mind cartoons ceases.”
-Alan Moore, The Courtyard
-Sciborg 2.0 July 23rd 10:13 AM.
If you look at the literature and humanities as being the upper hemispheres of a brain above the pop culture instinctive brain below, yeah, you might very well ask why it gets so many calories passed to it when it just aggrandises itself while the instinctive elephant brain just does what it was going to do anyway?
But then again, it depends if you have an abiding faith in that instinctive culture to go just fine by itself?
These words were published yesterday in the journal Science, in a review of “Consciousness: Confessions of a Romantic Reductionist” by Christof Koch
“Can neuroscience be reconciled with living a happy, meaningful, moral, and yet nondelusional life? I will confess that this question also occasionally keeps me lying awake at night. However, I cannot but think that Koch is, unfortunately, misguided and that Crick and Monod are, unfortunately, right: In the eternal silence of neuronal spaces, our anguish probably goes unheeded. My own optimism consists of enjoying every single moment of consciousness while it lasts.”
The author of the review, Stanislas Dehaene, critques Koch’s adherence to Chalmerian panpsychism while showing in the early chapters of his books that certain complicated brain systems are themselves not conscious. Complexity is not sufficient! Information integration (Tononi) is presented as the critical component.
If anything, it shows that the people doing this work are not insensitive to the deeper implications.
Also, if you ever find yourself on the opposite opinion as Francis Crick AND Jaques Monod, you are probably wrong.
This doesn’t seem like a definitive rebuttal.
Admittedly I am in the armchair “amateur consciousness hour” group, but it seems the issue depends on whether Koch is a Teilhard type “panpsychian” or more into panexperientialism which differentiates between aggregates that represent higher levels of consciousness (life) and those that don’t (rocks and shit).
Schwitzgebel has been arguing on his blog that Tononi himself is committed to some kind of panpsychism, given his claims to date. And Koch, who (according to his SA column at least) has embraced IITC with enthusiasm, would be in the same boat. I actually used to worry about these things, but in a sense I think BBT renders the issue moot. Once you strip away all the things it seems to strip away, then consciousness ceases to seem all that ‘special,’ and Chalmerian type revisions of physicalism become almost theatrical.
Monod needs to be written into a superhero.
Hello Scott
I’ve been reading your blog for a while now without commenting. First, thanks for the all the fascinating ideas. Also thought you might find this article interesting
http://www.newyorker.com/online/blogs/frontal-cortex/2012/06/daniel-kahneman-bias-studies.html?mobify=0
From a somewhat different angle, but the last two paras could have been taken directly off this blog. Seems the Argument is going mainstream.
“What explains this result? One provocative hypothesis is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to glimpse their systematic thinking errors. However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.
The problem with this introspective approach is that the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.”
It’s a great article – one that certainly tweaks my confirmation biases! If you get a chance to check out Kahneman’s Thinking Fast and Thinking Slow, do so. Single best intro to this world that I’ve read. Thanks, Jonas!
What worries me is when I think of the issue in terms of conciousness transferal. You know, the ol’ “wanna transfer my mind from my aging body into a robot” thang. The mind, that is, not just popping the brain/the meaty hardware out and putting it into a cyborg body.
The thing is, you want to go from seeing from behind the eyes you have right now, to seeing from behind the camera lenses. You want it to be a contiguous experience – you certainly don’t want to just DIE and have a duplicate in the robot body (though good luck to him, but nah…). If it ends up as a duplication, that is no good – something has to actually be transfered over. But what?
What is the thing being transfered, rather than just duplicated? Okay, you have synaptic connections – but that doesn’t transfer! All you’d be doing is duplicating that connection in the robots synaptic simulation hardware. You can’t shift that synaptic hardening…it’s there in the brain, or you’ve just made a duplicate (or you’ve wrecked the original) one or the other.
So if nothing can be moved…if there’s nothing to move…that thing you want to transfer (before you die of old age), that thing you think exists (and you think is currently alive) to transfer…I mean, I’d like to think one exists as much as computer programs do. But this program can’t even switch platforms while continuing to run? That’s what strikes me – the impossibility of seeing from one set of eyes, to seeing from behind another. It just seems it’ll always end in a duplication, not a transfer.
At best all I got is an appeal to determinism, oddly enough. Of the origins of one, growing out of a certain set of past events whilst the duplicate is growing out of a very different set of past events. That’s the only tangible element I can really see in it. And that tangible element isn’t moveable without breaking/killing it. At best you could maybe grow into a cyborg brain, making millions of synapse to cyber connections, becoming into it and then hoping you are not too much of a wraith when the organic history dies and only what did grow into the cyber, continues.
@Callan: (Forgive if I’m erecting straw men from your statements, not my intent.)
If you look at the literature and humanities as being the upper hemispheres of a brain above the pop culture instinctive brain below, yeah, you might very well ask why it gets so many calories passed to it when it just aggrandises itself while the instinctive elephant brain just does what it was going to do anyway?
It seems to me, though I’m happy to be supplied evidence to the contrary, that the whole reformation of the gatekeepers relies on a paternalistic notion that art and society need academia.
But then again, it depends if you have an abiding faith in that instinctive culture to go just fine by itself?
Whether one believes this or not, the question is whether academia has anything to offer in shepherding culture…or for that matter, if it could even do so at this point.
p.s. The Courtyard keeps making me think of this blog, maybe just because I know there are comic fans around:
“Wza-Y’ei is a word for the negative conceptual space left surrounding a positive concept. The class of things larger than thought. Being what thought excludes.
It applies to so many things…everything that is conceived…”
Also, for some reason, these lines from The Wire:
“It’s a cold world.”
“I thought you said the world’s getting warmer.”
“World goes one way, people another.”
It seems to me, though I’m happy to be supplied evidence to the contrary, that the whole reformation of the gatekeepers relies on a paternalistic notion that art and society need academia.
Which is to say, instinctive culture gets along fine by itself?
It seems to be turning the whole thing into a paternalism question, which avoids the question, does art and society actually have any fitness metrics they measure themselves by, to see if they have gone wayward from some standard they set out to abide by?
I can kinda see the rot and the hate here for academia that Dylan refers to. So much hate, it’s an abandonment of any sort of organisation that checks such fitness measures. Academia has given it a bad name. Just be yourself and believe in yourself and all that, as some blog guy rants about sometimes.
I mean, what evidence are you happy to see? The very idea of holding to a standard is missing out on some things you like to do (or most likely this will be the case to some degree). Unpleasantness. Unless you’ve commited to some sort of unpleasantness occuring, you’ve absolutely no interest in being supplied evidence for some group getting in the way of your fun. It’d just seem a bunch of gate keepers.
So there will never be any evidence to the contrary. Because academia fucked that generationally ongoing commitment up.
What evidence could I give that doesn’t seem false because…it simply wouldn’t seem fun? It’d be a downer? What’s your personal budget for missing out on fun? If it’s 0, I could never sell you an organisation thats down on any particular fun. You just wouldn’t have it in your budget *shrug*.
Moi: But then again, it depends if you have an abiding faith in that instinctive culture to go just fine by itself?
Saajan: Whether one believes this or not, the question is whether academia has anything to offer in shepherding culture…or for that matter, if it could even do so at this point.
No, if you believe culture will go just fine by itself, it’s not a question at all. My percetion is that the culture question was avoided. I’m fine with a hypothetical commitment either way (doing both, one after the other, is cool as well), but a hypothetical commitment to culture will do just fine, is a commitment to that wouldn’t bother with any question about academia.
So, you don’t have any evidence? Or how about just anec-data on why you think it’s important. You seem to believe the ivory tower fat I’m saying we should trim still matters.
Why?
As for academia being the responsible parent ruining the fun, this is actually making me doubt you have much ground to stand on…It sounds way too similar to “I know I won’t be able to convince you to accept Jesus. You see, some people just aren’t meant to be saved.”
How did Dylan put it? “So you’re just another Christian!”
I’m not even convinced quality is necessary for a work to engender real world change. I remember ragging on the gay bashing storyline in Green Lantern – it was poorly written, terribly dialogue, just shit. Then some kid tells me he never thought about gay bashing until he read the comic.
The better argument, it seems to me, is to laud the opportunities for teaching craft. But this seems to be something orthogonal to the idea of the English department as a gatekeeper.
My perception is that the culture question was avoided.
I agree, I think I did unintentionally sidestep that one. So help me out here -> What is “instinctive” culture? Where do you think it’s headed?
Either academia is going to raise up the quality of life, or it’s going to save us….from what exactly?
Let’s conclude with Alan Moore, this time from Voice of the Fire:
“How shall we speak, straight-faced, about the local whore who turned a trick so Nan could buy a jar of Marmite for the kids?
Maudlin Northampton shite, all tarts with hearts and we-were-so-poor-rickets-was-a-luxury. And yet a girl whose name has not survived would take a stranger up her in a back yard for her neighbor’s children, and how is it we no longer have a language to contain such things?”
-Alan Moore, Voice of the Fire
So, you don’t have any evidence?
None. There is no reason to not have fun (Dats the joy of a nihilistic universe!), so how do I build evidence on no reason? How do you think it works – there’s either some deep vein of meaning to draw upon for why you should do what I say or it can just be dismissed?
You seem to believe the ivory tower fat I’m saying we should trim still matters.
Why?
I thought your original position was “why bother thinking about or critiqing academia at all”?
That’s what I’ve aimed my lance at, anyway. I don’t think it involves arguing against cutting fat.
“It sounds way too similar to “I know I won’t be able to convince you to accept Jesus. You see, some people just aren’t meant to be saved.”
How did Dylan put it? “So you’re just another Christian!”
Well, I don’t hear you saying what your budget for no fun is? And am I supposed to say a practice of sucking up some no fun is not a cult practice? There’s some line of practices which aren’t cults?
I will say though, suffering some no fun is the sort of cult practice you can stop/start. Kinda like going to the gym. So perhaps more like just another gym junkie?
I presume alot of money is going and will go to academia, rather than green lantern and the simpsons and such which do tend to drop atleast a few odd thoughts into pop culture. Hey, if you can wrest the money away from academia, maybe there isn’t a reason to bother critiquing them then, so if so, then fair point.
What is “instinctive” culture? Where do you think it’s headed?
Either academia is going to raise up the quality of life, or it’s going to save us….from what exactly?
Well, a patriarchal, sexist against females society seems to be the default the instinctual goes toward, as one example. Others defaults are no judge, jury or trial, just condemnation at the reflexive drop of a hat. I guess I don’t think weve changed the brains were working with, just the culture. Which can recede and revert to the default, I guess I’m asserting.
Can you define fun? I don’t think The Wire is fun, but as an educational tool it’s likely going to do more good than any paper written by the literati.
It just seems like you’ve created this specter of fun, as if the masses which include oncologists and paramedics are lost in an opium stupor waiting to be rescued by some scarf-wearing fool who knows a lot about Kant and Proust.
I can’t tell if your being sarcastic about the meaningless universe, but you seem to believe in the humanities in a religious fashion.
Well, a patriarchal, sexist against females society seems to be the default the instinctual goes toward, as one example
There’s a ton of work being done on that on the ‘Net. Alyssa Rosenberg likely reaches a good number of people, for example. Then there’s the women in video games documentary coming out, Grant Morrison and Alan Moore noting the misogyny in comics, and so on.
Having lived in Philly and met people from the Annenberg School of Communication, I can assure you that a lot of work is done on everything from comics to Hustler.
It just seems like you’ve created this specter of fun, as if the masses which include oncologists and paramedics
Well, take the paramedic. How many OD’ing drug users do they patch up (or attempt to)?
So, do they try and write something to perhaps get some ideas about drugs in before or early on in someones drug using history (and I’m not talking pamphlets, of course). Or do they get home and watch some reality TV (with the rest of us?)
So they keep patching them up (or not).
Now Scotts approach seems to be to ride the genre wave – to ride the fun, in other words (I guess). But subvert it to, atleast partially, be quite unfun (towards a cause).
I dunno, do his books strike you as not fun in parts (and I don’t mean ‘bad writing’)? If not…well, that makes conversation complicated so I’ll wait on an answer there.
But if they do seem unfun, well yeah, you’ve got this paramedic in a loop, simply plastering over the symptoms, over and over again. No? Of course, that’s kind of assuming a writer can affect a person before a drug becomes their life path. Maybe not, so if you want to hit me up with that possibility, it’s a fair shot to take (has my blessings!)
but you seem to believe in the humanities in a religious fashion.
I think I said “I presume alot of money is going and will go to academia, rather than green lantern and the simpsons and such which do tend to drop atleast a few odd thoughts into pop culture. Hey, if you can wrest the money away from academia, maybe there isn’t a reason to bother critiquing them then, so if so, then fair point.”
If you can shift the money, cool. Perhaps I believe in money in a religious way? >:)
There’s a ton of work being done on that on the ‘Net. Alyssa Rosenberg likely reaches a good number of people, for example. Then there’s the women in video games documentary coming out, Grant Morrison and Alan Moore noting the misogyny in comics, and so on.
Having lived in Philly and met people from the Annenberg School of Communication, I can assure you that a lot of work is done on everything from comics to Hustler.
Who’s reading the work? What’s their distribution method and is it working? Ie, are people who don’t already agree with them picking up their works and reading them?
You seem to be defending artists rather than critics or academia.
When I read things like Krystal’s backhanded “It’s okay to like bad books” snub I just smell the desperation of someone who’s chosen stacking of cultural capital is on the chopping block.
Then I begin to wonder about more and more cost cutting measures. How much bloat is funded by tuition dollars, how many dissertations divorced from STEM are really necessary? Can they be morally justified?
Mind you, it’s only “bloat” is Scott’s right about Science’s supremacy as a claims making institution.
Callan, I’ll do your work for you (haha really this is from Galleymac over at Westeros):
They did a study at University of Toronto that found that people who read fiction scored higher on tests of empathy and of interpreting social situations. The fiction gives people a chance to learn how to identify emotions and emotional responses in others (and in themselves) — yes, in a “safe” environment. Not to mention learning that the emotions are normal and ok to have. So it is practice. (Also one of the reasons I’ve been so ridiculously in love with my Islas quote below since 1997.)
Again, this seems to veer more toward teaching and facilitating discussion than the gatekeeping or researcher role. In general it seems what’s needed is a separate evaluation system, STEM tenure can be based on research but humanities weighted far more heavily on teaching.
Probably just need Masters students for a lot of this, likely another good cost saving measure.
Hey RSB, are you still planning to share a chapter from the Unholy Consult? Not that I’m complaining or anything about getting a free book (plus a peek at your writing process), but, well, you know……….
Being impatient (I accidently wrote ‘impolite’ but wouldn’t that be wrong, huh?) seems to be the new cool thing 😉
*sings* “Write Scott, write like the wind…na na na…”
I want to apologise in advance for my sloppy formulations, which might be due to lack of sleep and my bad english, but I really hope it is clear anyway what I wanted to say, which is quite a lot, now that I look at it. Of course I’m open to explain what I meant in detail if any questions arise.
The thing I tried with this one was finding possible approaches for critique, not only (if any at all) good ones
5
“This is why inventive philosophers will often produce multiple, competing interpretations (accomplish in the space of a single paper what typically takes communities of more slow-witted philosophers several years) to discredit claims. Ambiguity tells against theoretical cognition–generally speaking. And pretty much any theoretical claim can be made ambiguous, qualified and redefined unto contradiction.”
Nothing to critizise really, but: The last sentence is only true if solely competing interpretations are possible. Good philosophers should try to be precise in their claims. If you want to be a successful philosopher being inventive is more productive, though.
I think this is clear anyway, but some people might interpret this you meaning the above being the only way it is done.
–
X (the one after 6)
“Theories allow us to be more stupid more efficiently. They idiot-proof the world. A theoretician is someone who sees shortcuts at every turn, ways to condense thinking into concepts and their relations–things as apparently static as the human soul.”
One could argue against it like this, which seems to be a common one: Theories make life easier. In fact, living a life the way we do wouldn’t be possible without theorising. We make theories about everything, so that we don’t think about the same things more than one time for example.
Apart from that you are right. Theorising might be a path that leads away from “the thruth”. But who cares about truth, especially when there is no meaning in this universe?
You say theories allow us to be more stupid, but wouldn’t it be more stupid not to trust in at least some theories, some that seem to be “right” most of the time?
33
“Of course the brain can’t recognize itself. Of course the brain can’t interpret its own processes the way it interprets environmental processes. Since the thalamocortical system is hardwired to the rest of the brain it simply cannot ‘gain perspective’ on itself, which is to say, sample its neurological environment the variable way it can its ecological.
But there’s bigger problems. There’s process assymmetry, the fact that whatever recursive processing the brain develops will simply add to the load of ‘blind processing.’ ”
It’s quite clear what you want to say. But what does it really mean that ‘the brain recognizesanything’ or that ‘the brain sees something’. With these expressions you seem to act like the brain is something like a conscious mind(?) and not the machine that creates the illusion of one. All we have is the data, the signals the brain uses. Doesn’t using the words in this way just come down to interpretations of deluded beings?
Furthermore, how can anyone know what brain can’t do? You can proof that a pig can fly, if you ever see one spead it’s wings and flies away, but you can’t proof that it can’t. So how can you know what the “blind processing” includes.
–
I may have already said something similar, but if you say things like ‘the brain interprets’ ‘the brain gains perspective’ ect., you say things that fit into your theory of the brain. All the words you use seem to have meaning, But:
“Purpose, meaning, morality, and so on, are simply on-the-spot fixes, reliable because of their broader functional role, yet misapprehensions all the same.”
Many sections of LGT say a lot of theoretical things about the brain. So these things must be misapprehensions too, right?
–
You use language here in a way that leaves room for some interpretations (the things concerning the brain for instance;). You seem to act like one of those clever philosophers you critizised above (5). Of course this is surely necessary most of the time because LGT would maybe become unreadable and boring otherwise.
–
47
“When it comes to all the questions that matter, neither of us know fuck all.”
Now the question comes to mind, which questions matter and why, when there is no meaning…
All we have is data. Everything else are just interpretations of deluded minds…
FYI: The June 3 and June 4 links on the Light, Time & Gravity page both point to Part V.
(i posted a similar comment on the LT&G page)
It’s slow going for me ‘cos I am reading a couple other books right now (Tigana by Guy Gavriel Kay and Imajica by Clive Barker). I have made it through part V of the story and I am enjoying the experience. My favorite scene so far is the hockey scene in Part 1 ( 3 – 2001).
I have been trying to get word out regarding the piece. I have sent out a couple tweets and linked to the piece on Huffington Post and Goodreads. Hopefully, this will result in some traffic and feedback.
Skipping topics, what’s happening with the Montana ruling on campaign contributions? As if ‘money talks’ is a freedom of speach or something? Oh, and an obligitory ‘folks here are immune to that sort of thing’ responce.
Though not Godel’s Incompleteness this is an interesting use of Cantor’s Diagonal to assert that there are true but unproveable statements:
http://www.coopertoons.com/education/diagonal/diagonalargument.html
…mathematicians who are either younger and closer to their coursework or just smarter might have a better hold of logic to point out possible errors. Seems to work IMO.
Interesting 🙂
I just don’t see how this follows:
“But at most we can only use a countably infinite number of symbols.”
Maybe I didn’t understand it right, but isn’t the number of symbols we can use uncountably infinite? For every subset of the natural numbers (the subsets gained by making the powerset) you could create a symbol, so we would have an uncountably infinite number of symbols. Is there any error in my thinking?
Apart from this this all seems fine to me.
At first glance it seemed to hold, because each natural number is associated with a symbol. So there can only be a countably infinite number of symbols.
It’s like no matter the base, binary or decimal or octal or hex or whatever, the system is a finite number of symbols.
But isn’t also every real number associated with a symbol?
What is ‘proving’ it, really? I mean, if your gunna do something anyway, the whole ‘prove it’ really comes down to proving something so much so you make someone else do something. I wont deny theirs a certain gravitation to proving something to be felt, but why is that gravitation there?
‘Proving’ is more like showing what can be done. So if you kow something can’t be done, you don’t have to try it. I’m thinking of the Hilbert’s problems, which was a list of mathematical problems, that were unsolved at the time they were published (for those who don’t know). Gödel proved with his second incompleteness theorem the second problem can’t be solved.
Well, without proofs there are no foundations. A lot of stuff that seems routine (derivatives, integration) need proofs to be justifiable.
RSA encryption requires a proof, though there are some PhDs out there who think they can pull a “Sneakers” and solve the encryption in reasonable time. Apparently without quantum computing.
A proof allows for confidence in using some algorithm, knowing that the worst case won’t cost you lives or millions.
‘Proving’ is more like showing what can be done.
Why? Towards what end?
So if you know something can’t be done, you don’t have to try it.
How do you know?
lol html cause it is an annoying question.
Gödel proved with his second incompleteness theorem the second problem can’t be solved.
What theorem did you use to determine Godel proved such?
It’s funny how ones own evaluation is both inherent, and yet all too easily seems utterly absent. Or so I evaluate.
“Why? Towards what end?”
You could ask that about almost everything. There are many possible answers. Some people just have fun proving things, makes them happy. Isn’t that a nice thing? Some people try to create coherent systems and without “proving” it would come down to guesswork.
ect.
“How do you know?”
It seems irrational to me to say: “You have to do X. X can’t be done”
“What theorem did you use to determine Godel proved such?”
How do you define theorem?
“It’s funny how ones own evaluation is both inherent, and yet all too easily seems utterly absent. Or so I evaluate.”
How do you know?
What I’m getting at is that ‘proven’ seems to be a profoundly social statement (ie, it’s used as such). It’s not just proven to you, if it’s proven to you, it’s proven to that guy over there, his sister, the rest of the continent, the rest of man for eternity. Everyone should believe it, because it’s proven. When you think of that scope, in terms of “someone just likes to prove things, makes them happy”? It almost seems of the magnitude of creating new strains of anthrax to make oneself happy, even if not of the same negative association. Playing with something rather enormous, simply to be happy? Particularly “Some people try to create coherent systems and without “proving” it would come down to guesswork.”. It’s like one must round up all of mankind into this proof – simply for a fear of dealing in guesswork? What is coherancy – it seems a deeply social beast?
It seems irrational to me to say: “You have to do X. X can’t be done”
So until you get everyone else to say X can be done, you can’t do X?
How do you define theorem?
Same way you defined your use of the word.
Moi: “It’s funny how ones own evaluation is both inherent, and yet all too easily seems utterly absent. Or so I evaluate.”
How do you know?
Evaluate, not know. A mastery of non commital caveats (craven, even) is part of the dealio.
That or more fun is ‘I know in the same way that, for the question you ask, you would understand and know the answer I give’
Of course that’s just a naughty hijack of credibility, heh!
But anyway, it was ‘evaluate’. Seems off topic to ask how I know?
“What I’m getting at is that ‘proven’ seems to be a profoundly social statement (ie, it’s used as such). It’s not just proven to you, if it’s proven to you, it’s proven to that guy over there”
I think you are overinterpreting what ‘proven’ means. Maybe many people do this, but that’s not at all the way I look at it. I don’t think ‘proving something’ is a social statement, only ‘proving something to X’. When you prove something you don’t automatically prove it to everybody.
“Everyone should believe it, because it’s proven.”
Again, maybe people use it that way, but that’s just too much. Nobody needs to believe something, just because it’s proven. When you prove something you must make several assumptions. If someone doesn’t agree with the assumtions, he/she doesn’t have to agree with conclusion.
Furthermore, in order to prove something to someone, you have to make this person understand the prove. Noone needs to believe what he doesn’t understand. And even if this person understands the prove, I think it’s too much to ask for everybody to be a rational thinker and accept the rules of logic.
“So until you get everyone else to say X can be done, you can’t do X?”
No, I didn’t want to imply that at all. What I want to say is that if:
(1)person a wants to do X,
(2) person b proves, that X can’t be done and
(3) person a accepts, the prove, thereby accepts that X can’t be done,
then person a would be irrational to try doing X anyway.
And with this I don’t want to imply that being irrational is wrong or that noone should be irrational.
“What theorem did you use to determine Godel proved such?”
I don’t fully understand the prove, so in a strict sense of “knowing”, I don’t know that, the statement I gave is true, but it was based on the assumtion that I could trust in the people, that wrote/talked about it.
Oh, god, scientists are afraid of math as well:
http://www.globalpost.com/dispatch/news/science/120626/scientists-are-afraid-math-too-says-study
Well, it’s official. We’re fucked, the future now left to the evopsych “just so” storytellers….
4x – 25 = 1
BOO!
AHHHHH! Whew, nearly gave me a heart attack. 😉
But isn’t also every real number associated with a symbol?
Heh, was thinking about that myself. Theorems, on the basis they can be proven via programs, can be represented as zeroes and ones. But the set of binary strings is uncountable, via the Diagnolization argument itself.
So, yeah, it does seem the set of all theorems by it’s very nature is uncountable, without a recourse to the uncountable nature of power sets.
Meh, it’s been quite awhile for me. I’m pretty sure I’m the wrong person to ask.
It just seemed to me that he came to the right conclusion with a from premise, but okay…that’s enough of those horror stories for me. Only the thought of this lets me sleep at night:
Möbelixman keeps away all those evil numbers.
I agree, he doesn’t seem to justify that theorems are countably infinite. Aren’t the number of relations between N elements always countable, and by induction (bear with me, it’s been awhile) you can posit that N+1 elements are thus countable.
Yeah, it would have been helpful if he would have proven that the number of theorems must be countably infinite.
I don’t think that the number of relations is always countable, if you have infinite elements.
But if you have a non-infinite number of elements, then the number of relations must be countable.
If you are dealing with infinity, you often get to quite paradox thoughts. I’m not sure all of this make that much sense.
Apart from that, it was nice article about this kind of thinking 🙂
One of the people I badgered to read the book responded that even though Timothy Taylor is trying his hardest to seize the “authentic voice” while you treat it as a gimmick, you managed to pull it off much better than he did. That made me laugh.
At the same time though, the reviewer felt the opening sections weren’t attention grabbing or vivid enough for her to sustain her interest. I suppose even a parodic depiction of the quotidian can skate too close to the line, and come off as too grimy even for those with an academic bend. But then again she also loves LoTR and doesn’t think too highly of Taylor, so maybe that was to be expected.
Interesting. I think I’m so in love with that little hockey vignette that I have a hard time understanding how anyone could fail to be gripped! But there is a lot of tobacco field exposition, I suppose…
I’ve exchanged a couple of emails with Fawcett by the way, and am loving up Cambodia!
Not being much of a hocky fan, my reaction to that passage was “haha I remember seeing that on the news. This sure is a very Canadian start to the novel.” But then I forgot about it. Only after you mentioned it was significant for interpreting the ending did I go back and reread it. Initially I didn’t recall the passage even while two year later Dylan was talking about justice in the final chapters.
Nothing wrong with the vignette, but it’s not midnight croquet.