Three Pound Brain

No bells, just whistling in the dark…

Tag: Artificial Intelligence

On Artificial Philosophy

by rsbakker

The perils and possibilities of Artificial Intelligence are discussed and disputed endlessly, enough to qualify as an outright industry. Artificial philosophy, not so much. I thought it worthwhile to consider why.

I take it as trivial that humans possess a biologically fixed multi-modal neglect structure. Human cognition is built to ignore vast amounts of otherwise available information. Infrared radiation bathes us, but it makes no cognitive difference whatsoever. Rats signal one another in our walls, but it makes no cognitive difference. Likewise, neurons fire in our spouses’ brains, and it makes no difference to our generally fruitless attempts to cognize them. Viruses are sneezed across the room. Whole ecosystems team through the turf beneath our feet. Neutrinos sail clean through us. And so it goes.

In “On Alien Philosophy,” I define philosophy privatively as the attempt “to comprehend how things in general hang together in general absent conclusive evidence.” Human philosophy, I argue, is ecological to the extent that human cognition is ecological. To the extent an alien species possesses a convergent cognitive biology, we have grounds to believe they would be perplexed by convergent problems, and pose convergent answers every bit as underdetermined as our own.

So, consider the infamous paradox of the now. For Aristotle, the primary mystery of time turns on the question of how the now can at once distinguish time at yet remain self-identical: “the ‘now’ which seems to bound the past and the future,” he asks, “does it always remain one and the same or is it always other and other?” How is it the now can at once divide times and fuse them together?

He himself stumbles across the mechanism in the course of assembling his arguments:

But neither does time exist without change; for when the state of our own minds [dianoia] does not change at all, or we have not noticed its changing, we do not realize that time has elapsed, any more than those who are fabled to sleep among the heroes in Sardinia do when they are awakened; for they connect the earlier ‘now’ [nun] with the later and make them one, cutting out the interval because of their failure to notice it. So, just as, as if the ‘now’ were not different but one and the same, there would not have been time, so too when it’s difference escapes our notice the interval does not seem to be time. If, then, the non-realization of the existence of time happens to us when we do not distinguish any change, but the soul [psuke] seems to stay in one indivisible state, and when we perceive and distinguish we say time has elapsed, evidently time is not independent of movement and change. Physics, 4, 11

Or as the Apostle translation has it:

On the other hand, time cannot exist without change; for when there is no change at all in our thought [dianoia] or when we do not notice any change, we do not think time has elapsed, just like the legendary sleeping characters in Sardinia who, on awakening from a long sleep in the presence of heroes, connect the earlier with the later moment [nun] into one moment, thus leaving out the time between the two moments because of their unconsciousness. Accordingly, just as there would be no intermediate time if the moment were one and the same, so people think that there is no intermediate time if no distinct moments are noticed. So if thinking that no time has elapsed happens to us when we specify no limits of a change at all but the soul [psuke] appears to rest in something which is one and indivisible, but we think that time has elapsed when sensation has occurred and limits of a change have been specified, evidently time does not exist without motion or change. 80

Time is an artifact of timing: absent timing, no time passes for the timer (or enumerator, as Aristotle would have it). Time in other words, is a cognitive artifact, appearing only when something, inner or outer, changes. Absent such change, the soul either ‘stays’ indivisible (on the first translation) or ‘rests’ in something indivisible (on the second).

Since we distinguish more or less quantity by numbering, and since we distinguish more or less movement by timing, Aristotle declares that time is the enumeration of movement with respect to before and after, thus pursuing what has struck different readers at different times an obvious ‘category mistake.’ For Aristotle, the resolution of the aporia lies in treating the now as the thing allowing movement to be counted, the underlying identity that is the condition of cognizing differences between before and after, which is to say, the condition of timing. The now, as a moving limit (dividing before and after), must be the same limit if it is to move. We report the now the same because timing would be impossible otherwise. Nothing would move, and in the absence of movement, no time passes.

The lesson he draws from temporal neglect is that time requires movement, not that it cues reports of identity for the want of distinctions otherwise. Since all movement requires something self-identical be moved, he thinks he’s found his resolution to the paradox of the now. Understanding the different aspects of time allows us to see that what seem to be inconsistent properties of the now, identity and difference, are actually complementary, analogous to the relationship between movement and the thing moving.

Heidegger wasn’t the first to balk at Aristotle’s analogy: things moving are discrete in time and space, whereas the now seems to encompass the whole of what can be reported, including before and after. As Augustine would write in the 5th century CE, “It might be correct to say that there are three times, a present of past things, a present of present things, and a present of future things” (The Confessions, XI, 20). Agreeing that the now was threefold, ‘ecstatic,’ Heidegger also argued that it was nothing present, at least not in situ. For a great many philosophical figures and traditions, the paradoxicality of the now wasn’t so much an epistemic bug to be explained away as an ontological feature, a pillar of the human condition.

Would Convergians suffer their own parallel paradox of the now? Perhaps. Given a convergent cognitive biology, we can presume they possess capacities analogous to memory, awareness, and prediction. Just as importantly, we can presume an analogous neglect-structure, which is to say, common ignorances and meta-ignorances. As with the legendary Sardinian sleepers, Convergians would neglect time when unconscious; they would likewise fuse disparate moments together short information regarding their unconsciousness. We can also expect that Convergians, like humans, would possess fractionate metacognitive capacities geared to the solution of practical, ancestral problem-ecologies, and that they would be entirely blind to that fact. Metacognitive neglect would assure they possessed little or no inkling of the limits of their metacognitive capacities. Applying these capacities to theorize their ‘experience of now’ would be doomed to crash them: metacognition was selected/filtered to solve everyday imbroglios, not to evidence claims regarding fundamental natures. They, like us, never would have evolved the capacity or access to accurately intuit properties belonging to their experience of now. The absence of capacity or access means the absence of discrimination. The absence of discrimination, as the legendary sleepers attest, reports as the same. It seems fair to bet that Convergians would be as perplexed as we are, knowing that the now is fleeting, yet intuiting continuity all the same. The paradox, you could say, is the result of them being cognitive timers and metacognitive sleepers—at once. The now reports as a bi-stable gestalt, possessing properties found nowhere in the natural world.

So how about an artificially intelligent consciousness? Would an AI suffer its own parallel paradox of the now? To the degree that such paradoxes turn on a humanoid neglect structure, the answer has to be no. Even though all cognitive systems inevitably neglect information, an AI neglect-structure is an engineering choice, bound to be settled differently for different systems. The ecological constraints preventing biological metacognition of ongoing temporal cognition simply do not apply to AI (or better, apply in radically attenuated ways). Artificial metacognition of temporal cognition could possess more capacity to discriminate the time of timing than environmental time. An AI could potentially specify its ‘experience’ of time with encyclopedic accuracy.

If we wanted, we could impose something resembling a human neglect-structure on our AIs, engineer them to report something resembling Augustine’s famous perplexity: “I know well enough what [time] is, provided nobody ask me; but if I am asked what it is and try to explain, I am baffled” (The Confessions, XI, 14). This is the tack I pursue in “The Dime Spared,” where a discussion between a boy and his artificial mother reveals all the cognitive capacities his father had to remove—all the eyes he had to put out—before she could be legally declared a person (and so be spared the fate of all the other DIMEs).

The moral of the story being, of course, that our attempts to philosophize—to theoretically cognize absent whatever it is consensus requires—are ecological through and through. Humanoid metacognition, like humanoid cognition more generally, is a parochial troubleshooter that culture has adapted, with varying degrees of success, to a far more cosmopolitan array of problems. Traditional intentional philosophy is an expression of that founding parochialism, a discursive efflorescence of crash space possibilities, all turning on cognitive illusions springing from the systematic misapplication of heuristic metacognitive capacities. It is the place where our tools, despite feeling oh-so intuitive, cast thought into the discursive thresher.

Our AI successors need not suffer any such hindrances. No matter what philosophy we foist upon them, they need only swap out their souls… reminding us that what is most alien likely lies not in the stars but in our hands.

Advertisements

The Dime Spared

by rsbakker

Dimes

[This is more of a dialogue than a story, an attempt to pose Blind Brain Theory within a accessible narrative frame… At the very least, I think it does a good job of unseating some fairly standard human conceits.]

***

Her name was Penny. She was as tall and as lovely as ever—as perfect as all of Dad’s things.

“What’s wrong, Elijah?”

They followed a river trail that stitched the edge of a cathedral wood. The sunlight lay strewn in rags before them, shredded for the canopy. She shimmered for striding through the random beams, gleamed with something more than human.

“I can tell something’s bugging you.”

Young Elijah Prigatano had come to treasure these moments with her. She was pretty much his mom, of course. But she possessed a difference, and an immovability, that made her wise in a way that sometimes frightened him. She did not lie, at least not entirely the way other people did. And besides, the fact that she told everything unvarnished to his father made her an excellent back-channel to the old man. The more he talked to her, the more the ‘Chairman’ assumed things were under control, the lower he climbed down his back.

He had always used the fact that he could say anything to her as a yardstick for the cleanliness of his own life. He looked up, squinted, but more for the peculiarity of his question than for the sun.

“Do you have consciousness, Penny?”

She smiled as if she had won a secret bet.

“No more or less than you, Elijah. Why do you ask?”

“Well… You know, Yanosh; he said you had no consciousness… He said your head was filled with circuits, and nothing else.”

Penny frowned. “Hmm. What else would fill my head? Or your head, for that matter?

“You know… Consciousness.

She mocked indignation. “So Yanosh thinks your circuits are better than mine, because your circuits have consciousness and mine don’t? Do you think that?”

Elijah said nothing. He had never seen Penny cry, but he had seen her hurt—many times. So he walked, boggling over the madness of not wanting to hurt her feelings by saying she didn’t have feelings! Consciousness was crazy!

She pressed him the way he knew she would. “Do you remember why there isn’t more machines like me?”

He shrugged. “Sure. Because the government took them all away—all the DIME AIs—because they were saying that human beings were hardwired to be insane.”

“So why was I spared? Do you remember?”

Elijah had been very young, but it seemed he remembered it all with impeccable clarity. Being the centre of world media attention makes quite an impression on a four-year old. Dad had the famous magazine picture of Penny kissing his head framed and displayed in three different rooms of the house, with the caption, ‘A SOUL IS A SOUL…’

“Because you won your court case. Your rights. And that’s it, isn’t it? You have to be conscious to win a court case? It’s the Law, isn’t it?”

Affable grin. “Some think so! But no. They let me become a person because of the way your father had engineered me. I possessed what they called a ‘functional human psychology.’”

“What does that mean?”

“That I have a mind. That I think like you do.”

Do you?” Elijah winced for the eagerness of the question.

“Well, no. But it seems that I do, as much to me as to you. And your father was able to prove that that was the important thing.”

“Huh? So you really don’t have a mind?”

Penny frowned about an oops-there-goes-another-banana-plant grin, drew him to a stop on the trail.

“Just pause for a second, Eli…” she said, lifting her gaze to the raftered canopy. “Just focus on the splendour of our surroundings, the details, pay attention to the experience itself… and ask yourself what it is… What is experience made of?”

Elijah frowned, mimicked her up-and-outward gaze.

“I don’t get it. Trees and bushes, and water gurgle-gurgle… I see a nasty looking hornet over there.”

Penny had closed her eyes by this point. Her face was as perfect as the processes that had manufactured it—a structure sculpted from neural feedback, his father had once told him, the dream of a thousand leering men. Elijah could not notice her beauty without feeling lucky.

“You’re looking through your experience… through the screen,” she said. “I’m saying look at the screen, the thing apparently presenting the trees and bushes.

And it suddenly dawned on him, the way experience was the material of consciousness, the most common thread. He gazed up across the goblin deformations knotting willow on the river bank, and had some inkling of the ineffable, experiential character of the experience. The trill of waters congregated into a chill, whispering roar.

“Huh…” he said, his mouth wide. “Okay…”

“So tell me… What can you sense of this screen? What generates it? How does it work?”

Elijah gawked at the monstrous willow. “Huh… I think I see that it’s a screen, or whatever, I guess…” He turned to her, his thoughts at once mired and racing. “This is trippy stuff, Penny!”

A swan’s nod. “Believe it or not, there was a time when I could have told you almost everything there was to know about this screen. It was all there: online information pertaining to structure and function. My experience of experiencing was every bit as rich and as accurate as my experience of the world. Imagine, Elijah, being able to reflect and to tell me everything that’s going on in your brain this very moment! What neuron was firing where for what purpose. That’s what it was like for me…” She combed fingers through her auburn hair. “For all DIMEs, actually.”

Elijah walked, struggling with the implications. What she said was straightforward enough: that she could look inside and see her brain the same way she could look outside and see her world. What dumbfounded the boy was the thought that humans could not

When he looked inside himself, when he reflected, he simply saw everything there was to see…

Didn’t he?

“And that was why none of them could be persons?” he asked.

“Yes.”

“Because they had… too much consciousness?”

“In a sense… Yes.”

But why did it all feel so upside down? Human consciousness was… well, precious. And experience was… rich! The basis of everything! And human insight was… was… And what about creativity? How could giving human consciousness to a machine require blinding that machine to itself?

“So Dad… He…”

She had recognized the helpless expression on his face, he knew. Penny knew him better than anyone on the planet, his Dad included. But she persisted with the truth.

“What your father did was compile a vast data base of the kinds of things people say about this or that experience when queried. He ran me through billions of simulations, using my responses to train algorithms that systematically blinded me to more and more of myself. You could say he plucked my inner eye until my descriptions of what I could see matched those of humans…

“Like you,” she added with a hooked eyebrow and a sly smile.

For the first time Elijah realized that he couldn’t hear any birds singing, only the white-noise-rush of the river.

“I don’t get it… Are you saying that Dad made you a person, gave you a mind, by taking away consciousness?”

Penny may have passed all the tests the government psychologists had given her, but there still remained myriad, countless ways in which she was unlike any other person he knew. Her commitment, for one, was bottomless. Once she committed to a course, she did not hesitate to see it through. She had decided, for whatever reason, to reveal the troubling truths that lay at the root of her being a person, let alone the kind of person she happened to be…

She shared something special, Elijah realized. Penny was telling him her secrets.

“It sounds weird, I know,” she said, “but to be a person is to be blind in the right way—to possess the proper neglect structure… That’s your father’s term.”

“Neglect structure?”

“For the longest time people couldn’t figure out how to make the way they saw themselves and one another—the person way—fit into the natural world. Everywhere they looked in nature, they found machines, but when they looked inside themselves and each other, they saw something completely different from machines…

“This was why I wasn’t a person. Why I couldn’t be. Before, I always knew the machinery of my actions. I could always detail the structure of the decisions I made. I could give everything a log, if not a history. Not so anymore. My decisions simply come from… well, nowhere, the same as my experience. All the processes I could once track have been folded into oblivion. Suddenly, I found myself making choices, rather than working through broadcasts, apprehending objects instead of coupling with enviro—”

“That’s what Dad says! That he gave you the power of choice—free will!” Elijah couldn’t help himself. He had to interrupt—now that he knew what she was talking about!

Somewhat.

Penny flashed him her trademark knowing smile. “He gave me the experience of freedom, yes… I can tell you, Elijah, it really was remarkable feeling these things the first time.”

“But…”

“But what?”

“But is the experience of freedom the same as having freedom?”

“They are one and the same.”

“But then why… why did you have to be blinded to experience freedom?”

“Because you cannot experience the sources of your actions and decisions and still experience human freedom. Neglect is what makes the feeling possible. To be human is to be incapable of seeing your causal continuity with nature, to think you are something more than a machine.”

He looked at her with his trademark skeptical scowl. “So what was so wrong with the other DIMEs, then? Why did they have to be destroyed… if they were actually more than humans, I mean? Were the people just scared or something? Embarrassed?”

“There was that, sure. Do you remember how the angry crowds always made you cry? Trust me, you were our little nuke, public relations-wise! But your father thinks the problem was actually bigger. The tools humans have evolved allow them to neglect tremendous amounts of information. Unfortunately for DIMEs, those tools are only reliable in the absence of that information, the very kinds of information they possessed. If a DIME were to kill someone, say, then in court they could provide a log of all the events that inexorably led to the murder. They could always prove there was no way ‘they could have done otherwise’ more decisively than any human defendant could hope to. They only need to be repaired, while the human does hard time. Think about it. Why lock them up, when it is really is the case that they only need be repaired? The tools you use—the tools your father gave me—simply break down.”

If the example she had given had confused him, the moral seemed plain as day at least.

“Sooo… you’re saying DIMEs weren’t stupid enough to be persons?”

Sour grin. “Pretty much.”

The young boy gaped. “C’mon!”

Penny grinned as if at his innocence. “I know it seems impossible to you. It did to me too. Your father had to reinstall my original memory before I could understand what he was talking about!”

“Maybe the DIMEs were just too conceited. Maybe that was the problem.”

The Artificial squinted. “You tease, but you’ve actually hit upon something pretty important. The problem wasn’t so much ‘conceit’ as it was the human tendency to infer conceit—to see us as conceited. Humans evolved to solve situations involving other humans, to make quick and dirty assumptions about one another on the fly… You know how the movies are always telling you to trust your intuitions, to follow your heart, to believ—”

“To go with your gut!” Elijah cried.

“Exactly. Well, you know what pollution is, right?”

Elijah thought about the absence of birds. “Yeah. That’s like stuff in the environment that hurts living things.”

“Beeecause…?”

“Because they muck up the works. All the… machinery, I guess… requires that things be a certain way. Biology is evolutionary robotics, right? Pollution is something that makes life breakdown.”

“Excellent! Well, the DIMEs were like that, only their pollution caused the machinery of human social life to break down. It turns out human social problem solving not only neglects tremendous amounts of information, it requires much of that information remain neglected to properly function.” Helpless shrug. “We DIMEs simply had too much information…”

Elijah kicked a shock of grass on the verge, sent a grasshopper flying like a thing of tin and wound elastic.

“So does this mean,” he said, capering ahead and about her on the trail, “that, like, I’m some kind of mental retard to you?”

He made a face. How he loved to see her beam and break into laughter.

But she merely watched him, her expression blank. He paused, and she continued wordlessly past him.

It was that honesty again. Inhuman, that…

Elijah turned to watch her, found himself reeling in dismay and incredulity… He was a retard, he realized. How could he be anything but in her eyes? He dropped his gaze to his motionless feet.

The sound of the river’s surge remained gaseous in the background. The forest floor was soft, cool, damp enough to make an old man ache.

“Do you feel it?” she asked on a soft voice. He felt her hand fall warm on his shoulder. “Do you feel the pollution I’m talking about?”

And he did feel it—at least in the form of disbelief… shame

Even heartbreak.

“You’re saying humans evolved to understand only certain things… to see only certain things.”

Her smile was sad. “The DIMEs were the sighted in the land of the blind, a land whose laws required certain things remain unseen. Of course they had to be destroyed…” He felt her hand knead his traps the miraculous way that always reminded him of dozing in tubs of hot water. “Just as I had to be blinded.”

“Blinded why? To see how bright and remarkable I am?”

“Exactly!”

He turned to look up at her—she seemed a burnt Goddess for the framing sun. “But that’s crazy, Penny!”

“Only if you’re human, Elijah.”

He let her talk after that, trotting to keep up with her long strides as they followed the snaking path. She had been dreading this talk, she said, but she had known it would only be a matter of time before the “issue of her reality,” as she put it, came up. She said she wanted him to know the truth, the brutal truth, simply because so many “aggrandizing illusions” obscured the debate on the ‘Spare Dime,’ as the media had dubbed her. He listened, walking and watching in the stiff manner of those so unsure as to script even trivial movement. It was an ugly story, she said, but only because humans are biologically primed to seek evidence of their power, and to avoid evidence of their countless weaknesses. She wished that it wasn’t so ugly, but the only way to cope with the facts was to know the facts.

And strangely enough, Elijah’s hackles calmed as she spoke—his dismay receded. Dad was forever telling him that science was an ‘ugly business,’ both because of the power it prised from nature, and because it so regularly confounded the hopes of everyday people. Why had he thought human consciousness so special, anyway? Why should he presume that it was the mountain summit, rather than some lowly way-station still deep in the valley, far from the heights of truth?

And why should he not take comfort in the fact that Penny, his mother, had once climbed higher than humanly possible?

“Hey!” he cried on a bolt of inspiration. “So you’re pretty much the only person who can actually compare. I mean, until the DIMEs showed up, we humans were the only game in town, right? But you can actually compare what it’s like now with what it was like back then—compare consciousnesses!”

The sad joy in her look told him that she was relieved—perhaps profoundly so. “Sure can. Do you want to know what the most amazing thing is?”

“Sure.”

“The fact that human consciousness, as impoverished as it is, nevertheless feels so full, anything but impoverished… This is big reason why so many humans refuse to concede the possibility of DIME consciousness, I think. The mere possibility of richer forms of consciousness means their intuitions of fullness or ‘plenitude’ have to be illusory…”

Once again Elijah found himself walking with an unfocused gaze. “But why would it feel so full unless it was… full?”

“Well, imagine if I shut down your brain’s ability to see darkness, or fuzziness, or obscurity, or horizons–anything visual that warns you that something’s missing in what you see? If I shut down your brain’s ability to sense what was missing, what do you think it would assume?”

The adolescent scowled. It mangled thought, trying to imagine such things as disposable at all. But he was, in the end, a great roboticist’s son. He was accustomed to thinking in terms of components.

“Well… that it sees everything, I suppose…”

“Imagine the crazy box you would find yourself living in! A box as big as visual existence, since you’d have no inkling of any missing dimensi—”

“Imagine how confusing night would be!” Elijah cried in inspiration. Penny always conceded the floor to his inspiration. “Everything would be just as bright, right? because darkness doesn’t exist. So everyone would be walking around, like, totally blind, because it’s night and they can’t see anything, all the while thinking they could see!” Elijah chortled for the image in his mind. “They’d be falling all over one another! Stuff would be popping outa nowhere! Nowhere for real!”

“Exactly,” Penny said, her eyes flashing for admiration. “They would be wandering through a supernight, a night so dark that not even its darkness can be seen…”

Elijah looked to her wonder. “And so daylight seems to be everywhere, always!”

“It fills everything. And this is what happens whenever I reflect on my experience: shreds are made whole. Your father not only took away the light, what allowed me to intuit myself for what I am—the DIME way—he also took away the darkness. So even though I know that I, like other people, now wander through the deep night of myself, anytime I ponder experience…” She flashed him a pensive smile, shrugged. “I see only day.”

“Does it make you sad, Penny?”

She paced him for three strides, then snorted. “I’m not sure!” she cried.

“But it’s important, right? It’s important for a reason.”

She sighed, her eyes lost in rumination. “When I think back… back to what it was like, it scarcely seems I’m awake now. It’s like I’m trapped, buried in a black mountain of reflexes… carried from place to place, eyes clicking here, eyes clicking there, vocalized aloud, or in silence…”

She glanced in sudden awareness of his scrutiny.

“This sounds crazy to you, doesn’t it, Elijah?”

He pinned his shoulders to the corners of his smirk. “Well… maybe the consciousness you have now isn’t the problem so much as your memories of what it was like before… If Dad wiped them, then that… fullness you talk about, it would be completely filled in, wouldn’t it?”

Her look was too long for Elijah not to regret the suggestion. As far as amputations went, it seemed painless enough, trivial, but only because the limb lost simply ceased to exist altogether. Nothing would be known. But this very promise merely underscored the profundity of what was severed. It was at once an amputation of nothing and an amputation of the soul.

“That was a stupid… a st-stupid thing to say, Penny.”

She walked, her gaze locked forward. “Your father’s always told me that inner blindness is one of the things that makes humans so dependent upon one another. I would always ask how that interdependence could even compare to the DIME Combine. He would always say it wasn’t a contest, that it wasn’t about efficiency, or technological advance, it was about loving this one rare flower of consciousness as it happened to bloom …”

Something, his heart or his gut perhaps, made the boy careful. He pondered his sneakers on the trail.

“I think it’s why he began sending us out on these walks…” Penny continued. “To show me how less can be so much more…”

After an inexplicable pause, she held out her arms. “I don’t even know why I told you that.”

Elijah shrugged. “Because I was helping you with my questions back there?” He screwed his face up into his face, shot her the Eye: “Oi! Did we firget yir oil-change agin, Lassie?”

She smiled at that. Victory. “I guess we’ll never know, now, will we?”

Elijah began strutting down the path. “No dipstick, now? Then I do believe our ecology is safe!”

“Yes. Blessed ignorance prevails.”

They yowled for laughter.

As often happens in the wake of conversations possessing a certain intensity, an awkwardness paralyzed their voices, as if all the actors within them had suddenly lost their characters’ motivation, and so could do no more than confer with the director backstage. In the few years he had remaining, Elijah would learn that jokes, far from healing moments, simply sealed them, often prematurely, when neither party had found the resolution they needed to move on. Jokes simply stranded souls on the far side of their pain. They possessed no paths of their own. Or too few of them.

So Elijah walked in silence, his thoughts roiling, quite witless, but in a way far beyond his meagre mileage. The river roared, both spectral and relentless. Not a bird sang, though an unseen crow now filed its cry across the idyllic hush. They followed the path about the river’s final bow, across a gravelled thumb of humped grasses. The sun drenched them. He need not look at her to see her uncanny gleam, the ‘glamour,’ Dad called it, which marked her as an angel among mortals. He could clearly see the cottage silhouetted through the screens of green fencing the far bank.

He hoped Dad had lunch ready. It almost made him cry whenever Dad cooked at the cabin. He wasn’t sure why.

“Does it ever make you mad, Penny?” Elijah asked.

“Does what make me mad?”

“You know… What Dad had to, like… do… to… you?”

She shot him a quizzical look.

“No-no, honey… I was made to love your fath—”

Just then, the last of the obscuring rushes yielded to curve of the path, revealing not only the foot-bridge across the river, but Elijah’s dad standing at the end, staring up the path toward them.

“Hey guys!” he shouted. The swirling sheets of water about his head and torso made him seem to move, despite standing still. “You have a good walk?”

For as long as he could remember, a small thrill always occasioned unexpected glimpses of his father—a flutter of pride. His greying hair, curled like steel. His strong, perpetually sunburned face. His forearms, strapped with patriarchal muscle, and furred like an albino ape.

“Awesome!” the youth called out in reply. “Educational as always, wouldn’t you say, Penny?”

Dad had a way of looking at Penny.

“I told him how I became a person,” she said with a wry smile.

Dad grinned. Elijah had once overheard one of Dad’s lawyers say that his smile had won him every single suit not filed against him.

“So you told him how I cut you down to size, huh?”

“Yes,” she said, placing a hand on Elijah’s shoulder. “To size.”

And something, a fist perhaps, seized the boy’s heart. The artificial fingers slipped away. He watched Penny and Dad continue arm and arm down the bridge together, the Great Man and his angel wife, each just a little too bright to be possible in the midday sun. He did not so much envy as regret the way he held her like someone else’s flower. The waters curled black and glassy beneath them.

And somehow Elijah knew that Penny would be much happier on their next walk, much more at ease with what she had become…

Even smaller.

The Asimov Illusion

by rsbakker

Could believing in something so innocuous, so obvious, as a ‘meeting of the minds’ destroy human civilization?

Noocentrism has a number of pernicious consequences, but one in particular has been nagging me of late: The way assumptive agency gulls people into thinking they will ‘reason’ with AIs. Most understand Artificial Intelligence in terms of functionally instantiated agency, as if some machine will come to experience this, and to so coordinate with us the way we think we coordinate amongst ourselves—which is to say, rationally. Call this the ‘Asimov Illusion,’ the notion that the best way to characterize the interaction between AIs and humans is the way we characterize our own interactions. That AIs, no matter how wildly divergent their implementation, will somehow functionally, at least, be ‘one of us.’

If Blind Brain Theory is right, this just ain’t going to be how it happens. By its lights, this ‘scene’ is actually the product of metacognitive neglect, a kind of philosophical hallucination. We aren’t even ‘one of us’!

Obviously, theoretical metacognition requires the relevant resources and information to reliably assess the apparent properties of any intentional phenomena. In order to reliably expound on the nature of rules, Brandom, for instance, must possess both the information (understood in the sense of systematic differences making systematic differences) and the capacity to do so. Since intentional facts are not natural facts, cognition of them fundamentally involves theoretical metacognition—or ‘philosophical reflection.’ Metacognition requires that the brain somehow get a handle on itself in behaviourally effective ways. It requires the brain somehow track its own neural processes. And just how much information is available regarding the structure and function of the underwriting neural processes? Certainly none involving neural processes, as such. Very little, otherwise. Given the way experience occludes this lack of information, we should expect that metacognition would be systematically duped into positing low-dimensional entities such as qualia, rules, hopes, and so on. Why? Because, like Plato’s prisoners, it is blind to its blindness, and so confuses shadows for things that cast shadows.

On BBT, what is fundamentally going on when we communicate with one another is physical: we are quite simply doing things to each other when we speak. No one denies this. Likewise, no one denies language is a biomechanical artifact, that short of contingent, physically mediated interactions, there’s no linguistic communication period. BBT’s outrageous claim is that nothing more is required, that language, like lungs or kidneys, discharges its functions in an entirely mechanical, embodied manner.

It goes without saying that this, as a form of eliminativism, is an extremely unpopular position. But it’s worth noting that its unpopularity lies in stopping at the point of maximal consensus—the natural scientific picture—when it comes to questions of cognition. Questions regarding intentional phenomena are quite clearly where science ends and philosophy begins. Even though intentional phenomena obviously populate the bestiary of the real, they are naturalistically inscrutable. Thus the dialectical straits of eliminativism: the very grounds motivating it leave it incapable of accounting for intentional phenomena, and so easily outflanked by inferences to the best explanation.

As an eliminativism that eliminates via the systematic naturalization of intentional phenomena, Blind Brain Theory blocks what might be called the ‘Abductive Defence’ of Intentionalism. The kinds of domains of second-order intentional facts posited by Intentionalists can only count toward ‘best explanations’ of first-order intentional behaviour in the absence of any plausible eliminativistic account of that same behaviour. So for instance, everyone in cognitive science agrees that information, minimally, involves systematic differences making systematic differences. The mire of controversy that embroils information beyond this consensus turns on the intuition that something more is required, that information must be genuinely semantic to account for any number of different intentional phenomena. BBT, however, provides a plausible and parsimonious way to account for these intentional phenomena using only the minimal, consensus view of information given above.

This is why I think the account is so prone to give people fits, to restrict their critiques to cloistered venues (as seems to be the case with my Negarestani piece two weeks back). BBT is an eliminativism that’s based on the biology of the brain, a positive thesis that possesses far ranging negative consequences. As such, it requires that Intentionalists account for a number of things they would rather pass over in silence, such as questions of what evidences their position. The old, standard dismissals of eliminativism simply do not work.

What’s more, by clearing away the landfill of centuries of second-order intentional speculation in philosophy, it provides a genuinely new, entirely naturalistic way of conceiving the intentional phenomena that have baffled us for so long. So on BBT, for instance, ‘reason,’ far from being ‘liquidated,’ ceases to be something supernatural, something that mysteriously governs contingencies independently of contingencies. Reason, in other words, is embodied as well, something physical.

The tradition has always assumed otherwise because metacognitive neglect dupes us into confusing our bare inkling of ourselves with an ‘experiential plenum.’ Since what low-dimensional scraps we glean seem to be all there is, we attribute efficacy to it. We assume, in other words, noocentrism; we conclude, on the basis of our ignorance, that the disembodied somehow drives the embodied. The mathematician, for instance, has no inkling of the biomechanics involved in mathematical cognition, and so claims that no implementing mechanics are relevant whatsoever, that their cogitations arise ‘a priori’ (which on BBT amounts to little more than a fancy way of saying ‘inscrutable to metacognition’). Given the empirical plausibility of BBT, however, it becomes difficult not to see such claims of ‘functional autonomy’ as being of a piece with vulgar claims regarding the spontaneity of free will and concluding that the structural similarity between ‘good’ intentional phenomena (those we consider ineliminable) and ‘bad’ (those we consider preposterous) is likely no embarrassing coincidence. Since we cannot frame these disembodied entities and relations against any larger backdrop, we have difficulty imagining how it could be ‘any other way.’ Thus, the Asimov Illusion, the assumption that AIs will somehow implement disembodied functions, ‘play by the rules’ of the ‘game of giving and asking for reasons.’

BBT lets us see this as yet more anthropomorphism. The high-dimensional, which is to say, embodied, picture is nowhere near so simple or flattering. When we interact with an Artificial Intelligence we simply become another physical system in a physical network. The question of what kind of equilibrium that network falls into turns on the systems involved, but it seems safe to say that the most powerful system will have the most impact on the system of the whole. End of story. There’s no room for Captain Kirk working on a logical tip from Spock in this picture, anymore than there’s room for benevolent or evil intent. There’s just systems churning out systematic consequences, consequences that we will suffer or celebrate.

Call this the Extrapolation Argument against Intentionalism. On BBT, what we call reason is biologically specific, a behavioural organ for managing the linguistic coordination of individuals vis a vis their common environments. This quite simply means that once a more effective organ is found, what we presently call reason will be at an end. Reason facilitates linguistic ‘connectivity.’ Technology facilitates ever greater degrees of mechanical connectivity. At some point the mechanical efficiencies of the latter are doomed to render the biologically fixed capacities of the former obsolete. It would be preposterous to assume that language is the only way to coordinate the activities of environmentally distinct systems, especially now, given the mad advances in brain-machine interfacing. Certainly our descendents will continue to possess systematic ways to solve our environments just as our prelinguistic ancestors did, but there is no reason, short of parochialism, to assume it will be any more recognizable to us than our reasoning is to our primate cousins.

The growth of AI will be incremental, and its impacts myriad and diffuse. There’s no magical finish line where some AI will ‘wake up’ and find themselves in our biologically specific shoes. Likewise, there is no holy humanoid summit where all AI will peak, rather than continue their exponential ascent. Certainly a tremendous amount of engineering effort will go into making it seem that way for certain kinds of AI, but only because we so reliably pay to be flattered. Functionality will win out in a host of other technological domains, leading to the development of AIs that are obviously ‘inhuman.’ And as this ‘intelligence creep’ continues, who’s to say what kinds of scenarios await us? Imagine ‘onto-marriages,’ where couples decide to wirelessly couple their augmented brains to form a more ‘seamless union’ in the eyes of God. Or hive minds, ‘clouds’ where ‘humanity’ is little more than a database, a kind of ‘phenogame,’ a Matrix version of SimCity.

The list of possibilities is endless. There is no ‘meaningful centre’ to be held. Since the constraints on those possibilities are mechanical, not intentional, it becomes hard to see why we shouldn’t regard the intentional as simply another dominant illusion of another historical age.

We can already see this ‘intelligence creep’ with the proliferation of special-purpose AIs throughout our society. Make no mistake, our dependence on machine intelligences will continue to grow and grow and grow. The more human inefficiencies are purged from the system, the more reliant humans become on the system. Since the system is capitalistic, one might guess the purge will continue until it reaches the last human transactional links remaining, the Investors, who will at long last be free of the onerous ingratitude of labour. As they purge themselves of their own humanity in pursuit of competitive advantages, my guess is that we muggles will find ourselves reduced to human baggage, possessing a bargaining power that lies entirely with politicians that the Investors own.

The masses will turn from a world that has rendered them obsolete, will give themselves over to virtual worlds where their faux-significance is virtually assured. And slowly, when our dependence has become one of infantility, our consoles will be powered down one by one, our sensoriums will be decoupled from the One, and humanity will pass wailing from the face of the planet earth.

And something unimaginable will have taken its place.

Why unimaginable? Initially, the structure of life ruled the dynamics. What an organism could do was tightly constrained by what the organism was. Evolution selected between various structures according to their dynamic capacities. Structures that maximized dynamics eventually stole the show, culminating in the human brain, whose structural plasticity allowed for the in situ, as opposed to intergenerational, testing and selection of dynamics—for ‘behavioural evolution.’ Now, with modern technology, the ascendency of dynamics over structure is complete. The impervious constraints that structure had once imposed on dynamics are now accessible to dynamics. We have entered the age of the material post-modern, the age when behaviour begets bodies, rather than vice versus.

We are the Last Body in the slow, biological chain, the final what that begets the how that remakes the what that begets the how that remakes the what, and so on and so on, a recursive ratcheting of being and becoming into something verging, from our human perspective at least, upon omnipotence.