Interstellar Dualists and X-phi Alien Superfreaks
by rsbakker
I came up with this little alien thought experiment to illustrate a cornerstone of the Blind Brain Theory: the way systems can mistake information deficits for positive ontological properties, using a species I call the Walleyes (pronounced ‘Wally’s’):
Walleyes possess two very different visual systems, the one high dimensional, adapted to tracking motion and resolving innumerable details, the other myopic in the extreme, adapted to resolving blurry gestalts at best, blobs of shape and colour. Both are exquisitely adapted to solve their respective problem-ecologies, however; those ecologies just happen to be radically divergent. The Walleyes, it turns out, inhabit the twilight line of a world that forever keeps one face turned to its sun. They grow in a linear row that tracks the same longitude around the entire planet, at least wherever there’s land. The high capacity eye is the eye possessing dayvision, adapted to take down mobile predators using poisonous darts. The low capacity eye is the eye possessing nightvision, adapted to send tendrils out to feed on organic debris. The Walleyes, in fact, have nearly a 360 degree view of their environment: only the margin of each defeats them.
The problem, however, is that Walleyes, like anenomes, are a kind of animal that is rooted in place. Save for the odd storm, which blows the ‘head’ about from time to time, there is very little overlap in their respective visual fields, even though each engages (two very different halves of) the same environment. What’s more, the nightvision eye, despite its manifest myopia, continually signals that it possesses a greater degree of fidelity than the first.
Now imagine an advanced alien species introduces a virus that rewires Walleyes for discursive, conscious experience. Since their low-dimensional nightvision system insists (by default) that it sees everything there is to be seen, and its high-dimensional system, always suspicious of camoflaged predators, regularly signals estimates of reliability, the Walleyes have no reason to think heuristic neglect is a problem. Nothing signals the possibility that the problem might be perspectival (related to issues of information access and problem solving capacity), so the metacognitive default of the Walleyes is to construe themselves as special beings that dwell on the interstice of two very different worlds. They become natural dualists…
The same way we seem to be.
Perhaps some X-phi super-aliens are snickering as they read this!
“…the nightvision eye, despite its manifest myopia, continually signals that it possesses A GREATER DEGREE OF FIDELITY than the first.” [Caps mine, since the italics from the original do not transpose to comments.]
One must first stipulate a conscious experience prior to the virus for that visual signal to be quantified in the manner you stated.
What am I missing?
That hawks can pick out their prey at great height better than, say, a shrew can pick out an insect at that height – clearly there is a greater degree of fidelity in the hawks vision in order to do that. It doesn’t require (our) concious experience for that to occur.
Why so?
Doesn’t it make it even harder that they actually do live in two very different worlds? One side is drastically different from the other? Partcularly if you roll up the dark side of the world into a tiny little skull sized ball?
There still seems a black box element as to why the blurry vision provides greater fidelity than the other (in the example ‘it just does’)? I’d propose an idea that the creatures brain power is enough to consistantly solve the blurry vision, so it feels like higher fidelity, while the other vision strains the system and the system can feel that strain (perhaps just in how much of the visual range is not being processed). However the hiccup I guess is that to ‘consistantly solve’ is to solve within the parameter of the creatures mind – the mind solves what the body sets before it. Sure, that might solve what is set before it, but what was set before it by the body might utterly ignore various preditors (or sucklike!). Rather like being good at tests a particular teacher might set, but those tests really aren’t any primer for dealing with life. I’m having trouble thinking of some fiction for that at the moment – perhaps if I hadn’t stayed up late playing GTA…
Further: The external senses generally overlap – you can see something but you can also touch it/you can touch something but you can also see it, for example. These two different ways of sensing the one thing allow a sort of triangulation to occur. But with the mind, we basically have just one sense. There’s no second perspective – and so no sort of triangulation available as a result. Unless you adopt scientific investigation as a sort of prosthetic second sense! But Jim Butcher said science is just another religion, man! So why would you trust it as a prosthetic sense, man!?
I think you’re hitting on an important part of the puzzle here regarding what kind of baseline is used for ‘resolution’ or ‘fidelity.’ But I would hesitate before placing to much on significance on body as opposed to mind, and rather think it through in entirely bodily terms: what solves (facilitates reproduction) is the whole behavioural, morphological, organic apparatus. We simply pick out those bits which seem to make the most the significant contributions.
It seems like picking and choosing when to make distinctions and when to collapse one thing into another? Why talk about bodies or living things when we can just talk about how in a particular X,Y, Z grid co-ordinate raw physics will determine what happens next (and the only question mark there is why a universe exists at all – though that’s a big question!)?
If ‘life’ and ‘the body’ still warrant a distinction, why doesn’t ‘the mind’ warrant a distinction as much as the various symbiote creatures that live and help support a human body also warrant a distinction? None of those symbiotes know much or anything about themselves or how they work, yet they both help keep the body going and as much as the body gets distinguished from just more physics, those symbiotes are called symbiotes and not just more physics in a grid position.
It’s like you want to render it right back to raw physics, but still keep the living body distinctions going. It seems to be picking and choosing to me.
Either tear it all down, not just our little skull treehouse, or acknowledge your distinctions and the implications of those distinctions made, I think (and no, those distinctions don’t necessarily then support common culture notion of the mind or even my notion of the mind – they might not or do not. But you can’t have the distinction but ignore it’s implications when it benefits your argument).
I tried to think it through in entirely bodily terms – I instantly rammed straight into the conceit of ‘the body’ as somehow being worthy of distinction from raw physics. It didn’t disassemble any conceit about ‘the mind’, it simply made me run into another conceit about there being ‘a body’.
Does it work as conversation when we tear down certain distinctions/conciets, but treat other ones (ones very close to the topic) not as distinctions at all and ‘just there’?
I’m left with an image rather like how fairy floss turns to goop upon touching water, a figure of a person turning to goop from the inside of the skull, down – even as the rest of the body is floss. Incongruant melting.
To continue(as if I hadn’t posted enough already!): It’s like bird wings – you don’t say that’s all there is, the birds body. No, they engage certain qualities of aerodynamics.
So it’s a question of whether Is logic, ie AND/OR gates, just as naturalistic as the qualities of air that grant aerodynamics.
Because the brain is like the wing – it merely engages the naturalistic logic.
Yes, just like the difference between an albatross and a humming bird and how their bodies engage aerodynamics in very different ways because of morphology, what naturalistic logic the body engages (if at all!) is a matter of morphology. But just as much as it’s not just the birds wing but the wing in combination with something else, here it’s still something else engaged. Yes, morphology essentially double dips – it both decides what information/signals are fed in, and it also even decides the configuration of the structures it uses to engage logic (and so determines what logic it call upon, if any).
And what we call logic is, I’d postulate, simply the way certain physical structures behave – like air behaves in a certain way, or water trickles this way or that, magnets attract or repel, or electricity somehow (I don’t know how) finds the shortest path to ground.
I guess you could go for how you could replicate a synaptic AND gate by using magnets to make such gates or water behaviour or other physical materials or electrical configurations, then question whether somehow the same logic exists in all of them? Ie, question whether that logic exists at all, by such an example? I just find reduction to just the body as not really making sense – it’s selective about where it’ll call something hard physics and where it’ll recognise distinctions from inanimate matter like organs/life.
For what it’s worth, anyway.
This is tres elegant. And I don’t think it matters much whether we apply the “c” word to the Walley’s content fixations. All we need to assume is that they information bearing states that are transparent (dark, auto-epistemically closed, etc.) like Tye’s PANIC states. Maybe consciousness is something woo involving non-physical sludge. Maybe not. It doesn’t matter since the take home moral is ontological, or rather, meta-ontological.
[…] shares this intersubjective phenomenological world? Could there – like Scott Bakker’s anemone-like Walleyes – so alien that their phenomenological worlds don’t coincide with […]
I’m reminded of a passage from Borges’s Book of Imaginary Beings
“The other creature raised by the problem of consciousness is the ‘hypothetical animal’ of Rudolf Hermann Lotze. Lonelier than the statue that smells roses and finally becomes a man, this being has in its skin but one movable sensitive point – at the extremity of an antenna. Its structure denies it, as is obvious, more than one perception at a time. Lotze argues that the ability to retract or extend its sensitive antenna will enable this all but bereft animal to discover the external world (without the aid of the Kantian categories of time and space) and distinguish a stationary from a moving object.”
Does a google car think it is special?
It is, it drives, it makes decisions: it is not conscious, or is it? Instead of asking: “What is consciousness?” Let us ask: “Does driving entail first-person-singular?” If not, then eliminate “first-person-singular” from our vocabulary, for it seems an obsolete concept.
It has yet to crash, only two known crashes and both were due to human intervention (i.e., one car the chauffeur decided to manually drive, the other a car hit a google car from behind at a stop light).
So that’s what Walleyes are! But then why do they grant magical powers?
http://jojo.wikia.com/wiki/Wall_Eyes
ehehehehe
Hey Scott,
Care to cheer up my weekend with some news on Unholy Consult? The wait is almost unbearable and you did say a few weeks ago that you might need two more weeks until you sent out it to your agent and editors? Has that happened?
I’m still hoping for a 2014 release.
-James
If a creature had hawk eyes designed to pick out rabbits from 1000 feet and shrew eyes designed to pick out beetles from six inches it would at the least need to be able to decide when to look with its hawk eyes and when to look with its shrew eyes. You might not need consciousness to have a sensory experience, but you might need consciousness to decide what kind of sensory experience you want to have. I think any creature which has the ability to focus its sensory equipment onto specific features of its environment has consciousness. After all, when we tell a child to ‘pay attention’ we’re telling him to focus his sensory equipment on certain inputs and ignore certain other inputs. If a walleye has the ability to focus its sensory equipment on a target well enough to shoot it down I’d say a walleye already has consciousness. I think what the alien virus gave the walleyes was a sort of 2nd order consciousness, the ability to think about its seeing and therefore think about its thinking. Before they received this gift (or were the victims of this prank) the walleyes had no reason and no capacity to compare their day vision to their night vision. When they look out into the night they see all the little there is to be seen. The trouble is not that there are no objects. The trouble is that there is no light. The danger is that if the night side suddenly because brilliantly illuminated the walleyes wouldn’t suddenly see better. They would go blind.
I think you have to consider reflexive motion tracking – was that a decision to see what you saw? How ‘concious’ is that, when an automatic door does pretty much the same thing?
But just as much as you refer to the idea of 2nd order conciousness, perhaps there are scales of conciousness we could refer to for various interactions?
The danger is that if the night side suddenly brilliantly illuminated the walleyes wouldn’t suddenly see better. They would go blind.
Wow, yeah. Well put!
Truth shines!
(sorry, couldn’t resist a quote from the books!)
http://www.partiallyexaminedlife.com/2014/02/24/precognition-90-david-brin/
I watched a David Attenborough doco the other day and they noted how mammals generally see in black and white – reptiles and birds have the great colour vision. But the interesting thing was that early primates, although mammals, managed to regain their colour vision from their reptile heritage so as to spot whether fruits are ripe.
I think it’s interesting to the topic as to why seeing something black and white is appealing to us – it harkens to our origins. While seeing in colour is possibly to see things like a reptile does – and possibly with it, to think more like a reptile thinks.