Feature Top (Full Width)

Thursday, 9 July 2015

Piercing perception, part 2: The plug and play brain.

Where is that damn installation disc?

I remember a time, in the not too distant past, where every peripheral that you bought came with an installation disc, containing the vital drivers required to allow the computer to make use of the device. Without them your gadget was no more useful than a rock, tethered to computer via string (or if the device was wireless...just a rock). Each new computer required the user to scrabble about in old boxes, search disc spindles or CD wallets to find the right disk. Failing that, a trawl through the manufacturers support pages were required to track down the specific software that would recognise how the computer was supposed to interpret and make use of  the electronic signals being sent from the input device. Nowadays, thankfully, you plug a mouse into the USB port and the computer is already installing it, ready to use in seconds. Marvelous.

If we were to consider our eyes, nose, ears, tongue and nerves as input devices, and our brain as the computer, then we already know that we have the right drivers to make sense of the signals we receive from them. However, if the brain can -as the computer does- find its own drivers, can we make use of prosthetic peripherals to enable us to sense the world in ways which have never been achieved before?

Sense and Sensing ability.

In the last post Piercing perception, part 1: A mole new world, I discussed how we perceive only a fraction of what the universe has to offer, due to the restrictions in our ability to process sensory information, as well as our inability to interact with the happenings of the universe on a macro or micro scale (in a meaningful way), without the backing of a well stocked science lab. Furthermore, we discussed the super senses of animals and how technology could potentially be used to harness their abilities, hypothetically allowing us to experience the world in new and exciting ways: expanding our umwelt (the world as it is experienced by a particular organism).

Artificial Inference

In the TED talk Can we create new senses in humans?, Eagleman (2015) explains that we already know that our brains are capable of adapting to receiving information from electronic devices, stating that many thousands of people can hear due to having cochlea implants or can see from having retinal implants. In a cochlea implant, a microphone picks up the sound, turning it into a digital signal, which is relayed to the inner ear. Likewise, with a retinal implant, a camera device captures the images, turns them into digital signals which are directed to the optic nerve. The brain can adapt to these new devices and find its own drivers to enable it to make sense of new signals.

But how does the brain speak digital? How does it translate these signals and convert it into something more familiar? The answer is that when the brain, for example, sees an object, it does not really see anything at all. All that happens is it receives electrochemical signals from the eyes, likewise when you hear something, the brain actually hears nothing but receives,  electrochemical signals. The brain sorts out these signals and makes meaning of them. It does not discriminate between what kind of data it is receiving, it just takes in everything and then figures out what to do with it, which Eagleman explains provides an evolutionary advantage, allowing for "...Mother Nature to tinker around with different types of input channels". So perhaps if we had the heat pits of a snake, the electro sensors of the Ghost knifefish or Magnetite which is in some birds to help with navigation our brain would be able to adapt to what they could pick up, allowing us to perceive our world in new ways.

In summary.

Our brains do not see, hear, taste, smell or touch, no more than a computer can see a digital feed from a camera. The computer just takes the patterns of electronic signals and sorts, uses and stores them in such a way to derive meaning. Our brains do the same. They are not fussy about what type of information they receive, they just find a way of using it. The versatility of our brains allows those who have sensory impairments to experience the world as though they would without their limitation; imitating the peripherals which time and evolution have bestowed upon us. The question now is not  whether our brains are capable of receiving digital input, but "How adaptable are our brains?", "Can our brains cope with new sensors, which evolution has denied us?" and "How will this alter our perception of the world?". 


MRC (2012),  Cochlear Impant, viewed 9th July 2015, <http://www.mrc-cbu.cam.ac.uk/improving-health-and-wellbeing/cochlear-implant/>

Shutterstock (2012), altered by Novis, J. (2015) Plug and play brain, viewed 9th July 2015, <http://images.gizmag.com/hero/brainpolymer-1.jpg>

No comments:

Post a Comment