Can technology change how we sense the world? Could we ‘feel’ data? By The Brain’s David Eagleman

Fascinating series The Brain with neuroscientist David Eagleman is currently showing on BBC4. In this edited extract from his book, he explains why the brain’s ability to adjust means we may not be limited in the future to the senses we have now

David Eagleman’s extraordinary series The Brain is currently being shown on BBC4. Catch episodes you’ve missed on BBC iPlayer (Episode 1: What is Reality?). We share an extract from his book:

Over the last 100,000 years our species has been on quite a journey: we’ve gone from living as primitive hunter-gatherers surviving on scraps to a planet-conquering hyperconnected species that defines its own destiny. And we owe our runaway success to the special properties of the three pounds of matter stored inside our skulls.

What is it about the human brain that has made this journey possible? If we can understand the secrets behind our achievements, then perhaps we can direct the brain’s strengths in careful, purposeful ways, opening a new chapter in the human story.

What do the next thousand years have in store for us? In the far future, what will the human race be like?

The secret to understanding our success – and our future opportunity – is the brain’s tremendous ability to adjust, known as brain plasticity. In this critical way, the brain is fundamentally unlike the hardware in our digital computers. Instead, it’s ‘liveware’. It reconfigures its own circuitry.

Though the adult brain isn’t quite as flexible as a child’s, it still retains an astonishing ability to adapt and change. Every time we learn something new, the brain changes itself. It’s this property of the brain – its plasticity – that enables a new marriage between our technology and our biology.

Technology making us see and hear differently

We’ve become progressively better at plugging machinery directly into our bodies. You may not realise it, but currently hundreds of thousands of people are walking around with artificial hearing and artificial vision.

With a device called a cochlear implant, an external microphone digitises a sound signal and feeds it to the auditory nerve. Similarly, the retinal implant digitises a signal from a camera, and sends the signals through an electrode grid plugged into the optic nerve at the back of the eye.

Even though the implants give slightly different signals than our natural sense organs, the brain figures out how to make do with the information it can get. For deaf and blind people around the planet, these devices have restored their senses.

David Eagleman YouTube Cochlear and retinal implants 620x349

What sensory opportunities does that open up? We come into the world with a standard set of basic senses: hearing, touch, sight, smell, and taste, along with other senses such as balance, vibration and temperature. The sensors we have are the portals by which we pick up signals from our environment. However, these senses only allow us to experience a tiny fraction of the world around us. All the information sources for which we don’t have sensors are invisible to us.

I think of our sensory portals as peripheral plug-and-play devices. Whatever information comes in, the brain figures out what to do with it. Mother Nature only needed to invent the principles of brain operation once and then she was freed up to tinker with designing new input channels.

Brains that experience a different reality

Just look across the animal kingdom, and you’ll find a boggling variety of peripheral sensors in use by animal brains. Snakes have heat sensors. The glass knifefish has electrosensors for interpreting changes in the local electrical field. Cows and birds have magnetite, with which they can orient themselves to Earth’s magnetic field. Most animals can see in ultraviolet; elephants can hear at very long distances; dogs experience a richly scented reality.

The crucible of natural selection is the ultimate hacker space, and these are just some of the ways that genes have figured out how to channel data from the outside world into the internal world. The end result is that evolution has built a brain that can experience many different slices of reality.

The consequence I want to highlight is that there may be nothing special or fundamental about the sensors we’re used to. They’re just what we’ve inherited from a complex history of evolutionary constraints. We’re not stuck with them.

Sensory substitution that’s already happening

One of the projects in my laboratory is to build a platform for enabling sensory substitution. Specifically, we have built a wearable technology called the Variable Extra-Sensory Transducer, or VEST. The VEST, worn inconspicuously under the clothing, is covered with tiny vibratory motors. These motors convert data streams into dynamic patterns of vibration across the torso. We’re using the VEST to give hearing to the deaf.

After about five days, a person born deaf can correctly identify spoken words. Although the experiments are still quite early, we expect that after several months of wearing the VEST, people will come to have a direct perceptual experience – essentially the equivalent of hearing.

It may seem strange that a person can come to hear via moving patterns of vibration on the torso. But the trick is this: the brain doesn’t care how it gets the information, as long as it gets it.

What if we could use this technology to extend our sensory inventory? To this end, my students and I are currently adding new senses to the human repertoire to augment our experience of the world.

Real-time data fed directly into the body

Consider this: the internet is streaming petabytes of interesting data, but currently we can only access that information by staring at a phone or computer screen. What if you could have real-time data streamed into your body, so that it became part of your direct experience of the world? In other words, what if you could feel data?

This could be weather data, stock exchange data, Twitter data, cockpit data from an airplane, or data about the state of a factory – all encoded as a new vibratory language that the brain learns to understand.

As you went about your daily tasks you would have a direct perception of whether it’s raining 100 miles away or whether it’s going to snow tomorrow. Or you could sense what’s trending across the Twittersphere, and in this way be tapped into the consciousness of the species.

Although this sounds like science fiction, we’re not far off from this future – all thanks to the brain’s talent at extracting patterns, even when we’re not trying. That is the trick that can allow us to absorb complex data and incorporate it into our sensory experience of the world.

Like reading this page, absorbing new data streams will come to feel effortless. Unlike reading, however, it would be a way to take on new information about the world without having to consciously attend to it.

At the moment, we don’t know the limits – or if there are limits – to the kinds of data the brain can incorporate. But it’s clear that we are no longer a natural species that has to wait for sensory adaptations on an evolutionary timescale.

As we move into the future, we will increasingly design our own sensory portals on the world. We will wire ourselves into an expanded sensory reality.

 

David Eagleman is a neuroscientist, director of the Laboratory for Perception and Action at the Baylor College of Medicine, vice-chair of the World Economic Forum’s Global Agenda Council on Neuroscience & Behaviour, a New York Times bestselling author, TED speaker, and writer and presenter of US TV series The Brain. His book The Brain: The Story of You is out now, published by Canongate