All our feelings are formed in the brain. Regardless of the kind of incoming information, whether the music sounds, any smells or visual images, all of them essentially are mere signals transmitted and deciphered by specialized cells. In this case, if not to take into account these signals, the brain does not directly in contact with the external environment. And if so, then it is likely that we have the opportunity to create new ways of interaction of the brain with the surrounding world and transmit the data directly.
Let’s go back a couple of sentences ago. If all the information is just coming in pulses, why vision is so different from smell or taste? Why you’ll never confuse the visual beauty of the blooming pine wood with taste of feta cheese? Or the friction of sandpaper on your finger tips with the smell of fresh espresso? We can assume that it has something to do with the structure of the brain areas involved in hearing, different from those that process information about visual images, and so on. But why in this case people who have lost, like sight, according to numerous studies, get the “reorientation” of the visual zone to strengthen the other senses?
Thus arose the hypothesis that the internal subjective experience is determined by the structure of the data themselves. In other words, the information itself is received, for example from the retina, has a different structure than the data emanating from the tympanic membrane receptors or with the fingertips. As a result, you get a different feeling. It turns out that in theory we can create new ways to transmit information. It’s not like sight, hearing, taste, touch or smell. It will be something completely new.
There are two ways to do it. First, by implanting electrodes directly into the brain. Second — receiving signals by the brain is non-invasive. For example, with the help of wearable devices. Imagine you’re wearing a bracelet with multiple vibrating motors that stimulate various locations around the wrist to form a data stream. When we establish a clear relationship between the information and a type of touch, people can easily start to recognize it. Something like that at the moment the company does NeoSensory, creating a vibrating neural interfaces. One of these developers plan to submit in the next 2019.
“Think about how babies “learn” to use the ears, clapping their hands or muttering something and catching sounds. Such training can also be observed in people born deaf and fitted with cochlear implants in adulthood. First, the experience of the cochlear implant is not like the sound. My friend described this as painless electric shocks. She did not feel that it has something to do with sound. But after about a month things started to “sound”, albeit lousy. Perhaps the same process happened with each of us when we learned to use the ears. We just don’t remember it.” — said one of the authors of the work on the creation of neural interfaces David Illman.
Based on the article of Professor of the Department of psychiatry and behavioral Sciences, Stanford University, author of The Brain: The Story Of You, and one of the founders NeoSensory David Illman. Published in Wired.
Do you believe in the development of neural interfaces? Can you tell us about this in our chat in Telegram.