Each human-computer interface is actually a brain-computer interface; it is just a matter of degree. Our intentions might be sent from our brain to the computer through our fingers and keyboard, throughout a camera that tracks eye movement or from sensors that read signals from surface of the scalp or from individual neurons. It is a continuum.
Though, when we talk about brain-computer interfaces (BCI) today, we are talking about capturing signals straight from the brain and using them to control an electronic device. This can be completed in a few ways, for example through electroencephalography (EEG) sensors that record electrical impulses from the brain or functional near-infrared spectroscopy (fNIR) that uses light to monitor blood flow in brain. However sensors can also be implanted, these less invasive technologies would work if sensors are worn on headsets that stay them in contact with scalp.
These technologies are not mind readers just so far, but they can be trained to distinguish patterns in controlled scenarios. We are very far away from me putting an EEG cap on your head and whereas you think "red car", I am able to know that you are thinking about red car. What we are capable to do now, for case, is train the system to make out a choice of four icons or to know that you are thinking "red car" against "playing tennis".
Therefore, a patient with locked-in syndrome may train a BCI to make out between two thoughts, such as "playing tennis" versus "walking down the street" and these might become their "yes" and "no" signals. To an extent, it doesn't issue what the two thoughts are. We can't say which neuron in brain fires when you think about playing tennis, but we can train a BCI to distinguish between that electrical pattern and an additional pattern.
Presently as everyone walks in much the similar way, but with differences in pace, speed and so on, so we make use of the same types of brainwaves for the same kinds of mental activities though there would still be differences between individuals.
As BCIs have advanced we have built up a "library" of signals, so that we can make devices that have the ability to track three, four or five patterns. We by now have robust technology tools that enable us to obtain clean signals from the brain. It used to be essential to wear 128 or even 256 leads on your head to get any useful information. Currently we can get meaningful data with 16, eight, or even four leads, depending on the task and the signal of interest. Now magic will be in the software and what it can do with those signals.
In time we will be able to use these devices flawlessly for tasks for example controlling the cursor on a computer screen or interacting, hands-free, with mobile phones. One of the projects that we are currently working on with the U.S. Navy is how to utilize both BCIs and physiological sensing to optimize team training and individual.
For instance, to make training as effective as possible, you might have a BCI that monitors whether you are paying attention correctly. If your attention wanders, the computer might alert you or ask you to describe the material that was just covered. It would be part of a bright tutor that paces the learning and content to match your attention and focus.
BCIs can also be used to monitor employees in high-stress environments, for example air traffic controllers or to recognize post-traumatic stress disorder in military personnel or concussion in players of contact sports.
Given the advances in BCIs, it seems crazy that every time you visit doctor your blood pressure, temperature, height and weight are checked, but not your brain vital signs. You don't require a sophisticated BCI to track brain health, even things such as reaction time tests can be a good indicator of your brain's processing speed.
EEGs are now just passive monitoring, other than it is easy to imagine a future where energy can be directed into the brain. Last year, scientists at MIT used light to activate cells in genetically modified mice to implant a fake memory into their brains. We are a long way from being able to do that with human beings, but we might see an extension of EEG technology to resolve when your brain was in a state where it was most interested to learning.
One profound application for BCIs would be awareness of other people emotions and brain states. Scientists at Princeton University have looked at speaker-listener pairs with both the EEG and brain imaging and have shown that while two people are communicating, speaking and understanding each other, their brains are literally on the same wavelength. Not only have that, listener's brain wave patterns start to precede speaker's brain wave patterns. You start to really anticipate the other person's brain waves.
That kind of data would have a profound impact on the way people interact. Visualize going into every meeting knowing exactly whose paying attention to you, who's on the same wavelength as you, exactly. Suppose having that kind of information. It will modify every single dynamic that you encounter.