Picture this. You’re back in school, it’s exam week, you have multiple exams coming up that you absolutely need to do well on, but lately you have been getting so flustered with the tasks you need to get done that you unintentionally lose focus.
Sound like a familiar scenario? Chances are you have gone through this hectic pace of thought process at one point or another in your life, but what if you were told that you can easily prevent this from ever happening again by putting on a headband for three minutes a day?
With seven sensors, five hours of battery life, and an application that tracks the user’s progress, the Muse Brain Sensing Headband allows users to become more focused through meditation. All you have to do is turn on the headband, pair it with the application, open the application, put on your headphones, and listen to the guide.
A headband that allows you to be your best self in under three minutes a day sounds like a sweet deal, but how does it exactly work?
As I was doing my research on this topic, I noticed that not many articles give an in depth analysis of how the brain receives signals, which I believe plays a crucial role in truly understanding brain sensing technology. This is why I have decided to do just that.
In this article, I will first go over the physiology of the brain as I discuss the details of neurotransmission. I will follow that with a short discussion of the technology that is inspired by this physiology. Lastly, I will go over how brain computing (or brain sensing/analyzing) is going to evolve in the coming years.
Let’s Talk Brain Science: Neurotransmission
Let’s first discuss how signal transduction works in the cells of the body. When a ligand, a small molecule that acts as a signal, binds to a receptor, a series of pathways occur within the cell that help activate a target protein or molecule.
In other words, a signal molecule latches onto a specific binding site, turns on another target molecule downstream of the receptor, and continues turning on other target molecules and proteins until the designated signal is produced. This signal produced then moves up the spine and to the brain where neurons are located.
Your brain is home to approximately 100 billion neurons, nerve cells that allow you to react to signals. These minute cells work in a very synchronized manner. A neuron consists of a nucleus, axon, dendrites, myelin sheath, and an axon terminal.
A signal is transmitted to a neuron in the form of an electrical impulse known as an ‘action potential’ which then moves from the dendrites of the neuron to the axon terminal. The action potential causes the nerve cell to release neurotransmitters (signaling molecules) which then bind to the receptor on the next neuron.
The area between the two neurons where all of this is taking place is called the synapse. If the electrical charge on these neurotransmitters that are released into the synapse is above the threshold, then an action potential will be fired, but if not, then nothing will happen and that will be the end of the signal transduction. These two states are known as the excitatory and inhibitory states respectively.
When a group of neurons experience this change in electrical impulse, they generate an electrical field which resembles a small vibration and which can be then detected on the scalp by an EEG sensors.
In short: the brain receives an electrical signal, which causes an action potential within neurons. The action potential moves across neurons through a synapse, which generates an electrical field that is detectable by sensors on the scalp.
Discovered in 1924 by a German Psychiatrist, Hans Berger, Electroencephalography technology, or EEG, works by measuring the difference in electrical field that is produced by neurotransmission in real time. In traditional EEG testing, rows of electrodes are placed on a person’s scalp with a wire that hooks them up to an amplifier that strengthens the waves that are picked up, and a computer which records all of the data.
The data is presented on a graph in real time as the electrodes are picking up the electrical field on the scalp. Scientists decode this data by analyzing the types of waves that are presented. There are a total of five different wavelength patterns: Delta, Theta, Alpha, Beta, and Gamma (least to greatest in wavelength frequency).
These neural patterns that are picked up by the electrodes are then used by researchers to analyze cognitive behavior. For instance, in sleep research, researchers will look for delta waves to see how deep a patient is able to fall asleep. Likewise, they will look for higher frequency waves such as gamma or beta waves to check if the patient is still in REM sleep.
With its noninvasive method of use, this technology allows scientists and physicians to record when and where a particular activity has taken place in a subject’s brain. From these findings, they are then be able to interpret how the subject was feeling during a particular conversation – were they bored and unresponsive? Engaged and thinking critically? Were they focused on the conversation or task without any interruptions?
From sleep behavior to consumer behavior, EEG technology allows us to delve deeper into the human brain on a more factual basis.
The IoT Method to Brain Sensing
In 2014, with almost $170,000 and 644 backers, Joel Murphy and Conor Russomanno successfully released OpenBCI (BCI standing for Brain Computing Interface), an open source biosensing platform that allows consumers to track the electrical activity produced by the brain, heart, and muscle.
For the first time ever, this technology became accessible to the general public, which paved the way for world changing inventions.
Fast-forward to 2017 and you can find brain sensing products all over the web. From a headband that allows users to meditate to trendy eyewear that help athletes stay fashionable while also improving their focus, these devices are becoming prevalent in everyday life. But how do these companies incorporate EEG technology into these products to begin with?
Just as a traditional EEG cap places electrodes all across the skull, headbands like InteraXon’s Muse Brain Sensing Headband work by placing sensors along the forehead and behind the ears. Once the headset is paired with its application, the electrical impulses that are read by the sensors are immediately visualized in the app.
Depending on the types of brainwaves that are picked up, the application determines if the user needs to become more focused or not. If the waves increase in frequency, that indicates to the software that the user is distracted from the given task, and as a feedback response to these waves, the application increases the volume of the sound that the user is hearing in an attempt to get the user to refocus.
While extremely straightforward, in order to get the most accurate reading, one needs electrodes to be places all over the scalp, and around the eyes since the impulses are spread across the skull like mini vibrations.
With the frontal cortex being the primary location for problem solving, judgement, and impulse control, it makes sense why the Muse Brain Sensing Headband has sensors that are placed along the forehead.
While it may not be able to get an accurate read on the brain as a whole, it is able to track the activity of the frontal lobe, where our ability to control focus is located. Thus, this headband can strongly aid in training the frontal cortex to react more calmly to impulse and think through actions rationally with a more focused mindset.
Brain Chipping: The Future of Brain Computing Technology
Brain sensing technology is prominently used for its many medical benefits including helping cancer patients relieve stress to increase rate of recovery. However, what if these efforts can be put toward making everyday life easy and seamless as well?
By now, we’ve all heard of the new trend called “chipping,” well what if we used this same method to control everyday tasks such as turning on/off lights, locking the doors to your house, turning off your alarm clock in the morning, etc.?
At the rate this technology is being leveraged by major tech companies such as SpaceX, within the next few decades (possibly earlier), we will see people getting sensors implanted along their skull and integrated with different software to allow human beings to take full control of their lives.
Looks like movies aren’t all lies after all. The only plot twist in this reality movie is that we are becoming our own robots. Until then, however, there’s a lot of work that has to be done to ensure clear, precise data before we can start merging the human brain with AI.