Author: Justin Mckibben
If there are any other nerds out there like me, you may have come across an abstract animated series called Psycho-pass that rose in popularity a few years back in 2012. The show’s name fits firmly into the primary premise of the show, an authoritarian future dystopia, where omnipresent public sensors ceaselessly scan the mental states of every passing citizen. In the TV show, collected data on both present mentality and aggregated personality data is used to gauge the probability of an individual committing a crime, the rating referred to as that citizen’s Psycho-Pass. Law enforcement and public security utilizes technology tracking mental health of citizens in order to premeditate possible threats. The characters chase criminals who the system deems emotionally or psychologically at risk, and the show adds a few good twists of suspense and philosophical paradox.
Needless to say, I am a big fan of the series.
So of course, seeing a headline explaining a new research project that could make this kind of system a reality, it stirs up some curiosity. This abstract concept of machines reading the psychological profiles of everyday people as a security measure has jumped right out of the world of scifi-fantasy and could soon be another innovation that changes our world.
Could a mental health security system for the future of public safety?
How the Mental Health Security System Works
According to a new piece of research, published in the International Journal of Advanced Intelligence Paradigms, a mental health security system is being developed that gives an analyses of the user’s brainwaves.
Most modern security systems require a PIN or password. Other biometric-based systems require a fingerprint or scans of an iris or retina. We have already seen this kind of stuff in the movies; voice-activated locks, palm-print thermal safes and other cool high-tech gadgets. Now, Violeta Tulceanu of the University of Iasi is adding a truly unique aspect to security; the emotion detector.
Upon reading the brainwaves the system is designed to automatically determine whether the user is in a fit mental state. After the test is complete the system will grant access to resources, but only if deemed appropriate.
Violeta Tulceanu states:
“The true engine of motivation is our capacity to perceive pleasure and fear pain, and thus, reward and punishment,”
“Our ability to react to dangerous situations is directly related to our capacity to relate to our environment, and our sense of self-preservation.”
In the new approach, Tulceanu trains the system to recognize a user’s “emotional fingerprint” using the patterns of electrical brainwaves. The system measures the brainwaves they generate in the presence of specific, evocative stimuli. The system has to have a baseline mental signature to cross-reference. Each emotional response is matched to a given pattern and these are then associated with particular configurations of the mental health security system. So someone mentally stable will set the standard for their future readings.
Once the profile is complete it can allow or preclude access to given resources. So the next time someone tries to get access, the system simply measures the current electrical brain activity and if the result of processing the credentials matches the “emotional fingerprint” access is granted or refused accordingly.
This is amazing because it not just recognizes brainwaves to allow authorized access, but it also determines whether or not someone’s current mental health should keep them from having access.
Why a Mental Health Security System Could Matter
Based on the core concept, this could all matter a great deal to public safety. If someone is in a well-balanced emotional state, when faced with external factors they probably react according to:
- Group expectations
- Cultural background
- Social norms
- Personal inclinations
However, humans are emotional. We feel. Some of us more intensely, but all of us included. So our decisions can be subject to:
- Our wants/desires
We can even be influenced by psychoactive chemicals that might make particular resources inappropriate or hazardous. Perhaps a safe with a gun locked inside should only be accessible by someone of a stable mental and emotional state.
With this kind of mental health security system there could be another step to control:
- Entry to a building
- Access to computer resources
- Even the withdrawal of money from ATMs
The research actually indicates this mental health security system could also have applications in:
- Electronic learning
What Could the Mental Health Security System Change?
Many may not notice at first, but this is a huge deal and if it gained momentum it could change a lot about our world. Thinking about it, anyone can suffer from depression, stress, or anxiety, as well as substance abuse. Some of us may not even be aware of our own issues with mental health until something devastating has happened. We all have the capacity to make detrimental decisions, and sometimes we also have the capacity to do so while accessing sensitive resources.
Let us look at just a few ways this could be a really big deal.
In the interest of public safety, we could consider access to an airport or a school. With the history of shootings and other attacks on citizens being perpetrated by people deemed to be in the grips of mental illness, could this new technology have the capacity to save lives by blocking off those who it perceives to be a threat?
Tulceanu suggests this mental health security system could ensure the safety and security of individuals and those around them that might be at risk if access is granted to particular resources.
The mental health security system might be able to assess whether a person is acting responsibly and of their own accord. So if someone is being forced to access something, the security system could measure this emotional response as well and act further on the behalf of someone who could be being robbed or held hostage.
Is a Mental Health Security System Morally Just?
Here comes the philosophical debate. When looking at the possibilities of this technology, we also have to ask ourselves the same questions that crop up in the scifi stories; is this moral? Specifically, if it became a government tool, does it violate privacy or civil rights? Really ask yourself- is this a brighter, safer future? Or could it be misused for subliminal, psychological oppression?
Seriously, this is a tough call. It does sound like it could save a lot of lives. But some would ask- who has the right to say whether or not I’m mentally or emotionally stable enough to get my own money from the bank? Or to get on a plane? Who decides when you are too emotionally or mentally compromised to go to work? What if years from now you aren’t allowed to live in a certain neighborhood because of your place on the brainwave scale?
Would this kind of restriction on people based on an analyses of mental health be reinforcing the stigmas attached to mental health? If so, would people be discriminated against for mental health issues? In the TV show I referenced earlier, people with moderately risky mental health ratings were lawfully mandated to therapy; would that become part of the normal practice if a more comprehensive mental health security system was put into place?
These days, modern research techniques show that far from being indefinable, emotion is completely neurological. Emotion lives at the core of all learning mechanisms. This makes it possible to treat emotion more objectively. All this new research is extremely fascinating. Without being too sure which way to lean, I simply wonder what the world would think of a mental health security system.