Cybersecurity to serve as a safeguard against criminal brain hacking
Data produced by your web browser, through your geolocation, by completing internet forms or by providing inputs on your keyboard can unveil very private and intimate data, like your sexual preferences or your political beliefs. You can just visualize what data can be discovered, or even inserted, and manipulated with the accessibility to your thoughts. It’s the last bastion of privacy that’s being violated, in an attempt to create a horde of zombies that will be adherent to the slave overlord. The production of brain-machine interfaces compels us to evaluate this matter on a deeper level, as these devices have the capability to connect to computers directly to the brain.
Science and neurotechnology haven’t progressed to a level to fulfil the mind reading promise (or so we’re being led to believe) but there are currently consumer-grade devices on the market that can audit delicate neural signals, neural messaging in essence. Electroencephalography (EEG) headsets, for example, are leveraged in marketing research in the analysis of emotions and unconscious responses to specific products or events. This mental information can undergo process to unveil vital data with regards to a person, states Pablo Ballarin, cofounder of the cybersecurity firm Balusian. It is in essence, an attempt to get you under the spell of the powers that be. A real-life big brother.
The nature of neurorisks
Neurotechnology’s security issues aren’t a new problem, there could be scenarios of harassment, organized crime, or traffic in private data, much like what is occurring currently in other electronic sectors. However, what is different is the nature of neural information. As these are produced directly at the brain, they can integrate sensitive medical data and hints with regards to our identity and the intimate frameworks directing our private choices.
To illustrate the realities of these dangers, analysts have made efforts to hack neuro technology that is presently available on the market for purchase. Their objective is to gather data which might prove good to cybercriminals, therefore highlighting possible security drawbacks and loopholes. Teams of scientists and technologists have illustrated that it’s doable to implant spyware in a brain-machine interface – particularly one developed to control video games with the mind; which facilitates them to steal data from the user.
By inserting subliminal imagery in the electronic game, hackers had the capacity to probe the user’s unconscious mental responses to particular stimuli, like postal address, zip codes, bank information or human faces. Thus, they had the capacity to glean data which included a bank card’s PIN number and an address of residence.
Ballarin, who is employed as telecommunications engineer and a cybersecurity specialist, has evaluated a few devices himself. He undertook hacking of a brand-name EEG headset and could carry out interceptions of the neural information transmitted by the device to a paired cell phone. “If you can go about processing these signals, you can gather data with regards to illness, cognitive capacities or even a person’s tastes and preferences, which can be things which are very private, such as sexual preferences and inclinations you wouldn’t even discuss with your significant other.
In the worst possible outcome, a bidirectional interface, that is, one that not merely interprets brain signals but also makes them, for instance, through nerve pulses, i.e., to a prosthetic arm, a wheelchair, or the very nervous system – could be susceptible to hacking either with the intention of physically harming the user or putting the user at some kind of disadvantage, or do harm to another human being. This is already being done. Fourteen years ago, way back in 2007, practitioners had to turn off all wireless functionality on a pacemaker fitted on
Dick Cheney, then Vice President of the U.S.A., to avert a potential homicide attempt through hacking of the device.
Protecting mental privacy
Steps to protect neural information must be both technological and political, and of course, social. There are global regulatory standards in place, particularly the General Data Protection Regulation (GDPR) in Europe, which theoretically restricts the treatment and potential selling of psychological and mental data, just like any private, confidential data, to maintain privacy. But organizations do not always give the details on how they anonymize information. Even where there is relative transparency, Ballarin highlights that it’s simple to track anonymized data back to particular people.
In an article released by the journal Nature entitled “Four ethical priorities for neurotechnologies and AI, Columbia University Researcher Rafael Yuste and his associates indicated three concrete steps. Firstly, to avert the traffic of neural information, they have made a proposal of opting out of sharing such data should be the prevailing default. Only with express consent, from individual customers should organizations be enabled to hand this data over to third parties. This of course, opens up the debate as to what happens when such intrusive gear is put into place when the individual in question was a minor. It’s tantamount to rape, and worthy of the highest punishment.
Even then, the authors identify that information volunteered by some users might be leveraged to draw “adequate” conclusions regarding others who assign a greater priority to their privacy. So important, especially in the 21st century. Rational beings have an inclination towards privacy and not invading the privacy of others, technologies such as this being abused is a perfect example of the malicious actor’s lack of regard for the privacy of others – if such technologies are not regulated, we can look forward to a dystopian future of control, lunacy, and general anarchy reigning in a squalid and decrepit planet.
They proposed that the selling, commercial transferring, and leveraging of neural information should have tight regulations. These regulations – which would also restrict the potential of people giving up their neural information or having neural activity encoded directly into their brains for economic reward, may be akin to legislation the also limits the sale of human organs.
Lastly, they indicated technological steps, like blockchain and federated learning, to prevent the process of neural signals in centralized databases. In this vein, a differing team of researchers from the University of Washington have indicated that neurotechnological devices execute an in-situ separating of brain wave components prior to their transmission. In this fashion, a brain-machine interface can restrict the data transmitted to its control device. (Typically, a paired mobile phone or computer) and only communicate data that holds relevance to the activity at hand.
For example, EEG sensors developed to control a wheelchair would just communicate the component of brain waves which codifies data pertinent to movement intentions, holding back other components in relation to, for example, emotional experiences and sensations. By restricting the storage and transmitting of raw neural information, avenues for criminals to hijack critical data is restricted as well.
Having said this, external scientists have indicated that this strategy puts high performance demands on the neurotechnological devices themselves, as they will require processing capacities over the brain wave sensors. Aside from that, limiting access to the raw signal restricts avenues for production of third-party software.
The gauntlet for finding a solution to neurotech’s privacy matters has been thrown. In the opinion of Ballarin, cybersecurity solutions must be provided via global regulation, however, legal action is typically late to provide solutions to old issues, therefore manufacturers and developers must predict the blind spots in their solutions. Eventually, it is the customers who will make judgments in the shape of their choices – but for this, they must have data on the risks and be aware of what is in their best interests, and not fall prey to malicious actors.