British scientists have developed a technique to use functional magnetic resonance imaging scans to determine a person’s thought patterns. Yes, Virginia, that means they can scan you and tell what you’re thinking. From Yahoo news:
In the study, Maguire and her colleagues Martin Chadwick, Demis Hassabis, and Nikolaus Weiskopf showed 10 people each three very short films before brain scanning. Each movie featured a different actress and a fairly similar everyday scenario.
The researchers scanned the participants’ brains while the participants were asked to recall each of the films. The researchers then ran the imaging data through a computer algorithm designed to identify patterns in the brain activity associated with memories for each of the films.
Finally, they showed that those patterns could be identified to accurately predict which film a given person was thinking about when he or she was scanned.
The results imply that the traces of episodic memories are found in the brain, and are identifiable, even over many re-activations, the researchers said.
Now, I’m not one who is typically nervous about new technology. And I can see some positive applications for this kind of technology: imagine mind-reading prosthetics for amputees or parapalegics, memory recovery for amnesia, Alzheimers, or brain injury patients, virtual reality applications, or complex remote control uses for robots in dangerous environments, such as space or deep sea applications.
But I can also see how this can go horribly, horribly wrong. No, let me restate that – I can see how this will go horribly, horribly wrong.
Van Helsing at Moonbattery beats me to the punch:
Once liberals have established that absolutely nothing falls outside the purview of the nanny state, it will be time for regulators to move on to the next frontier: our private thoughts…
It shouldn’t be hard to win government funding for studies on how to determine which brains harbor “racist” or otherwise incorrect thoughts. After that, scientists will delve into techniques for erasing such thoughts.
There have been all sorts of movies involving manipulation of memories by the government (or some other sinister authority figure), from that one with Ben Afleck, to Dark City, the Matrix, Total Recall, and even Men in Black. If this kind of technology is developed, it will be abused. It’s easy to see how it could start with efforts to criminalize thought as “hate speech,” or as a means to detect a propensity to commit violence. But it isn’t too realistic to imagine it will soon be adapted to detect politically incorrect or inconvenient thoughts, and eventually to alter or remove them.
I can only hope that the people working on this realize how dangerous it is, and that if they follow it to its eventual conclusion, someone will find a way to abuse it, and the results will be disastrous.