A commentary by John Harris, appearing in The Guardian, 9 February 2012
Recent advances in neuroscience, such as memory manipulation, create compelling ethical dilemmas
This week it was reported that soldiers could potentially, in the near future, have their minds plugged directly into weapons systems, and have their learning boosted by neural stimulation. The Royal Society’s Brain Waves project on new directions in neuroscience gives us much to reflect on and worry about. And it follows the news last week that scientists are developing a “mind-reading” technique to capture thoughts.
Research in all this is in its infancy but, though new understandings of how the brain works generate new treatments for disease and brain damage, they also expose us to many new dangers. The challenge is always to use judgment and, if necessary, force to maximise good and minimise evil. We should be clear, however, that there is no precautionary approach; therapy delayed is rescue denied. As in all other areas of human activity choice is not an option but a destiny. How should we choose?
The Royal Society report spoke of brain-machine interfaces (BMIs) to connect people’s brains directly to machinery. These interfaces are already being used to control artificial limbs for amputees, but they would also be efficient in improving speed and accuracy in delivering weapons systems. Rod Flower, chair of the report’s working group, rightly asks: “If you are controlling a drone and you shoot the wrong target or bomb a wedding party, who is responsible for that action? Is it you or the BMI?”
While this is a nice puzzle, the alternative without BMIs might be a greater likelihood that the wrong target will be chosen or hit. If we ban military BMIs, who is responsible for that?
The bigger question, though, is how to reduce the incidence of events where people suffer and others need to be called to account. Think of smart drugs that improve thought. Modafinil, a drug that keeps pilots alert, can indeed aid military pilots – but it also protects civilian passengers. The same drug also enhances other cognitive functioning, including exam performance.
We humans need to be smarter in order to combat a monstrous regiment of dangers that include climate change, meteorite strikes, diseases such as Aids and CJD, and an over-precautionary approach to innovation which may increase, rather than reduce, our vulnerability to these and other dangers. The dilemma is: whither caution? The ability to choose between caution and adventure assumes we can predict accurately – something we humans have been lamentably bad at.
In future, we’re also likely to face an ethical dilemma over memory manipulation. This is now a distinct possibility because drugs are available that can wipe, or certainly dampen, our recollection of events. Why should we tamper with our access to history? Well one good reason is that memories can be traumatic. The victim of, for example a brutal rape, might well wish to wipe the memory. But what if so doing removes the capacity to identify the perpetrator, and leaves him free to ruin others’ lives?
The neurotransmitter serotonin and the molecule oxytocin are hailed as agents which, by increasing reluctance to cause suffering on the one hand and trust on the other, can bring about an improvement in morals. Adjusting the levels of these chemicals in the body will effect changes which bypass decision-making and make certain behaviour, for all practical purposes, automatic. Why should we worry about bypassing morally defective decision making? One reason is it takes away our freedom.
Without the ability to reason about our decisions to act on the basis of judgment – rather than prompted by impulse or chemical, or biological, or technological stimulus – we not only lack liberty, the ability to choose. We lack the ability to choose wisely and well, to choose the best “all things considered”.
If we can read minds we might be able to literally see what someone has done and whether they did it on purpose. This would make solving crimes in principle simple and reliable. The problem here will be whether the science will reliably distinguish thoughts that describe fantasies or imaginings rather than real dirty deeds done.
The idea that neuroscience might enable thoughts to be read and intentions revealed is perhaps the most threatening of all to civil liberties. If we know someone intends to commit a murder or a robbery, why not monitor their thoughts and act pre-emptively? Apart from the obvious difference in quality between a wish or intention and an actual attempt, the reason might be that most of us form intentions that we abandon and wishes we never fulfil.
The price of liberty may be eternal vigilance but we need science, not least because it is our most obvious source of the sort of innovation that saves lives and produces welfare. Our vigilance must be as much to ensure we don’t stifle science as it is to be sure science remains our servant not our master.