Should scientists be allowed to create 'supersoldiers'? Experts warn desensitizing them to pain and emotion raises major ethical issues
- Scientists are already researching how to enhance the capabilities of soldiers
- One concern revolves around a soldier's capacity to give informed consent
- Enhancements could include altering a soldier's digestive system to enable them to digest cellulose, meaning that they can use grass as a food
- Or their sensitivity to pain could be diminished, or even the severity and likelihood of post traumatic stress disorder (PTSD) reduced
Enhancing a soldier's capacity to fight is nothing new.
Arguably one of the first forms of enhancement was through improving diet.
The phrase 'an army marches on its stomach' goes back at least to Napoleon, and speaks to the belief that being well fed enhances the soldier's chances of winning a battle.
Recent research has gone well beyond diet to enhance the capabilities of soldiers, like purposefully altering the structure and function of a soldier's digestive system to enable them to digest cellulose, meaning that they can use grass as a food. Perhaps prostheses could even be directly wired to their brain
But recent research has gone well beyond diet to enhance the capabilities of soldiers, like purposefully altering the structure and function of a soldier's digestive system to enable them to digest cellulose, meaning that they can use grass as a food.
Perhaps their cognitive capabilities could be substantially altered so they can make more rapid decisions during conflict.
Or their sensitivity to pain could be diminished, or even the severity and likelihood of post traumatic stress disorder (PTSD) reduced.
Or even the direct wiring of prostheses to their brain.
This kind of biological and technological enhancement is often referred to as developing 'supersoldiers'.
It's not science-fiction; research is underway around the world. And it brings with it a host of ethical concerns.
Perhaps a soldier's cognitive capabilities could be substantially altered so they can make more rapid decisions during conflict. Or their sensitivity to pain could be diminished, or even the severity and likelihood of post traumatic stress disorder (PTSD) reduced
One concern revolves around the capacity for a soldier – or any other member of a military force – to give meaningful informed consent to partake in clinical research or undergo enhancement.
The concern here is twofold.
This need for secrecy can impact how much information the subjects of enhancement receive, thus impacting the 'informed' part of informed consent.
Second, we might have concerns about whether a soldier can actively consent to enhancement.
That is, the hierarchical command structures and training in the military may impact the soldier's capacity to refuse enhancement.
Given the prominence of informed consent to medical ethics, this is a core issue for enhancement before we even get to conflict.
Numerous forms of enhancement look at ways of indirectly or directly impacting the soldier's cognitive capacities.
One example is countering the need for sleep through the use of drugs like amphetamines or Modafinil, or other longer-lasting neurological interventions.
Another is enhancing a soldier's capacity to make moral decisions.
Another concern is what might happen if we reduce a soldier's capacity to experience trauma with a drug like propranolol, which is being investigated for its ability to dampen the emotional force of particular memories.
If administered rapidly after a particularly traumatic military activity – say, killing a teenage combatant to protect a school full of children – this pharmaceutical intervention could reduce the likelihood or severity of the soldier developing post traumatic stress disorder (PTSD).
The ethical worries here turn on whether such interventions negatively impact a soldier's capacity to follow the laws of war.
However, if these enhancements don't increase the chances of the soldier committing war crimes, then perhaps there is even a moral obligation to enhance soldiers in such situations.
Conversely, there are reasons to be worried that enhancing soldiers can make their opponents, or even civilians, treat those soldiers immorally.
For example, if it is believed that enemy soldiers are enhanced so that they don't feel pain, some might be more inclined to torture them.
Treating the enemy as inhuman or subhuman is sadly all too common through history.
Enhancements may exacerbate this process, particularly if opposing groups can classify their enemies as inhuman mutant supersoldiers.
Another concern is around the soldier's life after conflict ceases or they leave the military.
One concern revolves around the capacity for a soldier – or any other member of a military force – to give meaningful informed consent to partake in clinical research or undergo enhancement. We might have concerns about whether a soldier can actively consent to enhancement. That is, the hierarchical command structures and training in the military may impact the soldier's capacity to refuse enhancement
For instance, does an enhancement have to be reversible? And if not, what special responsibilities does the military have to care for veterans, above and beyond existing supports? Similar issues have already been explored in science-fiction.
In a sense, none of these ethical concerns are specially new.
Informed consent, limiting war crimes and a responsibility to care for veterans are hardly novel ideas.
What enhancement technologies do is shine a light on existing behaviour.
And though we don't need to worry about enhanced soldiers becoming mutant superheroes quite yet, there is value in considering the ethical aspects of such technologies before they are used rather than after the fact.
Another concern is what might happen if we reduce a soldier's capacity to experience trauma with a drug like propranolol, which is being investigated for its ability to dampen the emotional force of particular memories. If administered rapidly after a particularly traumatic military activity – say, killing a teenage combatant to protect a school full of children – this pharmaceutical intervention could reduce the likelihood or severity of the soldier developing post traumatic stress disorder (PTSD)
Swarms of robots scouring enemy terrain ahead of ground troops in 'Iron Man' exoskeletons with built in weapons may sound like it is straight from the pages of a sci-fi novel, but Army experts believe this is actually a glimpse of war in 2025.
The US Army has revealed a 30 page vision of the future of battle, described as a 'mad scientist' concept.
It says that while robots will play a vital role, humans will still be at the heart of battle - albeit heavily augmented by technology.
Scroll down for video
According to the strategy, the Army envisions that, by 2025, ground troops will conduct foot patrols in with robots called 'squad multipurpose equipment transport vehicles' that carry rucksacks and other equipment. Overhead, unmanned aircraft will serve as spotters to warn troops of nearby enemy forces.
'As we look at our increasingly complex world, there's no doubt that robotics, autonomous systems and artificial intelligence will play a role,' said Lt. Gen. Kevin Mangum, the deputy commander for Army Training and Doctrine Command (TRADOC).
'We in the Army need to get our arms around what's in the realm of the possible.'
The 30-page strategy outlines five objectives to guide the technology that will ensure those Soldiers can survive and succeed in such an environment.
Speaking at TRADOC's Mad Scientist conference on robotics, artificial intelligence and autonomy, Mangum said he hoped to learn more from industry partners and those in the scientific community about how the Army might fight in the future.
However, no matter how much technology evolves, there will always be room for Soldiers, he said.
Leader-follower vehicles that the Army is currently testing to conduct autonomous convoys, for example, would still need troops to keep them out of enemy hands.
'We're not going to do these things totally autonomously,' Sadowski said.
'We're always going to have Soldiers involved in the process. You don't put treasure on the road without some sort of security.'
With the increasing sophistication of cyber and unmanned systems along with the expectation that fighting in the future will occur more often in densely populated urban areas, the Army has been refining its multi-domain battle concept after officially rolling it out in October.
Small robots could also be used for reconnaissance to increase situational awareness, while unmanned aerial systems deliver cargo to improve sustainment and reduce the reliance on manned rotary-wing support, the strategy notes.
'We feel that we're going to be contested in every domain,' Mangum said at the conference, which is intended to spark dialogue on technology innovation between the military, academia and industry.
'That's why this session is so timely to talk about what the challenges and, more importantly, what the opportunities are for us to be able to operate in that space.
The strategy predicts autonomous systems being fully integrated into the force by 2040.
Another idea the Army is considering is a 'warrior suit' that Soldiers can wear in combat, an exoskeleton equipped with computerized technology that can provide intelligence updates as well as integrate indirect and direct-fire weapons systems.
Small robots could also be used for reconnaissance to increase situational awareness, while unmanned aerial systems deliver cargo to improve sustainment and reduce the reliance on manned rotary-wing support, the strategy notes.
But the Army will also need to invest in robust communications and network systems that allow autonomous systems to talk to each other.
Another idea the Army is considering is an Iron Man style 'warrior suit' that Soldiers can wear in combat, an exoskeleton equipped with computerized technology that can provide intelligence updates as well as integrate indirect and direct-fire weapons systems.
Coding and new algorithms also lack funding, and although they may not be 'very sexy' they are still required to make robots move.
'You can have the slickest platform in the world, but if you don't have the right algorithms, it's not going to work,' Fountain said, speaking at the Mad Scientist event.
The two-day conference, he added, should help Army scientists gain more insight from other experts on the potential impacts and capabilities of new technology.
'Scientists are great at developing technology, but sometimes we need the assistance of a larger community to understand what those second and third order impacts are,' he said.
'We need your help in envisioning the role of robotics and artificial intelligence in the future.'
No comments:
Post a Comment