The London Daily News


04 August, 2009 04:05 (GMT +00:00)
"Terminators" on the battlefield could pose threat to human beings
Article Video Photos
Future solderi.JPG

The stuff of films no more, robots with artificial intelligence able to distinguish between a "friend" or "foe" could pose a serious risk to human beings, argues Professor Noel Sharkey from the University of Sheffield

News Desk

THE concept that robots might be able eventually tell friend from foe is deeply flawed, says roboticist Noel Sharkey of the University of Sheffield. He was commenting on a report calling for weapon-wielding military robots to be programmed with the same ethical rules of engagement as human soldiers.

With growing concerns from the US and the UK about troop losses in Afghanistan the development of artificial intelligence in robots that may allow them to be armed could spare the losses of lives we have seen recently.

A report funded by the Pentagon has said that companies trying to fulfill the requirement for one-third of US forces to be unmanned by 2015 may ignore the serious ethical concerns have already been raised, and place commercial concerns at the front of all activities..

"Fully autonomous systems are in place right now," warns Patrick Lin, the study's author at California State Polytechnic in San Luis Obispo. "The US navy's Phalanx system, for instance, can identify, target, and shoot down missiles without human authorisation."

Excerpts of the  report:

“Clearly, there is a tremendous advantage to employing robots on the battlefield, and the US government recognizes this.  Two key Congressional mandates are driving the use of military robotics: by 2010, one‐third of all operational deep‐strike aircraft must be unmanned, and by 2015, one‐third of all ground combat vehicles must be unmanned [National Defense Authorization Act, 2000].”
 
“Most, if not all, of the robotics in use and under development are semi autonomous at best; and though the technology to (responsibly) create fully autonomous robots is
near but not quite in hand, we would expect the US Department of Defense to adopt the same, sensible ‘crawlwalkrun’ approach as with weaponized systems, given the serious inherent risks.” 


“Nonetheless, these deadlines apply increasing pressure to develop and deploy robotics including autonomous vehicles; yet a ‘rush to market’ increases the risk for inadequate design or programming.  Worse, without a sustained and significant effort to build in ethical controls in autonomous systems, or even to discuss the relevant areas of ethics and risk, there is little hope that the early generations of such systems and robots will be adequate, making mistakes that may cost human lives.”  (This is related to the ‘firstgeneration’ problem we discuss in sections 6 and 7, that we won’t know exactly what kind of errors and mistaken harms autonomous robots will commit until they have already done.) 

Professor Sharkey questions the robots ability to make judgments and act as humans:

"It trots out the old notion that robots could behave more ethically than soldiers because they don't get angry or seek revenge." But robots don't have human empathy and can't exercise judgment, he says, and as they can't discriminate between innocent bystanders and soldiers they should not make judgments about lethal force.”

“But the presumptive case for deploying robots on the battlefield is more than about saving human lives or superior efficiency ad effectiveness, though saving lives and clearheaded action during frenetic conflicts are significant issues.  Robots, further, would be unaffected by the emotions, adrenaline, and stress that cause soldiers to overreact or deliberately overstep the Rules of Engagement and commit atrocities, that is to say, war crimes.  We would no longer read (as many) news reports about our own soldiers brutalizing enemy combatants or foreign civilians to avenge the deaths of their brothers i arms—"

Israel and the US  have already developed “drones” that are equipped like the Predator with two Hellfire laser-guided missiles, all unmanned and all remotely controlled.



 
Wikio

Text Comments Post a Text Comment
 
There are currently no Item comments.
 
 
 
 
ADVERTISEMENT
 
Click for more

Egypt calls in shark experts after attacksShark experts began arriving in the Egyptian resort of Sharm el-Sheikh on Tuesday to investigate what led to four attacks over the last week that killed a German woman and injured three snorkelers, officials said.

Yes
No
Maybe
View Results
257 Votes

Popular News

Wikileaks founder Assange arrested in London
Students to "shut down" London on Thursday
We hit oil, not quite its water
Police warn protests groups against being a vehicle for "troublemakers"
London prepares for another Arctic blast
Eurostar services disrupted, passengers advised to postpone non-essential travel
Enfield MP "strongly criticises" Chase Farm downgrade report

FASHION & STYLE | NEWS | LONDON POLITICS | INTERNATIONAL NEWS | BUSINESS NEWS | GADGET CITY | MOTORING NEWS | EATING OUT GUIDE | CRIME DESK | OLYMPICS | PROPERTY | COLUMNISTS | TRAVEL | EDITORIAL | ENTERTAINMENT | SPORTS | LONDON TENNIS | CONTACT US | ARTS & EVENTS | Investor Relations and Corporate | CLASSIFIEDS | SUBSCRIPTION (JOIN US FOR FREE) | LETTERS TO THE EDITOR