Author – David Anderson
Experts in robot ethics are studying “how to equip robots with a set of moral rules.”
Robots have ethics?
No, not yet, but “computer scientists are teaming up with philosophers, psychologists, linguists, lawyers, theologians and human rights experts in the emerging field of robot morality to try to change that” according to an article by Robin Marantz Henigjan in the January 9, 2015 edition of the New York Times.
And how to help, much less ensure, robots distinguish between right and wrong?
Answer: installation of an “ethical adapter to help the robot emulate guilt.”
If only Petraeus, or Allen, or Sinclair, or Johnson-the-third – to name a few of the military with tarnished brass caught in affairs more-or-less recently; or some in the Richland (Washington) School District who think morals don’t matter – at least not enough to keep a morals-clause in their contract; or students at Harvard who are unsure what cheating means; or Sandusky, Paterno and Spanier who were ejected from the Penn State football field over the child sexual abuse scandal; or fill in the blank,
If only all listed in the hall of shame (and more nominees to be named later), had had an “ethical adapter” on board.
David Petraeus – who may have betrayed us, suspicioned he shared classified information, and who “stepped down in November 2012 as head of the CIA after his affair with an Army reserve officer who was writing his biography became public,” now may be prosecuted for allegedly divulging military secrets reports Richard A. Serrano and Timothy M. Phelps in the Los Angeles Times, this January 9th.
How handy to the retired Army General would have been an “ethical adapter”?
And then there’s some school administrators in Richland who, according to an article by Ty Beaver writing in the Tri-City Herald, May 18, 2013, decided, at least then, to “stick with a morality clause in superintendent contracts” but “some education officials question how to define moral conduct.”
One long-term school administrator quoted in the article said he “would be offended if anyone asked me to sign (a morality clause)."
Still another suggested that while morality “was standard language” at one time, perhaps a review would be in order, given morality’s “ambiguity.”
So, following this line of (non)thinking – that morality is a rather loosey-goosey, careless and imprecise, not to mention passé concept – how are programmers, being human and all, supposed to program a robot to be moral?
Is it simply a matter of wiring, or, given technological advancements: ‘sensoring’?
With “lightning speed,” for example, a car using onboard “lasers, radar and cameras mounted on its roof and windshield, can now make “rapid probabilistic predictions” and ‘sense’ what it should do – say avoid the oncoming passenger van by plowing into the Volkswagen slug-bug on the right ‘since’ less people will die and less people dying is the programmed ‘value.’
An added benefit of such faceless features, while possibly accompanied by an increase in roadside carnage, would be the elimination of road rage: your laser-radar-camera-equipped car taking control when you lose it.
And given a human’s unpredictability – comprised as humans are not of wires but rather ruled too often by their mistake-prone emotions (e.g. road rage) – battles, another example as automobiles, should be fought – to hear the roboticists tell it – unclouded by “panic, confusion or fear” – most undependable and unreliable emotions – but rather enemies precisely and coldly and robotically obliterated as identified by uniform (unless already wounded) and target (hopefully before the robot realizes it’s a school or a hospital).
Had Sonny Weaver (“Draft Day,” starring Kevin Costner) an “ethical adapter” would he, as the General Manager of the Cleveland Browns, had absolutely no difficulty in choosing between picking the player everyone expects him to as opposed to going with his gut and selecting the rather no-account in comparison but one he felt truly right for the team?
Maybe. But then there wouldn’t have been the movie that left us until the very end on the horns of said dilemma.
If right choices and correct decisions were simply a matter of wiring and not substantial and prolonged and (yes, quite emotional) agonizing, would the world have given us Nelson Mandela, Dietrich Bonhoeffer, Martin Luther King, Jr., Robert F. Kennedy, Cicely Saunders, Aung San Suu Kyi, Edith Cavell and Raul Wallenberg – all subjects of Gordon Brown’s book entitled “Courage,” subtitled, “Portraits of bravery in the service of great causes”?
Because great victories and abysmal defeats are both the result of choices, the former history will show involving “worthwhile and noble sacrifices,” the latter, as history will also show, by personal, me-ism self-aggrandizement.
Rightly did Brown conclude of the noteworthy in history:
“Social disapproval, danger, physical pain, and even the risk of death mattered far less to them than personal belief and moral purpose. Quite simply, they seemed to be driven and sustained by higher ideals.”
Which no amount of robotic tinkering by experts can program.