Page 59

CCEG SOCIAL VALUE & INTANGIBLES REVIEW mention the open letters on Research Priorities For Robust and Beneficial Intelligence and on Autonomous Weapons which warn us against the risks associated with the uncontrolled development of AI.

Yet, even if drones are not always an existential threat per se as far as they remain tools fully controlled by human agents, they can nonetheless become dangerous in two different ways. First, the danger is intrinsically linked to the intention of their users or their potential technological flaws. Commercial planes are not weapons until someone decides to use them as such. Second, more subtly, if one agrees with the premises of the substantive theory of technology, they can shape our behavior without us even realizing it. In these two cases, drones will inevitably have an impact on social values. If we agree on the diversity of values, evaluating thoroughly this impact would be highly subjective and contextual. However, it seems that there is at least a weak consensus on some widely shared values that are threatened by AI-fitted drones. To mention but a few: courage, responsibility, human dignity, human civil liberties, work, security and even peace, are certainly among the most internationally shared values at risk because of AI-fitted drones. A quick look at the literature on AI shows that living with and among intelligent machines will change our perceptions. The value of humanness itself will be reassessed. The way we see each other and ourselves will evolve to a point where we will feel like we are no longer valuable beings since machines can perform better than we ever could. This kind of overconfidence in intelligent machines is widely held by people who think that autonomous cars or planes are more reliable

than those driven or flown by humans. This leads to the overreliance and then the over-use of machines considering that algorithms can do better than brains. In the field of war some even consider that machines will act more morally, for, unlike humans, they do not have feelings that could impair their judgment. What does that mean in terms of the way we value feelings as behaviour decoders? Should we consider that there is no room for feelings in conflicts? What about compassion? Do we really want indifferent combat machines? If it is tolerable in war, then we should consider it for police. A real Robocop that would impassively enforce the law would then be praised. AI-fitted drones, as any AI-fitted machines, could break the already tenuous string that links individuals to each other. They would dehumanize conflicts and weaken our sense of responsibility, thereby putting a moral buffer between killers and their victims. Delegating our decisions to machines, we would also lose highly valued virtues such as prudence, justice, temperance and courage. AI-fitted machines would alter our relation to other human beings and thus to the very worth of life. Even our understanding of what is moral and/or legal would be impaired since justice could not apply the same way to machines as it does to humans. As would be our relation to authority, for governments would lose control over whole segments of our every day life activities. Eventually, AI-fitted drones, along with other systems equipped with AI, could deconstruct the fragile fabric of societies built on shared experiences, established rules, and common values.

February 2017

58

Profile for CCEG

Social Value and Intangibles Review February 2017  

CCEG Social Value & Intangibles Review (SVIR) – February 2017 Centre for Citizenship, Enterprise and Governance journal on Social Impact and...

Social Value and Intangibles Review February 2017  

CCEG Social Value & Intangibles Review (SVIR) – February 2017 Centre for Citizenship, Enterprise and Governance journal on Social Impact and...

Profile for seratio