Synapse - Africa’s 4IR Trade & Innovation Magazine - 3rd Quarter 2020 Issue 09 (Show Edition)

Page 111

AI ETHICS As Data Science and AI practitioners, and organisations looking to adopt AI, we need to be aware of how systemic oppression and discrimination is represented within our data. It is important for us to identify problematic biases and tackle them head-on to ensure the products we put in people’s hands make their lives better.

Source designyourtrust.com While some results were humorous, one user highlighted that the AI was problematic when it converted an obvious photo of Barack Obama into a

white man. Other users had similar experiences when using images of non-White groups. The use-case for this face depixelizer was to extract facial features from data, however, what has been created is a model that places more emphasis on white features and works better on white people. This happened because the model was trained on a research dataset named FlickFaceHQ (FFHQ) which mainly consists of white people. The skew of white representation in datasets extends further than FFHQ and can simply be seen by googling “beautiful woman” which returns pictures of largely young, white females. This can lead to models working better for white people, for example, facial recognition. While the above are examples of how hegemonic whiteness is perpetuated through imbalanced data, more insidious is how minority groups are represented by models trained on bias data. In April this year, another experiment on Twitter went viral where

3RD QUARTER 2020

Source: Twitter /@Chicken3gg

109 SYNAPSE

A BIAS model, statistically, is one that has learnt too well on the initial data that it cannot make accurate predictions on new, unseen data. This is the type of bias that Data Scientists often deal with and is a function of the model they’re building. The lesser dealt with bias, the bias with ethical implications, are the biases that exist within the data being used by the model and the biases of the person building it. We acknowledge that there is systemic oppression; by definition, it is encoded within the institutions we interact with daily and consequently in the data we are exposed to. Recently an AI tool designed to take pixelated photographs of people and reconstruct them into a more accurate picture was posted on Twitter

By Daniel Wertheimer, ata cienti t


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.