2020 Newcomb: Tech In Mind Zine

Page 15

This understanding of algorithms is incredibly naïve and fails to acknowledge how such algorithms are even developed. No algorithm can write itself, and a goal of remaining objective by using an algorithm rather than a human judgement often overlooks this. Though the Apple Card is the most recent in the news, issues of gender bias in algorithms go well beyond just this case. Another big tech giant, Amazon, is guilty of discriminating against women in an algorithm they developed as a hiring tool. Amazon’s goal was to help automate the hiring process. Ideally, their product would be able to go through any resume submitted for a job posting and select the top candidates, saving time and effort. In practice, this algorithm ended up learning to favor male candidates, based on their differences in language, especially in hiring for technical jobs. Jobs. Given that the training set relied on successful past applicants and that there is an imbalance of men in technical fields, descriptions used by men in their resumes were deemed to be a higher quality than their female counterparts. Amazon eventually pulled the initiative completely after realizing it was failing so spectacularly.

The Bottom Line: The common problem we see in these poorly designed algorithms is that they learn biases reflected in their training sets that are present as a result of preexisting social discrimination. They learn these biases and reinforce them. It’s important that as we further rely on algorithms to do work for us, time and time again, they are failing to remove human biases. Instead, algorithms simply automate them.

Read More About It: Algorithms of Oppression by Safiya Noble Automating Inequality by Virginia Eubanks Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech by Sara Wachter-Boettcher


Articles inside

Grace Hopper 2019

1min
pages 58-61

Voter Turnout in New Orleans: 2019-2020

1min
page 35

NCRF: 2019-2020

1min
page 34

Beautiful Sisterhood of Books: 2019-2020

1min
page 31

Transitional Justice: 2019-2020

1min
page 30

Amazon Alexa AI App: 2019-2020

1min
page 29

African Letters Project: 2019-2020

2min
page 28

Macroeconomic Graphs: 2019-2020

1min
page 33

ViaNolaVie 2019-2020

1min
page 32

Fox: Review of Algorithms of Oppression

1min
page 43

Tabor: Our Belated Thanks, Hedy Lamarr

1min
page 42

Walder: Jewish Women in Technology

1min
pages 50-51

Grotjan: Does Gender Matter in Memes

1min
pages 44-49

Tanen: Women and Men in Computing

1min
pages 52-53

2020 Message from the Editor

1min
pages 4-5

2020SWE

1min
pages 66-69

2020WIT

1min
pages 64-65

#19Grace Hopper Resources

1min
pages 58-61

Kidwell: Built-In Bias

1min
pages 14-15

O'Connell: The Digital Divide

1min
pages 16-17

Repenning: Computing Nature

1min
pages 18-19

Hardy: Tech Exec Positions Held By Women

1min
pages 12-13

Taras: Smart Cities

1min
pages 22-23

Stevens: Neri Oxman: What Happens When We Design for Design's Sake.

1min
pages 20-21

2020 Newcomb: Tech In Mind Zine

1min
pages 10-11
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.