Finger on the Scale: Bias in Computer Systems

school self-education

 Finger on the Scale: Bias in Computer Systems

Works Cited

Angwin, Julia, et al. “Machine Bias.” ProPublica, 23 May 2016, www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.

Notes: I found their data analysis explanation (Larson) before I found the original article. This article gives human faces to the data, an important step in providing the emotional component to the data. One of the reasons that I chose this article and the other is that one of the authors is the same as the "It's Complicated" article in our reading. ProPublica is a highly rated non-profit, and the authors of the article are award-winning in their field (including a Pulitzer).
Bellovin, Steve. “Yes, ‘Algorithms’ Can Be Biased. Here's Why.” Ars Technica, Conde Nast, 24 Jan. 2019, arstechnica.com/tech-policy/2019/01/yes-algorithms-can-be-biased-heres-why/.
Notes: An initial article to get an easy overview of the topic, as well as provide several points of research to consider. Within the article, Dr. Bellovin, a professor of computer science at Columbia University, breaks down examples of algorithms in play, from Amazon's suggestions to the unintended consequences of facial recognition software. Ars Technica is focused on a technology-friendly audience, but is not an academic site (though Dr. Bellovin does link to other news sites like Reuters and non-profits like ProPublica). Dr. Bellovin's engaging style helps me focus on how to communicate the key points of my research to the visual medium.

“Community Resources for Justice Inc.” Charity Navigator, June 2018, www.charitynavigator.org/.

Charity Navigator provides information about non-profit organizations, such as yearly donations and any public IRS information. Query terms: "community resources for justice", "able gamers", "propublica"

Condit, Jessica. “The Terrible, Fantastic Life of AbleGamers COO Steven Spohn.” Engadget, Verizon Media, 25 Mar. 2020, www.engadget.com/2020-03-25-steven-spohn-ablegamers-sxsw-gaming-interview.html.

Notes: AbleGamers and its COO Steven Spohn provide necessary insight into the needs of disabled gamers. In the article, it provides multiple visual examples of gamers playing with specialized controllers, Microsoft's success with the Xbox Adaptive Controller. "Inclusion is a choice" and showing the Adaptive Controller are things that I want to portray in my infographic.

Flores, Anthony W., et al. “False Positives, False Negatives, and False Analyses: A Rejoinder to ‘Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And It’s Biased Against Blacks.’” Machine Bias Rejoinder, Community Resources for Justice, Inc., 9 July 2017, www.crj.org/assets/2017/07/9_Machine_bias_rejoinder.pdf.

Analysis of ProPublica's article (Larson). Opening paragraph states that the study "erroneously established conclusions is exacerbated by the large-market outlet in which it was published (ProPublica)". The rest of the article reads as if it was a personal affront to the CRJ staff: "Before we proceed further, we want to make it clear that we are not supporting or endorsing the idea of using risk assessment at sentencing (but do support its use at certain decision points in the correctional system) nor are we advocating for the Northpointe COMPAS. We also are not making any blanket statements about race, test bias, and all ARAIs." ARAI: actuarial risk assessment instruments One of the sources I used, a ProPublica article (Larson; chosen because one of the authors was the same as the "It's Complicated" article) had a rebuttal to its conclusions from Community Resources for Justice (abbreviated as CRJ), a non-profit organization. I used Charity Navigator to determine public information about the non-profit, but since CRJ receives less than 40% of its donations through individual contributions and the rest from government grants, I am less trusting of their data (Flores). How can a proper analysis and audit of the COMPAS software be done, if the people reviewing it aren't diverse also? (CRJ's leadership page lists five white men and women. Additional thoughts I had was "Why was a non-profit incorporated, rather than set up as a limited liability company?" but this was because I was doing my own personal business-related administration and the difference was at the forefront of my mind.)

Goethe, Taylor Synclair. “Bigotry Encoded: Racial Bias in Technology.” Reporter Magazine, Rochester Institute of Technology, 2 Mar. 2019, reporter.rit.edu/tech/bigotry-encoded-racial-bias-technology.

Notes: One of the initial articles I read regarding bias in hardware and how lack of diversity within the engineering team lead to the issue. The simple illustration style captured my attention, and I am planning to use something similar.

Larson, Jeff, et al. “How We Analyzed the COMPAS Recidivism Algorithm.” ProPublica, 23 May 2016, www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm.

Lee, Nicol Turner, et al. “Algorithmic Bias Detection and Mitigation: Best Practices and Policies to Reduce Consumer Harms.” Brookings, The Brookings Institution, 25 Oct. 2019, www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/.

Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press, 2018.

Notes: One of the most thorough examples on the topic of search engines and the algorithms that run them. The research on the inner mechanisms of these engines are opaque, but Noble (associate professor at UCLA) breaks it down with numerous examples. Thorough research, combined with extensive footnotes and bibliography. The book provides a viewpoint that I cannot and will not ever be able to fully capture: that of a black woman with children trying to navigate the internet safely. Points out numerous dangerous assumptions: that Google is reliable and without commercial interest (it is not); search engine echo chambers. It is interesting to note that my Internet use predates search engines, and that my method of using search engines is to focus on key phrases and words, rather than writing out whole questions (this level of granularity was not available early on). Is the end user being influenced by being able to ask specific questions, then having those questions influence future search results? The autocomplete option is a prime example of the echo chamber effect. The book also provides another example of deliberate choices on Google's part: blocking search results in countries where it is illegal (Nazi memorabilia). Yet it's not illegal in the US, so those results were not blocked. (Twitter has something similar.) This brings up the ethical question--should those results not be shown no matter the legality or not? Since I began this research, Google has provided the option to auto-delete app data and search histories after a set time period. A welcome change, and I do want to know what prompted it. Again and again, this book emphasizes that Google is not a public organization--a key thing to remember in this time of digital privacy concerns.

Sharpe, Vy, et al. “Move Slow and Fix Things: Teaching Computer Science Majors to Decode Discrimination and Design Diverse Futures.” Transformations: The Journal of Inclusive Scholarship and Pedagogy, vol. 28, no. 2, 2018, p. 202., doi:10.5325/trajincschped.28.2.0202.

Notes: This work focuses on fixing the issues of bias in technology by teaching its future. The authors proposed to create a class for students outside their department, with the enthusiastic and full support of the computer science department. (As outlined in my proposal, they identified a problem, called it out, and worked with the CS department to formulate a plan to tackle the problem.) Pedagogy is the methodology behind teaching (something that I will be getting into as a future art educator), and this journal focuses on inclusive pedagogy. It is important to provide examples of solutions, especially successful ones, and I feel that this article hit the spot. I want to adapt it in some form as part of my infographic. Paragraph 2: "Indeed, at the end of the semester, one student reflected on his experience in the class: “the most important thing I’ve learned in this course has definitely been recognizing all the ways in which I am privileged as a white cisgender male and learning about some ways that I can make computer science and the tech industry more inclusive for everyone.”" Paragraph 3: Facebook motto "Move Fast and Break Things" vs. "Move Slow and Fix Things" Paragraph 4: "The [computer science] department was supportive and immediately saw the need for the course." Paragraph 6: "One resource that helped students understand how racial bias impacts the development of technology is MIT Media Lab scholar Joy Buolamwini’s 2016 discussion of the “encoded gaze”—a concept she uses to discuss how facial recognition software is encoded with bias through the use of inequitable datasets to train the software’s processes." Paragraph 6: "While many of our students had taken courses on algorithms and machine learning, they had not learned about racial or other forms of discrimination that can shape the creation and choice of datasets used to train machines to learn. Students learn that the resulting “encoding” of discrimination in algorithms presumed to be more objective has direct implications on how people experience current and emerging technologies." Paragraph 7: "the prompt, “I have avatars to represent myself in the ways I want” became “People are entitled to have avatars to represent themselves in the ways they want.”" Personal connection: A Twitch streamer that I follow, DeeJayKnight, was so overwhelmed by the fact that he had the option to have a black hairstyle that wasn't an afro, that looked good, and was lit and textured properly in The Outer Worlds video games. He stopped the stream, took off his glasses, and was just so happy. This is something that has never even entered my world view. Paragraph 8: "What is the underlying logic of the technology—is it to facilitate an inclusive future or is it to erase human variability and difference?" Paragraph 9: "...the standard ways of approaching and teaching disability in the context of legal accessibility requirements, or the “checklist” approach, dramatically misses the wide range of user needs and experiences in the world." Paragraph 13: "By connecting diversity, technology, and ethics in the classroom, we can not only imagine more inclusive futures, but train the technologists to build them"

Webster, Andrew. “The Last of Us Part II Isn't Just Naughty Dog's Most Ambitious Game - It's the Most Accessible, Too.” The Verge, Vox Media, 1 June 2020, www.theverge.com/21274923/the-last-of-us-part-2-accessibility-features-naughty-dog-interview-ps4.



Older Post Newer Post


Leave a comment

Please note, comments must be approved before they are published