Uncategorized

Module 8: Ethics, Biases, and diversity in the digital world

Should original audience and intent be a factor in digitization? Who should have access to digitized materials? How are some algorithms impacting search bias? This week’s module dealt with some incredibly tough questions and problems regarding digital ethics.

Although succinct, Michelle Moravec’s article, “What would you do? Historians’ ethics and digitized archives” pointedly asks historians to think about the context, original intent, and if digitization would cause harm.[1] Moravec argues that it’s important to think “about our responsibilities as the users of these digitized archival material(s), when what we write is online, and when our reuse of digitized materials may at the least violate copyright and the worst cause harm to individuals.” Her article includes 6 questions that resulted from a roundtable discussion and serve as a starting point to examine digital ethics. Since being written in 2016, are there other questions or issues that you think should be included in that ethical framework?

One of Moravec’s questions mentions problems with metadata and ties in with Sharon Block’s article, “Erasure, Misrepresentation and Confusion: Investigating JSTOR Topics on Women’s and Race Histories.”[2]  Although the Library of Congress Subject Headings have been a known issue for Librarians in terms of anti-LGBTQIA+, Eurocentric, and other biases, Block went a step further to examine bias in digital databases, like JSTOR. Block’s study revealed “problematic topic indexing” such as the exclusion of the term “women” (but not “men”) from searches relating to Jennifer Morgan’s article, “Some Could Suckle over Their Shoulder’: Male Travelers, Female Bodies, and the Gendering of Racial Ideology, 1500-1770.” Block uses this to illustrate algorithmic issues that can skew search results.

Going a step further, Safiya Umoja Noble’s research into algorithmic bias is eye-opening. I opted to watch her 2015 talk, “Power, Privilege, and the Imperative to Act” and was shocked to learn about the pornographic and hypersexualization of the term “black girls” (along with Asian and Latina girls) in Google searches. Nobel reached out to Bitch Magazine to do a story on this issue and was initially met with reluctance because they assumed that everyone knew this happened. While discussing some of the biases and problematic index terms that influence algorithms, Nobel presents her librarian audience with several ways to work towards fighting the racist framework that is often pervasive in metadata. Nobel gives the example of how the indexing term “black history” was sometimes tied to racist imagery that perpetuated stereotypes in a popular image database. She included a quote that elaborates on how this has a ripple effect in society, “Media representations of people of color particularly African Americans, have been implicated in historical and contemporary racial projects. Such projects use stereotypic images to influence the redistribution of resources in ways that benefit dominant groups at the expense of others.”

Nobel ends her talk by bringing to light issues with the extraction industry and how people in Congo are often trading their lives for the manufacture of our devices. She implores the audience to be more mindful of the larger impact of digitization. I needed to know more about this and found a 2018 article from the Guardian, Is your phone tainted by the misery of the 35,000 children in Congo’s mines? This heartbreaking topic is going to be another research rabbit hole for me.

I haven’t had time to read her book, “Algorithms of Oppression: How Search Engines Reinforce Racism,” but I look forward to delving into it in the future.  There’s so much going on with these larger themes that leads to more questions and discussion. While we spent some time working through questions about the “right to privacy during the digital age”, and “what should be digitized” in our small groups, we could easily spend an entire semester on this topic.


[1] Michelle Moravec, “What would you do? Historians’ ethics and digitized archives,” 2016.

[2] Sharon Block, “Erasure, Misrepresentation and Confusion: Investigating JSTOR Topics on Women’s and Race Histories,” Digital Humanities Quarterly 14, no 1 (2020).

4 Comments

  • Paola Torrico

    Hi Paula! Your post brings up interesting points. Your reference to Safiya Umoja Noble’s research into algorithmic bias is what shocked me the most! Not sure how I missed this point in the article, but the fact that “they assumed that everyone knew this happened” (in regards to the pornographic and hypersexualization of the term “black girls” (along with Asian and Latina girls)) is absolutely mindblowing and shameful that it continues to this day. I think certain practices and/or guidelines need to be put in place to correct these issues that feed into unwanted stereotypes.

  • Matt Grembowitz

    I like that yu bring up Nobel’s research on the pornographic items that come up when looking up black, asian, or latina girls on google. This is just another way in which people are being made to seem more and more like the other and it perpetuates a fettishising of these girls of non-white ethnicities. In my eyes this is making them look more like an object than people. I never really though about how the internet could push these sick ideas just by the search “black girls”.

  • Ellie

    Hi Paula! I always like reading your posts because you have insider knowledge of the library side of the humanities (as a librarian). This might be a question without an answer, but have libraries/librarians pushed back against the prejudice ingrained in the LOC Subject Headings? Are there anti-racism movements in the library world lead by those invested in the equity of libraries? I guess I’m interested in whether libraries can create change on their own, in their own way, or if there are governing systems (like metadata or algorithms) that nationally affect library organization, and by extent, ethics.

  • Stephen Reiter

    I find the ethical questions surrounding digitization to be very interesting, and I agree with a lot of the points made by Moravec. There are the legal concerns, such as the violation of copyright laws, but also moral concerns, such as the harm that these materials may cause to others. I feel like it’s always so important to constantly remind ourselves of these issues as we delve further into the field of digital history. I think we all learned a lot about algorithmic bias from our readings this week. At a minimum, we all certainly learned the extent to which it is a problem. The concerns you mention from Nobel’s talk were truly startling, and I know I’ll be thinking about them for a long time.

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php