2 minute read

A Human Touch

The same is true when it comes to the intersection of big data and the humanities. In fact, Williams’ Mellon faculty co-lead is Todd Presner, chair of the UCLA Department of European Languages and Transcultural Studies and special advisor to the vice chancellor for research. Presner is currently working on a book, Ethics of the Algorithm: Computational Approaches to Holocaust History and Memory, in which he examines the innovations made possible in the field via everything from natural language processing to machine learning to data visualizations.

These approaches have also made a difference in work tied to the ancient world, according to Chris Johanson, associate professor of classics and chair of digital humanities. For example, in aristocratic Roman funeral traditions, mourners would portray multiple generations of the deceased’s most notable ancestors, both real and mythological.

Advertisement

Johanson’s RomeLab project developed reconstructions of these funerals and, using a searchable database of all known members of Roman society’s elite known as the Digital Prosopography of the Roman Republic, as well as network graph visualizations of their family trees, Johanson and his students created visualizations for every aristocratic funeral that might have occurred during the entirety of the Roman Republic.

“RomeLab is just one microscopic example of how one can work with computationally actionable data in the humanities,” Johanson says. “But it shows how these tools allow students to connect closer to the people and materials of the past than they could have otherwise.”

This philosophy informs the division on a broad scale. For example, John Papadopoulos, professor of classical archaeology, history and culture, has incorporated light detection and ranging (LiDAR) data to create 3D models of an Athenian agora excavation project. And Ashley Sanders Garcia, vice chair of digital humanities, has used text mining and network analysis to recover the history of Algerian women who lived between 1567 and 1837.

Another exciting project involves work being done by Jessica Cook, a doctoral candidate in English writing her dissertation on how 19th-century mnemonics and poetry informed the conceptualization of modern computing, focusing in great detail on Ada Lovelace, the world’s first computer programmer. To access Lovelace’s archive—most of which is unpublished— Cook had to photograph all the papers in the archive and train an AI model to read Lovelace’s Victorian-era handwriting and then transcribe it.

Cook’s efforts have proved so successful that she is currently running her model on Lovelace’s entire corpus of writing and will take similar approaches to the handwriting of Lovelace’s important correspondents.

Arguably the most powerful takeaway is that this project will finally allow Lovelace’s entire body of work to become accessible to researchers who can ensure she receives the rightful credit many of her male contemporaries have enjoyed for centuries.

“This kind of large-scale digital humanities endeavor is an exciting demonstration of how big data and AI have transformed the field of literary study,” says Cook. “However, this particular project is especially poignant because Ada Lovelace’s contributions to the history of computing were the genesis of the very AI technologies that make this research possible. If Lovelace had not produced the very pieces of writing that I am transcribing, it is possible that the modern computer as we know it may also not have existed.”

Keeping the focus on humanity is key, these researchers agree.

“As much as technology has the ability to distance us from what it means to be human, it’s also able to bring us much closer—to allow us to connect strands and stories of the human experience more efficiently than ever before,” Johanson says. “It’s really exciting to think about what is possible at UCLA and beyond when north and south campus collaborate.”