Quantcast

South SFV Today

Sunday, December 22, 2024

Seminar series explores role of data in humanities

Webp lw9kgvt1d34kt9bktq07xn0ak8x0

John Taylor, Professor of Economics at Stanford University and developer of the "Taylor Rule" for setting interest rates | Stanford University

John Taylor, Professor of Economics at Stanford University and developer of the "Taylor Rule" for setting interest rates | Stanford University

In today’s digital age, data is omnipresent. From smartphones to voting behaviors, the amount of data generated and collected is unprecedented. This data holds the potential to reveal profound insights about our world but also raises several questions: What biases influence particular data sets? How do we assign authentic and accurate meaning to data? What are its social and political implications? Where should it be stored, and who should own it?

These questions are central to a year-long seminar series titled “The Data that Divides Us: Recalibrating Data Methods for New Knowledge Frameworks Across the Humanities.” Supported by the Andrew W. Mellon Foundation Sawyer Seminar award and hosted by the Center for Spatial and Textual Analysis (CESTA), the series brings together humanities scholars across disciplines to discuss the pervasive nature of data and explore its place in the humanities.

“This series is a venue that allows humanists to bring the conversation around data more cogently together across different humanities fields and in relation to data science,” said Giovanna Ceserani, associate professor of classics in the School of Humanities and Sciences, CESTA faculty director, and one of the faculty leads of the series.

The next seminar will take place on May 30, followed by a symposium on May 31.

While data is often associated with engineers, mathematicians, and data scientists, humanities scholars argue that much can be lost when subjects such as historical events or human behaviors are analyzed strictly by numbers.

“If we start to understand the world only in terms of data, we may lose the more complex interpretations that come with traditional humanistic inquiry,” Ceserani said.

Humanities scholarship can analyze subjects through unique lenses, producing novel insights. In the first seminar, “The Place of Data,” scholars Alan Liu and Roopika Risam explored how data can reflect or cause modern social divisions. Their research analyzes geography, race, and gender-related data to examine how this information intersects with social divisions.

In another seminar titled “Catastrophe, Data, and Transformation,” historian Jessica Otis discussed her NSF-funded project Death by Numbers. This project involved transcribing London’s mortality statistics to understand how plague outbreaks intersected with early modern England's mentality. Dagomar Degroot discussed modern climate data.

The fifth part of the series featured Marlene Daut from Yale University in a seminar called “Recuperating Forgotten Narratives.” Daut discussed her work digitizing early 19th-century Haitian print media. By making these documents accessible online, she found they often contradicted conventional portrayals of Haiti or corrected false narratives about France’s involvement in slavery there. Haitians visiting the site used it for genealogical research.

“The papers [and] almanacs contained all these names, so people are using this digital archive like municipal archives to help complete their family trees,” Daut said.

Discussion on family tree research continued in “The Data of Enslavement” seminar presented by historians Greg O’Malley and Alex Borucki. They shared their Intra-American Slave Trade Database project documenting over 35,000 slave trading voyages within the Americas. Scholar Lauren Klein discussed ethical approaches to slave trade data while Ayesha Hardison talked about preserving texts by Black authors.

Data scientists typically acquire large datasets from specific sources for analysis. Humanities research tends to be more selective explained Chloé Brault, a Stanford PhD candidate in comparative literature.

“We’re often in practice creating our own data,” she said. “For example, that might look like a literary scholar selecting 100 novels to analyze.”

Brault is working on a dissertation investigating Montreal's literary production from the 1970s using computational tools and selected texts.

Ceserani noted that asking “What is the place of data in our work?” forces humanists to scrutinize their study objects differently.

“It forces us to ask questions of data scientists about their sources and evidence," she said. "It also forces us into productive conversations with them."

The final seminar titled “Ancient Data and Its Divisions” will be presented by Chiara Palladino from Furman University; Eric Harvey collaborating with CESTA; and Chris Johanson from UCLA on May 30 at Wallenberg Hall.

The Mellon Sawyer Seminar Series will conclude with a symposium on May 31 at Wallenberg Hall. Faculty PIs include Mark Algee-Hewitt from English and digital humanities; Grant Parker from classics; Laura Stokes from history; along with graduate fellow Matthew Warner; postdoctoral scholar Dr. Nichole Nomura; supported by CESTA, Stanford Humanities Center (SHC), and Dean’s Office of Humanities & Sciences.

Alan Liu is an English professor at UC Santa Barbara; Roopika Risam teaches film/media studies at Dartmouth College; Jessica Otis is an assistant history professor at George Mason University; Dagomar Degroot teaches environmental history at Georgetown University; Greg O’Malley teaches history at UC Santa Cruz; Alex Borucki teaches history at UC Irvine; Lauren Klein teaches quantitative theory/methods at Emory University; Ayesha Hardison teaches English/gender studies at Kansas University.

ORGANIZATIONS IN THIS STORY

!RECEIVE ALERTS

The next time we write about any of these orgs, we’ll email you a link to the story. You may edit your settings or unsubscribe at any time.
Sign-up

DONATE

Help support the Metric Media Foundation's mission to restore community based news.
Donate

MORE NEWS