About Me

Picture of me

I am a postdoctoral fellow at Penn State University’s Center for Humanities and Information, specializing in the history of computation, Enlightenment studies, Romantic literature, and digital humanities. My scholarship considers what intellectual history can tell us about contemporary technological issues. I am also a programmer working at the intersection of computation and language.

In 2018, I received a PhD in English from the Graduate Center, City University of New York. I also have an M.A. in English from NYU and studied literature, mathematics, and computer science at the undergraduate level. Before starting my PhD, I worked for five years in database programming and data visualization.

The Distance Machine shows how the words in a passage from Moby-Dick went in and out of common use over a period of two centuries.

If you want to get a sense of what my digital humanities work is like, a good place to start is the Distance Machine. The Distance Machine is a web-based program (which you can try in your browser right now) that identifies words in historical texts that were uncommon at particular points in time. The idea is to find the opposite of archaisms—words that seem ordinary to us now, but that had not yet become common at the time when a text was written. I published an article about the project in American Literature; the source code is on GitHub.

I’ve also made some fun things like a poetry generator that can create rhymed verse, a depoeticizer that makes texts more banal, and a neural-network reconstruction of Herman Melville’s novel Moby-Dick. My most recent creative-critical project is A Hundred Visions and Revisions, a program that uses a neural language model to rewrite a poem to be about a different topic.

There is a longer list of my digital projects here; you can also view a list of my major publications or my full CV.

I am currently working on a book project, tentatively titled Modernity and the Algorithm, about the history of algorithmic thinking from the seventeenth century to the nineteenth. Focusing on three mathematicians from subsequent centuries—Leibniz, Condorcet, and Boole—I show that the social function of algorithms has varied drastically with changing epistemologies. During the Enlightenment, computation was a politically loaded topic; radicals sought to replace the “errors” of the past with mathematical rationality, while conservatives viewed such methods as tyrannical impositions of arbitrary rules on human thought. As my book shows, the Romantic turn around 1800 enabled mathematicians to skirt these political issues by distinguishing the technical aspects of algorithmic systems from the cultural factors involved in making those systems meaningful to people. This nineteenth-century compromise, I argue, set the terms on which algorithms continue to relate to the user interface in modern computers. I published an article on this topic, titled “Romantic Disciplinarity and the Rise of the Algorithm,” in the Summer 2020 issue of Critical Inquiry.