Research

This page is due for major revision, but in the meantime is divided into four sections:

1) research toward my 2004 dissertation in digital humanities;
2) work as a member of UVa’s Speculative Computing Lab (SpecLab), ca. 2000-2003;
3) other electronic projects, some of which date back to 1996 or 1997;
4) scholarly editing (Swinburne’s Poems and Ballads, First Series).

Some of my more recent research relates to two NEH grants on which I serve as PI: the Scholars’ Lab’s Institute for Enabling Geospatial Scholarship, and Neatline: Geospatial & Temporal Interpretation of Archival Collections.

Speculative Computing: Instruments for Interpretive Scholarship

Ph.D. in English, University of Virginia, 2004.

About the diss:
My dissertation, Speculative Computing: Instruments for Interpretive Scholarship, drew on my work in humanities computing at the University of Virginia, first at IATH and later with SpecLab and the ARP / 9s / Rossetti group. I promptly CC-licensed my dissertation in 2004, and it’s available below.

Dissertation director: Jerome McGann
Committee members: Johanna Drucker and David Golumbia
Outside reader: Benjamin Ray

An abstract:
Like many modern humanities computing projects, Ramon Llull’s Ars Magna, a
system of inscripted, manipulable wheels dating to the thirteenth century, asserts that interpretation can be aided by mechanism without being generated mathematically or mechanically. That this assertion is sometimes lost on the larger academic community is not simply a failure of the devices scholar-technologists produce (although, as the work outlined here seeks to demonstrate, we could do a better job of anticipating and incorporating patently interpretive forms of interaction on the part of our users into the systems we create for them). Instead, it betrays our failure to articulate the humanistic and hermeneutic value of algorithmic work to a lay audience.

This dissertation uses Llull’s Ars Magna to introduce the relationships of algorithm, ars combinatoria, aesthetic provocation, diagrammatic reasoning, and ludic practice to the work of humanities scholarship and then presents two major case studies in the design of digital instruments and environments that open themselves to performance and intervention on the part of interpretive agents. The first is the Temporal Modelling PlaySpace, a composition tool for sketching personalized and inflected timelines that (like temporal relations in humanities data generally) are not necessarily unidirectional, homogenous, or continuous. Temporal Modelling’s innovation lies in its extraction for re-purposing of well-formed XML from users’ intuitively-designed and even deliberately ambiguous diagrammatic models. The second case study deals with computational and interface or visualization strategies for turning problems of subjectivity and deixis into opportunities for critical engagement in the Ivanhoe Game, a ludic subset of the larger IVANHOE project, an interpretive role-playing environment conceived by Jerome McGann and Johanna Drucker.

Both of these projects stem from work in progress at the University of Virginia’s Speculative Computing Laboratory. The goals and methods of SpecLab are demonstrated here — most especially in a trio of creative design exercises or “imaginary solutions” which make use of ideas developed in chapters on Llull, Temporal Modelling, and the Ivanhoe Game — and “speculative computing” is introduced as a new paradigm for exploratory digital work in the humanities.

The full text is available online. (Warning: 329 pages; 14MB in PDF format)
Creative Commons License

It is licensed under a Creative Commons License.

Selected presentations and essays feeding into Speculative Computing include:

  • “Lullian Method and Interpretation in Humanities Computing” at ACH/ALLC 2003 in Athens, Georgia.
  • “Some Applications of Game Theory to Digital Game Design” (as part of a panel on the Ivanhoe Game for ACH/ALLC 2002: New Directions in Humanities Computing. Tuebingen, Germany — July 2002.)
  • “Biblioludica: a game model for teaching material culture” at SHARP 2002 (Society for the History of Authorship, Reading, and Publishing). British Library, London — July 2002.
  • “Ludic Algorithms: or, How to Make Games and Why” (invited speaker in Graduate Student Lecture Series, UVA English Department. April 2002.)
  • “The Playful Scholarly Endeavor” (brief invited talk at the Game Developer’s Conference (GDC) Academic Summit, San Jose, CA — March 2002.)
  • “The Temporal Modelling Project” (presentation and demo for the project’s funder, the Intel Corporation. October 2001.)
  • “Ivanhoe and Game Design” (panel on the Ivanhoe Game with Johanna Drucker and Jerome McGann, Humanities and Technology Association Conference 2001) — September 2001.

There’s a strong ludic undertow in this dissertation, earlier drafts of which focused much more exclusively on games and game design. I’ll therefore mention here some games-related work: in the fall semester of 2002, I taught a course on the Culture and Aesthetics of Digital Games. A workshop on Game Design soon followed. I have been involved in a project (with my former student David Patch) to analyze the internal economies of multiplayer games and their design implications and have supervised an undergraduate independent study (Shane Liesegang’s) on the Peter Suber’s rule-making game, Nomic. I was also a founding member of SpecLab, which was (almost) all fun and games.

SpecLab

Update 2008: The most important thing to emerge from SpecLab, in my opinion, was the NINES project, for which I designed Collex. Johanna Drucker has a book forthcoming from Chicago UP, entitled SpecLab. It will chronicle the collective intellectual and design work that (for me) led to NINES.

Background: I was a founding member (with Jerry McGann, Johanna Drucker, Worthy Martin, Andrea Laue, and Steve Ramsay) of UVA’s Speculative Computing Laboratory (SpecLab), an interdisciplinary group supporting exploratory research in digital humanities. We use the technical term “speculative computing” metaphorically:

“Speculative computing is a technique to improve the execution time of certain applications by starting some computations before it is known that the computations are required. A speculative computation will eventually become mandatory or irrelevant. In the absence of side effects irrelevant computations may be aborted. However, a computation which is irrelevant for the value it produces may still be relevant for the side effects it performs.”
(from the proceedings of the 1992 Parallel Symbolic Computing Workshop at MIT)

“Speculative Computing” is also the title of my dissertation. My efforts and projects along these lines include:

other electronic projects

Links to a few samples of my past work, some of which was done under the auspices of IATH, UVA’s Institute for Advanced Technology in the Humanities. There’s more to be seen in my vita.

Some of these items are antediluvian, in Web terms, and are broken beyond repair. They stick around mostly for the sake of fond memories.

scholarly editing

My early experience in the digital humanities was grounded in textual criticism and scholarly editing at the Rossetti Archive, where I worked with Jerome McGann on text encoding, interface design, and project management from ca. 1997 to 2004. For most of that time, I served as the Archive’s Design Editor.

My ongoing scholarly editorial work focuses on Algernon Charles Swinburne, Rossetti’s contemporary — specifically his controversial 1866 volume, Poems and Ballads. This is a book with a fascinatingly vexed production and reception history. I plan to chronicle my work toward a critical edition in the pages of this blog.

National Portrait Gallery; by London Stereoscopic & Photographic Company, ca 1865
photo, National Portrait Gallery

An early version of my dissertation took the form of a scholarly edition of Poems and Ballads, First Series. The print embodiment of this edition, which includes a textual history, descriptive bibliographies, collation notes, and related material, was nearly complete, when I abandoned it for a more speculative dissertation full of case studies in humanities computing.

In the aftermath of that work, we built Juxta at ARP, and my next Swinburnian goal is to put all of the editions Poems and Ballads that I collated as a grad student into Juxta in XML format and deliver the whole on the Web. I am also collaborating with Matthew Mitchell, formerly of Digital Research & Scholarship at UVA Library, on a Solr and Ruby-on-Rails interface for textual studies. Matt has used my Swinburne XML, proofed with the assistance of Rob Stilling, as fodder for this research-and-development project.

Some early thinking toward Poems and Ballads can be found in two presentations I gave in the year 2000:

And a book chapter on some augmented reality (thought) experiments I undertook with Wayne Graham ca. 2015, using this Swinburne dataset and ever-unfinished edition, is forthcoming in 2018 from the University of Michigan Press.