Today was the first meeting of
Broadening the Digital Humanities: The Vectors-IML NEH Summer Institute on Multimodal Scholarship, which included a number of nuts-and-bolts issues about scholarship with and of computational media, such as the situation that
Holly Willis asked about in her introduction: "How do you cite a game sequence?" Willis also noted the applicability of the work of my UC Irvine colleague
Paul Dourish to understand the "crystallization of institutional power" that can become operational when universities stake claims to virtual real estate, such as islands in
Second Life, or to mobile networks that are arrayed in material geography. She provided an overview of current projects at the
Institute for Multimedia Literacy, where the Vectors group would be meeting for a month, and explained how they were adding K-12 initiatives both locally and in New Mexico.
Tara McPherson then encouraged the
11 Vectors fellows to introduce themselves and to offer some reflections about the "digital humanities" and their relationship to the term. A number of people around the table pointed out that they were technically "digital social scientists," but except for a discussion later in the day about human subjects protocols, the disciplinary differences were not strongly marked during our introductory session. Before lunch, everything from
Ruben & Lullaby, an iPhone love story that can fit in one's pocket, to "
Yadoo - Making American Culture," which still exists largely in a future in which the 554 boxes in the archive are brought back to life online. I even made a contribution to
California's Living New Deal Project, as it was being explained by one of the presenters, since I recently noticed the
WPA mural by Stanton Macdonald-Wright that had been restored at my local library.
After
Cheryl Ball introduced herself through her role as an editor at
Kairos, to whom the
Vectors journal served as what she called a kind of "evil twin,"
Katherine Hayles argued that there was a complicated politics that had to be negotiated now that "second-wave digital humanities," which exploits the capacities of multimedia computing for scholarly projects, was making people from the "first wave" of enthusiasts for text-encoding projects and digital archives "upset." As someone affiliated with an
information science/information studies program, Hayles shared my concern that the digital humanities continued to be engaged with theoretically as well as practically with computer science, as did McPherson who described her recent work on the history of object-oriented programming and how those "particular code structures" could also offer a reading of "race and representation after World War II."
During the day, participants weren't just Twittering, they were also using
BackNoise to provide another channel for commentary to help people find the work of
Erik Loyer, the
Digital Durham Project, David Shorter’s
Cuardeno, and the book site for
Micki McGee. It was more like a series of guideposts than the snarky backchannel that one might have expected it to be. So it was the
Twitter feed that probably captured some of the best lines of the day, as the different contingents from those planning to incorporate online digital video to those thinking about mapping and information representation tried to answer McPherson's question about "How might we visualize theory?" As McPherson narrated the second-wave/first-wave distinction, she argued for the importance of "interactive and visual experiences" in digital scholarship as second wavers wanted to engage with the "surface of screen." She also asserted that since about 2004 , the two communities started to see interaction as possible, and the lessons learned by "old school humanities computing folks" who developed archival resources with rigid standards and strong principles of interoperability informed the back-end practices of those who were focusing on aesthetics, media practices, and interactivities. If Hayles was pointing out how the humanities needs an understanding of computing, McPherson reminded her audience that "computing needs the humanities" too, particularly in the case of an intellectual movement like object-oriented programming that propagated "the separation of cause and effect" without seeing any ideological implications from that design philosophy.
There was also a lot of talk about failure, one of my favorite subjects, since I'm turning my attention to failed digital learning initiatives in my new book projects and made it the subtitle of my first. I plan to be following up on the work of
Bethany Nowviskie and Dot Porter who are surveying stakeholders about what they call "
graceful degradation" for the many digital humanities projects that go dark. This mindfulness about
hubris and embrace of a philosophy of versioning would also be important given the structure of this summer institute, in which participants would probably only reach the draft stage working with fellow scholars and the team of designers.
It was also interesting to hear McPherson discuss a new Mellon grant with Brown, UCSD, NYU, and Rochester that also brings together four archives, eight scholars, and three university presses -- MIT, University of California, and Duke -- participating from raw digital materials to finished online publication. She also explained how the differences in both scale and philosophies of openness were already beginning to play out for the four archives:
Critical Commons, the
Internet Archive, the
Shoah Foundation, and the
Hemispheric Institute. She also noted that the Mellon's preference for open source tools created not only intellectual property conflicts for USC's lawyers but also more inquiry into the ways that Google's algorithms can still weight results from explicitly open source projects like
TextMap.
Steve Anderson spoke next, as a number of people got out their cameras to document the ironic formulae in his slides, which showed a series of supposedly inverse relationships between text and images, content and design, and design and technology. Instead Anderson argued that that Vectors represented an integrated model of "research," "design," and "technology," with no master lens. He argued that there could be a "dissolution of the binaries of front-end and back-end" with the "database as authoring space" model in which an emphasis solely on "information presentation" has become a deeper engagement with questions about "information architecture," so that scholars can explore and interpret "evocative knowledge objects." To introduce the idea of "dynamic indexing," which the journal promotes, the group was shown
Springgraph and
FreeMind to foster cognitive activities that promote thinking and mapping in presentation tools to "map out idea structures," with an interface that is much more flexible than PowerPoint and other commonly used presentation tools in academia. Anderson warned against mere "clever refiguring" and showed Ted Nelson's 1963 illustration of "
ordinary hypertext" as a reminder of how vaccuous even certain ways of hypertextual linking could be.
Labels: digital archives, digital humanities, higher education, institutes