The Future of Writing
My colleague and collaborator Jonathan Alexander often had a standing-room only audiences for the Future of Writing Conference at U.C. Irvine.
The conference was introduced by David Theo Goldberg of the University of California Humanities Research Institute, which has attracted a number of major digital media scholars, particularly from Europe, to the UC Irvine campus. As Goldberg points out, perhaps Barthes too prematurely announced the death of the author, although practices of authorship in the university are changing as questions about the "evaluation of labor" become more complicated in everything from grading student essays to assessing faculty for promotion and tenure. Goldberg described a tension between authorship as a "marketplace function" and the “network practices of composition.” As others have observed, computational media have also had a profound effect on the design of published material and have reshaped literacy as "whole words" are "letterized." Goldberg points to Katherine Hayles as an example of the rare scholar able to engage with computational functionalities in wys that go beyond "technological determinism" and to the literary creativity of Millie Niss in her "Oulipoems."
As keynote speaker Tara McPherson observed about one of the morning panels, the humanists present were often lamenting how particular tools constrained pedagogical and compositional practices while simultaneously creating powerful engines for expression. One example of highly constrained composition, of course, is Twitter, which Carl Whithaus analyzed in depth as a "short short form" during his talk about dialogism and speech genres and how swarms and group rhetorical purposes were often organized around "links, sexuality, obsession, the mundane." Twitter, he argued, could prove to be important pedagogically as a kind of "elevator pitch" that opens up possibilities for invention. Following Whithaus, Andrew Klobucar explained some of the experiments that he had been doing with creative writing students, who are using computerized lexical tools featured at his Global Telelanguage Resources to create new compounds and neologisms using "LexIcons."
McPherson's keynote emphasized "emerging teaching practices" in the "scholarly vernacular" and how a relatively small coterie of academics were cogitating about "grid computing or visual knowledges" or were able to recognize the specialized jargon of supercomputing in terms like the "pedaflop." She pointed to Goldberg's piece for the Vectors digital journal on Hurricane Katrina, "Blue Velvet," as an example of work that could only be shown online. Too often, she argued, online articles merely rehearse the conventions of print monographs, so that "the formal modes have not changed," even in the case of a more progressive electronic publication, such as the Journal for Multimedia History, which still does not go beyond text with images or text with video clips.
McPherson also provided a recap of recent history in the following "typology of the digital humanities":
1. The Computing Humanities in the 60s and 70s, who may have been only fleeing departments with emerging feminist, cultural studies, or postmodern agendas and who used technology to harness computational power for archiving resources in particular areas of study, as in the case of the William Blake Archive
2. The Blogging Humanities, who responded to cutbacks in academic presses and problems with peer review by facilitating peer-to-peer conversations online, although their products for the public were often text heavy and disconnected from computing resources and the challenges of "big data" and its associated radically new forms of thinking and knowing
3. The Multimodal Humanities, who bring together databases, scholarly tools, networked writing, and p2p commentary while also leveraging the potential of the visual and aural media that dominates contemporary life and push arguments beyond Steven Toulmin’s model to explore new forms of literacy that see the computer as platform, a medium, and a display device in scholarship that shows as well as tells.
She argued that Vectors serves as a “long tail” test case, edge case, or testbed in which standard scholarly production gets the bulk of the academic audience and that it served as a site where faculty researchers could do more than present the careful subordination of argument of conventional essayistic writing to experiment with embodiment and role of immersion. Participants in the project have identified the following possible benefits.
1. Relational thinking: A deep engagement with database forms and algorithmic structures allows humanities scholars to formulate new research questions
2. Emergent genres of multimodal scholarship: Such genres cover a range of approaches, from the animated archive to the experiential argument to the interactive documentary to the spatialized essay to various forms of simulation or visualization that animate or volatize materials in the archive. As McPherson says, this process is about becoming interested in something beyond illustration and allowing materials to be filtered in new ways that include but also go beyond functions such as digitize, search, and catalogue.
3. Process as much as product: It is time to shift our notions of humanities scholarship away from a fixation on product toward a new understanding of process, and we need to value collaboration across skill sets as well as failure. Thus McPherson's concerns involve not just technology’s role in the humanities but the humanities’ role in technology, as well as issues of the larger public.
4. Rethinking digital tools: Scholarly tools shouldn’t be built a priori but rather in the context of use. We began with the research questions and built the tool subsequently. Such tools should also accommodate the tactile and emotional parts of argument and expression and encourage academics to work with middleware.
She closed by calling for more participation and collaboration across national boundaries (for example, through the Hemispheric Institute of the Americas) and across disciplines and job titles. Like many digital humanists, she payed tribute to the importance of partnerships with librarians and mentioned that her group may also partner with the Shoah Foundation. She argued that "95% of humanities research is never cited by anyone" and that academics should learn something from Minoo Moallem who discovered readers for "Nation on the Move," only because they were looking for the search term "Persian carpet."
During the question and answer period, perhaps the most interesting query came from Mark Marino, who discussed "repurposing as a critical move" in asking if Vectors would release the Flash code of the journal, so that others could produce scholarly and artistic remixes.
The conference was introduced by David Theo Goldberg of the University of California Humanities Research Institute, which has attracted a number of major digital media scholars, particularly from Europe, to the UC Irvine campus. As Goldberg points out, perhaps Barthes too prematurely announced the death of the author, although practices of authorship in the university are changing as questions about the "evaluation of labor" become more complicated in everything from grading student essays to assessing faculty for promotion and tenure. Goldberg described a tension between authorship as a "marketplace function" and the “network practices of composition.” As others have observed, computational media have also had a profound effect on the design of published material and have reshaped literacy as "whole words" are "letterized." Goldberg points to Katherine Hayles as an example of the rare scholar able to engage with computational functionalities in wys that go beyond "technological determinism" and to the literary creativity of Millie Niss in her "Oulipoems."
As keynote speaker Tara McPherson observed about one of the morning panels, the humanists present were often lamenting how particular tools constrained pedagogical and compositional practices while simultaneously creating powerful engines for expression. One example of highly constrained composition, of course, is Twitter, which Carl Whithaus analyzed in depth as a "short short form" during his talk about dialogism and speech genres and how swarms and group rhetorical purposes were often organized around "links, sexuality, obsession, the mundane." Twitter, he argued, could prove to be important pedagogically as a kind of "elevator pitch" that opens up possibilities for invention. Following Whithaus, Andrew Klobucar explained some of the experiments that he had been doing with creative writing students, who are using computerized lexical tools featured at his Global Telelanguage Resources to create new compounds and neologisms using "LexIcons."
McPherson's keynote emphasized "emerging teaching practices" in the "scholarly vernacular" and how a relatively small coterie of academics were cogitating about "grid computing or visual knowledges" or were able to recognize the specialized jargon of supercomputing in terms like the "pedaflop." She pointed to Goldberg's piece for the Vectors digital journal on Hurricane Katrina, "Blue Velvet," as an example of work that could only be shown online. Too often, she argued, online articles merely rehearse the conventions of print monographs, so that "the formal modes have not changed," even in the case of a more progressive electronic publication, such as the Journal for Multimedia History, which still does not go beyond text with images or text with video clips.
McPherson also provided a recap of recent history in the following "typology of the digital humanities":
1. The Computing Humanities in the 60s and 70s, who may have been only fleeing departments with emerging feminist, cultural studies, or postmodern agendas and who used technology to harness computational power for archiving resources in particular areas of study, as in the case of the William Blake Archive
2. The Blogging Humanities, who responded to cutbacks in academic presses and problems with peer review by facilitating peer-to-peer conversations online, although their products for the public were often text heavy and disconnected from computing resources and the challenges of "big data" and its associated radically new forms of thinking and knowing
3. The Multimodal Humanities, who bring together databases, scholarly tools, networked writing, and p2p commentary while also leveraging the potential of the visual and aural media that dominates contemporary life and push arguments beyond Steven Toulmin’s model to explore new forms of literacy that see the computer as platform, a medium, and a display device in scholarship that shows as well as tells.
She argued that Vectors serves as a “long tail” test case, edge case, or testbed in which standard scholarly production gets the bulk of the academic audience and that it served as a site where faculty researchers could do more than present the careful subordination of argument of conventional essayistic writing to experiment with embodiment and role of immersion. Participants in the project have identified the following possible benefits.
1. Relational thinking: A deep engagement with database forms and algorithmic structures allows humanities scholars to formulate new research questions
2. Emergent genres of multimodal scholarship: Such genres cover a range of approaches, from the animated archive to the experiential argument to the interactive documentary to the spatialized essay to various forms of simulation or visualization that animate or volatize materials in the archive. As McPherson says, this process is about becoming interested in something beyond illustration and allowing materials to be filtered in new ways that include but also go beyond functions such as digitize, search, and catalogue.
3. Process as much as product: It is time to shift our notions of humanities scholarship away from a fixation on product toward a new understanding of process, and we need to value collaboration across skill sets as well as failure. Thus McPherson's concerns involve not just technology’s role in the humanities but the humanities’ role in technology, as well as issues of the larger public.
4. Rethinking digital tools: Scholarly tools shouldn’t be built a priori but rather in the context of use. We began with the research questions and built the tool subsequently. Such tools should also accommodate the tactile and emotional parts of argument and expression and encourage academics to work with middleware.
She closed by calling for more participation and collaboration across national boundaries (for example, through the Hemispheric Institute of the Americas) and across disciplines and job titles. Like many digital humanists, she payed tribute to the importance of partnerships with librarians and mentioned that her group may also partner with the Shoah Foundation. She argued that "95% of humanities research is never cited by anyone" and that academics should learn something from Minoo Moallem who discovered readers for "Nation on the Move," only because they were looking for the search term "Persian carpet."
During the question and answer period, perhaps the most interesting query came from Mark Marino, who discussed "repurposing as a critical move" in asking if Vectors would release the Flash code of the journal, so that others could produce scholarly and artistic remixes.
Labels: composition, conferences, teaching
0 Comments:
Post a Comment
<< Home