Whither Hypertext
The Hypertext 2007 conference had moved very far from its initial model of authorial one-to-one or one-to-many interactivity to embrace electronic interconnected texts that may have many-to-many Web 2.0 characteristics and the semantic web’s use of self-organizing technologies.
From the very beginning of the panels on “Hypertext and the Person,” it was clear that social media paradigms would be important for assessing the role of hypertext and digital culture. As someone who studies institutional websites, I was interested in the paper about how “websites evolve over time” from Christian Doerr and Daniel von Dincklage about research done on the trials followed by users to the University of Colorado site. The team did manage to create a workable program that could generate “quicklinks” that predicted frequently accessed sites at a given point in the academic calendar, but they had to use a notBlog(A) function to filter results that would contain expired content, maintain a “blacklist” of rhetorically inappropriate front-page choices, and do a certain among of editing of the automatically generated results. The University of Colorado was also apparently open to “Web 2.0” approaches that would allow visitors to personal institutional pages. In contrast, Markel Vigo from the University of the Basque Country focused on web accessibility issues, which could also be user-generated, but may not solve the problems of elderly, disabled, or at-risk users, who are often unable to find information about life-or-death programs.
My paper started of the track on “Hypertext, Culture, and Communication” in which I discussed how web generators could be used for political and social satire. After opening with a discussion of the subversive appeal of the Church Sign Generator, I showed some examples of more pointed institutional critique: The Northwest Boarding Pass Generator, the Postmodern Essay Generator, and SCIgen, an interface for creating computer-science papers, three of which were actually accepted for a deceptively nonselective conference. I closed with a discussion of how the “interactivity” and creative potential of commercialized versions of Web 2.0 could be parodied at sites like Mark Marino’s Web 2.0 App Generator.
Next Monica Schraefel (mc schraefel) held forth about the continuing relevance of Vannevar Bush’s classic essay about information, “As We May Think” in on “What is the Analogue for the Semantic Web and Why is Having One Important?” She defined the semantic web as being Bush’s Memex + a Notebook function. She argued that analogies like “The Web is like a page,” even in the case of mash-ups based on maps encouraged false assumptions about information architecture. Bush, she pointed out, wanted computers to be tasked with repetitive tasks, rather than leaving the computationally trivial non-AI problems to people.
Schraefel’s analysis included an extended meditation about electronic notebooks and the desire to have “automatic capture so we can forget,” since we are no longer in an oral culture of continuous rehearsals. In getting out of mere keyword matching, she talked about sending kids into the woods with PDAs equipped for data capture to encourage the social activity of sharing results. Among the other programs she mentioned were myGrid, Haystack, and mSpace.fm. Since I’ve been having a long-running discussion with my colleagues about the difficulty of getting students to create research notebooks without having them be dismissed as busywork, my interest really was stimulated by Schraefel’s talk, which received the best paper prize at the conference. In my own academic planning, I’ve been looking at the relative merits of Zotero, Netvibes (which can get students organizing RSS feeds based on keywords), and some of the options from corporate behemoth Google.
She closed with a larger point about valorizing the pidgin generated by semantic web practices that mixed wild and tame (and see more on pidgins and creoles here).
Next up was new Facebook friend Nick Diakopoulos from Georgia Tech who gave a paper about “The Evolution of Authorship in a Remix Society.” Diakopoulos’s group has been studying the attitudes of video re-mix makers on the online editing site JumpCut. In analyzing both the discourse of the website and extended interviews with six users, the research team looked at the four sides of Lawrence Lessig’s Norms/Laws/Market/Architecture schema to analyze participants and administrators differing attitudes about intellectual property.
After lunch Hugh C. Davis presented a great paper, “Towards Better Understanding of Folksonomic Patterns,” about user tagging patterns on Del.icio.us I was surprised to learn about the number of computer tags that mimicked conventions in computer programming and that 34% of them could be categorized as for personal reference (“toHugh,” “myBlog,” “toRead,” etc.). Davis argued that with user-generated tagging, only about 5% of tags were useful, although many of the others with terms like “Kool” and “Kickass” were also worth studying as evaluative data. Co-author Hend S. Al-Khalifa had done the hard work of categorizing over 10,000 tags. My favorite del.icio.us tag of all time has to be the self-evident “SaveThis.”
Toward the end of the day, Harris Wu showed an interesting makeover of USA.gov that tried to create more useful navigation than the simple A-Z index that it had before. He also discussed a site with images of American Political History. Also up was Claudia Hess examining authorship patterns in featured articles in the German Wikipedia that were designated as “Exzellent” or “Lesenwert” (worth reading). Finally, the first half of the “Hypertext and Society” programme ended with Martin Halvey discussing his research findings on YouTube, where the average age of a tagging user was 24 and most pages get their views in the first week.
The dinnertime keynote with Wendy Hall, “Back to the Future with Hypertext: A Tale of Two or Three Conferences” was a surprisingly self-critical look back at the organization’s history and some of the missed opportunities in their conference past. Although she didn’t get into some of Nick Montfort’s critiques about the battle between “hypertext” and “cybertext” in the academy in which "Cybertext Killed the Hypertext Star," she did discuss the organization’s blindness to the potential of the World Wide Web. She started her history in 1987 with her own reading of “As You May Think” and the introduction of the Hypercard for Mac. She also cited Douglas Englebart’s work on augmenting human intellect in her “intertwingled,” to use the phrase of Ted Nelson, who was in the audience, compressed analysis of influence studies in the field of hypertext. Most amazingly, she tells the story about how Web pioneer Tim Berners-Lee presented about the World Wide Web at the conference in 1990 (although he did not have a formal peer-reviewed paper) and that his paper in 1991 was rejected, even though his work would turn out to be visionary. By 1993, she said half of the papers had web demos at the height of the conference in Seattle. Sadly, she says that the groups failure to “embrace the web” led to a schism with the ACM’s designated web special interest group, which led to a disaster in 1997 in which the two conferences were in competition on the academic schedule. She charged that by then the web conference, which was held in a theme park and charged a pricey registration, excluded “poets, authors, and human aspects” that the original hypertext conference had embraced. Although she talked about the importance of considering hypertext as both an art and science, she did, however, confess to having some interactions with creative writers at past conferences that made her think of the scene with the Vogons in The Hitchhiker’s Guide to the Galaxy. In closing with an attitude of self-forgiveness, she discussed the difficulty of predicting what people will do with technology with the example of the “mostly okay” Wikipedia as a case in point.
Update: Apparently conference organizers were also asking these questions.
From the very beginning of the panels on “Hypertext and the Person,” it was clear that social media paradigms would be important for assessing the role of hypertext and digital culture. As someone who studies institutional websites, I was interested in the paper about how “websites evolve over time” from Christian Doerr and Daniel von Dincklage about research done on the trials followed by users to the University of Colorado site. The team did manage to create a workable program that could generate “quicklinks” that predicted frequently accessed sites at a given point in the academic calendar, but they had to use a notBlog(A) function to filter results that would contain expired content, maintain a “blacklist” of rhetorically inappropriate front-page choices, and do a certain among of editing of the automatically generated results. The University of Colorado was also apparently open to “Web 2.0” approaches that would allow visitors to personal institutional pages. In contrast, Markel Vigo from the University of the Basque Country focused on web accessibility issues, which could also be user-generated, but may not solve the problems of elderly, disabled, or at-risk users, who are often unable to find information about life-or-death programs.
My paper started of the track on “Hypertext, Culture, and Communication” in which I discussed how web generators could be used for political and social satire. After opening with a discussion of the subversive appeal of the Church Sign Generator, I showed some examples of more pointed institutional critique: The Northwest Boarding Pass Generator, the Postmodern Essay Generator, and SCIgen, an interface for creating computer-science papers, three of which were actually accepted for a deceptively nonselective conference. I closed with a discussion of how the “interactivity” and creative potential of commercialized versions of Web 2.0 could be parodied at sites like Mark Marino’s Web 2.0 App Generator.
Next Monica Schraefel (mc schraefel) held forth about the continuing relevance of Vannevar Bush’s classic essay about information, “As We May Think” in on “What is the Analogue for the Semantic Web and Why is Having One Important?” She defined the semantic web as being Bush’s Memex + a Notebook function. She argued that analogies like “The Web is like a page,” even in the case of mash-ups based on maps encouraged false assumptions about information architecture. Bush, she pointed out, wanted computers to be tasked with repetitive tasks, rather than leaving the computationally trivial non-AI problems to people.
Schraefel’s analysis included an extended meditation about electronic notebooks and the desire to have “automatic capture so we can forget,” since we are no longer in an oral culture of continuous rehearsals. In getting out of mere keyword matching, she talked about sending kids into the woods with PDAs equipped for data capture to encourage the social activity of sharing results. Among the other programs she mentioned were myGrid, Haystack, and mSpace.fm. Since I’ve been having a long-running discussion with my colleagues about the difficulty of getting students to create research notebooks without having them be dismissed as busywork, my interest really was stimulated by Schraefel’s talk, which received the best paper prize at the conference. In my own academic planning, I’ve been looking at the relative merits of Zotero, Netvibes (which can get students organizing RSS feeds based on keywords), and some of the options from corporate behemoth Google.
She closed with a larger point about valorizing the pidgin generated by semantic web practices that mixed wild and tame (and see more on pidgins and creoles here).
Next up was new Facebook friend Nick Diakopoulos from Georgia Tech who gave a paper about “The Evolution of Authorship in a Remix Society.” Diakopoulos’s group has been studying the attitudes of video re-mix makers on the online editing site JumpCut. In analyzing both the discourse of the website and extended interviews with six users, the research team looked at the four sides of Lawrence Lessig’s Norms/Laws/Market/Architecture schema to analyze participants and administrators differing attitudes about intellectual property.
After lunch Hugh C. Davis presented a great paper, “Towards Better Understanding of Folksonomic Patterns,” about user tagging patterns on Del.icio.us I was surprised to learn about the number of computer tags that mimicked conventions in computer programming and that 34% of them could be categorized as for personal reference (“toHugh,” “myBlog,” “toRead,” etc.). Davis argued that with user-generated tagging, only about 5% of tags were useful, although many of the others with terms like “Kool” and “Kickass” were also worth studying as evaluative data. Co-author Hend S. Al-Khalifa had done the hard work of categorizing over 10,000 tags. My favorite del.icio.us tag of all time has to be the self-evident “SaveThis.”
Toward the end of the day, Harris Wu showed an interesting makeover of USA.gov that tried to create more useful navigation than the simple A-Z index that it had before. He also discussed a site with images of American Political History. Also up was Claudia Hess examining authorship patterns in featured articles in the German Wikipedia that were designated as “Exzellent” or “Lesenwert” (worth reading). Finally, the first half of the “Hypertext and Society” programme ended with Martin Halvey discussing his research findings on YouTube, where the average age of a tagging user was 24 and most pages get their views in the first week.
The dinnertime keynote with Wendy Hall, “Back to the Future with Hypertext: A Tale of Two or Three Conferences” was a surprisingly self-critical look back at the organization’s history and some of the missed opportunities in their conference past. Although she didn’t get into some of Nick Montfort’s critiques about the battle between “hypertext” and “cybertext” in the academy in which "Cybertext Killed the Hypertext Star," she did discuss the organization’s blindness to the potential of the World Wide Web. She started her history in 1987 with her own reading of “As You May Think” and the introduction of the Hypercard for Mac. She also cited Douglas Englebart’s work on augmenting human intellect in her “intertwingled,” to use the phrase of Ted Nelson, who was in the audience, compressed analysis of influence studies in the field of hypertext. Most amazingly, she tells the story about how Web pioneer Tim Berners-Lee presented about the World Wide Web at the conference in 1990 (although he did not have a formal peer-reviewed paper) and that his paper in 1991 was rejected, even though his work would turn out to be visionary. By 1993, she said half of the papers had web demos at the height of the conference in Seattle. Sadly, she says that the groups failure to “embrace the web” led to a schism with the ACM’s designated web special interest group, which led to a disaster in 1997 in which the two conferences were in competition on the academic schedule. She charged that by then the web conference, which was held in a theme park and charged a pricey registration, excluded “poets, authors, and human aspects” that the original hypertext conference had embraced. Although she talked about the importance of considering hypertext as both an art and science, she did, however, confess to having some interactions with creative writers at past conferences that made her think of the scene with the Vogons in The Hitchhiker’s Guide to the Galaxy. In closing with an attitude of self-forgiveness, she discussed the difficulty of predicting what people will do with technology with the example of the “mostly okay” Wikipedia as a case in point.
Update: Apparently conference organizers were also asking these questions.
Labels: conferences, generators, participatory culture, professional associations, wikis
0 Comments:
Post a Comment
<< Home