Democracy, Privacy, Technology, and Persuasion
At Persuasive 2009, Janet Davis of Grinnell made the argument for "Design Methods for Ethical Persuasive Technology" that exposed certain paradoxes inherent in the very words "persuasive technologies," which she described as "fraught with ethical problems," particularly when "principles are important," although she claimed that "existing design methods can help." Citing
BJ Fogg, she was eager to distinguish "persuasion" from "deception," "coercion," or "manipulation." Furthermore, with the use of computational media, novelty can blind users to persuasive intent, computers are seen as intelligent and fair, computers can be ubiquitous and persistent, computers can not be negotiated with, computers don’t have emotions, and computers do not share in moral responsibility.
Yet, according to "Bias in information systems," a 1996 article by Friedman & Nissenbaum 1996 that explored everything from how travel agents book airline flights to automating applications for citizenship, procedural systems are far from unbiased. Furthermore, Friedman, Millet & Felton have pointed out that privacy concerns are rarely closely examined by busy computer users who relegate privacy issues to default behaviors, because examining cookies is fatiguing. Furthermore, Johnson and Mulvey's 1995 study of "Accountability and computer decision systems" notes that designers rarely can be held accountable for the systems that they create. Moreover, Mason and Fleishman identify three "covenants" around reality, values, transparency for making visible models, mechanisms, and assumptions. Finally, Berdichevsky & Neuenschwander claim in 1999 in "Toward and Ethics of Persuasive Technologies" that eight principles are important in "ethical persuasive technology."
Using this research, Davis asked a number of questions of an audience that may have a tendency to focus on geegaws rather than social consensensus, such as "How can we predict outcomes?" and "Why do we prioritize privacy?" and "How do we ensure that disclosure is understood?" and "Could other values be just as or more important?" (Such values could include courtesy, for example.)
Davis noted that the "Golden Rule" that requires perspective taking may still be relevant to persuasive technologists, particularly those interested in the "value sensitive design," as outlined by Friedman, Kahn and Borning in 2006. Davis asserted the importance of "empirical investigations of how principles are understood by regular people" that are "iterative and integrative" and attention to the "context of use, value to be supported, and existing technology." For her, "dark side scenarios" and "value scenarios" from Berdichevsky & Neuenschwander's work in 1999 are also worth paying attention to.
The work of Jessica Miller was also important in the Davis talk, whether it involved "knowledge-sharing," the role of positive or negative as well as negative recognition, considerations of "privacy, trust, awareness," and the "value dams" and "flows" that are described in "Value tensions in design." However, Muller's 2003 work on "participatory design" (as opposed to value centered design) was also key, along with Bødker, Grønbæk, and Kyng's Cooperative Design: Techniques and Experiences From the Scandinavian Scene.
For Davis, this concerned not only value of democracy, which at the level of "systems development" was "too abstract," but also participatory design now beyond work, as in the case of recent deliberations about voting on Facebook agreements. Workshop, games, play, and cooperative prototyping were all critical for the development process, along with the idea of "design as rhetoric" that was expressed in Carl DiSalvo's 2008 Neighborhood Networks in which ambient persuasive systems could address poor air quality and other problems in a neighborhood, such as noise and speeding. Another exemplar came from ADAPT: Audience Design of Ambient Persuasive technology from Miller, Rich, and Davis in 2009.
Given the fact that Berdichevsky himself sent a text message to attendee BJ Fogg abut the role of intermediaries, and Fogg addressed the issue of surveillance on Facebook, it seemed that the discussion would be ongoing.
BJ Fogg, she was eager to distinguish "persuasion" from "deception," "coercion," or "manipulation." Furthermore, with the use of computational media, novelty can blind users to persuasive intent, computers are seen as intelligent and fair, computers can be ubiquitous and persistent, computers can not be negotiated with, computers don’t have emotions, and computers do not share in moral responsibility.
Yet, according to "Bias in information systems," a 1996 article by Friedman & Nissenbaum 1996 that explored everything from how travel agents book airline flights to automating applications for citizenship, procedural systems are far from unbiased. Furthermore, Friedman, Millet & Felton have pointed out that privacy concerns are rarely closely examined by busy computer users who relegate privacy issues to default behaviors, because examining cookies is fatiguing. Furthermore, Johnson and Mulvey's 1995 study of "Accountability and computer decision systems" notes that designers rarely can be held accountable for the systems that they create. Moreover, Mason and Fleishman identify three "covenants" around reality, values, transparency for making visible models, mechanisms, and assumptions. Finally, Berdichevsky & Neuenschwander claim in 1999 in "Toward and Ethics of Persuasive Technologies" that eight principles are important in "ethical persuasive technology."
Using this research, Davis asked a number of questions of an audience that may have a tendency to focus on geegaws rather than social consensensus, such as "How can we predict outcomes?" and "Why do we prioritize privacy?" and "How do we ensure that disclosure is understood?" and "Could other values be just as or more important?" (Such values could include courtesy, for example.)
Davis noted that the "Golden Rule" that requires perspective taking may still be relevant to persuasive technologists, particularly those interested in the "value sensitive design," as outlined by Friedman, Kahn and Borning in 2006. Davis asserted the importance of "empirical investigations of how principles are understood by regular people" that are "iterative and integrative" and attention to the "context of use, value to be supported, and existing technology." For her, "dark side scenarios" and "value scenarios" from Berdichevsky & Neuenschwander's work in 1999 are also worth paying attention to.
The work of Jessica Miller was also important in the Davis talk, whether it involved "knowledge-sharing," the role of positive or negative as well as negative recognition, considerations of "privacy, trust, awareness," and the "value dams" and "flows" that are described in "Value tensions in design." However, Muller's 2003 work on "participatory design" (as opposed to value centered design) was also key, along with Bødker, Grønbæk, and Kyng's Cooperative Design: Techniques and Experiences From the Scandinavian Scene.
For Davis, this concerned not only value of democracy, which at the level of "systems development" was "too abstract," but also participatory design now beyond work, as in the case of recent deliberations about voting on Facebook agreements. Workshop, games, play, and cooperative prototyping were all critical for the development process, along with the idea of "design as rhetoric" that was expressed in Carl DiSalvo's 2008 Neighborhood Networks in which ambient persuasive systems could address poor air quality and other problems in a neighborhood, such as noise and speeding. Another exemplar came from ADAPT: Audience Design of Ambient Persuasive technology from Miller, Rich, and Davis in 2009.
Given the fact that Berdichevsky himself sent a text message to attendee BJ Fogg abut the role of intermediaries, and Fogg addressed the issue of surveillance on Facebook, it seemed that the discussion would be ongoing.
Labels: institutional rhetoric, technology
0 Comments:
Post a Comment
<< Home