human values in information system design l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Human Values in Information System Design PowerPoint Presentation
Download Presentation
Human Values in Information System Design

Loading in 2 Seconds...

play fullscreen
1 / 38

Human Values in Information System Design - PowerPoint PPT Presentation


  • 219 Views
  • Uploaded on

Human Values in Information System Design. Batya Friedman Associate Professor The Information School. Overview. Values in Information Technology Value Sensitive Design Project 1. The Watcher & The Watched: Social Judgments about Privacy in a Public Place

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Human Values in Information System Design' - holland


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
human values in information system design

Human Values in Information System Design

Batya Friedman

Associate Professor

The Information School

© Batya Friedman 2003

overview
Overview
  • Values in Information Technology
  • Value Sensitive Design
  • Project 1. The Watcher & The Watched: Social Judgments about Privacy in a Public Place
  • Project 2. Cookies, Informed Consent, and Web Browsers
  • Propositions for Technology, Values, and the Justice System

© Batya Friedman 2003

current uw collaborators faculty staff and students
Faculty

Alan Borning, Ph.D. (CSE)

Sybil Carrère, Ph.D. (Nursing)

Peter H. Kahn, Jr., Ph.D. (Psych.)

David Notkin, Ph.D. (CSE)

Zoran Popevic, Ph.D. (CSE)

Paul Waddell, Ph.D. (Urban Pla.)

Research Staff

AJ Brush, Ph.D.

Brian Gill, Ph.D.

Students

Ph.D.

Nathan Freier (iSchool)

Erika Feldman (Psych.)

Janet Davis (CSE)

Irene Alexander (Eng.)

Masters

Nicole Gustine

Rachel Severson

Current UW Collaborators:Faculty, Staff, and Students

Undergrads

Brandon Rich (recent grad)

Jonathan Sabo

Scott Santens

Robin Sodeman

Anna Stolyar

© Batya Friedman 2003

collaborators and consultants from other institutions
Cornell University

Lynette I. Millett

New York University

HelenNissenbaum, Ph.D.

Daniel C. Howe

Princeton University

Edward Felten, Ph.D.

Purdue University

Alan Beck, Ph.D.

Gail Melson, Ph.D.

Nancy Edwards, Ph.D.

Brian Gilbert

Trace Roberts

Rivendel Con. & Design

Austin Henderson, Ph.D. (consul.)

Collaborators and Consultantsfrom Other Institutions

© Batya Friedman 2003

values in information technology
Values in Information Technology
  • Numerous, strong, and complex interactions between technology and enduring human values
  • Examples (from among many possibilities):
    • Privacy
    • Trust
    • Accountability
    • Ownership and property
    • Freedom from bias
    • Human welfare (safety, psych. well-being)
    • Universal usability
    • Environmental sustainability

© Batya Friedman 2003

goals for value sensitive design
Goals forValue Sensitive Design
  • Be proactive: Integrate the consideration of human values with design work (as opposed to providing an outside critique)
  • Develop a design methodology that is principled and comprehensive
  • This is a very hard problem, and VSD is certainly not the only possible approach – but we believe it provides a useful methodology

© Batya Friedman 2003

interactional perspective
Interactional Perspective
  • Value Sensitive Design is an interactional theory
    • In general, we don’t view values as inherent in a given technology
    • However, we also don’t view a technology as value-neutral
    • Rather, some technologies are more suitable than others for supporting given values
  • Investigating these value suitabilities (along with what values and whose values) is a key task of VSD

© Batya Friedman 2003

vsd s tripartite methodology
VSD’s Tripartite Methodology
  • Conceptual investigations
    • Philosophically informed analyses of the values and value conflicts involved in the system
  • Technical investigations
    • Identify existing or develop new technical mechanisms; investigate their suitability to support or not support the values we wish to further
  • Empirical investigations
    • Using techniques from the social sciences, investigate issues such as: Who are the stakeholders? Which values are important to them? How do they prioritize these values?
  • These are applied iteratively and integratively

© Batya Friedman 2003

direct and indirect stakeholders
Direct and Indirect Stakeholders
  • Direct stakeholders: Interact with the system being designed and its outputs
  • Indirect stakeholders: Don’t interact directly with the system, but are affected by it in significant ways

© Batya Friedman 2003

two projects investigating privacy in a public place
Two Projects Investigating Privacy in a Public Place
  • In a Physical Place: The Watcher and The Watched
  • In a Virtual Place: Cookies, Informed Consent, and Web Browsers

© Batya Friedman 2003

privacy in public warren and brandeis harvard educational review 1890
Privacy in Public(Warren and Brandeis, Harvard Educational Review, 1890)

“[While in earlier times], the state of the photographic art was such that one’s picture could seldom be taken without his consciously ‘sitting’ for the purpose, the law of contract or of trust might afford the prudent man sufficient safeguards against the improper circulation of his portrait; but since the latest advances in photographic art have rendered it possible to take pictures surreptitiously, the doctrines of contract and of trust are inadequate to support the required protection” (p. 179).

© Batya Friedman 2003

some empirical findings on privacy
Some EmpiricalFindings on Privacy
  • On the universal side, the empirical evidence points to the existence of and functional need for some form of privacy in all societies studied to date (Cf. Harris et al, 1995; Roberts & Gregor, 1971; Westin, 1984)
  • On the side of variation, the manifestations, regulations, and mechanisms of privacy vary widely across cultures (Cf. Biggs, 1970; Moore, 1984; Murphy, 1964)
  • Recent Privacy Polls in the United States:

[1] Equifax-Harris; [2] Harris; [3] Privacy and American Business

© Batya Friedman 2003

an inside office
An Inside Office

© Batya Friedman 2003

slide14

Room with a View –The Laboratory Experiment (Peter Kahn, Batya Friedman, Jennifer Hagman, Sybil Carrère)(N = 90; 30 university students in each condition)

© Batya Friedman 2003

room with a view the three conditions
Room with a View –The Three Conditions

Real Window

Blank Wall

HDTV Real-Time Plasma Display

© Batya Friedman 2003

slide16

The Watcher and The Watched: Social Judgments about Privacy in a Public Place(Batya Friedman, Peter Kahn, Jennifer Hagman, 2004)(Surveys: N = 750; Interviews: N= 120)

The Watcher

TheWatched

The Camera

© Batya Friedman 2003

slide17

W

30

W

30

W

30

w

30

w

750

Int.

Int.

Int.

Int.

Sur.

M

F

M

F

M

F

M

F

M

F

0

13

13

13

7

27

13

33

---

---

13

40

13

27

13

33

20

33

17

31

20

27

20

27

47

47

20

20

17

27

17

31

17

27

0

27

27

40

7

40

27

33

19

28

0

27

13

20

0

7

13

27

21

35

0

27

0

53

40

67

47

60

20

47

37

59

19

19

28

28

0

13

21

21

35

35

33

67

27

47

0

53

33

67

35

57

0

20

47

67

27

47

7

47

33

53

32

52

37

37

59

59

27

0

7

40

40

73

53

80

27

53

34

55

35

35

57

57

27

7

0

47

33

73

47

73

40

47

33

54

32

32

52

52

7

27

34

34

55

55

0

40

33

33

54

54

© Batya Friedman 2003

why do people hold these views
Why Do People Hold These Views?
  • For “all right” evaluations (on average):
    • Personal Interest (31%)
    • Functionality (31%)
    • Social Expectations (24%)
  • For “not all right” evaluations (on average):
    • Functionality (34%)
    • Social Expectations (30%)
    • Human Welfare/Safety (25%)
    • Privacy (29%)
    • Informed Consent (38%)

© Batya Friedman 2003

why do people hold these views19
Why Do People Hold These Views?
  • For “all right” evaluations (on average):
    • Personal Interest (31%)
    • Functionality (31%)
    • Social Expectations (24%)
  • For “not all right” evaluations (on average):
    • Functionality (34%; Watcher: women 53%, men 0%)
    • Social Expectations (30%)
    • Human Welfare/Safety (25%)
    • Privacy (29%; Watched:women 16%; men 37%)
    • Informed Consent (38%; Watcher: women 61%, men 0%)

© Batya Friedman 2003

summary of key findings 1
Summary of Key Findings (1)
  • More women were concerned then men. Women were more likely then men to be concerned about the HDTV and display of real-time images.
  • Women’s concerns less context sensitive. Men in the position of power (“The Watchers”) tended to be less concerned then men in the vulnerable position (“The Watched”). Strikingly, nearly identical percentages of women expressed concern, independent of context – “Watcher” or “Watched”. (Cf. Asch, 1952; Milgram, 1963)
  • Privacy in public is a multi-faceted issue. Participants expressed a range of reasons for their judgments, with a more diverse set used to support “not all right” evaluations. (Cf. Schoeman, 1984)

© Batya Friedman 2003

summary of key findings 2
Summary of Key Findings (2)
  • Research Method: Indirect Stakeholders. The Value Sensitive Design methodology positioned this research to identify the concerns of indirect stakeholders (“The Watched”) and, within those, that of women.
  • Research Method: Ecological Validity. The results also demonstrate the need to study people’s social judgments in the context of technologies in use (as opposed to “what if” scenarios…).

© Batya Friedman 2003

project 2 privacy in public in a virtual place
Project 2 – Privacy in Public in a Virtual Place

Cookies, Informed Consent, and Web Browsers

© Batya Friedman 2003

slide23
Cookies and Informed Consentin Web Browsers(Batya Friedman, Edward Felten, Lyn Millett, Daniel Howe, 2000, 2001, 2002)
  • Conceptual Investigation
    • What do we mean by informed consent online?
  • Technical Investigations
    • A retrospective study (Netscape Navigator and Internet Explorer, 1995 – 1999)
    • Redesign of the Mozilla browser
  • Empirical Investigation
    • Formative evaluation of redesign work
    • Traditional usability
    • Value-oriented features

© Batya Friedman 2003

summary value sensitive design s constellation of features 1
Summary: Value Sensitive Design’s Constellation of Features (1)
  • Proactive: seeks to influence the design of technology throughout the design process.
  • Enlarges the arena in which values arise to include not only the work place but also education, the home, commerce, online communities, the justice system, and public life.
  • Integrative methodology that includes conceptual, empirical, and technical investigations.
  • Enlarges the scope of human values beyond those of cooperation and participation and democracy to include all values, especially those with moral import.

© Batya Friedman 2003

summary value sensitive design s constellation of features 2
Summary: Value Sensitive Design’s Constellation of Features (2)
  • Identifies and takes seriously both direct and indirect stakeholders
  • Interactional theory: values are viewed neither as inscribed in technology, nor as simply transmitted by social forces
  • Builds from the psychological position that certain values are universally held, but that how such values play out in particular cultures will vary widely (abstract vs. concrete or act-based conceptualizations)
  • Distinguishes between usability and human values with ethical import

© Batya Friedman 2003

slide31
Technology, Values & the Justice System:Proposition I(Friedman & Nissenbaum, 1996; Friedman, Kahn, & Borning, 2002)

We can’t anticipate all the value consequences of designing and deploying a particular information technology.

  • Use “best practices” but don’t demand

perfection

  • Design systems with the expectation that they

will need to be adapted over time

© Batya Friedman 2003

technology values the justice system proposition ii
Technology, Values & the Justice System:Proposition II

With respect to privacy, historically the bulk of our protections have come from the difficulty and cost of accessing and manipulating information.

  • When we introduce a technology that enhances access to

information, we can expect it to unbalance privacy checks

within the social fabric.

  • The justice system may need to reintroduce a reasonable

balance. The use of technology may assist the justice system in achieving that balance.

© Batya Friedman 2003

technology values the justice system proposition iii
Technology, Values & the Justice System:Proposition III

Informed consent can be a useful tool for creating the conditions in which a balance between privacy and access can flourish.

  • Consent implies the existence of an
  • “on/off” switch

© Batya Friedman 2003

technology values the justice system proposition iv
Technology, Values & the Justice System:Proposition IV

Defaults matter.

Most people don’t change the default settings on their machines.

  • The justice system has a good deal at

stake in ensuring that technologists

get the defaults “right” the first time

© Batya Friedman 2003

technology values the justice system proposition v
Technology, Values & the Justice System:Proposition V

Opt in?

Or opt out?

(Tied to defaults.)

© Batya Friedman 2003

technology values the justice system proposition vi
Technology, Values & the Justice System:Proposition VI

Visible?

Invisible?

(This is about surreptitious data collection.)

© Batya Friedman 2003

technology values the justice system proposition vii
Technology, Values & the Justice System:Proposition VII

Mobile Data

(In the form of ubiquitous computing, location sensing, context aware computing, RFD tags…)

© Batya Friedman 2003

the end
THE END

© Batya Friedman 2003