1 / 81

Converging Technologies and Pervasive Computing

Converging Technologies and Pervasive Computing. Cybertechnology is converging with non-cybertechnologies, including biotechnology and nanotechnology. Cybertechnology is also becoming pervasive as computing devices now pervade our public and private spaces.

piper
Download Presentation

Converging Technologies and Pervasive Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Converging Technologies and Pervasive Computing • Cybertechnology is converging with non-cybertechnologies, including biotechnology and nanotechnology. • Cybertechnology is also becoming pervasive as computing devices now pervade our public and private spaces. • Pervasive computing and technological convergence are both facilitated by the miniaturization of computing devices.

  2. Converging Technologies (Continued) • Computers are becoming less visible as distinct entities, as they: • (a) continue to be miniaturized and integrated into ordinary objects, • (b) blend unobtrusively into our surroundings. • Cybertechnology is also becoming less distinguishable from other technologies as boundaries that have previously separated them begin to blur because of convergence.

  3. Technological Convergence • Howard Rheingold (1992) notes that technological convergence occurs when: unrelated technologies or technological paths intersect or “converge unexpectedly” to create an entirely new field. • Convergence re cybertechnology is not new. • Rheingold notes that virtual-reality (VR) technology resulted from the convergence of video technology and computer hardware in the 1980s.

  4. Technological Convergence (Continued) • Cybertechnologies are converging with non-cybertechnologies at an unprecedented pace. • Two areas involving convergence are: • biotechnology and information technology (resulting in the field of bioinformatics); • nanotechnology and computing (giving rise to the field of nanocomputing).

  5. “Enabling Technologies” • Rheingold notes that convergence often depends on enabling technologies, which he defines as: technologies that make other technologies possible. • He points out that for VR, converging elements had to wait for the enabling technologies of electronic miniaturization, computer simulation, and computer graphics to mature in the 1980s.

  6. Miniaturization and Embedded/ Integrated Computing Devices • Technological convergence has been enabled by two key factors: • (1) the miniaturization of computers and computing devices; • (2) the embedding/integrating of computing devices into objects and environments.

  7. Three Areas of Technological Convergence Affecting Ethics • Three converging technologies that raise ethical concerns are: • ambient intelligence (AmI) the convergence of (a) pervasive computing, (b) ubiquitous communication, and (c) intelligent user interfaces; • bioinformatics; • nanocomputing.

  8. Ambient Intelligence (AmI) • Ambient intelligence (or AmI) is defined as a technology that enables people to live and work in environments that respond to them in “intelligent ways” (Aarts and Marzano, 2003; Brey, 2005; and Weber et al., 2005). • Consider the example of the “intelligent home” (Raisinghani, et al., 2004) described in the text.

  9. AmI (Continued) • Three key technological components make AmI possible: • pervasive computing; • ubiquitous communication; • intelligent user interfaces (IUIs).

  10. Pervasive Computing • According to the Centre for Pervasive Computing, pervasive computing is defined as: a computing environment where information and communication technology are “everywhere, for everyone, at all times.” • Computer technology is integrated in our environments – i.e., from “toys, milk cartons and desktops, to cars, factories, and whole city areas.”

  11. Pervasive Computing (Continued) • Pervasive computing is made possible because of the increasing ease with which circuits can be embedded into objects, including wearable, even disposable items. • Bütschi, Courant, and Hilty (2005) note that computing has already begun to pervade many dimensions of our lives. • For example, it pervades the work sphere, cars, public transportation systems, the health sector, the market, and our homes.

  12. Pervasive Computing (Continued) • Pervasive computing is sometimes also referred to as ubiquitous computing (or ubicomp). • “Ubiquitous computing” was coined by Mark Weiser (1991), who envisioned “omnipresent computers” that serve people in their everyday lives, both at home and at work.

  13. Pervasive Computing (Continued) • Adam Greenfield (2005) believes that ubiquitous or pervasive computing will insinuate itself much more thoroughly into our day-to-day activities than current Internet- and Web-based technologies. • For pervasive computing to operate at its full potential, however, continuous and ubiquitous communication between devices is also needed.

  14. Ubiquitous Communication • Ubiquitous communication aims at ensuring flexible and omnipresent communication between interlinked computer devices (Raisinghani et al., 2004) via: • wireless local area networks (W-LANs), • wireless personal area networks (W-PANs), • wireless body area networks (W-BANs), • Radio Frequency Identification (RFID).

  15. Intelligent User Interfaces (IUIs) • Intelligent User Interfaces (or IUIs) have been made possible by developments in the field of artificial intelligence (AI). • Philip Brey (2005) notes that IUIs go beyond traditional interfaces such as a keyboard, mouse, and monitor.

  16. IUIs (Continued) • IUIs improve human interaction with technology by making it more intuitive and more efficient than was previously possible with traditional interfaces. • With IUIs, computers can “know” and sense far more about a person than was possible with traditional interfaces, including information about that person’s situation, context, or environment.

  17. IUIs (Continued) • With IUIs, AmI remains in the background and is virtually invisible to the user. • Brey notes that with IUIs, people can be surrounded with hundreds of intelligent networked computers that are “aware of their presence, personality, and needs.” • But users may not be aware of the existence of IUIs in their environments.

  18. Ethical and Social Issues Affecting AmI • Three ethical and social issues affecting AmI: • freedom and autonomy; • technological dependency; • privacy, surveillance, and the “Panopticon.”

  19. Autonomy and Freedom Involving AmI • Will human autonomy and freedom be enhanced or diminished as a result of AmI technology? • AmI’s supporters suggest humans will gain more control over the environments with which they interact because technology will be more responsive to their needs. • Brey notes a paradoxical aspect of this claim, pointing out that “greater control” is presumed to be gained through a “delegation of control to machines.”

  20. Autonomy and Freedom (Continued) • Brey considers three ways in which AmI may make the human environment more controllable, because it can: • (1) become more responsive to the voluntary actions, intentions, and needs of users; • (2) supply humans with detailed and personal information about their environment; • (3) do what people want without having to engage in any cognitive or physical effort.

  21. Autonomy and Freedom (Continued) • Brey also considers three ways that AmI can diminish the amount of control that humans have over their environments, where users may lose control because a smart object can: • (1) make incorrect inferences about the user, the user’s actions, or the situation; • (2) require corrective actions on the part of the user; • (3) represent the needs and interests of parties other than the user.

  22. Technological Dependency • We have come to depend a great deal on cybertechnology in conducting many activities in our day-to-day lives. • In the future, will humans depend on the kind of smart objects and smart environments made possible by AmI technology in ways that exceed our current dependency on cybertechnology?

  23. Technological Dependency (Continued) • IUIs could relieve us of: • (a) having to worry about performing many of our routine day-to-day tasks, which can be considered tedious and boring, and • (b) much of the cognitive effort that has, in the past, enabled us to be fulfilled and to flourish as humans.

  24. Technological Dependency (Continued) • What would happen to us if we were to lose some of our cognitive capacities because of an increased dependency on cybertechnology? • Review the futuristic scenario (in the textbook) described by E. M. Forster about what happens to a society when it becomes too dependent on machines.

  25. Privacy, Surveillance, and the Panopticon • Marc Langheinrich (2001) argues that with respect to privacy and surveillance, four features differentiate AmI from other kinds of computing applications: • ubiquity, • invisibility, • sensing, • memory application.

  26. Privacy, Surveillance, and the Panopticon (Continued) • Langheinrich notes that because: • (1) computing devices are ubiquitous or omnipresent in AmI environments, privacy threats are more pervasive in scope. • (2) computers are virtually invisible in AmI environments, it is likely that users will not always realize that computing devices are present and are being used to collect and disseminate personal data.

  27. Privacy, Surveillance, and the Panopticon (Continued) • Langheinrich also believes that AmI poses a more significant threat to privacy than earlier computing technologies because: • Sensing devices associated with IUIs may become so sophisticated that they will be able to sense (private) human emotions like fear, stress, and excitement. • AmI has the unique potential to create a memory or “life-log” – i.e., a complete record of someone’s past.

  28. Surveillance and the Panopticon • Johann Čas (2004) notes that in AmI environments, no one can be sure that he or she is not being observed. • An individual cannot be sure whether information about his or her presence at any location is being recorded.

  29. Surveillance and the Panopticon (Continued) • Čas believes that the only realistic attitude is to assume that any activity (or inactivity) is being monitored and that this information may be used in any context in the future. • So, people in AmI environments are subject to a virtual “panopticon.” • Review the example of Bentham’s “Inspection House” (described in the textbook). Does it anticipate any threats posed by AmI?

  30. Table 12-1 Ambient Intelligence

  31. Bioinformatics • Bioinformatics is a branch of informatics. • Informatics involves the acquisition, storage, manipulation, analyses, transmission, sharing, visualization, and simulation of information on a computer. • Bioinformatics is the application of the informatics model to the management of biological information.

  32. Ethical Aspects of Bioinformatics • Three kinds of social and ethical concerns arise in bioinformatics research and development: • Privacy and Confidentiality; • Autonomy and Informed Consent; • Information Ownership and Property Rights.

  33. Privacy, Confidentiality, and the Role of Data Mining • Review the deCODE Genetics case (described in the textbook). • Many individuals who donated DNA samples to deCODE had the expectation that their personal genetic data was: • confidential information, • protected by the company’s privacy policies and by privacy laws.

  34. Privacy, Confidentiality, and the Role of Data Mining (Continued) • Anton Vedder (2004) notes that privacy protection that applies to personal information about individuals does not necessarily apply to that information once it is: • aggregated, and • crossed referenced with other information (via data mining).

  35. Privacy, Confidentiality, and Data Mining (Continued) • Research subjects could be denied employment, health insurance, or life insurance based on the results of data-mining technology used in genetic research. • For example, a person could end up in a “risky” category based on arbitrary associations and correlations that link trivial non-genetic information with sensitive information about one’s genetic data.

  36. Privacy, Confidentiality, and Data Mining (Continued) • Individuals who eventually become identified or associated with newly-created groups may have no knowledge that the groups to which they have been assigned actually exist. • These people may also have no chance to correct any inaccuracies or errors that could result from their association with that group.

  37. Autonomy and Informed Consent • The process of informed consent used in getting permissions from human research subjects who participate in genetic studies that use data mining is controversial. • In the deCODE case, did the genetic information acquired from “informed” volunteers meet the required conditions for valid informed consent?

  38. Autonomy and Informed Consent (Continued) • According to the Office of Technology Assessment (OTA) Report, entitled Protecting Privacy in Computerized Medical Information, individuals must: • (i) have adequate disclosure of information about the data dissemination process; • (ii) be able to fully comprehend what they are being told about the procedure or treatment.

  39. Autonomy and Informed Consent (Continued) • Because of the way data-mining technology can manipulate personal information that is authorized for use in one context only, the process of informed consent is opaque. • The conditions required for “valid” informed consent are difficult, if not impossible, to achieve in cases that involve secondary uses of personal genetic information. • So, it is not clear that these research subjects have full autonomy.

  40. Intellectual Property Rights and Ownership Issues • Consider that deCODE Genetics was given exclusive rights (for 12 years) to the information included in the Icelandic health-records database. • This raises property-rights issues that also affect who should (and should not) have access to the information in that database.

  41. Intellectual Property Rights and Ownership Issues (Continued) • Who should own the personal genetic information in deCODE’s database? • Should deCODE have exclusive ownership rights to all of the personal genetic information that resides in its databases? • Should individuals retain (at least) some rights over their personal genetic data when it is stored in a privately owned database?

  42. Intellectual Property Rights and Ownership Issues (Continued) • Have individuals that donated their DNA samples to deCODE necessarily lost all rights to their personal genetic data, once it was stored in that company’s databases? • Should deCODE hold rights to this data in perpetuity, and should deCODE be permitted to do whatever it wishes with that data?

  43. Intellectual Property Rights and Ownership Issues (Continued) • Why are questions involving the ownership of personal genetic information stored in commercial databases so controversial from an ethical point of view? • Recall our discussion in Chapter 5 of a commercial database containing personal information about customers that was owned by Toysmart.com, a now defunct on-line business.

  44. Table 12-2: Ethical Issues Associated with Bioinformatics

  45. Ethical Guidelines and Legislation for Genetic Data/Bioinformatics • ELSI (Ethical, Legal, and Social Implications) Guidelines have been established for federally-funded genomics research. • ELSI requirements do not apply to genomics research in the commercial sector.

  46. Ethical Guidelines and Legislation (Continued) • Some genetic-specific privacy laws and policies have been passed in response to concerns about potential misuses of personal genetic data. • In the U.S., laws affecting genetics have been enacted primarily at the state level.

  47. Ethical Guidelines and Legislation (Continued) • No U.S. federal laws protect personal genetic data per se. • The Health Insurance Portability and Accountability Act (HIPAA) provides broad protection for personal medical information. • HIPAA protects the privacy of “individually identifiable health information” from “inappropriate use and disclosure.”

  48. Ethical Guidelines and Legislation (Continued) • Critics worry that HIPAA does not provide any special privacy protection for personal genetic information. • It is not clear that HIPAA adequately addresses concerns affecting nonconsensual secondary uses of personal medical and genetic information (Baumer, Earp, and Payton, 2006).

  49. Nanotechnology • Rosalyn Berne (2005) defines nanotechnology as: the study, design, and manipulation of natural phenomena, artificial phenomena, and technological phenomena at the nanometer level. • K. Eric Drexler, who coined the term nanotechnology in the 1980s, describes the field as: a branch of engineering dedicated to the development of extremely small electronic circuits and mechanical devices built at the molecular level of matter.

  50. Nanotechnology and Nanocomputing • Drexler (1991) predicted that developments in nanotechnology will result in computers at the nano-scale, no bigger in size than bacteria, called nanocomputers. • Nanocomputers can be designed using various types of architectures. • An electronic nanocomputer would operate in a manner similar to present-day computers, differing primarily in terms of size and scale.

More Related