1 / 31

Part IV : Liability Chapter 15: Security for Ubiquitous Computing

Part IV : Liability Chapter 15: Security for Ubiquitous Computing. Tobias Straub, Andreas Heinemann. Introduction & Motivation. UC features (not meant to be complete) large number of peers spontaneous and autonomous interaction a priori unknown communication partners

eara
Download Presentation

Part IV : Liability Chapter 15: Security for Ubiquitous Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part IV : LiabilityChapter 15: Security for Ubiquitous Computing Tobias Straub, Andreas Heinemann

  2. Introduction & Motivation • UC features (not meant to be complete) • large number of peers • spontaneous and autonomous interaction • a priori unknown communication partners • no or just limited established security infrastructure (e.g., PKI) • rich diversity of UC settings and applications • Virgil Gligor, 2005 From the Internet where processing is free and physically protected, but communication is not to UC where neither processing nor communication is free and physically protected Security for UC:

  3. UbiComp changes our world • today: • tomorrow: Common approaches Attacker PC Printer Trusted device known, trusted device Attacker Various/new monitoring Capabilities. Privacy at risk new, distributed, attacks ? Drucker1 Printing Computer various infrastructures & services, unknown networks, etc. Layout PDA PDA ME Billing Laptop Virus-Scanner Handy „virtual printer“:part of infrastructure different levels of trust Security for UC:

  4. FOUR UC SETTINGS Security for UC:

  5. UC setting 1: Mobile Computing Mobile Computing • supports mobile users with connectivity and access to services and backend systems while being on the move (aka nomadic computing) • relies on a given infrastructure (802.11 WiFi, GSM, UMTS, etc.) • user needs to register with a provider • access is controlled by provider • closed group of users • no user anonymity • physical threat if device is mobile. • 4.973 laptops, 5.939 Pocket PCs, and a 63.135 mobile phones lost or forgotten in taxis in London within 6 monthsee http://www.laptopical.com/laptops-lost-in-taxi.html • danger of device owner impersonation • private & business data lost Security for UC:

  6. Mobile Computing Scenario Scenario: The Mobile Salesman While on the road, a salesman needs to regularly download up-to-date client reports from his company’s databases. His laptop is equipped with several wireless communication interfaces which can be used to connect via different service providers depending on what kind of service/infrastructure is available. At the client’s office, there is a WiFi network the salesman can access. There are also some networked printers available for guests. However, it is unclear to what extent the infrastructure can be trusted. Challenges • secure communication to backend via insecure communication links • secure storage of internal data on a mobile device • secure device association • Is there a way to securely send a confidential document over the air to a printer located in the office? • Does it help if the salesman selects a printer close to him equipped with a secondary communication interface? Security for UC:

  7. UC setting 2: Ad Hoc Interaction Ad Hoc Interaction • no given infrastructure • UC devices build the infrastructure on their own by establishing temporary, wireless, and ad hoc communication links between them • On application layer: spontaneous interaction without any central authority that restricts interaction/participation, no managed groups • user & device anonymity • again: physical device exposure Security for UC:

  8. Ad Hoc Interaction Scenario Challenges • devices, that are a priori unknown to each other, communicate. • whom to trust? (see Chapter 16 – Trust II) • personal data stored on the device and exchanged with strangers • user privacy is at risk Scenario: Passive Collaboration in Opportunistic Networks In an Opportunistic Network, passers-by exchange information, for example digital advertisements (Straub & Heinemann, 2004), while being co-located. After an initial configuration, devices interact autonomously and without users’ attention. Information dissemination is controlled by profiles stored on the users’ devices. Such a profile expresses a user’s interest in and knowledge about some pieces of information to share. Security for UC:

  9. UC setting 3: Smart Spaces Smart Spaces • focus on user friendliness & user empowerment • unobtrusive interaction • use of contextual information • optional: digital IDs in use • often based on sensing and tracking capabilities integrated into the environment • location privacy issues? Security for UC:

  10. Smart Spaces Scenario Challenges • for new patients, how to unambiguously associate the heartbeat monitor with a record? • how to secure a communication link? • how to detach a heartbeat monitor from a patient's record, after a patient leaves the hospital? Scenario: Patient Monitoring In a hospital, all records of patients are digitally stored and maintained in a central database. Records are updated with the results of physical examinations or continuous monitoring. Husemann and Nidd (2005) describe a middleware capable of integrating a wide range of medical analyzers that have a common wireless interface. Consider a battery driven heartbeat monitor which is attached to the body and sends measurements to the database. The data can be used as well as for a patient surveillance system that triggers an alarm in case of an anomaly. Security for UC:

  11. UC setting 4: Real-Time Enterprises Real-Time Enterprises • effort to leverage UC technology and methods within enterprises • goal: have immediate access to comprehensive and up-to-date information about processes and procedures within an enterprise • goal: close information/media gap Security for UC:

  12. Real-Time Enterprise Scenario Challenges • how to circumvent industrial espionage by unauthorized RFID tag readout? • how to circumvent surveillance and tracking of humans by unauthorized RFID tag readout? Scenario: RFID-based Warehouse Management Radio frequency identification (RFID) offers a variety of opportunities in tracking goods (see e.g. Fleisch & Mattern (2005)). Suppose all goods stocked at a warehouse are equipped tagged with an RFID transponder. With the corresponding readers integrated into storage racks, the process of stocktaking can be completely automated and inventory information is available in real-time. Security for UC:

  13. A TAXONOMY OF UC SECURITY Security for UC:

  14. Basic Terminology and Objectives of IT Security ASSETS (data, HW) to protect in the four scenarios • confidential documents (Scenario 1) • an individual’s habits and preferences (Scenario 2), • medical information (Scenario 3), • the stock list at a warehouse (Scenario 4). Protection Objectives (CIAA) • Confidentiality (C) refers to the aim of keeping pieces of information secret from unauthorized access. • Integrity (I) is the requirement that data is safe from changes, be it either accidentally or deliberately. • Authenticity (A) concerns itself with the genuineness of messages or the identity of entities in a networked system. • Availability (A) means the provisioning of a system’s services to its users in a reliable way. Security for UC:

  15. UC Characteristics and Associated Risks Security for UC:

  16. UC Limitations and Associated Challenges Security for UC:

  17. OVERVIEW OF CRYPTOGRAPHIC TOOLS Security for UC:

  18. Symmetric Cryptosystems • A plaintext is transformed into a ciphertext in order to ensure confidentiality between a sender (Alice) and a receiver (Bob) • Alice and Bob need to agree on a shared key and an algorithm (3DES, AES, ….) • Symmetric: Alice and Bob use the same key for en- and decryption • Kerckhoff (19th century): A cryptosystem’s strength should not be based on the assumption that its algorithm is kept secret, but only on the attacker’s uncertainty regarding the key. • visit http://www.keylength.com for appropriate key lengths • secure key distribution? Security for UC:

  19. Asymmetric Cryptosystems and PKI • avoids key distribution problem • makes use of different keys for encryption and decryption (public and private key) • Alice encrypts a message for Bob with Bob's public key. Bob uses his corresponding private key to decrypt a message • Examples: RSA, ElGammal, Elliptic curves • new problem: public key authentication. How does Alice know, that a public key P+Bob is genuine? • solution: digital certificates managed by PKIs Security for UC:

  20. Hash Functions & Digital Signatures • Modification detection code (MDC) • ensures data integrity • hash function h: a function that compresses bitstrings of arbitrary finite length to bitstrings of fixed length, common 160 bit • Examples: RIPEMD-160, SHA-1 • has to be 2nd preimage resistant: Given an input x that hashes to h(x), an attacker must not be able to find a value y x such that h(y) = h(x). • Message authentication code (MAC) • hash function + secret key shared between sender and receiver • On receipt, Bob knows: Message is integer and was send by Alice • Each MDC h can be extended to a MAC in the following way: On input x, compute h( (k 7p1) || h( (k 7p2) || x) ) where k is the key, p1, p2 are constant padding strings, 7 is the XOR operation, and || denotes concatenation. • Digital Signatures • used for proof of authorship (different to MAC, where both Alice and Bob know a shared key) • often implemented with public key cryptography, see RSA signature scheme. Security for UC:

  21. Limitations of Cryptography in UC • Energy consumption is a serious issue in UC • Pocket PC’s battery with a 1500 mAh capacity and a 5V voltage would have lost 20% of its charge after 5000 executions of a DH protocol or 10000 RSA signatures • Lightweight cryptography needed (new designs, but also new risk and thread analysis) Experiments with a 206 MHz Compaq iPAQ H3670. Potlapally, Ravi, Raghunathan, and Jha (2003) Security for UC:

  22. SAMPLE SOLUTIONS Security for UC:

  23. Privacy-Enhancing Technologies (I) • Blurring data (location based service) • Suitable for one-hop communication in Opportunistic Networks • cf. Scenario 2 • Avoid static data on all network layers Security for UC:

  24. Privacy-Enhancing Technologies (II) Design Principles for UC environments. Langheinrich (2001) • Notice • An announcement mechanism that allows users to notice the dta collectoin capabilities in their environments. • Choice and Consent • The user has the choice of allowing or denying any kind of data collection (respected by the environment) • Proximity and Locality • meta information (locality and proximity) for collected data should be used by the enviroment to enforce access restriction • Access and Recourse • easy user access to collected personal information • reports about usage of personal data Implemented in pawS. Langheinrich (2002) Security for UC:

  25. pawS Architecture. Langheinrich (2002) Security for UC:

  26. Fighting DoS Attacks Proof-of-Work techniques (PoW) • idea: treat the computational resources of each user of a resource or service as valuable • in order to prevent arbitrarily high usage of a common resource by a single user, each user has to prove that she has made some effort, i.e., spent computing resources, before she is allowed to use the service • sender provides answer to a computational challenge together with message. if verification of answer fails, message is discarded • costs of creating such a proof must be some order of magnitude higher than for system setup and proof verification. Security for UC:

  27. Bootstrapping Secure Communication Secure transient association – The resurrecting duckling security policy • device authentication in the absence of a central and always available authority • agreement on a shared key by physical device contact. Simple to understand for a user and involved devices are non-ambiguous • Two devices involved. Roles • a slave (or duckling) obeys a master • a master (or mother duck) controls a slave • Two states of a slave: Security for UC:

  28. Four formal principles of theresurrecting duckling security policy - (Stajano, 2002) • Two State principle • imprintable or imprinted. • In the imprintable state, anyone can take it over. In the imprinted state, it only obeys its mother duck. • Imprinting principle • The transition from imprintable to imprinted, known as imprinting, happens when the mother duck, sends an imprinting key to the duckling. This must be done using a channel whose confidentiality and integrity are adequately protected. • The mother duck must also create an appropriate backup of the imprinting key. • Death principle • The transition from imprinted to imprintable is known as death. It may occur under a very specific circumstance (particular variant) of the model • death by order of the mother duck. • death by old age after a predefined time interval. • death on completion of a specific transaction. • Assassination principle • The duckling must be constructed in such a way that it will be uneconomical for an attacker to assassinate it, i.e., to cause the duckling’ s death artificially in circumstances other than the one prescribed by the Death principle of the policy. Security for UC:

  29. Out-of-Band Channels in UC • UC environments may feature a rich set of out-of-band channels in order to bootstrap communication, e.g. • Infrared light • Dynamically generated 2D barcodes • location limited audio channel • biometric data • ultrasonic • LED and a pushbutton • Example: Proximity-Based Authentication for Windows Domains (Aitenbichler & Heinemann, 2007) Security for UC:

  30. RFID – Clipped Tag (IBM) • IBM’s “Clipped Tag” is giving consumers the ability to simply “opt out” and protect their privacy by tearing or scratching off the RFID antennae, eliminating the tag’s ability to communicate with other devices or systems. Security for UC:

  31. Literature • Virgil Gligor (2005) Cryptolite: How Lite Can Secure Crypto Get? Information Security Summer School. • Straub & Heinemann (2004). An Anonymous Bonus Point System For Mobile Commerce Based On Word-Of-Mouth Recommendation. In, Applied Computing 2004. Proceedings of the 2004 ACM Symposium on Applied Computing (pp. 766–773). New York, ACM Press. • Husemann and Nidd (2005). Pervasive Patient Monitoring – Take Two at Bedtime. ERCIM News, 70–71. • Fleisch & Mattern (2005). Das Internet der Dinge: Ubiquitous Computing und RFID in der Praxis. Springer. • Potlapally, Ravi, Raghunathan, and Jha (2003). Analyzing the energy consumption of security protocols. In Proc. ISPLED’03 (pp. 30–35). • Langheinrich (2001). Privacy by Design – Principles of Privacy-Aware Ubiquitous Systems. In G. D. Abowd, B. Brumitt, & S. A. Shafer (Eds.), Ubicomp (Vol. 2201, p. 273-291). Springer. • Langheinrich (2002). A Privacy Awareness System for Ubiquitous Computing Environments. In G. Borriello & L. E. Holmquist (Eds.), Ubicomp (Vol. 2498, pp. 237–245). Springer. • Stajano (2002). Security for Ubiquitous Computing. John Wiley & Sons. • Aitenbichler & Heinemann, 2007. Proximity-Based Authentication for Windows Domains. to be published at UbiComp 07. WS on Security for Spontaneous Interaction Security for UC:

More Related