Create Presentation
Download Presentation

Download Presentation

Nicolas T. Courtois - U niversity C ollege L ondon

Download Presentation
## Nicolas T. Courtois - U niversity C ollege L ondon

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -

**Cryptographic Message AuthenticationEntity**AuthenticationPasswords, Challenge-Response+Time Stamping Nicolas T. Courtois - University College London**Two Main Areas in Authentication**• Cryptographic Message Authentication • MACs / Digital signatures + complex protocols • Entity Authentication, • Passwords, • static = bad • Challenge-Response: • dynamic: the right answer to all questions at the exam Nicolas T. Courtois, January 2009**Vocabulary**Basic concepts. 1. Identification: declare who you are. 2. [Entity] Authentication: prove it. But a Secure Identification Scheme = 1+2 = Entity Authentication Scheme can be considered as synonyms. Nicolas T. Courtois, January 2009**Crypto Revisionin CompSec crypto=black boxes**Nicolas T. Courtois, January 2009**Goals of Cryptography**• Confidentiality: privacy, anonymity or pseudonymity. • Authenticity, Integrity, Non-repudiation… • Fair play and resistance to malicious behaviours in multiparty protocols… • Meta: Trust (or Accountability), Openness, Governance, Compliance, Auditing, Alerting, Risk Assessment... Nicolas T. Courtois, January 2009*****The Security ? 3-point Formal Approach**What is Security ? Inability to achieve: • Security against what: Adversarial Goal. • Against whom: resources of the Adversary: money, human resources, computing power, memory, risk, expertise, etc.. • Access to the system. Nicolas T. Courtois, January 2009******The Security ? 3-point Formal Approach**Security Notion / Definition = a triple: • Adversarial Goal. • Resources of the Adversary. • Access / Attack. One can ONLY talk about security w.r.t. a given triple. May not hold for other triple. Nicolas T. Courtois, January 2009**Authenticity -Vocabulary**Two Main Areas: • Message Authentication. • Entity Authentication / Identification Closely related… Nicolas T. Courtois, January 2009**Entity Authentication / Identification**3 FACTORS:A person/device can be authenticated by • Something that he/it knows. • PIN, password, knowledge of an AES key, private RSA key etc.. • Something that he/it has. • Smart card, USB key, TPM module, and other tamper-resistant hardware… • Something that he/it is. • Biometrics, unique physical characteristics (cf. snow flake). Nicolas T. Courtois, January 2009**Multi-factor authentication:**To enter the office, one needs: • A PIN. • A smart card. We speak about 2-factor system. High security systems (e.g. bank vault, military lab, etc.) requires to systematically and simultaneously use 3 factors => Good security. Nicolas T. Courtois, January 2009**Message Authenticity – Goals**Different security levels: 1. Correct transmission– no (random) transmission error. A malicious attacker can always modify it. • Achieved with CRC and/or error correction/detection codes. 2. Integrity– no modification possible if the “tag/digest” is authentic. If we cannot guarantee the authenticity of the tag, a malicious attacker can still modify and re-compute the hash. • Achieved with cryptographic hash functions (= MDC). (e.g. SHA-1). 3. Authenticity– specific source. Authentified with some secret information (key). • Achieved with a MAC (= a hash function with a key = a secret-key signature). 4a. Non-repudiation– very strong requirement. Only one person/entity/device can produce this document. • Achieved with Digital Signatures. The strongest method of message authentication. 4b. Public verify-ability. Everybody can be convinced of the authenticity (trust the bank ?). • Achieved with Digital Signatures. The strongest method of message authentication. Nicolas T. Courtois, January 2009**Signatures**Can be: • Public key: • Real full-fledged digital signatures. • Secret key: • Not « real signatures » but MACs. • Widely used in practice, in some cases OK… Nicolas T. Courtois, January 2009**MACs = “Secret-Key Signatures”**m yes/no (m,) MAC algorithm MAC algorithm forgery sk (secret key) sk (secret key) Nicolas T. Courtois, January 2009**Digital Signatures**m yes/no (m,) signing algorithm verification algorithm forgery sk (private key) pk (public key) Nicolas T. Courtois, January 2009**Digital Signatures with Message Recovery**m m yes/no () signing algorithm verification algorithm forgery sk (private key) pk (public key) Nicolas T. Courtois, January 2009**Signatures - Requirements**• Authenticity– guarantees the document signed by… • Non-repudiation– normally only possible with public-key signatures. • Unless if we assume that we dispose of a tamper-resistant hardware (e.g. a smart card) the non-repudiation can be achieved with a MAC based on AES ! • Public verify-ability - normally only possible with public-key signatures. • Unless there is a trusted third party (e.g. independent and trusted authority, an electronic notary service), then public verify-ability will be achieved with a MAC based on AES ! CONCLUSION; secret key signatures can work in practice… but are fundamentally either less secure or less practical (what if the notary stops responding, the smart card destroys itself because it thinks it is being attacked etc..). Nicolas T. Courtois, January 2009**Digital Signatures: Top of the Top:**• The strongest known form of Message Authentication: • Integrity, and more: • Authenticity, and more: • Public Verifiability ( secret key signatures, MACs), and more: • Non-repudiation: I’m the only person that can sign… Nicolas T. Courtois, January 2009***Digital Signatures vs. Authentication**• Strongest known form of Message Authentication. • Allows also authentication of a token/device/person (e.g. EMV DDA, US Passport): • challenge –response (just sign the challenge) • The reverse does not hold: • Not always possible to transform authentication into signature. More costly in general ! Sym. encryption << P.K. authentication < signature Nicolas T. Courtois, January 2009**Part 3**Cryptographic Hashing Nicolas T. Courtois, January 2009**Hash Functions**Nicolas T. Courtois, January 2009**What do We Sign ? The Problem:**Public key crypto is very slow. Sign a long message with RSA, impossible, even on a 40 GHz CPU ! • Use hash function. • Sign a short « digest » of the message. Nicolas T. Courtois, January 2009**Hashing**In computer science we have: • hashing (weak), not security just some mixing and chopping… • must be very fast. • Example: hash tables, such as hash_set<> in C++ STL. • cryptographic hashing (strong), • nobody should ever find any weakness in it • should be very fast, but NOT at the expense of security ! Nicolas T. Courtois, January 2009**[Cryptographic] Hash Function:**m A hash function (or hash algorithm) is a reproducible method of turning data (usually a message or a file) into a number suitable to be handled by a computer. These functions provide a way of creating a small digital "fingerprint" from any kind of data. The function chops and mixes (i.e., substitutes or transposes) the data to create the fingerprint, often called a hash value. The hash value is commonly represented as a short string of random-looking letters and numbers (Binary data written in hexadecimal notation). H(m) A94A8FE5 CCB19BA6 1C4C0873 D391E987 982FBBD3 H >=160 bits 0- bits Nicolas T. Courtois, January 2009**Hash-then-Sign**m A hash function (or hash algorithm) is a reproducible method of turning data (usually a message or a file) into a number suitable to be handled by a computer. These functions provide a way of creating a small digital "fingerprint" from any kind of data. The function chops and mixes (i.e., substitutes or transposes) the data to create the fingerprint, often called a hash value. The hash value is commonly represented as a short string of random-looking letters and numbers (Binary data written in hexadecimal notation). H(m) DigitalSignature e.g. RSA-PSS s H 098f6bcd4621d373cade4e832627b4 >=160 bits >=80 bits 0- bits Nicolas T. Courtois, January 2009**Hash Functions = MDC**Nicolas T. Courtois, January 2009**Requirements**• public function, no secret keys or parameters. • arbitrary (or very long) length -> fixed length • easy/fast to compute • hard to: Nicolas T. Courtois, January 2009**Requirements**OWF SPR CR Nicolas T. Courtois, January 2009**Cryptographic Hash Functions**Hash functions – typical requirements: • OWHF = One-Way Hash Functions. Strict Minimum • OWF • SPR • CRHF = Collision-Resistant Hash Functions. A Lot / 2 little ? • OWF • CR – already hard to achieve… • Many people demand even much more of hash functions: • OWF • SPR • CR– already hard to achieve… • PRF – very strong requirement. • very fast, standardized, with partial security proofs etc. Nicolas T. Courtois, January 2009**One-Way Functions (OWF)**easy ?x, such that x = f-1(y) x hard y y = f(x) Nicolas T. Courtois, January 2009**Preimage Resistance == OWF**OWF = Preimage Resistant: Let y be chosen at random. “Hard” to find x s.t. H(x)=y. Hard=? - Concrete security: Let y be on n bits. • It should take time about 2n. • Remark: If it takes 2n/3 it is a OWF in asymptotic sense, yet very insecure in practice ! Note: OW seems quite easy to achieve. Nicolas T. Courtois, January 2009**Another Important Requirement**SPR – Second Preimage Resistant. Note: Seems very feasible to achieve. Hard=? - Concrete security: • It should take time about 2n. • Knowing one x can helps to reduce the difficulty if there is a weakness somewhere… • For a well designed function, to know one x doesn’t seem to help a lot… Nicolas T. Courtois, January 2009**Passwords**Nicolas T. Courtois, January 2009**The Key Idea**Prover sends a password to a Verifier. The channel is assumed private. • Integrity? • The channel doesn’t really have to be authenticated or noise-free… • this will affect usability and availability, but not the security Nicolas T. Courtois, January 2009**Areas of Study**Care is needed when: • Choosing the password • (and the technology: e.g. visual passwords) • Storing the password on each side • cryptography • software / hardware security • Using/typing the password: • *** vs shoulder surfing • Transmitting the password • (encrypted in some way?) neither necessary nor sufficient… • Destroying the password (why not) Nicolas T. Courtois, January 2009**Attacks Taxonomy**• Guessing • Snooping / shoulder surfing • Eavesdropping / sniffing • Spoofing (fake login page) Impersonation = masquerading = illegitimate access with correct credentials Nicolas T. Courtois, January 2009**How to Measure Password Strength**Nicolas T. Courtois, January 2009**Threat Models for Password Inherent Strength**If Interception => Replay attacks. Security is lost. Without interception: • Online guessing, pass or fail. • Offline password cracking. Target: • against one user • many users, target one: can be easier! • target many users Nicolas T. Courtois, January 2009**Measures of Strength**• Choosing the password • Entropy, • single user’s password, how hard is it to guess it? A: 23.4 attempts. • Min-entropy = -log2(P most frequent password): • the weakest == the most frequent password, • important in attacks against multiple users • Conditional entropy: • similar as old password, • same as another password, • correlated with memorable places dates names etc Nicolas T. Courtois, January 2009**Revision About Entropy**Nicolas T. Courtois, January 2009**A Random Variable**By definition, a [real-valued] random variable X, is an application X: IR. For each realisation of the experiment, X takes some value. Each random variable has a probability distribution. Assume that a source X outputs one of the values x1..xm. Then the probability distribution of X is defined by the pi =def= Pr[X= xi]. Nicolas T. Courtois, January 2009**Entropy of a Source**Again let X be a random variable (with a finite or infinite number of possible outcomes xi). The entropy of X [Shannon] is: H(X) =def= - x Pr[X=x] log2 Pr[X=x] It depends on the probability distribution and : H(X) = - i pi log2 pi Nicolas T. Courtois, January 2009***Properties of the Entropy**Joint source: • H(X,Y) >= H(X) with equality if and only if Y can be written as f(X). (The joint entropy is bigger than of one source, except if the second source is fully dependent on the first, then Y does not bring any additional uncertainty.) • H(X,Y) <= H(X) + H(Y) with equality if and only if X and Y are independent. (When the variables are independent, the uncertainties add up. If not, the uncertainty will be less than the sum of the two.) Nicolas T. Courtois, January 2009**Properties of the Entropy**Very important Theorem: • If there are n possible values xi with Pr[X=xi]>0, then H(X) <= log2 (n) with equality if and only if the distribution is uniform. (Biased sources yield less information ! (e.g. advertisements on TV). Not much uncertainty in what they will say.) Nicolas T. Courtois, January 2009**Corollary: Theorem 12-1 in Bishop**The average expected time to guess a password [for one fixed user] is maximised when all the possible passwords are equiprobable. Proof: from last page: H(X) <= log2 (n) with equality if and only if the distribution is uniform Nicolas T. Courtois, January 2009**Conditional Entropy**The same, but the universe “shrinks”. The entropy of X knowing Y H(X | Y) It measures the amount of uncertainty remaining about X when Y has been observed and is known. Nicolas T. Courtois, January 2009**Conditional Entropy - Formulas**The entropy of X knowing Y (also called equivocation of Y about X): H(X | Y) = y p(y) * H(X | Y=y) = - x Pr[X=x | Y] log2 Pr[X=x | Y] = - xy p(x,y) * log2 p(x|y) = - xy p(x|y)*p(y) * log2 p(x|y) Measures the amount of uncertainty remaining about X when Y has been observed and is known. Nicolas T. Courtois, January 2009***Conditional Entropy - Properties**• H(X | Y) >= 0 and H(X | X) = 0. (There is no uncertainty left about X when we know X.) • H(X | Y) = H(X,Y) – H(Y) (The conditional entropy is equal to joint entropy where we remove the entropy of Y, because we know Y.) • H(X | Y) <= H(X) with equality if and only if X and Y are independent.(The entropy of X can only decrease when we know Y. If it doesn’t, means that X does not depend at all on Y.) Nicolas T. Courtois, January 2009**Mutual Information**• I(X,Y) =def= H(X | Y)-H(X)=H(X,Y)-H(Y)-H(X) (how much information is common, symmetric value) Nicolas T. Courtois, January 2009**Password Management**Nicolas T. Courtois, January 2009**Bad User**Users fail to manage passwords properly. And in various ways. including highly comical ones. Nicolas T. Courtois, January 2009