1 / 30

On the (Im)possibility of Obfuscating Programs

On the (Im)possibility of Obfuscating Programs. Boaz Barak, Oded Goldreich, Russel Impagliazzo, Steven Rudich, Amit Sahai, Salil Vadhan and Ke Yang. Presented by Shai Rubin. In practice Hackers successfully obfuscate viruses Researchers successfully obfuscate programs [ 2,4 ]

rayers
Download Presentation

On the (Im)possibility of Obfuscating Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. On the (Im)possibility of Obfuscating Programs Boaz Barak, Oded Goldreich, Russel Impagliazzo, Steven Rudich, Amit Sahai, Salil Vadhan and Ke Yang Presented by Shai Rubin Security Seminar, Fall 2003

  2. In practice Hackers successfully obfuscate viruses Researchers successfully obfuscate programs [2,4] Companies sell obfuscation products [3] In theory [1] There is no good algorithm for obfuscating programs Theory/Practice “Gap” Which side are you? Security Seminar, Fall 2003

  3. Why Do I Give This Talk? • Understand Theory/Practice Gap • An example of a good paper • An example of an interesting research: • shows how to model a practical problem in terms of complexity theory • Illustrates techniques used by theoreticians • I did not understand the paper. I thought that explaining the paper to others, will help me understand it • To hear your opinion (free consulting) • To learn how to pronounce ‘obfuscation’ Security Seminar, Fall 2003

  4. Disclaimer • This paper is mostly about complexity theory • I’m not a complexity theory expert • I present and discuss only the main result of the paper • The paper describes extensions to the main result which I did not fully explore • Hence, some of my interpretations/conclusions/suggestions may be wrong/incomplete • You are welcome to catch me Security Seminar, Fall 2003

  5. Talk Structure Motivation (Theory/Practice Gap) Impossibility Proof Obfuscation Model Theoretician Track Other Obfuscation Models Obfuscation Model Analysis Practitioner Track Summary Security Seminar, Fall 2003

  6. Obfuscation Concept A good obfuscator: a virtual black box “Anything an adversary can compute from an obfuscated program O(P), it can compute given just an oracle access to P” The weakest notion of compute: a predicate, or a property of P. O(Prog.c) Prog.c Prog.c Code + Analysis + Input/Output queries Input/Output queries  p(Prog.c) Security Seminar, Fall 2003

  7. Turing Machine Obfuscator • [Functionality property] O(M) computes the same function as M. • [Efficiency property]O(M)running time1 is the same as M. • [Black box property] For any efficient algorithm2A (Analysis) that computes a predicate p(M) from O(P), there is an efficient (oRacle access) algorithm2RM that for all M computes p(M): A Turing machine O is a Turing Machine (TM) Obfuscator if for any Turing machine M: Pr[A(O(M)) = p(M)]  Pr[RM(1|M|) = p(M)] In words: For every M, there is no predicate that can be (efficiently) computed from the obfuscated version of M, and cannot be computed by merely observing the input-output behavior of M. 1Polynomial slowdown is permitted 2Probabalistic polynomial-time Turing machine Security Seminar, Fall 2003

  8. Talk Structure Motivation (Theory/Practice Gap) Impossibility Proof Obfuscation Model Theoretician Track Other Obfuscation Models Obfuscation Model Analysis Practitioner Track Summary Security Seminar, Fall 2003

  9. Proof Outline 1. You say: “I have an obfuscator: for any Machine M, for any (analysis) algorithm A that computes a predicate p(M), there is an oracle access algorithm RM that for all M computes p(M). 2. Really? Please provide O. 3. Given O and a my chosen Turing machine E, I compute O(E). 4. I show you a predicate p, and an (analysis) algorithm s.t.: A(O(E))=p(E). You must provide RM: Pr[RE(1|E|)= p(E)]  Pr[A(O(E))=p(E)]. 5. I choose another machine Z and obfuscate it using O. I show you that Pr[RZ(1|Z|)= p(Z)] << Pr[A(O(Z))=p(Z)]. 6. Conclusion: please try another obfuscator (i.e., you do not have a good obfuscator) Security Seminar, Fall 2003

  10. Building E (1) • Combination Machine. For any M,N: • COMBM,N(1,x)  M(x) and COMBM,N(0,x)  N(x). • Hence, COMBM,N can be used to compute N(M). M(x) b=1 COMBM,N(b,x)= N(x) b=0 Security Seminar, Fall 2003

  11. 1 C()=  x= D,(C)= C,(x)= 0 otherwise 0 otherwise Building E (2) • Let ,{0,1}K • Let • Note: D, can distinguish between C, and C’,’ when (,)(’,’) • E,=COMBD,,C, • Remember: E, can be used to compute D,(C,) Security Seminar, Fall 2003

  12. Proof Outline 1. You say: “I have an obfuscator: for any Machine M, for any (analysis) algorithm A that computes a predicate p(M), there is an oracle access algorithm RM that for all M computes p(M).  2. Really? Please provide O.  3. Given O and a my chosen Turing machine E, I compute O(E,).  4. I show you a predicate p, and an (analysis) algorithm s.t.: A(O(E,))=p(E,). You must provide RM: Pr[RE,(1|E,|)= p(E,)]  Pr[A(O(E,))=p(E,)]. 5. I choose another machine Z and obfuscate it using O. I show you that Pr[RZ(1|Z|)= p(Z)] >> Pr[A(O(Z))=p(Z)]. 6. Conclusion: please try another obfuscator (i.e., you do not have a good obfuscator) Security Seminar, Fall 2003

  13. The Analysis Algorithm Input: A combination machine COMBM,N(b,x). Algorithm: • Decompose COMBM,NintoMand N. • COMBM,N(1,x)  M(x) • COMPM,N(0,x)  N(x)). • Return M(N). Note: A(O(E,)) is a predicate that is always (i.e., with probability 1) true: A(O(E,)) = A(O(COMBD,,C,))  D,(C,) = 1 You must provide oracle access algorithm: RMs.t.Pr[RE,(1|E,|)=1]  1. Security Seminar, Fall 2003

  14. Proof Outline 1. You say: “I have an obfuscator: for any Machine M, for any (analysis) algorithm A that computes a predicate p(M), there is an oracle access algorithm RM that for all M computes p(M).  2. Really? Please provide O.  3. Given O and a my chosen Turing machine E, I compute O(E).  4. I show you a predicate p, and an (analysis) algorithm s.t.: A(O(E,))=1. You must provide RM: Pr[RE,(1|E,|)= 1]  Pr[A(O(E,))=1] = 1. 5. I choose another machine Z and obfuscate it using O. I show you that Pr[RZ(1|Z|)= p(Z)] << Pr[A(O(Z))=p(Z)]. 6. Conclusion: please try another obfuscator (i.e., you do not have a good obfuscator) Security Seminar, Fall 2003

  15. The Z machine • Let Zk be a machine that always return 0k. • Z is similar to E, (COMBD,,C,): replace C, with Zk. Z=COMBD,,Zk • Note A(O(Z)): is a predicate that is always (i.e., with probability 1) false: A(O(Z)) = A(O(COMBD,,Zk))  D,(Zk) = 0 • Pr[RZ(1|Z|]=0)  1 ?. If we show that Pr[RZ(1|Z|]=0) << 1, we are done. Security Seminar, Fall 2003

  16. D, D, D, D, D, D, Zk C, C, Zk Why Pr[RZ(1|Z|]=0)<<1 ? Let us look at the execution of RE,: RE,: 1 Start End C, When we replace the oracle to C, with oracle to Zk, we get RZ. What will change in the execution? RZ: Out’ Start End Zk Pr(out’=0) = Pr(a query to C, returns non-zero) = Pr(query=) = 2-k 3 Security Seminar, Fall 2003 3 Inaccurate, see paper.

  17. Proof Outline 1. You say: “I have an obfuscator: for any Machine M, for any (analysis) algorithm A that computes a predicate p(M), there is an oracle access algorithm RM that for all M computes p(M).  2. Really? Please provide O.  3. Given O and a my chosen Turing machine E, I compute O(E).  4. I show you a predicate p, and an (analysis) algorithm s.t.: A(O(E))=1. You must provide RM: Pr[RE(1|E|)= 1]  Pr[A(O(E))=1] = 1. 5. I choose another machine Z and obfuscate it using O. I show you that Pr[RZ(1|Z|)= 0]=2-k<< Pr[A(O(Z))=0] = 1.  6. Conclusion: please try another obfuscator (i.e., you do not have a good obfuscator) Security Seminar, Fall 2003

  18. Talk Structure Motivation (Theory/Practice Gap) Impossibility Proof Obfuscation Model Theoretician Track Other Obfuscation Models Obfuscation Model Analysis Practitioner Track Summary Security Seminar, Fall 2003

  19. Modeling Obfuscation A good obfuscator: a virtual black box “Anything that an adversary can compute from an obfuscation O(P), it can also compute given just an oracle access to P” O(Prog.c) Prog.c Prog.c Code + Analysis + Input/Output queries Input/Output queries  Knowledge • Barak shows: there are properties that cannot be efficiently learned from I/O queries, but can be learned from the code • However, we informally knew it: for example, whether a program is written in C or Pascal, or which data structure a program uses Security Seminar, Fall 2003

  20. Obfuscation Model Space Information hidden by obfuscator. All predicates Barak’s Model Specific predicate Difficulty to gain information from O(P). Efficient inefficient Security Seminar, Fall 2003

  21. TM Obfuscator • O(M) computes the same function as M. • O(M) running time1 is the same as M. • For any efficient algorithm2A (Analysis) that computes a predicate p(M), there is an efficient (oRacle) algorithm2RM that for all M computes p(M): A Turing machine O is a TM obfuscator if for any Turing machine M: Pr[A(O(M)) = p(M)]  Pr[RM(1|M|) = p(M)] 1Polynomial slowdown is permitted 2Probabalistic polynomial-time Turing machine Security Seminar, Fall 2003

  22. Obfuscation Model Space Information gained from O(P). All predicates Barak’s Model Specific predicate Difficulty to gain information from O(P). Specific Program Efficient Inefficient All Programs Programs Security Seminar, Fall 2003

  23. TM Obfuscator • O(M) computes the same function as M. • O(M) running time1 is the same as M. • For any efficient algorithm2A (Analysis) that computes a predicate p(M), there is an efficient (oRacle) algorithm2RM that for all M computes p(M): A Turing machine O is a TM obfuscator if for any Turing machine M: Pr[A(O(M)) = p(M)]  Pr[RM(1|M|) = p(M)] 1Polynomial slowdown is permitted 2Probabalistic polynomial-time Turing machine Security Seminar, Fall 2003

  24. Talk Structure Motivation Impossibility Proof Obfuscation Model Theoretician Track Other Obfuscation Models Obfuscation Model Analysis Practitioner Track Summary Security Seminar, Fall 2003

  25. Other Obfuscation Models Information gained from O(P). All predicates • Static Disassembly [2]: • Not all properties • Not difficult • Not virtual black box? Static Disassembly [2]: Barak’s Model Specific predicate Difficulty to gain information from O(P). Inefficient Efficient Signature obfuscation: • Signature obfuscation: • Not all properties • Not virtual black box? All Programs Programs Security Seminar, Fall 2003

  26. Barak’s Model Limitation • Virtual Black Box: • Not surprising in some sense (but, still excellent work) • Does not corresponds to what attackers/researchers are doing: “the virtual black box paradigm for obfuscation is inherently flawed” • Too general: • obfuscator must work for all programs • for any property (Barak addresses this in the extensions) • Too restrictive: does not allow to fit the oracle algorithm per Turing machine (does it matter?). Security Seminar, Fall 2003

  27. Alternative Models “Property Hiding Model”: for a given property q: (i) q can be computed from P, (ii) q cannot be (is more difficult to?) computed from O(P). Given an algorithm A, and a Turing machine M such that A(M)=q(M), obfuscate M such that • [property hiding] for every algorithm A, A(O(M))  q(M) • [functionality] M and O(M) computes the same function • Static Disassembly • A(M)=(particular) Dissembler • q(M) = A(M) • 90% of the instruction in A(M) are different than the instructions in A(O(M)) • Virus Signature Obfuscation • A(M) = q(M) = substring of instructions inside M • O(M) does not contain this substring Security Seminar, Fall 2003

  28. Alternative Models (2) Backdoor Model: hide functionality for a single input, change functionality for most other inputs Given a Turing machine M and an input x • [obfuscated back door] there exists y such that M(x)=O(M)(y) • [non functionality]for every zyPr[M(z)O(M)(z)] is high Security Seminar, Fall 2003

  29. Summary What to take home: • The gap is possible because: • Virtual black box paradigm is different than real world obfuscation. • The Obfuscation Model Space . • Nice research: Concept  Formalism  Properties • A lot remain to be done Security Seminar, Fall 2003

  30. Bibliography • B. Barak, O. Goldreich R. Impagliazzo, S. Rudich, A. Sahai, S. Vadhan and K. Yang, "On the (Im)possibility of Obfuscating Programs", CRYPTO, Aug. 2001, Santa Barbara, CA. • Cullen Linn and Saumya Debray. "Obfuscation of Executable Code to Improve Resistance to Static Disassembly", CCS Oct. 2003, Washington DC. • www.cloakware.com. • Christian S. Collberg, Clark D. Thomborson, Douglas Low: Manufacturing Cheap, Resilient, and Stealthy Opaque Constructs. POPL 1998. Security Seminar, Fall 2003

More Related