1 / 29

Foundational Certified Code in a Metalogical Framework

Foundational Certified Code in a Metalogical Framework. Karl Crary and Susmit Sarkar Carnegie Mellon University. Motivation: Grid Computing. Make use of idle computing cycles over the network [e.g. SETI] Computer owners download and execute code from developers

fell
Download Presentation

Foundational Certified Code in a Metalogical Framework

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Foundational Certified Code in a Metalogical Framework Karl Crary and Susmit Sarkar Carnegie Mellon University

  2. Motivation: Grid Computing • Make use of idle computing cycles over the network [e.g. SETI] • Computer owners download and execute code from developers • A key issue: Unknown developers, so consumers are concerned about safety

  3. Certified code Code is safe! • Package the code with certificate [PCC, TAL] • Certificate: a machine verifiable proof of safety • Typically, proof that code is well-typed in a safe type system Is code safe ? Knowledge Certificate Developer Consumer Code

  4. Type System? Is that safe? • Old Answer: Fix a type system, trust peer-review • New Answer: Give developers flexibility of using their own type systems • Need to check this is safe • Known as Foundational Certified Code Type System Type System Machine details Certificate Developer Consumer Code

  5. Roadmap • Our system • Metalogics • Safety Policy • A Safety Proof • Related and future work

  6. Our System Safety Condition Safety Policy Safety Proof Why is your safety condition any good? Does Code satisfy the Safety Policy? Code satisfies my Safety Condition I can prove it to you! Certificate Developer Code Consumer

  7. Metalogic : meta theorems • We use LF to express logics • e.g., operational semantics • producer’s safety conditions • We care about meta theorems: • If some input derivation exists, then an output derivation exists • e.g., Safety Theorem

  8. How to check meta theorems? • Choice 1: reflect metalogical reasoning in the framework • Choice 2: use a logic designed for metalogical reasoning • e.g. Twelf [Schurmann]

  9. Programming in Meta logics • We write logic programs relating derivations • limited to -1 reasoning, authors plan stronger system • Need to do induction on structure of derivation • System can check these logic programs are total (user annotations required)

  10. Roadmap • Our system • Metalogics • Safety policy • A safety proof • Related and future work

  11. Safety policy - Preliminaries • Formalize operational semantics of the IA32 architecture • Formalize machine states: memory, register files, stack, instruction pointer • Formalize transitions from state to state • Remove transitions deemed unsafe

  12. Example: transition for addition • addl $5,(%eax) • load 4 bytes from (%eax), • load immediate operand 5, • add them, • store result back in (%eax), • update EFLAGS and advance EIP • This can go wrong, e.g. if eax points to protected memory • Solution: The formal load and store relations do not apply in such cases

  13. Safety Policy • Define initial state on loading program P • We never get to a state where the (formal) machine does not have a transition • Another way of stating: the formal machine is never stuck • Halt state treated specially

  14. Why is this safe? • Real machine’s transitions according to formal machine’s transitions: real machine is performing safe operations • To perform unsafe operations, real machine takes a transition not in formal machine • This does not happen in a safe machine

  15. Roadmap • Our system • Metalogics • Safety policy • A safety proof • Related and future work

  16. Example Safety Proof • A particular safety proof • Our safety proof is for TALT [Crary] • Type system for an assembly language • Fairly low-level, but still abstract • Our foundational safety proof is syntactic [Hamid et al.]

  17. Safety • Our conditions will isolate a set of safe states • Safe states cannot transition to stuck states Safe State M1 State M2

  18. Key Lemmas • Progress • Preservation Safe State M1 State M2 Safe State M2 Safe State M1

  19. Putting it together – Safety Theorem • Transitions from a safe state cannot go to a stuck state Safe Safe State M1 State M2

  20. Idea of proof • Safe machine • Three parts of the proof • Abstract Type Safety (previous work) • Simulation • Determinism Typed abstract M’ implements Safe State M

  21. TALT safety proof [Crary] • This has two top level lemmas: • Progress: A well typed abstract machine makes a transition • Preservation: If a well typed abstract machine makes a transition, the resulting (abstract) machine is well typed

  22. Concrete Machine Lemmas • Simulation • Determinism Abstract M1 Abstract M2 Concrete M1’ Concrete M2’ Concrete M2 Concrete M1 Concrete M2’

  23. Progress progress Abstract M2’ Abstract, typed M1’ implements implements Safe State M1 State M2 State M2

  24. Preservation progress Typed Abstract M2’ Typed abstract M1’ implements implements implements M2+ Safe State M1 Safe Safe State M2

  25. Implementation Statistics • Safety Policy : 2,081 lines of code • Safety Proof : 44,827 lines of code • Time to check : 75 sec • Number of lemmas : 1,466 • Man years : 1 and 1/2

  26. Related work • Foundational PCC - Appel et al • FTAL - Hamid et al • Temporal Logic PCC - Bernard and Lee

  27. Future Work • Develop a compiler from Standard ML to TALT • Expand the target language to include many more IA32 instructions • Specify and prove other properties, e.g. Running time bounds

  28. Indeterminism • The data may be indeterminate, due to e.g. input • Safety demands that any instance be safe • We have an oracle that the semantics consults to determine what to do • Oracle is quantified in safety theorem

More Related