1 / 18

9 th International Common Criteria Conference Report to IEEE P2600 WG

9 th International Common Criteria Conference Report to IEEE P2600 WG. Brian Smithson Ricoh Americas Corporation 10/24/2008. Overview. Hosted by the Korean Certification Body, part of the IT Security Certification Center of the National Intelligence Service

marek
Download Presentation

9 th International Common Criteria Conference Report to IEEE P2600 WG

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 9th International Common Criteria ConferenceReport to IEEE P2600 WG Brian Smithson Ricoh Americas Corporation 10/24/2008

  2. Overview • Hosted by the Korean Certification Body, part of the IT Security Certification Center of the National Intelligence Service • Well organized and produced, very hospitable, included an island tour and some entertainment • Conference agenda and other info is here: http://www.9iccc.kr • Slides available on request • Next ICCC will be in Norway

  3. CCDB activities for CCv3.1 • Guidance document for transitioning from 2.x to 3.1 will be published soon • Transition dates for assurance continuity will be realigned so as to be consistent with the transition dates for version 3.1 in general • Protection Profiles may be evaluated in the same project and at the same time as their first use in a Security Target (!). • Some initial guidance on developer evidence production will be published in about six weeks (from September 23), and more guidance later. • ISO has published CC version 3.1 parts 2 and 3; part 1 will be published later. There are some minor changes to version 3.1 that will be published

  4. CCDB activities for CCv4 • The CCDB has been considering what vendors need: • An assurance process that gives credit to vendors’ other assurance efforts • An efficient process • A process that can lead to assurance improvement for vendors • Results that are widely usable and recognized • They are also considering what customers need: • Assurance in operation • Meaningful information for people who build and operate certified systems and for people who are responsible for the data on those systems • Evaluation of real products as they are delivered and used in the marketplace, not just artificially restricted configurations • Some qualitative product assurance comparisons

  5. CCDB activities for CCv4 (2) • Among the key ideas they are trying to achieve: • More direct interaction between developers and assessment teams • Evaluation of existing documentation (and code), requiring few documents that are CC-specific • Evaluation of product development and update processes so that ongoing assurance can be predicted • Tools to assist evaluators in collecting evidence and building evidence chains, and for producing evaluation reports • More detailed reports for several different user audiences

  6. CCDB activities for CCv4 (3) • To implement some of these ideas, they are forming several workgroups: • Evidence-based approach (led by the US scheme) • Predictive assurance (led by DE) • Tools (led by ES, also UK)) • Skills and interactions (led by UK, also US) • Meaningful reports (led by CA) • Lower assurance (EAL1~2) evaluations (led by UK) • Individual presentations were given for each of the WGs

  7. CCDB activities for CCv4 (4) • The eventual aim of these activities: • Evaluations to be performed by a combination of assurance experts and subject matter experts • Creation of a body of knowledge (akin to “case law”) • Interaction among evaluators, internationally (but with protection of intellectual property) • Evaluations that examine evidence that is produced as a normal function of product development • Evaluation reports that address the various needs of different user audiences (such as procurement, system integrators, IT staff, and data owners) • Rough schedule for these activities: • Workgroups will use electronic methods (Wikis) more extensively, but will have a face-to-face meeting in the US around the beginning of 2Q2009. • Finalizing changes for CC version 4 to take place sometime around 3Q~4Q2010.

  8. Some other interesting presentations… • Dealing with the expanding membership of CCRA • If mutual recognition stops at EAL4, and a product is validated at EAL5 (or higher), why not mutually recognize that product at EAL4? • In CCv4, compromise on national differences so CCRA can be more united in the future • Integrating CC and Capability Maturity Model Integration (CMMI) • Portfolio approaches to managing CC and FIPS-140 • Evaluate Process Assurance (as is done in manufacturing) instead of Product Assurance • Better approaches to security domain separation are needed for components of systems that have inherently different EAL needs • Integration of architectural requirements into the CC • From our friends at atsec  • Introduce usability into CC to better reflect overall assurance

  9. Some other interesting presentations… (2) • Along with TOE assurance, provide some measure of ST assurance to represent the quality of threats, assumptions, OSPs, … • Expand process assessment in the CC, matching it with product objectives, and increase the scope of certification to include similar products developed under the same process • Apply code analysis tools as part of CC evaluation • CC is good for finding vulnerabilities in design and TOE mecahnisms • Code analysis tools are good for finding vulnerabilities in implementation and operation • Use existing metrics and tools for measuring code complexity • Apply failure mode and effect analysis (FMEA) for more structured evaluation • Several proposals about tools to automate evaluation process

  10. CCVF meetings • Pre-conference, with CCDB • David Martin (UK scheme) reviewed the CCDB WGs • Question about considering non-government needs: • UK doesn’t, AU does, DE is interested in a broader approach at least for lower assurance levels • CCDB progress seems to have been somewhat minimal since 8ICCC in Rome • Post-conference, CCVF only • Decided to have liaisons with each CCDB WG • Track and report progress • Collect CCVF opinions/consensus and feed back to CCDB • Try to help CCDB with their direction • Getting serious about creating a CCVF web site • Promoting awareness and recruiting new members • Collaborating and communicating

More Related