1 / 31

Static Code Analysis and Governance

Static Code Analysis and Governance. Effectively Using Source Code Scanners. About Me. Jonathan Carter Principal Security Consultant @ Pure Hacking Governance Business Unit Application Security Enterprise Security Architect and Designer Security Researcher @ Fortify

naeva
Download Presentation

Static Code Analysis and Governance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Static Code Analysis and Governance Effectively Using Source Code Scanners

  2. About Me • Jonathan Carter • Principal Security Consultant @ Pure Hacking • Governance Business Unit • Application Security • Enterprise Security Architect and Designer • Security Researcher @ Fortify • API’s, Frameworks, Threat Intelligence

  3. Presentation Flow • What do scanners do? • How do they do it? • What do you need to worry about? • How do you address these concerns?

  4. What do analyzers do? 1 Source Code Vulnerabilities 2 3 API RulesSecurity Intelligence

  5. Translation Mechanics • Translation builds a model of how data flows through various layers • Allows full interoperability of languages Presentation Layer 1 Business Layer Data Layer Source Code Model

  6. Translation Example 1. Engine Reads .NET Source Code and Encounters: String URLparameter = Request[“URLElement”]; 2. Engine Translates Statement into Intermediate Language: Object ‘URLParameter’ Declared of Type String; Temporary Object ‘t1’ Declared; ‘t1’ = Result of ‘Request’ object’s ‘GetElement’ Method Executed; ‘URLParameter’ = ‘t1’; 3. Engine Adds New Content to Existing Translation of Code

  7. Translation Pitfalls • Translation step is not easy • Does the Translator Support the Language? • Are there subtle differences between different versions of a particular language? • How will the user know when translation fails? Potential False Negatives: • Language Versions Not Supported • Translation Incorrect

  8. Translation Solutions Here’s What You Can Do: • Verify that scanner supportsall languages involved inyour scan • Ask vendors about roadmaps forlanguages • Ensure you know how to detecttranslation failures.

  9. Scan Mechanics ASP.NET Rules ADO.NET Rules T-SQL Rules Java Rules Model Vulnerabilities Intelligence

  10. Scan Example 1. Engine Translates .NET Source Code into Intermediate Language <% = Request[“URLElement”] %> 2. Engine Recognizes That ‘Request’ Object is Dangerous Source Model Model Model Dangerous Source Rule XSS 3. Engine Recognizes Dangerous Output and Declares XSS Presence .NET XSS Rule

  11. Scan Pitfalls • Scan step is even trickier than translation • Do rules cover a particular library, API? • Are rules accurately describing the conditions for a vulnerability to exist? • Are the analyzers correctly applying a rule all the time? • Are the rules good at detecting the vulnerabilities you care about? • Are the rules being overly paranoid in describing risk?

  12. Scan Pitfalls Potential False Positives: • Engine models data flow and control flow incorrectly • Engine applies rules incorrectly • Rules identify data sources as untrustworthy and your organization disagrees • Rules don’t take into account dynamic nature of your code • Old Rules

  13. Scan Pitfalls Potential False Negatives: • Code is simply missing and analyzer never applies rules to it • Rules Don’t Recognize New Methods, Classes

  14. Scan Pitfall False Taint Promotion • Engine lacks enough computing resources to perform a full scan • To compensate, engine cuts corners during scan phase and makes broad generalizations about various data structures • Engine reports a large number of false positives

  15. Scan Pitfall Philosophical Limitations in Static Analysis • Not Really Suited for Identifying Architectural Issues • Not Ideal for Finding Vulnerabilities in Dynamic Code

  16. Scan Solutions Here’s What You Can Do: • Verify that the scanner usesthe latest rules • Verify that rules adequately cover all of the libraries yourcode may use • Ensure that the engine providesdetailed evidence of everyvulnerability it reports.

  17. Scan Solutions Here’s What You Can Do: • Contact product’s technical support when the evidence fora vulnerability is simply wrong • Ensure that the scanner’s rulesidentify any custom data sourcesand sinks • Examine Scan Logs to ensure scan failuresare not occurring.

  18. Scan Solutions Here’s What You Can Do: • Verify that the engine is includingall of its rules when performinga scan • Exclude any data source rulesfor data sources your organizationconsiders trustworthy • Gather feedback from developers about the accuracy of the results

  19. Reporting Mechanics 3 Report Project Preferences Vulnerabilities Engine produces various reports

  20. Reporting Example 1. Engine Identifies XSS Vulnerability in Scan 2. Previously, User Specifies Classification Scheme for Vulnerabilities Model Risk and Vulnerability Grouping Scheme XSS 2. Engine Produces PDF + XSS Custom Vulnerability .NET XSS Rule

  21. Reporting Pitfalls Potential Problems: • Report does not take into account risk appetite of organization • Reports do not capture usefulsecurity metrics. • Vulnerability Description / Remediation advice not satisfactory

  22. Reporting Solutions Here’s What You Can Do: • Demand to see sample reportsfrom vendors before purchasingthe scanner • Verify that the report’s risk assessment strategy is inline withyour organization’s risk methodology • Inspect the engine’s capability to customizereports based on security metrics

  23. Reporting Solutions Here’s What You Can Do: • Verify that you can producereports that reflect yourorganization’s security metrics • Ask your software developersif they find the reports usefulin identifying and fixing the issues

  24. Process Impacts • Vendor Engagement • Code Development • Build • Code Review • QA • Security Auditing • Vulnerability Management • Change Management • Risk Assessment

  25. Process Impacts • Impacts to Processes Are Profound • Where should a scan occur in the SDLC? • How should the results be managed? • Should the organization refuse to release until scans are clean? • How does the organization aggregate the risks? • Does every project get a scan or just some? • How does the organization patch andmaintain the scanner?

  26. People Impacts • Vendors • Software Developers • Testers • Security Auditors • Release Engineers • Project Managers • Risk Analysts • Operational Staff

  27. People Impacts • Impacts to People Are Profound • Who’s responsible for running the scan? • Who do we turn to when results look suspicious? • Who verifies that things are getting fixed? • Who agrees to audit the results? • Who accepts the risks of the associated vulnerabilities? • Who maintains the rules? • Who audits the quality of the scans?

  28. Conclusions • Source Code Analyzers are powerful and amazingly complex under the covers • Anyone who tells you they are the complete solution is probably in sales ;-)

  29. Conclusions • Developers – • Education about the scanneris critical to identifying false positives and negatives • Risks Staff – • Verify that scanner’s method of risk assessmentis aligned with yours.

  30. Conclusions • Auditors – • Don’t be overwhelmed bya lot of issues. Chances aregood there are a lot ofnon-issues (risk appetite). • Risk Owners – • Insist that the results havebeen verified by someonewho wrote the code

  31. Contact Info

More Related