Evidence-Based Security Assessment Bof presented at ShmooCon 2005

by Crispin Cowan (Immunix), Adam Shostack, Al Potter (ICSA Labs), Ed Reed,

Tags: Forensics

URL : http://web.archive.org/web/20050404000611/www.shmoocon.org/program.html

Summary : How to decide "Is this thing secure?" is a tough problem. It is a lot tougher than most naive security product consumers think it is. Issues like "what threats are you considering?" and "how much is /insecurity/ costing you?" make it tougher. It is also a lot tougher than most security professionals think it is; Alan Turing's Halting Problem proves that /automatic/ assessment of system security is undecidable, and so the question of "is this thing secure?" will always involve human intervention.

Unfortunately, the human approach to assessing security to date has also been sadly lacking. At the formal/government end, we have the Orange Book, TCSEC, and Common Criteria. Having discovered that assessing /actual/ security is hard, these procedures instead produce very expensive piles of documentation of how hard the vendor /tried/ to provide security. A system can be Common Criteria certified with a mountain of documentation, and have a remote root exploit come out the next day. At the informal/hax0r end, we have random penetration testing by "the community", ideally with full disclosure, and sometimes by forensic examination of compromised systems. Here occasional disclosure of a vulnerability definitively shows a product or system to be
*insecure*, but we /never/ get any assurance of security, and can only infer security from long silence.

We propose a panel on a new approach to assessing security: evidence-based security assessment. It's time to seek security expressed in disprovable hypothesis, and experiments designed to test them. This is the heart of the scientific method, and its time to apply it to security. Is this product or that more secure? Is that "best practice" really better? Can your CISSP-style stand up to the fury of our drunken master style? We will talk about how broad theories are better than narrow ones, and how simple tests are better than complex ones, allowing us to move to more interesting hypotheses and proofs than "This is secure; 0wn.c; patch; goto 10"

It's time to compare and contrast. It's time to test. It's time to demand evidence based security. This panel will feature speakers, presenting the world's fastest re-introduction to the scientific method, followed by the underlying hypothesis for other approaches to security, and testing them. We'll also show some examples of how to use evidence based approaches to testing a variety of technologies that are out there today.

Al Potter: Security Evaluator