[SAMATE Home | IntrO TO SAMATE | SARD | SATE | Bugs Framework | Publications | Tool Survey | Resources]
at
NIST
Building 101, Portrait Room
Gaithersburg, MD
USA
preceding the
Software and Supply Chain
Assurance (SSCA) Forum
Software must be developed to have high quality: quality cannot be "tested in". However auditors, certifiers, and others must assess the quality of software they receive. "Black-box" software testing cannot realistically find maliciously implanted Trojan horses or subtle errors which have many preconditions. For maximum reliability and assurance, static analysis must be used in addition to good development and testing. Static analyzers are quite capable and are developing quickly. Yet, developers, auditors, and examiners could use far more capabilities.
The goals of the Static Analysis Tool Exposition (SATE) V are to:
Briefly, participating tool makers run their tools on a set of programs. Researchers led by NIST analyze the tool reports. This workshop is the first chance the public will have to hear SATE V observations and conclusions. For this edition the set of programs includes five large, open-source tools selected for having known (CVE-reported) vulnerabilities and also most of the Juliet test suite, almost 90,000 synthetic test cases in C/C++ and Java. We will also recognize sound analyzers through the SATE V Ockham Sound Analysis Criteria.
This workshop has two goals. First, gather participants and organizers of SATE to share experiences, report interesting observations, and discuss lessons learned. The workshop is also an opportunity for attendees to help shape the next exposition, SATE VI.
The second goal is to convene researchers, tool developers, and government and industrial users of software assurance tools to define obstacles to urgently-needed software assurance capabilities and identify engineering or research approaches to overcome them.
This workshop follows similar workshops for SATE IV, SATE 2010, SATE 2009, and SATE 2008 (at SAW), the Static Analysis Summit II (at SIGAda 2007), and the first Static Analysis Summit in 2006.
Those who develop, use, purchase, or review software assurance tools and have interest in details of tool performance should attend. Academicians who are working in the area of semi- or completely automated tools to review or assess the security properties of software are especially welcome. We encourage participation from researchers, students, developers, and assurance tool users in industry, government, and universities.
The program consists of presentations by participants in and organizers of Static Analysis Tool Exposition (SATE) V.
9:00 AM Welcome to SATE V, Paul E. Black, NIST, organizer
9:10 The Experience of Red Lizard Software, Franck Cassez, NICTA, participant
9:25 SATE V background, Vadim Okun, NIST, organizer
9:50 Parasoft's Experience, Arthur Hicken, Parasoft, participant
10:20 break
10:30 Synthetic Test Cases (Juliet) Analysis Results, Aurelien Delaitre, NIST, organizer
11:00 HP Fortify Experience with SATE V, Yekaterina Tsipenyuk O'Neil and Lu Zhao, HP Fortify, participant
11:30 AM lunch
12:30 PM SATE V Ockham Sound Analysis Criteria, Paul E. Black, NIST, organizer
1:00 The Tool Developer and the SWAMP, James Kupsch, University of Wisconsin-Madison, organizer
1:30 From the Juliet test suite to a real-world SSL implementation: The case for sound static analyzers, Pascal Cuoq, CEA, participant
1:45 break - commemorate π day at 1:59
2:00 Coverity Results and Experiences for SATE V, Peter Henriksen, Coverity, participant
2:30 CVE-Selected Analysis Results, Bertrand Stivalet, NIST, organizer
3:00 Where's My Flying Monoid?, Nathan Ryan, Buguroo, participant
3:30 Static Analysis in the Federal Government, John Keane, OSD, organizer
4:00 Discussion: planning the next SATE, Elizabeth Fong, NIST, organizer
4:30 PM finish
We will announce the new name of the SRD during the welcome.
Paul E. Black (NIST) paul.black [at] nist.gov (paul[dot]black[at]nist[dot]gov)
Elizabeth Fong (NIST) efong [at] nist.gov (efong[at]nist[dot]gov)
Edward Bonver (Symantec)
Thomas Hurt (AT&L/R&E/DASD(Systems Engineering))
John Keane (OSD)
Michael Lowry (NASA)
Mihaela Mattes (Oracle)
Murali Somanchy (Qualcomm)
Todd Wilson (AMRDEC)
Victor Winter (U Nebraska-Omaha)