[SAMATE Home | IntrO TO SAMATE | SARD | SATE | Bugs Framework | Publications | Tool Survey | Resources]
1 What We Want of a Taxonomy
1.1 What Do We Mean by Tool, Function, etc.?
1.2 Classification Scheme Desiderata
1.3 Questions a Taxonomy Should Address
2 A Taxonomy of Tools
2.1 Life Cycle Process or Activity
2.2 Automation Level
2.3 Approach
2.4 Viewpoint
3 Other Useful Data
3.1 Assessment vs. Development
3.2 Sponsor
3.3 Price
3.4 Platforms
3.5 Languages/Formats
3.6 Assessment/Quality
3.7 Run time
4 References
The SAMATE project needs a taxonomy, or classification, of software assurance (SwA) tools and techniques to
Here we use tool to mean a single distinct function or (manual) technique. Function means something producing a (software assurance) result for the user. A Source Code Security Analysis tool looking for weaknesses is a function. A parser is not (unless it, too, reports flaws while parsing).
As far as possible, a classification scheme should be
What questions need to be answered to complete the SA Tool/Function taxonomy?
Regarding Tool Classes
Regarding Tool Capabilities
A validation of a taxonomy is to try classes of tools, which is in the Tool Survey.
This is a proposed taxonomy. We welcome your comments and suggestions.
This taxonomy is a faceted classification, possibly with further hierarchical organization within each class. There are four facets: life cycle process, automation level, approach, and viewpoint.
Primary tool or technique are used at different times in the software life cycle. Support tools and techniques, such as management and configuration tools, apply across the life cycle. This is a unification of IEEE/EIA 12207-1996 [1], Kornecki and Zalewski [2], and SWEBOK 2004 [3].
The notation [n a.b.c] means section a.b.c of reference n below.
Primary Processes
Requirements [2] [3 1.1] [1 5.3.2 & 5.3.4] |
|
Design [2] [3 1.2] [1 5.3.3, 5.3.5, & 5.3.6] |
|
Implementation [1 5.3.7] [2] [3 1.3] |
Acquisition [1 5.1] |
Maintenance [1 5.5] [3 1.5] | |
Testing [2] [3 1.4 & 1.9] [1 5.3.7 - 5.3.11] |
|
Operation [1 5.4] |
Supporting Processes
SWEBOK [3] lists other categories: Miscellaneous Tools and Software Engineering Methods (Heuristic, Formal, and Prototyping).
How much does the tool do by itself, and how much does the human need to do?
0. Manual procedure
1. Analysis aid
2. Semi-automated
3. Automated
What approach does this tool or technique take to software assurance?
Can we see or "poke at" the internals? External tools do not have access to application software code or configuration and audit data. Internal tools do.
Tools would not be classified by these (one wouldn't separate commercial from academic tools functionally), but such information would be useful.
"DO-178B differentiates between verification tools that cannot introduce errors but may fail to detect them and development tools whose output is part of airborne software and thus can introduce errors." [2, page 19] (Emphasis in the original.)
Who fixes it? Can I get it?
Cost of use is another item, but related.
What does it run on? Linux, Windows, Solaris, ...
What is the target language or format? C++, Java, bytecode, UML, ...
How well does it work? Number of bugs. Number of false alarms. Tool pedigree. Maturity of tool. Performance on benchmarks.
How long does it run or do per unit (LOC, module, requirement)? Is it quick enough to run after every edit? every night? every month? For manual methods, how often are, say, reviews? Is it scalable?
Computational complexity might be separate or a way of quantifying run time.
[1] IEEE/EIA Std 12207.0-1996, Software life cycle processes
[2] Andrew J. Kornecki and Janusz Zalewski, The Qualification of Software Development Tools From the DO-178B Certification Perspective, CrossTalk, pages 19-23, April 2006
[3] Guide to the SWEBOK, Chapter 10, 2004. accessed January 2015