3.2 Work Products and Static Analysis
Key Takeaways
- Almost any readable work product can be reviewed, including requirements, code, models, testware, contracts, and project documents.
- Static analysis needs a structured work product that a tool can parse or check against rules.
- Static analysis is common for code quality, maintainability, security, style, and complexity feedback.
- Not every work product is suitable for every static technique; the technique must fit the product.
- CTFL questions often separate manual review from tool-supported analysis.
What Can Be Examined
A work product is any artifact produced during software development, testing, operation, or maintenance. Requirements specifications, user stories, acceptance criteria, product backlog items, design diagrams, architecture descriptions, source code, test plans, test cases, test charters, traceability matrices, contracts, and project documents can all be candidates for static testing.
The key question for a review is simple: can people read and understand the work product well enough to evaluate it? If yes, a review can usually help. A business representative may review a user story for value and clarity. A tester may review acceptance criteria for testability. A developer may review design notes for feasibility.
Static analysis is narrower. A tool needs structure. Source code, configuration files, formal models, structured text, and documents with checkable rules are good candidates. An informal sketch on a whiteboard may support a conversation, but it is not usually suitable for automated static analysis unless it is turned into a structured model.
| Work product | Review fit | Static analysis fit |
|---|---|---|
| User story with acceptance criteria | Strong | Limited unless rules are encoded |
| Source code | Strong | Strong |
| Architecture model | Strong | Strong if model syntax is formal |
| Test cases | Strong | Possible for coverage, format, or traceability |
| Contract or regulation | Strong | Limited unless text mining or rules are defined |
| Third-party executable only | Weak | Usually inappropriate without rights and tooling |
Static analysis can reveal issues that are awkward to expose with runtime tests. Examples include unreachable code, undeclared variables, duplicated code, excessive complexity, inconsistent naming, dependency rule violations, unsafe patterns, and some security vulnerabilities. It can also help with maintainability by showing code that is hard to change or understand.
Static analysis findings are not automatically defects. A tool may report an anomaly, warning, or rule violation that requires human judgment. For example, a complexity warning may be acceptable in generated code but serious in a safety-critical decision module. The exam may use anomaly as a broader term than defect.
A good CTFL answer fits the technique to the artifact. Reviewing a requirement for ambiguity is static testing. Running a linter on source code is static analysis. Executing the program to see whether the requirement is met is dynamic testing. Asking whether a tester can understand a work product is often the clue that a review is appropriate.
Which work product is the best candidate for automated static analysis?
Which items are typical work products that can be examined through static testing? Select all that apply.
Select all that apply