CONTENTS
Part Ⅰ A guide to total software Quality control
Volume 1 Overview of Techniques and the Planning Process
1. INTRODUCTION = 3
1.1 Purpose of the Guide = 3
1.2 Scope of the Guide = 3
1.3 Organization and Content of the Guide = 4
1.4 What is Software Quality = 4
1.5 What is Software Quality Control? = 7
1.6 Why is it Difficult to Control Software Quality = 7
1.7 What Dose Quality Control Cost? = 8
1.8 Why is it Important to Control Software Quality in the Past? = 8
1.9 Who Controls Software Quality? = 9
1.10 Who Has Been Responsible for Software Quality in the Past? = 10
1.11 When Should Software Quality Control be Applied? = 10
1.11.1 Basic Phases of a System Life Cycle = 11
2 MODEL FOR SOFTWARE QUALITY CONTROL = 13
2.2 Introduction = 13
2.2. General Approaches to Software Quality Control = 13
2.2.1 Rome Laboratory's Software Quality Framework = 13
2.2.2 Goal Question Metric Paradigm = 17
2.2.3 Risk Management Model = 18
2.2.4 The Spiral Model of Software Development = 19
2.2.5 The Plan-DO-Check-Action Model of Quality Control = 19
2.3 The TSQC Model = 20
2.3.1 The Basic Elements of the TSQC Model = 20
2.3.2 What are the TSQC Parameters? = 22
2.3.2.1 Products = 22
2.3.2.2 Processes = 24
2.3.2.3 Resources = 25
2.3.3 The TQSC Model in the System Life Cycle = 26
2.3.3.1 Predevelopment = 27
2.3.3.2 Development = 28
2.3.3.3 Post Deployment Support = 29
2.3.4 The TQSC Model Differ from the RSQF? = 29
2.4 How Does the TSQC Model Differ from the RSQF? = 29
3 SOFTWARE QUALITY CONTROL TECHNIQUES = 31
3.1 Introduction = 31
3.2 Characteristics of Quality Control Techniques and Measures = 31
3.2.1 Life Cycle Phase = 31
3.2.2 Type of Control = 31
3.2.3 Effect on Quality Control Parameters = 32
3.3 Quality Control Issues and Techniques = 32
3.3.1 What are the Quality Requirements for the Final Product? = 33
3.3.1.1 Operations Concept Document = 33
3.3.1.2 REP Preparation and Review = 33
3.3.l.3 Prototyping = 33
3.3.2 Who Will be the Developer? = 33
3.3.2.1 REP Preparation and Review = 34
3.3.2.2 The Software Engineering Institute Software Capability Evaluation = 35
3.3.2,3 Software Development Capability/Capacity Review = 35
3.3.2.4 Software Engineering Exercise = 35
3.3.3 What Can be Done To Prevent Quality Defects? = 36
3.3.3.1 Standards = 36
3.3.3.2 Software Engineering Prototype = 37
3.3.3.3 Configuration Management = 38
3.3.3.4 Performance Engineering = 39
3.3.3.5 Software Engineering Environments = 39
3.3.3.6 Reuse = 40
3.3.4 How Can Quality be Checked = 41
3.3.4.1 Reviews and Audits = 41
3.3.4.2 Independent Verification and Validation = 42
3.3.4.3 Requirements Verification Matrix = 43
3.3.4.4 SOfeware Quality Assutrance = 43
3.3.4.5 Testing = 43
3.3.4.6 Reliability Modeling = 44
3.3.5 What Kinds of Information Should be Available at Checkpoints = 44
3.3.5.1 Requirements Traceablility matrix = 45
3.3.5.2 Metrics = 45
3.3.5.3 Software Problem Reports Analysis = 46
3.3.5.4 Software Development Files = 46
3.3.6 What can the Contractor do to improve the Process and Resources = 47
3.3.6.1 Cause and Effect Analysis = 47
3.3.6.2 SEL Self-Assessment = 47
3.4 Characterization of the Quality Control Techniques = 47
3.4.1 Applicability of the Techniques = 48
3.4.2 Type of Control = 48
3.4.3 Relationship to the Quality Control Parameters = 48
4. APPLYING TOTAL SOFTWARE QUALITY CONTROL = 54
4.1 Introduction = 54
4.2 A Review of TSQC Planning Process = 55
4.3 An Overview of the TSQC Planning Process = 55
4.4 The Total Software Quality Control Plan = 56
4.5 Planning Prior to Development = 57
4.5.1 Defining the Software Quality Requiremenys = 59
4.5.2 Identifying Constraints = 59
4.5.3 Identifying Risks = 60
4.5.3.1 Sources and Causes of Risk = 60
4.5.3.1 Assessing Risks = 62
4.5.4 Selecting Software Quality Control Techniques and Activities = 65
4.5.4.1 Using a Core Set of Techniques = 66
4.5.4.2 Selecting Additional Techniques to Control Risk = 66
4.5.5 Applying the Techniques = 70
4.5.6 Refining the Selections = 70
4.5.6.1 Reviewing Controls Over the Developer = 71
4.5.6.2 Tailoring the Application of Techniques = 72
4.5.6.3 Selecting the Level of Application of Techniques = 72
4.5.6.4 Selecting Among Alternative Techniques = 73
4.5.7 Planning Checkpoints = 73
4.5.7.1 Planning the Acquisition Strategy = 74
4.5.7.2 Planning Information Requirements at Checkpoints = 75
4.5.7.3 Reviewing and Refining Checkpoints = 75
4.5.8 Documenting an Overview of the TSQC Process = 77
4.5.9 Making Detailed Plans for TSQC Activities = 78
4.6 Updating the TSQC Plan = 78
4.6.1 After Developer Selection = 78
4.6.2 During Development = 79
5 SUMMARY AND CONCLUSIONS = 80
5.1 Summary = 80
5.2 Conclusions = 80
5.2.1 What is the Current Status of Total Software Quality Control? = 80
5.2.2 What is Needed to Gain Greater Control Over Software Quality? = 81
5.2.3 What Can be Done Now? = 82
List of References = 82
APPENDIX A : QUALITY GOALS, QUESTIONS, METRICS, REQUIREMENTS = 85
A.1 Efficiency = 85
A.2 Integrity = 87
APPENDIX A : RISK ASSESSMENT QUESTIONNAIRE = 90
B.1 Risk Areas = 90
B.2 Use of Questionnaire = 91
B.3 Requirements Risk Area = 91
B.4 Development and Test = 93
B.5 Maintenance Planning = 94
B.6 Government = 94
B.7 Developer = 95
B.8 Schedule and Budget = 96
GLOSSARY = 98
PART Ⅰ
Volume 2 Descriptions of individual Techniques
1. INTRODUCTION = 101
Purpose of the Guide = 101
Scope of the Guide = 101
Organization and Content of the Guide = 102
2. CAUSE AND EFFECT ANALYSIS = 104
3. CONFIGURATION MANAGEMENT = 111
4. INDEPENDENT VERIFICATION AND VALIDATION = 119
5. INSPECTIONS = 126
6. PERFORMANCE ENGINEERING = 131
7. PROTOTYPING = 140
8. RELIABILITY MODELING = 145
9. REQUIREMENTS TRACEABILITY = 152
10. REP PREPARATION AND REVIEW = 158
11. SEI SOFTWARE CAPABILITY EVALUATION = 164
12. SOFTWARE AUDIT = 171
13. SOFTWARE DESIGN METRICS = 177
14. SOFTWARE DEVELOPMENT CAPABILITY/CAPACITY REVIEW = 182
15. SOFTWARE DEVELOPMENT FILES = 186
16. SOFTWARE ENGINEERING EXERCISE = 189
17. SOFTWARE ENGINEERING PROTOTYPE = 195
18. SOFTWARE MANAGEMENT METRICS = 200
19. SOFTWARE QUALITY ASSURANCE = 210
20. SOFTWARE PROBLEM REPORT ANALYSIS = 215
21. STANDARDS = 223
22. TESTING = 230
GLOSSARY = 237
PART Ⅱ SOFTWARE ERROR ANALYSIS
EXECUTIVE SUMMARY = 240
1. OVERVIEW = 242
1.1 Definitions = 243
2. INTRODUCTION TO SOFTWARE ERROR ANALYSIS = 245
2.1 Cost Benefits of Early Detection = 246
2.2 Approach to Selecting Error Analysis Techniques = 247
3. TECHNIQUES FOR DETECTING ERRORS = 248
3.1 Classes of Error Detecting Techniques = 248
3.2 Techniques Used During the Lifecycle = 249
3.2.1 Requirements = 251
3.2.2 Design = 252
3.2.3 Implementation = 252
3.2.4 Test = 254
3.2.5 Installation and Checkout = 255
3.2.6 Operation and Maintenance = 255
3.3 Benefits of Classes of Error Detection Techniques = 255
4. REMOVAL OF ERRORS = 258
4.1 Identification = 258
4.2 Investigation = 260
4.3 Resolution = 261
4.3.1 Resolution Plan = 261
4.3.2 Resolution Action = 261
4.3.3 Corrective Action = 262
4.3.4 Follow-Up = 262
4.4 Use of Individual Error Data = 262
5. TECHNIQUES FOR THE COLLECTION AND ANALYSIS OF ERROR DATA = 263
5.1 Error History Profile / Database = 263
5.2 Data Collection Process = 264
5.3 Metrics = 266
5.3.1 Metrics Throughout the Lifecycle = 268
5.3.1.1 Metrics Used in All Phases = 268
5.3.1.2 Requirements Metrics = 270
5.3.1.3 Design Metrics = 271
5.3.1.4 Implementation Metrics = 273
5.3.1.5 Test Metrics = 275
5.3.1.6 Installation and Checkout Metrics = 278
5.3.1.7 Operation and Maintenance Metrics = 278
5.4 Statistical Process Control Techniques = 279
5.4.1 Control Chart = 260
5.4.2 Run Chart = 283
5.4.3 Bar Graph = 283
5.4.4 Histogram = 286
5.4.5 Scatter Diagram = 288
5.4.6 Method of Least Squares(Regression Technique) = 289
5.5 Software Reliablity Estimation Models = 290
6. SUMMARY = 295
7. REFERENCES = 297
APPENDIX A : ERROR DETECTION TECHNIQUES = 303
A.1 Algorithm Analysis = 303
A.2 Back-to-Back Testing = 303
A.3 Boundary Value Analysosis = 304
A.4 Control Flow Analysis / Diagrams = 305
A.5 Database Analysis = 305
A.6 Data Flow Analysis = 305
A.7 Data Flow Diagrams = 306
A.8 Decision Tables(Truth Tables) = 307
A.9 Desk Checking(Code Reading) = 308
A.10 Error Seeding = 309
A.11 Finite State Machines = 310
A.12 Formal Methods(Formal Verifications, Proof of Correctness, Formal Proof of Program) = 310
A.13 Information Flow Analysis = 311
A.14 (Fagan) Inspections = 311
A.15 Interface Analysis = 312
A.16 Interface Testing = 313
A.17 Mutation Analysis = 314
A.18 Performance Testing = 315
A.19 Prototyping / Animation = 315
A.20 Regression Analysis and Testing = 316
A.21 Requirements Parsing = 316
A.22 Reviews = 317
A.23 Sensitivity Analysis = 317
A.24 Simulation = 318
A.25 Sizing and Timing Analysis = 319
A.26 Slicing = 320
A.27 Software Sneak Circuit Analysis = 320
A.28 Stress Testing = 321
A.29 Symbolic Execution = 322
A.30 Test Certification = 322
A.31 Traceability Analysis(Tracing) = 323
A.32 Walkthroughs = 324
APPENDIX B : ERROR ANALYSIS TECHNIQUES CITED IN SOFTWARE STANDARDS = 325
PART Ⅲ INCREASING SOFTWARE CONFIDENCE : WHERE WE'RE HEADED IN SOFTWARE TESTING TECHNOLOGY
1. INTRODUTION = 344
2. BACKGROUND = 344
3. TESTING STRATEGIES = 346
3.1 Mutation Testing = 346
3.2 Decision to Desion PATH(DD-PATH) Testing - Branch Testing = 347
4. AUTOMATED TESTING TOOLS = 349
4.1 MOTHRA = 350
4.2 RXVP80 = 353
5. THE TEST PROGRAM = 354
6. CORRECT PROGRAM = 356
6.1 MOTHRA - Statement Analysis Mutants = 356
6.2 MOTHRA - Predicate and Domain Mutants = 356
6.3 RXVP80 = 363
7. DOMAIN ERROR = 363
7.1 MOTHRA - Statement Analysis Mutants = 363
7.2 MOTHRA - Predicate and Domain Mutants = 368
8. MISSING STATEMENT ERROR = 377
8.1 MOTHRA = 377
8.2 RXVP80 = 375
9. COMPUTATION ERROR = 384
9.1 MOTHRA = 384
9.2 RXVP80 = 389
10. CONCLUTION = 389