This publication by Ian Molyneaux to is also makes a good attempt at defining the process: "The Art of Performance Testing". ... but is it an art or is it a science? A good question for another day perhaps.
Unlike Microsoft, I broke down the process into 9 stages as follows:
- Test Project Initiation
- Develop and Review Non-Functional Requirements and Design
- Test Environment Planning and Design
- Plan and Design Tests
- Test Environment Configuration and Setup
- Implement the Test Design
- Performance Test Execution
- Test Analysis and Reporting
- Test Closure Activities
| A | Test Project Initiation | |
| Process | Aim | |
| A1 | Impact Assessment | Assess impact of change / development in regards to performance & Capacity risks, testing timescales, and high-level test approach. |
| A2 | Proof of Concept | Technical Evaluation of performance test tool against target application. Identify scripting and data requirements, assessment of scripting effort, etc. |
| B | Develop and Review Non-Functional Requirements and Design | |
| Process | Aim | |
| B1 | NFR Development & Review | Contribute to the development, review and sign-off of the Non-Functional system requirements (inc Business NFR's) |
| B2 | Design Specification Development & Review | Contribute to the development and review of the Design Specifications. |
| D | Plan and Design Tests | |
| Process | Aim | |
| C1 | Environment Planning and Design | Identify the physical test environment and the production environment as well as the tools and resources available to the test team. The physical environment includes hardware, software, and network configurations. |
| D | Plan and Design Tests | |
| Process | Aim | |
| D1 | Performance Test Planning | To present all of the information necessary to plan and control the performance test effort. It describes the approach to performance testing and is the top-level plan to direct the test effort. |
| D2 | Performance Test Design | To present all of the information necessary to plan and control the performance test effort. It describes the approach to performance testing and is the top-level plan to direct the test effort. |
| E | Test Environment Configuration and Setup | |
| Process | Aim | |
| E1 | Configure the Test Environment | Prepare the test environment, tools, and resources necessary to execute performance tests as features and components become available. Ensure that the test environment is instrumented for resource monitoring and analysis. |
| F | Implement the Test Design | |
| Process | Aim | |
| F1 | Develop Test Assets | Develop the performance test in accordance with the test design: Create versioned & reviewed Test Script, Test Data, etc. |
| G | Performance Test Execution | |
| Process | Aim | |
| F2 | Performance Test Execution | Run and monitor your tests. Validate the tests, test data, and results collection. Execute validated tests for analysis while monitoring the test and the test environment. |
| H | Test Analysis and Reporting | |
| Process | Aim | |
| H1 | Test Analysis and Reporting | Consolidate and share results data. Analyse the data both individually and as a cross-functional team. |
| H2 | Defect Management | Manage defects: Log, track, update, etc |
| I | Test Closure Activities | |
| Process | Aim | |
| I1 | Lessons Learned | Learn from successes and failures. |
| I2 | Process Review | Review test process with aim of improving it. |
| I3 | Defect Review | Review defects to ensure no outstanding issues and identify anti-patterns. |
| I4 | Archive test assets | To store visioned test assets from the last release |
| 15 | Post Release Production Monitoring | Monitor production post release re: Sawmill, resource utilization, DB Profile, message queues, etc |
Clearly, it is unlikely and unwise for these phases to be carried out in a rigid linear manner. To be truly effective, performance testing should be managed in the context of iteration planning and processes.