Performance testing: This is a type of testing intended to determine the responsiveness, throughput, reliability, and/or scalability of a system under a given workload. Performance testing is commonly conducted to accomplish the following:
• Assess production readiness
• Evaluate against performance criteria
• Compare performance characteristics of multiple systems or system
Configurations
• Find the source of performance problems
• Support system tuning
• Find throughput levels
• Evaluate against performance criteria
• Compare performance characteristics of multiple systems or system
Configurations
• Find the source of performance problems
• Support system tuning
• Find throughput levels
Core Activities of Performance Testing
Performance testing is typically done to help identify bottlenecks in a system, establish a baseline for future testing, support a performance tuning effort, determine compliance with performance goals and requirements, and/or collect other performance-related data to help stakeholders make informed decisions related to the overall quality of the application being tested. In addition, the results from performance testing and analysis can help you to estimate the hardware configuration required to support the application when you “go live” to production operation.
Performance testing is typically done to help identify bottlenecks in a system, establish a baseline for future testing, support a performance tuning effort, determine compliance with performance goals and requirements, and/or collect other performance-related data to help stakeholders make informed decisions related to the overall quality of the application being tested. In addition, the results from performance testing and analysis can help you to estimate the hardware configuration required to support the application when you “go live” to production operation.
Core performance Testing Activities can be described in 7 steps & they are as follows –
1. Identify Test Environment
2. Identify Performance Acceptance Criteria
3. Plan and Design Tests
4. Configure Test Environment
5. Implement Test Design
6. Execute Tests
7. Analyze, Report and Retest
1. Identify Test Environment
2. Identify Performance Acceptance Criteria
3. Plan and Design Tests
4. Configure Test Environment
5. Implement Test Design
6. Execute Tests
7. Analyze, Report and Retest
The performance testing approach consists of the following activities:
1. Identify the Test Environment. Identify the physical test environment and the production environment as well as the tools and resources available to the test team. The physical environment includes hardware, software, and network configurations. Having a thorough understanding of the entire test environment at the outset enables more efficient test design and planning and helps you identify testing challenges early in the project. In some situations, this process must be revisited periodically throughout the project’s life cycle.
2. Identify Performance Acceptance Criteria. Identify the response time, throughput, and resource utilization goals and constraints. In general, response time is a user concern, throughput is a business concern, and resource utilization is a system concern. Additionally, identify project success criteria that may not be captured by those goals and constraints, for example, using performance tests to evaluate what combination of configuration settings will result in the most desirable performance characteristics.
3. Plan and Design Tests. Identify key scenarios, determine variability among representative users and how to simulate that variability, define test data, and establish metrics to be collected. Consolidate this information into one or more models of system usage to be implemented, executed, and analyzed.
4. Configure the Test Environment. Prepare the test environment, tools, and resources necessary to execute each strategy as features and components become available for test. Ensure that the test environment is instrumented for resource monitoring as necessary.
5. Implement the Test Design. Develop the performance tests in accordance with the test design.
6. Execute the Test. Run and monitor your tests. Validate the tests, test data, and results collection. Execute validated tests for analysis while monitoring the test and the test environment.
7. Analyze Results, Report, and Retest. Consolidate and share results data. Analyze the data both individually and as a cross-functional team. Reprioritize the remaining tests and re-execute them as needed. When all of the metric values are within accepted limits, none of the set thresholds have been violated, and all of the desired information has been collected, you have finished testing that particular scenario on that particular configuration.
Why Do Performance Testing?
At the highest level, performance testing is almost always conducted to address one or more risks related to expense, opportunity costs, continuity, and/or corporate reputation. Some more specific reasons for conducting performance testing include:
At the highest level, performance testing is almost always conducted to address one or more risks related to expense, opportunity costs, continuity, and/or corporate reputation. Some more specific reasons for conducting performance testing include:
• Assessing release readiness by: Enabling you to predict or estimate the performance characteristics of an application in production and evaluate whether or not to address performance concerns based on those predictions. These predictions are also valuable to the stakeholders who make decisions about whether an application is ready for release or capable of handling future growth, or whether it requires a performance improvement/hardware upgrade prior to release.
Providing data indicating the likelihood of user dissatisfaction with the performance characteristics of the system.
Providing data to aid in the prediction of revenue losses or damaged brand credibility due to scalability or stability issues, or due to users being dissatisfied with application response time.
• Assessing infrastructure adequacy by: Evaluating the adequacy of current capacity. Determining the acceptability of stability. Determining the capacity of the application’s infrastructure, as well as determining the future resources required to deliver acceptable application performance. Comparing different system configurations to determine which works best for both the application and the business. Verifying that the application exhibits the desired performance characteristics, within budgeted resource utilization constraints.
• Assessing adequacy of developed software performance by: Determining the application’s desired performance characteristics before and after changes to the software. Providing comparisons between the applications’s current and desired performance characteristics.
• Improving the efficiency of performance tuning by: Analyzing the behavior of the application at various load levels. Identifying bottlenecks in the application. Providing information related to the speed, scalability, and stability of a product prior to production release, thus enabling you to make informed decisions about whether and when to tune the system.
No comments:
Post a Comment