Performance testing – 360logica.com
360logica follow robust Performance methodology to cater the client requirement and perform the high quality test focusing on Performance Acceptance Criteria and figuring out the bottlenecks in the system.
Performance Methodology we follow:
- Identify the Test Environment. Identify the physical test environment and the production environment as well as the tools and resources available to the test team. The physical environment includes hardware, software, and network configurations.
- Identify Performance Acceptance Criteria. Identify the response time, throughput, and resource utilization goals and constraints. In general, response time is a user concern, throughput is a business concern, and resource utilization is a system concern.
- Plan and Design Tests. Identify key scenarios, determine variability among representative users and how to simulate that variability, define test data, and establish metrics to be collected.
- Configure the Test Environment. Prepare the test environment, tools, and resources necessary to execute each strategy as features and components become available for test.
- Implement the Test Design. Develop the performance tests in accordance with the test design.
- Execute the Test. Run and monitor your tests. Validate the tests, test data, and results collection. Execute validated tests for analysis while monitoring the test and the test environment.
- Analyze Results, Report, and Retest. Consolidate and share results data. Analyze the data both individually and as a cross-functional team. Reprioritize the remaining tests and re-execute them as needed.
Performance testing requires a different mind set and skill set to that of functional testing. Understanding customer requirements and expectations, as well as user activities and behaviors, is key to designing suitable tests.
Ensure tests represent realistic usage of the application. Test environments must be carefully controlled to prevent unauthorized modifications which might falsify test results.
Automated test tools coupled with fast backup and restore mechanisms are essential due to the need to repeat tests many times.
System bottlenecks can rapidly become very technical in nature and consume considerable resource and effort to diagnose. Resolution may require considerable re-work and even re-design of your product.
Even after performing a significant number of tests and gathering a considerable amount of data and test results there is still a possibility that the wrong conclusions may be drawn by developers inexperienced in performance testing.