TESTEVERYTHING

PERFORMANCE


Why should you automate performance testing?
Automated Performance Testing is a discipline that leverages products,people, and processes to reduce the risks of application, upgrade, or patchdeployment. At its core, automated performance testing is about applyingproduction workloads to pre-deployment systems while simultaneouslymeasuring system performance and end-user experience. A well-constructedperformance test answers questions such as:
Does the application respond quickly enough for the intended users?
Will the application handle the expected user load and beyond?
Will the application handle the number of transactions required by thebusiness?
Is the application stable under expected and unexpected user loads?
Are you sure that users will have a positive experience on go-live day?By answering these questions, automated performance testing quantifies theimpact of a change in business terms. This in turn makes clear the risks of deployment. An effective automated performance testing process helps youto make more informed release decisions, and prevents system downtimeand availability problems


Source of  this  Alexander Podelko's blog

Throughput is the rate at which incoming requests are completed. Throughput defines load on the system and is measured in operations per a time unit. It may be the number of transactions per second or the number of adjudicated claims per hour.

It is also important to see how throughput differs with time. For example, throughput can be defined for typical hour, peak hour, and off-hour for each particular kind of load. In some cases, it is important to detail further what the load is hour-by-hour.

The number of users doesn’t, by itself, define throughput. Without defining what each user is doing and how intensely (i.e. throughput for one user), the number of users doesn’t make much sense as a measure of load. For example, if there are 500 users running short queries each minute, we have throughput of 30,000 queries per hour. If the same 500 users are running the same queries, but one per hour, the throughput is 500 queries per hour. So there are the same 500 users, but a 60-time difference between loads (and, respectively, hardware requirements for the system).


Scalability is the ability of a system to meet performance requirements as the demand increases (usually by adding hardware). Scalability requirements may include demand projections for the system such as increasing of the number of users, transaction volumes, data size, or appearing additional types of load for specific point of time in the future.

http://testeverythingqtp.blogspot.com/2015/12/common-issues-while-creating.html

1 comment:

Anonymous said...

I am using loadrunner for performance testing.
but why there is a difference between the response time of a module to run the module manually or by the loadrunner with only one vuser?

Post a Comment

Which one is right ?

Translate







Tweet