This factor that we call contention is making me crazy...
I have deleted my article on the scalability and 'contention',
because my equation doesn't work correctly...
Now my question is:
If you read carefully , you will see that Dr. Gunther is using this
model to predict scalability after he simulates a relatively small
number of vusers in LoadRunner ( because of licensing costs, it's
cost-effective) and after that he finds the coefficients of the
2nd-degree polynomial (quadratic equation) and then transform
those coefficients back to the USL parameters using the = b - a
and = a.
And then he is extrapolating with the USL model - from just
a small number of measured points - to higher loads
to predict scalability.
But imagine that a large number of clients are making HTTP requests
to a website, this thing that we call logical contention and physical contention
are probabilistic, so, imagine that there is a lot of contention in a time
interval , internet clients may wait a time greater than 5 seconds , and
this is not good...
Hence, since the contention is probabilistic and the throuput can become
much slower when there is a lot of contention in a time interval ,
how can USL predict such a problem from just a small number
of measured points ???
Look at this:
Amine Moulay Ramdane.