Dear LoadRunner Cloud community,
I am trying to create a report identical to the one generated after a load test in the REPORT pane beside the DASHBOARD. I have managed to identify how the average/min/max trt are computed, as well as failed and passed transactions.
However, I am still trying to crack how the standard deviation is computed.
Would anyone know how to aggregate the data for Standard Deviation?
EDIT
these are what I have tried so far (per transaction):
- calculating for the whole timeseries where
- w = weight
- xi = average trt value
- x-bar = weighted mean
- M = the number of non-zero weights
- calculating the same as above per time_stamp, and then getting the mean of calculated values throughout the timeseries
- getting the weighted covariance of the average trt values, and then getting its square root (just using library functions in Python)
Regards,
Leif
PS If anyone is curious about the other metrics:
- the average trt for a transaction through the whole timeseries is a weighted average of all the the transaction average trt values--the weights being total_duration/val.
- min trt and max trt come respectively from the minimum and maximum values of the min trt and max trt metrics in the val column
- passed transactions and failed transactions are cumulative sums over the timeseries