TrueTime session file column explanations

0 Likes

Problem:

How are the values calculated for each column in the Function List view of the TrueTime session file?

Resolution:

You can find the definitions for the columns by hitting F5 on your keyboard with a colum section highlighted. Here is an extended explanation of how each column is calculated:

Function Name:

Simply put this is the name of the function that TrueTime has collected data on.

% in Function:

This is the percentage of time spent in the selected function (but without any "child" function's time included) as a percentage of the entire session. An example would be if Foo.exe ran for 25 seconds and the FooBar() function ran for 5 seconds of the session, then "% in Function" would be calculated this way: function time divided by session time or 5s / 25s = 20%.

% with Children:

This is the percentage of time spent in the selected function, with its child function's time included, as a percentage of the entire session. An example would be if Foo.exe ran for 25 seconds and the FooBar() function ran for 5 seconds of the session, and FooBarAI() function (a child function of FooBar()) ran 5 more seconds then "% with Children" would be calculated this way: Function time plus Child Function time divided by session time or 5s 5s / 25s = 20%.

Called:

This is the number of times the function was called in the entire session.

An example of this would be if the function FooBar() is called 12 times in the session then 12 would be in the column. It doesn't matter who or why it is called, just that it was called.

Image:

This is the name of the .dll, or .exe or .ocx that the function is a part of.

% in Image:

This is the percentage of time spent in the function as a percentage of the time in the image. Example Foo.exe calls the functions FooBlock() and FooSlab() in FooCrete.dll. FooSlab() runs for 5 seconds and FooBlock() runs for 10 seconds. To calculate "% in Image" you divide the time the function ran by the total time spent in the image. The "% in Image" for FooBlock() would be: FooBlock() / (FooBlock() FooSlab()) or 5s / (5s 10s) = 33%

Average:

This is the average time it took the function to execute (without it's "child" functions). The Average is calculated by taking the sum total of every execution of the function divided by the number in the Called column. If it took 12 seconds to execute FooBar() 12 times in a session, then the Average would be Sum Total Time / Called or 12s / 12 times = 1s.

First:

This is the time it took to execute the function (without "child" function) the very first time it was called in the session. If the first time FooBar() is called, it took 7 seconds to execute then this column would read 7.0 seconds.

Minimum:

This is the time for the quickest execution of the function that occurred in this session. Out of all of the times FooBar executes in the session, one time it executes in .2 seconds and that is the quickest time then .2 seconds is what will be in this column.

Maximum:

This is the time for the slowest execution of the function that occurred in this session. Out of all of the times FooBar executes in the session, one time it executes in 12 seconds and that is the slowest time then 12 seconds is what will be in this column.

Average with Children:

This is the average time spent in the selected function, with its child function's time included. An example would be if the FooBar() function ran for 5 seconds in one execution, and FooBarAI() function (a child function of FooBar()) ran 5 more seconds, then on the next execution FooBar() ran for 6 seconds and FooBarAI() ran for 6 more seconds ending the session. To calculate the "Average with Children" we would add up all of the times for the function FooBar() (5s 6s = 11s) and then add in the time for its child function (FooBarAI) to execute, (11s 5s 6s = 22s) We then take this total and divide by the times FooBar() was called 22s / 2 = 11s The "Average with Children" for FooBar() would be 11 seconds.

Real:

This is commonly called the "wall clock" time. This is the time that was spent in the real world for the execution of the function. This is a raw measure of time that doesn't take into consideration when Windows switches threads from your app. to something else for a while. This time would be the same as if you started a stopwatch when you ran your app and stopped it when you exited your app.

Old KB# 11250
Comment List
Anonymous
Related Discussions
Recommended