Estimated Time Elapse Inaccuracy



I want to measure the time for one instruction using SI, but at I get weird timings that don't make sence (msec!) for copying a pointer. That's too long, isnt it? I am using the elapsed time between two breakpoints.


Using SI's "Elapsed Time" to time code is probably one of the worst ways to gather performance data. The way that the ET is gathered is by issuing an RDTSC when SI is about to go away & then issuing another when SI pops back up. A whole slew of clock ticks happen between then & even if you only trace over 1 instruction & there are no context switches issued by the OS. SI still has to do it's thing to leave & come back up again. Copying a huge chunk of unpredictable video memory to a temporary buffer is part of the coming back up process in addition to possible ring transitions do to the IRETD of going down & the int 3 being fired off by your breakpoint. You'd be much better off using a simpler technique such as :

#pragma warning( disable : 4035 ) 
// Turn off no return value warning 
inline __int64 __fastcall RDTSC( ) 
_emit 0xf 
_emit 0x31 
void main() 
 char *tp, *p = 0xDEADBEEF ;
 __int64 t0 = RDTSC() ;
 tp = p; __int64 t1 = RDTSC() ; 
Old KB# 11061
Comment List
Related Discussions