I'm trying to use callgraph in VTune 7.0 to profile and optimize a game server application.
The results I've got from VTune are confusing. It reports the main thread total time as 61 seconds, but some of the network threads total time as 225 seconds, even though all the threads were running throughout the test. Why do the threads have different total times?
The network threads have total wait times nearly equal to thier total times, while the main thread has a wait time of 13 seconds. Is wait time the amount of time spent in blocking calls?
In this caseI would interpret this as the main thread doing useful work ~75% of the time, while the network threads are blocking waiting for work to do nearly 100% of the time. This makes sense, but doesn't explain the difference in total times.
If wait time is blocked time then my goal should be to reduce the main thread wait time to 0 (the server uses an event based architecture). Does this make sense?