User Guide


Compare Results

Compare your analysis results before and after optimization and identify a performance gain.
Use this feature on a regular basis for regression testing to quickly see where each version of your target has better performance.
You can compare any results that have common performance metrics.
Intel® VTune™
provides comparison data for these common metrics only.
To compare two analysis results:
  1. Click the
    Compare Results
    button from the
    Compare Results
    window opens.
    Result 1 / Result 2
    drop-down menu
    Specify the results you want to compare. Choose the result of the current project from the drop-down menu, or click the
    button to choose a result from a different project.
    Swap Results
    Click this button to change the order of the result files you want to compare. Result 1 always serves as the basis for comparison.
    Click this button to view the difference between the specified result files. This button is only active if the selected results can be compared. Otherwise, an error message is displayed.
  2. Specify two results that you want to compare and click the
    A new result tab opens providing difference between the two results per performance metric.
The tab name combines the identifiers of two results. For example, the comparison of the Microarchitecture Exploration analysis results
appears as
. The data views in the comparison mode provide calculation of the difference between the two results in the order you originally defined in the
Compare Results
window and as specified in the tab title.
You can compare performance statistics in the following views:
Use this view:
To do this:
Analyze the difference in the overall application performance between two results and the system/platform difference, if any. Start exploring the changes from the
window and then move to the
analysis to identify the changes per program unit.
Analyze the data columns of the two results and a new column with the difference between these results for a function and its callers.
Compare results and identify the difference in event count and performance per hardware event-based metrics collected during event-based sampling analysis.
Explore the performance difference between two collection runs for a function and its callees.
Get a holistic picture of the performance changes before and after optimization by comparing data for a function, its callers and callees.
Understand how differently input values, command line parameters, or compilation options affect the performance when you are optimizing your target. Double-click a program unit of your interest and compare the performance data for each line of the source/assembly code.

Product and Performance Information


Performance varies by use, configuration and other factors. Learn more at