Compare Results

Compare your analysis results before and after optimization and identify a performance gain.

Use this feature on a regular basis for regression testing to quickly see where each version of your target has better performance.

You can compare any results that have common performance metrics. Intel® VTune™ Amplifier provides comparison data for these common metrics only.

To compare two analysis results:

  1. Click the Compare Results button from the VTune Amplifier toolbar.

    The Compare Results window opens.

    Option

    Description

    Result 1 / Result 2 drop-down menu

    Specify the results you want to compare. Choose the result of the current project from the drop-down menu, or click the Browse button to choose a result from a different project.

    Swap Results button

    Click this button to change the order of the result files you want to compare. Result 1 always serves as the basis for comparison.

    Compare button

    Click this button to view the difference between the specified result files. This button is only active if the selected results can be compared. Otherwise, an error message is displayed.

  2. Specify two results that you want to compare and click the Compare button.

    A new result tab opens providing difference between the two results per performance metric.

The tab name combines the identifiers of two results. For example, the comparison of the Microarchitecture Exploration analysis results r001ue and r005ue appears as r001ue-r005ue. The data views in the comparison mode provide calculation of the difference between the two results in the order you originally defined in the Compare Results window and as specified in the tab title.

You can compare performance statistics in the following views:

Use this view:

To do this:

Summary window

Analyze the difference in the overall application performance between two results and the system/platform difference, if any. Start exploring the changes from the Summary window and then move to the Bottom-up analysis to identify the changes per program unit.

Bottom-up window

Analyze the data columns of the two results and a new column with the difference between these results for a function and its callers.

Event Count window

Compare results and identify the difference in event count and performance per hardware event-based metrics collected during event-based sampling analysis.

Top-Down Tree window

Explore the performance difference between two collection runs for a function and its callees.

Caller/Callee window

Get a holistic picture of the performance changes before and after optimization by comparing data for a function, its callers and callees.

Source/Assembly window

Understand how differently input values, command line parameters, or compilation options affect the performance when you are optimizing your target. Double-click a program unit of your interest and compare the performance data for each line of the source/assembly code.

See Also

For more complete information about compiler optimizations, see our Optimization Notice.
Select sticky button color: 
Orange (only for download buttons)