This article, sample code and whitepaper were produced by Leigh Davies (Intel) who examines how to optimize your graphics applications such that they self adjust their workload on CPU/GPU to prolong a system's battery life while maintaining an acceptable visual quality for the user.
Today any review of a new processor whether it’s used in a desktop computer, a laptop, a tablet or a phone will contain lots of information about how efficient it is and the new technologies that have been used to achieve this performance. Operating system developers spend large amounts of time optimizing improve efficiency and extend battery life, but what can be done by someone who is designing an application and wants to ensure it runs as efficiently as possible? The aim of this sample is to provide insight into how features in a game can affect the power efficiency of the hardware it’s running on including the importance of frame rate capping, the effect of bandwidth on power and the cost of running asynchronous CPU work. The sample also demonstrates a way an application can adjust its workload to prolong a system’s battery life when it detects a change from AC power to battery, how aggressive the change is can be adjusted based on the currently active windows power scheme.
Figure 1: Power Explorer with onscreen power information from Intel Power Gadget 2.7
The core of the sample was designed around the idea used in Codemasters GRID2* that an application can and should adapt its behavior based on whether it’s running on battery or AC power. We also wanted to show a way of adapting this behavior changed to reflect how the user had their system configured so the decision was taken to tie the amount of adaption into the currently active windows power scheme. As well as extending the system’s battery life when not running on AC power that also allows the application to adapt to the fact the hardware’s performance will change based on the power scheme. As the sample was created it became clear that to tell the complete story of how and why a game needs to adapt based on the power source we would need the ability to accurately display power information in real time and allow the user to experiment with a wide range of graphics options to see how they affect power as not all optimizations will be applicable to all titles. The result the sample can be split into 3 main areas.
- Windows Power API’s for measuring battery life, capturing system notifications on power changes and information on the current Windows Power Scheme.
- Integration of Intel Power Gadget 2.7 and the information this allows to be displayed on screen.
- A sandbox showing the effect the various graphics options have on power draw and the interaction between each other. The main effects that can be adjusted are:
- VSync Rate Backbuffer Format ( a choice of 32 Bit and 64Bit formats and MSAA)
- HDR pipeline ( Tone mapping and bloom)
- Resolution ( Using a simple upscale post process)
- CPU workload and thread usage
- Shader workload.
- Tessellation level.
More details can be found in the accompanying article that can be found below. As a final example of the effect that adjusting settings can have running the sample on a 47Watt 4950HQ system (See Table 1) gave the following results with over a 3 fold reduction in power draw between maximum visual quality and power saving options
Figure 2: Power comparison of different settings
The main things that significantly affected the power draw of the same can be summarized as:
* Frame rate, limiting the maximum frame rate is the single most important optimization regarding power.
* Bandwidth, back buffer formats and MSAA affect almost every rendering call, the less bandwidth the better.
* Limit CPU work that doesn’t provide tangible benefits on modern systems to allow more power for the GPU, avoid spin locks and unnecessary polling, optimize time consuming functions even on none critical threads, just because the CPU is fast enough to do the work without stalling another part of the code doesn’t mean it’s an efficient use of the power budget.
* Balance shader cost against visual quality, excessive tessellation or highly complex pixel shaders that provide only minimal visual benefits can significantly impact on power draw.
Table 1: Hardware Spec used in testing