Common Misconceptions of Intel® Processor Graphics

NOTICE:  This article discusses obsolete hardware and has been retired.  To find out more about the latest Intel graphics, please see /en-us/articles/intel-graphics-developers-guides.












Intel® GMA X3000 Processor Graphics Tech Update

The Intel® Graphics Media Accelerator (GMA) X3000 provides the display needs for the majority of business and consumer users who don’t require expensive 3D processing. The graphics core is built into the chipset and integrated into the motherboard. A first amongst other processor graphics chips, it is capable of HW transform and lighting intelligently. Embedded in the memory controller hub, it shares system memory with the operating system in such a way as to significantly lower the system overhead. The latest graphics core of Intel, the Intel® X3000 is integrated into the chipset and provides an incredible amount of graphics performance for next to no cost for the end user. PC buyers have appreciated this balanced approach to system design, and Intel graphics technology is one of the most lucrative solutions chosen by PC users.

This article takes a look at some of the misconceptions surrounding Intel graphics technology. Here we examine some of the common assumptions about processor graphics and provide insights otherwise ignored by the trade press. This paper addresses software vendor concerns and information about Intel X3000, including features, support, and why it represents a compelling argument for using a base graphics technology that is so widespread. The hope is that audiences will be extended and the end user experience be extended benefiting the customer experience as a whole.

What exactly is Intel Processor Graphics?

Intel X3000 Chipset incorporates key features available in previous Intel Graphics versions like Dynamic Video Memory Technology (DVMT) as well as hardware acceleration for 3D graphics that utilize Microsoft DirectX* 9.0C and OpenGL* 1.5X. While the Intel GMA 3x00 (946GZ, Q963, Q965, G31, Q33, Q35, G33) supports software vertex shaders and hardware pixel shaders on Shader Model 3.0, the Intel X3000 Chipset would support Shader Model 3.0 with full support for hardware vertex shaders and pixel shaders when used with the latest drivers.

The Intel GMA graphics core used in the Intel X3000 Chipset is the first platform built on this architecture and is referred to as the Intel X3000 graphics. Future versions of this architecture will be used in next generation chipset platforms with incremental feature and performance capabilities.


When combined with Intel® Core™2 Duo processors, the Intel X3000 Chipset delivers fast system-level performance and responsiveness.

Intel X3000 Series Chipset Graphics


Graphics core

GMA 3000

GMA 3100

GMA X3000

GMA X3100

GMA X3500





G31, Q33, Q35



GM965, GL960


Clock speed (MHz)









Vertex shader model



Pixel shader model




Pixel Pipelines




Unified shader processors




Hardware vertex shaders

Not officially supported



Peak memory bandwidth (GB/s)







Max video memory (MB)



OpenGL support




DirectX API support



MPEG-2 Hardware Acceleration

Motion compensation


VC-1 Hardware Acceleration


MC (for WMV9 only)

MC + In loop filter

Table 1. Intel GMA X3x00 Series Capabilities


The Intel X3000 Series represents a significantly more powerful graphics core than found in previous generations of Intel graphics hardware. In discrete graphics devices, the graphics subsystem is contained primarily on the PCI Express bus. However, with Intel X3000, the graphics subsystem sometimes uses the CPU in addition to its own GPU resources (vis-à-vis the Graphics and Memory Controller Hub (GMCH), system memory, and the graphics core integrated into the chipset). With 3x00 series graphics, the CPU is used for the first stage of 3D processing (geometry operations), while the chipset handles the rest of the 3D processing thereby placing less reliance on the CPU. In utilizing system memory for system and graphics usage, Intel X3000 balances both resources, yielding optimal price/performance for our customers.

The Intel X3000 (G/GM965) features 8 unified shader processors running at 667MHz, as opposed to the 4 pixel pipelines (at 400 MHz) of its predecessor in the Intel® G945 and Intel® GM945 chipsets. Like its predecessors, Intel X3000 utilizes shared memory architecture. Since memory is shared by both graphics and other system applications, memory bandwidth is critically important for quality and performance. Hardware Zone, a PC-focused website, points out, “Not only does it run at a high clock speed of 667MHz, the X3000 finally features and supports hardware Transform and Lighting (T&L) units, Vertex Shader 3.0, Pixel Shader 3.0, Shader Model 3.0 (SM3.0), High Dynamic Range (HDR) and full 32-bit FP compute for graphics processing - all within the graphics engine.”

Intel® Processor Graphics Myths

For a number of years Intel has provided graphics performance that emphasized value. These solutions were appropriate for value conscious consumers and business users. With the Intel X3000 graphics core, Intel is targeting the capabilities required by the broad base of mainstream users. These users purchase a high volume of software and expect a playable experience with the majority of game applications available to them during their system’s lifetime. Intel is working to create hardware that provides additional performance and value for these customers. In this section, we discuss the most common misconceptions about Intel Processor Graphics.

Myth #1: 3D Games cannot run on the Intel processor graphics chipset

Based on older generation Intel Integrated graphics solutions, some conclude that it is difficult to run modern 3D games on Intel hardware. The Intel X3000 graphics core is an integrated solution that has demonstrated the capability to run some of the mainstream games at playable frame rates. Some examples include:

  • Age of Empires 3: War Chiefs Expansion* (Ensemble/Microsoft), 27 FPS
  • F.E.A.R *(Vivendi), 47 FPS
  • Dungeons & Dragons Online* (Turbine/Atari), 54 FPS
  • Star Wars: Empire at War*(Lucas Arts), 32 FPS
  • World of Warcraft*(Blizzard/Vivendi), 29 FPS


While there is agreement that the most demanding gaming environments will still require add-in graphics cards for hard-core gaming, mainstream consumer experts agree that it is possible to achieve a playable experience with most games on the market using the Intel X3000 series.

Myth #2: Intel Has a Small Segment of the Graphics Market

Within the desktop market, Mercury Research estimates that Intel easily holds the majority of the processor graphics market. In 2006, Intel had the largest market segment share in processor graphics (with ~50%), which is more than 2x the volume of the next closest competitor. Graph 1 shows desktop 3D-capable accelerator market trends, on a unit basis, as projected into 2011. (Source: IDC, Mercury Research PC Graphics Report, John Peddie.)

Desktop Gfx (ku)*





















Total Desktop














Mobile Gfx (ku)*





















Total Mobile







Table 2: Intel Market Share and projected growth of processor graphics


Graph 1: Intel Market Share and projected growth of processor graphics

Table 2 and Graph 1 show the market segment share with respect to other companies and the projected growth over the next few years of the processor graphics market. In the past, it was understandable that application developers interested in graphics performance would look at the integrated solutions as not providing the features necessary for development of content meant for the mainstream gamer. However, with Intel X3000 Graphics support of Shader model 3.0, features needed to enable a rich graphics experience are readily available and are being enabled across the entire computing continuum. While it would not be reasonable to say that the hardest of hardcore gamers will be moving away from discrete graphics solutions in the near term, there is a trend toward Intel providing greater value to the budget and mid-range portion of the market. Intel continues to win market segment share with PC consumers in this space. Hence, supporting Intel X3000 Graphics can positively influence revenue by growing the available customer base that can have a fun and visually rich experience with your application.

Myth #3: Intel Graphics is too slow while performing Transform and Lighting


TnL stands for “Transform and Lighting” and refers to hardware transformation and lighting. Application developers often assume that to achieve a certain level of performance, transformation and lighting in a discrete solution is a requirement. However, the Intel X3000 Graphics drivers, allow for the best possible mix of transform and lighting operations to to either hardware or software depending on which task is best suited for the GPU or CPU, all while still providing good application performance. By using the CPU or GPU in an “on demand” basis for TnL operations, the X3000 pipeline is optimized for the best combination of graphics workload balance.

The reality is bottlenecks in TnL are not normally the real cause for slowness when gaming. Most often this is actually due to pixel throughput. If there is a bottleneck in TnL, this can be due to a serious problem with the game architecture. To get to the bottom of the problem and determine where the issue actually resides, the Intel® VTune ™ Performance Analyzer is the perfect tool.

As shown in Graph 2, Intel graphics keep improving, especially against the competition.

General Graphics Performance

Graph 2. Intel processor graphics performance comparison

Source: Anandtech (

Myth #4: Intel Graphics Drivers are Highly Problematic

It is a fact that there have been problems with Intel graphics drivers in the past. But the driver team has dedicated itself to fixing these problems, putting a process in place to track and fix issues on a prioritized basis. Intel has dedicated engineers in the Software Solutions Group to report issues, and ISVs can report issues to their Intel Relationship Managers. Having the largest total market share right now, and given the exceptional talent on-hand, the vision for the next two to three years is for Intel graphics drivers to set the gold standard for the industry. To reach this gold standard level, it is necessary for the graphics community, including ISVs and gamers, to help ensure that any bugs encountered are reported to the driver team at Intel. To do this, follow these steps:

  • Download the latest Intel driver.
  • Ensure the bug is only reproducible on Intel X3000 Graphics by testing on parts from other vendors.
  • Detailed description of problem with image/playback captured.
  • Create a reproducible case (i.e., a .PIXRun).
  • You can submit your test case to your company’s main Intel Corporation Relationship Manager, or if your company doesn’t have one, it’s very likely that your publisher does. This should be the same person you turn to for CPU information and optimizations.
  • Another avenue that dvelopers can use publically is our forum located at /en-us/forums/user-community-for-intel-graphics-technology/ - Processor Graphics Software Development.


How to Optimize Applications for Intel® X3000 Graphics

To optimize your applications for Intel X3000 Graphics and get the best performance and functionality, there are a number of things you can do:

  • Use Intel® VTune™ Performance Analyzer to display performance data from the system-wide level down to the source level so that you can optimize across multiple operating system platforms, development environme nts, and the latest Intel processors. This tool can help determine the time spent in the graphics driver and the time spent in the application. It also allows you to easily see the “top 10” optimization candidates.
    • Download the latest Microsoft DirectX 9 SDK and its Symbols.
    • Associate the symbols in the Intel VTune Performance Analyzer.
    • Be sure to associate the symbol file with the binary file for Microsoft DirectX and the game executable.
  • Use Microsoft PIX* to find bottlenecks in your application and to determine the time spent in the DirectX API. Many Intel Graphics parts have a PIX plugin provided to monitor chipset and chipset driver graphics performance.



Now that you know the facts about the Intel X3000 family of graphics - how it is leading the market and providing a compelling visual experience and impressive performance with popular games-you can help educate other developers within your organization on why making Intel X3000 graphics a targeted platform for your application can expand your user community. Expand your knowledge of Intel Graphics by reading the GMA X3000 Development Guide.

This article sought to put to rest some misperceptions in the graphics community regarding Intel’s position in the graphics marketplace and the strength of its graphics technology. Intel X3000 graphics, the latest generation of Intel Graphics, shows major improvement in graphics performance and features. Intel Graphics is unique among the graphics architecture competition in that it goes beyond the core, providing an integrated balance of all the platform components. Intel X3000 Graphics represents an increasing percentage of the graphics market, and the advances Intel has made have been so vast that Intel Graphics is now one of the top-selling graphics solutions on the market. The result is a great user experience for mainstream PC users, with the additional benefit of low added cost. With Intel leading the graphics architecture marketplace, there is great incentive for ISVs to support this solution for applications targeted at mainstream PC users.

Additional Resources

To help dispel the myths surrounding the Intel X3000 Chipset and strengthen your knowledge of the topic, read the Intel GMA X3000 Development Guide.


About the Author

Chuck DeSylva is a Software Applications Engineering Manager in the Intel Software and Solutions Group. He and his team are responsible for the performance optimization of cutting edge consumer software titles running on Intel Desktop systems. Prior to working on application software optimization, Chuck worked as a driver developer for Intel Corporat ion. He was involved in developing/deploying the first device drivers for USB, AGP (GART) and Intel’s first graphics devices (i740/810(e)).


For more complete information about compiler optimizations, see our Optimization Notice.


anonymous's picture

I need istall it

anonymous's picture

why you say like this "Myth #1: 3D Games cannot run on the Intel processor graphics chipset"
'cause its TRUE, NOT a Myth.
First time i wanted to try a game "NFS The Run" on my laptop with integrated Intel Graphics (4 Series)
and vuala!, SAYS DXGI_ERROR_NOTSUPPORTED and my up-to-date drivers for game are UNKNOWN,
I browsed internet, huh..., I was not ALONE, thousands of people saying that your hardware is useless.
And a people from NFS Developers simply saying "That your product NOT SUPPORT most of 3D games"
So stop saying that all this Mith, cause its not Myth #1, its a your Problem #1,
Hope you will fix that, and many people will be HAPPY.

anonymous's picture

I have some memory coming in to at least get 4gb onboard . Yeah it was pretty crappy with 2gb but I am also running standard 2008 server without any bells or whistles. I saw the setting in the bios for 256mb memory with integrated but the board is a 946gz so going back to look. It was slow playing TF2 online but gonna beef the memory and see about the setting. I have a pdf here that indicates it wont run worth a crap unless its set to 512 so looking to see about changing that when the 4gb memory shows up. If that dont work I will most likely be a little depressed cause i threw out at least one old pcix-16 card and some half way decent pci cards. All i have sitting here is an AGP. Kind of sucks that the actual graphics world consists of nothing more than a crap load of head hunting self righteous marketers with no consideration for the recession or times. Back in the day I couldnt even get those fags who made the monster cards to respond because they were always on vacation on a beach somewhere drinking their fruity cocktails. I appreciate at least what intel has been trying to do and fully understand the whole picture behind what graphics card makers have done over the years which very little other than marketing something not even dicernable by the naked eye which in fact probably contributes to the eyesight deteriorating quicker along with charging alot of money and using our children as a magnet to get the money just to shut them up cause they are really llamas. Yeah I go back to dos and this whole experience with markets that lie and prey on children is not much better than a bunch of child porn web hosts to me. peace from the knowledge

anonymous's picture

I have Intel graphics on my motherboard in my crappy Chinese made piece of junk.

I can get all of 7fps in WOW so the statement above about getting 29fps is a JOKE!

I have a FAST connection but my PIECE OF CRAP STINGY system with a horrid video system won't even PLAY
SWTOR, so let's get a bit real here, you are lagging behind by over a DECADE.

--computer hardware expert 20+ years

anonymous's picture

cant run a decent new game on intel chipset 4 series express games which came after 2007
working games--splinter cell -chaos theory,half life 1,2 sniper elite,cod mw,battlefield 2

anonymous's picture

i have GMA3100 (G33/G31)
i have only pixelshader 2.0 and vertexshader 0.0

anonymous's picture

i have GMA3100 (G33/G31)
i have only pixelshader 2.0 and vertexshader 0.0

anonymous's picture

Dominic Smith, is Intel 4500m belongs to X3000 series or it is more high cost integrated solution?

anonymous's picture

Just want to say thanks for the great info. I've always been more than happy with Intel graphic solutions. In fact, just today I was playing Silent Hunter 4 on my laptop with the Intel 4500m and it was running just fine at 1366 X 768 with most sliders at medium or high.

Thanks again Chuck


Add a Comment

Have a technical question? Visit our forums. Have site or software product issues? Contact support.