Computer History Museum Panel
Keeping it Simple
The first insight was that they could do some of the work on the PC main processor instead of on the graphics card. The standard design at the time was to have the graphics card do all the processing to draw an image, which resulted in a very large and expensive system. In the 3dfx case, they realized that the Intel® Pentium® 90 processor had become fast enough to actually take care of the geometry phase of the rendering. This meant that it was possible to do a graphics card that would only do rasterization – and that cut down the size and cost quite a bit.
The second insight was that good enough was indeed good enough. They targeted games and games only. Not CAD applications where quality is paramount and you really want a line engine that can draw anti-aliased lines, or general-purpose Windows desktops where 2D drawing was necessary. There was no room for such features given the targeted cost and price. Games are nice to optimize for, since in a game, it is more important that things move smoothly than that they are perfectly rendered in every detail. This made it possible to dial back on bit depths, saving memory and bandwidth.
The ideas sound good – but how do you prove that they work?
Simulating to Prove and Demonstrate the Idea
The simulator was coded in C and ran on the same Pentium 90 machine as the geometry phase. In this way, a single PC could also work as demo system, no extra hardware needed. They claim they lugged a PC around quite a bit for live demonstrations, using their simulator.
The simulator was a lot slower than the hardware would be, for obvious reasons, so in order to show the graphics in motion they would often use it offline – render a sequence of frames from a demo, and then play it all back at full speed as a video. For a demo to a set of potential investors, the recorded video was the right solution. It showed what the product would be able to do, and 3dfx got funded.
Using a simulator as a demonstration tool to prove an idea is a great way to get ideas in front of people quickly. A simulator can be built using much less resources than the real thing, and in much less time.
Simulation as an Architecture Tool
From the transcript:
Sellers: I'd say that was the simulator we mentioned before. This thing was just all software that Gary created was the sort of the research part of how we would develop all of it. And Gary would map an algorithm the right way.
Sellers: I remember it had all these different--
Tarolli: Oh, yeah. Do it the right way.
Sellers: --flags, where you could do full floating point calculations and do everything kind of the SGI right way. And then Gary would use that as a kind of apples to apples comparison against OK, here's the cheap way. And since this was all about gaming and consumer use, there wasn't a perfect answer, right? Because it ultimately comes down to does it look good enough? And that's very subjective. And so you could really go to the extreme of when you can start visually seeing artifacts and visually seeing something that's not quite right. And then you just kind of come back slightly from that.
Simulation Driving Chip Testing
That design and demonstration software came in handy once the actual chips started to show up! It was basically repurposed as a test and validation vehicle. From the interview:
Sellers: And we keep talking about this simulator that Gary had written. So Gary had this ability to instead of doing all the rendering in pure software simulator, he could nest and strip all that out and send the real commands down to the actual graphics. So when we got the hardware back pretty quickly, we could actually do something interesting with it.
What this meant was that you could pull traffic out of the simulator not just at the point where the software talked to the hardware, but also within the hardware. In this first test, they only had the framebuffer chip that was the last step in the pipeline. Using the simulator, they could actually drive the chip with the commands it would normally be getting from the first chip in the Voodoo two-chip pipeline, and validate that it worked before they had a complete hardware platform.
The real chip was given a set of inputs from a real program, via a simulation of the missing parts of the hardware. This proved that the hardware did indeed work as intended – and it was an amazing sense of validation to see real pixels being drawn on a real screen.
Simulation Enabling the Ecosystem
This took two parts: one was developing a set of feature demos and examples to show developers how to use the API and what they could do with the hardware. The other part was making the API available to a few selected developers to do development before the hardware was available. The solution to this was a bit ironic – 3dfx bought a few high-end graphics workstations, and throttled them down to match the performance of their upcoming graphics cards. This let the game developers tune the performance and graphics quality of their games to the specific capabilities of the eventual hardware.
Watching the whole panel is highly recommended, as it mixes technology history with business insights. In the end, 3dfx basically created the PC graphics model as we know it today, with a GPU sitting alongside the CPU (usually on the same SoC). They took the PC into the professional flight simulator market and dethroned the old workstations in that space. Apparently, they even got their chips into cockpit avionics displays!