HDMI Audio: Intel's Biggest Little Secret In Home Theater PCs

I won't say my last post here was harsh-- it was heartfelt and survives a reread without me flinching-- but I wanted to be fair and illustrate why I think it's a big issue. What competitive advantage are we squandering / have we squandered here with HDCP issues over HDMI?


I won't go too deep into the history of HDMI and HDCP in PCs; I'll just summarize and say that early HDMI-enabled graphics cards from our competition were missing HDCP protection, and there were threats of lawsuits because they originally advertised their chips (not the actual end-user cards, mind you, their chips) as "HDCP-ready".


Though Intel was comparatively late to the game on HDMI, as far back as the 945G we supported HDMI through the use of conversion chips from our partners. These chips sit on the SDVO bus (as I've mentioned earlier, this is an interface which is muxed into PCIe from our northbridge) and convert the SDVO video into the HDMI signalling protocol.


Keep in mind that when the SDVO-HDMI chips (most common are the Chrontel 7315 and the Silicon Image 1390) were being designed, the released HDMI spec was at 1.0, and much of the audio portion of the 1.1 spec (which added a lot of audio capabilities) was still a work in progress.


I don't know which individual at Intel is responsible for deciding to ask the SDVO chip vendors to include an Azalia (aka Intel HD Audio) interface on the SDVO chip, but I hope that person got lots of stock options. It was an idea ahead of its time, risky, and although it was ultimately flawed due to content protection issues which arose after the design had already been finalized, it currently still provides Intel with a competitive advantage:


For the past two years, only Intel's HDMI audio has had 7.1 channels of lossless, high-def sound.


I write this on April 25, 2008, and that advantage is barely publicized but already falling away.


What happened?


Technical History


Speculating on what goes on in the conference rooms of our competitors is ultimately a fruitless exercise, but one could guess that, being graphics companies, they simply de-prioritized the audio portions of HDMI in favor of the video. Went with the part they understood best, in other words.


I can sympathize-- I'm alternately suprised and impressed that we didn't make the same mistake.


Some technical discussion is in order. HDMI was designed as a single-cable solution for audio and video. The video passes in much the same way it has on older interfaces like DVI and VGA: draw a screen line by line, finish the screen, wait an instant, and then start drawing the next screen; do this 50 or 60 times per second. In HDMI, the "wait an instant" portion of the time (known as the "blanking interval" by video nerds) is actually long enough to pass a significant amount of data, and it's referred to as the Data Island.


The HDMI spec allows the Data Island to be stuffed with all sorts of things, but germane to this discussion is that as of HDMI 1.1 it can be filled with audio packets.


All sorts of audio packets. A typical CD player puts out digital audio in a format called Linear Pulse Code Modulation (LPCM): up to 2 channels worth of data at 16 bit resolution and 44.1kHz sampling frequency. That's allowed over HDMI. A DVD player can put out 2 channels of LPCM at 24 bit resolution and 96kHz sampling frequency. HDMI permits this, too. DVD players can also put out 6-channel lossy compressed sound in the form of Dolby Digital or DTS. HDMI is pleased to offer this service as well.


Of course, you can get all this over the old S/PDIF protocol (the optical or coaxial digital cable which comes out of a DVD player and goes into a receiver or TV), and lots of people do. The real benefit to HDMI audio is the ability to transfer massively more sound data than S/PDIF was designed for.


With HDMI 1.1, you can send 8 channels of lossless, uncompressed sound in the form of LPCM at up to 24bit resolution and 192kHz sampling rate. Even those crazed audiophiles who think the sound of old vinyl records still beats the digitized CD versions concede that at that resolution and sampling rate there is no way for the human ear to distinguish the recording from analog. It's the Holy Grail of multichannel audio, essentially equivalent or better than the studio masters used to make the film prints that go to theaters.


It's nice, and it's only in the past three years or so that there have been HDMI receivers available which can actually process that level of sound; prior to that, high-definition multichannel sound had to be converted to analog and sent to the receiver that way. That worked, but the quality of the result was always limited by the Digital-to-Analog Converters (DACs) which convert high-def sound data to analog signals-- and the DACs in most motherboard solutions are passable but not great. With an HDMI solution, the data is sent digitally and the receiver's DACs are used, and generally speaking the DACs found in an A/V receiver are going to be significantly better than ones chosen by a motherboard vendor trying to save a buck or two. (I'm not criticizing... that's just the difference between business models for the two industries.)


HDMI, then, is really the most efficient way to pass high-end multichannel audio from the PC to the receiver. As mentioned before, older digital methods like SPDIF are limited to 2 channels or compressed 6-channel sound which (though it typically sounds very good) loses something in the translation. It's the interface of choice for those who want to take advantage of high-resolution >2 channel sound formats found on DVD-Audio, HD DVD, and Blu-ray disks.


The Ball: Dropped


Despite the fact that HDMI 1.1 (and greater) have been spec'd out for several years, video cards have, at best, provided SPDIF-level sound. A good part of this may be the focus of their business (video rather than audio), and another part of it is the "who cares?" factor: SPDIF audio is good enough for most people. Humans are very visually oriented in general, and it's far easier to notice compression artifacts in an MPEG2 image than it is to notice the equivalent in a Dolby Digital soundtrack. That doesn't mean the audio artifacts are not there, annoying those that can hear them, but it's evident to those in the industry that high-quality is not the highest goal of most PC listeners: MP3s, flawed as they may be, are way more popular than high-def audio like DVD-Audio or Sony's SACD formats.


But that doesn't mean the market for a high-def audio solution does not exist, just that it's a niche market. You'd think the sound card folks would have stepped in, right? That is their business, after all. Alas, no; the HDMI spec assumes you're sending video along with the audio, and evidently the sound card makers decided it was not worth the bother.


So: HDMI is the best way to transmit high-definition audio, but video card makers-- even when they do send the audio-- don't support the high-def specs and audio card makers by and large just aren't playing along.


Enter the Intel solution. Late to the HDMI game, and only on motherboards (no discrete cards yet), Intel piggybacked its HDMI implementation on the pre-existing Intel HD Audio functionality by hooking the SDVO chips into the HD Audio bus on the motherboard as just another codec. (For those not familiar with Azalia-speak, essentially the SDVO-HDMI chip is recognized as just an extra sound device to write to-- just like a Realtek or Sigmatel or other normal audio chip found on the motherboard.)


There have been bugs (and still are: the repeater mode bug I mentioned in my previous post is one of two serious ones) but to a pretty good approximation It Just Works. It may sound hard to believe, but a not insignificant number of HTPC people are buying Intel integrated graphics solutions for this reason alone-- full resolution 6- and 8-channel audio simply cannot be passed digitally in any other way than on certain specific Intel platforms.


That's a win, but it's been a pretty quiet one. I've been crowing about it among the enthusiast crowd as soon ever since I figured it out (the functionality is buried in our graphics drivers and not exactly obvious unless you know what you're looking for) and it's pretty well known now in those circles, but I haven't seen Intel as a company using it for bragging rights.


Given some of the other stuff we do brag about, I'll admit this is a bit mysterious to me, but I'll chalk it up to internal ignorance: even our own customer support folks don't seem to know the functionality is there and in some cases don't believe our customers when they're asked to provide assistance. The sound chip drivers are, after all, the responsibility of Realtek or Sigmatel-- that's what we've been instructing customer support to say for years-- and Intel doesn't offer assistance for those. The end-user must be mistaken when he or she claims that there's an Intel-generated and -supported HDMI Audio driver, right?


Bottom line: excellent implementation, kudos to the Intel HDMI Audio driver team. Poor publication. I'm sure when it debuted in the 945G timeframe there was little to crow about-- you could barely even find a receiver which could accept the audio over HDMI, then. But once HD DVD (RIP) and Blu-ray came out and the G965 was capable of playing them, I think we probably should have started evangelizing our solution more in the press.


"The only way to get the full audio you deserve." Or something like that. See, there's a reason I'm not in Marketing.


The Future


I mentioned before that our lead here is going away; it seems at least one of our competitors is finally claiming to supply the same full audio capabilities over HDMI on its latest motherboard chipset. The functionality is still not working in their drivers, which is why I say "claiming", but let's be conservative and assume it's only a matter of time. We can only assume that our other competitor will follow sooner rather than later.


Advantage lost, apparently.  :(  Are there ways we can still provide added value in this space?


The answer is yes, but time is of the essence, because dollars to donuts the competition is working on this, too.


The HDMI spec has been revised twice since the initial 8-channel LPCM capability was added. Rev 1.2 added support for natively passing DVD-A and SACD streams without decoding them into LPCM first, and Rev 1.3 added support for natively passing the new high-def lossless formats (Dolby TrueHD and DTS HD Master Audio) used in Blu-ray. Any player worth its salt can already decode these new formats into LPCM, so why bother sending the original bitstream instead?


The answer lies in various "helpful" things which get done to the sound once it leaves a Blu-ray disk. In order to make sure the audio from the movie can be interrupted seamlessly by pleasant sounds like the Windows "warning" noise or the "you've got mail sound", the LPCM isn't passed directly to the audio port-- it's mixed in with whatever other sounds are "needed".


Few enthusiasts trust the player software to leave the sound alone, and even fewer trust Windows to do the same. (And they have reason not to: Microsoft's kmixer (on XP and before) historically munged the sound enough that many sound card designers bypassed it and wrote their own software instead.) If you can send the raw undecoded bitstream, on the other hand, you are exempt from this tampering. The HDMI 1.3 spec is what enables this, and no solution currently exists to do this on the PC. If Intel wins the race to that functionality, it can retain competitive advantage in the audio space.


Beyond this, there is the matter of what Microsoft has dubbed the "Protected Audio-Video Path" or "PAVP".  Basically, the content owners have set requirements on what a player needs to do in order to maintain control of protected material, and Microsoft has translated these requirements into hardware and driver requirements for graphics/audio suppliers (including Intel).  When you get to the bottom of the matter, what it essentially means is that all audio and video need to be encrypted the entire time they're in the PC until they are sent out an analog port or another encrypted interface like HDCP-protected HDMI or DVI.  If this encryption cannot be maintained over the entire path, the player software is required to artifically reduce the quality of the audio before sending it out on the unprotected path.


Right now, there are so few audio solutions which accomodate this that all playback software for Blu-ray downconverts all audio to lower quality, though you might not notice it unless you're a sound fanatic.  But later this year, the player software folks will be modifying their players to pass the full audio over "protected" interfaces.  Intel needs to ensure it's in this space-- right now, doing everything over the (non-encrypted) Intel HD Audio bus, we're in the same boat as everyone else in terms of downconversion.


This article was longer than I expected or even wanted it to be.  The message, if you've bothered to read this far, is: Intel was way ahead in the HDMI Audio game, but the competition has almost caught up.  For a decisive win with HTPC enthusiasts, we need to ensure our HDMI audio solutions are ready for undecoded bitstream transfer of Dolby TruHD and DTS HD MA, and at the same time support whatever PAVP craziness Microsoft has concocted this week.


EDIT: Just a quick note: in my "The Future" section, I should have pointed out that there are a couple of sound card manufacturers who are now working on releasing a "passthrough", where you feed your HDMI video from a video card into your soundcard, which adds the audio and then resends it all out over another HDMI port.  This should work, but it doesn't change the need for Intel to hit this space in their integrated graphics solution.


Kategorien:
Nähere Informationen zur Compiler-Optimierung finden Sie in unserem Optimierungshinweis.