When The Perfect Is The Enemy Of The Good
I've already discussed the benefits to Intel Graphics and HDMI Audio in a previous post, and complained about the HDCP repeater mode bug (still unresolved as of graphics driver release 15.9.2) which forces people to use gray-market software if they want to use Intel HDMI Audio to play Blu-ray disks with a receiver, but there's still one more nit I have to pick: there is a very prominent, mid- to high-end consumer electronics supplier whose receivers are still not playing ball with Intel HDMI Audio. Let's explore why this is, and-- at least philosophically-- how I think this should be fixed.
What Can Your Receiver Do?
With HDMI, we finally have a two-way path for communication between the source and endpoint devices in the consumer electronics space. This means a lot for the Digital Rights Management crowd, but it also enables something pretty cool for end customers in general: a source device can query the end device and tailor its output to the best possible the endpoint device can produce.
Computer monitors have been doing this over VGA for years, and later over DVI. There is, in fact, a special data structure called the EDID (Extended Display Identification Data). The source sends a command to the monitor/television to send its EDID, and the monitor/television responds with a 128-byte message which details the native resolution of the screen, any specific timings it likes best, etc.
For HDMI, this structure has been extended further with another 128-bytes. These bytes supply more specific timings and, most relevant to our discussion here, a list of which audio possibilities the end device supports. In the normal case of a TV as the end device, the report back is typically "LPCM 2.0" (lossless stereo sound) or at best "Dolby Digital 5.1" (five channel lossy compressed sound). When you place a nifty new audio receiver in the signal path (as an HDMI "repeater"), however, the receiver itself intercepts the TV's EDID and adds its own information before sending it back to the source device (DVD player or Blu-ray player or PC). The specifics of how this is done are discussed in the consumer electronics spec CEA/EIA-861D, and summarized in the EDID Wikipedia article.
Theoretically, this should allow the source device (in this case, the PC) to send the best possible audio your TV or receiver can decode. Simple, right?
How Does Your PC Handle It?
Let's ignore the graphics case, for now; I can (and eventually will) write an entire article about the lies EDID can tell about TV/monitor resolutions. As ever, let's hit audio.
As discussed in a previous post, Intel's HDMI Audio solution is a bizarre intersection between Intel HD Audio (aka The Artist Formerly Known as "Azalia") and Intel's SDVO video. The interactions between the operating system (using Microsoft's Unified Audio Architecture or UAA) and the Intel HD Audio hub in the southbridge are complex enough before you add HDMI. Suffice it to say that, in a roundabout way, the operating system asks the audio subsystem what it can do so it knows how to send any sounds without something screechy and horrible erupting from your speakers.
Taking this further, in order to report this back to the OS, Intel's HDMI Audio needs to get the ELD (or "EDID-Like Data") from the end device. It does this by... you guessed it... querying the connected device(s) for EDID.
So the path in a typical configuration is:
OS --> (query for capabilities) --> HD Audio Bus --> HDMI Audio chip --> (query for ELD) --> Graphics drivers --> (query for Receiver EDID) --> Receiver --> (query for TV EDID) --> TV
The TV gets the query, reports back what it can do, the receiver adds on its own capabilities, and the answer cascades back to the OS, which reveals in some Audio Properties window what the HDMI Audio is permitted to send to the downstream devices.
Slightly insane, and with this many players you can see the potential for software or firmware breakdown. Needless to say, in order to protect you (and Microsoft's legal department) from sound which will rupture your expensive speakers, the OS will not send sound in any format that the HDMI audio chip does not explicitly state it supports. Therefore, the UAA driver in the OS, the HD Audio Bus driver, the HDMI audio driver, the graphics driver, the receiver firmware, and the TV's firmware are all links in a chain which can mess up the audio capabilities.
In this case, again, we must focus. Pretty much everything in this path works, miraculously enough, but there is one corner case that does not, and it's an ugly one. Denon is a very popular brand of AV receiver-- especially in the demographic that is currently messing around with HDMI audio: the early adopters and the audiophiles. Hooking up a sparkling new Denon 7.1 channel receiver to your nifty Intel HDMI audio solution will net you... 2 channel stereo. Same as most TV sets. Other receivers (Onkyo, Yamaha, etc.) don't have this difficulty.
Where in the signal path is the problem?
Ambiguous Specs and Ambiguous Drivers
The problem, after much debug by myself and some second level folks in our support group, is that Denon is doing things differently in their EDID than other consumer electronics manufacturers, and the way Intel drivers are handling this is not helping.
Which puts it somewhere in here:
HDMI Audio chip --> (query for ELD) --> Graphics drivers --> (query for Receiver EDID) --> Receiver
Per spec, the audio data in the EDID is found in one or more Short Audio Descriptors ("SADs"). The way most manufacturers do this is to have a single "tag" byte followed by a number of SADs. For instance, an Onkyo receiver enumerates its audio with
38 09 7f 07 0f 7f 07 17 07 50 3f 06 c0 4d 02 00 57 06 00 5f 7e 01 67 7e 00
The "tag" byte tells how many bytes follow which are SAD data (in this case, 24), and the remaining bytes are three-byte SADs which detail stuff like which formats are supported (like Dolby Digital, DTS, LPCM, WMA Pro, MLP), which bit rate and depth they are supported at, and how many channels can be played back. In this case the Onkyo parses out as:
LPCM 2 Channel Sound, Frequencies:192kHz, 176kHz, 96kHz, 88kHz, 48kHz, 44kHz, 32kHz, Bit Depth :24 bit, 20 bit, 16 bit
LPCM 8 Channel Sound, Frequencies:192kHz, 176kHz, 96kHz, 88kHz, 48kHz, 44kHz, 32kHz, Bit Depth :24 bit, 20 bit, 16 bit
AC-3 8 Channel Sound, Frequencies:48kHz, 44kHz, 32kHz, Max bitrate :640
DTS 8 Channel Sound, Frequencies:48kHz, 44kHz, Max bitrate :1536
SACD 6 channel Audio
Dolby Digital+ 8 Channel Sound
DTS-HD 8 Channel Sound
MLP/Dolby TrueHD 8 Channel Sound
What happens with Denon? Denon takes a slightly different approach. Denon precedes each SAD with a "tag" byte:
23 0d 1f 07 23 09 7f 07 23 3d 1f c0 23 15 1f 51
Note the recurring "23". That's "This Audio block has 3 bytes (1 SAD)". The first one is 8 channel LPCM, the second is 2 channel LPCM, the third is 6 channel DTS, and the fourth is 6 channel AC-3.
Herein lies the problem. Intel audio drivers sample the ELD and see the first Audio block has an LPCM value of 8 channel capabilities and are thrilled to offer this... then they keep reading and see there is a completely new Audio block, with a new set of SADs. This second Audio block has a SAD with an LPCM value of 2 channels... which overwrites the 8-channel value. Oops. Now the drivers report back that the receiver can only do 2 channels and the OS will not even offer 7.1 channel LPCM as an option. Wouldn't want to frighten the Denon with sound it can't handle, after all.
At this point, I am uncertain whether the error gets made at the HDMI audio driver level or the Intel Graphics driver level-- does the Graphics driver parse the EDID and pass along the ELD with 2 channel values, or does the Graphics driver send the entire EDID and the Audio driver overwrites the 8 channel values with 2 channel all on its own? No idea. But Intel's system does not cope well with Denon's "special" way of doing things.
Now, as for that "special way"... it's unconventional-- no one else does it this way-- and it's wasteful of bytes... but is it against spec?
Here's where strict adherence to a crappily-written spec can (and in this case, does) get Intel and Denon into trouble: CEA/EIA-861B states
"The format of the "CEA Data Block Collection" shall conform to that shown in Table 30."
I'm sure it would be a violation of copyright restrictions to include Table 30 here, but please trust me when I say it looks exactly like the Onkyo example above: single "tag" byte, followed by a stream of three-digit SADs.
But wait... in the exact same paragraph:
"Note that the order of the Data Blocks is not constrained. It is also possible to have more than one of a specific type of data block if necessary to include all of the descriptors needed to describe the DTV Monitor’s capabilities."
So... Denon is providing more than one of a specific type of data block (Audio). They don't have to all be in one stream, per this interpretation.
And so the argument extends. Denon sees nothing wrong with their interpretation. Intel says Denon should fix their EDID.
Intel can point to what is arguably their accurate reading of the spec (I tend to read it this way, as well); Denon can point to the fact that HD DVD and Blu-ray players, the PS3, and other devices don't seem to be having difficulties streaming 8-channel sound to Denon receivers... why should they have to distribute new firmware to all the owners of their receivers... why is this their problem again?
The irony is: if Denon had chosen to write LPCM 2 channel first and LPCM 8 channel second, everything would still have worked and we'd never have known about the problem.
Does It Matter?
Ultimately, the consumer doesn't care who's right or wrong in this esoteric technical dispute. They plug an Intel computer into a Denon receiver and it only gives them stereo. They swear at it for a bit, plug in their friend's PS3, and it works fine with 7.1 channels. Do you think they consider it important whether Intel's or Denon's reading of the spec is right?
Of course not. The PS3 and other consumer electronics devices are at best simply ignoring the problem; at worst, they are aware and don't care-- they'd rather have their device be transparent to the consumer than argue the intricacies of specs.
It is my firm belief that Intel should relax its read of the audio bytes of the EDID to accomodate what is arguably an ambiguity in the spec purely to satisfy customers. I don't know if it's our Graphics driver or our Audio driver, but it should be able to cope with Stupid EDID Tricks.
Yes, Denon is probably at fault. I'm with you driver guys, and will drink a beer with you to commiserate. But in the end-- at least in this matter-- fault isn't important. Results are, and we need to show that our solution works on as much equipment as possible.
Right now we're being stubborn on something that doesn't really matter to us, but matters a whole heck of a lot to the end user.
Which is most important?