Human eyes are capable of seeing many more colors than those shown by video displays currently on the market. Up to now, displays were limited in the number of colors they produce as well as computer systems those can represent only a finite number of colors. This article is focused on providing an overview of using 10-bit color depth compared to 8-bit color depth with capabilities built on 7th generation Intel® Core™ processors and optimized by Intel® Software Tools. An example of HEVC 10-bit encoding can also be found in the attached code sample.
In order to understand additional details about 8-bit vs. 10-bit colors, a concept called ‘color depth’ is outlined as follows.
Color depth is also known as bit depth, and is the number of bits used to display the color of a single pixel. The same images or video frames with different color depth look differently because number of colors in each pixel varies depending on color depth value.
The number of bits for an image refers to the amount of bits per channel for each type of color in each pixel. The number of color channels in a pixel depend on the color space used. For example, the available color channels of RGBA color space are Red (R), Green (G), Blue (B) and Alpha (A). Each additional bit doubles the amount of information we can store for each color. In an 8-bit image, the total number of colors available per pixel is 256. Table 1 shows the possible number of colors available for each respective color depth.
|channel depth||Tones per channel per pixel||Total number of possible tones|
As most computer and TV monitors on the market are still capable of showing only up to 8-bit content, on which the 10-bit content is displayed by lowering the bit depth. However, the actual advantages of 10-bit can be exploited in the following most common scenarios.
If a content is shot in 10-bit, there is a large margin of safety to not lose information when applying the required changes. Otherwise, image processing with lower precision could result in loss of sharpness, contrast and other valuable information. If loss of information occurred due to changes applied to 8-bit content, this could leave fewer bits per pixel and cause a color banding effect. Color banding concept is explained with an example below.
When an image sensor captures an image and is unable to distinguish the minimal difference between adjacent colors, a problem of inaccurate color representation occurs. As a result, the image is translated into a single visual pixel color value due to the lack of adjacent color availability. This pattern results in an image which has bands of color instead of smooth calibration of colors. Color banding (Figure 1) occurs when an image is captured without enough detail, but the same image is supposed to look differently in the real world.
Available possible solutions to avoid color banding are
An uncalibrated display can also show banding-like artifacts. In such scenarios, one can try monitor calibration tools or the Intel® Graphics Control Panel applet.
Figure 1 shows the difference between an 8-bit and 10-bit image with respect to the color banding issue. The image on the left was captured with an 8-bit sensor and image on the right was captured with a 10-bit sensor. In the left image, the required detail was not captured and fewer bits implicate fewer number of colors causing the color banding effect. Whereas in right image, the same frame was captured with enough detail and transition between, so the adjacent colors are smooth. To offer smoother color transition between the adjacent pixels, the current color gamut is not sufficient and it needs to be widened. A wider color gamut is introduced in the standard BT.2020, which is briefly introduced below.
7th generation Intel Xeon and Core processors support the BT. 2020 (also known as Rec. 2020) standard in use-cases such as 4K Ultra-high definition (UHD) content creation/consumption and HDR with 10-bit enablement and more. UHD monitors have 3840*2160 pixels among several screen sizes. Displays supporting the BT.2020 standard are able to provide enhanced viewing experiences at these high resolutions.
The International Telecommunications Union (ITU) recommendation of the BT.2020 represents a much larger range of colors than previously used in BT.709. The comparison between the respective color spaces is shown in Figure 2 (which follows), represents the CIE 1931 color space chromacity diagram. The X and Y axis show the chromacity coordinates with the wavelength of the respective color space shown in blue font. The triangle outlined in yellow shows the color space covered by the BT. 709 standard, which has finite color information to represent pixels on large displays such as HDTV. The black triangle shows the BT. 2020 color space in which the smoother transition between adjacent colors is highly possible as more colors are available. BT. 2020 also defines various aspects of UHD TV such as display resolution, frame rate, Chroma subsampling and bit depth in addition to the color space.
7th generation Intel processors support the HEVC Main 10 profile, VP9 Profile 2 and High Dynamic Range (HDR) video rendering by exploiting BT.2020 standard.
High Efficiency Video Coding (HEVC), also known as H.265 is a video compression standard, a successor to the widely successful H.264/AVC standard. The HEVC standard is capable of enabling more sophisticated compression algorithms relative to its predecessors. See also Learn about the Significance of HEVC (H.265) Codec for more information. The Main 10 profile allows for a color depth of 8-bits to 10-bits per sample with 4:2:0 chroma sampling.
HEVC 10b decode support is available starting from 6th generation Intel® processors. The command below shows how sample_decode in the Intel Media SDK Code Samples can be used to achieve raw frames from a HEVC elementary stream.
sample_decode.exe h265 -p010 -i input.h265 -o raw_farmes.yuv -hw
The input (input.h265) used in the above decode session can be downloaded by visiting Free H.265/HEVC bitstreams (the exact file name is mentioned at the end of this article). The output (raw_frames.yuv) from the above decode session needs to be in P010 format, which can be used as the input to sample_encode operation as explained in the following paragraph.
HEVC 10b hardware acceleration for both decoder and encoder with HEVC/H.265 Main 10 Profile is supported in 7th generation Intel processors. The HEVC 10-bit encode capability was verified using the attached ‘modified_sample_encode' code, which was exclusively modified to support this particular feature. This sample works with Intel® Media SDK 2016 R2. Related build instructions are available in Media Samples Guide in Intel® Media SDK Code Samples.
Below is an example to achieve HEVC 10-bit encoding using the sample_encode from the attached 'modified_sample_encode'.
sample_encode.exe h265 -i raw_frames.yuv -o output.265 -w 3840 -h 2160 -p010 -hw
Figure 3 is a screenshot of Video Quality Caliper tool, which verifies that the encoded stream has 10 bits per pixel (bpp), which denotes that each pixel contains or 1024 number of colors.
sample_encode supports classic P010 YUV only, which has 10-data bits in Least Significant Bit position. This is contrast to FFMPEG P010 format, which has 10-data bits in Most Significant Bit position.
VP9 is a video coding format developed by Google as a successor to VP8. 7th generation Intel® platforms support VP9 10bit hardware accelerated decode, whereas encode solution is software/CPU-supported.
Dynamic range is the ratio between the whitest whites and blackest blacks in an image. HDR video interprets better dynamic range than conventional Standard Dynamic Range (SDR) video, which uses a non-linear operation to encode and decode luminance values in video systems.
HDR video content is supported using either the HEVC Main 10 or VP9.2 codec, which includes full hardware decode support starting with 7th generation Intel processors. To transmit HDR content, the system needs to be equipped with either DP 1.4 or HDMI 2.0a port. This feature is currently tested with pre-released OS but not yet available with public releases. Enabling HDR and its support will be provided in upcoming articles.
As discussed, developers have the opportunity to deliver amazing, real-life video, and to innovate their content with more brilliant colors with 10-bit support for the growing market of UHD/HDR-ready devices. With media applications running on 7th generation Intel® processors and optimized by Intel® Media Server Studio or the Intel® Media SDK, developers can deliver video at the BT.2020 standard for 10-bit 4K UHD and even higher resolutions and frame rates with smoother color transition. Going forward, 10-bit content and seamless viewing experiences will be available in more dimensions than described in this article, as many optimized multimedia applications run on multiple types of Intel® processor-based platforms.
The following tools (along with downloadable links) were used to explain the 10-bit supported features in this article:
See also Deep Color Support of Intel® Graphics for Intel hardware and graphics driver support for 10-bit/12-bit.
Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.
Notice revision #20110804