A new white paper is available:
Displaying Stereoscopic 3D (S3D) with Intel HD Graphics
This document describes how to display S3D content and set standard and S3D modes of many 3D-capable monitors and TVs using a library that is available in the Intel Media SDK 3.0 (currently Beta 3).
Hi,some questions:0. you mention in whitepaper requeriment to link to dxva libs will be droped in the future?1. are you working on a dx10 and dx11 (similar to amd quad buffer sdk) (for ivb or using dx_10 level device creation) similar api so it's usfeul for modern games?2. some option for poor OGL users to expose OGL quad buffered rendering (would be awesome as Nvidia AMD require profesional cards)3. Windows 8 developer preview supports standariced stereo rendering.. are you working to expose in some beta Windows 8 driver..
I'll try to answer some of your question: :)
I don't recall the whitepaper mentioning anything about what will be dropped or supported in the future.
I can't comment on future features or support. Sorry.
The paper, and MSDK S3D library are for Microsoft Windows 7 only.
I am trying to useigfx_s3dcontrol.lib andigfx_s3dcontrol.h in my DirectShow filter but got the following error message:
Can you tryextern IGFXS3DControl * __cdecl CreateIGFXS3DControl(); together with your first filter when filter's calling convention is stdcall?
I see now that this second issue is not related to calling convention at all. You just need to add comsupp.lib in linker additional dependencies in you filter's project. Please try that and let me know if any more problems.
I can now successfully build the filter with the following settings:
As far as I understand it, since the primary usage via the SDK is video, the new S3D control API exposed via the library does not provide any specific features to control still image use case.
One way to display a still image, albeit a bit crude would be to just use the same method as in the S3D sample but feed the same left and right surface into the renderer repeatedly.
Hi.. Yes. Displaying a still image is possible :)
Yes ... you have to use VideoProcessBlt(). That is how you chose Left or Right use pLEFTVideoProcessor->VideoProcessBlt() for left, and pRIGHTVideoProcessor->VideoProcessBlt() for right.
Each processor object will blit to their correct L or R surface (assuming you had set m_pS3DControl->SelectLeftView() when creating the L processor, and m_pS3DControl->SelecRightView() when creating the R processor.
You can think of it as pD3dDevice->GetBackBuffer() getting you a big super-surface (big enough to hold both Left and Right). The VideoProcessor object you use will make the Blt go to the correct place in that super-surface.
Right now, RightProcessor->VideoProcessBlt() the only way you change the bits that are seen by the right eye.
I am working on a real-time stereo 3D application. I assume that to get stereo output using thislibraryI have to render the left and right views to separate surfaces and then use the VideoProcessBlt calls to piece everything together. This is fine, but inefficient. Is there some way I could render the views directly where needed in the "super-surface" back buffer?
Another question: Does this library support all HDMI 1.4 3D formats? For example, I have a 3D monitor that takes Over/Under, Side-by-Side and Interlaced formats and claims to be HDMI 1.4 compatible (and it seems to be, in that 3D Blu-ray players work fine when connected to it). Will this library also work with this monitor (and negotiate one of those supportedformats "under the covers")?
I'm asking since my simple test app (that just draws a green rectangle to the left view and a red rectable to the right view) is only displaying the red rectangle (right view), even though the calls to SwitchTo3D is reporting success (as are all the other calls, made in the same order as the Media SDK example code).
A correction - with an HDMI 1.4 display the library supports only frame packing format. Other formats cannot be used.
Thank you very muchfor this information. I have verified that the device is being created after switching to 3D, and that the SelectView calls come afterwards. In walking through the sample_decode.cpp and RTF files, it appears that we are doing everything in the same order as the sample_decode project.
Of course, the next obvious step is for us to run the sample_decode project with our hardwareto see if that is working correctly with the 3D monitor and/or 3D TV we have available here for development & testing.
In order to accomplish that, we needed access to S3D video content in MVC format since that is the only format that sample_decode will render to a stereo device. Such media files seem hard to come by on the web (looking via a Google search). We mainly do CG and can easily construct separateleft & right video output files, so we used the sample_encode executable to create our own stereo MVC file from our own left/right views.
With that, it appears that the sample_decode executable works just like our app -- it returns success from all method calls yet it only displays the right view. So, most likely that means that we need to find other 3D display hardware. But note: we are using a late model LG 3D TV and it works fine with the various 3D Bluray players, and it claims to be HDMI 1.4a compatible, so perhaps further documentation for the IGFX3D_Control is necessary if if doesn't work with this type of 3D TV.
One possible weakness in our analysis: We created our own stereo 3D MVC file but really have no way to verify if we created it correctly. Could you perhaps point me to a place where I can download some sample/test stereo 3D MVC file(s)? With those in hand, we can rerun our sample_decode test and verify with certainty whether the IGFX3D_Control is working with our LG 3D TV or not.
It turned out to be a case of needing to explicitly set the desktop size & refresh to 720p60 or 1080p24 before running the code. I'm still trying to figure out why the auto-setcode to choosean acceptable size & refresh isn't working, but that is a "minor" problem compared to actually having stereo 3d output working properly.
Thanks again for your support.
PS It works with both the LG Cinema 3D monitor and the LG 3D TV once the refresh rates are correctly set.
What would have been really helpful is if the call to enter 3D mode were to have failed with an error code indicating that the display resolution and/or refresh rate wasn't correct.Rather, all return codesindicated that everything was working just fine.
I'll take this opportunity to add my 2cents for an API method or extension to allow and app to directly write to the left/right portion of the bigger stereo backbuffer. For real time applications like ours, this saves two big framebuffer copies, since we could then just render to the left and rightportions of the backbuffer rather than to offscreen buffers followed by blits. (BTW, thisis how AMD's HD3D implementation works under the covers so we know from experience that it is more efficient for our type of application.)
Thanks again for your help!
One addtional note:
It turns out that the GetS3DCaps method returns a list of supported 3D display modes that includes 1080p60hz. But this mode doesn't really support stereo 3D output!
Our program was finding that mode (as the closest -- well, exact -- match to our desktop settings) and hence was trying to use that, getting SUCCESS return codes, but not producing 3D output.
There is a comment on one of the RTF files in the SDK saying that 1080p24, 720p50, and 720p60 are the only valid/supported 3D modes (and I have confirmed that they work). But if that is the case, then the GetS3DCaps method should really only return those values.
I have a new (related) question: Is it possible to use the stereo 3D control interface along with "Nvidia Optimus" and/or "AMD Switchable" graphics?
A lot of laptops targeted for media professionals now provide two GPUs (Intel HD3000/4000 along with a discrete GPU) plus the option to enable the discrete GPU for certain apps. Apparently, when enabled, the discrete GPU renders application datainto the framebuffer controlled by the integrated GPU so that only one framebuffer is used to render the desktop plus all apps.
I have tried enabling the discrete GPU along with stereo output as described above, and I only seem to get one view actually appearing, though the connected display device does going into stereo 3D mode as verified by the curson only appearing in one eye.
I was hoping that since the images are being rendered to an offscreen buffer, then blitted back via the DXVA interface for stereo output, that some sort of integration of this dual GPU approach might be possible.
The VideoProcessBlt and present is different than other blits, and the operation is not really offscreen buffer, then blitted back. For this reason the feature cannot be used when the discrete graphics hardware is used to implement the VideoProcessBlt call.
i call [GetS3DCaps] retern:NO_INTERFACE?
HW:Micorosoft surface book;
Sec Monitor:SamSung S950(support 3D)