Direct3D 11 Conservative Depth Output Details and Demo

Direct3D 11 Conservative Depth Output Details and Demo

Here is an article about one of the little-known Direct3D 11 features: conservative depth output. What is it?

Conservative depth output: this is something you use for pixel shaders that manually output a depth value. Basically rather than using SV_Depth, you use a variant that also specifiea an inequality. For instance SV_DepthGreater, or SV_DepthLessEqual. The depth you output from the shader must then satisfy the inequality relative to the interpolated depth of the rasterized triangle (if you don’t, the depth value is clamped for you). This allows the GPU to still use early-z cull, since it can still trivially reject pixels for cases where the depth test will always fail for the specified upper/lower bound. So for instance if you render a quad and output DepthGreaterEqual, the GPU can cull pixels where the quad’s depth is greater than the depth buffer value. Don’t bother looking for this one in the documentation…it’s not in there.

You can find the complete article (covering also other lesser-known D3D11 features) as well as the demo HERE.

And if I’m not wrong, AMD provides a similar feature for OpenGL developpers with the
GL_AMD_conservative_depth extension.



  • Yes, AMD_conservative_depth extension provides the exact same functionality for OpenGL.
    I suppose the reason behind why it is not a well known feature of DX11, and why there is only an AMD extension in OpenGL for it, is because only AMD hardware supports it and NVIDIAs would just emulate it by disabling early-Z which can significantly reduce performance on their cards compared to AMD’s.

  • Thanks Daniel for the confirmation and the extra information!

  • Korvin77

    this ext presents even on my Radeon 4850 so it is partially DX11 card 😛

  • Actually you would wonder how many of the DX11 features are supported on Radeon 3000, 4000 series. This is because many of the features were already available in DX10.1 just they didn’t become popular because only ATI released cards with DX10.1 support.
    Also, usually the API capabilities are behind hardware capabilities.

  • DrBalthar

    Also alot of DX11 features which were supposed to be in DX10 have been canned because of nVidia’s inability to support them. Some tesellation features should have already been included in DX10 that’s why ATI/AMD hardware already had some support in.