Nice timing… I’m currently coding the new version of MSI Kombustor with brand new Direct3D render paths (really cool and interesting task, especially for an OpenGL developer!) and Chuck Walbourn, one of Microsoft’s Direct3D gurus, has just published an article called DirectX 11 Hardware Vendor Differences where he talks about some differences we can see between Radeon HD 5000 series and new GeGorce GTX 480/470. In particular, he has noticed important differences between the number of multisample antialiasing (MSAA) quality levels exposed by NVIDIA and ATI hardware:
Our work with the NVIDIA hardware for this release has provided insight into some areas that programmers need to pay attention to with respect to different vendor’s cards. The biggest difference I noticed was that number of MSAA quality levels exposed by AMD vs. NVIDIA. This information is obtained via the CheckMultisampleQualityLevels method in Direct3D 10.x and 11. The ATI Radeon HD 5000 Series only provides one quality level per sample count, while the NVIDIA GeForce GTX 470/480 exposes a number of fine-grain quality levels per sample count.
The quality level is the number of sample layouts and/or resolve algorithms for a specific sample count (a sample count = 2X, 4X or 8X). And from Direct3D doc, the higher the quality, the lower the performance.
What is the number of quality levels per sample count for NVIDIA ?
Since Kombustor D3D10 render path is okay (the D3D11 is under dev), I logged some details about MSAA samples and number of quality levels.
Here are the details for a GTX 480 (with R257.15 drivers – Win7 64-bit):
Direct3D 10 - Adapter 0 - Description: NVIDIA GeForce GTX 480 Direct3D 10 - Adapter 0 - Dedicated video memory: 1503MB Direct3D 10 - Adapter 0 - vendorId: 10DE, deviceId: 06C0, revision: 00A3 Direct3D 10 - MSAA 2X supported with 3 quality levels Direct3D 10 - MSAA 4X supported with 17 quality levels Direct3D 10 - MSAA 8X supported with 33 quality levels
And here are the details for a HD 5870 (with Catalyst 10.5 – Win7 64-bit):
Direct3D 10 - Adapter 0 - Description: ATI Radeon HD 5800 Series Direct3D 10 - Adapter 0 - Dedicated video memory: 1014MB Direct3D 10 - Adapter 0 - vendorId: 1002, deviceId: 6898, revision: 0000 Direct3D 10 - MSAA 2X supported with 1 quality levels Direct3D 10 - MSAA 4X supported with 1 quality levels Direct3D 10 - MSAA 8X supported with 1 quality levels
I just looked at quality levels more in detail and actually for NVIDIA they are related to CSAA (Coverage Sampling Antialiasing). According to this doc, here is the relation between sample count, quality levels and CSAA modes:
- 8X CSAA: sample count = 4 and quality level = 8
- 16X CSAA: sample count = 4 and quality level = 16
- 32X CSAA: sample count = 4 and quality level = 32
- 8XQ (Quality) CSAA: sample count = 8 and quality level = 8
- 16XQ (Quality) CSAA: sample count = 8 and quality level = 16
- 32XQ (Quality) CSAA: sample count = 8 and quality level = 32
CSAA is a feature of GeForce 8 series and higher. That’s why on AMD cards, quality level is still equal to 1 because CSAA is not supported on Radeon.
All CSAA modes will be added to Kombustor…
Indeed, AMD DX11 hardware has only one MSAA quality level whereas NVIDIA DX11 hardware offers up to 33 quality levels per sample count which corresponds to the max value defined in DirectX SDK.
Aaah… Direct3D or OpenGL, there are always differences between NVIDIA and AMD…