Nice timing… I’m currently coding the new version of MSI Kombustor with brand new Direct3D render paths (really cool and interesting task, especially for an OpenGL developer!) and Chuck Walbourn, one of Microsoft’s Direct3D gurus, has just published an article called DirectX 11 Hardware Vendor Differences where he talks about some differences we can see between Radeon HD 5000 series and new GeGorce GTX 480/470. In particular, he has noticed important differences between the number of multisample antialiasing (MSAA) quality levels exposed by NVIDIA and ATI hardware:
Our work with the NVIDIA hardware for this release has provided insight into some areas that programmers need to pay attention to with respect to different vendor’s cards. The biggest difference I noticed was that number of MSAA quality levels exposed by AMD vs. NVIDIA. This information is obtained via the CheckMultisampleQualityLevels method in Direct3D 10.x and 11. The ATI Radeon HD 5000 Series only provides one quality level per sample count, while the NVIDIA GeForce GTX 470/480 exposes a number of fine-grain quality levels per sample count.
The quality level is the number of sample layouts and/or resolve algorithms for a specific sample count (a sample count = 2X, 4X or 8X). And from Direct3D doc, the higher the quality, the lower the performance.
What is the number of quality levels per sample count for NVIDIA ?
Since Kombustor D3D10 render path is okay (the D3D11 is under dev), I logged some details about MSAA samples and number of quality levels.
Here are the details for a GTX 480 (with R257.15 drivers – Win7 64-bit):
Direct3D 10 - Adapter 0 - Description: NVIDIA GeForce GTX 480 Direct3D 10 - Adapter 0 - Dedicated video memory: 1503MB Direct3D 10 - Adapter 0 - vendorId: 10DE, deviceId: 06C0, revision: 00A3 Direct3D 10 - MSAA 2X supported with 3 quality levels Direct3D 10 - MSAA 4X supported with 17 quality levels Direct3D 10 - MSAA 8X supported with 33 quality levels
And here are the details for a HD 5870 (with Catalyst 10.5 – Win7 64-bit):
Direct3D 10 - Adapter 0 - Description: ATI Radeon HD 5800 Series Direct3D 10 - Adapter 0 - Dedicated video memory: 1014MB Direct3D 10 - Adapter 0 - vendorId: 1002, deviceId: 6898, revision: 0000 Direct3D 10 - MSAA 2X supported with 1 quality levels Direct3D 10 - MSAA 4X supported with 1 quality levels Direct3D 10 - MSAA 8X supported with 1 quality levels
UPDATE
I just looked at quality levels more in detail and actually for NVIDIA they are related to CSAA (Coverage Sampling Antialiasing). According to this doc, here is the relation between sample count, quality levels and CSAA modes:
- 8X CSAA: sample count = 4 and quality level = 8
- 16X CSAA: sample count = 4 and quality level = 16
- 32X CSAA: sample count = 4 and quality level = 32
- 8XQ (Quality) CSAA: sample count = 8 and quality level = 8
- 16XQ (Quality) CSAA: sample count = 8 and quality level = 16
- 32XQ (Quality) CSAA: sample count = 8 and quality level = 32
CSAA is a feature of GeForce 8 series and higher. That’s why on AMD cards, quality level is still equal to 1 because CSAA is not supported on Radeon.
All CSAA modes will be added to Kombustor…
Indeed, AMD DX11 hardware has only one MSAA quality level whereas NVIDIA DX11 hardware offers up to 33 quality levels per sample count which corresponds to the max value defined in DirectX SDK.
Aaah… Direct3D or OpenGL, there are always differences between NVIDIA and AMD…
How do games let you select 16x and 32x in D3D with NVIDIA then?
Also what happens under D3D9? Did that even have quality levels?
Seems like Microsoft need to get this working better…
I thought ATI did have different quality levels for AA as well, since you can access them via the Catalyst Control Center…
Hmm looks like D3D9 uses IDirect3D9::CheckDeviceMultiSampleType
Actually I think 16X and 32X are CSAA modes (Coverage Sampling Antialiasing)). I’ll try to test CSAA in Kombustor and I’m sure I’ll see 16X and 32X for my GTX 480.
Article updated with CSAA…
I was able to use x32 on second life on older nvidia drivers using http://www.nhancer.com/
after a few updates it went back to x16
Pingback: Microsoft on DirectX 11 Hardware Vendor Differences | VizWorld.com
Why has 2x got 3 quality levels?
Other stuff I have noticed…
ATI used to have a 6x mode, but now only 2x, 4x and 8x
CSAA in OpenGL uses
GL_NV_multisample_coverage
GL_NV_framebuffer_multisample_coverage
On GeForce 5 to 7:
D3DMULTISAMPLE_NONMASKABLE = 4 quality levels
level 0 = 2x
level 1 = 2xS? <– can't find info on this one
level 2 = 4x
level 4 = 8xS
D3DMULTISAMPLE_2_SAMPLES = 1 quality level
D3DMULTISAMPLE_4_SAMPLES = 1 quality level
On GeForce 3 and 4 MX:
D3DMULTISAMPLE_2_SAMPLES – 2x
D3DMULTISAMPLE_3_SAMPLES – 2xQ (Quincunix)
D3DMULTISAMPLE_4_SAMPLES – 4x
In OpenGL use glHint to speicfy nicest AA at 2x to get 2xQ
Also found this handy tool which shows the sampling patterns:
http://www.users.on.net/~triforce/d3d_fsaaviewer/
And a break down of the older nvidia MSAA modes:
http://www.nvnews.net/vbulletin/showthread.php?t=30641
This one has some of the ATI modes:
http://techreport.com/articles.x/11686/11
Also Geforce 8 has 8 levels of D3DMULTISAMPLE_NONMASKABLE
So maybe you need to expose two AA method settings (like some PC games do e.g. GRID)
The full set of D3DMULTISAMPLE_NONMASKABLE levels,
as well as MSAA and CSAA.
That way even older cards will have their 2xQ, 8xS etc exposed
Pingback: [GPU Tool] MSI Kombustor 1.1.0 With Direct3D 9, 10 and 11 Support - 3D Tech News, Pixel Hacking, Data Visualization and 3D Programming - Geeks3D.com