AMD GPU Tesselation SDK



Terrain tesselation on GPU

Ninja character tesselation on GPU



AMD has released a SDK (software development kit) for real time tesselation on GPU. Tesselation on GPU makes it possible to increase polygons density in real time, adding much more realism in the final scene. Tesselation on GPU provides several key benefits:

  • Compression: Using tessellation allows us to reduce our memory footprint and bandwidth consumption by storing on disk only low resolution meshes.
  • Bandwidth is improved because, instead of transferring all of the vertex data for a high-polygon mesh over the PCI-E bus, only the coarse mesh is sent to the GPU
  • Scalability: Because of its recursive nature, subdivision naturally accommodates LOD rendering and adaptive approximation with a variety of metrics.

Two demos are provided (terrain and ninja)with binaries and source code. They require DirectX Redistributable 9.0c November 2008 (very cool, I can play with them on my XP box!) and a Radeon (I used my Radeon HD 4850 with the latest Catalyst 9.2).

In AMD’s terrain demo, the low resolution terrain mesh has 4050 triangles and the GPU tesselated version has 1,664,550 triangles (max tesselation level):


Tesselation level: 1 – Polygons: 48,600 – FPS: 1000 – WIREFRAME rendering

Tesselation level: 15 – Polygons: 1,664,550 – FPS: 216 – Still in WIREFRAME!



4 thoughts on “AMD GPU Tesselation SDK”

  1. saew

    Too bad, it only works on ATI cards!
    I think I’ll better wait for DX11 so the tessellation will run on several cards.

  2. blend

    Well, you know… That’s just like NVidia PhysX which is hardware accelerated on the Geforces only.

    But I think it would be much more interesting if AMD released a complete SDK and provided the GL_AMDX_tesselation extension I think.

  3. BUDA20

    there is an Nvidia demo is realy good too:
    Google:
    nvidia Instanced Tessellation
    and go to the “Instanced Tessellation” demo.
    It needs DX10

    PD: AMD FTW 😛

  4. Pingback: [TEST] Hardware Tessellation on Radeon in OpenGL (Part 1/2) - 3D Tech News, Pixel Hacking, Data Visualization and 3D Programming - Geeks3D.com

Comments are closed.