AMD has unveiled today its new technology: TressFX. TressFX, co-developed by AMD and Crystal Dynamics, is a new technology for real-time realistic hair rendering. TressFX is based on Microsoft DirectCompute and use some AMD’s works such as OIT (Order Independent Transparency) or PPLL (Per-Pixel Linked-List).
TressFX Hair revolutionizes Lara Croft’s locks by using the DirectCompute programming language to unlock the massively-parallel processing capabilities of the Graphics Core Next architecture, enabling image quality previously restricted to pre-rendered images. Building on AMD’s previous work on Order Independent Transparency (OIT), this method makes use of Per-Pixel Linked-List (PPLL) data structures to manage rendering complexity and memory usage. DirectCompute is additionally utilized to perform the real-time physics simulations for TressFX Hair. This physics system treats each strand of hair as a chain with dozens of links, permitting for forces like gravity, wind and movement of the head to move and curl Lara’s hair in a realistic fashion. Further, collision detection is performed to ensure that strands do not pass through one another, or other solid surfaces such as Lara’s head, clothing and body. Finally, hair styles are simulated by gradually pulling the strands back towards their original shape after they have moved in response to an external force. Graphics cards featuring the Graphics Core Next architecture, like select AMD Radeon™ HD 7000 Series, are particularly well-equipped to handle these types of tasks, with their combination of fast on-chip shared memory and massive processing throughput on the order of trillions of operations per second.
TressFX will be used in the upcoming PC version of Tomb Raider:
NVIDIA has PhysX and now AMD has TressFX (and maybe one day PhysX will have its own hairFX module, who knows). But it’s not clear whether TressFX can or can not run on NVIDIA’s GeForce cards…
Image from rage3d
Well it was implemented using DirectCompute so I don´t see the problem why it wouldn´t run on GeForce DX11 GPU or any other DX11 HW. And well you can’t compare this with PhysX which is complex physics engine. This is only one feature.
BTW NVIDIA presented their own hair simulation few years ago:
http://www.youtube.com/watch?v=cECnvJeXXE8
But it’s plus for AMD that they help with implementation of something like this in a game (I don’t know game where NVIDIA solution is already implemented).
Nvidia already did hair 10 years ago. Crude compared to DX11 rendering these days.
http://www.youtube.com/watch?v=jKef71NQUnQ
http://www.nvidia.com/object/nzone_nalu_makingof1.html
Fermi hair demo has 3 years ago.
http://www.geforce.com/games-applications/pc-applications/fermi-hair-demo
Dawn hair rendering 1 year ago.
http://www.youtube.com/watch?v=SjhM_xc6Yes
Nvidia has more technology compared to ATi, Implementation of CUDA until Physx and so on.
I do not want to offend ATI but Nvidia is simply better
Yeah, but AMD TressFX looks much, much better compared with nVidia hair tech. It does feel real.
TressFX not for AMD only.
proof: http://www.bit-tech.net/news/hardware/2013/02/26/amd-tressfx/1
I wonder if there is a mane & tail version 😛
Nvidia has a similar tech 5 years ago, and they implemented it in Nurien Tech demo (a Eastern Game) with other physics FX (clothes).
But the Dawn Hair demo is better because the massive cuantity of simulated hair.
@Rares:
You can´t said that trissFX “looks better” with only static captures, a very important aspect of simulating hair is the sense of “animation” of it.
The nvidia demo of Dawn shows a massive improvement over the implmentation in Tomb Raider, yes, it´s a demo and not a game, but the Tomb Raider implmentation isn´t new at all. It´s good to see it, but itsn´t a “revolution” like AMD said.
I found Nvidia Luna the best Demo ever.
Never saw such a transparency effect in a game.
the most important is it’s good to be a pc gamer
amd counter attack againt nivida Physx…
very usefull, in the vanilla game on xbox version lara’s hair are very odd and its behaviour with the wind ingame looks very bugy.
desolé pour mon anglais ^^
The use of more general things is better because it’s not so specific. Graphic cards, processors should focus on parallel roblems, multithreaded performance and CPU’s on singlethreaded.
TressFX cool, but not real, try swim and roll in dirt coople of times and TressFX will render still clean hair. Try fight and shoot in ppl 2-3 days with no shanpoo on island… With software hair we have all times dirt hair its more real. So, i want too have option too disable TressFX.
lol… people still don’t know what’s special from tressfx and compare them to old nvidia demos.. tressfx uses direct compute..
it can run on a nV 660M
Nvidia are a pain in the ASS!, They make tools for developers to implement but make it Nvidia hardware only dependant (what a joke) so no devs really bother using this stuff because whats the point going through the hastle of implenting it when half you game users (ATi / Intel) can’t use it. In other words you can’t guarentee to your users that the game will run correcly as it should for equal power cards from different manufacturers, which leaves 60% of your customers running stupidly slower. Nvidia should take the hint of OPEN GL, OPEN CL and stop putting customers of ALL game producers in a stupid possition. No tech Nvidia come up with will be adopted fully in the industry unless it works on all platforms. The selling point should be there brand / logo on these games and maybe a slightly faster implementation then other cards. Pisses me OFF