NVIDIA GeForce GTX 580: The Anti-FurMark DX11 Card

NVIDIA GeForce GTX 580

NVIDIA GeForce GTX 580 - Anti FurMark card

NVIDIA has officially released the successor of the GTX 480: the GeForce GTX 580. This card is powered by the GF110 GPU, which is a refresh of the GF100 GPU. For more detail about new things brought by the GF110, check this page out.

But the real new thing is somewhere else: the power draw is now under strict control. Like AMD with the Radeon HD 5000 series (see )ATI Cypress (Radeon HD 5870) Cards Have Hardware Protection Against Power Virus Like FurMark and OCCT), NVIDIA has added dedicated hardware to limit the power draw. And still like AMD with the Catalyst drivers (see FurMark Slowdown by Catalyst Graphics Drivers is INtentional!), there are somne optimizations in ForceWare R262.xx when FurMark (or OCCT) is detected (hehe, maybe the weak link???). In short, when FurMark is detected, the GTX 580 is throttled back by the power consumption monitoring chips. Now we have the explanation of this strange FurMark screenshot.

1 – GeForce GTX 580 specifications

  • GPU: GF110 @ 772MHz / 40nm
  • Shader cores: 512 @ 1544MHZ
  • Memory: 1536MB GDDR5 @ 1002MHz real clock (or 4008MHz effective, see Graphics Cards Memory Speed Demystified for more details), 384-bit bus width
  • Texture units: 64
  • ROPs: 48
  • TDP: 244 watts
  • Power connectors: 6-pin + 8-pin
  • Price: USD 500$

GTX 580 - GPU-Z

2 – GTX 580 Power Draw Monitoring

To shorten the story, NVIDIA uses a mix of hardware monitoring chips AND FurMark detection at the driver level to limit the power draw.

GTX 580 and power hardware monitoring


GTX 580 - Voltage and current monitoring chips
GTX 580 – Voltage and current monitoring chips, labelled U14, U15 and U16

GTX 580 - Clock modulation in FurMark

From W1zzard / TPU:

In order to stay within the 300 W power limit, NVIDIA has added a power draw limitation system to their card. When either Furmark or OCCT are detected running by the driver, three sensors measure the inrush current and voltage on all 12 V lines (PCI-E slot, 6-pin, 8-pin) to calculate power. As soon as the power draw exceeds a predefined limit, the card will automatically clock down and restore clocks as soon as the overcurrent situation has gone away. NVIDIA emphasizes this is to avoid damage to cards or motherboards from these stress testing applications and claims that in normal games and applications such an overload will not happen. At this time the limiter is only engaged when the driver detects Furmark / OCCT, it is not enabled during normal gaming. NVIDIA also explained that this is just a work in progress with more changes to come. From my own testing I can confirm that the limiter only engaged in Furmark and OCCT and not in other games I tested. I am still concerned that with heavy overclocking, especially on water and LN2 the limiter might engage, and reduce clocks which results in reduced performance. Real-time clock monitoring does not show the changed clocks, so besides the loss in performance it could be difficult to detect that state without additional testing equipment or software support.

I did some testing of this feature in Furmark and recorded card only power consumption over time. As you can see the blue line fluctuates heavily over time which also affects clocks and performance accordingly. Even though we see spikes over 300 W in the graph, the average (represented by the purple line) is clearly below 300 W. It also shows that the system is not flexible enough to adjust power consumption to hit exactly 300 W.

GeForce GTX 580 power draw under FurMark:
- 153 watts with the limiter (hw chip + driver)
- 304 watts without the limiter

GTX 580 - Power consumption under FurMark

Conclusion from W1zzard / TPU:

A feature that will certainly be discussed at length in forums is the new power draw limiting system. When the card senses it is overloaded by either Furmark or OCCT, the card will reduce clocks to keep power consumption within the board power limit of 300 W. Such a system seems justified to avoid damage to motherboard and VGA card and allows NVIDIA to design their product robustness with real loads in mind. NVIDIA stresses that this system is designed not to limit overclocking or voltage tuning and that they will continue making improvements to it. Right now I also see reviewers affected because many rely on Furmark for testing temperatures, noise, power and other things which will make the review production process a bit more complex too. For the every day gamer the power draw limiter will not have any effect on performance.

From Hexus

GTX 480 is one hot-running beastie. Give it some FurMark love and watch the watts spiral out of control, way above the rated 250W TDP, and hear the reference cooler’s fan run fast enough to sound like a turbine. The cooler’s deficiencies have been well-documented in the press. NVIDIA doesn’t like you running FurMark, mainly because it’s not indicative of real-world gameplay and causes the GPU to run out of specification. We like it because it makes high-end cards squeal!

So concerned is NVIDIA with the pathological nature of FurMark and other stress-testing apps, it is putting a stop to it by incorporating hardware-monitoring chips on the PCB. Their job is to ensure that the TDP of the card isn’t breached by such apps, and they do this by monitoring the load on each 12V rail.

Should a specific application hammer the GPU to the point where the power-draw is way past specification, as FurMark does to a GTX 480, the hardware chips will simply clock the card down. Pragmatically, running FurMark v1.8.2 on the GTX 580 results in half the frame-rate (and 75 per cent of the load) that we experience on a ’480 with the same driver. The important point is that the power management is controlled by a combination of software driver and hardware monitoring chips.

NVIDIA goes about the power-management game sensibly, because the TDP cap only comes into play when the driver and chips determine that a stress-testing app is being used – currently limited to FurMark v1.8+ and OCCT – so users wishing to overclock the card and play real-world games are able to run past the TDP without the GPU throttling down. Should new thermal stress-testing apps be discovered, NVIDIA will invoke power capping for them with a driver update.

From AnandTech:

NVIDIA’s reasoning for this change doesn’t pull any punches: it’s to combat OCCT and FurMark. At an end-user level FurMark and OCCT really can be dangerous – even if they can’t break the card any longer, they can still cause other side-effects by drawing too much power from the PSU. As a result having this protection in place more or less makes it impossible to toast a video card or any other parts of a computer with these programs. Meanwhile at a PR level, we believe that NVIDIA is tired of seeing hardware review sites publish numbers showcasing GeForce products drawing exorbitant amounts of power even though these numbers represent non real world scenarios. By throttling FurMark and OCCT like this, we shouldn’t be able to get their cards to pull so much power. We still believe that tools like FurMark and OCCT are excellent load-testing tools for finding a worst case scenario and helping our readers plan system builds with those scenarios in mind, but at the end of the day we can’t argue that this isn’t a logical position for NVIDIA.

Now something really interesting guys thanks to FudZilla:

GTX 580 - FurMark 1.8.2
GTX 580 and FurMark 1.8.2: the GPU temp does not exceed 76°C

GTX 580 - FurMark 1.6.0
GTX 580 and FurMark 1.6.x: the GPU temp reaches 90°C!!!

My conclusion: I NEED A GTX 580!!!!

3 – Performances

GTX 580 - TessMark
OpenGL 4.0: TessMark

GTX 580 - Unigine Heaven performances
Direct3D 11: Unigine Heaven performances

4 – Reviews

[ Subscribe to Geeks3D latest news by email ]


↑ Grab this Headline Animator