1 – GTX 590 Overview
In short, the GTX 590 is slower than the HD 6990 and is not recommended for overclocking. The price is a bit too high but GTX 590 size and noise are smaller than HD 6990 ones…
The GTX 590 is made up of two GF110 GPUs (GTX 580) but clocked at a lower speed (GTX 590: 607MHZ, GTX 580: 772MHz) in order to keep the power consumption to an acceptable value. And to be sure that nobody will use FurMark to show the real power consumption, NVIDIA has aggressively capped FurMark in the drivers…
GeForce GTX 590 reference specifications
- GPU: 2 x GF110 @ 607MHz, 40nm
- Shader processors: 1024 (2 x 512)
- Memory: 3072MB (1536MB per GPU) GDDR5 @ 853.5MHz real speed or 3414MHz effective speed, 768-bit (384-bit per GPU)
- Texture units: 128 (2×64)
- ROPs: 96 (2×48)
- TDP: 365W
- Power connectors: two 8-pin
- GPU VRM: 10 phases (5 phases per GPU)
- GPU Memory VRM: 4 phases (2 phases per GPU memory)
- Price: US $699
- APIs: OpenGL 4.1, Direct3D 11, OpenCL 1.1, CUDA, PhysX
- Display connectors: three dual-link DVI-I and one mini DisplayPort
- GFlops single precision: 2486 (GTX 580: 1581, HD 6990: 5407)
2 – Performances
Unigine Heaven (1920×1080, 4X MSAA, 16X anisotrpic filtering, Extreme tessellation)
3 – Power consumption and temperatures
GeForce GTX 590, temperature in idle, source
GeForce GTX 590, temperature in load, source
Two days ago, I attended almost in live at the destruction of a GTX 590 not by FurMark but by one of FurMark’s babies 😉 Then I know why NVIDIA has limited in a so aggressive way the GPU load of FurMark: a non-OC GTX 590 can be damaged by FurMark!
UPDATE: FurMark is not required to burn a GTX 590 😀
And according to hardware.fr, the real power consumption of the GTX 590 under a non-blacklisted version of FurMark is 450W!
And here’s what W1zzard (the author of GPU-Z) says:
As a first step, I increased the voltage from 0.938 V default to 1.000 V, maximum stable clock was 815 MHz – faster than GTX 580! Moving on, I tried 1.2 V to see how much could be gained here, at default clocks and with NVIDIA’s power limiter enabled. I went to heat up the card and then *boom*, a sound like popcorn cracking, the system turned off and a burnt electronics smell started to fill up the room. Card dead! Even with NVIDIA power limiter enabled. Now the pretty looking, backlit GeForce logo was blinking helplessly and the fan did not spin, both indicate an error with the card’s 12V supply.
After talking to several other reviewers, this does not seem to be an isolated case, and many of them have killed their cards with similar testing, which is far from being an extreme test.
I most strongly advise anyone to stay away from overclocking this product and use extremely conservative settings, maybe up to 650 MHz and no voltage adjustments.
According to NVIDIA this should not happen. In their official reviewer driver (which I used), the NVIDIA Power limit is designed to be active for all applications, not only Furmark.
AnandTech has a detailed explanation about the latest tweaks added by NVIDIA to limit the GPU load when FurMark is running:
Starting with the ForceWare 257 series drivers, NVIDIA is now using OCP at all times, meaning OCP now protects against any possible program that would generate an excessive load (as defined by NVIDIA), and not just FurMark and OCCT. At this time there’s definitely still a driver component involved as NVIDIA still throttles FurMark and OCCT right off the bat, but everything else seems to be covered by their generic detection methods.
At this point our biggest complaint is that OCP’s operation is still not transparent to the end user. If you trigger it you have no way of knowing unless you know how the game/application should already be performing. NVIDIA tells us that at some point this will be exposed through NVIDIA’s driver API, but today is not that day. Along those lines, at least in the case of FurMark and OCCT OCP still throttles to an excessive degree – whereas AMD gets this right and caps anything and everything at the PowerTune limit, we still see OCP heavily clamp these programs to the point that our GTX 590 draws 100W more under games than it does under FurMark. Clamping down on a program to bring power consumption down to safe levels is a good idea, but clamping down beyond that just hurts the user and we hope to see NVIDIA change this.
And here is the power consumption test with FurMark (now totally useless!!!)
This is probably the best graph for illustrating just how hard OCP throttles FurMark. Whereas AMD’s PowerTune does a very good job of keeping power consumption near the true power limit on the 6990 (in this case 375W), OCP is far more aggressive. This is why the GTX 590 consumes nearly 100W less, and why FurMark’s status as a worst case scenario test is compromised with overly aggressive OCP. Even the GTX 590 OC with its voltage bump is throttled to the point where it consumes less power than the 6990.