(Tested) ASUS Radeon HD 6670 Review

ASUS Radeon HD 6670 review index

6 – ASUS Radeon HD 6670 Power consumption and temperature

The Radeon HD 6670 is not a card designed for hardcore gamers and then overclocking this card is not really important. That’s why no overclocking test is present in this article. But I did a minimal test of power draw and GPU temperature. For this stress test, I used FurMark 1.9.0.

FurMark GPU stress test

At idle, the total power consumption of the testbed is 92W and the GPU temperature of the HD 6670 is 32°C. When FurMark is running (1920×1080 fullscreen, burn-in test), the total power consumption reaches 194W while the GPU temperature finds its max value at 64°C.

The power draw of the HD 6670 alone can be approximated with:
(194 – 92) * 0.9 = 92W

where 0.9 is the PSU efficiency factor. For the Corsair AX1200 PSU, this factor is around 0.9 (see this article, there is a graph of the AX1200 efficiency).

The official TDP of the HD 6670 is 66W. Keep in mind that this TDP is the power draw of the board under typical gaming situation. FurMark represents the peak 3D usage. You must have a good quality motherboard because the HD 6670 has no additional power connector. Then all the power required by the HD 6670 comes from the PCI Express slot with is limited to 75W according to the PCI Express specifications. Fortunately, many motherboards can supply more than 75W via the PCI Express slot but it’s something you have to pay attention.

The fan of ASUS HD 6670 VGA cooler is not noisy but in an open case, you can hear it (default fan speed is 39%). And under FurMark, the noise level remains the nearly same. In any case, the noise level is very acceptable.

ASUS Radeon HD 6670 review index

9 thoughts on “(Tested) ASUS Radeon HD 6670 Review”

  1. Luxembourgian

    This Full HD is a bit funny res, i will better wait for 16:10 LCD / 1920×1200 and prices to comedown.

  2. jK

    You say it competes with the GT440 (~75$) and GTS450 (>110$) and costs 100$. Still you only list the GT440 in the benchmarks (which obviously is slower).
    To me the numbers give the impression that a GTS450 is the better decision.

  3. JeGX Post Author

    @jK: you’re right, according to other tests over the Net, the GTS is a very nice alternative to the HD 6670. But I didn’t talk about the GTS 450 for two reasons: I don’t have a GTS 450 so I can’t compare performances, and the GTS 450 requires an additional power connector. Both GT 440 and HD 6670 do not have power connectors.

  4. Luxembourgian

    Yes and they both have about 12k in 3Dmark06 overall performance and they both have prices bellow 100$, so you won’t be running Physx on full HD so Radeon would be better choice here.

  5. ^^

    GTS 450 – 1Gb DDR3 99.90€
    GT 440 – 1GB DDR5 (slow card) 99.90€
    HD 6670 – 1GB DDR5 (fast card) 99.90€

    Now who winns?
    ^^ HD 6670 ^^ ABS THE BIG WINNER

  6. Sturla

    I feel that the power draw conclusion is a bit off.

    Idle, total power cons. 92W.
    FurMark, total power cons. 194W
    (194 – 92) * 0.9 = 92W

    That would mean that the CPU and the rest of the system does not use any more power when the GPU is stress tested. I think the 66W TDP is pretty accurate, as the manufacturers have no need to understate those figures. Also, I doubt that the PSU has 90% efficiency at sub 200W.

  7. Tudor


    Yes, you are correct about the efficiency number. It’s around 0.85 ~ 0.88 at that low wattage. Doing the calculations again, it results that the Radeon 6670 is about 87 ~ 89 watts in full load. Wich isn’t that far off from 92.

    But there is one thing you are wrong. Furmark stresses ONLY the GPU. You have normal 1~5% cpu usage while running Furmark so you can’t blame the system using more power during stress test. And even if there were some light usage of CPU, power draw would still be in the range of 80W, somewhat over the limit of what a PCi-Ex can handle. The only con I see to this is that you can’t overclock the card. Other than that, the numbers are perfectly fine 🙂

  8. Jim

    The review doesn’t speak about CrossfireX potential. I have an A10 based machine and want to know how this will perform in Crossfire. My understanding is that you have to master off the A10, but that you can get theoretical boosts equal to some percentage of this cards performance in cross fire mode, making the machine quite capable for low power and cost. Any attempt to evaluate Crossfire or expectations?

Comments are closed.