Furmark under Linux reports different frame rate than it shows

Started by Ascaris, July 11, 2016, 03:07:28 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Ascaris

Hi,

I have an Asus F8Sp laptop with an ATI Radeon HD3650 Mobility GPU.  I recently installed Linux Mint 17.3 on it, and I found that the most recent proprietary drivers that support this GPU are too old to work with the kernel or Xorg versions of any recent Linux release.  As such, I am currently using the RADEON open source driver.

I noticed that the WebGL sphere test in Peacekeeper runs significantly slower in Firefox in Linux than on the same PC with Windows 7 (~60 fps Windows and ~40 Linux).  That was why I searched to see if there was such a thing as a cross-platform GPU benchmark I could use to see if that is the issue... which brought me to GPUTest.

I downloaded the Windows and Linux versions, and I did the low resolution Furmark test on each.  In Windows, I got 6 FPS; not great by any means, but I already knew this is an old setup.

In Linux, I got a reported 5-6 FPS in the titlebar during the test (it bounced back and forth between them; this was during the burn-in.  The benchmark bounced 4-5 FPS and settled on 4 for the result)... but that's not at all what I saw in looking at the test itself.  It was drawing perhaps one frame every two seconds, which is about an order of magnitude slower than what it was reporting.  In Windows, the 6 fps result matched what I was seeing. 

I'd be happy with 4-5 fps; it's close enough to the Windows result to assure me that I am not losing a ton of performance by using the open drivers.  0.5 fps, though, is terrible, and that's what I really got in the test.

OpenGL 3 was used in the Linux test; 2 in Windows.

Any ideas on what is happening?


EDIT: Saw a reply to another post suggesting this; here's the log file.  I am trying to download the previous version now, but it's been stuck for a while, so I'll have to restart it and see how it goes.

2016:07:11@07:44:05(0000000001) < > GpuTest (Linux x64) is starting up...
2016:07:11@07:44:05(0000000002) < > GpuTest: Cross Platform Graphics Benchmark Utility
2016:07:11@07:44:05(0000000003) < > (C)2012-2014 Geeks3D (www.geeks3d.com)
2016:07:11@07:44:05(0000000004) < > Linux information:
2016:07:11@07:44:05(0000000005) < > - sysname: Linux
2016:07:11@07:44:05(0000000006) < > - release: 3.19.0-32-generic
2016:07:11@07:44:05(0000000007) < > - version: #37~14.04.1-Ubuntu SMP Thu Oct 22 09:41:40 UTC 2015
2016:07:11@07:44:05(0000000008) < > - machine: x86_64
2016:07:11@07:44:05(0000000009) < > gxl3d codename: Zhald.
2016:07:11@07:44:05(0000000010) <!> OpenCL support not found: unable to load the OpenCL core library (libOpenCL.so).
2016:07:11@07:44:05(0000000011) < > gxl3d_plugin_gpu_monitor_gml: GPU monitor plugin for gxl3d.. By JeGX / Geeks3D.com
2016:07:11@07:44:05(0000000012) < > [GpuMonitor plugin] Linux - CPU name: Intel(R) Core(TM)2 Duo CPU     T7800  @ 2.60GHz, CPU speed: 2600MHz, RAM: 7984MB
2016:07:11@07:44:05(0000000013) < > [GpuMonitor plugin] Num GPUs found: 1
2016:07:11@07:44:05(0000000014) < > [GpuMonitor plugin] GPU0 - ATI Mobility Radeon HD 3650
2016:07:11@07:44:05(0000000015) < > [GpuMonitor plugin] GPU0 - PCI codes: 0x1002-0x9591
2016:07:11@07:44:05(0000000016) < > GPU monitoring thread started.
2016:07:11@07:44:05(0000000017) < > Display size: width=1280, height=800
2016:07:11@07:44:05(0000000018) < > Window size: width=1024, height=640, left=0, top=0
2016:07:11@07:44:05(0000000019) < > FurMark - OpenGL renderer init OK.
2016:07:11@07:44:05(0000000020) < > FurMark - VSYNC disabled (xxx_SwapInterval_xxx available).
2016:07:11@07:44:05(0000000021) < > FurMark - OpenGL version detected: 3.0
2016:07:11@07:44:05(0000000022) < > FurMark - # OpenGL extensions: 230
2016:07:11@07:44:05(0000000023) < > FurMark - OpenGL - Renderer model: Gallium 0.4 on AMD RV635
2016:07:11@07:44:05(0000000024) < > FurMark - OpenGL - Renderer vendor: X.Org
2016:07:11@07:44:05(0000000025) < > FurMark - OpenGL - API version: 3.0 Mesa 10.5.9
2016:07:11@07:44:05(0000000026) < > FurMark - OpenGL - Shading language version: 1.30
2016:07:11@07:44:05(0000000027) < > FurMark : init OK.
2016:07:11@07:45:10(0000000028) < > [Benchmark_Score] - module: FurMark - Score: 287 points (1024x640 windowed, duration:60000 ms).
2016:07:11@07:45:10(0000000029) < > Exit from render thread
2016:07:11@07:45:10(0000000030) < > # frames rendered: 287
2016:07:11@07:45:10(0000000031) < > GpuTest 0.7.0
http://www.geeks3d.com

Module: FurMark
Score: 287 points (FPS: 4)

Settings:
- 1024x640 windowed
- antialiasing: Off
- duration: 60000 ms

Renderer:
- Gallium 0.4 on AMD RV635
- OpenGL: 3.0 Mesa 10.5.9


Ascaris

Tried it with version 0.6 of the benchmark-- same result.

Tried it also with Linux Mint 18.0, with the same result again.