Recent Posts

Pages: 1 ... 7 8 [9] 10
English forum / Problem Launching GLSL Hacker
« Last post by bentleyousley on December 23, 2014, 11:27:41 PM »

I'm experiencing problems when launching GLSL Hacker.

The Log:

2014/12/23@16:16:36(0000000001) < > GLSL Hacker v0.8.0.3 (Nov 18 2014@16:05:40)
2014/12/23@16:16:36(0000000002) < > Cross Platform Pixel Hacking Utility
2014/12/23@16:16:36(0000000003) < > (C)2012-2014 Geeks3D (
2014/12/23@16:16:36(0000000004) < > GLSL Hacker is starting up...
2014/12/23@16:16:36(0000000005) < > platform: Windows 64-bit
2014/12/23@16:16:36(0000000006) < > OpenCL: OpenCL computing plugin (OpenCL 1.1, 1.2).. By JeGX /
2014/12/23@16:16:36(0000000007) < > Python27: Python 2.7 (64-bit) host api plugin for gxl engine. By JeGX /
2014/12/23@16:16:36(0000000008) < > [Python27] initializing Python 2.7 plugin...
2014/12/23@16:16:36(0000000009) < > [Python27] Python version: 2.7.6 (default, Nov 10 2013, 19:24:24) [MSC v.1500 64 bit (AMD64)]
2014/12/23@16:16:36(0000000010) < > [Python27] Python compiler: [MSC v.1500 64 bit (AMD64)]
2014/12/23@16:16:36(0000000011) < > [Python27] Python build info: default, Nov 10 2013, 19:24:24
2014/12/23@16:16:36(0000000012) < > [Python27] Python installation detected. Happy coding!
2014/12/23@16:16:36(0000000013) < > [Python27] Python plugin initialization ok.
2014/12/23@16:16:36(0000000014) < > [Python27] Python version: 2.7.6 (default, Nov 10 2013, 19:24:24) [MSC v.1500 64 bit (AMD64)]
2014/12/23@16:16:36(0000000015) < > FBX: Autodesk FBX 3D object loader plugin. Supported formats: *.fbx, *.3ds, *.obj. By JeGX /
2014/12/23@16:16:36(0000000016) < > PhysX3: NVIDIA PhysX3 plugin for gxl3d.. By JeGX /
2014/12/23@16:16:36(0000000017) < > FFmpeg: FFmpeg plugin for gxl3d.. By JeGX /
2014/12/23@16:16:36(0000000018) < > GpuMonitor: GPU monitor plugin for gxl3d.. By JeGX /
2014/12/23@16:16:36(0000000019) < > [GpuMonitor] Operating system detected: Windows 7 64-bit build 7601 (Service Pack 1)
2014/12/23@16:16:36(0000000020) < > [GpuMonitor] Num GPUs found: 1
2014/12/23@16:16:36(0000000021) < > [GpuMonitor] GPU0 - GeForce GTS 250
2014/12/23@16:16:36(0000000022) < > [GpuMonitor] GPU0 - PCI codes: 0x10DE-0x0615
2014/12/23@16:16:36(0000000023) < > [GpuMonitor] GPU0 - BIOS:
2014/12/23@16:16:36(0000000024) < > FMOD: FMOD audio engine plugin. By JeGX /
2014/12/23@16:16:37(0000000025) < > FreeImage: FreeImage image loader. Supported formats: *.JPEG, *.PNG, *.TGA, *.BMP, *.PSD, *.GIF, *.HDR, *.PIC. By JeGX /
2014/12/23@16:16:37(0000000026) < > [FreeImage] version: 3.16.0
2014/12/23@16:16:37(0000000027) < > AntTweakBar: AntTweakBar plugin for GLSL Hacker. By JeGX /
2014/12/23@16:16:37(0000000028) < > LeapMotion: Leap Motion plugin.. By JeGX /
2014/12/23@16:16:37(0000000029) < > [LeapMotion] - no Leap device detected
2014/12/23@16:16:37(0000000030) < > FreeTypeGL: FreeTypeGL. By JeGX /
2014/12/23@16:16:37(0000000031) < > [OpenCL] initializing OpenCL data...
2014/12/23@16:16:37(0000000032) < > [OpenCL] found 1 OpenCL platform(s)
2014/12/23@16:16:37(0000000033) < > [OpenCL] Platform 0
2014/12/23@16:16:37(0000000034) < > [OpenCL] - vendor: NVIDIA Corporation
2014/12/23@16:16:37(0000000035) < > [OpenCL] - name: NVIDIA CUDA
2014/12/23@16:16:37(0000000036) < > [OpenCL] - profile: FULL_PROFILE
2014/12/23@16:16:37(0000000037) < > [OpenCL] - version: OpenCL 1.0 CUDA 3.0.1
2014/12/23@16:16:37(0000000038) < > [OpenCL] Platform 0 - devices count: 1
2014/12/23@16:16:37(0000000039) < > [OpenCL] - Device 0 - name: GeForce GTS 250 - Compute units: 16 @ 1836MHz
2014/12/23@16:16:37(0000000040) < > [OpenCL] OpenCL data initialized ok.
2014/12/23@16:16:37(0000000041) < > [PhysX3] PhysX version detected (Windows): 9100129
2014/12/23@16:16:37(0000000042) < > [PhysX3] PhysX SDK v3.3.2

When I double-click on the GLSLHacker.exe, a dialog box opens asking if I want run the program. When I click OK, nothing happens.

Any ideas about what I need to do to get this running?

Thank you for your help,

General Discussion / Re: Real core clock
« Last post by JeGX on December 04, 2014, 09:10:07 PM »
Yes GPU Shark shows the real clock speed in the current Pstate section (in blue).
General Discussion / Re: Real core clock
« Last post by Darkbluesky on December 03, 2014, 02:40:43 PM »
Thanks a lot.

Does GPU Shark, show the real core clock, as do -according to the link in the OP-  FurMark, EVGA OC Scanner X and MSI Kombustor?

General Discussion / Re: Real core clock
« Last post by JeGX on December 03, 2014, 01:42:30 PM »
There is GPU Shark, based more or less on the same GPU monitoring code than FurMark, that displays all you need:
General Discussion / Re: Real core clock
« Last post by Darkbluesky on December 03, 2014, 08:36:30 AM »
Although I was searching the web, I did not test the easiest. They are the FurMark, Oc Scanner X and MSI combustor that have been upgraded/modified to show the real core clock.


But, then what I need is to know the real core clock when using other benchmark / game. Just as GPU-Z would show me the core clock, but the real one. Does it exists? Is there a way to know it?

General Discussion / Real core clock
« Last post by Darkbluesky on December 03, 2014, 08:17:37 AM »

I have an ASUS GTX580, which still serves me well, and I have seen the article about throttling of the card.

In that article, the analyst shows the REAL core clock and even the throttling. I would need to know can I see/show the real core clock.

As the article's comments are closed (the article is 2 years old), I can't post there, so I wondered if someone here could help me.

Do you know how is it done?

Thank you very much!
English forum / GL-Z 0.1.0 released
« Last post by JeGX on November 21, 2014, 04:02:31 PM »
EIZO Corporation (TSE: 6737) today announced the new FlexScan EV2730Q, a 26.5-inch square monitor with a 1920 × 1920 resolution (1:1 aspect ratio). The monitor is the newest addition to EIZO’s FlexScan EcoView Series which combines both ergonomic and environmental features for an economical result.

FlexScan EV2730QThe FlexScan EV2730Q is wide all around – the unique 1920 × 1920 resolution provides users with 78% more pixels compared with a standard widescreen 1920 × 1080 monitor. The extended vertical space is convenient for displaying large amounts of information in long windows, reducing the need for excess scrolling and providing a more efficient view of data. This makes the monitor ideal for displaying information such as CAD or program development data with a more complete overall view on screen.

The non-glare IPS panel has wide viewing angles, making the monitor comfortable to view in any workstation and from any angle. The ergonomically designed stand with height adjustment, tilt, and swivel provides positioning flexibility and user comfort.

To lower eyestrain, the monitor utilizes an EIZO-developed solution that regulates brightness to make flicker unperceivable. In addition, the wide dimming range allows the monitor to be adjusted to just 1% of maximum brightness for higher comfort in dimly-lit work environments.

Five preset modes are included – sRGB, Movie, Paper, and two modes with user-adjustable settings. Paper mode reduces the amount of blue light to help prevent eye fatigue.

The monitor includes EIZO’s own EcoView technologies such as EcoView Optimizer 2, which saves power by reducing the backlight brightness and increasing the gain when displaying mostly dark content. In addition, Auto EcoView automatically adjusts the screen’s brightness in accordance with changes in ambient lighting to trim power usage while reducing eye fatigue.

A presence sensor called EcoView Sense 2 detects when the user leaves the desk and automatically switches to power save mode. When the user returns, EcoView Sense 2 powers the monitor on again. It detects both the user’s movements and body heat for increased accuracy.

- Press release:
- Home page:

3D-Tech News Around The Web / C++11/14/17 Features In VS 2015 Preview
« Last post by JeGX on November 20, 2014, 01:54:01 PM »
Visual Studio 2015 Preview is now available, so here's an updated feature table for the Core Language:

So, you just got access to the latest supercomputer with thousands of GPUs. Obviously this is going to help you a lot with accelerating your scientific calculations, but how are you going to analyze, reduce and visualize this data?  Historically, you would be forced to write everything out to disk, just to later read it back into another data analysis cluster.

Wouldn’t it be nice if you could analyze and visualize your data as it is being generated, without having to go through a file system? And wouldn’t it be cool to interact with the simulation, maybe even modifying parameters while the simulation is running?

And wouldn’t it be nice to use your GPU for that as well? As it turns out, you can actually do this. It’s called in-situ visualization, meaning visualization of datasets in-place where they are computed. High-quality, high performance rendering and visualization is just one of the capabilities of the Tesla Accelerated Computing Platform.


Cosmological simulations like those undertaken by a group led by Professor Simon Portegies-Zwart  at the Leiden Observatory in the Netherlands provide a good example of present-day in-situ visualization. To understand how the Milky Way galaxy formed, and how dark matter influenced the process, they run very large-scale GPU-accelerated gravitational simulations with the Bonsai2 code. Their simulations are so powerful and efficient, that their code is one of the nominees for this year’s Gordon Bell awards.

Pages: 1 ... 7 8 [9] 10