Recent Posts

Pages: [1] 2 3 ... 10
1
General Discussion / Re: Real core clock
« Last post by JeGX on December 04, 2014, 09:10:07 PM »
Yes GPU Shark shows the real clock speed in the current Pstate section (in blue).
2
General Discussion / Re: Real core clock
« Last post by Darkbluesky on December 03, 2014, 02:40:43 PM »
Thanks a lot.

Does GPU Shark, show the real core clock, as do -according to the link in the OP-  FurMark, EVGA OC Scanner X and MSI Kombustor?

Thanks!
3
General Discussion / Re: Real core clock
« Last post by JeGX on December 03, 2014, 01:42:30 PM »
There is GPU Shark, based more or less on the same GPU monitoring code than FurMark, that displays all you need:

http://www.geeks3d.com/20141110/gpu-shark-0-9-2-videocard-monitoring-utility-geforce-radeon/
4
General Discussion / Re: Real core clock
« Last post by Darkbluesky on December 03, 2014, 08:36:30 AM »
Although I was searching the web, I did not test the easiest. They are the FurMark, Oc Scanner X and MSI combustor that have been upgraded/modified to show the real core clock.

Ok.

But, then what I need is to know the real core clock when using other benchmark / game. Just as GPU-Z would show me the core clock, but the real one. Does it exists? Is there a way to know it?

Thanks!
5
General Discussion / Real core clock
« Last post by Darkbluesky on December 03, 2014, 08:17:37 AM »
Hello,

I have an ASUS GTX580, which still serves me well, and I have seen the article http://www.geeks3d.com/20120425/geforce-gtx-580-and-gpu-throttling-with-tdp-apps/ about throttling of the card.

In that article, the analyst shows the REAL core clock and even the throttling. I would need to know can I see/show the real core clock.

As the article's comments are closed (the article is 2 years old), I can't post there, so I wondered if someone here could help me.

Do you know how is it done?

Thank you very much!
6
English forum / GL-Z 0.1.0 released
« Last post by JeGX on November 21, 2014, 04:02:31 PM »
7
Quote
EIZO Corporation (TSE: 6737) today announced the new FlexScan EV2730Q, a 26.5-inch square monitor with a 1920 × 1920 resolution (1:1 aspect ratio). The monitor is the newest addition to EIZO’s FlexScan EcoView Series which combines both ergonomic and environmental features for an economical result.

FlexScan EV2730QThe FlexScan EV2730Q is wide all around – the unique 1920 × 1920 resolution provides users with 78% more pixels compared with a standard widescreen 1920 × 1080 monitor. The extended vertical space is convenient for displaying large amounts of information in long windows, reducing the need for excess scrolling and providing a more efficient view of data. This makes the monitor ideal for displaying information such as CAD or program development data with a more complete overall view on screen.

The non-glare IPS panel has wide viewing angles, making the monitor comfortable to view in any workstation and from any angle. The ergonomically designed stand with height adjustment, tilt, and swivel provides positioning flexibility and user comfort.

To lower eyestrain, the monitor utilizes an EIZO-developed solution that regulates brightness to make flicker unperceivable. In addition, the wide dimming range allows the monitor to be adjusted to just 1% of maximum brightness for higher comfort in dimly-lit work environments.

Five preset modes are included – sRGB, Movie, Paper, and two modes with user-adjustable settings. Paper mode reduces the amount of blue light to help prevent eye fatigue.

The monitor includes EIZO’s own EcoView technologies such as EcoView Optimizer 2, which saves power by reducing the backlight brightness and increasing the gain when displaying mostly dark content. In addition, Auto EcoView automatically adjusts the screen’s brightness in accordance with changes in ambient lighting to trim power usage while reducing eye fatigue.

A presence sensor called EcoView Sense 2 detects when the user leaves the desk and automatically switches to power save mode. When the user returns, EcoView Sense 2 powers the monitor on again. It detects both the user’s movements and body heat for increased accuracy.



Links:
- Press release: http://www.eizoglobal.com/press/releases/htmls/ev2730q.html
- Home page: http://www.eizoglobal.com/products/flexscan/ev2730q/index.html

8
3D-Tech News Around The Web / C++11/14/17 Features In VS 2015 Preview
« Last post by JeGX on November 20, 2014, 01:54:01 PM »
Quote
Visual Studio 2015 Preview is now available, so here's an updated feature table for the Core Language:

Link: http://blogs.msdn.com/b/vcblog/archive/2014/11/17/c-11-14-17-features-in-vs-2015-preview.aspx
9
Quote
So, you just got access to the latest supercomputer with thousands of GPUs. Obviously this is going to help you a lot with accelerating your scientific calculations, but how are you going to analyze, reduce and visualize this data?  Historically, you would be forced to write everything out to disk, just to later read it back into another data analysis cluster.

Wouldn’t it be nice if you could analyze and visualize your data as it is being generated, without having to go through a file system? And wouldn’t it be cool to interact with the simulation, maybe even modifying parameters while the simulation is running?

And wouldn’t it be nice to use your GPU for that as well? As it turns out, you can actually do this. It’s called in-situ visualization, meaning visualization of datasets in-place where they are computed. High-quality, high performance rendering and visualization is just one of the capabilities of the Tesla Accelerated Computing Platform.

...

Cosmological simulations like those undertaken by a group led by Professor Simon Portegies-Zwart  at the Leiden Observatory in the Netherlands provide a good example of present-day in-situ visualization. To understand how the Milky Way galaxy formed, and how dark matter influenced the process, they run very large-scale GPU-accelerated gravitational simulations with the Bonsai2 code. Their simulations are so powerful and efficient, that their code is one of the nominees for this year’s Gordon Bell awards.

Link: http://devblogs.nvidia.com/parallelforall/interactive-supercomputing-in-situ-visualization-tesla-gpus/
10
3D-Tech News Around The Web / NVIDIA MFAA tested on GTX 980
« Last post by JeGX on November 19, 2014, 10:34:03 AM »
Quote
First up is a new antialiasing method called MFAA, or Multi-Frame Sampled AA. This new method alternates the AA sample pattern, which is now programmable via software, in both temporal and spatial directions.

The goal is to change the AA sample pattern in a way to produce near 4xMSAA quality at the effective cost of 2x MSAA (in terms of performance).

...

NVIDIA's new Multi-Frame Sampled Anti-Aliasing is finally coming out, two full months behind the release and reveal of the GTX 980 and the MFAA technology in general. Despite that delay, the current shipping driver only supports MFAA on twenty PC games and uses a silent white list method that requires a lot of research on the part of the gamer to determine compatibility. Clearly this isn't what NVIDIA expected or desired, but that is where we are on the launch of the AA method with the baddest name around.

Still, even though we could fairly call this MFAA release small by expectations placed on the tech by NVIDIA, it does appear to work as desired in those games that are supported. In my time with it, the image quality it provided was better than 2x MSAA and nearly to that of 4x MSAA with performance closer to 2x MSAA than 4x MSAA. That alone would give MFAA a spot in our list of favorite features for Maxwell if it just supported more games!

Time will tell if MFAA is a feature that NVIDIA continues to work on and improve or if it will be one of the many graphics technologies from the last 15 years to find its way to the list of also-rans. Even looking at the list of ATI/AMD/NVIDIA specific AA methods alone will leave you dizzy with acronym-confusion. Not having SLI support for MFAA also seems like a really glaring omission considering these are the same types of users that are willing to enable off-shoot options in the control panel like this.

For now though, a very limited subset of NVIDIA's gamers (GTX 980/970) will be able to enjoy the benefits of MFAA on a very limited subset of modern PC games. It has potential, but needs a lot of work and attention from the driver team to keep the plates spinning.

Link: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Multi-Frame-Sampled-Anti-Aliasing-MFAA-Tested-GTX-980/
Pages: [1] 2 3 ... 10