OpenGL 3 introduced three interpolation qualifiers that are at our disposal in the vertex and fragment shaders. These interpolation qualifiers allow to specify the way a vertex shader output is interpolated across a primitive. The OpenGL spec/wiki says:
- flat: the value is not interpolated. The value given to the fragment shader is the value from the Provoking Vertex for that primitive.
- smooth: performs a perspective correct interpolation.
- noperspective: performs a linear interpolation in window space.
The default interpolation qualifier is smooth when no qualifier is present.
Let’s see in practice how these qualifiers affect the rendering. I used GLSL Hacker to quickly code the demo and play with GLSL shaders. All demos of this tutorial are included in GLSL Hacker Code Sample Pack in the GLSL_OpenGL_32_Interpolation/ folder.
The first 3D test is simple: a quad with a different color at each vertex:
– vertex 0: red
– vertex 1: green
– vertex 2: blue
– vertex 3: yellow
Here is the GPU program used to render the quad:
Vertex shader:
#version 150 in vec4 gxl3d_Position; in vec4 gxl3d_Color; uniform mat4 gxl3d_ModelViewProjectionMatrix; //flat out vec4 VertexColor; smooth out vec4 VertexColor; //noperspective out vec4 VertexColor; void main() { gl_Position = gxl3d_ModelViewProjectionMatrix * gxl3d_Position; VertexColor = gxl3d_Color; }
Pixel shader:
#version 150 //flat in vec4 VertexColor; smooth in vec4 VertexColor; //noperspective in vec4 VertexColor; out vec4 Out_Color; void main() { Out_Color = VertexColor; }
With smooth qualifier, the rendering is:
With flat qualifier, the rendering is:
The flat qualifier disables the interpolation and each triangle of the quad is colored with the last triangle vertex color:
– triangle 0: vertices {0, 1, 2} – flat color is blue because vertex 2 is blue.
– triangle 1: vertices {2, 3, 0} – flat color is red because vertex 0 is red.
Why the last vertex? Because of the Provoking vertex. By default, the provoking vertex is the last vertex of a primitive, in our case the last vertex of a triangle. See HERE for more details about the Provoking vertex.
With noperspective qualifier, the rendering is:
As you can see, the difference between noperspective and smooth is really small, only a sharp eye can see some subtle variations
Now the second test with a textured quad. Here is the GPU program with texture mapping:
Vertex shader:
#version 150 in vec4 gxl3d_Position; in vec4 gxl3d_TexCoord0; uniform mat4 gxl3d_ModelViewProjectionMatrix; noperspective out vec4 VertexUV; //smooth out vec4 VertexUV; //flat out vec4 VertexUV; void main() { gl_Position = gxl3d_ModelViewProjectionMatrix * gxl3d_Position; VertexUV = gxl3d_TexCoord0 * 4; }
Pixel shader:
#version 150 noperspective in vec4 VertexUV; //smooth in vec4 VertexUV; //flat in vec4 VertexUV; uniform sampler2D tex0; out vec4 Out_Color; void main() { Out_Color = texture(tex0,VertexUV.xy); }
With smooth qualifier, the rendering is:
This is the usual rendering (the one we usually expect) and texture coordinates are interpolated with perspective correction.
With flat qualifier, the rendering is:
The texture color at vertex 2 and vertex 0 is red that’s why the quad is uniformly red (not a real red but RGB:{0.25; 0; 0}). Other way to explain: texture color at provoking vertices is RGB:{0.25; 0; 0}.
With noperspective qualifier, the rendering is:
With the noperspective qualifier, texture coordinates are linearly interpolated in the screen space (or window space) and we clearly see each triangle.
Last thing: interpolation qualifiers in vertex and pixel shaders must match. If interpolation qualifiers mismatch like in the following example:
Vertex shader:
... smooth out vec vertex_color; ...
Pixel shader:
... flat in vec vertex_color; ...
you will get some weird results. On Windows + GTX 660, I got a black quad:
Under Mac OS X with the Intel HD 4000 GPU, the quand was simply not displayed:
References:
> As you can see, the difference between noperspective and smooth is really small, only a sharp eye can see some subtle variations
I remember trying to find a way to do something similar (but not quite the same) in 2006, so I can easily spot the difference. 🙂
http://www.gamedev.net/topic/419296-skewedsheared-texture-mapping-in-opengl/
I ended up with a solution that used fixed pipeline, by figuring out a way to dynamically calculate the texture transformation matrix for any given input.
What is the point of such spec ?
Perspective correction was not available before because of the cost in software rendering, or that it was poorly understood how to make it cheap in HW for real time.
Nobody would benefit now in 3D to use non perspective (linear) interpolation…
I feel it is like writing software rasterizer in the middle of the 90’s.
>>>>
Nobody would benefit now in 3D to use non perspective (linear) interpolation…
<<<<
My guess if it is faster than perspective interpolation, then you can save a little bit using it in 2D rendering (full-screen quads, overlays, etc.)
Also I had some ideas where you can use 'flat', but right now I'm thinking if you can just use uniforms…
Flat saves you to put the same value 3x time at each vertex. So it saves you cost for memory update (no need to maintain all vertex but 1/3), if you want specific PER triangle it it may be/could be useful I think.
Now concerning the perspective correction, consider it as “free” in term of overhead for performance. The interpolator are specific HW unit with a fast multiply by inverse kind of calculation. Basically by shutting down the correction, you do NOT make it faster, you just skip the data path most likely. I would bet they actually have to ADD transistor to support such thing. (or do some tweak at the driver level ? I think it could be possible to mess up with the transform matrix to get this result (see the upper forum link))
Anyway, it is not like software where every new or more operation will cost you time. In HW you do a tradeoff with the transistor count to be able to have more calculation within the same timing bill and perspective correct is already included in this particular case… So I really do not get those specs.
I mean : PS1 level texturing ? 🙂
It was horrible, texturing were swingging around in racing games (think the road). Perspective correction is a huge plus that is already integrated. I do not really see any reason to disable it, even for 2D, it changes nothing to the transistor work/path.
@Romain
Well said many software writers know too little about hardware implementation of algorithms.