GPU buffers

Started by JeGX, June 23, 2014, 04:31:33 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

JeGX

GPU buffers are a powerful new feature of GLSL Hacker 0.7.0+. A GPU buffer is a wrapper around OpenGL hardware buffer objects. You can easily create an uniform or shader storage buffer, set values and use the buffer in a GLSL shader.

Here is a code snippet (in Lua) that illustrates how to use GPU buffers functions of the new gh_gpu_buffer lib:


storage_size = 32 -- memory size in bytes for two vec4
flags = ""
ssbo = gh_gpu_buffer.create("SHADER_STORAGE", "GL_DYNAMIC_READ", storage_size, flags)
gh_gpu_buffer.bind(ssbo)

gh_gpu_buffer.map(ssbo)

buffer_offset_bytes = 0
gh_gpu_buffer.set_value_4f(ssbo, buffer_offset_bytes, x0, y0, z0, w0)

buffer_offset_bytes = buffer_offset_bytes + 16
gh_gpu_buffer.set_value_4f(ssbo, buffer_offset_bytes, x1, y1, z1, w1)

gh_gpu_buffer.unmap(ssbo)

gh_gpu_buffer.unbind(ssbo)

binding_point_index = 3
gh_gpu_buffer.bind_base(ssbo, binding_point_index)



More complete examples are available in the code sample pack:

- host_api/gl-310-arb-uniform-buffer/
- host_api/gl-420-arb-atomic_counters/
- host_api/gl-430-arb-shader-storage-buffer-object/
- host_api/gl-440-arb-bindless-texture/

All these code samples use GPU buffers to share data with GLSL programs.

Adik

Thanks for the GLSL Hacker software and also for the gh_gpu_buffer lib.

I am trying to do some GP computation using GLSL compute shaders and store my results in SSBO. Please, could you provide me with an example how to use the sub_data_read1ui function to read the results back from graphics card?
I tried to figure it out by myself, but the function is not used in the Code pack, I also didn't found reference for the gh_gpu_buffer library and also was not able to figure it out from sources, since I read that the project is closed-source.

Thanks in advance,
Adik

JeGX

I never tested that kind of code but a code snippet with sub_data_read_1ui could be:


gh_gpu_buffer.map(ssbo, "GL_READ_ONLY")
x = gh_gpu_buffer.sub_data_read_1ui(ssbo, offset)
gh_gpu_buffer.unmap(ssbo)


Let me know if you need more help.


Adik

First of all, thanks for the reply and let me say that I don't need to have this solved for now.

Second, I have tried what you suggested, but with no success. I only got 0s as results from the sub_data_read_1ui function for all valid buffer's indices. The data in the SSBO should be ok, because when I use it for rendering the rendered frame seems to be ok.

Maybe, I overlooked something, so this is the code:

gh_gpu_buffer.map(counts_buffer, "GL_READ_ONLY")

io.output("counts_buffer.txt")
local total = 0

for i = 0, 16 * 16 * 16 - 1 do
  local buffer_offset_bytes = 4 * i
  local count = gh_gpu_buffer.sub_data_read_1ui(counts_buffer, buffer_offset_bytes)
  if (count > 0) then
    io.write(count .. "\n")
  end
  total = total + count
end

io.write("Atoms = " .. total)
io.close()
 
gh_gpu_buffer.unmap(counts_buffer)


If you wanted to implement a working example, you could let me know and I would test it.

JeGX

ok I'll try to write a small code sample asap. Maybe there's a bug and I'll catch it at that moment!

JeGX

I coded a small test demo and it works fine. To read the GPU buffer in the CPU side (Lua) just bind the SSBO, read data and unbind it:


gh_gpu_buffer.bind(ssbo)

x, y, w, h = gh_gpu_buffer.sub_data_read_4f(ssbo, buffer_offset_bytes)

gh_gpu_buffer.unbind(ssbo)


No need to map the GPU buffer to read it.
Let me know.

JeGX

And sorry for the GPU buffer ref guide, I forgot to update the main index:

http://www.geeks3d.com/glslhacker/reference/scripting_gpu_buffer.php