On 01/26/2016 11:36 AM, Marek Olšák wrote:
I don't understand this. Can you explain it?

This shader buffer for shared storage is actually a fake buffer. Only one shared "fake" buffer can be used and it will contain all shared variables which are defined in the compute shaders. Its size is defined by MAX_COMPUTE_SHARED_MEMORY_SIZE fyi.

I did this change to reserve the slot 32 for it and so it won't be allocated by the underlying driver since I clamp the maximum number of usable shader+atomic buffers to 32.


AFAIK, shared variables don't need a backing buffer.

Thanks,
Marek

On Sun, Jan 24, 2016 at 10:09 PM, Samuel Pitoiset
<samuel.pitoi...@gmail.com> wrote:
At least, one shader buffer must be available for compute shaders.

Signed-off-by: Samuel Pitoiset <samuel.pitoi...@gmail.com>
---
  src/gallium/include/pipe/p_state.h     | 2 +-
  src/mesa/state_tracker/st_extensions.c | 6 ++++--
  2 files changed, 5 insertions(+), 3 deletions(-)

diff --git a/src/gallium/include/pipe/p_state.h 
b/src/gallium/include/pipe/p_state.h
index 2e4d283..051856e 100644
--- a/src/gallium/include/pipe/p_state.h
+++ b/src/gallium/include/pipe/p_state.h
@@ -61,7 +61,7 @@ extern "C" {
  #define PIPE_MAX_SHADER_INPUTS    80 /* 32 GENERIC + 32 PATCH + 16 others */
  #define PIPE_MAX_SHADER_OUTPUTS   80 /* 32 GENERIC + 32 PATCH + 16 others */
  #define PIPE_MAX_SHADER_SAMPLER_VIEWS 32
-#define PIPE_MAX_SHADER_BUFFERS   32
+#define PIPE_MAX_SHADER_BUFFERS   33
  #define PIPE_MAX_SHADER_IMAGES    32
  #define PIPE_MAX_TEXTURE_LEVELS   16
  #define PIPE_MAX_SO_BUFFERS        4
diff --git a/src/mesa/state_tracker/st_extensions.c 
b/src/mesa/state_tracker/st_extensions.c
index d066784..c198892 100644
--- a/src/mesa/state_tracker/st_extensions.c
+++ b/src/mesa/state_tracker/st_extensions.c
@@ -219,8 +219,10 @@ void st_init_limits(struct pipe_screen *screen,
                                            pc->MaxUniformBlocks);

        pc->MaxAtomicCounters = MAX_ATOMIC_COUNTERS;
-      pc->MaxAtomicBuffers = screen->get_shader_param(
-            screen, sh, PIPE_SHADER_CAP_MAX_SHADER_BUFFERS) / 2;
+      pc->MaxAtomicBuffers =
+         _clamp(screen->get_shader_param(screen, sh,
+                                         PIPE_SHADER_CAP_MAX_SHADER_BUFFERS),
+                0, PIPE_MAX_SHADER_BUFFERS - 1) / 2;
        pc->MaxShaderStorageBlocks = pc->MaxAtomicBuffers;

        /* Gallium doesn't really care about local vs. env parameters so use the
--
2.6.4

_______________________________________________
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev

--
-Samuel
_______________________________________________
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/mesa-dev

Reply via email to