Glsl vector * non-hlsl matrix

I have two (identical) shaders, one in hlsl and one in glsl. In a pixel shader, I multiply a vector by a matrix for normal transformations. The code is essentially:

HLSL

float3 v = ...;
float3x3 m = ...;
float3 n = mul(v, m);

      

GLSL

vec3 v = ...;
mat3 m = ...;
vec3 n = v * m;

      

This should do the multiplication of the string vector, but in glsl it doesn't. If I wrote out the algorithm explicitly, it works for both. Both the glsl spec and hlsl, from what I can tell, say they should multiply the string vector if the vector is on the left side as it is.

Another confusing thing is that I am multiplying a vector by a matrix in a vertex shader with the vector on the left, but this works fine in both glsl and hlsl. This leads me to assume that this is only a problem in the fragment / pixel shader.

I am passing a matrix from a vertex shader to a fragment shader using:

out vec3 out_vs_TangentToWorldX;
out vec3 out_vs_TangentToWorldY;
out vec3 out_vs_TangentToWorldZ;

out_vs_TangentToWorldX = tangent * world3D;
out_vs_TangentToWorldY = binormal * world3D;
out_vs_TangentToWorldZ = normal * world3D;

      

and in the fragment shader I reverse engineer it with:

in vec3 out_vs_TangentToWorldX;
in vec3 out_vs_TangentToWorldY;
in vec3 out_vs_TangentToWorldZ;

mat3 tangentToWorld;
tangentToWorld[0] = out_vs_TangentToWorldX;
tangentToWorld[1] = out_vs_TangentToWorldY;
tangentToWorld[2] = out_vs_TangentToWorldZ;

      

+3


source to share


1 answer


HLSL matrices have a large number of rows, GLSL matrices have columns. So if you pass the matrix to the GLSL shader using the same memory layout as you pass it to HLSL, your HLSL rows will become GLSL columns. And you have to use column multiplication in your GLSL shader to get the same effect as in HLSL.

Just use



vec3 n = m * v;

      

+3


source







All Articles