We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The shader
const vec3 x = 0.*vec3(1.,0.,1.);
gets minified to const vec3 f=0.;, resulting in compiler errors. Expected behavior would be const vec3 f=vec3(0.);.
const vec3 f=0.;
const vec3 f=vec3(0.);
Interestingly, glslang-validator will validate the output of shader_minifier as correct (?), which makes these kind of issues hard to debug.
Observed on version 1.2.
The text was updated successfully, but these errors were encountered:
It's the same issue as #2.
I think we could propagate type information to avoid this kind of bug (at least for the simple cases).
Sorry, something went wrong.
Closing this issue as I've disabled the 0.*x optimization (#187).
0.*x
I'll reintroduce it when we have a way to check the type of expressions.
No branches or pull requests
The shader
gets minified to
const vec3 f=0.;
, resulting in compiler errors. Expected behavior would beconst vec3 f=vec3(0.);
.Interestingly, glslang-validator will validate the output of shader_minifier as correct (?), which makes these kind of issues hard to debug.
Observed on version 1.2.
The text was updated successfully, but these errors were encountered: