-
Notifications
You must be signed in to change notification settings - Fork 956
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebGL TextureArray not working #4081
Comments
It looks similar to the bug that should have been fixed by #3641. |
@AndriBaal are you on wgpu v0.16 or higher (since they contain the fix @daxpedda referenced)? |
I tested this on both 0.16 and 0.17 and its hapening on both versions |
You might be running into #2161 (comment). If you try to avoid setting |
When working with only 1 array layer everything works fine, it is only when using more than one, that some texture arrays break. I can't explain why some work and some don't. That rounding up to 16 makes all my texture arrays work is maybe also just a coincidence. |
You don't get a validation error if you set |
Ah yes, it actually does lol, completely forgot that. What I meant was that when working without Arrays it works completely fine. |
Hello Together
Me and some friends are working on a little Zelda Game inside my game engine that looks like the following (on linux, android and windows):
However in the Browser with WebGL it looks like the following:
As you see, some sprites are messed up for example, the spritesheet of the player is swapped with the spritesheet of the Enemy.
Im getting the following errors in the console:
Im using the following code to create my 'SpriteSheet':
The only way I found to fix the problem, was to align the
depth_or_array_layers
to 16 like this:Of course this is no solution, because it allocates 14 additional textures in the worst case.
Am I doing something wrong or is this just a bug?
The text was updated successfully, but these errors were encountered: