Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WebGL TextureArray not working #4081

Open
AndriBaal opened this issue Aug 22, 2023 · 7 comments
Open

WebGL TextureArray not working #4081

AndriBaal opened this issue Aug 22, 2023 · 7 comments
Labels
api: gles Issues with GLES or WebGL type: bug Something isn't working

Comments

@AndriBaal
Copy link

AndriBaal commented Aug 22, 2023

Hello Together

Me and some friends are working on a little Zelda Game inside my game engine that looks like the following (on linux, android and windows):
image

However in the Browser with WebGL it looks like the following:
image

As you see, some sprites are messed up for example, the spritesheet of the player is swapped with the spritesheet of the Enemy.
Im getting the following errors in the console:
image

Im using the following code to create my 'SpriteSheet':

pub fn new<D: Deref<Target = [u8]>>(gpu: &Gpu, desc: SpriteSheetBuilder<D>) -> Self {
    let amount = desc.sprite_amount.x * desc.sprite_amount.y;
    assert!(amount > 1, "SpriteSheet must atleast have to 2 sprites!");

    let data = [desc.sprite_amount, Vector::new(0, 0)]; // Empty vec needed for 16 Byte alignment
    assert!(std::mem::size_of_val(&data) % 16 == 0);
    let size_hint_buffer = gpu
        .device
        .create_buffer_init(&wgpu::util::BufferInitDescriptor {
            label: Some("spritesheet_size_hint_buffer"),
            contents: bytemuck::cast_slice(&data),
            usage: wgpu::BufferUsages::UNIFORM,
        });

    let texture_descriptor = wgpu::TextureDescriptor {
        label: Some("sprite_texture"),
        size: wgpu::Extent3d {
            width: desc.sprite_size.x,
            height: desc.sprite_size.y,
            depth_or_array_layers: amount,
        },
        mip_level_count: 1,
        sample_count: 1,
        dimension: wgpu::TextureDimension::D2,
        format: wgpu::TextureFormat::Rgba8UnormSrgb,
        usage: wgpu::TextureUsages::TEXTURE_BINDING
            | wgpu::TextureUsages::COPY_DST
            | wgpu::TextureUsages::COPY_SRC
            | wgpu::TextureUsages::RENDER_ATTACHMENT,
        view_formats: &[],
    };
    // log::warn!("{}", desc.sprite_size.x * desc.sprite_size.y * amount % 16 == 0);
    let texture = gpu.device.create_texture(&texture_descriptor);
    let sampler = gpu.device.create_sampler(&desc.sampler);

    for (layer, bytes) in desc.data.iter().enumerate() {
        gpu.queue.write_texture(
            ImageCopyTexture {
                texture: &texture,
                mip_level: 0,
                origin: wgpu::Origin3d {
                    x: 0,
                    y: 0,
                    z: layer as u32,
                },
                aspect: wgpu::TextureAspect::All,
            },
            bytes,
            wgpu::ImageDataLayout {
                offset: 0,
                bytes_per_row: Some(4 * desc.sprite_size.x),
                rows_per_image: Some(desc.sprite_size.y),
            },
            wgpu::Extent3d {
                width: desc.sprite_size.x,
                height: desc.sprite_size.y,
                depth_or_array_layers: 1,
            },
        );
    }

    let view = texture.create_view(&wgpu::TextureViewDescriptor::default());
    let bind_group = gpu.device.create_bind_group(&wgpu::BindGroupDescriptor {
        entries: &[
            wgpu::BindGroupEntry {
                binding: 0,
                resource: wgpu::BindingResource::TextureView(&view),
            },
            wgpu::BindGroupEntry {
                binding: 1,
                resource: wgpu::BindingResource::Sampler(&sampler),
            },
            wgpu::BindGroupEntry {
                binding: 2,
                resource: wgpu::BindingResource::Buffer(wgpu::BufferBinding {
                    buffer: &size_hint_buffer,
                    offset: 0,
                    size: None,
                }),
            },
        ],
        layout: &gpu.base.sprite_sheet_layout,
        label: Some("sprite_sheet_bindgroup"),
    });

    return Self {
        _texture: texture,
        _size_hint_buffer: size_hint_buffer,
        _sampler: sampler,
        bind_group,
        sprite_size: desc.sprite_size,
        sprite_amount: desc.sprite_amount,
    };
}

The only way I found to fix the problem, was to align the depth_or_array_layers to 16 like this:

size: wgpu::Extent3d {
    width: desc.sprite_size.x,
    height: desc.sprite_size.y,
    depth_or_array_layers: align_to(amount, 16),
},

Of course this is no solution, because it allocates 14 additional textures in the worst case.

Am I doing something wrong or is this just a bug?

@daxpedda
Copy link
Contributor

It looks similar to the bug that should have been fixed by #3641.

@teoxoy
Copy link
Member

teoxoy commented Aug 31, 2023

@AndriBaal are you on wgpu v0.16 or higher (since they contain the fix @daxpedda referenced)?

@AndriBaal
Copy link
Author

I tested this on both 0.16 and 0.17 and its hapening on both versions

@teoxoy
Copy link
Member

teoxoy commented Aug 31, 2023

The only way I found to fix the problem, was to align the depth_or_array_layers to 16 like this:

You might be running into #2161 (comment).

If you try to avoid setting depth_or_array_layers to 1 or multiples of 6, does it work?

@AndriBaal
Copy link
Author

When working with only 1 array layer everything works fine, it is only when using more than one, that some texture arrays break. I can't explain why some work and some don't. That rounding up to 16 makes all my texture arrays work is maybe also just a coincidence.

@teoxoy teoxoy added type: bug Something isn't working api: gles Issues with GLES or WebGL labels Aug 31, 2023
@teoxoy
Copy link
Member

teoxoy commented Aug 31, 2023

You don't get a validation error if you set depth_or_array_layers to 1 while creating the view with wgpu::TextureViewDescriptor::default() and having the texture as texture_2d_array on the shader side?

@AndriBaal
Copy link
Author

Ah yes, it actually does lol, completely forgot that. What I meant was that when working without Arrays it works completely fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: gles Issues with GLES or WebGL type: bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants