-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[XR] Implementation of WebXR Depth Sensing Feature #13159
[XR] Implementation of WebXR Depth Sensing Feature #13159
Conversation
… processes have been applied.
* correction * make ref 0 vector
…enshot with RenderTargetTexture, because we can just set the RTT active camera instead.
[GUI] add picking for fullscreen ADTs
Fix creation of cube textures from URL
* Start command bar on VSM. * Update Storybook to 7 * Modularizing CSS * Match style with GUI Editor * More basic declarations and styling. * Text styling and removing unused styling * Removing more styling * Remove more unneeded stuff * Linting...
* Remove unneeded prefixes * further prefix removals * fix build * fix the running tests
Add applyPostProcess flag on ADV to optionally draw it after the post…
* fix declaration generation * formatting
Fix wrong plugin name check for babylon serialization
* Fix samples support in render targets * Add labels to some WebGPU descriptors * Remove unused code * Fix code injection at start * Add some more logs * Fix usage of destroyed textures * Fix reentrancy problems with observables * Remove unncessary call (and fix problem in WebGPU) * Create the RTT with the right samples from start * Fix uniform analysis in WebGPU * Fix reserved keyword (in WebGPU) used as variable * Factorize visualization test code and fix some tests * Fix visualization test * linting
ParticleSystem: Add BILLBOARDMODE_STRETCHED_LOCAL mode
Fix loading the gltf loaders and validations twice
Please make sure to label your PR with "bug", "new feature" or "breaking change" label(s). |
Snapshot stored with reference name: Test environment: To test a playground add it to the URL, for example: https://babylonsnapshots.z22.web.core.windows.net/refs/pull/13159/merge/index.html#WGZLGJ#4600 Links to test babylon tools with this snapshot: https://playground.babylonjs.com/?snapshot=refs/pull/13159/merge To test the snapshot in the playground with a playground ID add it after the snapshot query string: https://playground.babylonjs.com/?snapshot=refs/pull/13159/merge#BCU1XR#0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you so much for thie PR! Great work :-)
We try not to provide the dev directly with WebXR objects but abstract the usage as much as we can.
For example - CPU usage, it seems like we can "init" the XRCPUDepthInformation object internally, and then, once available, provide the getDepthInMeters
directly, or at least abstract that to the dev.
Regarding GPU, we should not provide a WebGL2Texture to the user, but a babylon texture that the user can use. Internally, the user can have access to the internal texture, but the public API should be a babylon object and not a WebXR/WebGL object.
We should provide something extra to the dev, otherwise there is no difference between using the feature and implementing it yourself.
@RaananW I've thought about API specification of the WebXRDepthSensing feature and managed to draw a proposal. The parameters of And specific fields of When the Usage is 'cpu-optimized', we can access interface XRCPUDepthInformation extends XRDepthInformation {
readonly data: ArrayBuffer;
getDepthInMeters(x: number, y: number): number;
} This type of array can be changed by depth information data format between When the usage is 'gpu-optimized', we can access texture field as WebGLTexture. Maybe, when the usage is 'cpu-optimized', we can generate RawTexture from ArrayBuffer. How do you think abou this? |
Yes to all! :-) We should use a texture instead of a webgl texture, so that the dev will be able to use it in a babylon material or use it just like any other texture on both gpu and cpu optimized
Those will need to be configurable in the options (if it's not already the case).
|
@drumath2237 any update on this PR :-) It looks like a really cool feature. |
@sebavan |
I have to close this PR because of the repo cleanup. Let's open it again from a clean branch |
…#13563) * add: definitions of type and interfaces of depth sensing feature * add: native XRFrame method * add: WebXR Depth Sensing Feature class implementation * Add implementations exposing DepthInformation fields and methods * Remove depth information observers * Add an implementation of generate RawTexture from cpu depth information * Add and fix implementations of texture generation * Fix implementations of texture generation of cpu mode * Add process depth information method * Add documentation comments and fix naming * Fix import paths * add: latest depth buffer field * fix: change Texture format to RGBA and create depth texture * add: sandbox impl of luminance-alpha depth image texture * Change Texture format from LuminaceAlpha to RTexture * Fix cpu depth image processings * Change image processing implementations * Rename _latestDepthImageTexture to _cachedDepthImageTexture make a difference between public latest data and provate cached data * Changed depth texture pixel algorithm * small refactor of obtaining rawValueToMeters * Add getDepthInMeters observer to notify active one * Fix comment of webxr.d.ts * Fix formatting errors * Change null union types to Nullable generics * changed to returning InternalTexture instead of WebGLTexture * Define original depth types to avoid using XR native type * Create individual private fields of public fields * add @SInCE to docs comment * detach if depth usage is unknown * throw an error in nativeXRFrame depth method until the functionality is implemented * correct some docs comments
Add a WebXR Feature of Depth Sensing Module.
Related issue is #11876
Related forum page is here.