You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I feel some example code would help with the discussion (I'm not entirely familiar with WebGPU usage): what's the difference between CPU MLGraph.compute() vs. GPU MLGraph.compute() vs. GPU command encoder + WebGPU interop?
I'd agree.
There are samples for MLGraph.compute() (or MLContext.compute() in current spec) in the spec and explainer. Basically the CPU and GPU contexts share the same code path except setting corresponding device type when creating the context.
There is lack of sample of GPU command encoder + WebGPU interop. We prototyped the WebGPU interop before when investigating real-time video processing. The webgpu + webnn blur transformation sample could be a reference. But I'd agree we should add the sample code for WebGPU interop.
Split from #341, where @wacky6 mentioned
I'd agree.
There are samples for
MLGraph.compute()
(orMLContext.compute()
in current spec) in the spec and explainer. Basically the CPU and GPU contexts share the same code path except setting corresponding device type when creating the context.There is lack of sample of GPU command encoder + WebGPU interop. We prototyped the WebGPU interop before when investigating real-time video processing. The webgpu + webnn blur transformation sample could be a reference. But I'd agree we should add the sample code for WebGPU interop.
/cc @wchao1115
The text was updated successfully, but these errors were encountered: