You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently trying to give a try to your method on a custom scene I have but I'm facing quite a disturbing and strange issue. Training starts well, but at some points, I got some CUDA-kernel error I cannot get.
I've found some similar issue in the original GS repo (here: graphdeco-inria/gaussian-splatting#41 (comment)), made the corresponding changes by rebuilding the diff-gaussian-rasterizer submodule, but I still get the error.
Here is the log stack I get.
Do you have any clues / insights on what's happening and why ?
Thanks a lot for your time, your work,
Best,
Gaétan.
The text was updated successfully, but these errors were encountered:
Hi !
Currently trying to give a try to your method on a custom scene I have but I'm facing quite a disturbing and strange issue. Training starts well, but at some points, I got some CUDA-kernel error I cannot get.
I highly suspect an issue related to the
gaussians.compute_3D_filter
method (https://github.com/autonomousvision/mip-splatting/blob/746a17c9a906be256ed85b8fe18632f5d53e832d/train.py#L164C1-L165C1) but I don't manage to investigate in a deeper way the error.I've found some similar issue in the original GS repo (here: graphdeco-inria/gaussian-splatting#41 (comment)), made the corresponding changes by rebuilding the
diff-gaussian-rasterizer
submodule, but I still get the error.Here is the log stack I get.
Do you have any clues / insights on what's happening and why ?
Thanks a lot for your time, your work,
Best,
Gaétan.
The text was updated successfully, but these errors were encountered: