Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extension of input dim #13

Open
StrivedTye opened this issue Oct 18, 2023 · 1 comment
Open

Extension of input dim #13

StrivedTye opened this issue Oct 18, 2023 · 1 comment

Comments

@StrivedTye
Copy link

Hi Authors,

Thanks for your great work!
During training, I want to utilize other geometric features (e.g. normal) except the coordinates (x, y, z). This is, the input shape will become (B, N, 6) instead of (B, N, 3). After doing this extension, there is some problems about the calculation of the jacobi matrix. I still only calculate gradient with respect to x, y, and z: A1 = self.mlp1[0].weight[:, :3, :] . Does it make sense?

Thanks!

@Lilac-Lee
Copy link
Owner

Hi @StrivedTye, you could consider the ReLU-MLP as ReLU(Ax+b) (see eq. 8). Therefore, you are computing the partial derivative of the network of the input. If your input has size 6, you will get a gradient of size 6, and each dimension is in correspondence with your input. So, your understanding should be correct.

Cheers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants