Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for density compensation estimation with cufinufft #195

Draft
wants to merge 58 commits into
base: master
Choose a base branch
from

Conversation

chaithyagr
Copy link
Member

This is crucial support for many use cases. I have some temporary things added to ensure tests still run good.

Copy link
Member

@paquiteau paquiteau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets see how it goes in flatironinstitute/finufft#564

@paquiteau paquiteau changed the title Add support for density compenstation estimation with cufinufft Add support for density compensation estimation with cufinufft Nov 18, 2024
@chaithyagr
Copy link
Member Author

cufinufft 2.3.1 is here!! Ill update the PR to pull it.

Copy link
Member

@paquiteau paquiteau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some questions and suggestions, but this makes cufinufft backend probably the best backend for mri-nufft (albeit memory consumption, in this case its still gpunufft)

python -m pip install finufft pooch brainweb-dl torch fastmri

- name: Install GPU related interfaces
- name: Point to CUDA 12.1 #TODO: This can be combined from other jobs
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get your comment, what combinaison do you have in mind ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We do this even in other places in the CI, we not really re-do it again and again

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since there is two backend, you could plot the different density compensation vectors to show the differences (as cufinufft and gpunufft does not use the same interpolation kernel)

@@ -54,7 +54,7 @@ def __init__(self, inital_trajectory):
data=torch.Tensor(inital_trajectory),
requires_grad=True,
)
self.operator = get_operator("gpunufft", wrt_data=True, wrt_traj=True)(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do you prefer cufinufft ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea is to run cufinufft with density compensation, basically increase the coverage

pyproject.toml Outdated
Comment on lines 17 to 18
torchkbnufft-cpu = ["torchkbnufft", "cupy-cuda12x"]
torchkbnufft-gpu = ["torchkbnufft", "cupy-cuda12x"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

those "CI backends" could be put all together and with a comment above for explaining.
you could also do something like torchkbnufft-cpu = ["mri-nufft[torchkbnufft]"] to avoid repeating the dependency

cp.array(samples[:, 1], copy=False),
cp.array(samples[:, 2], copy=False) if self.ndim == 3 else None,
)
plan.setpts(self._kx, self._ky, self._kz)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just to be sure, this does not create copies of the arrays ( maybe related to #147 as well)

Comment on lines 926 to 930
if normalize:
test_op = MRICufiNUFFT(samples=kspace_loc, shape=original_shape, **kwargs)
test_im = cp.ones(original_shape, dtype=test_op.cpx_dtype)
test_im_recon = test_op.adj_op(density_comp * test_op.op(test_im))
density_comp /= cp.mean(cp.abs(test_im_recon))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this could be refactored into a def _normalize_density(backend, samples, shape, density_comp) so that it could be used for other density compensation methods (e.g. voronoi)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants