Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scalability and non rigid non linear transformations #1

Closed
asmagen opened this issue Mar 28, 2020 · 30 comments
Closed

Scalability and non rigid non linear transformations #1

asmagen opened this issue Mar 28, 2020 · 30 comments

Comments

@asmagen
Copy link

asmagen commented Mar 28, 2020

Does this package scale well to WSI of size 15k x 15k pixels and allows for non rigid registration to the level of aligning well close to individual cells? The tissues I'm working with not only shift but also shrink and require non linear transformations.

Thanks

@asmagen
Copy link
Author

asmagen commented Mar 28, 2020

Also, can you provide a usage example for registering two png images?

@jlevy44
Copy link
Owner

jlevy44 commented Mar 28, 2020

Hey @asmagen , this package should scale well to WSI up to 50-60k in any given spatial dimension, given that the tissue can mostly be separated into separate connected components (see paper). Our paper only details macro-architectural alignment for now since that was the dataset we had on-hand, but we are working to acquire the necessary dataset to improve cell level alignment. Maybe once you try the algorithm on your images, you can let us know how it went?

There are still a few loss functions and deformations we will be adding, but some of them do handle non-linear transforms.

I need to push a quick update to handle file types (eg. png) other than npy format, I'll update the package in a few hours and send you some commands.

Thank you for this feedback! Very much appreciated :)

@jlevy44
Copy link
Owner

jlevy44 commented Mar 28, 2020

Ok, I just updated the code, but have not yet added a new pypi installation, because the new code has not been tested yet.

Feel free to try it out, instructions can be found in the updated README.

Please do let us know if you encounter bugs (probably because I just pushed quite a few changes rapidly) and also if you are able to have some success here (possibly room for fruitful collaborations if there is interest!).

If you encounter bugs, I'll spend sometime in the next few days debugging. The PyPI installation will work with npy format (easy to convert png to numpy), while the latest build will work with any format readable by npy or cv2.

@jlevy44
Copy link
Owner

jlevy44 commented Mar 31, 2020

#5

@jlevy44
Copy link
Owner

jlevy44 commented Mar 31, 2020

Just to add, should have a dataset to test cell level alignment now, so I may have time over the weekend to try it out myself. Running a few jobs right now that make it difficult to find the memory available to run this job. You are welcome to continue to contribute. All of our discussion so far has been quite beneficial. :)

@asmagen
Copy link
Author

asmagen commented Apr 3, 2020

May be on interest in regards to applying deformations:
Warping drop2

@asmagen
Copy link
Author

asmagen commented Apr 4, 2020

See new and relevant response here: MRtrix3/mrtrix3#2004 (comment)

@jlevy44
Copy link
Owner

jlevy44 commented Apr 6, 2020

Hey, thanks for sending this. Unfortunately, I was not able to complete the necessary modifications this weekend, as my qualifying exam is this week.

Getting ITK to MTRix and/or getting the niis to work will be a priority when things clear up. Meanwhile, I will also be working to get the nonlinear transform code online, just may take me a little longer than expected. Thanks for all of your help @asmagen

@asmagen
Copy link
Author

asmagen commented Apr 7, 2020

Thank you @jlevy44 , looking forward to it soon.

@asmagen
Copy link
Author

asmagen commented Apr 8, 2020

FYI:
airlab-unibas/airlab#19

@jlevy44
Copy link
Owner

jlevy44 commented Apr 11, 2020

Ok just finished my qual. I just pushed some code, but haven't been able to test it, the new command line option: apply_drop2_transform

pathflow_mixmatch apply_drop2_transform --source_image [IMAGE TO WARP] --ref_image [WARP TO THIS] --dx [X DISPLACEMENT FROM DROP2] --dy [Y DISPLACEMENT FROM DROP2] --gpu_device -1

You can also try to set gpu_device to 0 and see how memory constraints change when warping on the gpu. Fingers crossed that this works, if not I will debug later today. This should be a quick fix until we debug our own pipeline a bit more.

@asmagen
Copy link
Author

asmagen commented Apr 12, 2020

Thanks! I'll try it out asap.

@asmagen
Copy link
Author

asmagen commented Apr 14, 2020

Thanks for incorporating these changes @jlevy44. Since there is a call to target_image object that's not yet defined, I extracted the code to try to continue working with the rest of it. Most of it worked but there's an issue with the object types sent as parameters to displace_image:

>>> source_img=cv2.imread(source_image)
>>> source_img.shape
(4308, 2928, 3)
>>> ref_img=cv2.imread(ref_image)
>>> ref_img.shape
(4302, 2846, 3)
>>> source_img=cv2.resize(source_img,ref_img.shape[:2][::-1])
>>> source_img.shape
(4302, 2846, 3)
>>> dx,dy=nibabel.load(dx).get_fdata(),nibabel.load(dy).get_fdata()
>>> displacement=th.tensor(np.concatenate([dx,dy],-1)).unsqueeze(0).permute(0,2,1,3)
>>> displacement.shape
torch.Size([1, 4302, 2846, 2])
>>> new_img = displace_image(source_img, displacement, gpu_device)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 6, in displace_image
  File "/usr/local/lib/python3.7/site-packages/airlab/transformation/utils.py", line 101, in warp_image
    warped_image = F.grid_sample(image.image, displacement + grid)
  File "/usr/local/lib/python3.7/site-packages/torch/nn/functional.py", line 2711, in grid_sample
    return torch.grid_sampler(input, grid, mode_enum, padding_mode_enum, align_corners)
RuntimeError: grid_sampler(): expected input and grid to have same dtype, but input has float and grid has double

Apparently displacement is torch.float64 and source_img is uint8.
I also changed the cv2.resize call because it generated an issue with the dimensions that were swapped.

I found that it happens in displace_image requiring a modification in the type from th.float32 to th.float64: im=al.utils.image.create_tensor_image_from_itk_image(im, dtype=th.float64, device=('cuda:{}'.format(gpu_device) if gpu_device>=0 else 'cpu'))`
which made it run and complete, but when I open the image I see only black pixels.

Any idea what's happening?

@jlevy44
Copy link
Owner

jlevy44 commented Apr 14, 2020

It’s possible that all of the values were zeroed out at some point, I’m not sure though and will look into it. Could be some division or multiplication by 255. Just suggesting some ideas, I’ll look into it. I think we’re close though!

@jlevy44
Copy link
Owner

jlevy44 commented Apr 15, 2020

@asmagen
Copy link
Author

asmagen commented Apr 15, 2020

That what the problem? Is it working for you now?

@jlevy44
Copy link
Owner

jlevy44 commented Apr 16, 2020

Yeah, it's working! Just needs some final touches. I would test it out, and then carefully consider testing the following options, which I have not tested in totality and thus am having trouble finding the ideal transform.

Biggest issue remaining is that it is not readily apparent how dx and dy should be applied to build the displacement matrix, so I have added the temporary options --flip_pattern and --flip_xy to experiment and see which combinations of flipping the 1st 2 axes of dx and dy and whether to reverse the order of dx, dy works.

Looking at flip_pattern, setting elements to -1 correspond to: [flip dx axis 0, flip dx axis 1, flip dy axis 0, flip dy axis 1], and flip_xy then reverses the order of [dx, dy] to [dy,dx] before concatenation. I am unsure of which combination of these options are correct. There are 32 options to select from, and one of them is the correct one. When we find this one out, I will hard code it. Check out the code to see how this experiment is being done.

Sorry for the delay!

Try reinstalling from github and running (replace with your own images/files):

 pathflow-mixmatch apply_drop2_transform --flip_pattern [-1,-1,1,1] --flip_xy True --source_image A.png --ref_image B.png --dx field_x.nii.gz --dy field_y.nii.gz --gpu_device -1 --output_file test.warp.png

And let me know which flip_pattern and flip_xy work for you so I can make the changes! I'll experiment more tomorrow. We're close.

Wishing you the best of luck on your research! Seems like a very interesting project.

@jlevy44
Copy link
Owner

jlevy44 commented Apr 23, 2020

Ok, I tested all 32 options, and unfortunately, all came back negative. But this is because I forgot to specify --ocompose, as per issue: biomedia-mira/drop2#2 .

I think with this specified, I should be able to get it up and running. @asmagen , remember to specify --ocompose to get the entire deformation field.

I will modify my command to obtain the ideal output!

@jlevy44
Copy link
Owner

jlevy44 commented Apr 23, 2020

Now, it appears to be working! Should have it fully functional by tomorrow!

@jlevy44
Copy link
Owner

jlevy44 commented Apr 23, 2020

@asmagen Ok! Should be good to go. There may be minor defects related to the interpolation method that I can look into, but give it a shot. I was able to get it to work on my data with minor defects that I may seek to further remedy.

@jlevy44
Copy link
Owner

jlevy44 commented Apr 23, 2020

pathflow_mixmatch apply_drop2_transform --source_image [IMAGE TO WARP] --ref_image [WARP TO THIS] --dx [X DISPLACEMENT FROM DROP2] --dy [Y DISPLACEMENT FROM DROP2] --gpu_device -1

@asmagen
Copy link
Author

asmagen commented Apr 29, 2020

I see we got input here regarding the registration issues with air lab. How can we progress with utilizing the solution here?

@sumanthratna
Copy link
Collaborator

Hi @asmagen! I have a draft PR (it's a work in progress) that should fix the nonlinear transformations (#12). It's not ready yet but I'd expect it to be ready in the next few days.

If you're working on a tight deadline, please try out my fork of this repo and let me know if it works for you!

@asmagen
Copy link
Author

asmagen commented May 8, 2020

Hi @sumanthratna, has it resolved or still in progress?

@sumanthratna
Copy link
Collaborator

Hi @asmagen! Progress has been a little slow on mixmatch, but I believe that the loss of details in wendland can be resolved by finding the right parameters, which might take some trial-and-error. Unfortunately, I don't know if I'll be able to take another look at it until after May 18.

@sumanthratna
Copy link
Collaborator

Update for @asmagen: it turns out that decreasing the learning rate significantly fixes the issue we saw in airlab-unibas/airlab#26. I think the plan is to do some more testing, but I believe the PR should be merged soon.

@jlevy44
Copy link
Owner

jlevy44 commented May 9, 2020

@asmagen we’re in the middle of testing. I would try the drop2 transformation, which should work, if not, I agree with Sumanth to definitely fork the repo, or contact the airlab developers to ask about the block of code I had added for applying drop2. They may be able to help out

@jlevy44
Copy link
Owner

jlevy44 commented May 9, 2020

I think our code is very close for the drop2 solution, though may need one or two minute changes.

@asmagen
Copy link
Author

asmagen commented May 28, 2020

Hi @jlevy44 @sumanthratna I'm getting back to the registration now and I wanted to catch up with the latest developments on the issue we had with the transformation and the learning rate. Where do we stand in that context? What is currently working or requiring attention on may end and how to run it? To clarify, I'm referring to regular grayscale image registration where you have a large one or two tissue chunks rather than multiple small components.
Thanks

@asmagen asmagen closed this as completed Jun 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants