-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scalability and non rigid non linear transformations #1
Comments
Also, can you provide a usage example for registering two png images? |
Hey @asmagen , this package should scale well to WSI up to 50-60k in any given spatial dimension, given that the tissue can mostly be separated into separate connected components (see paper). Our paper only details macro-architectural alignment for now since that was the dataset we had on-hand, but we are working to acquire the necessary dataset to improve cell level alignment. Maybe once you try the algorithm on your images, you can let us know how it went? There are still a few loss functions and deformations we will be adding, but some of them do handle non-linear transforms. I need to push a quick update to handle file types (eg. png) other than npy format, I'll update the package in a few hours and send you some commands. Thank you for this feedback! Very much appreciated :) |
Ok, I just updated the code, but have not yet added a new pypi installation, because the new code has not been tested yet. Feel free to try it out, instructions can be found in the updated README. Please do let us know if you encounter bugs (probably because I just pushed quite a few changes rapidly) and also if you are able to have some success here (possibly room for fruitful collaborations if there is interest!). If you encounter bugs, I'll spend sometime in the next few days debugging. The PyPI installation will work with npy format (easy to convert png to numpy), while the latest build will work with any format readable by npy or cv2. |
Just to add, should have a dataset to test cell level alignment now, so I may have time over the weekend to try it out myself. Running a few jobs right now that make it difficult to find the memory available to run this job. You are welcome to continue to contribute. All of our discussion so far has been quite beneficial. :) |
May be on interest in regards to applying deformations: |
See new and relevant response here: MRtrix3/mrtrix3#2004 (comment) |
Hey, thanks for sending this. Unfortunately, I was not able to complete the necessary modifications this weekend, as my qualifying exam is this week. Getting ITK to MTRix and/or getting the niis to work will be a priority when things clear up. Meanwhile, I will also be working to get the nonlinear transform code online, just may take me a little longer than expected. Thanks for all of your help @asmagen |
Thank you @jlevy44 , looking forward to it soon. |
Ok just finished my qual. I just pushed some code, but haven't been able to test it, the new command line option: apply_drop2_transform
You can also try to set gpu_device to 0 and see how memory constraints change when warping on the gpu. Fingers crossed that this works, if not I will debug later today. This should be a quick fix until we debug our own pipeline a bit more. |
Thanks! I'll try it out asap. |
Thanks for incorporating these changes @jlevy44. Since there is a call to target_image object that's not yet defined, I extracted the code to try to continue working with the rest of it. Most of it worked but there's an issue with the object types sent as parameters to
Apparently displacement is torch.float64 and source_img is uint8. I found that it happens in Any idea what's happening? |
It’s possible that all of the values were zeroed out at some point, I’m not sure though and will look into it. Could be some division or multiplication by 255. Just suggesting some ideas, I’ll look into it. I think we’re close though! |
That what the problem? Is it working for you now? |
Yeah, it's working! Just needs some final touches. I would test it out, and then carefully consider testing the following options, which I have not tested in totality and thus am having trouble finding the ideal transform. Biggest issue remaining is that it is not readily apparent how dx and dy should be applied to build the displacement matrix, so I have added the temporary options --flip_pattern and --flip_xy to experiment and see which combinations of flipping the 1st 2 axes of dx and dy and whether to reverse the order of dx, dy works. Looking at flip_pattern, setting elements to -1 correspond to: [flip dx axis 0, flip dx axis 1, flip dy axis 0, flip dy axis 1], and flip_xy then reverses the order of [dx, dy] to [dy,dx] before concatenation. I am unsure of which combination of these options are correct. There are 32 options to select from, and one of them is the correct one. When we find this one out, I will hard code it. Check out the code to see how this experiment is being done. Sorry for the delay! Try reinstalling from github and running (replace with your own images/files):
And let me know which flip_pattern and flip_xy work for you so I can make the changes! I'll experiment more tomorrow. We're close. Wishing you the best of luck on your research! Seems like a very interesting project. |
Ok, I tested all 32 options, and unfortunately, all came back negative. But this is because I forgot to specify --ocompose, as per issue: biomedia-mira/drop2#2 . I think with this specified, I should be able to get it up and running. @asmagen , remember to specify --ocompose to get the entire deformation field. I will modify my command to obtain the ideal output! |
Now, it appears to be working! Should have it fully functional by tomorrow! |
@asmagen Ok! Should be good to go. There may be minor defects related to the interpolation method that I can look into, but give it a shot. I was able to get it to work on my data with minor defects that I may seek to further remedy. |
|
I see we got input here regarding the registration issues with air lab. How can we progress with utilizing the solution here? |
Hi @sumanthratna, has it resolved or still in progress? |
Hi @asmagen! Progress has been a little slow on mixmatch, but I believe that the loss of details in wendland can be resolved by finding the right parameters, which might take some trial-and-error. Unfortunately, I don't know if I'll be able to take another look at it until after May 18. |
Update for @asmagen: it turns out that decreasing the learning rate significantly fixes the issue we saw in airlab-unibas/airlab#26. I think the plan is to do some more testing, but I believe the PR should be merged soon. |
@asmagen we’re in the middle of testing. I would try the drop2 transformation, which should work, if not, I agree with Sumanth to definitely fork the repo, or contact the airlab developers to ask about the block of code I had added for applying drop2. They may be able to help out |
I think our code is very close for the drop2 solution, though may need one or two minute changes. |
Hi @jlevy44 @sumanthratna I'm getting back to the registration now and I wanted to catch up with the latest developments on the issue we had with the transformation and the learning rate. Where do we stand in that context? What is currently working or requiring attention on may end and how to run it? To clarify, I'm referring to regular grayscale image registration where you have a large one or two tissue chunks rather than multiple small components. |
Does this package scale well to WSI of size 15k x 15k pixels and allows for non rigid registration to the level of aligning well close to individual cells? The tissues I'm working with not only shift but also shrink and require non linear transformations.
Thanks
The text was updated successfully, but these errors were encountered: