-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
StackOverflow Type Inference Error #44852
Comments
It's unlikely that someone can begin to address this without being able to reproduce it. A MWE would help. |
An example of the code can be made available to JuliaComputing(I'll have to ask my teamates), but it contains proprietary information which cannot be publicly shared. Unfortunately, the code could only be reduced so much. |
I believe there's a process for RelationalAI to share compiler issues with Julia Computing (there's a support contract between the companies). Have you followed that process to let them know and prioritize this issue? |
Oops, sorry if I suggested the wrong thing here! I can confirm that I suggested @caseykneale should post a companion issue for the type inference stackoverflow in the julia repo. I thought that was part of our standard procedure, but it's possible I am out of date. Anyway, I think this is being sorted out yeah, I just wanted to be sure we had a tracker for the issue on the open julia repo so we had somewhere to coordinate on the work. Hope that clarifies things a bit! |
Traced this to a call to foldr/reverse on a very long tuple. In that case, we have recursion on a tuple that's smaller on each step, so the compiler allows it and tries to follow it to completion, causing a stack overflow in inference. We can't easily to do anything about this. For example, adding a cutoff would make type inference wildly unpredictable, since it would depend not just on tuple length but on how deep the inference stack happened to be when we hit the long tuple function. Doing nothing about it is also justified by the fact that such cases would often cause a runtime stack overflow as well. One thing we can maybe do is avoid trying to print a backtrace when type inference hits a stack overflow, since that can be quite disruptive (lots of output plus can take a very long time). |
Yeah, ideally we could print an error about the user's code instead of the internal stackoverflow from inside of type inference. that would be awesome. But yeah, thanks, this explanation makes sense! Thanks for digging 👍 |
We ran into an issue where Julia type inference breaks down and results in a stack overflow(SO) while working on an application for a client. The issue is present in both Julia 1.6.5 and Julia 1.7.2. On MacOS the SO terminates quickly, but on Ubuntu instances it propagates a stack trace for > 24hours (>600MB).
We cannot share the example but we have made an RR available (hopefully it was collected correctly).
Private Link:
https://github.com/RelationalAI/raicode/issues/7471
Possibly related issues:
#43050
#38364
The text was updated successfully, but these errors were encountered: