-
Notifications
You must be signed in to change notification settings - Fork 380
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cycle in requirements.txt causes infinite recursion #354
Comments
I think an easy way to handle this could be to just track what requirement files have been previously included (and include the one that is about to be included) because that should also optimize for more complex tree shapes that result in the same file being included multiple times in a way that is not a cycle e.g.
I would also be interested to know if you have any occurrences of that in your dataset - they'd not be noticable on the final output since packages are deduplicated. |
+1, sounds like a straight-forward way to avoid processing the same file twice.
Hmm, I'm not sure we have the data to identify these. |
It's hard to comment without knowing the details of your setup, but if you have the ability to run a modified version of the scanner you could easily have the scanner record every |
Across the whole 1.2M repos no, but locally and across a smaller subset of projects with python in the name? I can let it run and see if it finds anything interesting overnight. |
I think this is the situation you were talking about:
There's also:
I don't think it creates any duplicate packages though, as you use a map here so it gets de-duped: osv-scanner/pkg/lockfile/parse-requirements-txt.go Lines 119 to 121 in 2c101c1
Finally there's https://github.com/luxonis/depthai-python which is slightly different scenario. As
So some of the packages will be duplicated across queries since they start from different lockfiles. Which I think is the correct behavior one would want to see in the results, but does cause some extra queries to OSV. |
@another-rex can you take a look here? |
@oliverchang this is also one I'm working on over in the detector - I've not got a fix yet, but will probably have one in the next couple of days, so feel free to assign it to me if you want |
Ported from G-Rath/osv-detector#191 Note that this is not actually supported by `pip` itself, but doing so actually optimizes the parser a bit anyway by only reading each file exactly once regardless of how often it is required fixes: google#354 Co-authored-by: Gareth Jones <[email protected]>
Ported from G-Rath/osv-detector#191 Note that this is not actually supported by `pip` itself, but doing so actually optimizes the parser a bit anyway by only reading each file exactly once regardless of how often it is required fixes: google#354 Co-authored-by: Gareth Jones <[email protected]>
Ported from G-Rath/osv-detector#191 Note that this is not actually supported by `pip` itself, but doing so actually optimizes the parser a bit anyway by only reading each file exactly once regardless of how often it is required fixes: google#354 Co-authored-by: Gareth Jones <[email protected]>
Ported from G-Rath/osv-detector#191 Note that this is not actually supported by `pip` itself, but doing so actually optimizes the parser a bit anyway by only reading each file exactly once regardless of how often it is required fixes: #354 Co-authored-by: Gareth Jones <[email protected]>
The logic that handles referring to other requirement files can recurse infinitely (or at least until the stack hits the goroutine 1Gb limit).
osv-scanner/pkg/lockfile/parse-requirements-txt.go
Lines 96 to 113 in 2c101c1
Consider the following
requirements.txt
:Running
osv-scanner
on this file will crash the program (and my VM when trying to replicate)For what it's worth, it's a problematic
requirements.txt
forpip
too:Would some sort of cycle detection in
osv-scanner
make sense? Or is this too much work to handle what is essentially bad input?The alternative on my end is just removing the offending repo from my dataset, which I'm fine doing.
The text was updated successfully, but these errors were encountered: