-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve Find All References performance? #4051
Comments
Can "searching" for files containing references be limited to workspace folders, if possible as an option? |
it's 2021, but find the reference still very slow for c++... |
2022 now |
@ljhm The speed isn't expected to improve by 2023 either, but you could potentially change the settings C_Cpp.references.maxCachedProcesses, C_Cpp.references.maxConcurrentThreads, C_Cpp.references.maxMemory to increase the performance, but whether or not that helps depends on which part of the processing is the bottleneck in the particular invocation and the amount of memory/CPU available. |
I miss the speed of Source Insight, which is my development tool before vscode. |
It's fast if the files to confirm are already loaded. |
@gtianw Do you mean to create an IntelliSense process for all the workspace files? You could file a feature request for that. It would use a lot of RAM memory, not SSD, which is disk drive space, which isn't used for memory unless it's paging to the disk. |
I mean to create |
can it be cached in database(hard disk) and we get the result from database inside of cpptools |
@gtianw ipch only has info on a headers for a particular TU, so it would already be used to speed up TU creation. @heartacker Yeah, we're aware of the possibility of writing references to the database -- that is being tracked internally (I suppose we could also open a new issue on GitHub). It's a major change though, since our tag parser that writes to the database doesn't currently do a full compile, e.g. of includes/defines, or parse into function definitions. |
Can we pre-process all tag info for a project and store it in a database to boost find reference speed? This database can be generate once and distribute by project, so devs can just download it instead of generate it locally. |
@ODtian We actually already pre-process tag info for the whole project, but with a lexical parser. That approach requires additional symbol confirmation work for find all references which slows it down. Semantic parsing requires a lot of processing time up front and (possibly) a larger database. This is something we're currently investigating. |
Any news on this one? Unfortunately I cannot share the repo, but "Find All References" is insanely slow here. Even if only a few files need to be confirmed, it adds several seconds. Sometimes it will confirm 40-50 files which takes at least 30 seconds. There seems to be no difference at all, when I choose preview (nothing is previewed, I still need to await until everything is confirmed). |
Our testing indicated that performance was on par with Visual Studio and that VS had some bugs that was causing worse performance (due to it getting stuck processing on 1 thread some times) -- @Colengms do you know the VS bug that you filed?
But...there's probably stuff we could do to improve performance, but since the bottleneck is generally IntelliSense parsing, I'm not sure yet if the performance gain would be worth the effort.
Generally, the "too long" case will occur if there are too many files to "confirm" -- users could use the cancel or preview if they don't want to wait for confirmation.
Let us know if anyone has a specific repro where you believe we're doing something incorrect (i.e. performance is slower than with VS, similarly configured) that is causing bad performance or too low CPU usage, e.g. if you have a 200k line file somewhere in your code base, it could get "stuck" lexing on 1 core.
The "searching" phase could also take slightly longer (than VS) for opened files because we don't re-use the already opened document object so we have to read it from disk.
Also, let us know if anyone encounters too slow "canceling" or too slow "previewing".
The text was updated successfully, but these errors were encountered: