Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large Number of Files Make Lab Unresponsive #804

Closed
mlucool opened this issue Oct 20, 2020 · 3 comments
Closed

Large Number of Files Make Lab Unresponsive #804

mlucool opened this issue Oct 20, 2020 · 3 comments

Comments

@mlucool
Copy link
Contributor

mlucool commented Oct 20, 2020

Description

If you have a large number of files in a git repo, jupyterlab-git makes all of lab slow/unresponsive.

While I am not sure of what should define large nor how the status of each file is taken into account, I have included a reproducer below as a starter. This issue was found when someone had a spurious .git in a home directory making lab unresponsive, but things like node_modules can sometimes have this number of files.

While there are a lot of possible fixes, one ideas during the request to somewhat specify what you think the state of the world is and the backend only sends the diff.

Reproduce

Create a new directory and run:

git init
for n in {1..10000}; do touch $n.txt; done

Navigate to that directory in lab. Play around and you'll notice lab becomes unusable. If lab still works fast, crank up n above.

Expected behavior

Git should not slow down lab.

Context

[email protected]
Chrome Version 86.0.4240.75 (Official Build) (64-bit)

@ianhi
Copy link
Collaborator

ianhi commented Oct 20, 2020

Can you upgrade to v 0.22? @fcollonval should added virtualized lists in #767 that should fix this problem.

@ianhi
Copy link
Collaborator

ianhi commented Oct 20, 2020

While there are a lot of possible fixes, one ideas during the request to somewhat specify what you think the state of the world is and the backend only sends the diff.

fwiw the main culprit seemed to be rendering 1000s of dom nodes when creating the untracked files list see #667 (comment)

@fcollonval
Copy link
Member

0.22 should indeed solve the trouble thanks to the virtualization of the file lists.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants