-
Notifications
You must be signed in to change notification settings - Fork 492
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NAS-130653 / 25.04 / Updated debug generate lock queue size #14301
Conversation
What will happen with the subsequent request in the UI? |
Could we get a brief description on what issue this is addressing? |
If we're at the limit of the queue, every subsequent result in the UI takes the same job. One issue with this actually is that the download url given is "one time use only" so those extra calls fail. This is if the Thinking about the best way to go about this, maybe just reject subsequent calls from the UI entirely? |
So how will the UI behave if we merge this changes? If I request a debug while a debug is already being gathered, will both UI tabs just download me the same file? |
@aiden3c does the double click lead to the same behavior? We need to resolve this somehow. I would say, we refuse this type of request on our side (need modifications in |
e3c464f
to
88c58dc
Compare
@yocalebo @anodos325 I found no way to double-click the "Save debug" button in the UI. It first opens an "proceed?" dialog, and even if I do double click on "yes" button, only one job is started. If I open another tab and start the debug generation there, it displays the new error correctly: Looks like we don't need to do anything on the UI side? |
time 1:30 |
This PR has been merged and conversations have been locked. |
Adding this limit stops people from spamming debug generation, as most of the time just one job is required.