Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reject dataset uploads if organization storage quota is exceeded #6893

Merged
merged 6 commits into from
Mar 27, 2023

Conversation

fm3
Copy link
Member

@fm3 fm3 commented Mar 2, 2023

Steps to test:

  • set reportUsedStorage.enabled = true for your datastore in application.conf (note that if your organization is super new, you may want to lower rescanInterval as well so that you don’t have to wait 24 hours. initial data sets lastScanTime to creationTime.)
  • upload a dataset, should still work
  • in the local db, limit your organization’s includedStorage field, retry, should be rejected

Issues:


@fm3 fm3 added the backend label Mar 2, 2023
@fm3 fm3 self-assigned this Mar 2, 2023
@fm3
Copy link
Member Author

fm3 commented Mar 2, 2023

@philippotto It is not simple for the backend to make the top-level error message very explanatory. Instead, the user currently gets this.
image

If I remember correctly, there are already some instances where the frontend does exists-checks in the error message to add another message to the user, do you think something like that could work here too?

Additionaly, I noticed that the state of the upload page is kind of stuck if the request fails like so. Maybe it would be possible to (somewhat?) reset it in this case, but not super important I guess.

@fm3
Copy link
Member Author

fm3 commented Mar 21, 2023

@philippotto ping :) Could you estimate if this would be a lot of effort? If so, we can also merge this as is.

@philippotto
Copy link
Member

Sorry, I forgot about that! Looking into it right now :)

@philippotto
Copy link
Member

As discussed, I went with the early storage check so that the button is disabled (and an alert box is shown). Will look like this:

image

@fm3 fm3 marked this pull request as ready for review March 23, 2023 10:37
@fm3 fm3 requested a review from daniel-wer March 23, 2023 10:38
Copy link
Member

@daniel-wer daniel-wer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! I couldn't get the storage reporter to run even after adapting both application.conf keys mentioned in the PR description. The organization page always reported 0.0 GB 🤔 Still, the normal dataset upload succeeded. Also, I investigated and fixed another issue which I encountered during testing.

@fm3
Copy link
Member Author

fm3 commented Mar 23, 2023

Huh, this is weird. Could it be that the displayed number is rounded down to 0.0? what does api/organizations contain?

@daniel-wer
Copy link
Member

Huh, this is weird. Could it be that the displayed number is rounded down to 0.0? what does api/organizations contain?

Good point! Indeed the response contains "usedStorageBytes": 10535441467 which is 10 GB. If the limit is displayed in TB with one digit after the decimal point this seems to be caused by rounding, as you suspected. Sorry for the false alarm, then :)

@daniel-wer
Copy link
Member

I retested and everything works as expected 🎉

@philippotto
Copy link
Member

The organization page always reported 0.0 GB

I suppose you mean 0.0 TB? Otherwise, the rounding error explained in the following comments doesn't make sense to me 😅

@fm3 fm3 merged commit 80fa509 into master Mar 27, 2023
@fm3 fm3 deleted the upload-check-storage-quota branch March 27, 2023 08:29
@daniel-wer
Copy link
Member

I suppose you mean 0.0 TB? Otherwise, the rounding error explained in the following comments doesn't make sense to me

In fact, the organization page didn't show any unit, because the upper limit was infinity. So at that point I assumed the unit might be GB, but it turned out to be TB.

@fm3
Copy link
Member Author

fm3 commented Mar 27, 2023

might be worth restructuring that display to avoid this kind of confusion for future users. Might make sense to separate the units for used vs allowed, but that does not play well with the current diagram style 🤷

hotzenklotz added a commit that referenced this pull request Apr 3, 2023
…come-toast

* 'master' of github.com:scalableminds/webknossos:
  Log all details on deleting annotation layer (#6950)
  fix typo
  Rename demo instance to wkorg instance (#6941)
  Add LOD mesh support for frontend (#6909)
  Fix layout of view mode switcher and move it (#6949)
  VaultPath no longer extends nio.Path (#6942)
  Release 23.04.0 (#6945)
  Use new zip.js version to allow zip64 uploads (#6939)
  Implement viewing sharded neuroglancer precomputed datasets (#6920)
  Reject dataset uploads if organization storage quota is exceeded (#6893)
  Refactor deprecated antd Dropdown menus (#6898)
hotzenklotz added a commit that referenced this pull request Apr 4, 2023
…wings

* 'master' of github.com:scalableminds/webknossos:
  updates docs for docker installation (#6963)
  Fix misc stuff when viewing tasks/annotations of another user (#6957)
  Remove segment from list and add undo/redo for segments (#6944)
  Log all details on deleting annotation layer (#6950)
  fix typo
  Rename demo instance to wkorg instance (#6941)
  Add LOD mesh support for frontend (#6909)
  Fix layout of view mode switcher and move it (#6949)
  VaultPath no longer extends nio.Path (#6942)
  Release 23.04.0 (#6945)
  Use new zip.js version to allow zip64 uploads (#6939)
  Implement viewing sharded neuroglancer precomputed datasets (#6920)
  Reject dataset uploads if organization storage quota is exceeded (#6893)
  Refactor deprecated antd Dropdown menus (#6898)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Block uploading datasets when storage quota is exceeded
3 participants