We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
There’s an unpublished dataset in dataverse.harvard.edu that has 1061 files:
https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/HVY5GR
When I try to select all files for download I get:
“Request-URI Too Long The requested URL’s length exceeds the capacity limit for this server.”
…I assume because it’s a URI like https://dataverse.harvard.edu/api/access/datafiles/3851150,3851397,3851153, and so on, with all 1,000 files.
Tagging @qqmyers as he mentioned he has some code that handles this issue. Thanks @qqmyers !
The text was updated successfully, but these errors were encountered:
FYI - I think max URL length is configurable in apache (not sure about glassfish/payara/etc), so this may be fixable without code changes.
Sorry, something went wrong.
I think I looked into that, but there are browser limits as well. I think the lowest is IE/Edge at around 2K chars (current info afaik).
Fixed merge conflict related to request access javascript [ref #6684, #…
a16adff
…6943]
Fixed unnecessary duplicate file download javascript [ref #6684, #6943]
90ac3a6
qqmyers
Successfully merging a pull request may close this issue.
There’s an unpublished dataset in dataverse.harvard.edu that has 1061 files:
https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/HVY5GR
When I try to select all files for download I get:
“Request-URI Too Long
The requested URL’s length exceeds the capacity limit for this server.”
…I assume because it’s a URI like https://dataverse.harvard.edu/api/access/datafiles/3851150,3851397,3851153, and so on, with all 1,000 files.
Tagging @qqmyers as he mentioned he has some code that handles this issue. Thanks @qqmyers !
The text was updated successfully, but these errors were encountered: