You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
get_bucket() does not work with DigitalOcean Spaces when there are more than 1000 objects (around 4000 in my case). Setting max at a value over 1000 results in duplicate objects, and setting max at Inf results in an infinite loop.
library("aws.s3")
## code goes hereobjects<- get_bucket(
bucket="my-bucket",
prefix="pipeline",
max=10000
)
## session info for your systemRversion4.0.2 (2020-06-22)
Platform:x86_64-apple-darwin17.0 (64-bit)
Runningunder:macOS10.16Matrixproducts:defaultLAPACK:/Library/Frameworks/R.framework/Versions/4.0/Resources/lib/libRlapack.dyliblocale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8attachedbasepackages:
[1] statsgraphicsgrDevicesutilsdatasetsmethodsbaseotherattachedpackages:
[1] aws.s3_0.3.22loadedviaa namespace (andnotattached):
[1] httr_1.4.2compiler_4.0.2R6_2.5.0tools_4.0.2base64enc_0.1-3curl_4.3aws.signature_0.6.0xml2_1.3.2digest_0.6.27
The text was updated successfully, but these errors were encountered:
get_bucket()
does not work with DigitalOcean Spaces when there are more than 1000 objects (around 4000 in my case). Settingmax
at a value over 1000 results in duplicate objects, and settingmax
atInf
results in an infinite loop.The text was updated successfully, but these errors were encountered: