-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
collect all maps at one URL? #335
Comments
How about here? I emailed one of the people in charge of it a while back and he said we'd be a pretty easy sell. |
yeah that looks great to me. they have a deadline coming up on 12/31. Should we go for it? |
Started work on a document that fills out some of the fields that require a paragraph form response here: https://docs.google.com/document/d/1JVUDVb39jKEoX80Hjpl7XEtN6j6rWuwylE9K3s0cQeY/edit?usp=sharing |
Application here: https://application.opendata.aws/ |
I'll try to fill it out more tomorrow |
I'm checking out for the holidays today, so I won't have time to do this. Very happy for others to follow it up though. |
I can handle most of this but I think it makes sense for either @andrewkern or @jeromekelleher to be the actual corresponding individual. What do you think of the text so far @andrewkern ? |
happy to work on it today with you @ndukler |
I think I've filled out the entire form on the googledoc. |
Any idea what the data-set license should be? @andrewkern @petrelharp ? |
this is all not our data so unclear to me? the only thing with possible restrictions I imagine is decode |
They seem to provide it all here (https://www.decode.com/addendum/) with no limitations? |
great so let's proceed with a completely open license |
@andrewkern it's complete if you'd like to make some suggestions on the googledoc. Otherwise I'm ready to submit it (under your name). |
@ndukler this looks great! i've made a pass and made a few quick edits. If you can submit it for me that would be amazing!! |
So I submitted it awhile ago. Did you get an email? |
nope |
Hmm, I'll resubmit |
Actually I'm gonna hold off for a few days. Tell me if they email you @andrewkern . Not sure how they'd react to two submissions. I gave them your U. Oregon email. |
okay Noah thanks |
I just re-submitted the form. I mean to do it before the deadline but it got lost in the shuffle. Next deadline is March 31st if the last one didn't go through. Apologies for the screw-up. |
thanks @ndukler |
i'm wondering if another option might be to host the maps on zenodo and then use their helper to download: https://zenodo.org/record/1261813#.Xhe6-OsnY7w |
Good idea to use Zenodo, I guess the sizes of the maps are small enough to fit there. I had a quick look at the tool for downloading, and I don't think it'll help. It's only a CLI and not a Python API, and it's not as robust as what we have now, IMO. Any idea on when we'll hear about the Amazon thing @ndukler? |
Depends on which submission went through. But no specific idea. |
Sorry for being bit slow on responding these days, I've got some other deadlines so I'll have to put the popsim stuff on hold for a month or so unless there's something small that I have time for on a weekend. |
just heard from Amazon-- we are in. Will forward the email to you two. |
What's the status here @andrewkern/@ndukler? Can we upload some maps to AWS and change our code? |
Sure, if I get the credentials for the AWS service I can look into uploading all of our resources by the end of the week |
just sent you two email... i think i blew it and we missed the deadline to upload!! |
Hey @ndukler, @andrewkern, can either of you give an update about this? @apragsdale needs to upload the Pongo map files somewhere so we can merge #363. |
Just to bump this again, is there any news about recombination map storage? Otherwise, would I be able to get some maps uploaded to their current location for the time being? |
Apologies for missing the earlier message. It is in fact all up and we should work on transitioning the code over to use the new storage. To see the documentation go here and search stdpopsim. That being said I think @andrewkern would have to upload the new data although I could do it if he hasn't changed the credentials. |
Ok, I still have access and can add it in if you send me the file. I'll set aside tomorrow for working on this and the other stdpopsim stuff I've dropped. |
@jeromekelleher Do you just want the URLs or do we want to use the BOTO package for handling downloads from the S3 bucket? |
Awesome, thanks @ndukler! They aren't too large, but about 200 Mb combined. I'll put them in a dropbox folder I can share with you, if that works. Really appreciate it, and let me know if I can help in any other way with it. |
I'd prefer to do plain https downloads if we can. Let me know when you've uploaded the files and I'll take a look to see what we can do. (Better to minimise the number of dependencies, if we can.) |
@ndukler do you need me to do uploads? i can do this today |
I can do it @andrewkern :) |
Is this all sorted now, and can be closed? |
I think we can. We need some documentation on how this process actually works though - @ndukler, could you update the dev docs with a short section on how someone goes about uploading these maps? I think our bus-factor is quite high at the moment on this one! |
Uh right now the process is send the maps to Andy or I and we click buttons and drag and drop into AWS. The problem is that this requires private credentials so it's not really something that can be done by the general developer base. Also, on the note of updating the devdocs I think we also need to update the QC section to remove the bits about the CLI and add a bit about the automated testing? |
this is less than ideal... |
We can at least say that "Andy Kern needs to do this"? |
+1 on what Peter says - it doesn't have to be ideal, we just need to document what we actually do.
Yes, good call - can you open an issue to track please? |
This was updated as part of #513. If there's anything still outstanding, please open a new issue. |
Closing this as it's all been taken care of now, except the docs which has its own ticket over at #517. |
i think it's time to think about a more permanent home for all of our maps. does anyone have any good suggestions as to where we might home these data?
The text was updated successfully, but these errors were encountered: