Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Affiliated Package: Consolidate with pyOpenSci? #334

Closed
1 of 6 tasks
pllim opened this issue Mar 17, 2023 · 29 comments · Fixed by astropy/astropy-APEs#87
Closed
1 of 6 tasks

Affiliated Package: Consolidate with pyOpenSci? #334

pllim opened this issue Mar 17, 2023 · 29 comments · Fixed by astropy/astropy-APEs#87
Labels

Comments

@pllim
Copy link
Member

pllim commented Mar 17, 2023

pyOpenSci started https://github.com/pyOpenSci/software-submission and they have partnership with JOSS. 👀

JOSS requires a paper, but it can be very short (e.g., two paragraphs). Passing pyOpenSci is similar enough to JOSS that the only additional requirement would be the paper.

Updates from Astropy Coordination Meeting 2023

  • Form a Working Group to work on this. This process will start with CoCo and is already on CoCo agenda.
  • Reach out to both PyOpenSci and JOSS, as the latter might have reviewers that we can tap into (i.e., specialized in astronomy domain) that pyOpenSci might not have.
  • Identify the additional criteria on top of JOSS/pyOpenSci; make sure that complying with Astropy criteria does not trigger a re-review by JOSS/pyOpenSci, i.e., orthogonal criteria.
  • Ensure we still keep our promise to "foster a sense of community" with this new process, to keep our promise "from the start."
  • Ensure we have a way for packages to maintain affiliation after acceptance.
    • Would be nice to automate this. Should open issue about it when we have a more concrete path forward.
  • Ensure we have a transition plan for those packages already affiliated and those still under review. We have to be willing to assist them through this process.
@pllim pllim added the NumFOCUS label Mar 17, 2023
@kelle
Copy link
Member

kelle commented Mar 17, 2023

Nice - great to see more people setting up infrastructure for code review and publication!

How would you envision consolidating with the pyOpenSci process?

While much of our review criteria are similar, Astropy Affiliated packages also have to play nice with Astropy so we have an extra "requirement". How would that work if we consolidated?

@pllim
Copy link
Member Author

pllim commented Mar 17, 2023

Not sure, probably cannot just "jump over". If people think this is worthwhile pursuing, we likely have to form a subcommittee of some sort, lead by the current Affiliated Package Editors (@dhomeier and @WilliamJamieson at the time of writing), to open a conversation with pyOpenSci people in similar roles.

@pllim

This comment was marked as resolved.

@pllim
Copy link
Member Author

pllim commented May 7, 2023

Update: I updated the original post above with notes from Astropy Coordination Meeting 2023. Please let me know if I missed anything. Thanks!

@lwasser
Copy link

lwasser commented May 8, 2023

Good morning all! 👋 I am developing the pyOpenSci community and structure! i am going to chat with @pllim a bit more about this but i wanted to introduce myself here and mention a bit about some of the processes that might support some of your goals! A few notes

  1. Have a look at our pangeo partnership page here. We developed a partnership where pangeo affiliated packages can submit to us, and we review based on both our broad community guidelines for modern packaging best practices plus specific guidelines that need to be implemented to be a "pangeo" package. I am not sure what your checks would be for astropy but this is a point we can discuss further.
  2. Finding reviewers from the astronomy community shouldn't be too difficult - especially if you help us promote the effort. we do calls often for particular areas and normally get good feedback. In a partnership situation it is also really important to have an editor that understands the community (who can atleast guide part of the review) so that is something i think we'd want to talk about as well!
  3. In terms of JOSS there is not much additional effort that needs to happen. JOSS accepts our review as theirs. As such, you can be reviewed by us and become a part of our ecosystem and if we partner, the Astropy ecosystem. The benefits of our reviews is they are highly python specific and focused on best practices following our work with PyPA and others in the broader ecosystem. We also focus on long term maintenance. JOSS accepts our review as theirs so the last step in the process if someone wants a JOSS citation as well is to have a paper reviewed. if the package is in scope for joss (joss doesn't accept some things such as api wrappers) and it will get a JOSS cross-ref DOI. it's a win-win.
  4. We have a package listing that could be used to tag packages as astropy so users could see that we support astropy and find affiliated packages there plus wherever you list them! Pangeo isn't there yet only because we only have 1 package so far!

Packaging best practices

We have been interfacing with the community both scientific and broader community including PyPA to develop a packaging guide. we plan to keep this current and we use these recommendations to help move authors towards more modern standards. But we of course never push too hard on that front recognizing that author's have limited time. We just want to help / encourage authors move towards better approaches when possible.

I hope that helps. i'm happy to answer any questions as well and look forward to chatting more with you @pllim later this week!

@pllim
Copy link
Member Author

pllim commented May 11, 2023

High level summary from meeting up with @lwasser on 2023-05-11 (thank you, Leah!) as follows.

General notes

  • pyOpenSci is funded "for the next 2 years" and it is also actively pursuing more funding. It has one full-time staff, an intern, and a community of volunteers (see https://www.pyopensci.org/our-community/).
  • Its funding goal includes collaborating with an existing project, which would make us a good candidate, especially given we already have Affiliated program and organizational support.
  • Its review procedures are well documented at https://www.pyopensci.org/software-peer-review/about/intro.html .
  • Its Editor in Chief does a first-pass of applications and immediately reject those that do not meet the basic criteria. The packages that got through that will move on to the next phase.
  • Like us, they are pondering how to re-review accepted packages. We both agree that some automation is necessary and pyOpenSci hopes to work with Scientific Python on that front (e.g., devstats).
  • Even if Astropy decided to not be part of pyOpenSci, perhaps we could still share some common underlying infrastructure.
  • Leah and I will keep the communication channel open between pyOpenSci and Astropy.

A major difference

Astropy: Reviewers are anonymous by default (though they can choose not to be). Reviews go through the intermediate person that is one of our two Editors.

pyOpenSci: Reviewers are all public as the whole review process is public. They are unable to make reviewers anonymous as that would conflict with their philosophy.

Question for Astropy: Is this a deal breaker?

Future work

If we decide to proceed...

  1. Hammer out an initial review process that is pyOpenSci + Astropy.
  2. How about a trial period where a couple of packages currently under review go through the pyOpenSci + Astropy process? Are those packages okay with being guinea pigs?
  3. Formulate a transition plan for the following Affiliated categories:
    a. Those accepted a long time ago (let's arbitrarily say 2 years or more?). I think these can go through the process as new applications since they are up for re-review anyway (if we ever can get to the re-review process).
    b. Those accepted not a long time ago (let's arbitrarily say less than 2 years?). These ones need thought because they might get upset to go through the process all over again having just gone through it.
    c. Those still under review. Perhaps we can find out more how to handle these based on our experiences from the trial period in Step 2 above.
  4. Both sides need to document the new process and communicate it out clearly to respective communities.
  5. Execute the transition plan formulated in Step 3 above.
  6. ???
  7. Profit!!!

astropy-dev

If you want to discuss on astropy-dev mailing list: https://groups.google.com/g/astropy-dev/c/gD-Aw4g_eiQ

@Cadair
Copy link
Member

Cadair commented May 11, 2023

I personally find it weird that the astropy review process is private anyway, so +1 to open.

@eteq
Copy link
Member

eteq commented May 11, 2023

FWIW, I don't think the anonymous thing is a deal-breaker. I think there's some tradeoffs associated with seniority and retaliation and the like, but sometimes it cuts both ways.

Expanding a bit more on that: the advantage of anonymity is that it allows someone who's more junior or in a position where they might in some way be retaliated at (even in subtle ways like "the next time that person sees me at a conference they don't invite me to dinner"). But it also allows people to do the other thing of using the shield of anonymity for bad behavior. IMHO in Astropy I think it's better than not because we have a specific set of reviewers from inside the community with at least some understanding that we want to better the project, whereas any random affiliated package author might or might not have that feeling.

So I think it might be advantageous to keep, but I think if it's the price we have to pay to collaborate with pyOpenSci I think it might be worth it if they can lower our infrastructure and support costs.

That said: I will highlight something @adrn mentioned at the coordination meeting: that it's not clear to me our problem is infrastructure, as opposed to finding people to actually do the reviews. pyOpenSci might not be able to really help with that. But if they are thinking of re-review and have some energy to do that too than it seems like a good win! Either way I completely agree we should keep lines of communication open, so thanks for doing that @pllim !

@eteq
Copy link
Member

eteq commented May 11, 2023

I guess that raises a question for @lwasser 👋 : can you imagine having anonymity being an option or something in the future for pyOpenSci? I'm not sure it's worth it for the extra cost even given what I said above, but maybe you have an opinion already here?

@pllim
Copy link
Member Author

pllim commented May 11, 2023

@eteq , even if it does not help with increasing the pool of reviewers (which I would disagree if collaboration means we also have access to JOSS reviewers though I am not sure on that), it would help exposure for the Affiliated packages (to be advertised to wider scientific Python community rather than just being listed on our own website) but that is my personal opinion.

@hamogu
Copy link
Member

hamogu commented May 11, 2023

I think when the current editors said "finding reviewers is a problem", they did not mean "we ask a lot of reviewers and they all turn us down and then we don't know whom to ask any longer", instead they mean "we as editors don't know enough about subfield X to know which person to ask as an appropriate reviewer in the first place".
That's not just a guess, but is based on (a) my own experience as a previous affiliated package editor and (b) a look a at the current table of packages to be reviewed that I discussed with one of the current editors at the last Coordination meeting.

That means it's not primarily about increasing the pool of reviewers (though that won't hurt of course), it's more about having a larger team of editors that cover a broader range of expertise themselves and thus will have more knowledge of different sub-fields of astronomy then one or two people can possible have.

A collaboration with pyOpenSci (and indirectly JOSS) is neither good nor bad for that problem in itself, we could solve that by either increasing the number of our own affiliated package editors e.g. from 2 to 4, or by collaborating with pyOpenSci and contributing a few people to an existing team of editors.

@hamogu
Copy link
Member

hamogu commented May 11, 2023

Like @eteq I realize that there is a reason for having anonymity (at least as an option), but I have a preference for open reviews where possible and in practice

  • we don't have a many reviewers that are junior (it's less like to be a problem for a senior reviewer)
  • We currently combine the review with a review by the editor; and a review where multiple reviewers agree is less likely to be attacked by and ungrateful submitter. We could suggest to pyOpenSci to automatically request a second review in case the first review is negative or to allow the reviewer to ask for a second reviewer who will co-review for a common report signed by both. (See above, in my experience, reviewers usually are willing to do the review when asked and not conflicted). Some details to be worked out, but I'm saying is that we can probably find some minor tweak of the existing procedures to give support to reviewers who feel they might be at risk for retaliation.

@lwasser
Copy link

lwasser commented May 11, 2023

hi everyone! 👋

and hi @eteq !! nice to connect again here. i think we met many moons ago at scipy when we did an early bof there.

I guess that raises a question for @lwasser 👋 : can you imagine having anonymity being an option or something in the future for pyOpenSci? I'm not sure it's worth it for the extra cost even given what I said above, but maybe you have an opinion already here?

Right now it's very difficult to imagine anonymous reviews. Perhaps we can chat a bit more about this so i can better understand.

Our review model is not adversarial, it's objective and supportive. It is about supporting maintainers and providing constructive feedback on package usability, structure, technical elements, etc. I mentioned to @pllim that if a package is rejected it's rare but often in the initial editor in chief checks. There are a few circumstances for rejection including out of scope and/or unusual / complex / non standard infrastructure that we know will be difficult to support/ maintain if that maintainer steps down.

in our reviews, sometimes reviewers open pull requests. this most recent review might be a good one to see how our review is a conversation with a goal of improving package infrastructure and usability, etc. An adversarial reviewer or author would violate our code of conduct and our values around peer review. Maintainers work hard enough already to develop and maintain their tools. we want to help them and slowly improve / enhance the ecosystem in the process. A "good" reviewer has a goal of helping to improve the package and point out issues.

We'd be happy to also add additional language to our COC that authors agree to prior to a review surrounding adversarial review environments and repercussions for retaliation (being removed from the ecosystem?). As of now they do have to agree to our COC.

Junior level reviewers

We embrace reviews of all career stages and will even support them if they are new to review through our mentorship program. id NEVER want to see someone become fearful of their career trajectory being impacted because of a review.

Questions for the group here

  • Can you tell me more about your review process? Do you normally have two reviewers? and do those reviewers have packaging standards that they review against?
    I find that it's nice to have a reviewer with domain specific knowledge and also one with outside eyes who can also look at things like:
    • are docs clear and accessible and
    • is the packaging infrastructure modern and inline with standards, etc

Sometimes we have a very technical reviewers and one that is most focused on usability. that can vary.

  • Would the retaliation come from disagreement around a specific workflow in a package that someone developed that another person may not approve of? We believe that review should be objective. As such the feedback should only seek to improve the package.

Second reviews of packages

  • When you say "re-review" what does that look like for you? We do not implement a full second review of packages. JOSS won't accept a second round of review unless there are SIGNIFICANT changes to a packages API / functionality that warrant publication.

What we do care about is long(er) term maintenance of tools. As such we do plan to develop processes to flag packages that become unmaintained over time. We do want to support finding new maintainers if we can.

Reviewer pool

@pllim i'm wondering if we can discuss this more in person in seattle? We do not "access" JOSS reviewers per say. Normally to find reviewers we reach out to the community to find people who have domain knowledge and packaging knowledge. It could be that someone who reviews for JOSS might also review for us! I need to know a bit more about the challenges you've all experienced finding reviewers.

Things that we offer in terms of peer review

  1. We are developing a community driven packaging guide that defines general standards recommendations and guidelines for packaging. We work with folks from core Python, PyPA and others to develop this. Authors will get feedback on packaging even in the initial editor checks. And we hope it over time helps to unify / clarify packaging approaches in the ecosystem .
  2. We have a lens beyond a single domain specific ecosystem. This allows us to identify package functionality overlap if it exists beyond our ecosystem.
  3. We want to serve as a supportive bridge for communities like yours while encouraging standardized packaging approaches
  4. We have reach and visibility across all domains.

I hope that is helpful. please let me know if you have other questions! and @pllim i'm super excited to talk IRL !!

@hamogu
Copy link
Member

hamogu commented May 16, 2023

Long post with details in subsections below.

Bottom line is that I personally think we (Astropy) should join our affiliated pacakge review process with PyOpenSci.
I've looked at the extra pangeo guidelines and I think that model would work very well for us; I'm convinced that what we do and what PyOpenSci is doing is closer to each other than we think. We both get more visibility by merging this - plus an added benefit of the existing JOSS collaboration.

In practice, I'm OK with open (non-annonymous) reviews - we do that for PRs, so why not do it for affiliated packages?

If we go this route, we would simply set a cut-of date and say "after date X, submit through PyOpenSci and follow their process. The main open question is what to do about packages that are already affiliated to astropy. It does not seem fair to force all of them to go through the full review process again, in particularly those that have been recently reviewed. On the other hand, I feel it's hard to set a cut-off and say "packages accepted within the last two years are OK, all other start from scratch".
@lwasser: Would pyOpenSci be open to accept our old review as yours and simply take all of them (current list here: https://www.astropy.org/affiliated/index.html)? Some of them have been accepted years ago and probably don't conform to what is best practice today, and some might be essentially unmaintained right now.

Is retaliation a problem in practice?

I think that "retaliation" is more of an abstract concern that's translated from what people are used to on the science side and should not distract us from what we try to accomplish here. It's not something that's ever come up and we do have community guidelines and codes of conduct that forbid retaliation. If they could be enforced (say, I don't hire a Post-Doc at my institute because I did no like their review three years before) is a different matter. We, in Astropy, also try to be helpful, and I know of cases where reviewers have opened PRs (without revealing that they were the reviewer) or have explicitly pointed to templates or examples for e.g. specific fixes of documentation.

Number of reviewers in our current process

In our current process, we combine two reviews: The "reviewer", who is normally chosen to be a subject matter export (e.g. someone who works on galaxy evolution for a packages that models galaxies) and the "editor" who usually looks more one the technical side (CI, docs, license files). The intention was to ensure that accepted packages not only follow best practices for packaging etc., but also represent the accepted state of the art in the field, e.g. don't rely on star formation theories that have been disproven decades ago. Professional astronomers typically get a certain number of "I can prove Einstein is wrong / only God created the universe / the elites control our thinking / ..., but the scientific establishment ignores me" emails/messages/submissions per year, and we anted to make sure none of those sneaks into an accepted package; we also wanted to at least carefully screen things that are true science, but so experimental,so new, and so specific to one particular paper, that there is little potential for reuse and high potential that the theory it's based on turns out to be wrong very soon. Again, in practice, I don't think we've had any submission of those types, but at least in my mind, that's why we ask for a review from a subject matter expert.

re-review

We've always intended, but never done re-reviews. What we mean by that is that packages are checked again after a few years and might be removed from the affiliated list at that time, if they fall behind the times, e.g. a package might have been accepted as Python 2.6 as out and maybe wasn't updated since, or if if they become unmaintained. It's OK to be stable, but if there are no commits for the last few years at all, and it's broken with current Python versions, we would not want to recommend it as an affiliated pacakge any longer.

This sounds very similar to "we do plan to develop processes to flag packages that become unmaintained over time", so maybe we can develop that process together.

Some background on "expectation for peer review in the astronomical community" for context of the previous posts:

Most science journals in our field have reviewers anonymous by default; some now push for a double-blind standard where the authors are also anonymous to the reviewer and only known to the editor. Similarly, NASA proposals (and many other organizations in astronomy) have always used anonymous reviewers and NASA policy is now requiring all missions to move to a double-blind process, where the name of the proposer is revealed to the reviewers only after the proposal has been ranked.

Obviously, that's not going to work fora review for code that's on github with signed commits, but I think that some of (at least my) reluctance comes from the fact that we've been trained for the last few years that double-blind is the new standard and it better for inclusivity in proposals and paper reviewing (and there is data to back that up). On the other hand, making all reviews public is moving in exactly the opposite direction.

@lwasser
Copy link

lwasser commented May 17, 2023

hey there @hamogu ! phew there is a lot there to digest! let me give this a go :) but what might be nice is for us all to have a chat at some point and then we can take notes and share here??

i also think there are enough details here that a google doc would be helpful to sort through so that decisions are recorded in one place. i have started one and will share it once i've added a bit more to it (it's empty now :) )

How would the transition work for already reviewed and accepted astropy affiliated packages?

@lwasser: Would pyOpenSci be open to accept our old review as yours and simply take all of them (current list here: https://www.astropy.org/affiliated/index.html)? Some of them have been accepted years ago and probably don't conform to what is best practice today, and some might be essentially unmaintained right now.

i think this would be difficult for us to just accept package by default. it would also be unfair to those who spent time with us going through review. a colleage of ours @cmarmo actually discussed this a bit yesterday. BUT here is what i propose as an alternative:

  1. we pull together a small team that can sort through packages that are already accepted
  2. we identify which of those are no longer being maintained reaching out to maintainers as needed
  3. we feel out whether any of those packages might be open to a review by us. i did speak briefly with the maintainers of poliastro at one point which seems to be affiliated with astropy. they have an open issue about pyos review.

Based on the above we see how many packages would need to be reviewed, how maintainers feel about that and go from there. If you had intended to do a second review at some point anyway and you have packages that were reviewed many years ago a new review might be good regardless.

Another idea: what if we have a list of affiliated packages somewhere that are not pyos reviewed (yet). maybe that list lives on the astropy website and we slowly review packages and they move over to our listing. my vision for our website would be we'd add a "astropy" filter to our package page. .
Screen Shot 2023-05-17 at 9 24 35 AM

Transparency in reviews

Obviously, that's not going to work for a review for code that's on github with signed commits, but I think that some of (at least my) reluctance comes from the fact that we've been trained for the last few years that double-blind is the new standard and it better for inclusivity in proposals and paper reviewing (and there is data to back that up). On the other hand, making all reviews public is moving in exactly the opposite direction.

i hear you. I think a LOT of scientific journals follow this policy / approach. in fact every review i'd ever been through has been blind and i only interface with the editor. But maybe you can consider what your goals are. I think that the open source community invests a lot of (often unpaid) time in tools that other people use. Open source work is also very much undervalued particularly in academia. So often someone who is passionate about open work is forced to both publish traditional papers AND continue their work for the good of the community without any recognition. This core issue in academia is precisely what journals like JOSS were designed to address. Support for maintainers rather than too much additional work to fit into that specific academic publication model. pyOpenSci takes this concept a step further. We want to not only provide a useful supportive review that improves the quality and potential sustainability of a package, but also we want people to know about the tool AND we want to support the maintainer as questions come up about packaging, best practices etc. we want to create and support community. as such i think there is great value in an open review where the goal is to improve the package and support the maintainer while also evaluating the package's utility. And hopefully through that support it will help authors over time as they maintain the tool. And ideally over time if they need to move on we can help find a new maintainer team to keep the tool going.

i have a lot of thoughts here but i'd love us to move away from that traditional academic model in the open source space. It does not feel like a healthy fit for open source maintainers. I do understand it on some levels for academic papers. But do we really need it for code given we are all working openly? That is just my opinion :) but one i feel strongly about.

Tracking maintenance

it would be wonderful to work together a bit on this item. The other day i was introduced to a very cool tool created by the scikit-build dev, Henry. I spoke with him about how we could potentially modify this to automate both our early editor in chief checks and our ongoing maintenance checks. We also hope to create some dashboards for our packages using work scientific python is doing to track activity. We could implement all of this with a bot that we are planning to install via a collab with arfon from JOSS and his work on whedon combined with perhaps a cron job that runs a report / updates a dashboard for us. Finally id like to maintain a list of maintainer emails (we have a survey that maintainers fill out as a part of the review). this will allow us to send out periodic emails maybe once a year to check in on maintenance status and any needs people have. In short this is something i've been thinking about for a while . i feel confident it is something that we can track over time and then either sunset packages as we need to / find new maintainers etc. My next funding pitch will likely include some of this infrastructure work. the whedon bot work is planned for this current funding cycle.

anyway i have a lot of ideas here and we just need to think about use cases and implement this.

summary

i'd really like to see some changes in the academic model. review should be constructive. and reviewers shouldn't be fearful of providing feedback to someone that is constructive. I also believe those participating in reviews should come from diverse backgrounds. so we want a mix of early vs late career, varying gender identities, backgrounds etc. Feedback shouldn't be opinion driven or personal. it should be useful and helpful. And it should be a mix of usability (documentation), domain specific feedback, technical packaging feedback. (this is why we have editor in chief checks, an editor and 2 reviewers for each review).

id really like to see us support the open source maintainers that are already doing more for the community.

i suspect based on everything that i've read here that we could work together productively producing a win-win situation.

  • Perhaps some of your editors join our editorial board.
  • We also have an editor who is already familiar with astropy who can pitch in!

Things we offer:
*Cross community visibility for packages (and astropy would help us here as well) win/win!

  • We have access to reviewers with expertise in modern packaging best packages but could also tap your community for domain specific reviewers.
  • We are working to also bridge standards in our packaging guide with work being done by PyPA and others.
  • Reviewers and editors get credit and visibility (if they want) for participating as pyos community members
  • Maintainers now get access to JOSS acceptance via our review
  • We can track maintenance over time and support finding new maintainers or sunsetting packages if need be.
    i hope that is helpful. i'll try to pull together some of this into a more easy to digest google document over the next week. And perhaps @pllim we can chat more and work on this document together next week (IF everyone here is ok with it)?

@pllim
Copy link
Member Author

pllim commented May 17, 2023

Definitely, I would be happy to chat more next week. I also love the plan for automation and resource sharing.

A follow-up meeting is a great idea and I think this time it should include at the very least @astropy/coordinators (or subset of them) and our current Editors (@dhomeier and @WilliamJamieson). Whether when we can find a good time though is another matter... 😬

@lwasser
Copy link

lwasser commented May 17, 2023

sounds great. also please if y'all have other questions / concerns i'm happy to do my best to answer them. we just want to support the scientific community however we can as a centralized community resource.

@namurphy
Copy link

Just stopping by from a Python in Heliophysics Community (PyHC) meeting, and @lwasser just gave a fantastic talk on pyOpenSci. I just wanted to mention that there's a lot of interest among PyHC members about joining pyOpenSci, in particular because of how closely related the heliopythoniverse is with the astropythoniverse.

Also of interest to @jibarnum and @sapols.

Thanks!

@pllim pllim pinned this issue May 19, 2023
@pllim
Copy link
Member Author

pllim commented May 19, 2023

Since this is in active discussion by multiple groups, I pinned this issue for higher visibility. FYI.

@pllim
Copy link
Member Author

pllim commented May 24, 2023

I spoke with @lwasser at length today at Scientific Python Developers Summit 2023. There were a lot discussed. Here are two concrete follow-up action item:

  • I have set up a "when2meet" for those interested to meet with Leah (I plan to attend as well): https://www.when2meet.com/?20166207-Qjbb4 (Please sign up if interested!)
  • Leah will clean up the notes from our discussions and share it from a later point.

@lwasser
Copy link

lwasser commented May 26, 2023

Hi Colleagues 👋
@pllim and I worked on a google document today that summarizes all of the things that we've discussed. Everyone here should have comment access to the document which means you can suggest changes and make comments / ask questions. it would be wonderful to have y'all review this prior to our check-in if that is possible! ✨ i'm hoping that this summary will also be helpful to other communities that we are talking to that have similar goals.

Please don't hesitate to reach out with any questions.

Looking forward to chatting more about this potential partnership.

@eteq
Copy link
Member

eteq commented Jun 15, 2023

As a quick informational item triggered by some out-of-band discussion between @lwasser and others in this thread, I wanted to leave this link here: https://github.com/astropy/astropy-project/blob/main/affiliated/affiliated_package_review_guidelines.md has the astropy affiliated package guidelines for comparison with pyopensci standards.

@lwasser
Copy link

lwasser commented Jun 15, 2023

thank you @eteq !! i am going to work on a draft pr that attempts to identify the astropy specific standards that would overlap on top of the pyOpenSci standards. Once I have a pr open, I will leave a link here to ensure everyone sees it. i'll likely work on this in the next 2 weeks. I will also work on a landing package mockup for astropy for our website.

It was wonderful to talk to astropy folks today and to get to know the community a bit better. I look forward to moving forward together collaboratively. More soon!

@pllim

This comment was marked as resolved.

@pllim
Copy link
Member Author

pllim commented Jun 17, 2023

Meeting notes from 2023-06-15

Attendees: @pllim, @lwasser, @WilliamJamieson, @eteq, @hamogu, @dhomeier

✔️ Yes, we should move forward with this collaboration!

Q: Who owns this process?
A: pyOpenSci already has an ongoing process and it is a community-driven one. It is compatible with Astropy but it is up to the Project to decide if a package applied to pyOpenSci also qualifies as Astropy Affiliated.

Q: Will this new process take away Astropy's community building capability?
A: No, pyOpenSci is providing the infrastructure It has no intention to take anything away from Astropy. The Project still has the decision making power when it comes to which package qualifies to be Astropy Affiliated.

Q: How are the discussion between pyOpenSci and other groups going?
A: Discussions still ongoing and other groups seem to want the same things as Astropy with respect to collaboration with pyOpenSci.

Q: Can we talk about this more during Scipy 2023 conference?
A: Absolutely! Those attending could meet and discuss over lunch, etc.

Q: Can pyOpenSci help with expanding the reviewer pool or do we have to also reach out to JOSS?
A: Leah said finding reviewers is always hard even on JOSS side; however, things have always have worked out one way or another. Moritz agreed based on Astropy's own experience.

Action Items

Blocked for now

@lwasser
Copy link

lwasser commented Jul 5, 2023

hi everyone! i am finally focused on the TODO's that were mentioned above.

To begin, please find a pr here that creates a draft partner page akin to what we created with pangeo here. .

I wanted to be transparent in my process of developing the text. Thus, in that pr you'll see a link to the google document where i took all of the affiliated package requirements that i could find, and tried to merge it with our existing requirements. things like documentation will be addressed in our inintial editor in chief checks for instance.

I welcome any and all feedback

the next step is to work a bit more on the astropy landing page for @eteq and others to review! NOTE - this is going to require a discussion around what metrics make the most sense for ALL packages to track package health over time. As such we might have a few iterations on the design of this page.

the initial will be based on what i already have developed for the website. then we will get funding (i'm already talking with funders!) to build out some more infrastructure around this specific dashboard leveraging as much as we can from work being done at @ Scientific Python which already has some funding for their devstats work 😄

reach out or respond here with any general questions. Please leave comments on the requirements in the open PR so we can iterate there.

@lwasser
Copy link

lwasser commented Jul 5, 2023

One more point - HERE is a start at a discussion around the landing page and what metrics we might want to show.
Please note - if you have design ideas for what that page should have - leave notes here

To discuss metrics collected that will impact all of our packages which is a longer term discussion around how to track mainetnance that we will seek funding to implement - please go here. . we can actually implement the partner landing page for packages however now or whenever we are ready and then keep the discussion going around metrics which will take more time to develop.

@eteq i hope this gives you what you need to start thinking about the astropy website implement as well! we can modify the feed to include whatever we want. i haven't actually created it yet but can once we've made some decisions around what we want to include

@pllim
Copy link
Member Author

pllim commented Jul 28, 2023

Update from me on 2023-07-28:

  • APE 24 draft is ready for reviews by co-authors. If it not yet ready for project-wide comments. If you want to be co-author and not listed as such, please let me know.
  • I left comments on issues/PRs mentioned by @lwasser . These are also linked from APE 24 document.

I really hope @eteq can chime in soon in the website mock-up. Otherwise, we might just move on without that valuable input.

@lwasser
Copy link

lwasser commented Aug 1, 2023

i just left feedback on the APE! pls let me know if you need anything else.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants