Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: pref_voting: The Preferential Voting Tools package for Python #7020

Closed
editorialbot opened this issue Jul 22, 2024 · 96 comments
Closed
Assignees
Labels
accepted published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 4 (SBCS) Social, Behavioral, and Cognitive Sciences waitlisted Submissions in the JOSS backlog due to reduced service mode.

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Jul 22, 2024

Submitting author: @epacuit (Eric Pacuit)
Repository: https://github.com/voting-tools/pref_voting
Branch with paper.md (empty if default branch):
Version: 1.15.0
Editor: @britta-wstnr
Reviewers: @dmnapolitano, @pkrafft
Archive: 10.5281/zenodo.14675584

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/9e563a36771efcb2f68f679b25ae2ea6"><img src="https://joss.theoj.org/papers/9e563a36771efcb2f68f679b25ae2ea6/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/9e563a36771efcb2f68f679b25ae2ea6/status.svg)](https://joss.theoj.org/papers/9e563a36771efcb2f68f679b25ae2ea6)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@dmnapolitano & @pkrafft, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @britta-wstnr know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @dmnapolitano

📝 Checklist for @pkrafft

@editorialbot editorialbot added Python review TeX Track: 4 (SBCS) Social, Behavioral, and Cognitive Sciences waitlisted Submissions in the JOSS backlog due to reduced service mode. labels Jul 22, 2024
@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.90  T=0.19 s (730.7 files/s, 193737.4 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          90           6352           6044          14536
Markdown                        35           1309              0           1589
JSON                             3              0              0            835
Jupyter Notebook                 4              0           5039            828
TeX                              1             32              0            295
YAML                             2              8             17             30
TOML                             1              3              0             28
DOS Batch                        1              8              1             26
reStructuredText                 2             36             57             24
make                             1              4              7              9
-------------------------------------------------------------------------------
SUM:                           140           7752          11165          18200
-------------------------------------------------------------------------------

Commit count by author:

   216	Eric Pacuit
   178	Wesley H. Holliday
     6	Dominik Peters

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.12987/9780300186987 is OK
- 10.1016/S1574-0110(02)80014-5 is OK
- 10.1007/978-3-319-91908-9_4 is OK
- 10.1016/S1574-0110(02)80008-X is OK
- 10.1007/978-0-387-49896-6 is OK
- 10.7551/mitpress/9780262015134.003.0001 is OK
- 10.1017/CBO9781107446984 is OK
- 10.1007/978-1-4020-9688-4_14 is OK
- 10.1007/978-3-642-20441-8_3 is OK
- 10.1515/9781400868339 is OK
- 10.2307/1911681 is OK
- 10.1007/s00355-015-0909-0 is OK
- 10.1007/978-3-662-09925-4 is OK
- 10.21105/joss.04880 is OK
- 10.1007/978-3-642-41575-3_20 is OK
- 10.1515/9781400859504 is OK
- 10.1017/CBO9780511605864 is OK
- 10.1007/978-94-009-3985-1 is OK
- 10.1007/978-3-662-03782-9 is OK
- 10.4159/9780674974616 is OK
- 10.1017/cbo9780511614316 is OK
- 10.4324/9781315259963 is OK
- 10.1017/cbo9781107446984.003 is OK

MISSING DOIs

- No DOI given, and none found for title: Guide to Numerical Experiments on Elections in Com...
- 10.1093/oso/9780190934163.003.0003 may be a valid DOI for title: Voting Procedures
- No DOI given, and none found for title: Computer-aided Methods for Social Choice Theory
- No DOI given, and none found for title: Rolling the dice: Recent results in probabilistic ...
- No DOI given, and none found for title: Learning to Manipulate under Limited Information
- No DOI given, and none found for title: Voting Methods
- No DOI given, and none found for title: Votelib: Evaluation of voting systems in Python
- No DOI given, and none found for title: Pref.Tools: Tools that are useful for analyzing pr...
- No DOI given, and none found for title: abif
- No DOI given, and none found for title: VoteKit

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

Paper file info:

📄 Wordcount for paper.md is 1077

✅ The paper includes a Statement of need section

@editorialbot
Copy link
Collaborator Author

License info:

✅ License found: MIT License (Valid open source OSI approved license)

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@britta-wstnr
Copy link

Hello again! 👋

@dmnapolitano @pkrafft
FYI @epacuit

This is the review thread for the paper. All of our higher-level communications will happen here from now on, review comments and discussion can happen in the repository of the project (details below).

📓 Please read the "Reviewer instructions & questions" in the comment from our editorialbot (above).
✅ All reviewers get their own checklist with the JOSS requirements - you generate them as per the details in the editorialbot comment. As you go over the submission, please check any items that you feel have been satisfied.
💻 The JOSS review is different from most other journals: The reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention this issue. That will also help me to keep track!
❓ Please also feel free to comment and ask questions on this thread.
🎯 We aim for the review process to be completed within about 4-6 weeks but please make a start well ahead of this as JOSS reviews are by their nature iterative and any early feedback you may be able to provide to the author will be very helpful in meeting this schedule.

@britta-wstnr
Copy link

FYI: I will be out of office for 2.5 weeks. 🌲 🌻 I will check-in again as soon as I am back! In the meanwhile, see above for how to get your reviewing process started! 🌱

@dmnapolitano
Copy link

dmnapolitano commented Jul 23, 2024

Review checklist for @dmnapolitano

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/voting-tools/pref_voting?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@epacuit) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@britta-wstnr
Copy link

Thanks for getting started with your review, @dmnapolitano !
A gentle ping for @pkrafft, could you generate your checklist if you get a moment and confirm the first two points? Thanks! 🙏

@britta-wstnr
Copy link

For transparency: I just sent @pkrafft and email in case the Github notifications are off 🙂

@dmnapolitano
Copy link

Hi @britta-wstnr , I believe I'm done with my review 🎉 Sorry it took so long, and please let me know if there's anything I missed or anything else I can help out with here.

One question I have...it's minor but the authors use "etc." quite a bit throughout the paper, which is rather informal compared to the rest of the paper (which is very well-written!). Is this ok for JOSS? This is my first time reviewing here so I'm not entirely familiar with the style standards 🙂

Thanks!!

@pkrafft
Copy link

pkrafft commented Aug 27, 2024

Review checklist for @pkrafft

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/voting-tools/pref_voting?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@epacuit) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@britta-wstnr
Copy link

Hi @dmnapolitano - thanks a lot for your review! 🙏
I see you raised a question in voting-tools/pref_voting#106 that got resolved; were there any other things the authors should address?
And for the mentioned issue, @epacuit is this now further clarified in the documentation as per the issue comment and if so, @dmnapolitano are you happy with how it's updated?

Regarding your question on style, AFAIK we do not have a strict style guide on this, but if something is disturbing the reading flow, it's definitely worth changing. I had a look at the use of etc. in the paper, and it might indeed make the paper a little more elegant to read if some of those occurrences could be resolved differently in the text. @epacuit can I add this onto your list? Thanks 🙏 and thanks to @dmnapolitano for pointing it out.

@dmnapolitano
Copy link

Hi @britta-wstnr , thanks for following up! Yes, I'm happy with how the issue was handled and I hope it'll be clarified in the documentation soon if it wasn't already. My issue will hopefully help others, at the very least 😄

And thanks, I agree that "etc." feels out of place in this otherwise well-written paper! 🙌🏻

@epacuit
Copy link

epacuit commented Sep 4, 2024

@britta-wstnr @dmnapolitano Thanks for the suggestion about removing "etc.". We have updated the paper accordingly.

@pkrafft
Copy link

pkrafft commented Sep 5, 2024

Hi all and @britta-wstnr, for minor issues, should I make a list in an issue comment, or make direct edits/suggestions through a fork and pull requests?

@britta-wstnr
Copy link

Hi @pkrafft - both works, whatever you find more convenient. If it's a list of minor comments, you can even post that in the thread here if you prefer. Bigger things we think is better documented at the software-level, i.e., on the Github repo of the software (that makes it easier for people to find it back in the future). Then just make sure to link the issue or PR here.

@britta-wstnr
Copy link

@pkrafft gentle ping just to make sure this answered your question? 🙂

@pkrafft
Copy link

pkrafft commented Sep 27, 2024

The issues I've identified in my review are as follows. I think the package, documentation, and write-up are okay but with room for improvement. They are certainly above the standard I would expect for research code but don't quite meet a high open source/free software standard.

  • This might be basic but for a non-expert it could be helpful to include in the installation instructions that pip3 has to be used for Python 3 if you haven't changed your soft links from the default on some machines
  • When trying to import generate_profile, I got an error "ModuleNotFoundError: No module named 'seaborn'" so there might be a problem with the dependency management in the pip file. I also get the error for preflibtools when I try to run the tests. After manually installing seaborn and preflibtools, I can run the code in the README and the tests without dependency errors.
  • It would be helpful if the README included documentation on how the software tests in the tests directory are meant to be run. I have to look at the code to see they are pytest
  • There is an example in the README, but not a real-world example or at least no real world context for the examples are presented in the README. As a reviewer with expertise in Python and passing knowledge of social choice theory, it's not immediately clear to me from the package write-up in the PDF nor from the README how the package is intended to be used in detail. Providing more hand-holding and guidance of real-world examples in the README would help address this. As it stands I have to dig into the Read the Docs to understand what the software is meant to do, rather than have the specific functions and capabilities be clarified by the package description. There are plenty of examples in the Read the Docs, so perhaps one or more of those could be put in the write-up and the README together with a narrative description to give a better sense for the software upfront. In particular, it takes some digging to understand what the tuples that are taken as input for the Profile objects mean.
  • There is extensive functionality of the software, so it is difficult to verify all the functionality. I have been able to run the code in the README without error, but I do get three errors when I run the tests directory with pytest. This is on a fresh install with Python 3.9.6. The errors are copied below.
    FAILED tests/test_all_vms.py::test_all_profile_vms - assert <networkx.cla...t 0x315056430> == [1]
    FAILED tests/test_all_vms.py::test_all_profile_with_ties_vms - assert <networkx.cla...t 0x3143334f0> == [1]
    FAILED tests/test_all_vms.py::test_all_margin_graph_vms - assert <networkx.cla...t 0x31457eb80> == [1]
  • I did not identify any performance claims for the software
  • I am not sure if the community guidelines for contributions to the software are so clear at the moment. As far as I can tell there is just an instruction to email the creators with questions.

@epacuit
Copy link

epacuit commented Sep 30, 2024

Thank you for the feedback!

We are still in the process of updating the README along the lines of your suggestions.

Regarding the errors with the tests and the import issues:

  1. When trying to import generate_profile, I got an error "ModuleNotFoundError: No module named 'seaborn'" so there might be a problem with the dependency management in the pip file. I also get the error for preflibtools when I try to run the tests. After manually installing seaborn and preflibtools, I can run the code in the README and the tests without dependency errors.

  2. There is extensive functionality of the software, so it is difficult to verify all the functionality. I have been able to run the code in the README without error, but I do get three errors when I run the tests directory with pytest. This is on a fresh install with Python 3.9.6. The errors are copied below.
    FAILED tests/test_all_vms.py::test_all_profile_vms - assert <networkx.cla...t 0x315056430> == [1]
    FAILED tests/test_all_vms.py::test_all_profile_with_ties_vms - assert <networkx.cla...t 0x3143334f0> == [1]
    FAILED tests/test_all_vms.py::test_all_margin_graph_vms - assert <networkx.cla...t 0x31457eb80> == [1]

These are both fixed in the commits 8856e72c890d8d7d2b02e856fac499d13c91cd96 and 95f40df3841d1f19419cfc7e23af22553348674f

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/sbcs-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#6352, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label Jan 21, 2025
@dmnapolitano
Copy link

Hi @epacuit, time for me to recommend acceptance for this paper! ✨ This means I hand the torch over to our EiC.

Big thanks to the reviewers @dmnapolitano and @pkrafft for your work, I and JOSS appreciate it! 💙 Thanks everyone for the smooth work on this. 👋

@britta-wstnr thanks so much to you and @pkrafft ! This was my first time reviewing for JOSS and I learned a lot about the process from both of you 😄 🎉 👏🏻

@epacuit
Copy link

epacuit commented Jan 21, 2025

Thanks everyone for your work on this!

@epacuit
Copy link

epacuit commented Jan 22, 2025

Sorry, but there is one thing that we just noticed, there is a bib entry that has an incorrect date:

Holliday, W. H., Kristoffersen, A., & Pacuit, E. (ForthcomingForthcoming). Learning to
manipulate under limited information. Proceedings of the 39th Annual AAAI Conference
on Artificial Intelligence (AAAI-25). https://arxiv.org/abs/2401.16412

Do you know how to make sure it says "Forthcoming" rather than "ForthcomingForthcoming".

The entry in the bib file is correct:

@inproceedings{HKP2025,
title={Learning to Manipulate under Limited Information},
author={Wesley H. Holliday and Alexander Kristoffersen and Eric Pacuit},
year={Forthcoming},
note ={arXiv 2401.16412 [cs.AI]},
booktitle = {Proceedings of the 39th Annual {AAAI} Conference on Artificial Intelligence ({AAAI-25})},
url = {https://arxiv.org/abs/2401.16412}
}

Usually this works fine, but is there a requirement that "year" field is a number?

@britta-wstnr
Copy link

Mhh ... good question - I do not know the answer. Let's see if the EiC knows once they get to this, otherwise I will inquire with the rest of the team!

@epacuit
Copy link

epacuit commented Jan 22, 2025

I think that there is a bug when processing the bib file. If the year field is a string, then it will write the field twice. To test, this I set the year field to the string "2025" (i.e., I used year={"2025"}), and this is what is produced:

Holliday, W. H., Kristoffersen, A., & Pacuit, E. (”2025””2025”). Learning to manipulate
143 under limited information. Proceedings of the 39th Annual AAAI Conference on Artificial
144 Intelligence (AAAI-25). https://arxiv.org/abs/2401.16412

We can fix this by setting the year to 2025 (the paper is coming out at the next AAAI meeting in a little over a month...).

I've done that now, so can we use the most recent version in our repo. Thanks!!

@samhforbes
Copy link

Whew, I can't see immediately why it does that. If it's an ongoing issue we can tag in the dev team, but sounds like it's not an issue for now!

@epacuit
Copy link

epacuit commented Jan 25, 2025

I just wanted to check if there is anything else I need to do to ensure that the issue with the bib file is corrected in the final version of the paper? The latest version in our repo removed the "Forthcoming" and replaced with with an integer so to bypass the bug with processing the bib file. Thanks!

@britta-wstnr
Copy link

@editorialbot generate pdf

@britta-wstnr
Copy link

@editorialbot check references

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.12987/9780300186987 is OK
- 10.1016/S1574-0110(02)80014-5 is OK
- 10.1007/978-3-319-91908-9_4 is OK
- 10.24963/ijcai.2024/881 is OK
- 10.1016/S1574-0110(02)80008-X is OK
- 10.1007/978-0-387-49896-6 is OK
- 10.7551/mitpress/9780262015134.003.0001 is OK
- 10.1017/CBO9781107446984 is OK
- 10.1007/978-1-4020-9688-4_14 is OK
- 10.1007/978-3-642-20441-8_3 is OK
- 10.1515/9781400868339 is OK
- 10.2307/1911681 is OK
- 10.1007/s00355-015-0909-0 is OK
- 10.1007/978-3-662-09925-4 is OK
- 10.21105/joss.04880 is OK
- 10.1007/978-3-642-41575-3_20 is OK
- 10.1515/9781400859504 is OK
- 10.1017/CBO9780511605864 is OK
- 10.1007/978-94-009-3985-1 is OK
- 10.1007/978-3-662-03782-9 is OK
- 10.4159/9780674974616 is OK
- 10.1017/cbo9780511614316 is OK
- 10.4324/9781315259963 is OK
- 10.1017/cbo9781107446984.003 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: Computer-aided Methods for Social Choice Theory
- No DOI given, and none found for title: Rolling the dice: Recent results in probabilistic ...
- No DOI given, and none found for title: Learning How to Vote With Principles: Axiomatic In...
- No DOI given, and none found for title: Learning to Manipulate under Limited Information
- No DOI given, and none found for title: Voting Methods
- No DOI given, and none found for title: Votelib: Evaluation of voting systems in Python
- No DOI given, and none found for title: Pref.Tools: Tools that are useful for analyzing pr...
- No DOI given, and none found for title: abif
- No DOI given, and none found for title: VoteKit

❌ MISSING DOIs

- 10.1093/oso/9780190934163.003.0003 may be a valid DOI for title: Voting Procedures

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@britta-wstnr
Copy link

@samhforbes do I have to recommend-accept again after @epacuit's change to the bibliography?

@samhforbes
Copy link

That's OK I can do that now.

@samhforbes
Copy link

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.12987/9780300186987 is OK
- 10.1016/S1574-0110(02)80014-5 is OK
- 10.1007/978-3-319-91908-9_4 is OK
- 10.24963/ijcai.2024/881 is OK
- 10.1016/S1574-0110(02)80008-X is OK
- 10.1007/978-0-387-49896-6 is OK
- 10.7551/mitpress/9780262015134.003.0001 is OK
- 10.1017/CBO9781107446984 is OK
- 10.1007/978-1-4020-9688-4_14 is OK
- 10.1007/978-3-642-20441-8_3 is OK
- 10.1515/9781400868339 is OK
- 10.2307/1911681 is OK
- 10.1007/s00355-015-0909-0 is OK
- 10.1007/978-3-662-09925-4 is OK
- 10.21105/joss.04880 is OK
- 10.1007/978-3-642-41575-3_20 is OK
- 10.1515/9781400859504 is OK
- 10.1017/CBO9780511605864 is OK
- 10.1007/978-94-009-3985-1 is OK
- 10.1007/978-3-662-03782-9 is OK
- 10.4159/9780674974616 is OK
- 10.1017/cbo9780511614316 is OK
- 10.4324/9781315259963 is OK
- 10.1017/cbo9781107446984.003 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: Computer-aided Methods for Social Choice Theory
- No DOI given, and none found for title: Rolling the dice: Recent results in probabilistic ...
- No DOI given, and none found for title: Learning How to Vote With Principles: Axiomatic In...
- No DOI given, and none found for title: Learning to Manipulate under Limited Information
- No DOI given, and none found for title: Voting Methods
- No DOI given, and none found for title: Votelib: Evaluation of voting systems in Python
- No DOI given, and none found for title: Pref.Tools: Tools that are useful for analyzing pr...
- No DOI given, and none found for title: abif
- No DOI given, and none found for title: VoteKit

❌ MISSING DOIs

- 10.1093/oso/9780190934163.003.0003 may be a valid DOI for title: Voting Procedures

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/sbcs-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#6383, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@samhforbes
Copy link

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

Doing it live! Attempting automated processing of paper acceptance...

@editorialbot
Copy link
Collaborator Author

Ensure proper citation by uploading a plain text CITATION.cff file to the default branch of your repository.

If using GitHub, a Cite this repository menu will appear in the About section, containing both APA and BibTeX formats. When exported to Zotero using a browser plugin, Zotero will automatically create an entry using the information contained in the .cff file.

You can copy the contents for your CITATION.cff file here:

CITATION.cff

cff-version: "1.2.0"
authors:
- family-names: Holliday
  given-names: Wesley H.
  orcid: "https://orcid.org/0000-0001-6054-9052"
- family-names: Pacuit
  given-names: Eric
  orcid: "https://orcid.org/0000-0002-0751-9011"
doi: 10.5281/zenodo.14675584
message: If you use this software, please cite our article in the
  Journal of Open Source Software.
preferred-citation:
  authors:
  - family-names: Holliday
    given-names: Wesley H.
    orcid: "https://orcid.org/0000-0001-6054-9052"
  - family-names: Pacuit
    given-names: Eric
    orcid: "https://orcid.org/0000-0002-0751-9011"
  date-published: 2025-01-28
  doi: 10.21105/joss.07020
  issn: 2475-9066
  issue: 105
  journal: Journal of Open Source Software
  publisher:
    name: Open Journals
  start: 7020
  title: "pref_voting: The Preferential Voting Tools package for Python"
  type: article
  url: "https://joss.theoj.org/papers/10.21105/joss.07020"
  volume: 10
title: "pref_voting: The Preferential Voting Tools package for Python"

If the repository is not hosted on GitHub, a .cff file can still be uploaded to set your preferred citation. Users will be able to manually copy and paste the citation.

Find more information on .cff files here and here.

@editorialbot
Copy link
Collaborator Author

🐘🐘🐘 👉 Toot for this paper 👈 🐘🐘🐘

@editorialbot
Copy link
Collaborator Author

🦋🦋🦋 👉 Bluesky post for this paper 👈 🦋🦋🦋

@editorialbot
Copy link
Collaborator Author

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.07020 joss-papers#6384
  2. Wait five minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.07020
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@editorialbot editorialbot added accepted published Papers published in JOSS labels Jan 28, 2025
@samhforbes
Copy link

Congrats @epacuit on the paper!

Many thanks to @dmnapolitano and @pkrafft for reviewing, and of course @britta-wstnr for editing this one.

@editorialbot
Copy link
Collaborator Author

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following

code snippets

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.07020/status.svg)](https://doi.org/10.21105/joss.07020)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.07020">
  <img src="https://joss.theoj.org/papers/10.21105/joss.07020/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.07020/status.svg
   :target: https://doi.org/10.21105/joss.07020

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 4 (SBCS) Social, Behavioral, and Cognitive Sciences waitlisted Submissions in the JOSS backlog due to reduced service mode.
Projects
None yet
Development

No branches or pull requests

6 participants