Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: Reducing the efforts to create reproducible analysis code with FieldTrip #5566

Closed
editorialbot opened this issue Jun 16, 2023 · 102 comments
Assignees
Labels
accepted Makefile Matlab published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Jun 16, 2023

Submitting author: @matsvanes (Mats W.J. van Es)
Repository: https://github.com/fieldtrip/fieldtrip
Branch with paper.md (empty if default branch): JOSS
Version: 20231220
Editor: @crsl4
Reviewers: @gflofst, @ashahide
Archive: 10.5281/zenodo.10495308

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/7f09b30f96082cab883d4dad60195e97"><img src="https://joss.theoj.org/papers/7f09b30f96082cab883d4dad60195e97/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/7f09b30f96082cab883d4dad60195e97/status.svg)](https://joss.theoj.org/papers/7f09b30f96082cab883d4dad60195e97)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@gflofst & @ashahide, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @crsl4 know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @gflofst

📝 Checklist for @ashahide

@editorialbot editorialbot added Makefile Matlab review TeX Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials labels Jun 16, 2023
@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1093/nar/gky379 is OK
- 10.3389/fnins.2018.00261 is OK
- 10.3389/fninf.2012.00007 is OK
- 10.1038/nrn3475 is OK
- 10.1111/nyas.13325 is OK
- 10.1016/j.neuron.2017.10.013 is OK
- 10.3389/fninf.2011.00013 is OK
- 10.1093/bioinformatics/bth361 is OK
- 10.1155/2011/156869 is OK
- 10.1126/science.aac4716 is OK
- 10.1016/j.neuroimage.2019.06.046 is OK
- 10.1016/S1053-8119(03)00185-X is OK
- 10.1177/0956797611417632 is OK
- 10.1371/journal.pbio.2000797 is OK
- 10.1017/S0140525X17001972 is OK

MISSING DOIs

- 10.1007/3-211-27183-x_1 may be a valid DOI for title: MATLAB

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.88  T=17.42 s (418.7 files/s, 69514.6 lines/s)
---------------------------------------------------------------------------------------
Language                             files          blank        comment           code
---------------------------------------------------------------------------------------
MATLAB                                5647          95785         230136         480408
C                                      829          33375          46894         162300
C++                                    109           6363           6643          25657
C/C++ Header                           283           8799          18127          22648
make                                   115           2891           1412           9397
TeX                                     19           1676            623           9097
Fortran 77                              36           1743           3894           7722
Java                                    39           1069           1661           5194
HTML                                    40            598           1060           5020
Bourne Shell                            20            961           1044           2710
Windows Module Definition                8            122              0           2437
Arduino Sketch                           7            536            904           2303
Markdown                                27            499              0           1150
XML                                     14             18             43           1005
TypeScript                               1              0              0           1001
C Shell                                 29            148            127            784
Expect                                   6             17              0            783
Ruby                                     4              0              0            603
Python                                   4            173             94            532
MSBuild script                           2              0              0            455
Mathematica                              3              9              0            220
Perl                                     1             40             81            220
XSLT                                     1             47             21            217
sed                                      5              8              0            164
ProGuard                                 7             50             25            141
CMake                                    3             58             16            107
Ant                                      2             16              2            100
Scheme                                   1              0              0             78
CSS                                      1             14              2             66
INI                                      9              6              5             59
Bourne Again Shell                       3              8              0             40
DOS Batch                                3             11              1             39
reStructuredText                         1             18              0             33
awk                                      6             15              0             29
Qt Project                               2              6              0             18
MUMPS                                    1              4              0             14
YAML                                     2              0              0             11
Solidity                                 2              0              0             10
---------------------------------------------------------------------------------------
SUM:                                  7292         155083         312815         742772
---------------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

@editorialbot
Copy link
Collaborator Author

Wordcount for paper.md is 1579

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@gflofst
Copy link

gflofst commented Jun 16, 2023

Review checklist for @gflofst

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/fieldtrip/fieldtrip?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@matsvanes) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@ashahide
Copy link

ashahide commented Jun 19, 2023

Review checklist for @ashahide

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/fieldtrip/fieldtrip?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@matsvanes) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@gflofst
Copy link

gflofst commented Jun 20, 2023

@matsvanes does not appear to have made any contributions to this code base. Am I looking at this wrong?

@crsl4
Copy link

crsl4 commented Jun 22, 2023

You seem to be right! Not sure of the expectations for first authorship, but let me check!

@crsl4
Copy link

crsl4 commented Jun 23, 2023

Hi @matsvanes, in response to @gflofst 's comment, do you mind sharing your author's contributions to FieldTrip? Thanks!

@matsvanes
Copy link

matsvanes commented Jul 4, 2023

@crsl4 Thanks for reaching out. I can see how my contributions are not clear from FieldTrip's github directly. I hope the following clears things up.
Unfortunately, much of the work submitted here has somewhat dispersed origins. My contributions were mostly in design, testing, and documentation, and haven't ended up in the main branch of the FieldTrip toolbox directly. See below for a trail of my involvement in the various stages.

Design
The toolbox addition was inspired by this early implementation.

Development and testing
Mostly through comments on issues and pulls on the FieldTrip Github, for example: #1617, #1362, #1330, #1317, #973

Documentation
A trail of developing the tests and documentation is found here, and their implementation on the FieldTrip website here.

@gflofst
Copy link

gflofst commented Jul 5, 2023

@matsvanes JOSS will have to judge if this is sufficient. I don't have adequate guidance except to see if "significant" effort was involved on the submitter's part. If you have any design notes or otherwise that can be added to this thread, that would make it more easily pass the bar. I appreciate tests and documentation rather than just code. I would certainly count those. Similar software is hard to judge attributions without something explicit.

@matsvanes
Copy link

Thank you for the suggestion @gflofst. Of importance would definitely be the two links above under Documentation. In case it's relevant, we also have a pre-print available.

@matsvanes
Copy link

Hi @crsl4 - is there any update on the review progress? Thanks!

@crsl4
Copy link

crsl4 commented Aug 4, 2023

Thanks @matsvanes for your message! I am still waiting to hear back from reviewers. The pre-print opens up a new set of questions. Can you explain the overlap? Is that preprint a journal submission?

@gflofst
Copy link

gflofst commented Aug 4, 2023

@crsl4, We have a question as to whether the submission meets the required significant effort bar. In the submitted materials, the author is not in the code repo at all, but was responsible for writing some of the associated papers, provided after the fact. If this is sufficient, I think we need to have the repo updated with links/copies of the additional materials to be able to check that review criteria as satisfied. I am waiting for your guidance on whether what is in the repo is sufficient, if the additional materials existing are sufficient, if the existing materials need to be added/linked in the repo to be sufficient, or if it is insufficient given the lack of code contributions.

@ashahide
Copy link

ashahide commented Aug 4, 2023

@crsl4, I was wondering the same thing about the preprint. Should the review comments be focused only on the material in the JOSS article, or should we also consider the additional preprint?

Thanks for your help.

@crsl4
Copy link

crsl4 commented Aug 4, 2023

@ashahide We are supposed to only review the JOSS article, but we should understand what is the gap to the pre-print, and whether the pre-print has significant overlap (bc then, why do we have the JOSS article?).

@crsl4
Copy link

crsl4 commented Aug 4, 2023

@gflofst Thanks for your message! I think I want to discuss this with other editors. I am not sure what the standard expectations are compared to other articles. I will reach back, thanks!

@ashahide
Copy link

ashahide commented Aug 4, 2023

Hi @matsvanes , is there a list of the required Matlab toolboxes somewhere that I may be missing? I see that there is an external folder for additional non-Matlab dependencies and that some of the tests check for necessary toolboxes, but I'm unable to find a list in the documentation.

@matsvanes
Copy link

Hi @matsvanes , is there a list of the required Matlab toolboxes somewhere that I may be missing? I see that there is an external folder for additional non-Matlab dependencies and that some of the tests check for necessary toolboxes, but I'm unable to find a list in the documentation.

Hi @ashahide, the list of external toolboxes can be found here. Most of these are supplied alongside FieldTrip (i.e., in /external), and are added to the path when running ft_defaults (see here for more info on installation)

@matsvanes
Copy link

@crsl4, We have a question as to whether the submission meets the required significant effort bar. In the submitted materials, the author is not in the code repo at all, but was responsible for writing some of the associated papers, provided after the fact. If this is sufficient, I think we need to have the repo updated with links/copies of the additional materials to be able to check that review criteria as satisfied. I am waiting for your guidance on whether what is in the repo is sufficient, if the additional materials existing are sufficient, if the existing materials need to be added/linked in the repo to be sufficient, or if it is insufficient given the lack of code contributions.

I hope I can clarify these questions with some historic perspective/context.
This project was part of my PhD - a time when I unfortunately was rather new to git(hub). Because of this, my FieldTrip branch was messy, and I used the quick route through my collaborators (co-authors on this paper) to get the code on the master branch. As mentioned before, I have also done most of the testing and documentation of the code.
Regarding the pre-print: this is essentially one chapter out of my PhD thesis. We were interested in submitting to JOSS because we believe its aims align well with our paper, and we wanted to change the narrative from a tutorial-style opinion/recommendation paper to put the focus on the software implementation itself (and get credit for it). Consequently, after opening a pre-submission enquiry, we adapted the pre-print to the JOSS format, and moved the tutorials to the website documentation: 1, 2, 3.

@robertoostenveld
Copy link

@matsvanes does not appear to have made any contributions to this code base. Am I looking at this wrong?

Let me as the head of the FieldTrip project and senior author on this submission confirm Mats' comment: Mats did indeed make significant contributions to the conceptual design, the implementation, the testing, and the the documentation (initially in the form of a manuscript that was rejected elsewhere, later also on the website). He fully deserves credit for this as first author.

My own involvement was mainly to supervise the process, to ensure consistency and future maintainability, and to help with the technical aspects of the coding and merging process. Since we are neuroscientists and neither of us are git and GitHub experts, I realize that the representation on github is not as ideal as it could have been. We continue to constantly learn how to make use of version control using git, and also how to pass those skills on to students that are working on concrete projects which involve coding (like this reproducescript one).

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/bcm-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#5020, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label Feb 17, 2024
@crsl4
Copy link

crsl4 commented Feb 17, 2024

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

I'm sorry @crsl4, I'm afraid I can't do that. That's something only eics are allowed to do.

@Kevin-Mattheus-Moerman
Copy link
Member

Kevin-Mattheus-Moerman commented Feb 19, 2024

@crsl4 thanks for editing this one. Can you please call @editorialbot create post-review checklist and go through/check the last steps?

In particular, the archive title should match the paper title. Also the version tag rendered here should match the one for the archive (it does I think) and should also match a tagged release on their repo (it doesn't I think, I can see the most recent one is 20240214 I think). Finally the archive listed author set should match the paper set.

@matsvanes

  • Can you check if citing this preprint would be appropriate for the brainlife citation: https://arxiv.org/abs/2306.02183 ?
  • Check for consistent use of British/European English (analysing, organisation) versus American English (standardized, initialized, organization), given your affiliations you may want to adopt to use only the former.

@matsvanes
Copy link

@Kevin-Mattheus-Moerman I have checked both points, and updated the reference for BrainLife to the one you provided.

@crsl4
Copy link

crsl4 commented Feb 20, 2024

@Kevin-Mattheus-Moerman do I need to do the post-review checklist again? I already did that before (date Dec 19, 2023), so not sure if it is needed again.

@Kevin-Mattheus-Moerman
Copy link
Member

Kevin-Mattheus-Moerman commented Feb 20, 2024

@crsl4 oh sorry I missed that. No in that case there is no need. Although from my comments you can see several boxes were perhaps ticked but the steps were not completed (e.g. title and version tag), so do keep an eye out for that in the future.

@Kevin-Mattheus-Moerman
Copy link
Member

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1093/nar/gky379 is OK
- 10.3389/fnins.2018.00261 is OK
- 10.3389/fninf.2012.00007 is OK
- 10.1038/nrn3475 is OK
- 10.1111/nyas.13325 is OK
- 10.1016/j.neuron.2017.10.013 is OK
- 10.3389/fninf.2011.00013 is OK
- 10.1093/bioinformatics/bth361 is OK
- 10.1155/2011/156869 is OK
- 10.1126/science.aac4716 is OK
- 10.1016/j.neuroimage.2019.06.046 is OK
- 10.48550/arXiv.2306.02183 is OK
- 10.1016/S1053-8119(03)00185-X is OK
- 10.1177/0956797611417632 is OK
- 10.1371/journal.pbio.2000797 is OK
- 10.1017/S0140525X17001972 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@Kevin-Mattheus-Moerman
Copy link
Member

@matsvanes thanks for making some of these changes. Can you also create a release on your repository with the tag 20231220 to match the one listed on the archive and associated with the paper? Alternatively we edit the JOSS listed and archive listed version tag to match your seemingly most recent tag 20240214. Let me know what you decide.

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/bcm-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#5024, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@matsvanes
Copy link

matsvanes commented Feb 20, 2024

20231220

@robertoostenveld Do you have a strong opinion on this?

@robertoostenveld
Copy link

We have the release https://github.com/fieldtrip/fieldtrip/releases/tag/20231220, right? That is the release which on Zenodo corresponds to https://doi.org/10.5281/zenodo.10495308.

Later tags such as 20240214 are in-between stable versions but not proper releases, and hence not optimal to refer to.

@Kevin-Mattheus-Moerman
Copy link
Member

@robertoostenveld sorry I missed that. Okay then all is set now, thanks

@Kevin-Mattheus-Moerman
Copy link
Member

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

Doing it live! Attempting automated processing of paper acceptance...

@editorialbot
Copy link
Collaborator Author

Ensure proper citation by uploading a plain text CITATION.cff file to the default branch of your repository.

If using GitHub, a Cite this repository menu will appear in the About section, containing both APA and BibTeX formats. When exported to Zotero using a browser plugin, Zotero will automatically create an entry using the information contained in the .cff file.

You can copy the contents for your CITATION.cff file here:

CITATION.cff

cff-version: "1.2.0"
authors:
- family-names: Es
  given-names: Mats W. J.
  name-particle: van
  orcid: "https://orcid.org/0000-0002-7133-509X"
- family-names: Spaak
  given-names: Eelke
  orcid: "https://orcid.org/0000-0002-2018-3364"
- family-names: Schoffelen
  given-names: Jan-Mathijs
  orcid: "https://orcid.org/0000-0003-0923-6610"
- family-names: Oostenveld
  given-names: Robert
  orcid: "https://orcid.org/0000-0002-1974-1293"
doi: 10.5281/zenodo.10495308
message: If you use this software, please cite our article in the
  Journal of Open Source Software.
preferred-citation:
  authors:
  - family-names: Es
    given-names: Mats W. J.
    name-particle: van
    orcid: "https://orcid.org/0000-0002-7133-509X"
  - family-names: Spaak
    given-names: Eelke
    orcid: "https://orcid.org/0000-0002-2018-3364"
  - family-names: Schoffelen
    given-names: Jan-Mathijs
    orcid: "https://orcid.org/0000-0003-0923-6610"
  - family-names: Oostenveld
    given-names: Robert
    orcid: "https://orcid.org/0000-0002-1974-1293"
  date-published: 2024-02-21
  doi: 10.21105/joss.05566
  issn: 2475-9066
  issue: 94
  journal: Journal of Open Source Software
  publisher:
    name: Open Journals
  start: 5566
  title: Reducing the efforts to create reproducible analysis code with
    FieldTrip
  type: article
  url: "https://joss.theoj.org/papers/10.21105/joss.05566"
  volume: 9
title: Reducing the efforts to create reproducible analysis code with
  FieldTrip

If the repository is not hosted on GitHub, a .cff file can still be uploaded to set your preferred citation. Users will be able to manually copy and paste the citation.

Find more information on .cff files here and here.

@editorialbot
Copy link
Collaborator Author

🐘🐘🐘 👉 Toot for this paper 👈 🐘🐘🐘

@editorialbot
Copy link
Collaborator Author

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.05566 joss-papers#5030
  2. Wait five minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.05566
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@editorialbot editorialbot added accepted published Papers published in JOSS labels Feb 21, 2024
@Kevin-Mattheus-Moerman
Copy link
Member

@matsvanes congratulations on this JOSS publication!

Thanks for editing @crsl4 ! And a special thanks to the reviewers: @gflofst, @ashahide !!!

@editorialbot
Copy link
Collaborator Author

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.05566/status.svg)](https://doi.org/10.21105/joss.05566)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.05566">
  <img src="https://joss.theoj.org/papers/10.21105/joss.05566/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.05566/status.svg
   :target: https://doi.org/10.21105/joss.05566

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@ashahide
Copy link

ashahide commented Feb 22, 2024 via email

@matsvanes
Copy link

Thank you all for your hard work and the positive outcome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Makefile Matlab published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials
Projects
None yet
Development

No branches or pull requests

8 participants