Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: MNE-ICALabel: Automatically annotating ICA components with ICLabel in Python #4484

Closed
editorialbot opened this issue Jun 18, 2022 · 51 comments
Assignees
Labels
accepted Makefile published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Jun 18, 2022

Submitting author: @adam2392 (Adam Li)
Repository: https://github.com/mne-tools/mne-icalabel
Branch with paper.md (empty if default branch): main
Version: v0.4
Editor: @emdupre
Reviewers: @TomDonoghue, @adswa
Archive: 10.5281/zenodo.7017165

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/d91770e35a985ecda4f2e1f124977207"><img src="https://joss.theoj.org/papers/d91770e35a985ecda4f2e1f124977207/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/d91770e35a985ecda4f2e1f124977207/status.svg)](https://joss.theoj.org/papers/d91770e35a985ecda4f2e1f124977207)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@hvgazula & @TomDonoghue, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @emdupre know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @TomDonoghue

📝 Checklist for @adswa

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.88  T=0.06 s (988.2 files/s, 81746.2 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          28            485            882           1623
YAML                             4             58             21            421
Markdown                         7            134              0            263
make                             2             66             11            262
DOS Batch                        1             29              1            212
TeX                              2              8              1            151
reStructuredText                 8             93             70            115
HTML                             4              6              6             51
JavaScript                       1              3             10             50
Bourne Shell                     3              7              8             32
TOML                             1              3              0             21
CSS                              1              3              4             19
-------------------------------------------------------------------------------
SUM:                            62            895           1014           3220
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

@editorialbot
Copy link
Collaborator Author

Wordcount for paper.md is 741

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.3389/fnins.2013.00267 is OK
- 10.1162/neco.1995.7.6.1129 is OK

MISSING DOIs

- None

INVALID DOIs

- https://doi.org/10.1016/j.neuroimage.2019.05.026 is INVALID because of 'https://doi.org/' prefix

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@emdupre
Copy link
Member

emdupre commented Jun 18, 2022

Hi @hvgazula and @TomDonoghue 👋 Thanks again for agreeing to review this submission ! The review will take place in this issue, and you can generate your individual reviewer checklists by asking editorialbot directly with @\editorialbot generate my checklist.

In working through the checklist, you're likely to have specific feedback on MNE-ICALabel. Whenever possible, please open relevant issues on the linked software repository (and cross-link them with this issue) rather than discussing them here. This helps to make sure that feedback is translated into actionable items to improve the software !

If you aren't sure how to get started, please see the Reviewing for JOSS guide -- and, of course, feel free to ping me with any questions !

@emdupre
Copy link
Member

emdupre commented Jul 6, 2022

Hi again @hvgazula and @TomDonoghue, I just wanted to check-in here since you haven't yet created your reviewer checklists 📋

Please let me know if you're encountering any issues in this process, or if you have a time window in which you expect to be able to work on this review !

@TomDonoghue
Copy link

TomDonoghue commented Jul 7, 2022

Review checklist for @TomDonoghue

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/mne-tools/mne-icalabel?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@adam2392) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@adam2392
Copy link

Hi @emdupre just wanted to check to see if there were any updates on this?

Don't want to continue bothering you all w/ notifications :p. Generally what sort of timeline should we expect?

@TomDonoghue
Copy link

Sorry for being slow on this - the past month ended up being busier than expected!

I have worked through the review / module check including getting the module installed locally, and running through the provided example on the documentation site. In general, I left some comments that are all on the original repository (in the issues linked above). They mostly detail some small comments on documentation and paper stuff (particularly the example), but they are all pretty minor things. Overall, I think the tool / module is well developed and organized, and I have no substantive comments on the code or project as a whole. Once the authors have a chance to respond to the minor points raised, I think the remaining things above can be checked off from my side.

@adam2392
Copy link

adam2392 commented Aug 3, 2022

Hi @TomDonoghue I have addressed your points raised in the mne-tools/mne-icalabel#86 (comment) issue. Thanks!

@emdupre
Copy link
Member

emdupre commented Aug 3, 2022

Thanks, @TomDonoghue and @adam2392 !

Generally what sort of timeline should we expect?

We ask that all reviews be completed within six weeks (so from our initial start date of 18 June, I wouldn't expect reviews before the end of last week). Given ongoing summer holidays, though, many folks are in-and-out of office so I'm generally assuming about a two-week buffer period.

Just as an explicit update : I'm following up with @hvgazula via email in case he's missing these GitHub notifications. And thank you again, @TomDonoghue, for your review on this !

@emdupre
Copy link
Member

emdupre commented Aug 15, 2022

Hi @adam2392 ,

Just to update you on this : I've reached out to @hvgazula several times but unfortunately have not received a response. Hopefully he's OK, but we'll now need to move forward without his review. I'm in the process of securing a second reviewer who could provide their input on an accelerated timeline so that we can move this forward.

I'm sorry for the delay in reviewing this work, and I appreciate your patience.

To try and save time where we can, I'll go ahead and make a few editorial comments here and on the software repo. Please let me know if you have any questions about these, or anything else at this point.

@emdupre
Copy link
Member

emdupre commented Aug 18, 2022

As an update, @adswa has agreed to review this submission on an accelerated timeline, so I'm adding her as a reviewer now. Thank you, @adswa !

Adapting relevant information from the top comment:

Your review will be checklist based. You will have a separate checklist that you should update when carrying out your review.
First, you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @emdupre know.

@emdupre
Copy link
Member

emdupre commented Aug 18, 2022

@editorialbot add @adswa as reviewer

@editorialbot
Copy link
Collaborator Author

@adswa added to the reviewers list!

@adswa
Copy link

adswa commented Aug 18, 2022

Review checklist for @adswa

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/mne-tools/mne-icalabel?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@adam2392) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@adswa
Copy link

adswa commented Aug 18, 2022

Hey everybody, happy to be on board! 👋
I got a head start into the review, and had time to look at almost all review criteria already. Overall, congrats on a well-made piece of software and this cool addition into the mne ecosystem! I tested all installation methods and went through the examples in the documentation on a Debian system. I'm a big fan of the examples in MNEs documentation and was happy to see this software create similar walkthroughs. I got to run the test suite locally on a Debian system, too, and the automated CIs look good. I found contributing and license information easily in the source repo and in the docs (filed only mne-tools/mne-icalabel#102 for a small bug). I plan to give the tool a test run on two different systems tomorrow, and might leave a comment or two about the paper structure, but you should have my complete review by tomorrow evening.

@adswa
Copy link

adswa commented Aug 19, 2022

Alright, I'm done with my review. I filed a range of general observations I made when trying it out a bit more (mne-tools/mne-icalabel#103, mne-tools/mne-icalabel#104, mne-tools/mne-icalabel#107, mne-tools/mne-icalabel#102), of which I deem none required to close in order to accept this submission. The only thing I would ask the authors for is to resolve my confusion stated in mne-tools/mne-icalabel#105 about the data requirements, and if my assumptions are not wrong, add some clarifying remarks to documentation, warnings, or paper as they see appropriate to help users select and adopt the tool more efficiently. Other than that, I'm very happy with the submission, and ready to recommend "accept".

@emdupre
Copy link
Member

emdupre commented Aug 24, 2022

@editorialbot check references

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/j.neuroimage.2019.05.026 is OK
- 10.3389/fnins.2013.00267 is OK
- 10.48550/ARXIV.1912.01703 is OK
- 10.1162/neco.1995.7.6.1129 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@emdupre
Copy link
Member

emdupre commented Aug 24, 2022

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@emdupre
Copy link
Member

emdupre commented Aug 24, 2022

@editorialbot set v0.4 as version

@editorialbot
Copy link
Collaborator Author

Done! version is now v0.4

@openjournals openjournals deleted a comment from editorialbot Aug 24, 2022
@emdupre
Copy link
Member

emdupre commented Aug 24, 2022

@editorialbot set 10.5281/zenodo.7017165 as archive

@editorialbot
Copy link
Collaborator Author

Done! Archive is now 10.5281/zenodo.7017165

@emdupre
Copy link
Member

emdupre commented Aug 24, 2022

@editorialbot remove @hvgazula from reviewers

@editorialbot
Copy link
Collaborator Author

@hvgazula removed from the reviewers list!

@emdupre
Copy link
Member

emdupre commented Aug 24, 2022

Thank you, @adam2392 !

My only remaining request is to update the Zenodo archive to correct the description's formatting; this can be done without minting a new DOI by editing the metadata of the existing record. Please let me know if you have any issues with this !

Once this is done, I can recommend the submission for publication 🚀

@adam2392
Copy link

Hi @emdupre is this what you meant? https://zenodo.org/record/7017165#.YwYu2y-B2fU

@emdupre
Copy link
Member

emdupre commented Aug 24, 2022

That seems better, thank you @adam2392 !

I'm now happy to recommend MNE-ICALabel to the JOSS EIC team for publication—congratulations on this impressive effort !

@emdupre
Copy link
Member

emdupre commented Aug 24, 2022

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/j.neuroimage.2019.05.026 is OK
- 10.3389/fnins.2013.00267 is OK
- 10.48550/ARXIV.1912.01703 is OK
- 10.1162/neco.1995.7.6.1129 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#3465, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label Aug 24, 2022
@arfon
Copy link
Member

arfon commented Aug 26, 2022

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

Doing it live! Attempting automated processing of paper acceptance...

@editorialbot
Copy link
Collaborator Author

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@editorialbot
Copy link
Collaborator Author

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.04484 joss-papers#3475
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.04484
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@editorialbot editorialbot added accepted published Papers published in JOSS labels Aug 26, 2022
@arfon
Copy link
Member

arfon commented Aug 26, 2022

@TomDonoghue, @adswa – many thanks for your reviews here and to @emdupre for editing this submission! JOSS relies upon the volunteer effort of people like you and we simply wouldn't be able to do this without you ✨

@adam2392 – your paper is now accepted and published in JOSS ⚡🚀💥

@arfon arfon closed this as completed Aug 26, 2022
@editorialbot
Copy link
Collaborator Author

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.04484/status.svg)](https://doi.org/10.21105/joss.04484)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.04484">
  <img src="https://joss.theoj.org/papers/10.21105/joss.04484/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.04484/status.svg
   :target: https://doi.org/10.21105/joss.04484

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Makefile published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX
Projects
None yet
Development

No branches or pull requests

6 participants