Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve --local_rank arg comment #8409

Merged
merged 4 commits into from
Jun 30, 2022

Conversation

pourmand1376
Copy link
Contributor

@pourmand1376 pourmand1376 commented Jun 30, 2022

Hi,
It took me to a while to understand why this parameter exists and what it does. I thought maybe adding this to help string would be a good idea.

🛠️ PR Summary

Made with ❤️ by Ultralytics Actions

🌟 Summary

Improvement in clarity for the Distributed Data Parallel (DDP) command-line argument description.

📊 Key Changes

  • Modified the help description for the --local_rank argument in the training script's argument parser.

🎯 Purpose & Impact

  • 🎓 Enhanced clarity: The new help text for the --local_rank argument makes it clear that it is used for automatic configuration in multi-GPU setups with Distributed Data Parallel (DDP).
  • 🔄 Minimal impact on end-users: This change mainly affects user understanding; it does not impact the script's functionality. It's particularly helpful for developers or users working with multi-GPU training configurations.

Copy link
Contributor

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👋 Hello @pourmand1376, thank you for submitting a YOLOv5 🚀 PR! To allow your work to be integrated as seamlessly as possible, we advise you to:

  • ✅ Verify your PR is up-to-date with upstream/master. If your PR is behind upstream/master an automatic GitHub Actions merge may be attempted by writing /rebase in a new comment, or by running the following code, replacing 'feature' with the name of your local branch:
git remote add upstream https://github.com/ultralytics/yolov5.git
git fetch upstream
# git checkout feature  # <--- replace 'feature' with local branch name
git merge upstream/master
git push -u origin -f
  • ✅ Verify all Continuous Integration (CI) checks are passing.
  • ✅ Reduce changes to the absolute minimum required for your bug fix or feature addition. "It is not daily increase but daily decrease, hack away the unessential. The closer to the source, the less wastage there is." -Bruce Lee

@glenn-jocher glenn-jocher changed the title Add more docs for rank parameter Improve --local_rank arg comment Jun 30, 2022
@glenn-jocher glenn-jocher merged commit e50dc38 into ultralytics:master Jun 30, 2022
@glenn-jocher
Copy link
Member

@pourmand1376 PR is merged. I've tried to strike the right balance here between explanatory comments and concise code.

Thank you for your contributions to YOLOv5 🚀 and Vision AI ⭐

@pourmand1376
Copy link
Contributor Author

Thanks for reviewing.

I think we need a wiki to explain this stuff more concisely. Now, the explanation for each parameter is simply not enough.

@glenn-jocher
Copy link
Member

@pourmand1376 yes, agreed we need more documentation.

We should be updating our docs at docs.ultralytics.com in the future with more detailed explanations.

@AyushExel FYI

@AyushExel
Copy link
Contributor

@glenn-jocher I'm putting docs on the agenda of our discussion for Saturday

@glenn-jocher
Copy link
Member

@AyushExel yes good idea

ctjanuhowski pushed a commit to ctjanuhowski/yolov5 that referenced this pull request Sep 8, 2022
* add more docs

* add more docs

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update train.py

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Glenn Jocher <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants