Skip to content

Issues: microsoft/Tutel

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Tutel with pytorch automatic mixed precision package question Further information is requested
#138 opened May 10, 2022 by MiZhenxing
bp of shared parameters and experts question Further information is requested
#161 opened Jun 14, 2022 by a157801
Error in load_importance_loss enhancement New feature or request
#167 opened Jul 6, 2022 by Luodian
Error when doing deepcopy of the model enhancement New feature or request
#177 opened Aug 3, 2022 by yzxing87
Example on saving experts to one model when using distributed training duplicate This issue or pull request already exists
#178 opened Aug 7, 2022 by Luodian
Pretrained MoE model question Further information is requested
#179 opened Aug 7, 2022 by Luodian
Question about multi-gate refer to multi-task learning question Further information is requested
#70 opened Dec 26, 2021 by Tokkiu
All2All precision always in fp32
#195 opened Feb 21, 2023 by vchiley
INTERNAL ASSERT FAILED
#203 opened May 2, 2023 by Qicheng-WANG
ProTip! Adding no:label will show everything without a label.