We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
用户在通过Llama2-7b模型验证DeeLink能力时遇到问题,以下为原话。 ---------------------邮件原文--------------------- 目前正在通过Llama2-7b模型验证DeeLink能力,其中遇到两个问题需要求助下:
在昇腾上通过DeepLink训练Llama2-7b,验证无代码改动可在英伟达与昇腾无障碍训练。我使用的llama2-chinese脚本,但在部署Llama2模型环境时遇到flash_attn与cuda强相关,无法在昇腾环境安装,猜测验证DeepLink能力需要特定的脚本,我在你们github上的测评模型中未找到相关脚本,请问你们方便提供下不,万分感谢!
在英伟达通过DeepLink训练Llama2-7b时遇到IndexError: map::at报错,前期定位是device='cuda:7'中:7不存在问题,但今天看到这个问题已解决,更新deeplink后测试发现以下报错仍然存在,请问你们有遇到这种问题吗?有啥临时解决方案吗?
The text was updated successfully, but these errors were encountered:
关于昇腾的问题:
Sorry, something went wrong.
No branches or pull requests
用户在通过Llama2-7b模型验证DeeLink能力时遇到问题,以下为原话。
---------------------邮件原文---------------------
目前正在通过Llama2-7b模型验证DeeLink能力,其中遇到两个问题需要求助下:
在昇腾上通过DeepLink训练Llama2-7b,验证无代码改动可在英伟达与昇腾无障碍训练。我使用的llama2-chinese脚本,但在部署Llama2模型环境时遇到flash_attn与cuda强相关,无法在昇腾环境安装,猜测验证DeepLink能力需要特定的脚本,我在你们github上的测评模型中未找到相关脚本,请问你们方便提供下不,万分感谢!
在英伟达通过DeepLink训练Llama2-7b时遇到IndexError: map::at报错,前期定位是device='cuda:7'中:7不存在问题,但今天看到这个问题已解决,更新deeplink后测试发现以下报错仍然存在,请问你们有遇到这种问题吗?有啥临时解决方案吗?
The text was updated successfully, but these errors were encountered: