We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
因为实际场景中经常有中英混合的文本,请问这个模型是否支持英文或者中英混合?
The text was updated successfully, but these errors were encountered:
建议你自行尝试。主要场景还是处理中文文本。 不排除能处理一些英文,但相比纯英文的模型效果应该是要差一些的。
Sorry, something went wrong.
非常感谢你的解答, 我还想请教一下现在支持多语种,参数量较小,推理速度快的 bert 或者 bert 的变种有什么?
我目前只找到一个 hugging face 的 distilbert-base-multilangual, 还有一个 https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384
我的使用场景是 bert-vits2 ,但是支持多语种
@Jackiexiao albert了解下 比如 uer/albert-base-chinese-cluecorpussmall
No branches or pull requests
因为实际场景中经常有中英混合的文本,请问这个模型是否支持英文或者中英混合?
The text was updated successfully, but these errors were encountered: