-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the experiment of MWPBert on math23k #21
Comments
|
you can change it with command line like i hope this will help you |
Very grateful for your help. The model works fine then. |
there may be something wrong with my code when i update v0.0.6, i will check it. and i'm so sorry for that. |
I got value acc 82.5, the latest result of MWPBert on math23k. here is my here is my instruction:
and I publish the result at the result table |
I got 'value accu=40.0' and found that the model uses 'bert-base-uncased' as the encoder by default. Could the reason be that I were not using a Chinese bert for math23k?
here is my instruction:
python run_mwptoolkit.py --model=MWPBert --dataset=math23k --task_type=single_equation --equation_fix=prefix --test_step=5 --gpu_id=0
I tried to change 'config["pretrained_model"]' to 'bert-base-chinese',but got some bugs which showed it doesn;t match the model......Is there any built-in method to change it?
The text was updated successfully, but these errors were encountered: