You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we ran codegeex models in ipex -llm 2.2.0b1 env , it generated a significant deviation in accuracy than in ipex-llm 2.1.0b2024515 or 2.1.0b2. We make test_logits test to get some data as below:
For the codegeex models' logits diff, 2.2.0b2 has fixed it. Though the logits are different, but the errors are much smaller, for the same prompt, 2.2.0b2 can generate the same output with 2.1.0b2.
When we ran codegeex models in ipex -llm 2.2.0b1 env , it generated a significant deviation in accuracy than in ipex-llm 2.1.0b2024515 or 2.1.0b2. We make test_logits test to get some data as below:
下面是各个ipex-llm版本间的logits结果(optimize_model=True)比较:
2.1.0b2 <-> 2.1.0b20240515 : 一致
2.1.0b2 <-> 2.1.0 : 不一致
2.1.0b2 <-> 2.2.0b1 : 不一致
2.1.0 <-> 2.2.0b1 : 不一致
下列版本做了optimize_model=True和optimize_model=False间的logits结果比较:
2.1.0b20240515: 一致
2.1.0b2: 一致
2.1.0: 不一致
2.2.0b1: 不一致
pls help me to debug this issue , thx.
The text was updated successfully, but these errors were encountered: