Skip to content
This repository has been archived by the owner on Oct 25, 2024. It is now read-only.

Fix SQ baichuan without position_ids for torch and ipex 2.3.0 #1597

Merged
merged 8 commits into from
Jun 11, 2024

Conversation

changwangss
Copy link
Contributor

@changwangss changwangss commented Jun 6, 2024

Type of Change

ticket: https://jira.devtools.intel.com/browse/ILITV-3631
ipex 2.3.0 optimized baichuan 13b without position_ids, so remove it.
meantime, ipex 2.3.0 offically supported transformers 4.38.1, so change the tuning and benchmark files.
update ipex 2.3.0 optmize llm supported list.
remove llama version limit when transformers higher than 4.36

Description

detail description
JIRA ticket: xxx

Expected Behavior & Potential Risk

the expected behavior that triggered by this PR

How has this PR been tested?

how to reproduce the test (including hardware information)

Dependency Change?

any library dependency introduced or removed

Signed-off-by: changwangss <[email protected]>
@changwangss changwangss requested a review from PenghuiCheng as a code owner June 6, 2024 06:31
Copy link

github-actions bot commented Jun 6, 2024

⚡ Required checks status: All passing 🟢

Groups summary

🟢 Format Scan Tests workflow
Check ID Status Error details
format-scan (pylint) success
format-scan (bandit) success
format-scan (cloc) success
format-scan (cpplint) success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/evaluation/models.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py, intel_extension_for_transformers/transformers/utils/utility.py.

🟢 Optimize Unit Test workflow
Check ID Status Error details
optimize-unit-test-baseline success
optimize-unit-test-PR-test success
Genreate-OptimizeUT-Report success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/evaluation/models.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py, intel_extension_for_transformers/transformers/utils/utility.py.

🟢 NeuralChat Unit Test
Check ID Status Error details
neuralchat-unit-test-baseline success
neuralchat-unit-test-PR-test success
Generate-NeuralChat-Report success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/evaluation/models.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py, intel_extension_for_transformers/transformers/utils/utility.py.

🟢 Engine Unit Test workflow
Check ID Status Error details
engine-unit-test-baseline success
engine-unit-test-PR-test success
Genreate-Engine-Report success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/evaluation/models.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py, intel_extension_for_transformers/transformers/utils/utility.py.

🟢 Chat Bot Test workflow
Check ID Status Error details
call-inference-llama-2-7b-chat-hf / inference test success
call-inference-mpt-7b-chat / inference test success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/evaluation/models.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py, intel_extension_for_transformers/transformers/utils/utility.py.


Thank you for your contribution! 💜

Note
This comment is automatically generated and will be updates every 180 seconds within the next 6 hours. If you have any other questions, contact VincyZhang or XuehaoSun for help.

Signed-off-by: changwangss <[email protected]>
Signed-off-by: changwangss <[email protected]>
@changwangss changwangss changed the title Fix SQ baichaun without position_ids Fix SQ baichuan without position_ids for torch and ipex 2.3.0 Jun 6, 2024
@changwangss changwangss requested a review from XuehaoSun June 6, 2024 08:17
Signed-off-by: Wang, Chang <[email protected]>
@XuehaoSun XuehaoSun merged commit 14734de into main Jun 11, 2024
22 checks passed
@XuehaoSun XuehaoSun deleted the wangchang/fix_baichuan branch June 11, 2024 03:13
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants