-
Notifications
You must be signed in to change notification settings - Fork 210
Fix SQ baichuan without position_ids for torch and ipex 2.3.0 #1597
Conversation
Signed-off-by: changwangss <[email protected]>
⚡ Required checks status: All passing 🟢Groups summary🟢 Format Scan Tests workflow
These checks are required after the changes to 🟢 Optimize Unit Test workflow
These checks are required after the changes to 🟢 NeuralChat Unit Test
These checks are required after the changes to 🟢 Engine Unit Test workflow
These checks are required after the changes to 🟢 Chat Bot Test workflow
These checks are required after the changes to Thank you for your contribution! 💜
|
Signed-off-by: changwangss <[email protected]>
Signed-off-by: changwangss <[email protected]>
Signed-off-by: changwangss <[email protected]>
for more information, see https://pre-commit.ci
Signed-off-by: changwangss <[email protected]>
Signed-off-by: Wang, Chang <[email protected]>
Type of Change
ticket: https://jira.devtools.intel.com/browse/ILITV-3631
ipex 2.3.0 optimized baichuan 13b without position_ids, so remove it.
meantime, ipex 2.3.0 offically supported transformers 4.38.1, so change the tuning and benchmark files.
update ipex 2.3.0 optmize llm supported list.
remove llama version limit when transformers higher than 4.36
Description
detail description
JIRA ticket: xxx
Expected Behavior & Potential Risk
the expected behavior that triggered by this PR
How has this PR been tested?
how to reproduce the test (including hardware information)
Dependency Change?
any library dependency introduced or removed