Skip to content
This repository has been archived by the owner on Oct 25, 2024. It is now read-only.

Rag example not working #1688

Open
anayjain opened this issue Aug 14, 2024 · 1 comment
Open

Rag example not working #1688

anayjain opened this issue Aug 14, 2024 · 1 comment

Comments

@anayjain
Copy link

I ran the rag example in intel_extension_for_transformers/neural_chat/pipeline/plugins/retrieval/README.md and I got this error -
example_py

Steps to re-create -

  1. !apt-get install -y ffmpeg
    !apt-get install -y libgl1-mesa-glx libgl1-mesa-dev

  2. !pip3 install intel-extension-for-transformers

  3. !git clone https://github.com/intel/intel-extension-for-transformers.git

  4. %cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/pipeline/plugins/retrieval/
    !pip install -r requirements.txt
    %cd ../../../../../../

  5. %cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/
    !pip install -r requirements.txt
    %cd ../../../

  6. then I ran rag example -
    from intel_extension_for_transformers.neural_chat import PipelineConfig
    from intel_extension_for_transformers.neural_chat import build_chatbot
    from intel_extension_for_transformers.neural_chat import plugins
    plugins.retrieval.enable=True
    plugins.retrieval.args["input_path"]="./docs/"
    config = PipelineConfig(plugins=plugins)
    chatbot = build_chatbot(config)

and got the error -

@benjamin-marie
Copy link

benjamin-marie commented Aug 17, 2024

The "import from intel_extension_for_transformers.transformers.modeling import AutoModelForCausalLM" also fails with the same error message.
But my installation command is much simpler:
pip install intel-extension-for-transformers

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants