Skip to content

Latest commit

 

History

History
53 lines (34 loc) · 1.35 KB

README.md

File metadata and controls

53 lines (34 loc) · 1.35 KB

LiteSearch

This repository contains the code for the paper "LiteSearch: Efficacious Tree Search for LLM" (AAAI 2025).

Getting Started

Train a Value Network

  1. Obtain Llama-3-8B: You can get the Llama-3-8B model from the official website or Hugging Face.

  2. Prepare Training Data: Prepare your training data according to the guidelines provided in the paper. The output format should match the example shown in train_demo.json.

  3. Train the Model:

Value/train.sh

Run LiteSearch

  1. Setup Server:

    • For the policy, use the latest version of VLLM.

    • For the value, run the provided script:

Value/run_hf.sh
  1. Run LiteSearch:
LiteSearch/search_batch.py

Wait for the search to complete.

Notes

All experiments in our paper are conducted on 8 V100 GPU (32G). Ensure you have all necessary dependencies installed and configured before running the scripts. For more details on dependencies and setup, refer to the paper.

Citation

@article{wang2024litesearch,
  title={Litesearch: Efficacious tree search for llm},
  author={Wang, Ante and Song, Linfeng and Tian, Ye and Peng, Baolin and Yu, Dian and Mi, Haitao and Su, Jinsong and Yu, Dong},
  journal={arXiv preprint arXiv:2407.00320},
  year={2024}
}