Skip to content

v0.2.0 - Accepted at RecSys '24 🎉

Latest
Compare
Choose a tag to compare
@Silleellie Silleellie released this 05 Aug 19:15

This release marks the acceptance of 'Reproducibility of LLM-based Recommender Systems: the case study of P5 paradigm' at RecSys 2024 conference. In the mentioned paper, LaikaLLM is presented and all experiments are carried out with it!

  • This release page and the README.md will be updated with the DOI of the paper as soon as it is available

Added

  • The T5Rec and GPT2Rec models have now the parameter inject_whole_word_embeds with which it is possible to encode whole word information in the input embeddings (Link to model docs)
  • It is now possible to specify a custom input prefix and target prefix for the GPT2Rec model, with the parameters input_prefix and target_prefix (Link to GPT2Rec docs)
  • Added original P5 prompt templates for the sequential, rating prediction and direct recommendation task (Link to doc)
  • Added more experiments in the sample_experiments directory Link
  • Added possibility to manually place AmazonDataset directory of the P5 paper

Changed

  • The parameter inject_personalization of T5Rec has been renamed to inject_user_embeds for better clarity
  • The sample_experiments directory now reflects 1:1 the accepted paper

Fixed

  • Fix best results for error metrics: it was logged the max instead of the min
  • Better error message in case of faulty AmazonDataset download

Full Changelog: v0.1.0...v0.2.0