Localllm is a web-based chat application built entirely with Rust, demonstrating the potential of frontend web development using Yew. This project showcases a locally running LLM (Language Model) with a seamless chat interface.
- 🌟 100% Rust: Entirely built with Rust, showcasing the power and versatility of the language.
- 🖥️ Frontend with Yew: Utilizes Yew for building the frontend, demonstrating Rust's capability in web development.
- 🤖 Local LLM Integration: Chat with a locally running language model.
Ensure you have the following installed:
Clone the repository:
git clone https://github.com/yourusername/localllm.git cd localllm cargo run
Usage Open your browser and navigate to http://localhost:8080. Start chatting with the locally running LLM model.
Handles the complete frontend and API calling logic. Contains the logic behind the API interactions. Defines the necessary structs used across the application. The entry point of the application, where the app function is called. Contributions are welcome! Please fork this repository and submit pull requests.Fork the repository. Create your feature branch (git checkout -b feature/AmazingFeature). Commit your changes (git commit -m 'Add some AmazingFeature'). Push to the branch (git push origin feature/AmazingFeature). Open a pull request.