Skip to content

vrn21/localllm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Localllm

Localllm is a web-based chat application built entirely with Rust, demonstrating the potential of frontend web development using Yew. This project showcases a locally running LLM (Language Model) with a seamless chat interface.

Table of Contents

Features

  • 🌟 100% Rust: Entirely built with Rust, showcasing the power and versatility of the language.
  • 🖥️ Frontend with Yew: Utilizes Yew for building the frontend, demonstrating Rust's capability in web development.
  • 🤖 Local LLM Integration: Chat with a locally running language model.

Getting Started

Prerequisites

Ensure you have the following installed:

Installation

Clone the repository:

git clone https://github.com/yourusername/localllm.git cd localllm cargo run

Usage Open your browser and navigate to http://localhost:8080. Start chatting with the locally running LLM model.

Project Structure

app.rs

Handles the complete frontend and API calling logic.

api.rs

Contains the logic behind the API interactions.

types.rs

Defines the necessary structs used across the application.

main.rs

The entry point of the application, where the app function is called.

Contributing

Contributions are welcome! Please fork this repository and submit pull requests.

Fork the repository. Create your feature branch (git checkout -b feature/AmazingFeature). Commit your changes (git commit -m 'Add some AmazingFeature'). Push to the branch (git push origin feature/AmazingFeature). Open a pull request.

Made with ❤️ using Rust and Yew

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published