This repository provides a containerized version of the LlamaFile to making it easy to deploy and manage. It also includes an example Controller Service built with FastAPI, which demonstrates how to handle incoming logic processed by LlamaFile.
- LlamaFile: The main application, ready for containerized deployment.
- Controller Service: A FastAPI-based example service to handle incoming logic and integrate seamlessly with LlamaFile.
- Docker Support: Simplifies deployment using Docker Compose.
- Environment Configuration: Configurable using
.env
for easy customization.
Ensure you have the following installed:
-
Clone the repository:
git clone https://github.com/hfahrudin/Dockerize-Llamafile.git cd Dockerize-Llamafile
-
Configure environment variables:
- Create a
.env
file in the root directory.
- Create a
-
Build and start the services:
docker-compose up --build
-
Access the Controller Service at
http://localhost:8000
(default).
Here’s an example request to the Controller Service:
curl -X GET "http://localhost:8000/health" \
-H "Content-Type: application/json" \
- LlamaFile Configuration: Update the application in the
llamafile/
directory as needed. - Controller Logic: Modify the FastAPI code in the
controller/
directory to implement custom logic. - Docker Compose: Adjust the
docker-compose.yml
file for your infrastructure requirements.
Contributions are welcome! Please fork the repository and submit a pull request with your changes.
This project is licensed under the MIT License.