Extract structured data from local or remote LLM models
-
Updated
Jun 21, 2024 - Python
Extract structured data from local or remote LLM models
A chrome extention for quering a local llm model using llama-cpp-python, includes a pip package for running the server, 'pip install local-llama' to install
entirely oss and locally running version of recall (originally revealed by msft for copilot+pcs)
This julia package addresses the membership problem for local polytopes: it constructs Bell inequalities and local models in multipartite Bell scenarios with binary outcomes.
Main code chunks used for models in the publication "Exploring the Potential of Adaptive, Local Machine Learning (ML) in Comparison ton the Prediction Performance of Global Models: A Case Study from Bayer's Caco-2 Permeability Database"
Add a description, image, and links to the local-models topic page so that developers can more easily learn about it.
To associate your repository with the local-models topic, visit your repo's landing page and select "manage topics."