An api to query local language models using different backends. Supported backends:
📚 Api doc
npm install @locallm/api
Example with the Koboldcpp provider:
import { Lm } from "@locallm/api";
const lm = new Lm({
providerType: "koboldcpp",
serverUrl: "http://localhost:5001",
onToken: (t) => process.stdout.write(t),
});
const template = "<s>[INST] {prompt} [/INST]";
const _prompt = template.replace("{prompt}", "list the planets in the solar system");
// run the inference query
await lm.infer(_prompt, {
stream: true,
temperature: 0.2,
n_predict: 200,
});
Check the examples directory for more examples