Extra-small AI SDK for any OpenAI-compatible API.
I'm working on a tiny local-LLM translator - ARPK, which is currently (v0.2.4) 449KB, of which 26% is ollama-js
(the other 13% is the pointless whatwg-fetch
that comes with ollama-js
!)
I wanted to make every byte count, so I started writing a lightweight library - xsAI.
It provides an interface similar to the Vercel AI SDK, ESM-only and zero dependencies for minimal installation size.
# without pre-compile
npm i xsai
# pre-compile
npm i -D xsai