Skip to content

Starter examples for using Next.js and the Vercel AI SDK with Ollama and ModelFusion.

License

Notifications You must be signed in to change notification settings

lgrammel/modelfusion-ollama-nextjs-starter

Repository files navigation

Next.js, Vercel AI SDK, Ollama & ModelFusion starter

This starter example shows how to use Next.js, the Vercel AI SDK, Ollama and ModelFusion to create a ChatGPT-like AI-powered streaming chat bot.

Setup

  1. Install Ollama on your machine.
  2. Pull the model: ollama pull llama2:chat (reference)
  3. Clone the repository: git clone https://github.com/lgrammel/modelfusion-ollama-nextjs-starter.git
  4. Install dependencies: npm install
  5. Start the development server: npm run dev
  6. Go to http://localhost:3000/
  7. Code: app/api/chat/route.ts

Example Route

import { ModelFusionTextStream, asChatMessages } from "@modelfusion/vercel-ai";
import { Message, StreamingTextResponse } from "ai";
import { ollama, streamText } from "modelfusion";

export const runtime = "edge";

export async function POST(req: Request) {
  const { messages }: { messages: Message[] } = await req.json();

  // Use ModelFusion to call Ollama:
  const textStream = await streamText({
    model: ollama.ChatTextGenerator({ model: "llama2:chat" }).withChatPrompt(),
    prompt: {
      system:
        "You are an AI chat bot. " +
        "Follow the user's instructions carefully.",

      // map Vercel AI SDK Message to ModelFusion ChatMessage:
      messages: asChatMessages(messages),
    },
  });

  // Return the result using the Vercel AI SDK:
  return new StreamingTextResponse(
    ModelFusionTextStream(
      textStream,
      // optional callbacks:
      {
        onStart() {
          console.log("onStart");
        },
        onToken(token) {
          console.log("onToken", token);
        },
        onCompletion: () => {
          console.log("onCompletion");
        },
        onFinal(completion) {
          console.log("onFinal", completion);
        },
      }
    )
  );
}