Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixed lib path, now can use custom anthropic client #31

Merged
merged 2 commits into from
Feb 9, 2025
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@ export interface AiCoder {
}
```

*Note: The `createAiCoder` function now accepts an optional `anthropicClient` parameter to override the default Anthropic client. This allows you to provide a custom client (for example, with different API key configuration) when using AiCoder in environments like the browser.*

The AI Coder supports streaming of AI responses and notifying you when the virtual file system (VFS) is updated. To achieve this, you can pass two callback functions when creating an AiCoder instance:
- **onStreamedChunk**: A callback function that receives streamed chunks from the AI. This is useful for logging or updating a UI with gradual progress.
- **onVfsChanged**: A callback function that is invoked whenever the VFS is updated with new content. This is useful for refreshing a file view or triggering further processing.
Expand Down
Binary file modified bun.lockb
Binary file not shown.
6 changes: 5 additions & 1 deletion lib/ai/aiCoder.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,13 +17,16 @@ export class AiCoderImpl implements AiCoder {
onVfsChanged: () => void
vfs: { [filepath: string]: string } = {}
availableOptions = [{ name: "microController", options: ["pico", "esp32"] }]
anthropicClient: import("@anthropic-ai/sdk").Anthropic | undefined

constructor(
onStreamedChunk: (chunk: string) => void,
onVfsChanged: () => void,
anthropicClient?: import("@anthropic-ai/sdk").Anthropic,
ShiboSoftwareDev marked this conversation as resolved.
Show resolved Hide resolved
) {
this.onStreamedChunk = onStreamedChunk
this.onVfsChanged = onVfsChanged
this.anthropicClient = anthropicClient
}

async submitPrompt(
Expand Down Expand Up @@ -62,6 +65,7 @@ export class AiCoderImpl implements AiCoder {
export const createAiCoder = (
onStreamedChunk: (chunk: string) => void,
onVfsChanged: () => void,
anthropicClient?: import("@anthropic-ai/sdk").Anthropic,
): AiCoder => {
return new AiCoderImpl(onStreamedChunk, onVfsChanged)
return new AiCoderImpl(onStreamedChunk, onVfsChanged, anthropicClient)
ShiboSoftwareDev marked this conversation as resolved.
Show resolved Hide resolved
}
15 changes: 12 additions & 3 deletions lib/ai/anthropic.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,16 @@
import Anthropic from "@anthropic-ai/sdk"
import dotenv from "dotenv"
dotenv.config()

let apiKey = ""
if (
typeof process !== "undefined" &&
process.env &&
process.env.ANTHROPIC_API_KEY
) {
import("dotenv").then((dotenv) => dotenv.config())
apiKey = process.env.ANTHROPIC_API_KEY
}

export const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
apiKey,
dangerouslyAllowBrowser: true,
})
5 changes: 4 additions & 1 deletion lib/ai/ask-ai-with-previous-attempts.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,15 @@ export const askAiWithPreviousAttempts = async ({
systemPrompt,
previousAttempts,
onStream,
anthropicClient,
}: {
prompt: string
systemPrompt: string
previousAttempts?: AttemptHistory[]
onStream?: (chunk: string) => void
anthropicClient?: typeof anthropic
}): Promise<string> => {
const client = anthropicClient || anthropic
const messages: { role: "assistant" | "user"; content: string }[] = [
{ role: "user", content: prompt },
]
Expand Down Expand Up @@ -50,7 +53,7 @@ export const askAiWithPreviousAttempts = async ({
onStream(
`Start streaming AI response, attempt: ${(previousAttempts?.length || 0) + 1}`,
)
const completionStream = await anthropic.messages.create({
const completionStream = await client.messages.create({
model: "claude-3-5-haiku-20241022",
max_tokens: 2048,
system: systemPrompt,
Expand Down
3 changes: 3 additions & 0 deletions lib/ai/run-ai-with-error-correction.ts
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ export const runAiWithErrorCorrection = async ({
onStream,
onVfsChanged,
vfs,
anthropicClient,
}: {
attempt?: number
logsDir?: string
Expand All @@ -56,6 +57,7 @@ export const runAiWithErrorCorrection = async ({
onStream?: (chunk: string) => void
onVfsChanged?: () => void
vfs?: Record<string, string>
anthropicClient?: import("@anthropic-ai/sdk").Anthropic
}): Promise<{
code: string
codeBlock: string
Expand All @@ -66,6 +68,7 @@ export const runAiWithErrorCorrection = async ({
systemPrompt,
previousAttempts,
onStream,
anthropicClient,
})
const codeMatch = aiResponse.match(/```tsx\s*([\s\S]*?)\s*```/)
const code = codeMatch ? codeMatch[1].trim() : ""
Expand Down
3 changes: 2 additions & 1 deletion tsconfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@
"paths": {
"src/*": ["src/*"],
"tests/*": ["tests/*"],
"prompt-templates/*": ["lib/prompt-templates/*"]
"prompt-templates/*": ["lib/prompt-templates/*"],
"lib/*": ["lib/*"]
},

// Best practices
Expand Down