-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(node): Add @vercel/ai instrumentation #13892
Conversation
size-limit report 📦
|
38cb4d4
to
bdc3543
Compare
❌ 1 Tests Failed:
View the top 1 failed tests by shortest run time
To view more test analytics, go to the Test Analytics Dashboard |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me :)
* Adds Sentry tracing instrumentation for the [ai](https://www.npmjs.com/package/ai) library. | ||
* | ||
* For more information, see the [`ai` documentation](https://sdk.vercel.ai/docs/ai-sdk-core/telemetry). | ||
* | ||
* @example | ||
* ```javascript |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice JS doc 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just once concern, otherwise LGTM!
attributes['ai.usage.completionTokens'] != undefined && | ||
attributes['ai.usage.promptTokens'] != undefined | ||
) { | ||
span.data['ai.tokens.used'] = attributes['ai.usage.completionTokens'] + attributes['ai.usage.promptTokens']; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looking at our LLM monitoring attribute spec, should this rather be:
span.data['ai.tokens.used'] = attributes['ai.usage.completionTokens'] + attributes['ai.usage.promptTokens']; | |
span.data['ai.total_tokens.used'] = attributes['ai.usage.completionTokens'] + attributes['ai.usage.promptTokens']; |
? Not sure though, maybe there's another spec somewhere for this attribute?
Adds Sentry tracing instrumentation for the ai library.
For more information, see the
ai
documentation.By default this integration adds tracing support to all
ai
callsites. If you need to disablecollecting spans for a specific call, you can do so by setting
experimental_telemetry.isEnabled
tofalse
in the first argument of the function call.If you want to collect inputs and outputs for a specific call, you must specifically opt-in to each
function call by setting
experimental_telemetry.recordInputs
andexperimental_telemetry.recordOutputs
to
true
.resolves #13679