Summary
The @google/genai SDK's multi-turn chat API (ai.chats.create() → chat.sendMessage() / chat.sendMessageStream()) bypasses the wrapGoogleGenAI proxy entirely. Calls through the chats interface produce zero Braintrust spans, even though the underlying models.generateContent() method IS instrumented.
This is a proxy architecture bug, not a missing-channel issue — the generation calls physically avoid the instrumented code path.
Root cause
The GoogleGenAI constructor stores a direct reference to this.models into the Chats object at construction time:
// Inside GoogleGenAI constructor (upstream SDK):
this.models = new Models(this.apiClient);
this.chats = new Chats(this.models, this.apiClient); // stores original models reference
The Braintrust wrapper applies its proxy after construction completes:
// js/src/wrappers/google-genai.ts
function wrapGoogleGenAIClass(OriginalGoogleGenAI) {
return new Proxy(OriginalGoogleGenAI, {
construct(target, args) {
const instance = Reflect.construct(target, args); // constructor runs first
return wrapGoogleGenAIInstance(instance); // proxy applied second
},
});
}
wrapGoogleGenAIInstance only intercepts models property access (line 74–78), but the Chats object already holds a reference to the original, unproxied Models instance. When chat.sendMessage() internally calls this.modelsModule.generateContent(), it uses the stored reference — completely bypassing the proxy.
What instrumentation is missing
All calls through the chats API are untraced:
| SDK Method |
Description |
Traced? |
ai.models.generateContent() |
Direct model call |
Yes |
ai.models.generateContentStream() |
Direct streaming call |
Yes |
ai.models.embedContent() |
Direct embedding call |
Yes |
chat.sendMessage() |
Multi-turn chat message |
No |
chat.sendMessageStream() |
Multi-turn chat streaming |
No |
The chats API is the recommended way to do multi-turn conversations in the @google/genai SDK. Users following official Google documentation would get zero observability.
Braintrust docs status
unclear — The Braintrust Gemini integration page at https://www.braintrust.dev/docs/integrations/ai-providers/gemini documents wrapGoogleGenAI as providing automatic tracing. The chats API (sendMessage / sendMessageStream) is not explicitly mentioned as supported or unsupported.
Upstream reference
- Google GenAI JS SDK chats API: https://ai.google.dev/gemini-api/docs/text-generation#multi-turn-conversations
- SDK source (
chats.ts): sendMessage() delegates to this.modelsModule.generateContent() — the stored reference, not a dynamic property access
- SDK source (
client.ts): this.chats = new Chats(this.models, ...) — models reference captured during construction
@google/genai npm package — this is a stable, documented API surface
Local files inspected
js/src/wrappers/google-genai.ts — wrapGoogleGenAIClass (line 62–67) applies proxy post-construction; wrapGoogleGenAIInstance (line 70–80) only intercepts models property
js/src/vendor-sdk-types/google-genai.ts — GoogleGenAIClient only declares models (line 17–19), no chats property
js/src/instrumentation/plugins/google-genai-channels.ts — channels exist for generateContent and generateContentStream but are never reached via chats path
e2e/scenarios/google-genai-instrumentation/scenario.impl.mjs — no test cases using the chats API
Summary
The
@google/genaiSDK's multi-turn chat API (ai.chats.create()→chat.sendMessage()/chat.sendMessageStream()) bypasses thewrapGoogleGenAIproxy entirely. Calls through the chats interface produce zero Braintrust spans, even though the underlyingmodels.generateContent()method IS instrumented.This is a proxy architecture bug, not a missing-channel issue — the generation calls physically avoid the instrumented code path.
Root cause
The
GoogleGenAIconstructor stores a direct reference tothis.modelsinto theChatsobject at construction time:The Braintrust wrapper applies its proxy after construction completes:
wrapGoogleGenAIInstanceonly interceptsmodelsproperty access (line 74–78), but theChatsobject already holds a reference to the original, unproxiedModelsinstance. Whenchat.sendMessage()internally callsthis.modelsModule.generateContent(), it uses the stored reference — completely bypassing the proxy.What instrumentation is missing
All calls through the chats API are untraced:
ai.models.generateContent()ai.models.generateContentStream()ai.models.embedContent()chat.sendMessage()chat.sendMessageStream()The chats API is the recommended way to do multi-turn conversations in the
@google/genaiSDK. Users following official Google documentation would get zero observability.Braintrust docs status
unclear— The Braintrust Gemini integration page at https://www.braintrust.dev/docs/integrations/ai-providers/gemini documentswrapGoogleGenAIas providing automatic tracing. The chats API (sendMessage/sendMessageStream) is not explicitly mentioned as supported or unsupported.Upstream reference
chats.ts):sendMessage()delegates tothis.modelsModule.generateContent()— the stored reference, not a dynamic property accessclient.ts):this.chats = new Chats(this.models, ...)— models reference captured during construction@google/genainpm package — this is a stable, documented API surfaceLocal files inspected
js/src/wrappers/google-genai.ts—wrapGoogleGenAIClass(line 62–67) applies proxy post-construction;wrapGoogleGenAIInstance(line 70–80) only interceptsmodelspropertyjs/src/vendor-sdk-types/google-genai.ts—GoogleGenAIClientonly declaresmodels(line 17–19), nochatspropertyjs/src/instrumentation/plugins/google-genai-channels.ts— channels exist forgenerateContentandgenerateContentStreambut are never reached via chats pathe2e/scenarios/google-genai-instrumentation/scenario.impl.mjs— no test cases using the chats API