Integration guides
Next.js on Vercel
Complete integration for Next.js App Router apps using the Vercel AI SDK.
This guide takes a Next.js 13+ App Router app using the Vercel AI SDK and adds end-to-end tracing: server-side AI calls, browser identity, streaming response flushing, and user/chat context.
Follow it top to bottom. Everything here is load-bearing for correctness; do not skip steps.
Install
npm install @runotis/sdkNo peer dependencies. Everything is bundled.
1. Environment variables
Set these in .env.local and in your Vercel project settings:
OTIS_API_KEY=sk-otis-xxx
NEXT_PUBLIC_OTIS_API_KEY=sk-otis-xxxOTIS_API_KEY— server-side auth. Also used to derive the HMAC salt for identifier hashing (enabled by default), so raw identifiers (emails, internal user IDs) never reach analytics storage.NEXT_PUBLIC_OTIS_API_KEY— browser-side. Safe to expose; it's scoped to ingest only.
2. Server instrumentation
Create instrumentation.ts at the root of your project (Next.js auto-loads this on server startup):
import { createOtisInstrumentation } from "@runotis/sdk/next/server";
const { register } = createOtisInstrumentation({
apiKey: process.env.OTIS_API_KEY!,
serviceName: "my-app",
serverless: true,
});
export { register };Required on Vercel
serverless: true configures the exporter to flush every span immediately instead of batching. Without it, spans are lost when the function freezes.
3. Root layout — browser provider
import { OtisProvider, OtisPageView } from "@runotis/sdk/next";
export default function RootLayout({ children }: { children: React.ReactNode }) {
return (
<html>
<body>
<OtisProvider config={{
apiKey: process.env.NEXT_PUBLIC_OTIS_API_KEY!,
serviceName: "my-app",
browser: {
autoAnonymousUserId: true,
autoSessionId: true,
},
}}>
<OtisPageView />
{children}
</OtisProvider>
</body>
</html>
);
}OtisPageView auto-tracks route changes as page view events. Remove it if you don't want pageviews.
GDPR / EU users
EU users need consent
If your app serves EU users, cookies are not written until consent is given. Add a consent banner and call otis.consentGiven() when the user accepts. See Browser & consent for CMP adapters (Usercentrics, OneTrust, Cookiebot) and the full consent API.
If your app is internal or not subject to GDPR, add consent: { mode: "granted" } to the browser config to skip the consent gate.
4. next.config — proxy rewrites
import { withOtisConfig } from "@runotis/sdk/next";
export default withOtisConfig({
// your Next.js config
});This adds rewrites so the browser SDK sends spans through your Next.js server instead of directly to ingest, which avoids CORS issues and bypasses most ad blockers.
5. Chat route handler
This is the critical path. Every piece below matters.
import { getServerOtis } from "@runotis/sdk/next/server";
import { contextFromChatRequest } from "@runotis/sdk";
import { streamText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
import { waitUntil } from "@vercel/functions";
import { auth } from "@clerk/nextjs/server";
export async function POST(req: Request) {
const otis = getServerOtis()!;
const body = await req.json();
// 1. Auth — pull userId + sessionId from your auth provider
const { userId, sessionId } = await auth();
// 2. Derive stable chatId + eventId from the request body
// (handles useChat tool roundtrips correctly)
const ctx = contextFromChatRequest(body, req, {
userId: userId ?? undefined,
sessionId: sessionId ?? undefined,
});
return otis.withContext(ctx, async () => {
const { streamText: tracedStreamText } = otis.wrap(ai);
const result = await tracedStreamText({
model: anthropic("claude-sonnet-4-6"),
messages: body.messages,
});
// 3. Keep the function alive until spans flush
waitUntil(otis.flush());
return result.toTextStreamResponse();
});
}Why each piece matters
contextFromChatRequest(body, req, ...). When the Vercel AI SDK's useChat hook drives a chat with tools, a single user turn can produce multiple POST requests (one per tool roundtrip). Without a stable chat ID, each POST creates a separate trace. contextFromChatRequest derives stable chatId and eventId from the request body so all roundtrips extend the same trace. It also reads __otis_uid and __otis_session cookies from the request, so browser-set identity flows to the server automatically.
otis.withContext(ctx, ...). Sets userId, sessionId, chatId, and eventId on every span created inside the callback (AI spans, custom traced functions, and event helpers). Without it, spans are anonymous.
otis.wrap(ai). Wraps the AI SDK to intercept model calls, tool execution, and streaming. Produces a parent span per AI call with child spans for each model invocation and tool.
waitUntil(otis.flush()). waitUntil keeps the function alive until the flush promise resolves.
On Vercel, the function returns the streaming response before the stream completes. Without waitUntil, the function can freeze mid-flush and spans are lost.
If waitUntil isn't available, use the onFinish hook:
const result = await tracedStreamText({
model: anthropic("claude-sonnet-4-6"),
messages: body.messages,
onFinish: () => { waitUntil(otis.flush()); },
});6. Client components — identify + feedback
"use client";
import { useOtis } from "@runotis/sdk/next";
export function FeedbackButton({ messageId }: { messageId: string }) {
const { sendFeedbackSignal, identifyUser } = useOtis();
return (
<button onClick={() => sendFeedbackSignal(messageId, "thumbs_up")}>
Helpful
</button>
);
}Call identifyUser after sign-in to link the anonymous browser session to the authenticated user:
"use client";
import { useEffect } from "react";
import { useOtis } from "@runotis/sdk/next";
import { useUser } from "@clerk/nextjs";
export function OtisIdentify() {
const { identifyUser } = useOtis();
const { user } = useUser();
useEffect(() => {
if (user) identifyUser(user.id, { email: user.primaryEmailAddress?.emailAddress });
}, [user, identifyUser]);
return null;
}Mount <OtisIdentify /> somewhere inside <OtisProvider>.
Integration checklist
Before shipping, verify:
-
OTIS_API_KEYandNEXT_PUBLIC_OTIS_API_KEYare set in Vercel -
instrumentation.tspassesserverless: true - Chat route uses
contextFromChatRequest,withContext,wrap(ai), andwaitUntil(otis.flush()) - Auth provider's
userId+sessionIdflow intocontextFromChatRequest -
OtisProviderwraps the app inapp/layout.tsx - Consent banner is wired (EU users) OR
consent: { mode: "granted" }is set - A test AI call produces a trace with
user.id,chat.id, and token usage attributes
Next steps
- Tracing — tracing your own functions, supported AI frameworks beyond the Vercel AI SDK
- Events and exceptions —
sendEvent,sendException - Identity — auth provider pass-through patterns
- Feedback signals — record user feedback on AI responses
- Browser & consent — GDPR consent flow and CMP adapters
- Privacy — PII redaction and identifier hashing
- Customization —
beforeSend, debug logging, configuration