A simple Next.js App Router scaffold for a chat app built with the Vercel AI SDK and the direct OpenAI provider.
- A streaming chat interface powered by
useChat - A
ToolLoopAgentwith three mocked server-side tools - Typed UI messages via
InferAgentUIMessage - An API route that streams agent responses with
createAgentUIStreamResponse
- Install dependencies:
npm install- Create a local env file from the example:
cp .env.example .env.local- Add your OpenAI API key to
.env.local:
OPENAI_API_KEY=your_key_here- Optional: add PromptLayer tracing credentials if you want OpenTelemetry traces exported to PromptLayer:
PROMPTLAYER_API_KEY=your_promptlayer_key_here
OTEL_SERVICE_NAME=vercel-chat-app- Start the app:
npm run devOpen http://localhost:3000.
The scaffold uses the direct OpenAI provider from @ai-sdk/openai and defaults to gpt-5, matching the current example on the AI SDK OpenAI provider docs. You can override that with:
OPENAI_MODEL=your_preferred_modelThis app can export Vercel AI SDK traces to PromptLayer using OpenTelemetry.
PROMPTLAYER_API_KEYenables OTLP trace export to PromptLayerOTEL_SERVICE_NAMEoptionally overrides the service name shown in traces
If PROMPTLAYER_API_KEY is not set, the app still runs normally and skips PromptLayer export.
npm run devnpm run lintnpm run typecheck