Skip to content

MagnivOrg/vercel-chat-app

Repository files navigation

Relay Chat Demo

A simple Next.js App Router scaffold for a chat app built with the Vercel AI SDK and the direct OpenAI provider.

What is included

  • A streaming chat interface powered by useChat
  • A ToolLoopAgent with three mocked server-side tools
  • Typed UI messages via InferAgentUIMessage
  • An API route that streams agent responses with createAgentUIStreamResponse

Setup

  1. Install dependencies:
npm install
  1. Create a local env file from the example:
cp .env.example .env.local
  1. Add your OpenAI API key to .env.local:
OPENAI_API_KEY=your_key_here
  1. Optional: add PromptLayer tracing credentials if you want OpenTelemetry traces exported to PromptLayer:
PROMPTLAYER_API_KEY=your_promptlayer_key_here
OTEL_SERVICE_NAME=vercel-chat-app
  1. Start the app:
npm run dev

Open http://localhost:3000.

Model configuration

The scaffold uses the direct OpenAI provider from @ai-sdk/openai and defaults to gpt-5, matching the current example on the AI SDK OpenAI provider docs. You can override that with:

OPENAI_MODEL=your_preferred_model

PromptLayer tracing

This app can export Vercel AI SDK traces to PromptLayer using OpenTelemetry.

  • PROMPTLAYER_API_KEY enables OTLP trace export to PromptLayer
  • OTEL_SERVICE_NAME optionally overrides the service name shown in traces

If PROMPTLAYER_API_KEY is not set, the app still runs normally and skips PromptLayer export.

Useful scripts

  • npm run dev
  • npm run lint
  • npm run typecheck

About

A simple Next.js App Router scaffold for a chat app built with the Vercel AI SDK and the direct OpenAI provider.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors