Skip to Content
Consumer SdkQuick Start Guide

Quick Start Guide

This tutorial will walk you through creating a complete chat application with Fractal integration, deployed on Vercel.

Prerequisites

  • Node.js 18+ installed
  • A Fractal consumer API key (get yours at registry.fractalmcp.com)
  • An OpenAI API key
  • Vercel account for deployment

Project Setup

Create a new project directory and initialize it:

mkdir my-fractal-chat cd my-fractal-chat npm init -y

Installation

Install the required dependencies:

# Core Fractal dependencies npm install @fractal-mcp/client @fractal-mcp/render @fractal-mcp/vercel-connector # AI SDK dependencies npm install @ai-sdk/openai @ai-sdk/react ai # React and server dependencies npm install react react-dom express cors # Dev dependencies npm install -D @vitejs/plugin-react typescript vite tailwindcss

Project Structure

Create the following project structure:

my-fractal-chat/ ├── package.json ├── index.html ├── vite.config.ts ├── tailwind.config.js ├── server/ │ └── index.ts └── src/ ├── main.tsx ├── Chat.tsx └── index.css

Chat Component

src/Chat.tsx:

import { useChat } from '@ai-sdk/react'; import { FractalFrameEvent, renderLayoutAsComponent } from '@fractal-mcp/render'; import { useCallback } from 'react'; export default function Chat() { // Set up chat functionality with Vercel AI SDK const { messages, input, handleInputChange, handleSubmit, append } = useChat({ api: '/api/chat' }); // Handle events from interactive Fractal components const handleFrameEvent = useCallback((event: FractalFrameEvent) => { // Send component events back to the server as data messages append({ role: 'data', content: JSON.stringify(event), }) }, [append]); return ( <div className="max-w-2xl mx-auto p-4"> <h1 className="text-2xl font-bold mb-6">My Fractal Chat</h1> {/* Display chat messages */} <div className="space-y-4 mb-6"> {messages .filter(m => !["system", "data"].includes(m.role)) .map(message => ( <div key={message.id} className={`p-3 rounded-lg ${ message.role === 'user' ? 'bg-blue-100 ml-12' : 'bg-gray-100 mr-12' }`}> <div className="font-medium text-sm mb-2"> {message.role === 'user' ? 'You' : 'Assistant'} </div> {/* Render message parts */} {message.parts.map((part, i) => { // Regular text if (part.type === 'text') { return <div key={i}>{part.text}</div>; } // Tool invocations (this is where Fractal magic happens!) if (part.type === 'tool-invocation') { const toolInvocation = (part as any).toolInvocation; // Render Fractal components if (toolInvocation.toolName === 'renderLayout') { return ( <div key={i} className="mt-3"> {renderLayoutAsComponent(toolInvocation, handleFrameEvent)} </div> ); } // Show other tools being called return ( <div key={i} className="mt-2 text-sm text-gray-600"> 🔧 Using tool: {toolInvocation.toolName} </div> ); } return null; })} </div> ))} </div> {/* Input form */} <form onSubmit={handleSubmit} className="flex gap-2"> <input value={input} onChange={handleInputChange} placeholder="Ask me anything..." className="flex-1 p-3 border rounded-lg focus:outline-none focus:ring-2 focus:ring-blue-500" /> <button type="submit" disabled={!input.trim()} className="px-6 py-3 bg-blue-500 text-white rounded-lg hover:bg-blue-600 disabled:opacity-50" > Send </button> </form> </div> ); }

Server Implementation

server/index.ts:

import express from 'express'; import cors from 'cors'; import { streamText, convertToCoreMessages } from 'ai'; import { openai } from '@ai-sdk/openai'; import { FractalSDK } from '@fractal-mcp/client'; import { FractalVercel, cleanMessages } from '@fractal-mcp/vercel-connector'; const app = express(); app.use(cors()); app.use(express.json()); // Simple system message for the AI const systemMessage = `You are a helpful assistant.`; let fractalVercel: FractalVercel | null = null; async function setupFractal() { if (!fractalVercel) { const client = new FractalSDK({ apiKey: process.env.FRACTAL_CONSUMER_KEY! }); await client.connect(); fractalVercel = new FractalVercel(client); } return fractalVercel; } app.post('/api/chat', async (req, res) => { const { messages } = req.body; // Prevent enormous react components from clogging the model's context const cleanedMessages = cleanMessages(messages, ["renderLayout"]); const fractal = await setupFractal(); // Handle component events const wasHandled = await fractal.handleDataMessage(cleanedMessages, res); if (!wasHandled) { // Get available tools and stream response const tools = await fractal.getTools(); const result = streamText({ model: openai('gpt-4o'), system: systemMessage, messages: convertToCoreMessages(cleanedMessages), tools, }); result.pipeDataStreamToResponse(res); } }); app.listen(3001, () => { console.log('Server running on http://localhost:3001'); });

Environment Variables

Create a .env file in your project root:

FRACTAL_CONSUMER_KEY=your_fractal_consumer_key_here OPENAI_API_KEY=your_openai_api_key_here

Get your Fractal Consumer Key: Visit registry.fractalmcp.com to sign up and get your consumer API key.

Running Your App

Run your application locally:

# Install dependencies npm install # Start development server npm run dev

Open your browser to http://localhost:5174 and start chatting!

Next Steps

  • Explore available tools in the Fractal registry
  • Customize your chat UI and styling
  • Add authentication and user management
  • Deploy to production with proper monitoring

For more examples and advanced usage, check out the example-consumer-vercel in this repository.