Vercel
Deployment and hosting tools powered by Vercel API
The @tooly/vercel package provides AI-ready tools for Vercel deployment and hosting platform. Manage projects, deployments, domains, and teams with AI assistance.
Installation
npm install @tooly/vercelQuick Start
import { generateText } from 'ai'
import { openai } from '@ai-sdk/openai'
import { createAITools } from '@tooly/vercel'
const tools = createAITools(process.env.VERCEL_BEARER_TOKEN!)
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'List my Vercel projects and show me their latest deployments',
},
],
tools,
})
console.log(result.text)Setup
1. Get Your Vercel Bearer Token
- Go to Vercel Account Settings
- Create a new access token
- Copy the token for use in your application
2. Environment Variables
Store your bearer token securely:
VERCEL_BEARER_TOKEN=your_vercel_bearer_token_here3. Initialize the Tools
import { createAITools } from '@tooly/vercel'
const tools = createAITools(process.env.VERCEL_BEARER_TOKEN!)Available Tools
The Vercel package provides the following AI tools:
createProject
Creates a new project in Vercel.
Parameters:
name(string, required): Project nameframework(string, optional): Framework preset (e.g., "nextjs", "vite", "create-react-app")gitRepository(object, optional): Git repository configurationtype(string): Git provider ("github", "gitlab", "bitbucket")repo(string): Repository name (e.g., "username/repo-name")
publicSource(boolean, optional): Whether the project source code is publicrootDirectory(string, optional): Root directory relative to repository rootteamId(string, optional): Team ID to create the project underslug(string, optional): Team slug to create the project under
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'Create a new Next.js project called "my-app" connected to my GitHub repo',
},
],
tools,
})getProject
Gets details of a specific Vercel project by ID or name.
Parameters:
idOrName(string, required): Project ID or nameteamId(string, optional): Team ID that owns the projectslug(string, optional): Team slug that owns the project
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'Show me details for my project called "portfolio"',
},
],
tools,
})listProjects
Lists all Vercel projects.
Parameters:
teamId(string, optional): Filter by team IDslug(string, optional): Filter by team sluglimit(number, optional): Maximum number of projects to return (1-100)since(number, optional): Get projects created after this timestampuntil(number, optional): Get projects created before this timestamp
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'List all my Vercel projects',
},
],
tools,
})updateProject
Updates an existing Vercel project.
Parameters:
idOrName(string, required): Project ID or name to updatename(string, optional): New project nameframework(string, optional): New framework presetpublicSource(boolean, optional): Whether the project source code is publicrootDirectory(string, optional): New root directoryteamId(string, optional): Team ID that owns the projectslug(string, optional): Team slug that owns the project
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'Update my project to use Vite framework and make the source public',
},
],
tools,
})deleteProject
Deletes a Vercel project.
Parameters:
idOrName(string, required): Project ID or name to deleteteamId(string, optional): Team ID that owns the projectslug(string, optional): Team slug that owns the project
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'Delete my old test project',
},
],
tools,
})getDeployment
Gets details of a specific deployment by ID or URL.
Parameters:
idOrUrl(string, required): Deployment ID or URLteamId(string, optional): Team ID that owns the deploymentslug(string, optional): Team slug that owns the deployment
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'Show me details for deployment dpl_abc123',
},
],
tools,
})listDeployments
Lists deployments for a project or team.
Parameters:
projectId(string, optional): Filter by project IDteamId(string, optional): Team ID to list deployments forslug(string, optional): Team slug to list deployments forlimit(number, optional): Maximum number of deployments to return (1-100)since(number, optional): Get deployments created after this timestampuntil(number, optional): Get deployments created before this timestampstate(string, optional): Filter by deployment state (e.g., "BUILDING", "READY", "ERROR")target(string, optional): Filter by deployment target ("staging", "production")
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'Show me the latest production deployments for my project',
},
],
tools,
})listProjectDomains
Lists domains for a specific project.
Parameters:
idOrName(string, required): Project ID or nameteamId(string, optional): Team ID that owns the projectslug(string, optional): Team slug that owns the project
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'Show me all domains configured for my portfolio project',
},
],
tools,
})getTeam
Gets details of a specific team.
Parameters:
teamId(string, required): Team ID
Example:
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [
{
role: 'user',
content: 'Show me details for my team',
},
],
tools,
})Integration Examples
OpenAI SDK
import OpenAI from 'openai'
import { createOpenAIFunctions } from '@tooly/vercel'
const openai = new OpenAI()
const { functions, callFunction } = createOpenAIFunctions(process.env.VERCEL_BEARER_TOKEN!)
const response = await openai.chat.completions.create({
model: 'gpt-4',
messages: [
{
role: 'user',
content: 'Create a new project for my portfolio website',
},
],
functions,
})
if (response.choices[0].message.function_call) {
const result = await callFunction(response.choices[0].message.function_call)
console.log(result)
}Anthropic SDK
import { createAnthropicTools } from '@tooly/vercel'
const { tools, callTool } = createAnthropicTools(process.env.VERCEL_BEARER_TOKEN!)
// Use with Anthropic Claude API
// Implementation depends on your Anthropic integrationError Handling
All tools include built-in error handling and will throw descriptive errors for common issues:
- Invalid bearer token
- Missing required parameters
- API rate limits
- Network connectivity issues
- Invalid project or deployment IDs
try {
const result = await generateText({
model: openai('gpt-4.1-nano'),
messages: [{ role: 'user', content: 'List my projects' }],
tools,
})
} catch (error) {
console.error('Vercel operation failed:', error.message)
}Best Practices
- Secure Token Storage: Store your Vercel bearer token securely using environment variables
- Rate Limiting: Be mindful of Vercel API rate limits when making frequent requests
- Error Handling: Always implement proper error handling for production applications
- Team Management: Use team IDs and slugs to organize projects within team accounts
- Deployment Monitoring: Regularly check deployment status and logs for issues
Rate Limits
Vercel API has rate limits that vary by plan:
- Hobby: 100 requests per hour
- Pro: 1,000 requests per hour
- Enterprise: Custom limits
The package automatically handles rate limiting errors and provides meaningful error messages.
Support
For issues with the Vercel package:
- Check the Vercel API documentation
- Verify your bearer token permissions
- Review the error messages for specific guidance
- Open an issue on the Tooly GitHub repository
For Vercel platform issues, contact Vercel Support.