Vercel Logs Integration ▲
Supercharge your Vercel deployment debugging with Gonzo's terminal-based log analysis. Get real-time insights, pattern detection, and AI-powered analysis for your Vercel applications.
Overview
If you're building on Vercel, you've probably used the vercel logs
command or the built-in dashboard to debug your apps. While these tools are useful, they can quickly feel limited when you need:
Deeper insights: Beyond basic log viewing
Real-time context: Pattern recognition across log streams
Better filtering: Advanced search and filtering
AI analysis: Understanding complex errors
Terminal workflow: Stay in your terminal alongside other tools
That's where Gonzo comes in: an open-source terminal UI (TUI) for logs with native support for Vercel's JSON log format.
Quick Start
Basic Usage
The simplest way to use Gonzo with Vercel:
# Stream logs from your Vercel deployment
vercel logs --output json | gonzo
# Follow logs in real-time
vercel logs --follow --output json | gonzo
# Specific deployment
vercel logs <deployment-url> --follow --output json | gonzo
That's it! Gonzo automatically detects and parses Vercel's JSON format.
Installation
Install Vercel CLI
# Install via npm
npm install -g vercel
# Or via yarn
yarn global add vercel
# Verify installation
vercel --version
Install Gonzo
# Via Go
go install github.com/control-theory/gonzo/cmd/gonzo@latest
# Via Homebrew
brew install gonzo
# Verify installation
gonzo --version
Authentication
# Log in to Vercel
vercel login
# Verify you're authenticated
vercel whoami
Common Use Cases
1. Debugging Deployment Issues
Watch logs as your deployment rolls out:
# Deploy and watch logs
vercel --prod --follow --output json | gonzo
# Or after deployment
vercel logs --follow --output json | gonzo
What you'll see:
Build logs
Function execution logs
Edge function logs
Error messages
Request information
2. Monitoring Serverless Functions
Debug serverless function execution:
# All function logs
vercel logs --follow --output json | gonzo
# Filter in Gonzo
# Press '/' and type function name
Gonzo helps you:
Identify cold starts
Track execution times
Spot errors quickly
Analyze invocation patterns
3. Analyzing Edge Functions
Monitor Edge Functions with low-latency visualization:
# Follow edge function logs
vercel logs --follow --output json | gonzo
Benefits:
Real-time edge execution visibility
Geographic distribution insights
Error rate tracking
Performance pattern detection
4. Production Incident Response
Quick investigation during incidents:
# Recent production logs
vercel logs --since 1h --output json | gonzo
# Follow production errors
vercel logs --follow --output json | gonzo
# In Gonzo: Press '/' to filter for "ERROR"
5. Development Debugging
Debug during local development:
# Development logs
vercel dev --output json 2>&1 | gonzo
# Or tail deployment logs while developing
vercel logs --follow --output json | gonzo
Advanced Usage
Filtering Logs
By Project
# Specific project
vercel logs my-project --follow --output json | gonzo
By Time Range
# Last hour
vercel logs --since 1h --output json | gonzo
# Last 24 hours
vercel logs --since 24h --output json | gonzo
# Specific time range
vercel logs --since 2024-01-15T10:00:00 --output json | gonzo
By Deployment
# Specific deployment URL
vercel logs https://my-app-abc123.vercel.app --follow --output json | gonzo
# Latest production deployment
vercel logs --prod --follow --output json | gonzo
Combining with AI Analysis
Get AI-powered insights on Vercel errors:
# Enable AI analysis
export OPENAI_API_KEY="sk-your-key-here"
# Stream logs with AI
vercel logs --follow --output json | gonzo --ai-model="gpt-4"
In Gonzo:
Navigate to an error log
Press
Enter
to view detailsPress
i
for instant AI analysisGet explanation of the error and suggested fixes
Shell Aliases
Add to your ~/.bashrc
or ~/.zshrc
:
# Vercel + Gonzo aliases
alias vl='vercel logs --follow --output json | gonzo'
alias vl-prod='vercel logs --prod --follow --output json | gonzo'
alias vl-ai='vercel logs --follow --output json | gonzo --ai-model="gpt-4"'
# With time filters
alias vl-1h='vercel logs --since 1h --output json | gonzo'
alias vl-today='vercel logs --since 24h --output json | gonzo'
Usage:
vl # Follow all logs
vl-prod # Production only
vl-ai # With AI analysis
vl-1h # Last hour
Understanding Vercel Log Format
Vercel outputs JSON logs with this structure:
{
"id": "abc123",
"message": "GET /api/users 200",
"timestamp": 1705315805000,
"source": "lambda",
"projectId": "prj_abc123",
"deploymentId": "dpl_xyz789",
"buildId": "bld_456def",
"requestId": "req_123abc",
"statusCode": 200,
"duration": 234
}
Gonzo automatically extracts:
Timestamp: When the log occurred
Source: Lambda, edge, build, etc.
Message: Log content
Metadata: Request ID, deployment ID, status codes
All fields: Available in Attributes panel
Workflow Examples
Morning Deployment Review
# Check overnight deployments
vercel logs --since 12h --output json | gonzo
# Look for errors in Gonzo:
# 1. Check Word Frequency for common errors
# 2. Filter with '/' for "ERROR"
# 3. Press Enter on Counts to see pattern heatmap
Real-Time Monitoring
# Terminal 1: Monitor production
vercel logs --prod --follow --output json | gonzo
# Terminal 2: Monitor preview
vercel logs --follow --output json | gonzo
Error Investigation
# Find recent errors
vercel logs --since 1h --output json | gonzo
# In Gonzo:
# 1. Press '/' and type: ERROR
# 2. Navigate through errors with ↑↓
# 3. Press Enter to see full details
# 4. Press 'i' for AI analysis of complex errors
Performance Analysis
# Follow logs with focus on performance
vercel logs --follow --output json | gonzo
# Look for:
# - High duration values
# - Cold start indicators
# - Timeout errors
# - Memory issues
Integration with Vercel Dashboard
Complementary Workflow
Use both tools for maximum effectiveness:
Vercel Dashboard:
Overview metrics
Deployment management
Team collaboration
Analytics
Gonzo:
Real-time log analysis
Terminal-based workflow
Pattern detection
AI-powered insights
Advanced filtering
When to Use Each
Use Vercel Dashboard when:
Reviewing high-level metrics
Managing deployments
Team collaboration
Long-term analysis
Use Gonzo when:
Debugging specific issues
Real-time monitoring
Terminal-first workflow
Need advanced analysis
Want AI assistance
Troubleshooting
No Logs Appearing
Verify Vercel CLI is working:
# Test without Gonzo
vercel logs --output json
# Should show JSON output
Check authentication:
vercel whoami
# Should show your username
Ensure JSON output:
# ✅ Correct
vercel logs --output json | gonzo
# ❌ Wrong (missing --output json)
vercel logs | gonzo
Logs Not Parsing Correctly
Verify JSON format:
# Check raw output
vercel logs --output json | head -1 | jq .
Update Vercel CLI:
npm install -g vercel@latest
vercel --version
Performance Issues
Reduce log volume:
# Limit to specific source
vercel logs --follow --output json | grep "lambda" | gonzo
# Time-limited query
vercel logs --since 30m --output json | gonzo
Adjust Gonzo settings:
vercel logs --follow --output json | gonzo --update-interval=2s --log-buffer=5000
Best Practices
1. Use JSON Output Always
# ✅ Always include --output json
vercel logs --follow --output json | gonzo
# ❌ Don't omit it
vercel logs --follow | gonzo
2. Filter Early for Production
# Filter at Vercel level when possible
vercel logs --prod --follow --output json | gonzo
# Then use Gonzo's '/' for additional filtering
3. Leverage AI for Complex Errors
# Set up AI once
export OPENAI_API_KEY="sk-..."
# Use for error investigation
vercel logs --follow --output json | gonzo --ai-model="gpt-4"
4. Create Project-Specific Aliases
# In ~/.bashrc or ~/.zshrc
alias vl-api='vercel logs api-project --follow --output json | gonzo'
alias vl-web='vercel logs web-project --follow --output json | gonzo'
alias vl-docs='vercel logs docs-project --follow --output json | gonzo'
5. Combine with Other Tools
# With jq for pre-filtering
vercel logs --output json | jq 'select(.statusCode >= 400)' | gonzo
# Save logs while viewing
vercel logs --output json | tee vercel-logs.json | gonzo
Real-World Scenarios
Scenario 1: Deployment Goes Wrong
# Quick investigation
vercel logs --since 10m --output json | gonzo
# In Gonzo:
# - Look for red (ERROR) entries
# - Check Word Frequency for clues
# - Use AI analysis on errors
Scenario 2: User Reports Error
# Find recent errors
vercel logs --since 1h --output json | gonzo
# In Gonzo:
# - Press '/' and filter for user ID or error
# - Navigate to relevant logs
# - View full context with Enter
Scenario 3: Performance Degradation
# Monitor execution times
vercel logs --follow --output json | gonzo
# Watch for:
# - Increased duration values
# - Cold start patterns
# - Timeout errors
Scenario 4: Function Development
# Terminal 1: Local dev
vercel dev
# Terminal 2: Watch deployment logs
vercel logs --follow --output json | gonzo
# Iterate quickly with real-time feedback
Complete Tutorial
For a comprehensive guide including:
Detailed setup instructions
Real-world debugging examples
Advanced filtering patterns
AI-powered error analysis
Production monitoring strategies
Read the full guide: Vercel Logs Meet Gonzo
Configuration Examples
Gonzo Config for Vercel
Create ~/.config/gonzo/vercel-config.yml
:
# Optimized for Vercel logs
update-interval: 1s
log-buffer: 3000
memory-size: 15000
skin: controltheory-dark
# AI for error analysis
ai-model: "gpt-3.5-turbo"
Use with:
vercel logs --follow --output json | gonzo --config ~/.config/gonzo/vercel-config.yml
Related Resources
Documentation
AI Integration Guide
Blog Posts
Vercel Logs Meet Gonzo - Complete guide
Community
Support
Having issues with Vercel integration?
Check Troubleshooting Guide
Ask in GitHub Discussions
Report bugs in GitHub Issues
Pro Tip: Always use --output json
with vercel logs
for best compatibility with Gonzo. The JSON format provides rich structured data that Gonzo can parse and analyze effectively.
Last updated