Environment Variables
Complete reference for environment variables that configure Gonzo's behavior. Environment variables provide a convenient way to configure Gonzo without command-line flags or config files.
Overview
Environment variables are useful for:
Container deployments: Configure via Docker/Kubernetes env vars
CI/CD pipelines: Set configuration in pipeline env
Quick testing: Temporary configuration without files
System-wide defaults: Set in shell profile
Priority Order (highest to lowest):
Command line flags
Environment variables
Configuration file
Default values
Gonzo-Specific Variables
Input Configuration
GONZO_FILES
Type: String (comma-separated paths)
Default: ""
(uses stdin)
Description: Log files to read
# Single file
export GONZO_FILES="/var/log/app.log"
# Multiple files
export GONZO_FILES="/var/log/app.log,/var/log/error.log"
# Glob pattern (quote to prevent shell expansion)
export GONZO_FILES="/var/log/*.log"
Example:
export GONZO_SKIN="dracula"
Built-in skins:
Dark:
default
,controltheory-dark
,dracula
,gruvbox
,monokai
,nord
,solarized-dark
Light:
controltheory-light
,github-light
,solarized-light
,vs-code-light
,spring
Advanced Settings
GONZO_TEST_MODE
Type: Boolean (true
or false
)
Default: false
Description: Run without TTY (for testing)
export GONZO_TEST_MODE="true"
Note: For automated testing only, not for normal use.
AI Provider Variables
These configure AI provider connections:
OpenAI API
OPENAI_API_KEY
Type: String (API key) Required for: OpenAI API access Description: Your OpenAI API key
export OPENAI_API_KEY="sk-your-actual-api-key-here"
Get your key: https://platform.openai.com/api-keys
OPENAI_API_BASE
Type: String (URL)
Default: https://api.openai.com/v1
Description: API endpoint (for custom providers)
# For OpenAI (default, can omit)
export OPENAI_API_BASE="https://api.openai.com/v1"
# For LM Studio (MUST include /v1)
export OPENAI_API_BASE="http://localhost:1234/v1"
# For Ollama (NO /v1 suffix)
export OPENAI_API_BASE="http://localhost:11434"
# For custom OpenAI-compatible API
export OPENAI_API_BASE="https://your-api.com/v1"
Important Notes:
LM Studio: MUST include
/v1
suffixOllama: MUST NOT include
/v1
suffixMost others: Include
/v1
suffix
OpenAI Configuration
OPENAI_ORGANIZATION
Type: String (organization ID) Optional: For multi-org accounts Description: OpenAI organization ID
export OPENAI_ORGANIZATION="org-your-org-id"
System Variables
These system-wide variables affect Gonzo:
Terminal Configuration
TERM
Type: String Default: Set by terminal Description: Terminal type
export TERM="xterm-256color"
Recommended values:
xterm-256color
- 256 color supportscreen-256color
- For tmux/screenalacritty
- For Alacritty terminal
LANG
/ LC_ALL
Type: String (locale) Default: System locale Description: Character encoding
export LANG="en_US.UTF-8"
export LC_ALL="en_US.UTF-8"
Note: UTF-8 recommended for proper display.
Display Control
NO_COLOR
Type: Boolean (any value) Default: Not set Description: Disable color output
# Disable colors
export NO_COLOR=1
# Re-enable colors
unset NO_COLOR
Note: Setting to any value disables colors.
Complete Configuration Examples
Basic Development Setup
# ~/.bashrc or ~/.zshrc
export GONZO_FILES="/var/log/app.log"
export GONZO_FOLLOW="true"
export GONZO_SKIN="dracula"
export OPENAI_API_KEY="sk-your-key-here"
Production Monitoring
# Production server configuration
export GONZO_FILES="/var/log/production/*.log"
export GONZO_FOLLOW="true"
export GONZO_UPDATE_INTERVAL="2s"
export GONZO_LOG_BUFFER="5000"
export GONZO_MEMORY_SIZE="20000"
export GONZO_SKIN="default"
export OPENAI_API_KEY="sk-prod-key"
OTLP Receiver
# OTLP log receiver setup
export GONZO_OTLP_ENABLED="true"
export GONZO_OTLP_GRPC_PORT="4317"
export GONZO_OTLP_HTTP_PORT="4318"
export GONZO_UPDATE_INTERVAL="1s"
export GONZO_LOG_BUFFER="2000"
Local AI with Ollama
# Using Ollama for AI features
export OPENAI_API_KEY="ollama"
export OPENAI_API_BASE="http://localhost:11434"
export GONZO_AI_MODEL="llama3"
export GONZO_FILES="/var/log/app.log"
export GONZO_FOLLOW="true"
Local AI with LM Studio
# Using LM Studio for AI features
export OPENAI_API_KEY="local-key"
export OPENAI_API_BASE="http://localhost:1234/v1" # Note: /v1 required
export GONZO_AI_MODEL="" # Auto-select
export GONZO_FILES="/var/log/app.log"
Docker/Container Usage
Docker Compose
# docker-compose.yml
services:
gonzo:
image: gonzo:latest
environment:
- GONZO_OTLP_ENABLED=true
- GONZO_OTLP_GRPC_PORT=4317
- GONZO_OTLP_HTTP_PORT=4318
- GONZO_LOG_BUFFER=2000
- GONZO_UPDATE_INTERVAL=2s
- OPENAI_API_KEY=${OPENAI_API_KEY}
ports:
- "4317:4317"
- "4318:4318"
Docker Run
docker run -e GONZO_OTLP_ENABLED=true \
-e GONZO_OTLP_GRPC_PORT=4317 \
-e GONZO_OTLP_HTTP_PORT=4318 \
-e OPENAI_API_KEY="${OPENAI_API_KEY}" \
-p 4317:4317 \
-p 4318:4318 \
gonzo:latest
Kubernetes
# kubernetes-deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
name: gonzo
spec:
template:
spec:
containers:
- name: gonzo
image: gonzo:latest
env:
- name: GONZO_OTLP_ENABLED
value: "true"
- name: GONZO_OTLP_GRPC_PORT
value: "4317"
- name: GONZO_OTLP_HTTP_PORT
value: "4318"
- name: GONZO_LOG_BUFFER
value: "2000"
- name: OPENAI_API_KEY
valueFrom:
secretKeyRef:
name: gonzo-secrets
key: openai-api-key
ports:
- containerPort: 4317
- containerPort: 4318
Shell Profile Setup
Bash (~/.bashrc)
# Gonzo Configuration
export GONZO_FILES="/var/log/app.log"
export GONZO_FOLLOW="true"
export GONZO_SKIN="dracula"
export GONZO_UPDATE_INTERVAL="2s"
# OpenAI Configuration
export OPENAI_API_KEY="sk-your-key-here"
# Optional: Add completion
if [ -f ~/.gonzo-completion.bash ]; then
source ~/.gonzo-completion.bash
fi
Zsh (~/.zshrc)
# Gonzo Configuration
export GONZO_FILES="/var/log/app.log"
export GONZO_FOLLOW="true"
export GONZO_SKIN="nord"
export GONZO_UPDATE_INTERVAL="2s"
# OpenAI Configuration
export OPENAI_API_KEY="sk-your-key-here"
# Optional: Add completion
if [ -f ~/.gonzo-completion.zsh ]; then
source ~/.gonzo-completion.zsh
fi
Fish (~/.config/fish/config.fish)
# Gonzo Configuration
set -x GONZO_FILES "/var/log/app.log"
set -x GONZO_FOLLOW "true"
set -x GONZO_SKIN "monokai"
set -x GONZO_UPDATE_INTERVAL "2s"
# OpenAI Configuration
set -x OPENAI_API_KEY "sk-your-key-here"
Debugging Environment Variables
View All Gonzo Variables
# Show all GONZO_* variables
env | grep GONZO
# Show all OpenAI variables
env | grep OPENAI
# Show all relevant variables
env | grep -E '(GONZO|OPENAI|TERM|LANG)'
Test Configuration
# Print configuration without starting Gonzo
gonzo --help
# Or check specific values
echo "Files: $GONZO_FILES"
echo "Follow: $GONZO_FOLLOW"
echo "Interval: $GONZO_UPDATE_INTERVAL"
echo "API Key: ${OPENAI_API_KEY:0:10}..." # Show only first 10 chars
Clear All Settings
# Unset all Gonzo variables
unset GONZO_FILES
unset GONZO_FOLLOW
unset GONZO_UPDATE_INTERVAL
unset GONZO_LOG_BUFFER
unset GONZO_MEMORY_SIZE
unset GONZO_AI_MODEL
unset GONZO_OTLP_ENABLED
unset GONZO_OTLP_GRPC_PORT
unset GONZO_OTLP_HTTP_PORT
unset GONZO_SKIN
# Unset OpenAI variables
unset OPENAI_API_KEY
unset OPENAI_API_BASE
Precedence Examples
Understanding how different configuration methods interact:
Example 1: Command Line Overrides Env Var
export GONZO_UPDATE_INTERVAL="5s"
gonzo --update-interval=1s # Uses 1s, not 5s
Example 2: Env Var Overrides Config File
# config.yml has: update-interval: 10s
export GONZO_UPDATE_INTERVAL="2s"
gonzo --config=config.yml # Uses 2s, not 10s
Example 3: Full Precedence Chain
# config.yml
update-interval: 10s
export GONZO_UPDATE_INTERVAL="5s"
gonzo --config=config.yml --update-interval=1s
# Result: Uses 1s (flag > env > config)
Security Considerations
API Keys
Never commit API keys:
# ❌ BAD - Don't commit to version control
export OPENAI_API_KEY="sk-actual-key"
# ✅ GOOD - Use secrets management
export OPENAI_API_KEY="${OPENAI_KEY_FROM_VAULT}"
# ✅ GOOD - Read from secure file
export OPENAI_API_KEY=$(cat ~/.secrets/openai_key)
Use environment-specific keys:
# Development
export OPENAI_API_KEY="sk-dev-key"
# Production
export OPENAI_API_KEY="sk-prod-key"
File Permissions
Protect your shell profile:
chmod 600 ~/.bashrc
chmod 600 ~/.zshrc
Kubernetes Secrets
Store sensitive values in secrets:
apiVersion: v1
kind: Secret
metadata:
name: gonzo-secrets
type: Opaque
stringData:
openai-api-key: "sk-your-key-here"
Troubleshooting
Variable Not Applied
Check if set:
echo $GONZO_FILES
# If empty, not set
Check for typos:
# Wrong
export GONZO_FILE="/var/log/app.log" # Missing 'S'
# Correct
export GONZO_FILES="/var/log/app.log"
Check for overrides:
# Command line flag overrides env var
gonzo -f other.log # Ignores GONZO_FILES
AI Not Working
Check API key:
echo $OPENAI_API_KEY
# Should show your key (sk-...)
Check API base:
echo $OPENAI_API_BASE
# Should be correct for your provider
Test connectivity:
# For OpenAI
curl https://api.openai.com/v1/models \
-H "Authorization: Bearer $OPENAI_API_KEY"
# For LM Studio
curl http://localhost:1234/v1/models
# For Ollama
curl http://localhost:11434/api/tags
OTLP Receiver Not Starting
Check if enabled:
echo $GONZO_OTLP_ENABLED
# Should be "true"
Check port conflicts:
# Check if ports are free
lsof -i :4317
lsof -i :4318
Best Practices
Use shell profiles: Set persistent defaults in
~/.bashrc
or~/.zshrc
Separate environments: Use different values for dev/prod
Secure API keys: Never commit keys to version control
Document setup: Comment your environment configuration
Test changes: Verify variables are applied correctly
Use defaults: Only override what you need to change
Related Documentation
Configuration Schema - Configuration file reference
CLI Reference - Command line options
AI Integration - AI provider setup
Security: Never expose API keys in logs, screenshots, or public repositories. Use environment variables or secrets management.
_FILES="/var/log/nginx/access.log,/var/log/nginx/error.log" gonzo ```
GONZO_FOLLOW
Type: Boolean (true
or false
)
Default: false
Description: Follow log files in real-time
export GONZO_FOLLOW="true"
Example:
export GONZO_FOLLOW="true"
export GONZO_FILES="/var/log/app.log"
gonzo
Performance Settings
GONZO_UPDATE_INTERVAL
Type: Duration string
Default: "1s"
Valid values: Go duration format (e.g., 500ms
, 2s
, 5s
)
Description: Dashboard update frequency
export GONZO_UPDATE_INTERVAL="2s"
Common values:
"500ms"
- Very responsive"1s"
- Default, good balance"2s"
- Reduced CPU usage"5s"
- Low resource usage
GONZO_LOG_BUFFER
Type: Integer
Default: 1000
Valid range: 1
to 100000
Description: Maximum log entries in buffer
export GONZO_LOG_BUFFER="2000"
Guidelines:
Low volume (
<100 logs/sec
): 500-1000Medium volume (
100-1000 logs/sec
): 1000-5000High volume (
>1000 logs/sec
): 5000-10000
GONZO_MEMORY_SIZE
Type: Integer
Default: 10000
Valid range: 100
to 1000000
Description: Maximum words tracked for frequency analysis
export GONZO_MEMORY_SIZE="15000"
Guidelines:
Minimal: 5000
Standard: 10000 (default)
Extended: 15000-20000
Maximum: 50000+
AI Configuration
GONZO_AI_MODEL
Type: String
Default: ""
(auto-select)
Description: AI model for log analysis
# Auto-select best available (recommended)
export GONZO_AI_MODEL=""
# OpenAI models
export GONZO_AI_MODEL="gpt-4"
export GONZO_AI_MODEL="gpt-3.5-turbo"
# Ollama models
export GONZO_AI_MODEL="llama3"
export GONZO_AI_MODEL="mistral"
# LM Studio models
export GONZO_AI_MODEL="openai/gpt-oss-120b"
OTLP Receiver
GONZO_OTLP_ENABLED
Type: Boolean (true
or false
)
Default: false
Description: Enable OTLP log receiver
export GONZO_OTLP_ENABLED="true"
GONZO_OTLP_GRPC_PORT
Type: Integer
Default: 4317
Valid range: 1024
to 65535
Description: gRPC receiver port
export GONZO_OTLP_GRPC_PORT="4317"
GONZO_OTLP_HTTP_PORT
Type: Integer
Default: 4318
Valid range: 1024
to 65535
Description: HTTP receiver port
export GONZO_OTLP_HTTP_PORT="4318"
Display Settings
GONZO_SKIN
Type: String
Default: "default"
Valid values: Any installed skin name
Description: Color scheme/theme
export GONZO
Last updated