Command Line Reference

Complete reference for all Gonzo command-line flags and options. From basic file input to advanced performance tuning, this guide covers every CLI option available.

circle-info

Quick Reference: Use gonzo --help or gonzo -h to see a summary of all available flags at any time.

Command Syntax

gonzo [flags] [command]

# Basic usage
gonzo -f application.log
gonzo -f logs/*.log --follow

# With multiple options
gonzo -f app.log --follow --log-buffer=5000 --ai-model="gpt-4"

# Using configuration file
gonzo --config ~/.config/gonzo/config.yml

# Piping input
cat logs.log | gonzo
kubectl logs -f pod/my-app | gonzo

Core Flags

File Input

Specify log files or patterns to analyze:

Flag
Short
Type
Description
Example

--file

-f

string

File or glob pattern to read

gonzo -f app.log

--follow

boolean

Follow file like tail -f

gonzo -f app.log --follow

Examples:

Notes:

  • Can be specified multiple times for multiple sources

  • Glob patterns must be quoted to prevent shell expansion

  • --follow works like tail -f for real-time monitoring

  • Without --follow, Gonzo reads entire file and exits

Configuration

Specify configuration files and options:

Flag
Short
Type
Description
Example

--config

string

Path to configuration file

gonzo --config prod.yml

Examples:

Config File Search Path:

  1. Path specified with --config flag

  2. ./config.yml (current directory)

  3. ~/.config/gonzo/config.yml (user config)

  4. /etc/gonzo/config.yml (system config)

Performance Flags

Buffer and Memory

Control memory usage and buffer sizes:

Flag
Short
Type
Default
Description

--log-buffer

-b

int

1000

Maximum log entries to keep

--memory-size

-m

int

10000

Maximum frequency entries

--update-interval

-u

duration

1s

Dashboard update interval

Examples:

Guidelines:

AI Configuration Flags

AI Model Selection

Configure AI analysis features:

Flag
Type
Default
Description

--ai-model

string

auto

AI model to use for analysis

Examples:

Supported Models:

  • OpenAI: gpt-4, gpt-4-turbo, gpt-3.5-turbo, gpt-3.5-turbo-16k

  • Ollama: llama3, llama3:70b, mistral, mixtral, codellama

  • LM Studio: Any model loaded in LM Studio

  • Auto: Automatically selects best available model

AI Environment Variables:

OTLP Flags

OpenTelemetry Protocol

Configure OTLP log receiver:

Flag
Type
Default
Description

--otlp-enabled

boolean

false

Enable OTLP receiver

--otlp-grpc-port

int

4317

gRPC endpoint port

--otlp-http-port

int

4318

HTTP endpoint port

Examples:

OTLP Endpoints:

  • gRPC: localhost:4317 (or custom port)

  • HTTP: http://localhost:4318/v1/logs (or custom port)

Display and Output Flags

Interface Customization

Control display appearance and behavior:

Flag
Short
Type
Default
Description

--no-color

boolean

false

Disable color output

-v, --version

boolean

false

Print version information

-h, --help

boolean

false

Show help message

Examples:

Development and Testing Flags

Testing and Debugging

Options for development and CI/CD:

Flag
Short
Type
Default
Description

--test-mode

-t

boolean

false

Run without TTY for testing

--verbose

boolean

false

Enable verbose output

--dry-run

boolean

false

Show config without running

--stop-words

strings

[]

Additional stop words to filter from analysis

Examples:

Test Mode Features:

Stop Words Usage:

Use Cases:

Complete Flag Reference Table

All Flags Alphabetically

Flag
Short
Type
Default
Description

--ai-model

string

auto

AI model for analysis

--config

string

Configuration file path

--dry-run

boolean

false

Show config without running

--file

-f

string

File or glob pattern to read

--follow

boolean

false

Follow file like tail -f

--help

-h

boolean

false

Show help message

--log-buffer

-b

int

1000

Maximum log entries to keep

--memory-size

-m

int

10000

Maximum frequency entries

--no-color

boolean

false

Disable color output

--otlp-enabled

boolean

false

Enable OTLP receiver

--otlp-grpc-port

int

4317

OTLP gRPC port

--otlp-http-port

int

4318

OTLP HTTP port

--quiet

boolean

false

Suppress non-essential output

--show-config

boolean

false

Display current configuration

--test-mode

-t

boolean

false

Run without TTY

--update-interval

-u

duration

1s

Dashboard update interval

--verbose

boolean

false

Enable verbose output

--version

-v

boolean

false

Print version information

Common Command Patterns

Development Workflows

Production Monitoring

Analysis and Investigation

CI/CD Integration

Flag Combination Examples

Performance Optimization

AI Configuration

Multi-Source Analysis

Environment Variables

Configuration via Environment

Many flags can be set via environment variables:

Priority Order:

  1. Command-line flags (highest)

  2. Environment variables

  3. Configuration file

  4. Built-in defaults (lowest)

Shell Completion

Enable Autocomplete

Benefits:

  • Tab completion for flags

  • Completion for flag values

  • File path completion

  • Command completion

Troubleshooting CLI Issues

Common Problems

Flag not recognized:

Value not accepted:

Configuration conflicts:

Quick Reference Card

Most Common Commands

Essential Flags

What's Next?

Now that you know all CLI flags, explore related topics:

  • Configuration File - Persistent settings via YAML

  • Advanced Configuration - Complex setups and tuning

  • User Guide - Master the Gonzo interface

Or start using flags effectively:


Master CLI flags for quick, powerful log analysis! ⚡ From simple file analysis to sophisticated multi-source monitoring, command-line flags give you precise control over every Gonzo session.

Last updated