Log Format Issues
Troubleshooting guide for log parsing and format detection problems in Gonzo.
Format Detection Issues
Logs Not Being Parsed
Symptom: Structured logs appear as plain text instead of being parsed.
Diagnosis:
Gonzo automatically detects formats per line based on these rules:
JSON: Lines starting with
{Logfmt: Lines containing
key=valuepatternsPlain text: Everything else
Solutions:
Verify JSON is valid
# Test each line individually head -1 logs.json | jq . # If error, JSON is malformedCheck line starts with
{# JSON must start with opening brace head logs.json # ✅ Good: {"level":"info"... # ❌ Bad: 2024-01-15 {"level":"info"...Inspect logfmt format
# Should have key=value pairs head logs.txt # ✅ Good: level=info service=api msg="started" # ❌ Bad: [INFO] service api startedUse custom format for non-standard logs
gonzo --format=my-custom-format -f logs.txt
Mixed Format Logs
Symptom: Some lines parse correctly, others don't.
Explanation: Gonzo detects format per line, so mixed formats will have inconsistent parsing.
Example:
Solutions:
Accept mixed display
This is expected behavior
Each format renders appropriately
Pre-filter to single format
Convert to uniform format
Attributes Not Extracted
Symptom: Logs parse but attributes panel is empty.
Causes & Solutions:
Plain text logs have no structured attributes
Plain text can't be parsed into fields
Only JSON and logfmt have attributes
Check JSON structure
Nested JSON
Gonzo extracts nested attributes
May need to check nested paths
Missing common attribute names
Gonzo looks for common fields:
level,service,host, etc.Custom fields may not be highlighted
JSON Issues
Malformed JSON
Symptom: JSON logs show as plain text or cause errors.
Common JSON Issues:
Trailing commas
Single quotes instead of double
Unescaped quotes in strings
Missing quotes on keys
Validate & Fix:
Multi-line JSON
Symptom: JSON objects span multiple lines, not parsed correctly.
Example:
Solution:
Gonzo expects one JSON object per line (JSONL/NDJSON format).
JSON with Metadata Prefix
Symptom: Lines have timestamp or metadata before JSON.
Example:
Solution:
Remove prefix before piping to Gonzo:
Escaped JSON in Strings
Symptom: JSON contains escaped JSON strings.
Example:
Solution:
Logfmt Issues
Logfmt Not Detected
Symptom: Key=value logs appear as plain text.
Requirements for logfmt detection:
Must have
key=valuepatternsMultiple pairs per line
Values can be quoted:
key="value with spaces"
Example:
Solutions:
Verify format
Add more key=value pairs
Single pair may not trigger detection
Multiple pairs more reliably detected
Use custom format
Spaces in Logfmt Values
Symptom: Values with spaces not parsed correctly.
Examples:
Solution:
Ensure spaces in values are properly quoted or escaped:
Logfmt with Nested Structures
Symptom: Nested objects in logfmt don't parse well.
Example:
Explanation: Logfmt is flat by design. Nested structures need JSON.
Solution:
Accept flat representation
Gonzo extracts
user.idanduser.nameas separate attributes
Convert to JSON if needed
Plain Text Issues
No Structure Extracted from Text Logs
Symptom: Plain text logs show no attributes.
Explanation: Plain text logs can't be parsed into structured fields automatically.
Examples:
Solutions:
Accept plain text display
Logs still searchable and analyzable
Just no structured attributes
Create custom format parser
Convert logs to structured format
Modify application to output JSON/logfmt
Use log shipper to add structure (Fluent Bit, Logstash)
Severity Not Detected in Text Logs
Symptom: Plain text logs don't show color-coded severity.
Explanation: Gonzo looks for common severity keywords in text logs:
ERROR, FATAL, CRITICAL → Red
WARN, WARNING → Yellow
INFO → Green
DEBUG → Blue
TRACE → White
Solutions:
Include severity keywords
Use consistent format
Put severity at start of line
Use standard keywords (ERROR, WARN, INFO, DEBUG)
Create custom format
Define severity extraction pattern
Map custom levels to standard severities
OTLP Format Issues
OTLP Logs Not Appearing
Symptom: OTLP receiver running but no logs in Gonzo.
Diagnosis:
Verify receiver is enabled
Check sender configuration
Test with curl (HTTP)
Check for port conflicts
Solutions:
See Common Issues - OTLP Receiver for detailed fixes.
OTLP Attributes Missing
Symptom: OTLP logs appear but without expected attributes.
Causes:
Attributes in resource vs log record
Resource attributes: service.name, host, etc.
Log record attributes: user_id, request_id, etc.
Both should be extracted
Verify sender includes attributes
Check attribute names
Gonzo shows all attributes
May just be named differently than expected
Custom Format Issues
Custom Format Not Working
Symptom: --format=my-format shows error or doesn't parse.
Diagnosis:
Verify format file exists
Check YAML syntax
Test with built-in format first
Solutions:
See Custom Formats Guide for:
Format file syntax
Regex patterns
Field mapping
Testing formats
Regex Not Matching
Symptom: Custom format regex doesn't extract fields.
Solutions:
Test regex separately
Use online regex tester
Test at regex101.com
Use example log lines
Verify capture groups
Check for special characters
Start simple, iterate
Encoding Issues
Special Characters Garbled
Symptom: Non-ASCII characters display incorrectly.
Solutions:
Ensure UTF-8 encoding
Check file encoding
Terminal font support
Use font with good Unicode support
JetBrains Mono, Fira Code, Cascadia Code
Binary or Non-Text Data
Symptom: Binary data causes display issues.
Solution:
Performance with Complex Formats
Slow Parsing with Complex Regex
Symptom: Custom format with complex regex causes slowdowns.
Solutions:
Simplify regex patterns
Reduce backtracking
Avoid nested quantifiers:
(.*)*Use possessive quantifiers when possible
Anchor patterns:
^and `# Log Format Issues
Troubleshooting guide for log parsing and format detection problems in Gonzo.
Format Detection Issues
Logs Not Being Parsed
Symptom: Structured logs appear as plain text instead of being parsed.
Diagnosis:
Gonzo automatically detects formats per line based on these rules:
JSON: Lines starting with
{Logfmt: Lines containing
key=valuepatternsPlain text: Everything else
Solutions:
Verify JSON is valid
Check line starts with
{Inspect logfmt format
Use custom format for non-standard logs
Mixed Format Logs
Symptom: Some lines parse correctly, others don't.
Explanation: Gonzo detects format per line, so mixed formats will have inconsistent parsing.
Example:
Solutions:
Accept mixed display
This is expected behavior
Each format renders appropriately
Pre-filter to single format
Convert to uniform format
Attributes Not Extracted
Symptom: Logs parse but attributes panel is empty.
Causes & Solutions:
Plain text logs have no structured attributes
Plain text can't be parsed into fields
Only JSON and logfmt have attributes
Check JSON structure
Nested JSON
Gonzo extracts nested attributes
May need to check nested paths
Missing common attribute names
Gonzo looks for common fields:
level,service,host, etc.Custom fields may not be highlighted
JSON Issues
Malformed JSON
Symptom: JSON logs show as plain text or cause errors.
Common JSON Issues:
Trailing commas
Single quotes instead of double
Unescaped quotes in strings
Missing quotes on keys
Validate & Fix:
Multi-line JSON
Symptom: JSON objects span multiple lines, not parsed correctly.
Example:
Solution:
Gonzo expects one JSON object per line (JSONL/NDJSON format).
JSON with Metadata Prefix
Symptom: Lines have timestamp or metadata before JSON.
Example:
Solution:
Remove prefix before piping to Gonzo:
Escaped JSON in Strings
Symptom: JSON contains escaped JSON strings.
Example:
Solution:
Logfmt Issues
Logfmt Not Detected
Symptom: Key=value logs appear as plain text.
Requirements for logfmt detection:
Must have
key=valuepatternsMultiple pairs per line
Values can be quoted:
key="value with spaces"
Example:
Solutions:
Verify format
Add more key=value pairs
Single pair may not trigger detection
Multiple pairs more reliably detected
Use custom format
Spaces in Logfmt Values
Symptom: Values with spaces not parsed correctly.
Examples:
Solution:
Ensure spaces in values are properly quoted or escaped:
Logfmt with Nested Structures
Symptom: Nested objects in logfmt don't parse well.
Example:
Explanation: Logfmt is flat by design. Nested structures need JSON.
Solution:
Accept flat representation
Gonzo extracts
user.idanduser.nameas separate attributes
Convert to JSON if needed
Plain Text Issues
No Structure Extracted from Text Logs
Symptom: Plain text logs show no attributes.
Explanation: Plain text logs can't be parsed into structured fields automatically.
Examples:
Solutions:
Accept plain text display
Logs still searchable and analyzable
Just no structured attributes
Create custom format parser
Convert logs to structured format
Modify application to output JSON/logfmt
Use log shipper to add structure (Fluent Bit, Logstash)
Severity Not Detected in Text Logs
Symptom: Plain text logs don't show color-coded severity.
Explanation: Gonzo looks for common severity keywords in text logs:
ERROR, FATAL, CRITICAL → Red
WARN, WARNING → Yellow
INFO → Green
DEBUG → Blue
TRACE → White
Solutions:
Include severity keywords
Use consistent format
Put severity at start of line
Use standard keywords (ERROR, WARN, INFO, DEBUG)
Create custom format
Define severity extraction pattern
Map custom levels to standard severities
OTLP Format Issues
OTLP Logs Not Appearing
Symptom: OTLP receiver running but no logs in Gonzo.
Diagnosis:
Verify receiver is enabled
Check sender configuration
Test with curl (HTTP)
Check for port conflicts
Solutions:
See Common Issues - OTLP Receiver for detailed fixes.
OTLP Attributes Missing
Symptom: OTLP logs appear but without expected attributes.
Causes:
Attributes in resource vs log record
Resource attributes: service.name, host, etc.
Log record attributes: user_id, request_id, etc.
Both should be extracted
Verify sender includes attributes
Check attribute names
Gonzo shows all attributes
May just be named differently than expected
Custom Format Issues
Custom Format Not Working
Symptom: --format=my-format shows error or doesn't parse.
Diagnosis:
Verify format file exists
Check YAML syntax
Test with built-in format first
Solutions:
See Custom Formats Guide for:
Format file syntax
Regex patterns
Field mapping
Testing formats
Regex Not Matching
Symptom: Custom format regex doesn't extract fields.
Solutions:
Test regex separately
Use online regex tester
Test at regex101.com
Use example log lines
Verify capture groups
Check for special characters
Start simple, iterate
Encoding Issues
Special Characters Garbled
Symptom: Non-ASCII characters display incorrectly.
Solutions:
Ensure UTF-8 encoding
Check file encoding
Pre-filter logs
Use built-in formats when possible
JSON and logfmt parsing is optimized
Custom regex is slower
Timestamp Issues
Timestamps Not Recognized
Symptom: Logs appear in wrong order or timestamp not extracted.
Common timestamp formats Gonzo recognizes:
Solutions:
Use ISO 8601 format (recommended)
Ensure timestamp field name
Common names:
timestamp,time,@timestamp,tsGonzo checks these automatically
Custom format for unusual timestamps
Define timestamp extraction in format file
Specify timestamp format
Timezone Issues
Symptom: Timestamps appear in wrong timezone.
Solutions:
Use UTC in logs (recommended)
Include timezone offset
Gonzo displays timestamps as received
No automatic conversion
Format logs consistently at source
Large Log Line Issues
Very Long Lines Truncated
Symptom: Extremely long log lines appear cut off.
Solutions:
Use horizontal scrolling
View in detail modal
Split long lines at source
Configure application to use reasonable line length
Use structured logging to avoid massive single-line logs
Lines Exceed Buffer
Symptom: Some log lines cause errors or don't appear.
Solution:
Gonzo handles lines up to typical buffer limits. For extremely large lines:
Debugging Format Issues
Test Format Detection
Examine Raw Logs
Compare with Known-Good Format
Common Format Patterns
Application Logs
Go/Logrus:
✅ Parses as JSON automatically
Python/Logging:
⚠️ Plain text - create custom format for structure
Node.js/Winston:
✅ Parses as JSON automatically
System Logs
Syslog:
⚠️ Plain text - consider custom format
Systemd Journal:
⚠️ Key=value but special format - needs custom parser
Container Logs
Docker JSON:
✅ Parses as JSON, extracts nested log
Kubernetes:
✅ Parses as JSON with K8s attributes
Format Best Practices
When Choosing Log Format
Prefer structured formats
JSON or logfmt over plain text
Easier to parse and analyze
Better attribute extraction
Use consistent format
Same format across all services
Easier to aggregate and search
Include standard fields
levelorseverity: ERROR, WARN, INFO, DEBUGtimestamp: ISO 8601 formatmessageormsg: Human-readable messageserviceorservice.name: Service identifier
Example good JSON log:
When You Can't Change Format
Use custom format definition
Create regex-based parser
Map fields to standard attributes
Pre-process logs
Use awk/sed to restructure
Convert to JSON/logfmt before Gonzo
Use log shipping layer
Fluent Bit, Logstash, Vector
Transform logs to standard format
Getting Help
Provide This Info for Format Issues
Resources
Common Issues - General troubleshooting
Custom Formats Guide - Creating format parsers
GitHub Issues - Report format bugs
Examples Directory - Sample format files
Last updated