Log Format Issues
Troubleshooting guide for log parsing and format detection problems in Gonzo.
Format Detection Issues
Logs Not Being Parsed
Symptom: Structured logs appear as plain text instead of being parsed.
Diagnosis:
Gonzo automatically detects formats per line based on these rules:
JSON: Lines starting with
{Logfmt: Lines containing
key=valuepatternsPlain text: Everything else
Solutions:
Verify JSON is valid
# Test each line individually head -1 logs.json | jq . # If error, JSON is malformedCheck line starts with
{# JSON must start with opening brace head logs.json # ✅ Good: {"level":"info"... # ❌ Bad: 2024-01-15 {"level":"info"...Inspect logfmt format
# Should have key=value pairs head logs.txt # ✅ Good: level=info service=api msg="started" # ❌ Bad: [INFO] service api startedUse custom format for non-standard logs
gonzo --format=my-custom-format -f logs.txt
Mixed Format Logs
Symptom: Some lines parse correctly, others don't.
Explanation: Gonzo detects format per line, so mixed formats will have inconsistent parsing.
Example:
{"level":"info","msg":"API started"} ← Parsed as JSON
level=info msg="processing request" ← Parsed as logfmt
[ERROR] Connection timeout ← Plain textSolutions:
Accept mixed display
This is expected behavior
Each format renders appropriately
Pre-filter to single format
# Extract only JSON lines grep '^{' mixed.log | gonzo # Or only logfmt grep '=' mixed.log | gonzoConvert to uniform format
# Convert all to JSON with jq cat mixed.log | jq -R -c '. | fromjson? // {"message": .}' | gonzo
Attributes Not Extracted
Symptom: Logs parse but attributes panel is empty.
Causes & Solutions:
Plain text logs have no structured attributes
Plain text can't be parsed into fields
Only JSON and logfmt have attributes
Check JSON structure
# Verify JSON has expected fields head -1 logs.json | jq . # Should show key-value pairsNested JSON
{ "log": { "level": "info", "service": "api" } }Gonzo extracts nested attributes
May need to check nested paths
Missing common attribute names
Gonzo looks for common fields:
level,service,host, etc.Custom fields may not be highlighted
JSON Issues
Malformed JSON
Symptom: JSON logs show as plain text or cause errors.
Common JSON Issues:
Trailing commas
{"level":"info","msg":"test",} ❌ Invalid {"level":"info","msg":"test"} ✅ ValidSingle quotes instead of double
{'level':'info'} ❌ Invalid {"level":"info"} ✅ ValidUnescaped quotes in strings
{"msg":"He said "hi""} ❌ Invalid {"msg":"He said \"hi\""} ✅ Valid {"msg":"He said 'hi'"} ✅ Valid (single quotes in string)Missing quotes on keys
{level:"info"} ❌ Invalid {"level":"info"} ✅ Valid
Validate & Fix:
# Validate JSON
cat logs.json | jq . > /dev/null
# If errors, shows line number
# Pretty-print to find issues
jq . logs.json
# Fix and re-format
jq -c . logs.json > fixed.json
gonzo -f fixed.jsonMulti-line JSON
Symptom: JSON objects span multiple lines, not parsed correctly.
Example:
{
"level": "info",
"message": "test"
}Solution:
Gonzo expects one JSON object per line (JSONL/NDJSON format).
# Compact multi-line JSON to single lines
jq -c . pretty.json > compact.json
gonzo -f compact.json
# Or pipe directly
jq -c . pretty.json | gonzoJSON with Metadata Prefix
Symptom: Lines have timestamp or metadata before JSON.
Example:
2024-01-15 10:30:05 {"level":"info","msg":"test"}Solution:
Remove prefix before piping to Gonzo:
# Remove timestamp prefix
sed 's/^[0-9-]* [0-9:]* //' logs.txt | gonzo
# Or use awk
awk '{$1=$2=""; print}' logs.txt | gonzo
# Extract just JSON part
grep -o '{.*}' logs.txt | gonzoEscaped JSON in Strings
Symptom: JSON contains escaped JSON strings.
Example:
{"log":"{\"level\":\"info\",\"msg\":\"test\"}"}Solution:
# Unescape inner JSON
jq -r '.log | fromjson' logs.json | gonzo
# Or handle both levels
jq -c '.log | fromjson? // .' logs.json | gonzoLogfmt Issues
Logfmt Not Detected
Symptom: Key=value logs appear as plain text.
Requirements for logfmt detection:
Must have
key=valuepatternsMultiple pairs per line
Values can be quoted:
key="value with spaces"
Example:
level=info service=api user=123 msg="request completed" ✅ Detected
INFO service api user 123 request completed ❌ Not logfmtSolutions:
Verify format
# Check for key=value pattern grep -E '\w+=\w+' logs.txt | headAdd more key=value pairs
Single pair may not trigger detection
Multiple pairs more reliably detected
Use custom format
gonzo --format=my-logfmt -f logs.txt
Spaces in Logfmt Values
Symptom: Values with spaces not parsed correctly.
Examples:
msg=hello world ❌ Breaks: "world" seen as separate key
msg="hello world" ✅ Correct: quotes preserve spaces
msg=hello\ world ✅ Correct: escape preserves spacesSolution:
Ensure spaces in values are properly quoted or escaped:
# Fix unquoted spaces (requires log generation fix)
# Or accept partial parsing of problematic linesLogfmt with Nested Structures
Symptom: Nested objects in logfmt don't parse well.
Example:
user.id=123 user.name=johnExplanation: Logfmt is flat by design. Nested structures need JSON.
Solution:
Accept flat representation
Gonzo extracts
user.idanduser.nameas separate attributes
Convert to JSON if needed
# If you control log format, use JSON for nested data
Plain Text Issues
No Structure Extracted from Text Logs
Symptom: Plain text logs show no attributes.
Explanation: Plain text logs can't be parsed into structured fields automatically.
Examples:
[2024-01-15 10:30:05] ERROR: Connection failed
INFO - api-service - User login successfulSolutions:
Accept plain text display
Logs still searchable and analyzable
Just no structured attributes
Create custom format parser
# Define regex-based parser # See Custom Formats Guide gonzo --format=my-text-format -f logs.txtConvert logs to structured format
Modify application to output JSON/logfmt
Use log shipper to add structure (Fluent Bit, Logstash)
Severity Not Detected in Text Logs
Symptom: Plain text logs don't show color-coded severity.
Explanation: Gonzo looks for common severity keywords in text logs:
ERROR, FATAL, CRITICAL → Red
WARN, WARNING → Yellow
INFO → Green
DEBUG → Blue
TRACE → White
Solutions:
Include severity keywords
[ERROR] Connection failed ✅ Detected Connection failed ❌ Not detected WARN: Low disk space ✅ Detected Disk space is low ❌ Not detectedUse consistent format
Put severity at start of line
Use standard keywords (ERROR, WARN, INFO, DEBUG)
Create custom format
Define severity extraction pattern
Map custom levels to standard severities
OTLP Format Issues
OTLP Logs Not Appearing
Symptom: OTLP receiver running but no logs in Gonzo.
Diagnosis:
Verify receiver is enabled
# Check Gonzo started with OTLP gonzo --otlp-enabled # Should show listening on ports 4317 and 4318Check sender configuration
# Verify endpoint in sender endpoint: localhost:4317 # gRPC endpoint: http://localhost:4318/v1/logs # HTTPTest with curl (HTTP)
curl -X POST http://localhost:4318/v1/logs \ -H "Content-Type: application/json" \ -d '{"resourceLogs":[]}' # Should return 200 OKCheck for port conflicts
lsof -i :4317 lsof -i :4318
Solutions:
See Common Issues - OTLP Receiver for detailed fixes.
OTLP Attributes Missing
Symptom: OTLP logs appear but without expected attributes.
Causes:
Attributes in resource vs log record
Resource attributes: service.name, host, etc.
Log record attributes: user_id, request_id, etc.
Both should be extracted
Verify sender includes attributes
# Ensure attributes are set logger_provider.add_log_record_processor(processor) # Check resource attributes and log attributesCheck attribute names
Gonzo shows all attributes
May just be named differently than expected
Custom Format Issues
Custom Format Not Working
Symptom: --format=my-format shows error or doesn't parse.
Diagnosis:
Verify format file exists
ls ~/.config/gonzo/formats/my-format.yamlCheck YAML syntax
cat ~/.config/gonzo/formats/my-format.yaml # Validate if you have yamllint yamllint ~/.config/gonzo/formats/my-format.yamlTest with built-in format first
# Verify custom formats work at all gonzo --format=loki-stream -f test.json
Solutions:
See Custom Formats Guide for:
Format file syntax
Regex patterns
Field mapping
Testing formats
Regex Not Matching
Symptom: Custom format regex doesn't extract fields.
Solutions:
Test regex separately
# Test your regex pattern echo "sample log line" | grep -E "your-regex-pattern"Use online regex tester
Test at regex101.com
Use example log lines
Verify capture groups
Check for special characters
# Escape special characters pattern: '\[(\d+)\]' # Brackets escapedStart simple, iterate
# Begin with basic pattern pattern: '(\w+)' # Then add complexity pattern: '(\w+)=(\w+)' # Finally, full pattern pattern: '(\w+)="([^"]*)"'
Encoding Issues
Special Characters Garbled
Symptom: Non-ASCII characters display incorrectly.
Solutions:
Ensure UTF-8 encoding
export LANG=en_US.UTF-8 export LC_ALL=en_US.UTF-8Check file encoding
file logs.txt # Should show: UTF-8 Unicode text # Convert if needed iconv -f ISO-8859-1 -t UTF-8 logs.txt > logs_utf8.txtTerminal font support
Use font with good Unicode support
JetBrains Mono, Fira Code, Cascadia Code
Binary or Non-Text Data
Symptom: Binary data causes display issues.
Solution:
# Filter out binary data
strings logs.bin | gonzo
# Or ensure only text logs are processed
file logs.txt # Should be "text" not "data"Performance with Complex Formats
Slow Parsing with Complex Regex
Symptom: Custom format with complex regex causes slowdowns.
Solutions:
Simplify regex patterns
# Instead of: .*?(\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+Z).* # Use: (\d{4}-\d{2}-\d{2}T[\d:.]+Z)Reduce backtracking
Avoid nested quantifiers:
(.*)*Use possessive quantifiers when possible
Anchor patterns:
^and `# Log Format Issues
Troubleshooting guide for log parsing and format detection problems in Gonzo.
Format Detection Issues
Logs Not Being Parsed
Symptom: Structured logs appear as plain text instead of being parsed.
Diagnosis:
Gonzo automatically detects formats per line based on these rules:
JSON: Lines starting with
{Logfmt: Lines containing
key=valuepatternsPlain text: Everything else
Solutions:
Verify JSON is valid
# Test each line individually head -1 logs.json | jq . # If error, JSON is malformedCheck line starts with
{# JSON must start with opening brace head logs.json # ✅ Good: {"level":"info"... # ❌ Bad: 2024-01-15 {"level":"info"...Inspect logfmt format
# Should have key=value pairs head logs.txt # ✅ Good: level=info service=api msg="started" # ❌ Bad: [INFO] service api startedUse custom format for non-standard logs
gonzo --format=my-custom-format -f logs.txt
Mixed Format Logs
Symptom: Some lines parse correctly, others don't.
Explanation: Gonzo detects format per line, so mixed formats will have inconsistent parsing.
Example:
{"level":"info","msg":"API started"} ← Parsed as JSON
level=info msg="processing request" ← Parsed as logfmt
[ERROR] Connection timeout ← Plain textSolutions:
Accept mixed display
This is expected behavior
Each format renders appropriately
Pre-filter to single format
# Extract only JSON lines grep '^{' mixed.log | gonzo # Or only logfmt grep '=' mixed.log | gonzoConvert to uniform format
# Convert all to JSON with jq cat mixed.log | jq -R -c '. | fromjson? // {"message": .}' | gonzo
Attributes Not Extracted
Symptom: Logs parse but attributes panel is empty.
Causes & Solutions:
Plain text logs have no structured attributes
Plain text can't be parsed into fields
Only JSON and logfmt have attributes
Check JSON structure
# Verify JSON has expected fields head -1 logs.json | jq . # Should show key-value pairsNested JSON
{ "log": { "level": "info", "service": "api" } }Gonzo extracts nested attributes
May need to check nested paths
Missing common attribute names
Gonzo looks for common fields:
level,service,host, etc.Custom fields may not be highlighted
JSON Issues
Malformed JSON
Symptom: JSON logs show as plain text or cause errors.
Common JSON Issues:
Trailing commas
{"level":"info","msg":"test",} ❌ Invalid {"level":"info","msg":"test"} ✅ ValidSingle quotes instead of double
{'level':'info'} ❌ Invalid {"level":"info"} ✅ ValidUnescaped quotes in strings
{"msg":"He said "hi""} ❌ Invalid {"msg":"He said \"hi\""} ✅ Valid {"msg":"He said 'hi'"} ✅ Valid (single quotes in string)Missing quotes on keys
{level:"info"} ❌ Invalid {"level":"info"} ✅ Valid
Validate & Fix:
# Validate JSON
cat logs.json | jq . > /dev/null
# If errors, shows line number
# Pretty-print to find issues
jq . logs.json
# Fix and re-format
jq -c . logs.json > fixed.json
gonzo -f fixed.jsonMulti-line JSON
Symptom: JSON objects span multiple lines, not parsed correctly.
Example:
{
"level": "info",
"message": "test"
}Solution:
Gonzo expects one JSON object per line (JSONL/NDJSON format).
# Compact multi-line JSON to single lines
jq -c . pretty.json > compact.json
gonzo -f compact.json
# Or pipe directly
jq -c . pretty.json | gonzoJSON with Metadata Prefix
Symptom: Lines have timestamp or metadata before JSON.
Example:
2024-01-15 10:30:05 {"level":"info","msg":"test"}Solution:
Remove prefix before piping to Gonzo:
# Remove timestamp prefix
sed 's/^[0-9-]* [0-9:]* //' logs.txt | gonzo
# Or use awk
awk '{$1=$2=""; print}' logs.txt | gonzo
# Extract just JSON part
grep -o '{.*}' logs.txt | gonzoEscaped JSON in Strings
Symptom: JSON contains escaped JSON strings.
Example:
{"log":"{\"level\":\"info\",\"msg\":\"test\"}"}Solution:
# Unescape inner JSON
jq -r '.log | fromjson' logs.json | gonzo
# Or handle both levels
jq -c '.log | fromjson? // .' logs.json | gonzoLogfmt Issues
Logfmt Not Detected
Symptom: Key=value logs appear as plain text.
Requirements for logfmt detection:
Must have
key=valuepatternsMultiple pairs per line
Values can be quoted:
key="value with spaces"
Example:
level=info service=api user=123 msg="request completed" ✅ Detected
INFO service api user 123 request completed ❌ Not logfmtSolutions:
Verify format
# Check for key=value pattern grep -E '\w+=\w+' logs.txt | headAdd more key=value pairs
Single pair may not trigger detection
Multiple pairs more reliably detected
Use custom format
gonzo --format=my-logfmt -f logs.txt
Spaces in Logfmt Values
Symptom: Values with spaces not parsed correctly.
Examples:
msg=hello world ❌ Breaks: "world" seen as separate key
msg="hello world" ✅ Correct: quotes preserve spaces
msg=hello\ world ✅ Correct: escape preserves spacesSolution:
Ensure spaces in values are properly quoted or escaped:
# Fix unquoted spaces (requires log generation fix)
# Or accept partial parsing of problematic linesLogfmt with Nested Structures
Symptom: Nested objects in logfmt don't parse well.
Example:
user.id=123 user.name=johnExplanation: Logfmt is flat by design. Nested structures need JSON.
Solution:
Accept flat representation
Gonzo extracts
user.idanduser.nameas separate attributes
Convert to JSON if needed
# If you control log format, use JSON for nested data
Plain Text Issues
No Structure Extracted from Text Logs
Symptom: Plain text logs show no attributes.
Explanation: Plain text logs can't be parsed into structured fields automatically.
Examples:
[2024-01-15 10:30:05] ERROR: Connection failed
INFO - api-service - User login successfulSolutions:
Accept plain text display
Logs still searchable and analyzable
Just no structured attributes
Create custom format parser
# Define regex-based parser # See Custom Formats Guide gonzo --format=my-text-format -f logs.txtConvert logs to structured format
Modify application to output JSON/logfmt
Use log shipper to add structure (Fluent Bit, Logstash)
Severity Not Detected in Text Logs
Symptom: Plain text logs don't show color-coded severity.
Explanation: Gonzo looks for common severity keywords in text logs:
ERROR, FATAL, CRITICAL → Red
WARN, WARNING → Yellow
INFO → Green
DEBUG → Blue
TRACE → White
Solutions:
Include severity keywords
[ERROR] Connection failed ✅ Detected Connection failed ❌ Not detected WARN: Low disk space ✅ Detected Disk space is low ❌ Not detectedUse consistent format
Put severity at start of line
Use standard keywords (ERROR, WARN, INFO, DEBUG)
Create custom format
Define severity extraction pattern
Map custom levels to standard severities
OTLP Format Issues
OTLP Logs Not Appearing
Symptom: OTLP receiver running but no logs in Gonzo.
Diagnosis:
Verify receiver is enabled
# Check Gonzo started with OTLP gonzo --otlp-enabled # Should show listening on ports 4317 and 4318Check sender configuration
# Verify endpoint in sender endpoint: localhost:4317 # gRPC endpoint: http://localhost:4318/v1/logs # HTTPTest with curl (HTTP)
curl -X POST http://localhost:4318/v1/logs \ -H "Content-Type: application/json" \ -d '{"resourceLogs":[]}' # Should return 200 OKCheck for port conflicts
lsof -i :4317 lsof -i :4318
Solutions:
See Common Issues - OTLP Receiver for detailed fixes.
OTLP Attributes Missing
Symptom: OTLP logs appear but without expected attributes.
Causes:
Attributes in resource vs log record
Resource attributes: service.name, host, etc.
Log record attributes: user_id, request_id, etc.
Both should be extracted
Verify sender includes attributes
# Ensure attributes are set logger_provider.add_log_record_processor(processor) # Check resource attributes and log attributesCheck attribute names
Gonzo shows all attributes
May just be named differently than expected
Custom Format Issues
Custom Format Not Working
Symptom: --format=my-format shows error or doesn't parse.
Diagnosis:
Verify format file exists
ls ~/.config/gonzo/formats/my-format.yamlCheck YAML syntax
cat ~/.config/gonzo/formats/my-format.yaml # Validate if you have yamllint yamllint ~/.config/gonzo/formats/my-format.yamlTest with built-in format first
# Verify custom formats work at all gonzo --format=loki-stream -f test.json
Solutions:
See Custom Formats Guide for:
Format file syntax
Regex patterns
Field mapping
Testing formats
Regex Not Matching
Symptom: Custom format regex doesn't extract fields.
Solutions:
Test regex separately
# Test your regex pattern echo "sample log line" | grep -E "your-regex-pattern"Use online regex tester
Test at regex101.com
Use example log lines
Verify capture groups
Check for special characters
# Escape special characters pattern: '\[(\d+)\]' # Brackets escapedStart simple, iterate
# Begin with basic pattern pattern: '(\w+)' # Then add complexity pattern: '(\w+)=(\w+)' # Finally, full pattern pattern: '(\w+)="([^"]*)"'
Encoding Issues
Special Characters Garbled
Symptom: Non-ASCII characters display incorrectly.
Solutions:
Ensure UTF-8 encoding
export LANG=en_US.UTF-8 export LC_ALL=en_US.UTF-8Check file encoding
file logs.txt # Should show: UTF-8 Unicode text # Convert if needed iconv -f ISO-8859-1 -t UTF-8 logs.txt > logs_utf8.txtPre-filter logs
# Filter before complex parsing grep "pattern" logs.txt | gonzo --format=complexUse built-in formats when possible
JSON and logfmt parsing is optimized
Custom regex is slower
Timestamp Issues
Timestamps Not Recognized
Symptom: Logs appear in wrong order or timestamp not extracted.
Common timestamp formats Gonzo recognizes:
2024-01-15T10:30:05Z # ISO 8601
2024-01-15T10:30:05.123456Z # ISO 8601 with microseconds
2024-01-15 10:30:05 # Common format
Jan 15 10:30:05 # Syslog format
1705315805 # Unix timestampSolutions:
Use ISO 8601 format (recommended)
{"timestamp":"2024-01-15T10:30:05Z","msg":"test"}Ensure timestamp field name
Common names:
timestamp,time,@timestamp,tsGonzo checks these automatically
Custom format for unusual timestamps
Define timestamp extraction in format file
Specify timestamp format
Timezone Issues
Symptom: Timestamps appear in wrong timezone.
Solutions:
Use UTC in logs (recommended)
{"timestamp":"2024-01-15T10:30:05Z"} # Z indicates UTCInclude timezone offset
{"timestamp":"2024-01-15T10:30:05-05:00"} # ESTGonzo displays timestamps as received
No automatic conversion
Format logs consistently at source
Large Log Line Issues
Very Long Lines Truncated
Symptom: Extremely long log lines appear cut off.
Solutions:
Use horizontal scrolling
← → or h l # Scroll horizontallyView in detail modal
Enter # Opens full log entrySplit long lines at source
Configure application to use reasonable line length
Use structured logging to avoid massive single-line logs
Lines Exceed Buffer
Symptom: Some log lines cause errors or don't appear.
Solution:
Gonzo handles lines up to typical buffer limits. For extremely large lines:
# Pre-process to truncate lines
cut -c 1-10000 massive.log | gonzo
# Or filter out problematic lines
awk 'length($0) < 10000' massive.log | gonzoDebugging Format Issues
Test Format Detection
# Test with minimal sample
echo '{"level":"info","msg":"test"}' | gonzo
# Should parse as JSON
echo 'level=info msg=test' | gonzo
# Should parse as logfmt
echo 'INFO test message' | gonzo
# Should show as plain textExamine Raw Logs
# Check first few lines
head -5 logs.txt
# Check for hidden characters
cat -A logs.txt | head
# Validate JSON structure
head -1 logs.json | jq .
# Check line endings
file logs.txt # Shows CRLF vs LFCompare with Known-Good Format
# Test with working format first
echo '{"level":"info","message":"test"}' > good.json
gonzo -f good.json
# Then compare with problem logs
diff <(head -1 good.json) <(head -1 problem.json)Common Format Patterns
Application Logs
Go/Logrus:
{"level":"info","msg":"started","time":"2024-01-15T10:30:05Z"}✅ Parses as JSON automatically
Python/Logging:
2024-01-15 10:30:05,123 INFO module: message⚠️ Plain text - create custom format for structure
Node.js/Winston:
{"level":"info","message":"started","timestamp":"2024-01-15T10:30:05Z"}✅ Parses as JSON automatically
System Logs
Syslog:
Jan 15 10:30:05 hostname service[123]: message⚠️ Plain text - consider custom format
Systemd Journal:
MESSAGE=Test log
PRIORITY=6
_HOSTNAME=server⚠️ Key=value but special format - needs custom parser
Container Logs
Docker JSON:
{"log":"application log message\n","stream":"stdout","time":"2024-01-15T10:30:05Z"}✅ Parses as JSON, extracts nested log
Kubernetes:
{"level":"info","msg":"test","pod":"app-123"}✅ Parses as JSON with K8s attributes
Format Best Practices
When Choosing Log Format
Prefer structured formats
JSON or logfmt over plain text
Easier to parse and analyze
Better attribute extraction
Use consistent format
Same format across all services
Easier to aggregate and search
Include standard fields
levelorseverity: ERROR, WARN, INFO, DEBUGtimestamp: ISO 8601 formatmessageormsg: Human-readable messageserviceorservice.name: Service identifier
Example good JSON log:
{ "timestamp": "2024-01-15T10:30:05Z", "level": "error", "service": "api", "message": "Database connection failed", "error": "connection timeout", "host": "prod-server-01" }
When You Can't Change Format
Use custom format definition
Create regex-based parser
Map fields to standard attributes
Pre-process logs
Use awk/sed to restructure
Convert to JSON/logfmt before Gonzo
Use log shipping layer
Fluent Bit, Logstash, Vector
Transform logs to standard format
Getting Help
Provide This Info for Format Issues
# Sample log lines (3-5 lines)
head -5 logs.txt
# File encoding
file logs.txt
# Attempted command
echo "gonzo -f logs.txt [--format=...]"
# Expected vs actual behavior
echo "Expected: Parse as JSON"
echo "Actual: Shows as plain text"
# Gonzo version
gonzo --versionResources
Common Issues - General troubleshooting
Custom Formats Guide - Creating format parsers
GitHub Issues - Report format bugs
Examples Directory - Sample format files
Last updated