guides / integration

Integration

Integrate gnaw into your development workflow, CI/CD pipelines, and monitoring systems.

Development Workflow Integration

Editor Integration

VS Code

Create a custom task in .vscode/tasks.json:

{
    "version": "2.0.0",
    "tasks": [
        {
            "label": "Search Codebase",
            "type": "shell",
            "command": "gnaw",
            "args": ["${input:searchPattern}", "${workspaceFolder}"],
            "group": "build",
            "presentation": {
                "echo": true,
                "reveal": "always",
                "focus": false,
                "panel": "new"
            }
        }
    ],
    "inputs": [
        {
            "id": "searchPattern",
            "description": "Search pattern",
            "default": "TODO",
            "type": "promptString"
        }
    ]
}

Vim/Neovim

Add to your .vimrc or init.vim:

" Search with gnaw
nnoremap <leader>g :!gnaw <cword> %<cr>
vnoremap <leader>g :!gnaw <cword> %<cr>

" Search in project
nnoremap <leader>G :!gnaw <cword> .<cr>

Git Hooks

Pre-commit Hook

Create .git/hooks/pre-commit:

#!/bin/bash
# Check for TODO comments
if gnaw -l "TODO" src/; then
    echo "Warning: TODO comments found in code"
    echo "Consider addressing before committing"
fi

# Check for debug statements
if gnaw -l "console\.log\|print\|debugger" src/; then
    echo "Warning: Debug statements found"
    echo "Remove debug statements before committing"
fi

Commit Message Hook

Create .git/hooks/commit-msg:

#!/bin/bash
# Check commit message for common issues
if gnaw -i "fix\|bug\|error" "$1"; then
    echo "Commit message contains issue-related keywords"
    echo "Consider adding issue number or reference"
fi

CI/CD Integration

GitHub Actions

Create .github/workflows/code-quality.yml:

name: Code Quality

on: [push, pull_request]

jobs:
  code-quality:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      
      - name: Install gnaw
        run: |
          curl -L https://github.com/10printhello/gnaw/releases/latest/download/gnaw-linux-x86_64.tar.gz | tar xz
          sudo mv gnaw /usr/local/bin/
      
      - name: Check for security issues
        run: |
          if gnaw -l "password.*=\|api_key.*=\|secret.*=" src/; then
            echo "Security issue: potential hardcoded credentials"
            exit 1
          fi
      
      - name: Check for TODO comments
        run: |
          TODO_COUNT=$(gnaw --raw -c "TODO" src/)
          if [ "$TODO_COUNT" -gt 10 ]; then
            echo "Too many TODO comments: $TODO_COUNT"
            exit 1
          fi
      
      - name: Check for debug statements
        run: |
          if gnaw -l "console\.log\|print\|debugger" src/; then
            echo "Debug statements found in code"
            exit 1
          fi

GitLab CI

Create .gitlab-ci.yml:

stages:
  - quality

code-quality:
  stage: quality
  image: ubuntu:latest
  before_script:
    - apt-get update && apt-get install -y curl
    - curl -L https://github.com/10printhello/gnaw/releases/latest/download/gnaw-linux-x86_64.tar.gz | tar xz
    - mv gnaw /usr/local/bin/
  script:
    - gnaw -l "password.*=\|api_key.*=" src/ || echo "No hardcoded credentials found"
    - gnaw --raw -c "TODO" src/ | awk '{if($1>5) exit 1}'
    - gnaw -l "console\.log\|print" src/ || echo "No debug statements found"

Jenkins Pipeline

Create Jenkinsfile:

pipeline {
    agent any
    
    stages {
        stage('Code Quality') {
            steps {
                sh '''
                    # Install gnaw
                    curl -L https://github.com/10printhello/gnaw/releases/latest/download/gnaw-linux-x86_64.tar.gz | tar xz
                    sudo mv gnaw /usr/local/bin/
                    
                    # Run quality checks
                    gnaw -l "password.*=\|api_key.*=" src/ || echo "No hardcoded credentials"
                    TODO_COUNT=$(gnaw --raw -c "TODO" src/)
                    if [ "$TODO_COUNT" -gt 10 ]; then
                        echo "Too many TODO comments: $TODO_COUNT"
                        exit 1
                    fi
                '''
            }
        }
    }
}

Monitoring Integration

Log Monitoring

Real-time Error Monitoring

#!/bin/bash
# monitor-errors.sh
tail -f /var/log/app.log | gnaw --stream "ERROR" | while read line; do
    echo "$(date): $line" >> /var/log/errors.log
    
    # Send to monitoring system
    curl -X POST -H "Content-Type: application/json" \
         -d "{\"timestamp\":\"$(date -Iseconds)\",\"level\":\"ERROR\",\"message\":\"$line\"}" \
         http://monitoring.example.com/api/logs
done

Log Analysis Script

#!/bin/bash
# analyze-logs.sh
LOG_FILE="/var/log/app.log"
DATE=$(date +%Y-%m-%d)

# Count errors by type
ERROR_COUNT=$(gnaw --raw -c "ERROR" "$LOG_FILE")
WARN_COUNT=$(gnaw --raw -c "WARN" "$LOG_FILE")

# Get error details
gnaw --json "ERROR" "$LOG_FILE" | jq '.results[] | {time: .line | split(" ")[0:2] | join(" "), message: .line}' > "/tmp/errors-$DATE.json"

# Send summary
curl -X POST -H "Content-Type: application/json" \
     -d "{\"date\":\"$DATE\",\"errors\":$ERROR_COUNT,\"warnings\":$WARN_COUNT}" \
     http://monitoring.example.com/api/summary

Prometheus Integration

Custom Metrics

#!/bin/bash
# prometheus-metrics.sh
ERROR_COUNT=$(gnaw --raw -c "ERROR" /var/log/app.log)
WARN_COUNT=$(gnaw --raw -c "WARN" /var/log/app.log)

# Write to metrics file
echo "# HELP app_errors_total Total number of errors" >> /tmp/metrics
echo "# TYPE app_errors_total counter" >> /tmp/metrics
echo "app_errors_total $ERROR_COUNT" >> /tmp/metrics

echo "# HELP app_warnings_total Total number of warnings" >> /tmp/metrics
echo "# TYPE app_warnings_total counter" >> /tmp/metrics
echo "app_warnings_total $WARN_COUNT" >> /tmp/metrics

API Integration

REST API Wrapper

Create a simple REST API wrapper:

#!/usr/bin/env python3
# gnaw-api.py
from flask import Flask, request, jsonify
import subprocess
import json

app = Flask(__name__)

@app.route('/search', methods=['POST'])
def search():
    data = request.json
    pattern = data.get('pattern')
    path = data.get('path', '.')
    
    try:
        result = subprocess.run(
            ['gnaw', '--json', pattern, path],
            capture_output=True,
            text=True
        )
        
        if result.returncode == 0:
            return jsonify(json.loads(result.stdout))
        else:
            return jsonify({'error': result.stderr}), 400
            
    except Exception as e:
        return jsonify({'error': str(e)}), 500

@app.route('/agent/search', methods=['POST'])
def agent_search():
    data = request.json
    query = data.get('query')
    file_type = data.get('file_type')
    directory = data.get('directory')
    
    cmd = ['gnaw', 'agent', 'ask', '--json', query]
    if file_type:
        cmd.extend(['--type', file_type])
    if directory:
        cmd.extend(['--dir', directory])
    
    try:
        result = subprocess.run(cmd, capture_output=True, text=True)
        
        if result.returncode == 0:
            return jsonify(json.loads(result.stdout))
        else:
            return jsonify({'error': result.stderr}), 400
            
    except Exception as e:
        return jsonify({'error': str(e)}), 500

if __name__ == '__main__':
    app.run(host='0.0.0.0', port=5000)

Webhook Integration

Slack Webhook

#!/bin/bash
# slack-webhook.sh
WEBHOOK_URL="https://hooks.slack.com/services/YOUR/WEBHOOK/URL"

# Monitor for errors
tail -f /var/log/app.log | gnaw --stream "ERROR" | while read line; do
    curl -X POST -H "Content-Type: application/json" \
         -d "{\"text\":\"🚨 Error detected: $line\"}" \
         "$WEBHOOK_URL"
done

Discord Webhook

#!/bin/bash
# discord-webhook.sh
WEBHOOK_URL="https://discord.com/api/webhooks/YOUR/WEBHOOK/URL"

# Monitor for critical errors
tail -f /var/log/app.log | gnaw --stream "FATAL\|CRITICAL" | while read line; do
    curl -X POST -H "Content-Type: application/json" \
         -d "{\"content\":\"🚨 Critical error: $line\"}" \
         "$WEBHOOK_URL"
done

Database Integration

Log Analysis to Database

#!/usr/bin/env python3
# log-to-db.py
import sqlite3
import subprocess
import json
from datetime import datetime

def analyze_logs():
    # Run gnaw analysis
    result = subprocess.run(
        ['gnaw', '--json', 'ERROR', '/var/log/app.log'],
        capture_output=True,
        text=True
    )
    
    if result.returncode == 0:
        data = json.loads(result.stdout)
        
        # Store in database
        conn = sqlite3.connect('logs.db')
        cursor = conn.cursor()
        
        cursor.execute('''
            CREATE TABLE IF NOT EXISTS errors (
                id INTEGER PRIMARY KEY,
                timestamp TEXT,
                file_name TEXT,
                line_number INTEGER,
                message TEXT
            )
        ''')
        
        for error in data['results']:
            cursor.execute('''
                INSERT INTO errors (timestamp, file_name, line_number, message)
                VALUES (?, ?, ?, ?)
            ''', (
                datetime.now().isoformat(),
                error['file_name'],
                error['line_number'],
                error['line']
            ))
        
        conn.commit()
        conn.close()

if __name__ == '__main__':
    analyze_logs()

Best Practices

- Use streaming for real-time monitoring - Set appropriate memory limits - Monitor resource usage - Validate input patterns - Sanitize output data - Use secure communication channels - Implement error handling - Use timeouts for long-running operations - Monitor and log failures Start with simple integrations and gradually add complexity as your needs grow. gnaw's flexibility makes it easy to integrate into existing workflows.