Skip to content

Logging server and UI, all you need is a single docker-compose up. Postgres for scalability. Team-based workflow. No fancy logging dependency needed.

Notifications You must be signed in to change notification settings

9cb14c1ec0/SimpleLogs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SimpleLogs

A self-hosted logging storage and search application with a simple API for log ingestion.

Features

  • Simple Ingestion API - Send logs with 2 lines of code from any language
  • Full-text Search - Search log messages with PostgreSQL full-text search
  • JSON Metadata - Attach structured data to logs and filter by any field
  • Multi-team - Isolate logs by team with separate API keys
  • Retention Policies - Auto-delete old logs per team
  • Modern Stack - FastAPI + Vue 3 + Vuetify + PostgreSQL
  • Auto HTTPS - Caddy handles SSL certificates automatically

Quick Start

# Clone and configure
git clone <repo-url> simplelogs
cd simplelogs
cp .env.example .env

# Start services
docker-compose up -d

# Access the UI
open http://localhost

Default login: admin@example.com / changeme

Configuration

Edit .env to configure:

# Domain (use real domain for auto-HTTPS)
DOMAIN=localhost

# Database
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DB=simplelogs

# Security (change in production!)
SECRET_KEY=your-secure-random-string
ADMIN_EMAIL=admin@example.com
ADMIN_PASSWORD=changeme

Sending Logs

Get your API key from the admin panel (Teams → Create Team), then:

curl

curl -X POST http://localhost/api/v1/ingest \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"level": "info", "message": "User logged in", "metadata": {"user_id": 123}}'

Python

import requests

requests.post("http://localhost/api/v1/ingest",
    headers={"X-API-Key": "YOUR_API_KEY"},
    json={
        "level": "error",
        "message": "Payment failed",
        "metadata": {"user_id": 123, "amount": 99.99}
    })

JavaScript

fetch("http://localhost/api/v1/ingest", {
    method: "POST",
    headers: {
        "X-API-Key": "YOUR_API_KEY",
        "Content-Type": "application/json"
    },
    body: JSON.stringify({
        level: "info",
        message: "Order created",
        metadata: { orderId: 456 }
    })
});

Batch Ingestion

Send up to 1000 logs in one request:

curl -X POST http://localhost/api/v1/ingest/batch \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "logs": [
      {"level": "info", "message": "Step 1 complete"},
      {"level": "info", "message": "Step 2 complete"},
      {"level": "error", "message": "Step 3 failed", "metadata": {"error": "timeout"}}
    ]
  }'

Log Format

Field Type Required Description
level string No debug, info, warn, error, fatal (default: info)
message string Yes Log message text
metadata object No JSON object with any additional data
source string No Service/app name
timestamp string No ISO 8601 timestamp (default: server time)

Searching Logs

In the UI, you can search by:

  • Text - Full-text search on message content
  • Level - Filter by log level(s)
  • Source - Filter by source/service name
  • Date Range - Filter by time period
  • Metadata - Filter by JSON fields (e.g., user_id=123)

Development

Backend

cd backend
python -m venv venv
source venv/bin/activate  # or `venv\Scripts\activate` on Windows
pip install -r requirements.txt

# Run with auto-reload
uvicorn app.main:app --reload

Frontend

cd frontend
npm install
npm run dev

Database Migrations

cd backend

# Initialize (first time)
aerich init -t app.db.TORTOISE_ORM
aerich init-db

# Create migration
aerich migrate --name add_new_field

# Apply migrations
aerich upgrade

Production Deployment

  1. Set a real domain in .env:

    DOMAIN=logs.yourdomain.com
  2. Update security settings:

    SECRET_KEY=<generate-a-long-random-string>
    ADMIN_PASSWORD=<strong-password>
    POSTGRES_PASSWORD=<strong-password>
  3. Deploy:

    docker-compose up -d

Caddy will automatically obtain and renew SSL certificates from Let's Encrypt.

License

MIT

About

Logging server and UI, all you need is a single docker-compose up. Postgres for scalability. Team-based workflow. No fancy logging dependency needed.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published