Skip to content

PascalRepond/datakult

Repository files navigation

Contributors Forks Stargazers Issues GPL-3.0


Datakult logo (icon by Freepik - Flaticon)

Datakult

Review and analyse the media and culture that you consume.

About The Project

A Django application to track and rate the media I consume: movies, TV shows, books, video games, music, and more.

Note: This is a personal project, partly vibe-coded with the help of LLMs. It serves as a learning playground for Django, and to use as a personal tool.

Features

  • Track different media types (books, films, TV series, games, music, podcasts...)
  • Rating system (1-10 scale)
  • Markdown reviews
  • Filters by type, status, year...

(back to top)

Built With

  • Django
  • DaisyUI
  • Tailwind CSS
  • HTMX

(back to top)

Docker Deployment

Production Deployment (using pre-built image)

  1. Download the Docker configuration files:

    mkdir -p docker
    cd docker
    curl -O https://raw.githubusercontent.com/PascalRepond/datakult/main/docker/docker-compose.prod.yml
    curl -O https://raw.githubusercontent.com/PascalRepond/datakult/main/docker/.env.example
  2. Create your .env file from the example:

    cp .env.example .env
  3. Edit the .env file and replace the placeholder values:

    • SECRET_KEY: Generate with python -c "from django.core.management.utils import get_random_secret_key; print(get_random_secret_key())"
    • ALLOWED_HOSTS: Add your domain and IP (e.g., datakult.example.com,192.168.1.100,localhost)
    • DJANGO_SUPERUSER_PASSWORD: Use a secure password
    • TMDB_API_KEY: If you want to be able to import metadata from TMDB
    • TWITCH_CLIENT_ID and TWITCH_CLIENT_SECRET: If you want to import metadata from IGDB
  4. Start the application:

    docker compose -f docker-compose.prod.yml up -d

The application will be available at http://localhost:8000. Your data will be stored in docker/datakult_data/.

Migrations and superuser creation are handled automatically on first start.

Local Testing (build from source)

To test the Docker build locally before deploying:

# From the project root
docker compose -f docker/docker-compose.local.yml up --build

This builds the image from your current code. Default credentials are admin/admin.

(back to top)

Dev environment

Prerequisites

Before you begin, make sure you have the following installed:

  • uv (Python package manager)
    • Install with: curl -LsSf https://astral.sh/uv/install.sh | sh
  • Node.js 20+ and npm 10+
  • gettext (optional, for translations)
    • Debian/Ubuntu: sudo apt-get install gettext
    • macOS: brew install gettext

Setup

# Clone the repository
git clone https://github.com/PascalRepond/datakult.git
cd datakult

# Install dependencies (using uv)
uv sync --frozen

# Initial setup (checks versions and installs everything)
uv run poe bootstrap

# Start the development server
uv run poe server
# The default superuser is admin/admin

Available Commands

uv run poe server          # Dev server with Tailwind hot-reload
uv run poe migrate         # Apply migrations
uv run poe makemigrations  # Create migrations
uv run poe test            # Run tests
uv run poe lint            # Check code with Ruff
uv run poe format          # Format code
uv run poe ci              # Run all checks (format, lint, tests, audits)

(back to top)

Backups

Datakult includes a backup system that exports both the database and media files into a .tar.gz archive.

Manual Backup

Local environment:

# Create a backup
uv run poe backup

# Create a backup with automatic rotation (keep only 7 most recent)
uv run ./src/manage.py export_backup --keep=7

# Restore a backup
uv run poe restore path/to/backup.tar.gz

Docker environment:

# Create a backup
docker exec datakult uv run /app/src/manage.py export_backup

# Create a backup with automatic rotation (keep only 7 most recent)
docker exec datakult uv run /app/src/manage.py export_backup --keep=7

# Restore a backup
docker exec datakult uv run /app/src/manage.py import_backup /app/data/backups/backup.tar.gz

Backups are stored in:

  • Local: src/backups/
  • Docker: /app/data/backups/ (mapped to docker/datakult_data/backups/ on the host)

Automated Backups

For production environments, it's recommended to configure automated backups using your system's cron scheduler:

Example cron configuration (daily backup at 3 AM, keeping 7 most recent):

0 3 * * * docker exec datakult uv run /app/src/manage.py export_backup --keep=7 >> /var/log/datakult-backup.log 2>&1

Local development cron example:

0 3 * * * cd /path/to/datakult && uv run ./src/manage.py export_backup --keep=7 >> /var/log/datakult-backup.log 2>&1

The --keep parameter automatically deletes old backups, maintaining only the N most recent backup files.

(back to top)

License

Distributed under the GNU GENERAL PUBLIC LICENSE. See LICENSE for more information.

(back to top)

About

Review and analyse the media and culture that you consume.

Resources

License

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •