A self-hosted master data management (MDM) platform built with Rust. Connect Shops, PIM, CRM, ERP, and any system. Fast, secure, and flexible.
- Dynamic Entity System - Create custom data structures at runtime through the API
- Workflow Engine - DSL-based data pipelines with scheduled and on-demand execution (DSL Documentation)
- API Authentication - JWT and API key support with role-based access control
- Import/Export - CSV, JSON, XML, and third-party API integrations
- Versioning - Full version history for entities, definitions, and workflows
- Self-Hosted - Your data stays on your infrastructure
| Component | Version |
|---|---|
| Docker | 20.10+ |
| PostgreSQL | 14+ |
| Redis | 7+ |
For development, you'll also need:
- Rust 1.92+ (nightly)
- Node.js 22+ (for admin frontend)
- Clone the repository:
git clone https://github.com/BentBr/r_data_core.git
cd r_data_core- Copy (and adjust to your needs) the environment:
cp .env.example .env- Start all services:
docker compose up -dThe application will be available at http://rdatacore.docker if you setup dinghy DNS routing:
- Install Dinghy (just another docker container) - for MAC OS:
docker run -d --restart=always \ -v /var/run/docker.sock:/tmp/docker.sock:ro \ -v ~/.dinghy/certs:/etc/nginx/certs \ -p 80:80 -p 443:443 -p 19322:19322/udp \ -e CONTAINER_NAME=http-proxy \ --name http-proxy \ codekitchen/dinghy-http-proxy - Setup routing
sudo mkdir -pv /etc/resolver sudo bash -c 'echo "nameserver 127.0.0.1" > /etc/resolver/docker' sudo bash -c 'echo "port 19322" >> /etc/resolver/docker'
If you are not on macOS, you should create an compose.override.yaml and re-assign ports to for the web service to your localhosts.
Pull the latest images from GitHub Container Registry:
# Main application
docker pull ghcr.io/bentbr/r-data-core:latest
# Workflow worker
docker pull ghcr.io/bentbr/r-data-core-worker:latest
# Maintenance worker
docker pull ghcr.io/bentbr/r-data-core-maintenance:latest| Variable | Description |
|---|---|
DATABASE_URL |
PostgreSQL connection string |
JWT_SECRET |
Secret key for JWT token signing |
REDIS_URL |
Redis connection URL |
| Variable | Default | Description |
|---|---|---|
APP_ENV |
development | Application environment |
API_HOST |
0.0.0.0 | Server host address |
API_PORT |
8888 | Server port |
JWT_EXPIRATION |
86400 | JWT token expiration (seconds) |
API_ENABLE_DOCS |
true | Enable Swagger API documentation |
CORS_ORIGINS |
* | Allowed CORS origins |
CACHE_ENABLED |
true | Enable caching |
CACHE_TTL |
300 | Default cache TTL (seconds) |
CHECK_DEFAULT_ADMIN_PASSWORD |
true | Defines if the warning in FE is shown |
| Variable | Description |
|---|---|
VERSION_PURGER_CRON |
Cron expression for version purger task |
REFRESH_TOKEN_CLEANUP_CRON |
Cron expression for refresh token cleanup task |
MAINTENANCE_DATABASE_URL |
PostgreSQL connection string for maintenance worker |
MAINTENANCE_DATABASE_MAX_CONNECTIONS |
Maximum database connections (default: 10) |
MAINTENANCE_DATABASE_CONNECTION_TIMEOUT |
Connection timeout in seconds (default: 30) |
See .env.example for the complete list of configuration options.
RDataCore consists of three main components:
- API Server (
r_data_core) - Handles HTTP requests, authentication, and entity management - Workflow Worker (
r_data_core_worker) - Processes workflow jobs from Redis queue - Maintenance Worker (
r_data_core_maintenance) - Runs scheduled maintenance tasks
The main Docker image includes utility binaries for operations and maintenance:
# Run database migrations
docker compose exec core /usr/local/bin/run_migrations
# Check migration status
docker compose exec core /usr/local/bin/run_migrations --status
# Clear entire Redis cache
docker compose exec core /usr/local/bin/clear_cache --all
# Clear specific cache by prefix
docker compose exec core /usr/local/bin/clear_cache --prefix "entity_definitions:"
# Preview cache deletion (dry-run)
docker compose exec core /usr/local/bin/clear_cache --prefix "api_keys:" --dry-run
# Hash a password for admin users
docker compose exec core /usr/local/bin/hash_password 'YourSecurePassword'| Binary | Description |
|---|---|
run_migrations |
Run SQLx database migrations (--status to check, --help for options) |
clear_cache |
Clear Redis cache (--all or --prefix <PREFIX>, --dry-run to preview) |
hash_password |
Generate Argon2 password hash with SQL update statement |
Key tables:
entity_definitions- Schema definitions for dynamic entitiesentities_registry- All entity instances with JSONB field storageworkflows- Workflow definitions with DSL configurationworkflow_runs- Workflow execution historyadmin_users- Admin user accountsapi_keys- API authentication keys
Once running, access the API documentation at:
- Public API:
http://rdatacore.docker/api/docs/ - Admin API:
http://rdatacore.docker/admin/api/docs/
Admin API (requires admin JWT):
GET/POST /admin/api/v1/entity-definitions- Manage entity schemasGET/POST /admin/api/v1/workflows- Manage workflowsGET/POST /admin/api/v1/admin-users- Manage admin usersGET/POST /admin/api/v1/api-keys- Manage API keys
Public API (JWT or API key):
GET/POST /api/v1/entities/{type}- CRUD operations on entities
Define custom data structures with field types, validation rules, and UI settings:
{
"entity_type": "products",
"display_name": "Products",
"fields": [
{
"name": "sku",
"field_type": "String",
"required": true,
"unique": true
},
{
"name": "price",
"field_type": "Float",
"required": true
}
]
}- Text: String, Text, Wysiwyg
- Numeric: Integer, Float
- Boolean: Boolean
- Date: Date, DateTime
- Complex: Object, Array, UUID
- Relations: ManyToOne, ManyToMany
- Select: Select, MultiSelect
- Assets: Image, File
Create automated data pipelines using the workflow DSL:
- Fetch Stage - Pull data from external sources (APIs, files, databases)
- Transform Stage - Apply transformations and business logic
- Process Stage - Store, export, or forward processed data
Workflows can be triggered by:
- Cron schedules
- Manual API calls
- Webhook events
- Documentation: API Docs
- Issues: GitHub Issues
- Contact: hello@rdatacore.eu
For development setup, testing, and contribution guidelines, see docs/DEVELOPMENT.md.
See Pricing for license information.
- Free for developers, educators, and small teams
- Commercial licenses available for organizations