A robust data synchronization service that extracts data from Kaseya Autotask's REST API and stores it in PostgreSQL for advanced analytics and monitoring.
- 🔄 Continuous Sync: Automated data synchronization with configurable intervals
- 📊 Analytics Ready: Pre-built analytics endpoints for Grafana integration
- 🏢 Company Health Monitoring: Track contract profitability and customer metrics
- 🎯 Smart Discovery: Automatic API analysis and sync optimization
- 📈 Real-time Monitoring: Health checks, sync status, and performance metrics
- 🔍 Historical Tracking: Complete change history with audit trails
- ⚡ High Performance: Connection pooling, rate limiting, and incremental sync
Autotask API → Sync Engine → PostgreSQL → Analytics API → Grafana
- Autotask API Client: Wrapper around
autotask-nodewith rate limiting and pagination - Sync Engine: Cron-based scheduling with incremental updates
- Database Layer: PostgreSQL with historical tracking and materialized views
- Web API: Monitoring endpoints and analytics for Grafana
- Discovery Service: API analysis and sync recommendations
- Node.js 18+
- PostgreSQL 14+
- Autotask API credentials (username, integration code, secret)
-
Clone the repository
git clone <repository-url> cd autotask-data-sink
-
Install dependencies
npm install
-
Configure environment
cp env.example .env # Edit .env with your configuration -
Set up database
# Create PostgreSQL database createdb autotask_data_sink # Run migrations npm run db:migrate
-
Start the service
npm run dev
| Variable | Description | Required | Default |
|---|---|---|---|
AUTOTASK_USER |
Autotask username | ✅ | - |
AUTOTASK_INTEGRATION_CODE |
Integration code | ✅ | - |
AUTOTASK_SECRET |
API secret | ✅ | - |
AUTOTASK_API_URL |
API endpoint URL | ❌ | Auto-detected |
DB_HOST |
PostgreSQL host | ❌ | localhost |
DB_PORT |
PostgreSQL port | ❌ | 5432 |
DB_NAME |
Database name | ❌ | autotask_data_sink |
DB_USER |
Database user | ❌ | postgres |
DB_PASSWORD |
Database password | ✅ | - |
API_KEY |
API key for endpoint auth | ✅ | - |
WEB_PORT |
Web service port | ❌ | 3000 |
LOG_LEVEL |
Logging level | ❌ | info |
Configure sync frequencies in milliseconds:
SYNC_INTERVAL_TICKETS=300000 # 5 minutes
SYNC_INTERVAL_TIME_ENTRIES=300000 # 5 minutes
SYNC_INTERVAL_COMPANIES=86400000 # 24 hours
SYNC_INTERVAL_CONTACTS=86400000 # 24 hours
SYNC_INTERVAL_CONTRACTS=3600000 # 1 hour
SYNC_INTERVAL_PROJECTS=3600000 # 1 hour# Development with hot reload
npm run dev
# Production build and start
npm run build
npm start
# Run tests
npm test# Check migration status
npm run db:migrate -- --status
# Reset and migrate database
npm run db:migrate -- --reset-and-migrate
# Reset database only
npm run db:migrate -- --resetAll /api/* endpoints require authentication via X-API-Key header or Authorization: Bearer <key>.
# Trigger discovery analysis
curl -X POST -H "X-API-Key: your_key" http://localhost:3000/api/discovery/run
# Trigger manual sync for specific entity
curl -X POST -H "X-API-Key: your_key" http://localhost:3000/api/sync/trigger/ticket
curl -X POST -H "X-API-Key: your_key" http://localhost:3000/api/sync/trigger/time_entryGET /health- Service health check (unauthenticated)GET /api/sync/status- Sync job status and recent operationsGET /api/stats/database- Database statistics
All
/api/*endpoints requireX-API-Keyheader orAuthorization: Bearer <key>.
GET /api/analytics/contract-profitability- Contract profitability metricsGET /api/analytics/ticket-metrics?timeframe=7d- Ticket analyticsGET /api/analytics/time-metrics?timeframe=30d- Time entry analyticsGET /api/analytics/company-health- Company health metrics
POST /api/sync/trigger/:entityType- Trigger manual syncPOST /api/discovery/run- Run API discovery analysis
- companies - Customer/vendor information
- contacts - Contact details linked to companies
- resources - Autotask users (employees, contractors)
- contracts - Service agreements with companies
- projects - Work projects under contracts
- tickets - Support tickets and issues
- time_entries - Time tracking and billing records
- configuration_items - Assets and devices
- sync_operations - Track sync job execution
- sync_state - Maintain sync timestamps per entity
- entity_history - Historical change tracking
- contract_profitability - Revenue vs. cost analysis
- resource_utilization - Resource hours, billable percentage, projects worked
- ticket_sla_performance - SLA compliance per company
Contract Profitability
SELECT
contract_name,
revenue_percentage,
hours_percentage,
actual_revenue
FROM contract_profitability
WHERE revenue_percentage IS NOT NULL
ORDER BY revenue_percentage DESCTicket Volume Over Time
SELECT
date,
total_tickets,
open_tickets,
avg_resolution_hours
FROM (
SELECT
DATE_TRUNC('day', created_date) as date,
COUNT(*) as total_tickets,
COUNT(*) FILTER (WHERE status IN (1,2,3)) as open_tickets,
AVG(EXTRACT(EPOCH FROM (COALESCE(completion_date, NOW()) - created_date)) / 3600) as avg_resolution_hours
FROM tickets
WHERE created_date > NOW() - INTERVAL '30 days'
GROUP BY DATE_TRUNC('day', created_date)
) t
ORDER BY date-
Executive Dashboard
- Contract profitability overview
- Company health metrics
- Revenue trends
-
Operational Dashboard
- Active ticket counts
- Response time metrics
- Resource utilization
-
Financial Dashboard
- Billable hours tracking
- Revenue vs. cost analysis
- Profit margin trends
src/
├── config/ # Configuration management
├── database/ # Database connection and migration
├── logger/ # Logging infrastructure
├── services/ # Core business logic
│ ├── discovery.ts # API discovery and analysis
│ ├── sync.ts # Data synchronization
│ └── web.ts # Web API service
├── autotask/ # Autotask API client
└── index.ts # Application entry point
tests/ # Test suites
plans/ # Project planning documents
prompt_logs/ # Development prompt logs
Tests are organized by functionality level:
- Unit Tests: Individual component testing
- Integration Tests: Database and API integration
- End-to-End Tests: Full sync workflow validation
# Run all tests
npm test
# Run specific test suites
npm test -- --grep "Database"
npm test -- --grep "Integration"- ESLint: Code linting and style enforcement
- TypeScript: Static type checking
- Jest: Testing framework
- Pre-commit hooks: Automated testing before commits
# docker-compose.yml
version: '3.8'
services:
autotask-data-sink:
build: .
environment:
- NODE_ENV=production
- DB_HOST=postgres
depends_on:
- postgres
ports:
- "3000:3000"
postgres:
image: postgres:14
environment:
- POSTGRES_DB=autotask_data_sink
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=password
volumes:
- postgres_data:/var/lib/postgresql/data
volumes:
postgres_data:- Environment Variables: Use secure secret management
- Database: Configure backup and recovery procedures
- Monitoring: Set up alerts for sync failures and performance issues
- Logging: Configure log aggregation and rotation
- Security: Implement network security and access controls
- Indexes: Proper indexing on foreign keys and date columns
- Connection Pooling: Configured for optimal concurrency
- Query Optimization: Use EXPLAIN ANALYZE for slow queries
- Materialized Views: Pre-computed analytics for better performance
- Configurable delays between API calls
- Exponential backoff on errors
- Batch processing for large datasets
- Pagination handling for memory efficiency
Sync Failures
- Check Autotask API credentials
- Verify network connectivity
- Review API rate limits
- Check database connection
Performance Issues
- Monitor database query performance
- Check sync job overlap
- Verify adequate system resources
- Review log files for bottlenecks
Data Inconsistencies
- Run discovery analysis
- Check sync state timestamps
- Verify field mappings
- Review error logs
Logs are structured and categorized by component:
- Application: General application events
- Database: Database operations and errors
- Sync: Synchronization activities
- Autotask: API interactions
- Discovery: Discovery analysis
- Follow the existing code style and patterns
- Add tests for new functionality
- Update documentation for significant changes
- Use descriptive commit messages
[License information]
For issues and questions:
- Check the troubleshooting guide
- Review log files for error details
- Open an issue with detailed information
- Include configuration (sanitized) and error logs