diff --git a/apps/temps-cli/README.md b/apps/temps-cli/README.md index 1b8066ee..123dadef 100644 --- a/apps/temps-cli/README.md +++ b/apps/temps-cli/README.md @@ -1,15 +1,340 @@ -# temps-cli +

+ Temps Platform +

-To install dependencies: +

@temps-sdk/cli

+ +

+ npm version + npm downloads + license +

+ +

+ Command-line interface for the Temps deployment platform. Deploy, manage, and monitor your applications from the terminal -- no dashboard required. +

+ +--- + +```bash +# npm +npm install -g @temps-sdk/cli + +# bun +bun add -g @temps-sdk/cli + +# pnpm +pnpm add -g @temps-sdk/cli + +# Or run without installing +npx @temps-sdk/cli +bunx @temps-sdk/cli +``` + +## Quick Start + +```bash +# Authenticate with your Temps instance +bunx @temps-sdk/cli login + +# Initialize a project in the current directory +bunx @temps-sdk/cli init + +# Deploy +bunx @temps-sdk/cli up + +# Check status +bunx @temps-sdk/cli status +``` + +That's it. The CLI detects your framework, connects your git repo, and deploys -- all in one command. + +## Configuration + +### Interactive Setup + +```bash +bunx @temps-sdk/cli configure +``` + +Walks you through setting your Temps API URL and authentication token, stored in `~/.temps/config.json`. + +### Environment Variables + +```bash +TEMPS_API_URL=https://your-instance.temps.dev # Your Temps API URL +TEMPS_API_TOKEN=your-token # API key or deployment token +TEMPS_PROJECT=my-app # Project slug (optional) +``` + +Environment variables take precedence over config files, making CI/CD integration straightforward. + +### Project-Level Config + +Running `bunx @temps-sdk/cli init` or `bunx @temps-sdk/cli link` creates a `.temps/config.json` in your project directory: + +```json +{ + "projectSlug": "my-app" +} +``` + +The CLI walks upward from your working directory to find `.temps/config.json` (like `.git` discovery). When found, the `projectSlug` is used to auto-fetch the project ID and all configuration (git connection, preset, environments) from the API -- no need to pass `--project` on every command. + +**Resolution order:** `--project` flag > `.temps/config.json` > `TEMPS_PROJECT` env var > global default. + +## Developer Workflow + +### `init` + +Initialize a new Temps project in the current directory. Detects your framework, creates a project on the platform, and links it. ```bash -bun install +bunx @temps-sdk/cli init ``` -To run: +### `link ` + +Link the current directory to an existing Temps project. ```bash -bun run index.ts +bunx @temps-sdk/cli link my-app +``` + +### `up` + +One-command deploy. If the project is not yet linked, an interactive setup wizard walks you through framework detection, git connection, and service provisioning. If the project is already linked (via `link` or `init`), it fetches the project configuration -- including the git connection and preset -- shows a deployment preview, and triggers the pipeline with a live progress TUI. + +```bash +# Deploy the current directory +bunx @temps-sdk/cli up + +# Deploy a specific branch +bunx @temps-sdk/cli up --branch main + +# Deploy with a specific preset (skip auto-detection) +bunx @temps-sdk/cli up --preset nextjs + +# Manual deployment mode (no git, uploads a local Docker image) +bunx @temps-sdk/cli up --manual +``` + +**What `up` shows for a linked project:** + ``` +i Using project acme-api (from local-config) +✔ Found project: acme-api +i Repository: acme-org/acme-api +i Preset: fastapi + +╭─ Deployment Preview ──────────────╮ +│ Project: acme-api │ +│ Environment: production │ +│ Branch: main │ +│ Preset: fastapi │ +│ Repository: acme-org/acme-api │ +╰────────────────────────────────────╯ + +✔ Deployment started + 🚀 Deployment Progress + ... +``` + +If the project has no git provider connected, `up` warns you and suggests how to connect one or fall back to manual deployment. + +### `status` + +View the current project's deployment status, container health, and domain configuration. + +```bash +bunx @temps-sdk/cli status +``` + +### `open` + +Open the project's live URL in your default browser. + +```bash +bunx @temps-sdk/cli open +``` + +### `rollback` + +Rollback to the previous deployment. + +```bash +bunx @temps-sdk/cli rollback +``` + +### `env:pull` / `env:push` + +Sync environment variables between your local `.env` file and the Temps project. + +```bash +# Download env vars to .env +bunx @temps-sdk/cli env:pull + +# Upload .env to the project +bunx @temps-sdk/cli env:push +``` + +## Deployment Methods + +### Git-Based Deploy + +```bash +# Deploy from a branch (default: current branch) +bunx @temps-sdk/cli deploy + +# Deploy a specific branch +bunx @temps-sdk/cli deploy --branch feature/new-ui + +# Deploy to a specific environment +bunx @temps-sdk/cli deploy --branch main --environment production +``` + +### Local Docker Image + +Build a Docker image locally, export it, and upload it directly -- useful when your CI builds images or for air-gapped environments. + +```bash +bunx @temps-sdk/cli deploy:local-image --tag my-app:latest +``` + +### List Deployments + +```bash +bunx @temps-sdk/cli deployments +``` + +## Multi-Instance Management + +Manage multiple Temps server instances (self-hosted and cloud) from a single CLI. + +```bash +# List configured instances +bunx @temps-sdk/cli instances list + +# Add a new instance +bunx @temps-sdk/cli instances add + +# Switch active instance +bunx @temps-sdk/cli instances switch +``` + +## Temps Cloud + +Connect to Temps Cloud for managed hosting with automatic provisioning. + +```bash +# Login via browser (device code flow) +bunx @temps-sdk/cli cloud login + +# Check current user +bunx @temps-sdk/cli cloud whoami + +# Manage VPS instances +bunx @temps-sdk/cli cloud vps list +bunx @temps-sdk/cli cloud vps create +bunx @temps-sdk/cli cloud vps destroy + +# View billing and usage +bunx @temps-sdk/cli cloud billing +``` + +## Platform Migration + +Migrate projects from other platforms with an interactive wizard that discovers your projects, snapshots configuration, and generates a step-by-step migration plan. + +```bash +bunx @temps-sdk/cli migrate +``` + +**Supported platforms:** + +| Platform | What's migrated | +|----------|-----------------| +| Vercel | Projects, env vars, domains | +| Coolify | Projects, services, env vars, domains | +| Dokploy | Projects, services, env vars, domains | + +## Resource Management + +The CLI provides full CRUD access to every Temps resource: + +```bash +# Projects +bunx @temps-sdk/cli projects list +bunx @temps-sdk/cli projects create +bunx @temps-sdk/cli projects show + +# Domains & SSL +bunx @temps-sdk/cli domains list +bunx @temps-sdk/cli domains provision + +# Services (PostgreSQL, Redis, S3) +bunx @temps-sdk/cli services list --project +bunx @temps-sdk/cli services create --project + +# Monitoring +bunx @temps-sdk/cli monitors list +bunx @temps-sdk/cli monitors create + +# Environment variables +bunx @temps-sdk/cli environments list --project + +# Git providers +bunx @temps-sdk/cli providers list +bunx @temps-sdk/cli providers sync + +# Backups +bunx @temps-sdk/cli backups list --project +bunx @temps-sdk/cli backups run + +# Container management +bunx @temps-sdk/cli containers list --project + +# Runtime logs (live streaming) +bunx @temps-sdk/cli runtime-logs --project +``` + +**Full resource list:** projects, deployments, environments, domains, custom-domains, DNS, DNS providers, git providers, services, backups, containers, monitors, incidents, webhooks, API keys, tokens, users, settings, audit logs, proxy logs, errors, DSN, KV, blob, scans, IP access, email domains, email providers, emails, load balancer, templates, presets, funnels, notifications, notification preferences, platform. + +## CI/CD Integration + +Use environment variables for non-interactive deployments: + +```bash +# GitHub Actions example +env: + TEMPS_API_URL: ${{ secrets.TEMPS_API_URL }} + TEMPS_API_TOKEN: ${{ secrets.TEMPS_API_TOKEN }} + +steps: + - run: bunx @temps-sdk/cli deploy --branch ${{ github.ref_name }} --project my-app +``` + +## Global Options + +| Option | Description | +|--------|-------------| +| `-v, --version` | Display version number | +| `--no-color` | Disable colored output | +| `--debug` | Enable debug output | +| `-h, --help` | Display help | + +## Requirements + +- Node.js 18+ or Bun +- A running Temps instance (self-hosted or Temps Cloud) + +## Related + +- [`@temps-sdk/kv`](https://www.npmjs.com/package/@temps-sdk/kv) -- Key-value store +- [`@temps-sdk/blob`](https://www.npmjs.com/package/@temps-sdk/blob) -- File storage +- [`@temps-sdk/react-analytics`](https://www.npmjs.com/package/@temps-sdk/react-analytics) -- React analytics, session replay, error tracking +- [`@temps-sdk/node-sdk`](https://www.npmjs.com/package/@temps-sdk/node-sdk) -- Full platform API client and server-side error tracking + +## License -This project was created using `bun init` in bun v1.3.1. [Bun](https://bun.com) is a fast all-in-one JavaScript runtime. +MIT diff --git a/apps/temps-cli/package.json b/apps/temps-cli/package.json index 23057160..27a63e9f 100644 --- a/apps/temps-cli/package.json +++ b/apps/temps-cli/package.json @@ -1,6 +1,6 @@ { "name": "@temps-sdk/cli", - "version": "0.1.9", + "version": "0.1.12", "description": "CLI for Temps deployment platform", "type": "module", "bin": { diff --git a/apps/temps-cli/src/cli.ts b/apps/temps-cli/src/cli.ts index 01f4475d..acf0d6bb 100644 --- a/apps/temps-cli/src/cli.ts +++ b/apps/temps-cli/src/cli.ts @@ -79,7 +79,7 @@ export function createProgram(): Command { program .name('temps') .description('CLI for Temps deployment platform') - .version(VERSION, '-v, --version', 'Display version number') + .version(VERSION, '-V, --version', 'Display version number') .option('--no-color', 'Disable colored output') .option('--debug', 'Enable debug output') .hook('preAction', (thisCommand) => { diff --git a/apps/temps-cli/src/commands/auth/login.ts b/apps/temps-cli/src/commands/auth/login.ts index 055d4dd3..df12c09c 100644 --- a/apps/temps-cli/src/commands/auth/login.ts +++ b/apps/temps-cli/src/commands/auth/login.ts @@ -2,7 +2,7 @@ import { credentials, config } from '../../config/store.js' import { promptPassword, promptText, promptSelect, promptEmail } from '../../ui/prompts.js' import { withSpinner } from '../../ui/spinner.js' import { info, icons, colors, newline, box, warning } from '../../ui/output.js' -import { setupClient, client } from '../../lib/api-client.js' +import { setupClient, client, normalizeApiUrl } from '../../lib/api-client.js' import { getCurrentUser } from '../../api/sdk.gen.js' import { AuthenticationError } from '../../utils/errors.js' @@ -119,7 +119,7 @@ export async function loginWithEmail(emailArg?: string): Promise { }) // Use raw fetch to capture the session cookie - const apiUrl = config.get('apiUrl') + const apiUrl = normalizeApiUrl(config.get('apiUrl')) const authResponse = await withSpinner('Logging in...', async () => { const response = await fetch(`${apiUrl}/auth/login`, { @@ -238,7 +238,7 @@ export async function loginWithEmail(emailArg?: string): Promise { export async function loginWithMagicLink(emailArg?: string): Promise { const email = emailArg ?? await promptEmail('Email') - const apiUrl = config.get('apiUrl') + const apiUrl = normalizeApiUrl(config.get('apiUrl')) await withSpinner('Sending magic link...', async () => { const response = await fetch(`${apiUrl}/auth/magic-link/request`, { diff --git a/apps/temps-cli/src/commands/configure.ts b/apps/temps-cli/src/commands/configure.ts index 9e7185e3..d49c39e3 100644 --- a/apps/temps-cli/src/commands/configure.ts +++ b/apps/temps-cli/src/commands/configure.ts @@ -110,22 +110,49 @@ async function runConfigureWizard(options: ConfigureOptions & { disableColors?: console.log(colors.muted('This wizard will help you configure the CLI.\n')) // API URL (skip prompt if provided via flag) - const apiUrl = options.apiUrl ?? await promptUrl( - `API URL [${colors.muted(currentConfig.apiUrl)}]`, - currentConfig.apiUrl - ) + // Use effective value (env var > stored config) as the default + const envApiUrl = process.env.TEMPS_API_URL + let apiUrl: string + + if (options.apiUrl) { + apiUrl = options.apiUrl + } else if (envApiUrl) { + console.log(` API URL: ${colors.bold(envApiUrl)} ${colors.muted('(env: TEMPS_API_URL)')}`) + console.log(colors.muted(' To change, unset TEMPS_API_URL or use --api-url flag\n')) + apiUrl = envApiUrl + } else { + apiUrl = await promptUrl( + `API URL [${colors.muted(currentConfig.apiUrl)}]`, + currentConfig.apiUrl + ) + } // Save API URL first (needed for token validation) - config.set('apiUrl', apiUrl) + // Don't overwrite stored config if env var is active (env var takes precedence at runtime) + if (!envApiUrl || options.apiUrl) { + config.set('apiUrl', apiUrl) + } // API Token configuration let authStatus = 'Not authenticated' + const envApiToken = process.env.TEMPS_TOKEN || process.env.TEMPS_API_TOKEN || process.env.TEMPS_API_KEY + const envApiTokenName = process.env.TEMPS_TOKEN ? 'TEMPS_TOKEN' + : process.env.TEMPS_API_TOKEN ? 'TEMPS_API_TOKEN' + : process.env.TEMPS_API_KEY ? 'TEMPS_API_KEY' + : null + if (options.apiToken) { // Token provided via flag const tokenValid = await validateAndSaveToken(options.apiToken, apiUrl) if (tokenValid) { authStatus = `Authenticated as ${await credentials.get('email') ?? 'unknown'}` } + } else if (envApiToken) { + // Token from environment variable — skip prompt + const email = await credentials.get('email') + authStatus = `Authenticated as ${email ?? 'unknown'} ${colors.muted(`(env: ${envApiTokenName})`)}` + console.log(`\n API Token: ${colors.muted('***')} ${colors.muted(`(env: ${envApiTokenName})`)}`) + console.log(colors.muted(` To change, unset ${envApiTokenName} or use --api-token flag`)) } else if (isAuthenticated) { // Already authenticated, ask if they want to update console.log(colors.muted(`\nCurrently authenticated as: ${colors.bold(currentEmail ?? 'unknown')}`)) @@ -190,12 +217,12 @@ async function runConfigureWizard(options: ConfigureOptions & { disableColors?: default: currentConfig.colorEnabled, }) - // Save configuration - config.setAll({ - apiUrl, - outputFormat, - colorEnabled, - }) + // Save configuration (don't overwrite apiUrl if env var is active) + const configToSave: Partial = { outputFormat, colorEnabled } + if (!envApiUrl || options.apiUrl) { + configToSave.apiUrl = apiUrl + } + config.setAll(configToSave) newline() box( @@ -256,13 +283,22 @@ function setConfigValue(key: string, value: string): void { } function listConfig(): void { - const allConfig = config.getAll() + const storedConfig = config.getAll() newline() header(`${icons.folder} Configuration`) - for (const [key, value] of Object.entries(allConfig)) { - keyValue(key, value) + // Show effective values with env var override annotations + for (const [key, storedValue] of Object.entries(storedConfig)) { + const effectiveValue = config.get(key as keyof TempsConfig) + const isOverridden = key === 'apiUrl' && process.env.TEMPS_API_URL + + if (isOverridden) { + keyValue(key, `${effectiveValue} ${colors.muted('(env: TEMPS_API_URL)')}`) + keyValue(` ${colors.muted('stored')}`, colors.muted(String(storedValue))) + } else { + keyValue(key, effectiveValue) + } } newline() @@ -279,9 +315,13 @@ async function showConfig(options: { json?: boolean }): Promise { const apiUrlSource = envApiUrl ? 'env' : 'config' const apiUrl = envApiUrl || allConfig.apiUrl - // Check if API key is from environment variable - const envApiKey = process.env.TEMPS_API_TOKEN || process.env.TEMPS_API_KEY + // Check if API key is from environment variable (must match getApiKey() priority) + const envApiKey = process.env.TEMPS_TOKEN || process.env.TEMPS_API_TOKEN || process.env.TEMPS_API_KEY const apiKeySource = envApiKey ? 'env' : 'config' + const apiKeyEnvName = process.env.TEMPS_TOKEN ? 'TEMPS_TOKEN' + : process.env.TEMPS_API_TOKEN ? 'TEMPS_API_TOKEN' + : process.env.TEMPS_API_KEY ? 'TEMPS_API_KEY' + : null // Mask API key - show first 8 characters let maskedApiKey = 'Not configured' @@ -315,7 +355,7 @@ async function showConfig(options: { json?: boolean }): Promise { // API URL if (apiUrlSource === 'env') { - keyValue('API URL', `${apiUrl} ${colors.muted('(from TEMPS_API_URL)')}`) + keyValue('API URL', `${apiUrl} ${colors.muted('(env: TEMPS_API_URL)')}`) } else { keyValue('API URL', apiUrl) } @@ -323,8 +363,8 @@ async function showConfig(options: { json?: boolean }): Promise { // API Key if (apiKey) { const sourceNote = apiKeySource === 'env' - ? colors.muted('(from TEMPS_API_TOKEN)') - : colors.muted('(from config)') + ? colors.muted(`(env: ${apiKeyEnvName})`) + : colors.muted('(config)') keyValue('API Key', `${maskedApiKey} ${sourceNote}`) } else { keyValue('API Key', colors.warning('Not configured')) diff --git a/apps/temps-cli/src/commands/deploy/deploy-image.ts b/apps/temps-cli/src/commands/deploy/deploy-image.ts index eafbc3e9..5d476d94 100644 --- a/apps/temps-cli/src/commands/deploy/deploy-image.ts +++ b/apps/temps-cli/src/commands/deploy/deploy-image.ts @@ -1,5 +1,6 @@ import { requireAuth, config, credentials } from '../../config/store.js' -import { setupClient, client } from '../../lib/api-client.js' +import { setupClient, client, normalizeApiUrl } from '../../lib/api-client.js' +import { resolveProjectSlug } from '../../config/resolve-project.js' import { watchDeployment } from '../../lib/deployment-watcher.jsx' import { getProjectBySlug, getProject, getEnvironments } from '../../api/sdk.gen.js' import type { EnvironmentResponse } from '../../api/types.gen.js' @@ -58,17 +59,22 @@ export async function deployImage(options: DeployImageOptions): Promise { return } - // Get project name - const projectName = options.project ?? config.get('defaultProject') + // Resolve project + const resolved = await resolveProjectSlug(options.project) - if (!projectName) { + if (!resolved) { warning('No project specified') - info( - 'Use: temps deploy image --project or set a default with temps configure' - ) + info('Use: bunx @temps-sdk/cli deploy:image --project ') + info('Or link this directory: bunx @temps-sdk/cli link ') return } + const projectName = resolved.slug + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(projectName)} (from ${resolved.source})`) + } + // Fetch project details startSpinner('Fetching project details...') @@ -215,7 +221,7 @@ export async function deployImage(options: DeployImageOptions): Promise { // Trigger deployment startSpinner('Starting deployment...') - const apiUrl = config.get('apiUrl') + const apiUrl = normalizeApiUrl(config.get('apiUrl')) const apiKey = await credentials.getApiKey() try { diff --git a/apps/temps-cli/src/commands/deploy/deploy-local-image.ts b/apps/temps-cli/src/commands/deploy/deploy-local-image.ts index 9289e157..9ea4e6f1 100644 --- a/apps/temps-cli/src/commands/deploy/deploy-local-image.ts +++ b/apps/temps-cli/src/commands/deploy/deploy-local-image.ts @@ -1,5 +1,6 @@ import { requireAuth, config, credentials } from '../../config/store.js' -import { setupClient, client } from '../../lib/api-client.js' +import { setupClient, client, normalizeApiUrl } from '../../lib/api-client.js' +import { resolveProjectSlug } from '../../config/resolve-project.js' import { watchDeployment } from '../../lib/deployment-watcher.jsx' import { getProjectBySlug, getProject, getEnvironments, generatePresetDockerfile } from '../../api/sdk.gen.js' import type { EnvironmentResponse } from '../../api/types.gen.js' @@ -48,16 +49,21 @@ export async function deployLocalImage(options: DeployLocalImageOptions): Promis newline() // ─── Step 1: Resolve project and environment ───────────────────────────── - const projectName = options.project ?? config.get('defaultProject') + const resolved = await resolveProjectSlug(options.project) - if (!projectName) { + if (!resolved) { warning('No project specified') - info( - 'Use: temps deploy:local-image --project or set a default with temps configure' - ) + info('Use: bunx @temps-sdk/cli deploy:local-image --project ') + info('Or link this directory: bunx @temps-sdk/cli link ') return } + const projectName = resolved.slug + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(projectName)} (from ${resolved.source})`) + } + startSpinner('Fetching project details...') let projectData: { id: number; name: string; slug: string } @@ -317,7 +323,7 @@ export async function deployLocalImage(options: DeployLocalImageOptions): Promis startSpinner('Uploading image to server...') - const apiUrl = config.get('apiUrl') + const apiUrl = normalizeApiUrl(config.get('apiUrl')) const apiKey = await credentials.getApiKey() try { @@ -606,7 +612,11 @@ interface GeneratedDockerfile { async function tryGenerateDockerfile( projectSlug?: string ): Promise { - const slug = projectSlug ?? config.get('defaultProject') + let slug = projectSlug + if (!slug) { + const resolved = await resolveProjectSlug() + slug = resolved?.slug + } if (!slug) return null // Fetch the project to get its preset diff --git a/apps/temps-cli/src/commands/deploy/deploy-static.ts b/apps/temps-cli/src/commands/deploy/deploy-static.ts index 38b4cbcd..62f47df2 100644 --- a/apps/temps-cli/src/commands/deploy/deploy-static.ts +++ b/apps/temps-cli/src/commands/deploy/deploy-static.ts @@ -1,5 +1,6 @@ import { requireAuth, config, credentials } from '../../config/store.js' -import { setupClient, client } from '../../lib/api-client.js' +import { setupClient, client, normalizeApiUrl } from '../../lib/api-client.js' +import { resolveProjectSlug } from '../../config/resolve-project.js' import { watchDeployment } from '../../lib/deployment-watcher.jsx' import { getProjectBySlug, getProject, getEnvironments } from '../../api/sdk.gen.js' import type { EnvironmentResponse } from '../../api/types.gen.js' @@ -62,17 +63,22 @@ export async function deployStatic(options: DeployStaticOptions): Promise return } - // Get project name - const projectName = options.project ?? config.get('defaultProject') + // Resolve project + const resolved = await resolveProjectSlug(options.project) - if (!projectName) { + if (!resolved) { warning('No project specified') - info( - 'Use: temps deploy static --project or set a default with temps configure' - ) + info('Use: bunx @temps-sdk/cli deploy:static --project ') + info('Or link this directory: bunx @temps-sdk/cli link ') return } + const projectName = resolved.slug + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(projectName)} (from ${resolved.source})`) + } + // Fetch project details startSpinner('Fetching project details...') @@ -253,7 +259,7 @@ export async function deployStatic(options: DeployStaticOptions): Promise // Upload static bundle startSpinner('Uploading static bundle...') - const apiUrl = config.get('apiUrl') + const apiUrl = normalizeApiUrl(config.get('apiUrl')) const apiKey = await credentials.getApiKey() try { diff --git a/apps/temps-cli/src/commands/deploy/deploy.ts b/apps/temps-cli/src/commands/deploy/deploy.ts index d8219e19..cd25d813 100644 --- a/apps/temps-cli/src/commands/deploy/deploy.ts +++ b/apps/temps-cli/src/commands/deploy/deploy.ts @@ -1,21 +1,93 @@ import { requireAuth, config } from '../../config/store.js' -import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' +import { setupClient, client, getErrorMessage, getWebUrl } from '../../lib/api-client.js' +import { resolveProjectSlug } from '../../config/resolve-project.js' import { getProjectBySlug, getEnvironments, triggerProjectPipeline, - getLastDeployment, + getProjectDeployments, + getRepositoryByName, } from '../../api/sdk.gen.js' -import type { EnvironmentResponse, DeploymentResponse } from '../../api/types.gen.js' +import type { EnvironmentResponse, ProjectResponse } from '../../api/types.gen.js' import { promptSelect, promptText } from '../../ui/prompts.js' -import { startSpinner, succeedSpinner, failSpinner, updateSpinner } from '../../ui/spinner.js' -import { success, info, warning, newline, icons, colors, header, keyValue, box } from '../../ui/output.js' +import { startSpinner, succeedSpinner, failSpinner } from '../../ui/spinner.js' +import { info, warning, newline, icons, colors, box } from '../../ui/output.js' +import { watchDeployment } from '../../lib/deployment-watcher.jsx' +import { deployLocalImage } from './deploy-local-image.js' +import { deployStatic } from './deploy-static.js' + +// Types for the /repository/{id}/commits endpoint (not yet in generated SDK) +interface CommitInfo { + sha: string + message: string + author: string + author_email: string + date: string +} + +interface CommitListResponse { + commits: CommitInfo[] +} + +/** + * Fetch recent commits for a repository branch from the remote git provider. + */ +async function fetchRemoteCommits( + repositoryId: number, + branch: string, + perPage = 20, +): Promise { + try { + const response = await client.get({ + security: [{ scheme: 'bearer', type: 'http' }], + url: '/repository/{repository_id}/commits', + path: { repository_id: repositoryId }, + query: { branch, per_page: perPage }, + }) + const data = response.data as CommitListResponse | undefined + return data?.commits ?? [] + } catch { + return [] + } +} + +/** + * Look up the repository ID for a project's repo_owner/repo_name. + */ +async function getRepositoryId( + repoOwner: string, + repoName: string, + connectionId?: number | null, +): Promise { + try { + const { data } = await getRepositoryByName({ + client, + path: { owner: repoOwner, name: repoName }, + query: connectionId ? { connection_id: String(connectionId) } : undefined, + }) + return data?.id ?? null + } catch { + return null + } +} + +function getRelativeTime(date: Date): string { + const seconds = Math.floor((Date.now() - date.getTime()) / 1000) + if (seconds < 60) return 'just now' + const minutes = Math.floor(seconds / 60) + if (minutes < 60) return `${minutes}m ago` + const hours = Math.floor(minutes / 60) + if (hours < 24) return `${hours}h ago` + const days = Math.floor(hours / 24) + return `${days}d ago` +} interface DeployOptions { project?: string environment?: string environmentId?: string branch?: string + commit?: string wait?: boolean yes?: boolean } @@ -26,19 +98,26 @@ export async function deploy(options: DeployOptions): Promise { newline() - // Get project name - const projectName = options.project ?? config.get('defaultProject') + // Resolve project: --project flag > .temps/config.json > TEMPS_PROJECT env > global default + const resolved = await resolveProjectSlug(options.project) - if (!projectName) { + if (!resolved) { warning('No project specified') - info('Use: temps deploy --project or set a default with temps configure') + info('Use: bunx @temps-sdk/cli deploy --project ') + info('Or link this directory: bunx @temps-sdk/cli link ') return } + const projectName = resolved.slug + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(projectName)} (from ${resolved.source})`) + } + // Fetch project details startSpinner('Fetching project details...') - let projectData: { id: number; name: string } + let project: ProjectResponse let environments: EnvironmentResponse[] = [] try { @@ -48,17 +127,23 @@ export async function deploy(options: DeployOptions): Promise { }) if (error || !data) { + const rawApiUrl = config.get('apiUrl') + const baseUrl = client.getConfig().baseUrl ?? rawApiUrl failSpinner(`Project "${projectName}" not found`) + info(`API: ${colors.muted(`${baseUrl}/projects/by-slug/${projectName}`)}`) + if (error) { + info(`Error: ${getErrorMessage(error)}`) + } return } - projectData = data - succeedSpinner(`Found project: ${projectData.name}`) + project = data + succeedSpinner(`Found project: ${project.name}`) // Fetch environments const { data: envData } = await getEnvironments({ client, - path: { project_id: projectData.id }, + path: { project_id: project.id }, }) environments = envData ?? [] } catch (err) { @@ -66,12 +151,114 @@ export async function deploy(options: DeployOptions): Promise { throw err } + // Check source type — delegate to appropriate deploy method + const sourceType = project.source_type + const isGitBased = sourceType === 'git' + const hasGitConnection = isGitBased && !!project.git_provider_connection_id + + if (sourceType === 'static_files') { + info(`Project uses static files deployment`) + info(`Delegating to: ${colors.muted('deploy:static')}`) + newline() + await deployStatic({ + path: '.', + project: projectName, + environment: options.environment, + environmentId: options.environmentId, + wait: options.wait, + yes: options.yes, + }) + return + } + + if (sourceType === 'docker_image' || sourceType === 'manual') { + info(`Project uses ${sourceType === 'docker_image' ? 'Docker image' : 'manual'} deployment`) + info(`Delegating to: ${colors.muted('deploy:local-image')}`) + newline() + await deployLocalImage({ + project: projectName, + environment: options.environment, + environmentId: options.environmentId, + wait: options.wait, + yes: options.yes, + }) + return + } + + // Git-based project — check if git is actually connected + if (!hasGitConnection) { + warning('Project is git-based but no git provider is connected') + newline() + info('Options:') + info(` 1. Connect a git provider: ${colors.muted('bunx @temps-sdk/cli providers add')}`) + info(` 2. Deploy a local Docker image: ${colors.muted(`bunx @temps-sdk/cli deploy:local-image -p ${projectName}`)}`) + info(` 3. Deploy static files: ${colors.muted(`bunx @temps-sdk/cli deploy:static -p ${projectName} --path ./dist`)}`) + newline() + + if (!options.yes) { + const choice = await promptSelect({ + message: 'How would you like to deploy?', + choices: [ + { name: 'Build & deploy local Docker image', value: 'local-image' }, + { name: 'Deploy static files', value: 'static' }, + { name: 'Cancel', value: 'cancel' }, + ], + }) + + if (choice === 'local-image') { + await deployLocalImage({ + project: projectName, + environment: options.environment, + environmentId: options.environmentId, + wait: options.wait, + yes: options.yes, + }) + return + } + + if (choice === 'static') { + const staticPath = await promptText({ + message: 'Path to static files', + default: './dist', + }) + await deployStatic({ + path: staticPath, + project: projectName, + environment: options.environment, + environmentId: options.environmentId, + wait: options.wait, + yes: options.yes, + }) + return + } + + // Cancel + return + } + + // Non-interactive with --yes: fall back to local-image + info('Falling back to local Docker image deployment (--yes mode)') + await deployLocalImage({ + project: projectName, + environment: options.environment, + environmentId: options.environmentId, + wait: options.wait, + yes: options.yes, + }) + return + } + + // ─── Git-based deployment with connected provider ─────────────────────── + + if (project.repo_owner && project.repo_name) { + info(`Repository: ${colors.muted(`${project.repo_owner}/${project.repo_name}`)}`) + } + // Get environment let environmentId: number | undefined let environmentName = options.environment || 'production' if (environments.length > 0) { - // If environment ID is specified directly, use it if (options.environmentId) { environmentId = parseInt(options.environmentId, 10) const env = environments.find(e => e.id === environmentId) @@ -79,14 +266,12 @@ export async function deploy(options: DeployOptions): Promise { environmentName = env.name } } else if (options.environment) { - // Find by name const env = environments.find(e => e.name === options.environment) if (env) { environmentId = env.id environmentName = env.name } } else if (!options.yes) { - // Interactive: prompt for environment selection const selectedEnv = await promptSelect({ message: 'Select environment', choices: environments.map((env) => ({ @@ -99,7 +284,6 @@ export async function deploy(options: DeployOptions): Promise { environmentId = parseInt(selectedEnv, 10) environmentName = environments.find(e => e.id === environmentId)?.name ?? 'production' } else { - // Non-interactive: use production or first environment const prodEnv = environments.find(e => e.name === 'production') if (prodEnv) { environmentId = prodEnv.id @@ -111,134 +295,160 @@ export async function deploy(options: DeployOptions): Promise { } } - // Get branch - use flag value, or prompt if interactive mode + // Get branch let branch = options.branch if (!branch) { if (options.yes) { - branch = 'main' // Default for automation + branch = project.main_branch || 'main' } else { branch = await promptText({ message: 'Branch to deploy', - default: 'main', + default: project.main_branch || 'main', }) } } - // Confirm deployment (skip if --yes flag) + // Select commit — resolve from flag, interactive picker, or skip (deploy HEAD) + let commit = options.commit + if (!commit && !options.yes && project.repo_owner && project.repo_name) { + // Try to fetch recent commits so the user can pick one + const repositoryId = await getRepositoryId( + project.repo_owner, + project.repo_name, + project.git_provider_connection_id, + ) + + if (repositoryId) { + startSpinner('Fetching recent commits...') + const commits = await fetchRemoteCommits(repositoryId, branch) + if (commits.length > 0) { + succeedSpinner(`Found ${commits.length} recent commits`) + + const HEAD_VALUE = '__HEAD__' + const selected = await promptSelect({ + message: 'Select commit to deploy', + choices: [ + { name: `${colors.bold('HEAD')} ${colors.muted('(latest on branch)')}`, value: HEAD_VALUE }, + ...commits.map(c => { + const sha = colors.muted(c.sha.substring(0, 7)) + const msg = c.message.split('\n')[0].substring(0, 60) + const ago = getRelativeTime(new Date(c.date)) + return { + name: `${sha} ${msg} ${colors.muted(`(${c.author}, ${ago})`)}`, + value: c.sha, + } + }), + ], + }) + if (selected !== HEAD_VALUE) { + commit = selected + } + } else { + succeedSpinner('No commits found, deploying HEAD') + } + } + } + + // Deployment preview newline() box( - `Project: ${colors.bold(projectName)}\n` + - `Environment: ${colors.bold(environmentName)}\n` + - `Branch: ${colors.bold(branch)}`, + [ + `Project: ${colors.bold(projectName)}`, + `Environment: ${colors.bold(environmentName)}`, + `Branch: ${colors.bold(branch)}`, + commit ? `Commit: ${colors.bold(commit.substring(0, 7))}` : null, + project.preset ? `Preset: ${colors.bold(project.preset)}` : null, + project.repo_owner && project.repo_name + ? `Repository: ${colors.bold(`${project.repo_owner}/${project.repo_name}`)}` + : null, + ] + .filter(Boolean) + .join('\n'), `${icons.rocket} Deployment Preview` ) newline() - // Start deployment + // Trigger git-based deployment startSpinner('Starting deployment...') try { const { data, error } = await triggerProjectPipeline({ client, - path: { id: projectData.id }, + path: { id: project.id }, body: { branch, + commit: commit ?? undefined, environment_id: environmentId, }, }) if (error || !data) { failSpinner('Failed to start deployment') + const msg = getErrorMessage(error) + if (msg) { + info(`Reason: ${msg}`) + } return } - succeedSpinner(`Deployment started`) + succeedSpinner('Deployment started') info(data.message ?? 'Pipeline triggered successfully') - if (options.wait !== false) { - await waitForDeployment(projectData.id, environmentId) - } else { - newline() - info('Deployment running in background') - info(`Check status with: temps deployments list --project ${projectName}`) - } - } catch (err) { - failSpinner('Deployment failed') - throw err - } -} - -async function waitForDeployment(projectId: number, environmentId?: number): Promise { - const statusMessages: Record = { - pending: 'Waiting in queue...', - building: 'Building application...', - deploying: 'Deploying to servers...', - running: 'Starting containers...', - } - - startSpinner('Waiting for deployment...') - - let lastStatus = '' - let attempts = 0 - const maxAttempts = 180 // 6 minutes with 2s intervals - - while (attempts < maxAttempts) { - attempts++ + const webUrl = getWebUrl() + info(`Dashboard: ${colors.primary(`${webUrl}/projects/${projectName}/deployments`)}`) - const { data: deployment, error } = await getLastDeployment({ - client, - path: { id: projectId }, - }) - - if (error || !deployment) { - await new Promise((resolve) => setTimeout(resolve, 2000)) - continue - } - - // Check if this is the right deployment (for the selected environment) - if (environmentId && deployment.environment_id !== environmentId) { - await new Promise((resolve) => setTimeout(resolve, 2000)) - continue + if (options.wait === false) { + return } - if (deployment.status !== lastStatus) { - lastStatus = deployment.status - updateSpinner(statusMessages[deployment.status] ?? `Status: ${deployment.status}`) - } + // Find the deployment ID so we can watch it with the rich TUI + startSpinner('Waiting for deployment to start...') + let deploymentId: number | null = null + + for (let attempt = 0; attempt < 15; attempt++) { + const { data: deployList, error: deployError } = await getProjectDeployments({ + client, + path: { id: project.id }, + query: { + per_page: 1, + ...(environmentId ? { environment_id: environmentId } : {}), + }, + }) - if (deployment.status === 'success' || deployment.status === 'completed' || deployment.status === 'deployed') { - succeedSpinner(`${icons.rocket} Deployment successful!`) - newline() - header(`${icons.check} Deployment Complete`) - keyValue('Deployment ID', deployment.id) - keyValue('Commit', deployment.commit_hash?.substring(0, 7) ?? '-') - if (deployment.url) { - keyValue('URL', colors.primary(deployment.url)) + if (deployError) { + failSpinner('Failed to fetch deployment status') + info(`Error: ${getErrorMessage(deployError)}`) + info(`Dashboard: ${colors.primary(`${webUrl}/projects/${projectName}/deployments`)}`) + return } - const envDomains = deployment.environment?.domains || [] - const firstDomain = envDomains[0] - if (firstDomain) { - const envUrl = firstDomain.startsWith('http') ? firstDomain : `https://${firstDomain}` - keyValue('Environment', colors.primary(envUrl)) + + const latest = deployList?.deployments?.[0] + if (latest?.id) { + deploymentId = latest.id + break } - newline() - return + + await new Promise((r) => setTimeout(r, 2000)) } - if (deployment.status === 'failed' || deployment.status === 'error' || deployment.status === 'cancelled') { - failSpinner('Deployment failed') - newline() - if (deployment.cancelled_reason) { - info(`Reason: ${deployment.cancelled_reason}`) + if (deploymentId) { + succeedSpinner(`Deployment #${deploymentId} found`) + const result = await watchDeployment({ + projectId: project.id, + deploymentId, + timeoutSecs: 600, + projectName, + }) + + if (!result.success) { + process.exitCode = 1 } - info(`View logs with: temps logs ${projectId}`) - return + } else { + failSpinner('Could not locate the deployment to track') + info(`Dashboard: ${colors.primary(`${webUrl}/projects/${projectName}/deployments`)}`) } - - // Wait before checking again - await new Promise((resolve) => setTimeout(resolve, 2000)) + } catch (err) { + failSpinner('Deployment failed') + throw err } - - failSpinner('Deployment timed out') - info('Deployment is still running. Check status with: temps deployments list') } diff --git a/apps/temps-cli/src/commands/deploy/index.ts b/apps/temps-cli/src/commands/deploy/index.ts index a84bb60e..84b59892 100644 --- a/apps/temps-cli/src/commands/deploy/index.ts +++ b/apps/temps-cli/src/commands/deploy/index.ts @@ -18,6 +18,7 @@ export function registerDeployCommands(program: Command): void { .option('-e, --environment ', 'Target environment name') .option('--environment-id ', 'Target environment ID') .option('-b, --branch ', 'Git branch to deploy') + .option('-c, --commit ', 'Specific commit SHA to deploy') .option('--no-wait', 'Do not wait for deployment to complete') .option('-y, --yes', 'Skip confirmation prompts (for automation)') .action((projectArg, options) => { diff --git a/apps/temps-cli/src/commands/deploy/list.ts b/apps/temps-cli/src/commands/deploy/list.ts index c05ebf23..5157491e 100644 --- a/apps/temps-cli/src/commands/deploy/list.ts +++ b/apps/temps-cli/src/commands/deploy/list.ts @@ -1,5 +1,6 @@ -import { requireAuth, config } from '../../config/store.js' +import { requireAuth } from '../../config/store.js' import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' +import { resolveProjectSlug } from '../../config/resolve-project.js' import { getProjectDeployments, getProjectBySlug } from '../../api/sdk.gen.js' import type { DeploymentResponse } from '../../api/types.gen.js' import { withSpinner } from '../../ui/spinner.js' @@ -20,12 +21,17 @@ export async function list(options: ListOptions): Promise { await requireAuth() await setupClient() - const projectName = options.project ?? config.get('defaultProject') + const resolved = await resolveProjectSlug(options.project) - if (!projectName) { - throw new Error('No project specified. Use: temps deployments list --project ') + if (!resolved) { + throw new Error( + 'No project specified. Use: bunx @temps-sdk/cli deployments list --project \n' + + 'Or link this directory: bunx @temps-sdk/cli link ' + ) } + const projectName = resolved.slug + const deployments = await withSpinner('Fetching deployments...', async () => { // Get project ID from slug const { data: projectData, error: projectError } = await getProjectBySlug({ diff --git a/apps/temps-cli/src/commands/deploy/rollback.ts b/apps/temps-cli/src/commands/deploy/rollback.ts index 2c16dc68..6b9b50bc 100644 --- a/apps/temps-cli/src/commands/deploy/rollback.ts +++ b/apps/temps-cli/src/commands/deploy/rollback.ts @@ -1,5 +1,6 @@ import { requireAuth } from '../../config/store.js' import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' +import { requireProjectSlug } from '../../config/resolve-project.js' import { getProjectBySlug, getProjectDeployments, @@ -19,22 +20,24 @@ export async function rollback(options: RollbackOptions): Promise { await requireAuth() await setupClient() - if (!options.project) { - throw new Error('Project is required. Use: temps deployments rollback --project ') + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) } newline() - warning(`Rolling back ${colors.bold(options.project)} in ${colors.bold(options.environment)}`) + warning(`Rolling back ${colors.bold(resolved.slug)} in ${colors.bold(options.environment)}`) newline() // Get project ID const { data: projectData, error: projectError } = await getProjectBySlug({ client, - path: { slug: options.project }, + path: { slug: resolved.slug }, }) if (projectError || !projectData) { - throw new Error(`Project "${options.project}" not found`) + throw new Error(`Project "${resolved.slug}" not found`) } let targetDeploymentId = options.to ? parseInt(options.to, 10) : undefined @@ -51,30 +54,36 @@ export async function rollback(options: RollbackOptions): Promise { throw new Error(getErrorMessage(error)) } - // Filter by environment and status + // Filter by environment and completed status return data.deployments .filter(d => d.environment?.name === options.environment && (d.status === 'success' || d.status === 'completed' || d.status === 'deployed') ) - .slice(0, 5) + .slice(0, 10) }) - if (deployments.length < 2) { - warning('No previous deployments to rollback to') + if (deployments.length === 0) { + warning('No completed deployments found for this environment') return } - // Skip current, show previous deployments - const previousDeployments = deployments.slice(1) - + // Show all deployments, mark which is current const selectedId = await promptSelect({ message: 'Select deployment to rollback to', - choices: previousDeployments.map((d) => ({ - name: `#${d.id} - ${d.branch ?? 'unknown'} (${d.commit_hash?.substring(0, 7) ?? 'unknown'})`, - value: String(d.id), - description: new Date(d.created_at * 1000).toLocaleString(), - })), + choices: deployments.map((d) => { + const isRollback = d.metadata?.isRollback + const branch = d.branch ?? (isRollback ? 'rollback' : 'unknown') + const commit = d.commit_hash?.substring(0, 7) ?? (isRollback ? `from #${d.metadata?.rolledBackFromId ?? '?'}` : '-') + const currentTag = d.is_current ? ' (current)' : '' + const date = new Date(d.created_at * 1000).toLocaleString() + + return { + name: `#${d.id} - ${branch} (${commit})${currentTag}`, + value: String(d.id), + description: date, + } + }), }) targetDeploymentId = parseInt(selectedId, 10) @@ -112,5 +121,5 @@ export async function rollback(options: RollbackOptions): Promise { keyValue('Status', newDeployment.status) newline() - info(`Track progress with: temps deployments status --project ${options.project} --deployment-id ${newDeployment.id}`) + info(`Track progress with: temps deployments status --project ${resolved.slug} --deployment-id ${newDeployment.id}`) } diff --git a/apps/temps-cli/src/commands/deploy/status.ts b/apps/temps-cli/src/commands/deploy/status.ts index 8bc9bb2d..4a17600d 100644 --- a/apps/temps-cli/src/commands/deploy/status.ts +++ b/apps/temps-cli/src/commands/deploy/status.ts @@ -1,8 +1,9 @@ import { requireAuth } from '../../config/store.js' import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' +import { requireProjectSlug } from '../../config/resolve-project.js' import { getDeployment, getProjectBySlug } from '../../api/sdk.gen.js' import { withSpinner } from '../../ui/spinner.js' -import { newline, header, icons, json, colors, formatDate } from '../../ui/output.js' +import { newline, header, icons, json, colors, info, formatDate } from '../../ui/output.js' import { detailsTable, statusBadge } from '../../ui/table.js' interface StatusOptions { @@ -15,22 +16,24 @@ export async function status(options: StatusOptions): Promise { await requireAuth() await setupClient() - if (!options.project) { - throw new Error('Project is required. Use: temps deployments status --project --deployment-id ') + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) } if (!options.deploymentId) { - throw new Error('Deployment ID is required. Use: temps deployments status --project --deployment-id ') + throw new Error('Deployment ID is required. Use: temps deployments status --deployment-id ') } // Get project ID from slug const { data: projectData, error: projectError } = await getProjectBySlug({ client, - path: { slug: options.project }, + path: { slug: resolved.slug }, }) if (projectError || !projectData) { - throw new Error(`Project "${options.project}" not found`) + throw new Error(`Project "${resolved.slug}" not found`) } const projectId = projectData.id diff --git a/apps/temps-cli/src/commands/domains/index.ts b/apps/temps-cli/src/commands/domains/index.ts index 8ac329da..4098c5d5 100644 --- a/apps/temps-cli/src/commands/domains/index.ts +++ b/apps/temps-cli/src/commands/domains/index.ts @@ -37,6 +37,7 @@ async function findDomainIdByName(domainName: string): Promise { interface AddOptions { domain: string challenge?: string + yes?: boolean } interface VerifyOptions { @@ -105,6 +106,7 @@ export function registerDomainsCommands(program: Command): void { .description('Add a custom domain') .requiredOption('-d, --domain ', 'Domain name') .option('-c, --challenge ', 'Challenge type (http-01 or dns-01)', 'http-01') + .option('-y, --yes', 'Skip confirmation prompts') .action(addDomain) domains @@ -250,10 +252,11 @@ async function addDomain(options: AddOptions): Promise { success(`Domain ${domain} added`) if (result?.dns_challenge_token && result?.dns_challenge_value) { + const baseDomain = domain.startsWith('*.') ? domain.slice(2) : domain newline() box( `Type: TXT\n` + - `Name: ${result.dns_challenge_token}\n` + + `Name: _acme-challenge.${baseDomain}\n` + `Value: ${result.dns_challenge_value}`, 'Add this DNS record to verify ownership' ) diff --git a/apps/temps-cli/src/commands/environments/index.ts b/apps/temps-cli/src/commands/environments/index.ts index e2e4d2fe..bb4b0c39 100644 --- a/apps/temps-cli/src/commands/environments/index.ts +++ b/apps/temps-cli/src/commands/environments/index.ts @@ -2,7 +2,9 @@ import type { Command } from 'commander' import * as fs from 'node:fs' import * as path from 'node:path' import { requireAuth } from '../../config/store.js' +import { requireProjectSlug } from '../../config/resolve-project.js' import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' +import { parseEnvFile } from '../../lib/env-file.js' import { getEnvironments, getEnvironment, @@ -32,31 +34,35 @@ export function registerEnvironmentsCommands(program: Command): void { .description('Manage environments and environment variables') environments - .command('list ') + .command('list') .alias('ls') .description('List environments for a project') + .option('-p, --project ', 'Project slug or ID') .option('--json', 'Output in JSON format') .action(listEnvironments) environments - .command('create ') + .command('create') .description('Create a new environment') + .option('-p, --project ', 'Project slug or ID') .option('-n, --name ', 'Environment name') .option('-b, --branch ', 'Git branch') .option('--preview', 'Set as preview environment') .action(createEnvironmentCmd) environments - .command('delete ') + .command('delete ') .alias('rm') .description('Delete an environment') + .option('-p, --project ', 'Project slug or ID') .option('-f, --force', 'Skip confirmation') .action(deleteEnvironmentCmd) // Environment variables subcommand const vars = environments - .command('vars ') + .command('vars') .description('Manage environment variables') + .option('-p, --project ', 'Project slug or ID') vars .command('list') @@ -65,18 +71,18 @@ export function registerEnvironmentsCommands(program: Command): void { .option('-e, --environment ', 'Filter by environment name') .option('--show-values', 'Show actual values (hidden by default)') .option('--json', 'Output in JSON format') - .action((options, cmd) => { - const project = cmd.parent!.args[0] - return listEnvVars(project, options) + .action(async (options, cmd) => { + const projectSlug = cmd.parent!.opts().project + return listEnvVars(projectSlug, options) }) vars .command('get ') .description('Get a specific environment variable') .option('-e, --environment ', 'Specify environment (if variable exists in multiple)') - .action((key, options, cmd) => { - const project = cmd.parent!.parent!.args[0] - return getEnvVar(project, key, options) + .action(async (key, options, cmd) => { + const projectSlug = cmd.parent!.parent!.opts().project + return getEnvVar(projectSlug, key, options) }) vars @@ -85,9 +91,9 @@ export function registerEnvironmentsCommands(program: Command): void { .option('-e, --environments ', 'Comma-separated environment names (interactive if not provided)') .option('--no-preview', 'Exclude from preview environments') .option('--update', 'Update existing variable instead of creating new') - .action((key, value, options, cmd) => { - const project = cmd.parent!.parent!.args[0] - return setEnvVar(project, key, value, options) + .action(async (key, value, options, cmd) => { + const projectSlug = cmd.parent!.parent!.opts().project + return setEnvVar(projectSlug, key, value, options) }) vars @@ -97,9 +103,9 @@ export function registerEnvironmentsCommands(program: Command): void { .description('Delete an environment variable') .option('-e, --environment ', 'Delete only from specific environment') .option('-f, --force', 'Skip confirmation') - .action((key, options, cmd) => { - const project = cmd.parent!.parent!.args[0] - return deleteEnvVar(project, key, options) + .action(async (key, options, cmd) => { + const projectSlug = cmd.parent!.parent!.opts().project + return deleteEnvVar(projectSlug, key, options) }) vars @@ -107,9 +113,9 @@ export function registerEnvironmentsCommands(program: Command): void { .description('Import environment variables from a .env file') .option('-e, --environments ', 'Comma-separated environment names') .option('--overwrite', 'Overwrite existing variables') - .action((file, options, cmd) => { - const project = cmd.parent!.parent!.args[0] - return importEnvVars(project, file, options) + .action(async (file, options, cmd) => { + const projectSlug = cmd.parent!.parent!.opts().project + return importEnvVars(projectSlug, file, options) }) vars @@ -117,15 +123,16 @@ export function registerEnvironmentsCommands(program: Command): void { .description('Export environment variables to .env format') .option('-e, --environment ', 'Export from specific environment') .option('-o, --output ', 'Write to file instead of stdout') - .action((options, cmd) => { - const project = cmd.parent!.parent!.args[0] - return exportEnvVars(project, options) + .action(async (options, cmd) => { + const projectSlug = cmd.parent!.parent!.opts().project + return exportEnvVars(projectSlug, options) }) // Resources subcommand environments - .command('resources ') + .command('resources ') .description('View or set CPU/memory resources for an environment') + .option('-p, --project ', 'Project slug or ID') .option('--cpu ', 'CPU limit in millicores (e.g., 500 = 0.5 CPU)') .option('--memory ', 'Memory limit in MB (e.g., 512)') .option('--cpu-request ', 'CPU request in millicores (guaranteed minimum)') @@ -135,24 +142,29 @@ export function registerEnvironmentsCommands(program: Command): void { // Scale subcommand environments - .command('scale [replicas]') + .command('scale') .description('View or set the number of replicas for an environment') + .option('-p, --project ', 'Project slug or ID') + .option('-e, --environment ', 'Environment name or slug', 'production') + .option('-r, --replicas ', 'Number of replicas to set') .option('--json', 'Output in JSON format') .action(scaleCmd) // Cron jobs subcommand const crons = environments - .command('crons ') + .command('crons') .description('Manage cron jobs') + .option('-p, --project ', 'Project slug or ID') + .requiredOption('-e, --environment ', 'Environment name or slug') crons .command('list') .alias('ls') .description('List cron jobs for an environment') .option('--json', 'Output in JSON format') - .action((options, cmd) => { - const [project, environment] = cmd.parent!.args - return listCrons(project, environment, options) + .action(async (options, cmd) => { + const parentOpts = cmd.parent!.opts() + return listCrons(parentOpts.project, parentOpts.environment, options) }) crons @@ -160,9 +172,9 @@ export function registerEnvironmentsCommands(program: Command): void { .description('Show cron job details') .requiredOption('--id ', 'Cron job ID') .option('--json', 'Output in JSON format') - .action((options, cmd) => { - const [project, environment] = cmd.parent!.args - return showCron(project, environment, options) + .action(async (options, cmd) => { + const parentOpts = cmd.parent!.opts() + return showCron(parentOpts.project, parentOpts.environment, options) }) crons @@ -173,9 +185,9 @@ export function registerEnvironmentsCommands(program: Command): void { .option('--page ', 'Page number', '1') .option('--per-page ', 'Items per page', '20') .option('--json', 'Output in JSON format') - .action((options, cmd) => { - const [project, environment] = cmd.parent!.args - return listCronExecutions(project, environment, options) + .action(async (options, cmd) => { + const parentOpts = cmd.parent!.opts() + return listCronExecutions(parentOpts.project, parentOpts.environment, options) }) } @@ -190,10 +202,17 @@ async function getProjectId(projectSlug: string): Promise { return data.id } -async function listEnvironments(project: string, options: { json?: boolean }): Promise { +async function listEnvironments(options: { project?: string; json?: boolean }): Promise { await requireAuth() await setupClient() + const resolved = await requireProjectSlug(options.project) + const project = resolved.slug + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(project)} (from ${resolved.source})`) + } + const environments = await withSpinner('Fetching environments...', async () => { const projectId = await getProjectId(project) const { data, error } = await getEnvironments({ @@ -225,10 +244,15 @@ async function listEnvironments(project: string, options: { json?: boolean }): P } async function createEnvironmentCmd( - project: string, - options: { name?: string; branch?: string; preview?: boolean } + options: { project?: string; name?: string; branch?: string; preview?: boolean } ): Promise { await requireAuth() + await setupClient() + + const resolved = await requireProjectSlug(options.project) + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } const name = options.name ?? await promptText({ message: 'Environment name', @@ -241,10 +265,8 @@ async function createEnvironmentCmd( default: name === 'production' ? 'main' : name, }) - await setupClient() - const environment = await withSpinner('Creating environment...', async () => { - const projectId = await getProjectId(project) + const projectId = await getProjectId(resolved.slug) const { data, error } = await createEnvironment({ client, path: { project_id: projectId }, @@ -266,11 +288,16 @@ async function createEnvironmentCmd( } async function deleteEnvironmentCmd( - project: string, environment: string, - options: { force?: boolean } + options: { project?: string; force?: boolean } ): Promise { await requireAuth() + await setupClient() + + const resolved = await requireProjectSlug(options.project) + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } if (environment === 'production') { warning('Cannot delete production environment') @@ -279,7 +306,7 @@ async function deleteEnvironmentCmd( if (!options.force) { const confirmed = await promptConfirm({ - message: `Delete environment "${environment}" from ${project}?`, + message: `Delete environment "${environment}" from ${resolved.slug}?`, default: false, }) if (!confirmed) { @@ -288,10 +315,8 @@ async function deleteEnvironmentCmd( } } - await setupClient() - await withSpinner('Deleting environment...', async () => { - const projectId = await getProjectId(project) + const projectId = await getProjectId(resolved.slug) const { error } = await deleteEnvironment({ client, path: { project_id: projectId, env_id: environment as unknown as number }, @@ -305,12 +330,18 @@ async function deleteEnvironmentCmd( // ============ Environment Variables Commands ============ async function listEnvVars( - project: string, + projectFlag: string | undefined, options: { environment?: string; showValues?: boolean; json?: boolean } ): Promise { await requireAuth() await setupClient() + const resolved = await requireProjectSlug(projectFlag) + const project = resolved.slug + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(project)} (from ${resolved.source})`) + } + const [vars, environments] = await withSpinner('Fetching environment variables...', async () => { const projectId = await getProjectId(project) @@ -396,13 +427,19 @@ async function listEnvVars( } async function getEnvVar( - project: string, + projectFlag: string | undefined, key: string, options: { environment?: string } ): Promise { await requireAuth() await setupClient() + const resolved = await requireProjectSlug(projectFlag) + const project = resolved.slug + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(project)} (from ${resolved.source})`) + } + const [vars, environments] = await withSpinner(`Fetching ${key}...`, async () => { const projectId = await getProjectId(project) @@ -473,7 +510,7 @@ async function getEnvVar( } async function setEnvVar( - project: string, + projectFlag: string | undefined, key: string, value: string | undefined, options: { environments?: string; preview?: boolean; update?: boolean } @@ -481,6 +518,12 @@ async function setEnvVar( await requireAuth() await setupClient() + const resolved = await requireProjectSlug(projectFlag) + const project = resolved.slug + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(project)} (from ${resolved.source})`) + } + // Get environments first const [existingVars, envs] = await withSpinner('Fetching environments...', async () => { const projectId = await getProjectId(project) @@ -1087,15 +1130,19 @@ function displayResources(env: EnvironmentResponse | null | undefined): void { // ============ Scale Command ============ async function scaleCmd( - project: string, - environment: string, - replicas: string | undefined, - options: { json?: boolean } + options: { project?: string; environment: string; replicas?: string; json?: boolean } ): Promise { await requireAuth() await setupClient() - const projectId = await getProjectId(project) + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + + const environment = options.environment + const projectId = await getProjectId(resolved.slug) // Find environment by slug const envs = await withSpinner('Fetching environments...', async () => { @@ -1117,9 +1164,9 @@ async function scaleCmd( return } - if (replicas !== undefined) { + if (options.replicas !== undefined) { // Set replicas - const replicaCount = parseInt(replicas, 10) + const replicaCount = parseInt(options.replicas, 10) if (isNaN(replicaCount) || replicaCount < 0) { errorOutput('Replicas must be a non-negative number') return @@ -1148,7 +1195,7 @@ async function scaleCmd( } newline() - success(`Scaled ${project}/${environment} to ${replicaCount} replica${replicaCount !== 1 ? 's' : ''}`) + success(`Scaled ${resolved.slug}/${environment} to ${replicaCount} replica${replicaCount !== 1 ? 's' : ''}`) newline() info(`Note: Scaling takes effect on the next deployment or restart`) } else { @@ -1164,12 +1211,12 @@ async function scaleCmd( } newline() - header(`${icons.folder} Scale for ${project}/${environment}`) + header(`${icons.folder} Scale for ${resolved.slug}/${environment}`) newline() keyValue('Current Replicas', String(currentReplicas)) newline() - info(`To scale: ${colors.muted(`temps env scale ${project} ${environment} `)}`) - info(`Example: ${colors.muted(`temps env scale ${project} ${environment} 3`)}`) + info(`To scale: ${colors.muted(`bunx @temps-sdk/cli env scale -p ${resolved.slug} -e ${environment} --replicas `)}`) + info(`Example: ${colors.muted(`bunx @temps-sdk/cli env scale -p ${resolved.slug} -e ${environment} --replicas 3`)}`) } } @@ -1366,35 +1413,4 @@ async function listCronExecutions( } // Helper function to parse .env file content -function parseEnvFile(content: string): Record { - const variables: Record = {} - - for (const line of content.split('\n')) { - const trimmed = line.trim() - - // Skip empty lines and comments - if (!trimmed || trimmed.startsWith('#')) continue - - // Parse KEY=VALUE - const match = trimmed.match(/^([^=]+)=(.*)$/) - if (!match) continue - - const [, key, rawValue] = match - if (!key || rawValue === undefined) continue - - let value = rawValue.trim() - - // Handle quoted values - if ((value.startsWith('"') && value.endsWith('"')) || - (value.startsWith("'") && value.endsWith("'"))) { - value = value.slice(1, -1) - .replace(/\\n/g, '\n') - .replace(/\\"/g, '"') - .replace(/\\'/g, "'") - } - - variables[key.trim()] = value - } - - return variables -} +// parseEnvFile is now imported from '../../lib/env-file.js' diff --git a/apps/temps-cli/src/commands/open/index.ts b/apps/temps-cli/src/commands/open/index.ts index dd670696..c8d031be 100644 --- a/apps/temps-cli/src/commands/open/index.ts +++ b/apps/temps-cli/src/commands/open/index.ts @@ -1,9 +1,8 @@ import type { Command } from 'commander' import { execSync } from 'node:child_process' import { requireAuth } from '../../config/store.js' -import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' +import { setupClient, client, getWebUrl, getErrorMessage } from '../../lib/api-client.js' import { requireProjectSlug } from '../../config/resolve-project.js' -import { config } from '../../config/store.js' import { getProjectBySlug, getEnvironments } from '../../api/sdk.gen.js' import { withSpinner } from '../../ui/spinner.js' import { promptSelect } from '../../ui/prompts.js' @@ -50,8 +49,8 @@ async function open(projectArg: string | undefined, options: OpenOptions): Promi // If --dashboard, open the web dashboard if (options.dashboard) { - const apiUrl = config.get('apiUrl') - const dashboardUrl = `${apiUrl}/dashboard/projects/${resolved.slug}` + const webUrl = getWebUrl() + const dashboardUrl = `${webUrl}/projects/${resolved.slug}` success(`Opening dashboard for ${resolved.slug}`) openUrl(dashboardUrl) return diff --git a/apps/temps-cli/src/commands/projects/create.ts b/apps/temps-cli/src/commands/projects/create.ts index bc896794..2b343fcf 100644 --- a/apps/temps-cli/src/commands/projects/create.ts +++ b/apps/temps-cli/src/commands/projects/create.ts @@ -1,5 +1,5 @@ import { requireAuth, config } from '../../config/store.js' -import { promptText, promptConfirm, type SelectOption } from '../../ui/prompts.js' +import { promptText, promptConfirm, promptSelect, type SelectOption } from '../../ui/prompts.js' import { withSpinner } from '../../ui/spinner.js' import { success, @@ -10,10 +10,12 @@ import { keyValue, header, info, + warning, } from '../../ui/output.js' import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' import { createProject } from '../../api/sdk.gen.js' import type { RepositoryResponse } from '../../api/types.gen.js' +import { readEnvFile, findEnvFiles } from '../../lib/env-file.js' // Shared utilities (extracted to avoid duplication with setup wizard) import { @@ -183,32 +185,123 @@ async function configureEnvironmentVariables(): Promise<[string, string][]> { const envVars: [string, string][] = [] - let addMore = true - while (addMore) { - newline() - const key = await promptText({ - message: 'Variable name (e.g., DATABASE_URL)', - required: true, - validate: (v) => { - if (!v) return 'Variable name is required' - if (!/^[A-Z_][A-Z0-9_]*$/i.test(v)) { - return 'Variable name must start with a letter or underscore and contain only letters, numbers, and underscores' - } - return true - }, - }) + // Check for .env files in the current directory + const envFiles = findEnvFiles() - const value = await promptText({ - message: `Value for ${key}`, - required: true, + // Build method choices + const methodChoices: SelectOption[] = [] + + if (envFiles.length > 0) { + methodChoices.push({ + name: `Import from file (${envFiles.join(', ')} found)`, + value: 'file', + description: 'Load variables from a .env file', }) + } + + methodChoices.push( + { + name: 'Enter manually', + value: 'manual', + description: 'Type key-value pairs one by one', + }, + { + name: 'Specify file path', + value: 'path', + description: 'Provide a custom path to a .env file', + }, + ) + + const method = methodChoices.length === 1 + ? 'manual' + : await promptSelect({ message: 'How to add variables?', choices: methodChoices }) + + if (method === 'file' || method === 'path') { + let filePath: string + + if (method === 'file') { + if (envFiles.length === 1) { + filePath = envFiles[0]! + } else { + filePath = await promptSelect({ + message: 'Select .env file', + choices: envFiles.map((f) => ({ name: f, value: f })), + }) + } + } else { + filePath = await promptText({ + message: 'Path to .env file', + default: '.env', + required: true, + }) + } + + const parsed = readEnvFile(filePath) + + if (!parsed || Object.keys(parsed).length === 0) { + warning(`No variables found in ${filePath}`) + } else { + const entries = Object.entries(parsed) + newline() + info(`Found ${entries.length} variable(s) in ${colors.bold(filePath)}:`) + newline() - envVars.push([key, value]) + for (const [key, value] of entries) { + const masked = value.length > 30 ? `${value.substring(0, 30)}...` : value + keyValue(` ${key}`, colors.muted(masked)) + } + + newline() + const confirm = await promptConfirm({ + message: `Import ${entries.length} variable(s)?`, + default: true, + }) - addMore = await promptConfirm({ - message: 'Add another environment variable?', + if (confirm) { + for (const [key, value] of entries) { + envVars.push([key, value]) + } + success(`Imported ${entries.length} variable(s) from ${filePath}`) + } + } + } + + // Manual entry (either as primary method or to add more after file import) + if (method === 'manual' || envVars.length > 0) { + const shouldAddManual = method === 'manual' || await promptConfirm({ + message: 'Add more variables manually?', default: false, }) + + if (shouldAddManual) { + let addMore = true + while (addMore) { + newline() + const key = await promptText({ + message: 'Variable name (e.g., DATABASE_URL)', + required: true, + validate: (v) => { + if (!v) return 'Variable name is required' + if (!/^[A-Z_][A-Z0-9_]*$/i.test(v)) { + return 'Variable name must start with a letter or underscore and contain only letters, numbers, and underscores' + } + return true + }, + }) + + const value = await promptText({ + message: `Value for ${key}`, + required: true, + }) + + envVars.push([key, value]) + + addMore = await promptConfirm({ + message: 'Add another variable?', + default: false, + }) + } + } } return envVars diff --git a/apps/temps-cli/src/commands/projects/delete.ts b/apps/temps-cli/src/commands/projects/delete.ts index af1a0669..62329bb4 100644 --- a/apps/temps-cli/src/commands/projects/delete.ts +++ b/apps/temps-cli/src/commands/projects/delete.ts @@ -1,12 +1,13 @@ import { requireAuth, config } from '../../config/store.js' +import { requireProjectSlug } from '../../config/resolve-project.js' import { promptConfirm } from '../../ui/prompts.js' import { withSpinner } from '../../ui/spinner.js' -import { success, warning, newline, colors } from '../../ui/output.js' +import { success, warning, newline, colors, info } from '../../ui/output.js' import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' import { deleteProject, getProjectBySlug } from '../../api/sdk.gen.js' interface DeleteOptions { - project: string + project?: string force?: boolean yes?: boolean } @@ -15,7 +16,13 @@ export async function remove(options: DeleteOptions): Promise { await requireAuth() await setupClient() - const projectIdOrName = options.project + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + + const projectIdOrName = resolved.slug newline() diff --git a/apps/temps-cli/src/commands/projects/index.ts b/apps/temps-cli/src/commands/projects/index.ts index fe4a2303..6814b589 100644 --- a/apps/temps-cli/src/commands/projects/index.ts +++ b/apps/temps-cli/src/commands/projects/index.ts @@ -34,7 +34,7 @@ export function registerProjectsCommands(program: Command): void { .command('show') .alias('get') .description('Show project details') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .option('--json', 'Output in JSON format') .action(show) @@ -42,7 +42,7 @@ export function registerProjectsCommands(program: Command): void { .command('update') .alias('edit') .description('Update project name and description') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .option('-n, --name ', 'New project name') .option('-d, --description ', 'New project description') .option('--json', 'Output in JSON format') @@ -52,7 +52,7 @@ export function registerProjectsCommands(program: Command): void { projects .command('settings') .description('Update project settings (slug, attack mode, preview environments)') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .option('--slug ', 'Project URL slug') .option('--attack-mode', 'Enable attack mode (CAPTCHA protection)') .option('--no-attack-mode', 'Disable attack mode') @@ -65,7 +65,7 @@ export function registerProjectsCommands(program: Command): void { projects .command('git') .description('Update git repository settings') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .option('--owner ', 'Repository owner') .option('--repo ', 'Repository name') .option('--branch ', 'Main branch') @@ -78,7 +78,7 @@ export function registerProjectsCommands(program: Command): void { projects .command('config') .description('Update deployment configuration (resources, replicas)') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .option('--replicas ', 'Number of container replicas') .option('--cpu-limit ', 'CPU limit in cores (e.g., 0.5, 1, 2)') .option('--memory-limit ', 'Memory limit in MB') @@ -92,7 +92,7 @@ export function registerProjectsCommands(program: Command): void { .command('delete') .alias('rm') .description('Delete a project') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .option('-f, --force', 'Skip confirmation') .option('-y, --yes', 'Skip confirmation (alias for --force)') .action(remove) diff --git a/apps/temps-cli/src/commands/projects/show.ts b/apps/temps-cli/src/commands/projects/show.ts index e55931a5..27403ca6 100644 --- a/apps/temps-cli/src/commands/projects/show.ts +++ b/apps/temps-cli/src/commands/projects/show.ts @@ -1,12 +1,13 @@ import { requireAuth } from '../../config/store.js' +import { requireProjectSlug } from '../../config/resolve-project.js' import { withSpinner } from '../../ui/spinner.js' -import { newline, header, icons, json, keyValue, formatDate } from '../../ui/output.js' +import { newline, header, icons, json, keyValue, info, colors, formatDate } from '../../ui/output.js' import { detailsTable, statusBadge } from '../../ui/table.js' import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' import { getProject, getProjectBySlug } from '../../api/sdk.gen.js' interface ShowOptions { - project: string + project?: string json?: boolean } @@ -14,7 +15,13 @@ export async function show(options: ShowOptions): Promise { await requireAuth() await setupClient() - const projectIdOrName = options.project + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + + const projectIdOrName = resolved.slug const project = await withSpinner('Fetching project...', async () => { // Try to parse as ID first diff --git a/apps/temps-cli/src/commands/projects/update.ts b/apps/temps-cli/src/commands/projects/update.ts index c1fc6e72..358ad62c 100644 --- a/apps/temps-cli/src/commands/projects/update.ts +++ b/apps/temps-cli/src/commands/projects/update.ts @@ -1,4 +1,5 @@ import { requireAuth } from '../../config/store.js' +import { requireProjectSlug } from '../../config/resolve-project.js' import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' import { getProject, @@ -13,12 +14,18 @@ import { promptText, promptConfirm, promptSelect } from '../../ui/prompts.js' import { newline, header, icons, json, colors, success, info, warning, keyValue } from '../../ui/output.js' export async function updateProjectAction( - options: { project: string; name?: string; json?: boolean; yes?: boolean } + options: { project?: string; name?: string; json?: boolean; yes?: boolean } ): Promise { await requireAuth() await setupClient() - const projectIdOrSlug = options.project + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + + const projectIdOrSlug = resolved.slug // Get project first const project = await withSpinner('Fetching project...', async () => { @@ -88,7 +95,7 @@ export async function updateProjectAction( export async function updateSettingsAction( options: { - project: string + project?: string slug?: string attackMode?: boolean previewEnvs?: boolean @@ -99,7 +106,13 @@ export async function updateSettingsAction( await requireAuth() await setupClient() - const projectIdOrSlug = options.project + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + + const projectIdOrSlug = resolved.slug // Get project first const project = await withSpinner('Fetching project...', async () => { @@ -184,7 +197,7 @@ export async function updateSettingsAction( export async function updateGitAction( options: { - project: string + project?: string owner?: string repo?: string branch?: string @@ -197,7 +210,13 @@ export async function updateGitAction( await requireAuth() await setupClient() - const projectIdOrSlug = options.project + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + + const projectIdOrSlug = resolved.slug // Get project first const project = await withSpinner('Fetching project...', async () => { @@ -320,7 +339,7 @@ export async function updateGitAction( export async function updateConfigAction( options: { - project: string + project?: string replicas?: string cpuLimit?: string memoryLimit?: string @@ -332,7 +351,13 @@ export async function updateConfigAction( await requireAuth() await setupClient() - const projectIdOrSlug = options.project + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + + const projectIdOrSlug = resolved.slug // Get project first const project = await withSpinner('Fetching project...', async () => { diff --git a/apps/temps-cli/src/commands/runtime-logs.ts b/apps/temps-cli/src/commands/runtime-logs.ts index ea327290..821312df 100644 --- a/apps/temps-cli/src/commands/runtime-logs.ts +++ b/apps/temps-cli/src/commands/runtime-logs.ts @@ -12,18 +12,20 @@ interface RuntimeLogsOptions { container?: string tail: string timestamps?: boolean + follow?: boolean } export function registerRuntimeLogsCommand(program: Command): void { program .command('runtime-logs') .alias('rlogs') - .description('Stream runtime container logs (not build logs)') + .description('View runtime container logs (use -f to follow in real-time)') .option('-p, --project ', 'Project slug or ID') .option('-e, --environment ', 'Environment name', 'production') .option('-c, --container ', 'Container ID (partial match supported)') .option('-n, --tail ', 'Number of lines to tail', '1000') .option('-t, --timestamps', 'Show timestamps') + .option('-f, --follow', 'Follow log output (stream in real-time)') .action(runtimeLogs) } @@ -131,60 +133,71 @@ async function runtimeLogs(options: RuntimeLogsOptions): Promise { // Remove protocol and any trailing slash, keep the path (e.g., /api) const urlWithoutProtocol = apiUrl.replace(/^https?:\/\//, '').replace(/\/$/, '') + const follow = options.follow ?? false const params = new URLSearchParams() params.append('tail', options.tail) params.append('timestamps', String(options.timestamps ?? false)) + params.append('follow', String(follow)) const wsUrl = `${wsProtocol}://${urlWithoutProtocol}/projects/${projectData.id}/environments/${environment.id}/containers/${selectedContainer.container_id}/logs?${params.toString()}` - info(`Connecting to WebSocket...`) - info(`URL: ${colors.muted(wsUrl)}`) + if (follow) { + info(`Streaming logs (follow mode)...`) + } else { + info(`Fetching logs...`) + } newline() // Connect via WebSocket - await connectWebSocket(wsUrl, apiKey) + await connectWebSocket(wsUrl, apiKey, follow) } -async function connectWebSocket(url: string, apiKey: string): Promise { - return new Promise((resolve, reject) => { +function formatLogMessage(raw: string): void { + // Docker log lines include trailing newlines; strip them so + // console.log doesn't produce double-spaced output. + const data = raw.replace(/\r?\n$/, '') + + // Try to parse as JSON for structured logs + try { + const parsed = JSON.parse(data) + if (parsed.error) { + console.log(colors.error(`ERROR: ${parsed.error}`)) + if (parsed.detail) { + console.log(colors.muted(` ${parsed.detail}`)) + } + } else if (parsed.message) { + console.log(parsed.message.replace(/\r?\n$/, '')) + } else { + console.log(data) + } + } catch { + // Plain text log line + console.log(data) + } +} + +async function connectWebSocket(url: string, apiKey: string, follow: boolean): Promise { + return new Promise((resolve) => { const ws = new WebSocket(url, { headers: { 'Authorization': `Bearer ${apiKey}`, }, } as any) + let sigintHandler: (() => void) | null = null + ws.onopen = () => { - console.log(colors.success('✓ Connected to log stream')) - console.log(colors.muted('─'.repeat(60))) - console.log(colors.muted('Press Ctrl+C to stop')) - console.log(colors.muted('─'.repeat(60))) - console.log() + if (follow) { + console.log(colors.success('✓ Connected to log stream')) + console.log(colors.muted('─'.repeat(60))) + console.log(colors.muted('Press Ctrl+C to stop')) + console.log(colors.muted('─'.repeat(60))) + console.log() + } } ws.onmessage = (event) => { - const raw = event.data.toString() - - // Docker log lines include trailing newlines; strip them so - // console.log doesn't produce double-spaced output. - const data = raw.replace(/\r?\n$/, '') - - // Try to parse as JSON for structured logs - try { - const parsed = JSON.parse(data) - if (parsed.error) { - console.log(colors.error(`ERROR: ${parsed.error}`)) - if (parsed.detail) { - console.log(colors.muted(` ${parsed.detail}`)) - } - } else if (parsed.message) { - console.log(parsed.message.replace(/\r?\n$/, '')) - } else { - console.log(data) - } - } catch { - // Plain text log line - console.log(data) - } + formatLogMessage(event.data.toString()) } ws.onerror = (error) => { @@ -192,24 +205,40 @@ async function connectWebSocket(url: string, apiKey: string): Promise { } ws.onclose = (event) => { - console.log() - console.log(colors.muted('─'.repeat(60))) - if (event.code === 1000) { - console.log(colors.info('Connection closed normally')) - } else { - console.log(colors.warning(`Connection closed (code: ${event.code})`)) - if (event.reason) { - console.log(colors.muted(`Reason: ${event.reason}`)) + // Clean up the SIGINT handler + if (sigintHandler) { + process.removeListener('SIGINT', sigintHandler) + } + + if (follow) { + console.log() + console.log(colors.muted('─'.repeat(60))) + if (event.code === 1000) { + console.log(colors.info('Connection closed normally')) + } else { + console.log(colors.warning(`Connection closed (code: ${event.code})`)) + if (event.reason) { + console.log(colors.muted(`Reason: ${event.reason}`)) + } } } resolve() } - // Handle Ctrl+C gracefully - process.on('SIGINT', () => { + // Handle Ctrl+C gracefully (only relevant for follow mode, but register always) + sigintHandler = () => { console.log() console.log(colors.muted('Closing connection...')) - ws.close(1000, 'User requested close') - }) + try { + ws.close(1000, 'User requested close') + } catch { + // WebSocket may already be closed + } + // Force exit after a short delay in case ws.close doesn't trigger onclose + setTimeout(() => { + process.exit(0) + }, 500) + } + process.on('SIGINT', sigintHandler) }) } diff --git a/apps/temps-cli/src/commands/services/index.ts b/apps/temps-cli/src/commands/services/index.ts index cbc06868..de8f28f2 100644 --- a/apps/temps-cli/src/commands/services/index.ts +++ b/apps/temps-cli/src/commands/services/index.ts @@ -18,8 +18,10 @@ import { unlinkServiceFromProject, getServiceEnvironmentVariables, getServiceEnvironmentVariable, + getProjectBySlug, } from '../../api/sdk.gen.js' import type { ExternalServiceInfo, ServiceTypeRoute } from '../../api/types.gen.js' +import { requireProjectSlug } from '../../config/resolve-project.js' import { withSpinner } from '../../ui/spinner.js' import { printTable, statusBadge, type TableColumn } from '../../ui/table.js' import { promptText, promptSelect, promptConfirm } from '../../ui/prompts.js' @@ -32,10 +34,94 @@ const SERVICE_TYPE_LABELS: Record = { s3: 'MinIO (S3)', } +// Default parameters for each service type when using automation mode (-y) +// These match the backend's required fields + sensible defaults +const SERVICE_TYPE_DEFAULTS: Record> = { + postgres: { database: 'myapp', username: 'postgres' }, + mongodb: { database: 'myapp', username: 'mongoadmin' }, + redis: {}, + s3: {}, +} + +// JSON Schema → interactive prompt parameters +interface SchemaProperty { + type?: string + description?: string + default?: unknown + example?: unknown + enum?: string[] +} + +interface JsonSchema { + type?: string + title?: string + properties?: Record + required?: string[] + readonly?: string[] +} + +interface PromptParam { + name: string + label: string + description?: string + default_value?: unknown + required: boolean + readonly: boolean + enum_values?: string[] + param_type: string +} + +function schemaToPromptParams(schema: JsonSchema): PromptParam[] { + if (!schema?.properties) return [] + const required = new Set(schema.required ?? []) + const readonly = new Set(schema.readonly ?? []) + return Object.entries(schema.properties).map(([name, prop]) => ({ + name, + label: name.replace(/_/g, ' ').replace(/\b\w/g, c => c.toUpperCase()), + description: prop.description, + default_value: prop.default ?? prop.example, + required: required.has(name), + readonly: readonly.has(name), + enum_values: prop.enum, + param_type: prop.type ?? 'string', + })) +} + +/** + * Parse repeatable --set key=value pairs into a Record. + * Supports type coercion: numbers → number, true/false → boolean, rest → string. + */ +function parseSetPairs(pairs: string[]): Record { + const result: Record = {} + for (const pair of pairs) { + const eqIdx = pair.indexOf('=') + if (eqIdx === -1) { + throw new Error(`Invalid parameter "${pair}". Expected format: key=value`) + } + const key = pair.slice(0, eqIdx).trim() + const raw = pair.slice(eqIdx + 1) + if (!key) { + throw new Error(`Invalid parameter "${pair}". Key cannot be empty`) + } + // Type coercion + if (raw === 'true') result[key] = true + else if (raw === 'false') result[key] = false + else if (raw === '0') result[key] = 0 + else if (raw !== '' && !isNaN(Number(raw)) && !raw.startsWith('0')) result[key] = Number(raw) + else result[key] = raw + } + return result +} + +/** Collect repeatable --set values into an array */ +function collectSet(value: string, previous: string[]): string[] { + return previous.concat([value]) +} + interface CreateOptions { type?: string name?: string - parameters?: string + set?: string[] yes?: boolean } @@ -62,7 +148,7 @@ interface ProjectsOptions { interface UpdateOptions { id: string name?: string - parameters?: string + set?: string[] } interface UpgradeOptions { @@ -74,36 +160,52 @@ interface ImportOptions { type?: string name?: string containerId?: string - parameters?: string + set?: string[] version?: string yes?: boolean } interface LinkOptions { id: string - projectId: string + project?: string } interface UnlinkOptions { id: string - projectId: string + project?: string force?: boolean yes?: boolean } interface EnvOptions { id: string - projectId: string + project?: string json?: boolean } interface EnvVarOptions { id: string - projectId: string + project?: string var: string json?: boolean } +/** Resolve project slug (from flag, .temps/config.json, env, global) → project ID */ +async function resolveProjectId(flagValue?: string): Promise<{ id: number; slug: string }> { + const resolved = await requireProjectSlug(flagValue) + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + const { data, error } = await getProjectBySlug({ + client, + path: { slug: resolved.slug }, + }) + if (error || !data) { + throw new Error(`Project "${resolved.slug}" not found`) + } + return { id: data.id, slug: resolved.slug } +} + export function registerServicesCommands(program: Command): void { const services = program .command('services') @@ -123,7 +225,7 @@ export function registerServicesCommands(program: Command): void { .description('Create a new external service') .option('-t, --type ', 'Service type (postgres, mongodb, redis, s3)') .option('-n, --name ', 'Service name') - .option('--parameters ', 'Service parameters as JSON string') + .option('-s, --set ', 'Set a parameter (repeatable)', collectSet, []) .option('-y, --yes', 'Skip confirmation prompts (for automation)') .action(createServiceAction) @@ -155,12 +257,18 @@ export function registerServicesCommands(program: Command): void { .requiredOption('--id ', 'Service ID') .action(stopServiceAction) - services + const typesCmd = services .command('types') .description('List available service types') .option('--json', 'Output in JSON format') .action(listServiceTypes) + typesCmd + .command('info ') + .description('Show parameters schema for a service type (useful for automation)') + .option('--json', 'Output as raw JSON schema (default)') + .action(showServiceTypeInfo) + services .command('projects') .description('List projects linked to a service') @@ -173,7 +281,7 @@ export function registerServicesCommands(program: Command): void { .description('Update a service') .requiredOption('--id ', 'Service ID') .option('-n, --name ', 'Docker image name (e.g., postgres:18-alpine)') - .option('--parameters ', 'Service parameters as JSON string') + .option('-s, --set ', 'Set a parameter (repeatable)', collectSet, []) .action(updateServiceAction) services @@ -189,7 +297,7 @@ export function registerServicesCommands(program: Command): void { .option('-t, --type ', 'Service type (postgres, mongodb, redis, s3)') .option('-n, --name ', 'Service name') .option('--container-id ', 'Container ID or name to import') - .option('--parameters ', 'Service parameters as JSON string') + .option('-s, --set ', 'Set a parameter (repeatable)', collectSet, []) .option('--version ', 'Optional version override') .option('-y, --yes', 'Skip confirmation prompts (for automation)') .action(importServiceAction) @@ -198,14 +306,14 @@ export function registerServicesCommands(program: Command): void { .command('link') .description('Link a service to a project') .requiredOption('--id ', 'Service ID') - .requiredOption('--project-id ', 'Project ID') + .option('-p, --project ', 'Project slug (auto-detected from .temps/config.json)') .action(linkServiceAction) services .command('unlink') .description('Unlink a service from a project') .requiredOption('--id ', 'Service ID') - .requiredOption('--project-id ', 'Project ID') + .option('-p, --project ', 'Project slug (auto-detected from .temps/config.json)') .option('-f, --force', 'Skip confirmation') .option('-y, --yes', 'Skip confirmation prompts (alias for --force)') .action(unlinkServiceAction) @@ -214,7 +322,7 @@ export function registerServicesCommands(program: Command): void { .command('env') .description('Show environment variables for a linked service') .requiredOption('--id ', 'Service ID') - .requiredOption('--project-id ', 'Project ID') + .option('-p, --project ', 'Project slug (auto-detected from .temps/config.json)') .option('--json', 'Output in JSON format') .action(envAction) @@ -222,7 +330,7 @@ export function registerServicesCommands(program: Command): void { .command('env-var') .description('Get a specific environment variable') .requiredOption('--id ', 'Service ID') - .requiredOption('--project-id ', 'Project ID') + .option('-p, --project ', 'Project slug (auto-detected from .temps/config.json)') .requiredOption('--var ', 'Environment variable name') .option('--json', 'Output in JSON format') .action(envVarAction) @@ -289,8 +397,11 @@ async function createServiceAction(options: CreateOptions): Promise { let name: string let parameters: Record = {} - // Check if automation mode (all required params provided) - const isAutomation = options.yes && options.type && options.name + const hasSetParams = options.set && options.set.length > 0 + + // Automation mode: -y with type+name, OR type+name+set (explicit params = no need for -y) + const isAutomation = (options.yes && options.type && options.name) || + (options.type && options.name && hasSetParams) if (isAutomation) { // Validate service type @@ -301,60 +412,67 @@ async function createServiceAction(options: CreateOptions): Promise { serviceType = options.type as ServiceTypeRoute name = options.name! - // Parse parameters if provided - if (options.parameters) { + // Parse --set key=value pairs if provided, otherwise use smart defaults + if (hasSetParams) { try { - parameters = JSON.parse(options.parameters) - } catch { - warning('Invalid JSON in --parameters') + parameters = parseSetPairs(options.set!) + } catch (e) { + warning((e as Error).message) return } + } else { + // Apply default parameters for this service type (e.g., database/username for postgres) + parameters = { ...(SERVICE_TYPE_DEFAULTS[serviceType] ?? {}) } } } else { - // Interactive mode - serviceType = await promptSelect({ - message: 'Service type', - choices: types.map(t => ({ - name: SERVICE_TYPE_LABELS[t] || t, - value: t, - })), - }) as ServiceTypeRoute + // Interactive mode — use --type and --name if provided, prompt for the rest + if (options.type) { + if (!types.includes(options.type as ServiceTypeRoute)) { + warning(`Invalid service type: ${options.type}. Available: ${types.join(', ')}`) + return + } + serviceType = options.type as ServiceTypeRoute + info(`Service type: ${colors.bold(SERVICE_TYPE_LABELS[serviceType] || serviceType)}`) + } else { + serviceType = await promptSelect({ + message: 'Service type', + choices: types.map(t => ({ + name: SERVICE_TYPE_LABELS[t] || t, + value: t, + })), + }) as ServiceTypeRoute + } - name = await promptText({ - message: 'Service name', - default: `my-${serviceType}`, - required: true, - }) + if (options.name) { + name = options.name + } else { + name = await promptText({ + message: 'Service name', + default: `my-${serviceType}`, + required: true, + }) + } - // Get parameters schema for the service type + // Get parameters schema for the service type (returns JSON Schema) const { data: typeInfo } = await getServiceTypeParameters({ client, path: { service_type: serviceType }, }) - // Type guard for parameters response - interface ServiceTypeParameter { - name: string - label?: string - default_value?: unknown - required?: boolean - enum_values?: string[] - param_type?: string - } - interface ServiceTypeParametersResponse { - parameters?: ServiceTypeParameter[] - } - const paramResponse = typeInfo as ServiceTypeParametersResponse | undefined + const schema = typeInfo as JsonSchema | undefined + const promptParams = schemaToPromptParams(schema ?? {}) + // Only show user-configurable params (skip readonly ones the backend auto-generates) + const configurableParams = promptParams.filter(p => !p.readonly || p.required) - if (paramResponse?.parameters && paramResponse.parameters.length > 0) { + if (configurableParams.length > 0) { info(`\nConfigure ${SERVICE_TYPE_LABELS[serviceType] || serviceType} parameters:`) newline() - for (const param of paramResponse.parameters) { - // Skip parameters that have defaults and aren't required + for (const param of configurableParams) { + // Skip non-required params that have defaults — use the default automatically if (param.default_value !== undefined && !param.required) { const useDefault = await promptConfirm({ - message: `${param.label || param.name}: Use default "${param.default_value}"?`, + message: `${param.label}${param.description ? ` (${param.description})` : ''}: Use default "${param.default_value}"?`, default: true, }) if (useDefault) { @@ -367,19 +485,18 @@ async function createServiceAction(options: CreateOptions): Promise { if (param.enum_values && param.enum_values.length > 0) { value = await promptSelect({ - message: param.label || param.name, + message: param.label, choices: param.enum_values.map((v: string) => ({ name: v, value: v })), }) } else { value = await promptText({ - message: param.label || param.name, - default: param.default_value?.toString() || '', - required: param.required || false, + message: `${param.label}${param.description ? ` (${param.description})` : ''}`, + default: param.default_value?.toString() ?? '', + required: param.required, }) } if (value) { - // Try to parse as number if the param type suggests it if (param.param_type === 'integer' || param.param_type === 'number') { parameters[param.name] = parseInt(value, 10) } else if (param.param_type === 'boolean') { @@ -580,6 +697,76 @@ async function listServiceTypes(options: { json?: boolean }): Promise { console.log(` ${colors.bold(SERVICE_TYPE_LABELS[t] || t)} ${colors.muted(`(${t})`)}`) } newline() + info(`Run ${colors.bold('services types info ')} to see parameters for a specific type`) +} + +/** Build an example `services create` command using --set flags with all schema defaults */ +function buildExampleCommand(type: string, schema?: JsonSchema): string { + const setParts: string[] = [] + if (schema?.properties) { + for (const [key, prop] of Object.entries(schema.properties)) { + // Skip params with null defaults (auto-generated like password, port) + if (prop.default === null || prop.default === undefined) continue + setParts.push(`--set ${key}=${prop.default}`) + } + } + const setsStr = setParts.length > 0 ? ` ${setParts.join(' ')}` : '' + return `bunx @temps-sdk/cli services create -t ${type} -n my-${type}${setsStr}` +} + +async function showServiceTypeInfo(type: string): Promise { + await requireAuth() + await setupClient() + + const { data, error } = await getServiceTypeParameters({ + client, + path: { service_type: type as ServiceTypeRoute }, + }) + + if (error) { + warning(`Failed to get parameters for "${type}": ${getErrorMessage(error)}`) + return + } + + const schema = data as JsonSchema | undefined + if (!schema?.properties) { + json({ type, parameters: {}, defaults: SERVICE_TYPE_DEFAULTS[type] ?? {} }) + return + } + + // Build a clean output for agents: each parameter with its metadata + const params: Record = {} + + const requiredKeys = new Set(schema.required ?? []) + const readonlyKeys = new Set(schema.readonly ?? []) + + for (const [name, prop] of Object.entries(schema.properties)) { + params[name] = { + type: prop.type ?? 'string', + ...(prop.description ? { description: prop.description } : {}), + required: requiredKeys.has(name), + readonly: readonlyKeys.has(name), + ...(prop.default !== undefined ? { default: prop.default } : {}), + ...(prop.example !== undefined ? { example: prop.example } : {}), + } + } + + const output = { + service_type: type, + label: SERVICE_TYPE_LABELS[type as ServiceTypeRoute] || type, + parameters: params, + defaults: SERVICE_TYPE_DEFAULTS[type] ?? {}, + example_create: buildExampleCommand(type, schema), + } + + json(output) } async function listLinkedProjects(options: ProjectsOptions): Promise { @@ -634,11 +821,11 @@ async function updateServiceAction(options: UpdateOptions): Promise { } let parameters: Record = {} - if (options.parameters) { + if (options.set && options.set.length > 0) { try { - parameters = JSON.parse(options.parameters) - } catch { - warning('Invalid JSON in --parameters') + parameters = parseSetPairs(options.set) + } catch (e) { + warning((e as Error).message) return } } @@ -728,11 +915,11 @@ async function importServiceAction(options: ImportOptions): Promise { name = options.name! containerId = options.containerId! - if (options.parameters) { + if (options.set && options.set.length > 0) { try { - parameters = JSON.parse(options.parameters) - } catch { - warning('Invalid JSON in --parameters') + parameters = parseSetPairs(options.set) + } catch (e) { + warning((e as Error).message) return } } @@ -770,11 +957,11 @@ async function importServiceAction(options: ImportOptions): Promise { required: true, }) - if (options.parameters) { + if (options.set && options.set.length > 0) { try { - parameters = JSON.parse(options.parameters) - } catch { - warning('Invalid JSON in --parameters') + parameters = parseSetPairs(options.set) + } catch (e) { + warning((e as Error).message) return } } @@ -810,18 +997,14 @@ async function linkServiceAction(options: LinkOptions): Promise { return } - const projectId = parseInt(options.projectId, 10) - if (isNaN(projectId)) { - warning('Invalid project ID') - return - } + const project = await resolveProjectId(options.project) - await withSpinner('Linking service to project...', async () => { + await withSpinner(`Linking service to project ${colors.bold(project.slug)}...`, async () => { const { error } = await linkServiceToProject({ client, path: { id }, body: { - project_id: projectId, + project_id: project.id, }, }) if (error) { @@ -829,7 +1012,7 @@ async function linkServiceAction(options: LinkOptions): Promise { } }) - success(`Service ${options.id} linked to project ${options.projectId}`) + success(`Service ${options.id} linked to project ${project.slug}`) } async function unlinkServiceAction(options: UnlinkOptions): Promise { @@ -842,17 +1025,13 @@ async function unlinkServiceAction(options: UnlinkOptions): Promise { return } - const projectId = parseInt(options.projectId, 10) - if (isNaN(projectId)) { - warning('Invalid project ID') - return - } + const project = await resolveProjectId(options.project) const skipConfirmation = options.force || options.yes if (!skipConfirmation) { const confirmed = await promptConfirm({ - message: `Unlink service ${options.id} from project ${options.projectId}?`, + message: `Unlink service ${options.id} from project ${project.slug}?`, default: false, }) if (!confirmed) { @@ -861,17 +1040,17 @@ async function unlinkServiceAction(options: UnlinkOptions): Promise { } } - await withSpinner('Unlinking service from project...', async () => { + await withSpinner(`Unlinking service from project ${colors.bold(project.slug)}...`, async () => { const { error } = await unlinkServiceFromProject({ client, - path: { id, project_id: projectId }, + path: { id, project_id: project.id }, }) if (error) { throw new Error(getErrorMessage(error)) } }) - success(`Service ${options.id} unlinked from project ${options.projectId}`) + success(`Service ${options.id} unlinked from project ${project.slug}`) } async function envAction(options: EnvOptions): Promise { @@ -884,16 +1063,12 @@ async function envAction(options: EnvOptions): Promise { return } - const projectId = parseInt(options.projectId, 10) - if (isNaN(projectId)) { - warning('Invalid project ID') - return - } + const project = await resolveProjectId(options.project) const envVars = await withSpinner('Fetching environment variables...', async () => { const { data, error } = await getServiceEnvironmentVariables({ client, - path: { id, project_id: projectId }, + path: { id, project_id: project.id }, }) if (error) { throw new Error(getErrorMessage(error)) @@ -932,16 +1107,12 @@ async function envVarAction(options: EnvVarOptions): Promise { return } - const projectId = parseInt(options.projectId, 10) - if (isNaN(projectId)) { - warning('Invalid project ID') - return - } + const project = await resolveProjectId(options.project) const envVar = await withSpinner('Fetching environment variable...', async () => { const { data, error } = await getServiceEnvironmentVariable({ client, - path: { id, project_id: projectId, var_name: options.var }, + path: { id, project_id: project.id, var_name: options.var }, }) if (error) { throw new Error(getErrorMessage(error)) diff --git a/apps/temps-cli/src/commands/tokens/index.ts b/apps/temps-cli/src/commands/tokens/index.ts index ba499ec8..13d7bdcc 100644 --- a/apps/temps-cli/src/commands/tokens/index.ts +++ b/apps/temps-cli/src/commands/tokens/index.ts @@ -1,6 +1,7 @@ import type { Command } from 'commander' import { requireAuth, config, credentials } from '../../config/store.js' -import { setupClient, getErrorMessage } from '../../lib/api-client.js' +import { setupClient, normalizeApiUrl, getWebUrl, getErrorMessage } from '../../lib/api-client.js' +import { requireProjectSlug } from '../../config/resolve-project.js' import { colors, header, icons, info, json, keyValue, newline, success, warning, error as errorOutput } from '../../ui/output.js' import { promptConfirm, promptSelect, promptText } from '../../ui/prompts.js' import { withSpinner } from '../../ui/spinner.js' @@ -48,7 +49,7 @@ const PERMISSIONS = [ ] interface CreateOptions { - project: string + project?: string name?: string permissions?: string expiresIn?: string @@ -56,18 +57,18 @@ interface CreateOptions { } interface ListOptions { - project: string + project?: string json?: boolean } interface ShowOptions { - project: string + project?: string id: string json?: boolean } interface RemoveOptions { - project: string + project?: string id: string force?: boolean yes?: boolean @@ -78,7 +79,7 @@ async function makeRequest( path: string, body?: unknown ): Promise { - const apiUrl = config.get('apiUrl') + const apiUrl = normalizeApiUrl(config.get('apiUrl')) const apiKey = await credentials.getApiKey() const response = await fetch(`${apiUrl}${path}`, { @@ -102,34 +103,25 @@ async function makeRequest( return response.json() as Promise } -async function resolveProjectId(projectIdentifier: string): Promise { +async function resolveProjectId(projectSlug: string): Promise { // Try to parse as number first - const numId = parseInt(projectIdentifier, 10) + const numId = parseInt(projectSlug, 10) if (!isNaN(numId)) { return numId } // Otherwise, look up by slug - const apiUrl = config.get('apiUrl') - const apiKey = await credentials.getApiKey() - - const response = await fetch(`${apiUrl}/api/projects?page_size=100`, { - headers: { - 'Authorization': `Bearer ${apiKey}`, - }, - }) - - if (!response.ok) { - throw new Error('Failed to fetch projects') - } + const projects = await makeRequest<{ projects?: Array<{ slug: string; id: number }> }>( + 'GET', + `/projects?page_size=100` + ) - const data = await response.json() as { projects?: Array<{ slug: string; id: number }> } - const project = data.projects?.find((p) => - p.slug === projectIdentifier || p.slug.toLowerCase() === projectIdentifier.toLowerCase() + const project = projects.projects?.find((p) => + p.slug === projectSlug || p.slug.toLowerCase() === projectSlug.toLowerCase() ) if (!project) { - throw new Error(`Project "${projectIdentifier}" not found`) + throw new Error(`Project "${projectSlug}" not found`) } return project.id @@ -145,7 +137,7 @@ export function registerTokensCommands(program: Command): void { .command('list') .alias('ls') .description('List deployment tokens for a project') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .option('--json', 'Output in JSON format') .action(listTokensAction) @@ -153,7 +145,7 @@ export function registerTokensCommands(program: Command): void { .command('create') .alias('add') .description('Create a new deployment token') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .option('-n, --name ', 'Token name') .option('--permissions ', 'Comma-separated permissions (e.g., "visitors:enrich,emails:send" or "*" for full access)') .option('-e, --expires-in ', 'Expires in N days (7, 30, 90, 365, or "never")') @@ -164,7 +156,7 @@ export function registerTokensCommands(program: Command): void { .command('show') .alias('get') .description('Show deployment token details') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .requiredOption('--id ', 'Token ID') .option('--json', 'Output in JSON format') .action(showTokenAction) @@ -173,7 +165,7 @@ export function registerTokensCommands(program: Command): void { .command('delete') .alias('rm') .description('Delete a deployment token') - .requiredOption('-p, --project ', 'Project slug or ID') + .option('-p, --project ', 'Project slug or ID') .requiredOption('--id ', 'Token ID') .option('-f, --force', 'Skip confirmation') .option('-y, --yes', 'Skip confirmation (alias for --force)') @@ -190,14 +182,20 @@ async function listTokensAction(options: ListOptions): Promise { await requireAuth() await setupClient() + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + const projectId = await withSpinner('Resolving project...', async () => { - return resolveProjectId(options.project) + return resolveProjectId(resolved.slug) }) const response = await withSpinner('Fetching deployment tokens...', async () => { return makeRequest( 'GET', - `/api/projects/${projectId}/deployment-tokens` + `/projects/${projectId}/deployment-tokens` ) }) @@ -213,7 +211,7 @@ async function listTokensAction(options: ListOptions): Promise { if (tokensList.length === 0) { info('No deployment tokens found') - info(`Run: temps tokens create -p ${options.project} --name my-token -y`) + info(`Run: temps tokens create -p ${resolved.slug} --name my-token -y`) newline() return } @@ -236,8 +234,14 @@ async function createTokenAction(options: CreateOptions): Promise { await requireAuth() await setupClient() + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + const projectId = await withSpinner('Resolving project...', async () => { - return resolveProjectId(options.project) + return resolveProjectId(resolved.slug) }) let name: string @@ -317,7 +321,7 @@ async function createTokenAction(options: CreateOptions): Promise { const result = await withSpinner('Creating deployment token...', async () => { return makeRequest( 'POST', - `/api/projects/${projectId}/deployment-tokens`, + `/projects/${projectId}/deployment-tokens`, { name, permissions, @@ -354,8 +358,14 @@ async function showTokenAction(options: ShowOptions): Promise { await requireAuth() await setupClient() + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + const projectId = await withSpinner('Resolving project...', async () => { - return resolveProjectId(options.project) + return resolveProjectId(resolved.slug) }) const tokenId = parseInt(options.id, 10) @@ -367,7 +377,7 @@ async function showTokenAction(options: ShowOptions): Promise { const token = await withSpinner('Fetching token...', async () => { return makeRequest( 'GET', - `/api/projects/${projectId}/deployment-tokens/${tokenId}` + `/projects/${projectId}/deployment-tokens/${tokenId}` ) }) @@ -392,8 +402,14 @@ async function deleteTokenAction(options: RemoveOptions): Promise { await requireAuth() await setupClient() + const resolved = await requireProjectSlug(options.project) + + if (resolved.source !== 'flag') { + info(`Using project ${colors.bold(resolved.slug)} (from ${resolved.source})`) + } + const projectId = await withSpinner('Resolving project...', async () => { - return resolveProjectId(options.project) + return resolveProjectId(resolved.slug) }) const tokenId = parseInt(options.id, 10) @@ -406,7 +422,7 @@ async function deleteTokenAction(options: RemoveOptions): Promise { const token = await withSpinner('Fetching token...', async () => { return makeRequest( 'GET', - `/api/projects/${projectId}/deployment-tokens/${tokenId}` + `/projects/${projectId}/deployment-tokens/${tokenId}` ) }) @@ -427,7 +443,7 @@ async function deleteTokenAction(options: RemoveOptions): Promise { await withSpinner('Deleting token...', async () => { return makeRequest( 'DELETE', - `/api/projects/${projectId}/deployment-tokens/${tokenId}` + `/projects/${projectId}/deployment-tokens/${tokenId}` ) }) diff --git a/apps/temps-cli/src/commands/up/index.ts b/apps/temps-cli/src/commands/up/index.ts index 9983ab15..a0dc7e74 100644 --- a/apps/temps-cli/src/commands/up/index.ts +++ b/apps/temps-cli/src/commands/up/index.ts @@ -1,15 +1,22 @@ import type { Command } from 'commander' -import { requireAuth } from '../../config/store.js' -import { setupClient, client } from '../../lib/api-client.js' +import { requireAuth, config } from '../../config/store.js' +import { setupClient, client, getErrorMessage } from '../../lib/api-client.js' import { resolveProjectSlug } from '../../config/resolve-project.js' import { hasProjectConfig, writeProjectConfig } from '../../config/project-config.js' -import { deploy } from '../deploy/deploy.js' import { deployLocalImage } from '../deploy/deploy-local-image.js' import { runSetupWizard } from './setup-wizard.js' import { detectGitBranch } from '../../lib/detect-project.js' -import { promptConfirm } from '../../ui/prompts.js' -import { info, warning, newline, colors } from '../../ui/output.js' -import { getProjectBySlug } from '../../api/sdk.gen.js' +import { promptConfirm, promptSelect, promptText } from '../../ui/prompts.js' +import { startSpinner, succeedSpinner, failSpinner } from '../../ui/spinner.js' +import { info, warning, newline, colors, icons, box } from '../../ui/output.js' +import { + getProjectBySlug, + getEnvironments, + triggerProjectPipeline, + getProjectDeployments, +} from '../../api/sdk.gen.js' +import type { ProjectResponse, EnvironmentResponse } from '../../api/types.gen.js' +import { watchDeployment } from '../../lib/deployment-watcher.jsx' interface UpOptions { project?: string @@ -50,22 +57,122 @@ async function up(projectArg: string | undefined, options: UpOptions): Promise 0) { + if (options.environment) { + const env = environments.find(e => e.name === options.environment) + if (env) { + environmentId = env.id + environmentName = env.name + } + } else if (!options.yes) { + const selectedEnv = await promptSelect({ + message: 'Select environment', + choices: environments.map((env) => ({ + name: env.name, + value: String(env.id), + description: env.is_preview ? 'Preview environment' : undefined, + })), + default: String(environments.find(e => e.name === 'production')?.id ?? environments[0]?.id ?? ''), + }) + environmentId = parseInt(selectedEnv, 10) + environmentName = environments.find(e => e.id === environmentId)?.name ?? 'production' + } else { + const prodEnv = environments.find(e => e.name === 'production') + if (prodEnv) { + environmentId = prodEnv.id + environmentName = prodEnv.name + } else if (environments[0]) { + environmentId = environments[0].id + environmentName = environments[0].name + } + } + } + + // ─── Deploy based on source type ──────────────────────────────────────── if (sourceType === 'manual' || sourceType === 'docker_image') { - // Manual/docker_image projects deploy via local image build + upload + // Show deployment preview + newline() + box( + [ + `Project: ${colors.bold(project.name)}`, + `Environment: ${colors.bold(environmentName)}`, + project.preset ? `Preset: ${colors.bold(project.preset)}` : null, + `Deploy: ${colors.bold('Manual (local image upload)')}`, + ] + .filter(Boolean) + .join('\n'), + `${icons.rocket} Deployment Preview` + ) + newline() + await deployLocalImage({ project: resolved.slug, environment: options.environment, @@ -73,7 +180,7 @@ async function up(projectArg: string | undefined, options: UpOptions): Promise setTimeout(r, 2000)) + } + + if (deploymentId) { + succeedSpinner(`Deployment #${deploymentId} found`) + const result = await watchDeployment({ + projectId: project.id, + deploymentId, + timeoutSecs: 600, + projectName: resolved.slug, + }) + + if (!result.success) { + process.exitCode = 1 + } + } else { + failSpinner('Could not locate the deployment to track') + info(`Check status: ${colors.muted('bunx @temps-sdk/cli status')}`) + } + } catch (err) { + failSpinner('Deployment failed') + throw err + } } // Offer to save config if it doesn't exist diff --git a/apps/temps-cli/src/lib/api-client.ts b/apps/temps-cli/src/lib/api-client.ts index d42ff553..40fb8c49 100644 --- a/apps/temps-cli/src/lib/api-client.ts +++ b/apps/temps-cli/src/lib/api-client.ts @@ -4,7 +4,7 @@ import { config, credentials } from '../config/store.js' /** * Setup the API client with the correct base URL and auth headers */ -function normalizeApiUrl(url: string): string { +export function normalizeApiUrl(url: string): string { // Remove trailing slash let normalized = url.replace(/\/+$/, '') // Ensure /api suffix if not already present @@ -31,6 +31,13 @@ export async function setupClient(): Promise { }) } +/** + * Get the web dashboard base URL (API URL without /api suffix) + */ +export function getWebUrl(): string { + return config.get('apiUrl').replace(/\/+$/, '').replace(/\/api$/, '') +} + /** * Extract error message from API error response */ diff --git a/apps/temps-cli/src/lib/deployment-watcher.tsx b/apps/temps-cli/src/lib/deployment-watcher.tsx index 3aa0f1ca..cd0a4f09 100644 --- a/apps/temps-cli/src/lib/deployment-watcher.tsx +++ b/apps/temps-cli/src/lib/deployment-watcher.tsx @@ -1,7 +1,8 @@ -import { useState, useEffect } from 'react' +import { useState, useEffect, useCallback } from 'react' import { render, Box, Text, Newline } from 'ink' import Spinner from 'ink-spinner' import { config, credentials } from '../config/store.js' +import { normalizeApiUrl, getWebUrl } from './api-client.js' interface DeploymentEnvironment { id: number @@ -62,6 +63,10 @@ interface JobState { lastLogLine: number } +const TERMINAL_STATUSES = ['success', 'completed', 'deployed', 'failed', 'error', 'cancelled'] +const SUCCESS_STATUSES = ['success', 'completed', 'deployed'] +const FAILURE_STATUSES = ['failed', 'error', 'cancelled'] + // Convert API timestamp to milliseconds function toMs(timestamp: number): number { if (timestamp < 946684800000) { @@ -94,7 +99,7 @@ function StatusIcon({ status }: { status: string }) { case 'success': case 'completed': case 'deployed': - return + return case 'failed': case 'error': return @@ -142,8 +147,8 @@ function LogEntryRow({ entry }: { entry: LogEntry }) { break } - // Clean up the message - const message = entry.message.replace(/^[✅❌⏳📦📋📂🚀📍🔄⬇️🐳🏷️]\s*/, '') + // Clean up the message — strip leading emoji + const message = entry.message.replace(/^[\u{1F300}-\u{1F9FF}\u{2600}-\u{26FF}\u{2700}-\u{27BF}]\s*/u, '') return ( @@ -153,15 +158,15 @@ function LogEntryRow({ entry }: { entry: LogEntry }) { } // Job row component -function JobRow({ jobState, showAllLogs }: { jobState: JobState; showAllLogs?: boolean }) { +function JobRow({ jobState, isFinished }: { jobState: JobState; isFinished?: boolean }) { const { job, logs } = jobState const duration = job.started_at ? formatDuration(job.started_at, job.finished_at ?? undefined) : '' const statusColor = getStatusColor(job.status) - // Show more logs for running jobs, fewer for completed - const maxLogs = showAllLogs ? 20 : (job.status === 'running' ? 10 : 5) + // Show fewer logs when finished to keep output compact + const maxLogs = isFinished ? 3 : (job.status === 'running' ? 10 : 5) const recentLogs = logs.slice(-maxLogs) return ( @@ -170,19 +175,18 @@ function JobRow({ jobState, showAllLogs }: { jobState: JobState; showAllLogs?: b {job.name} {duration && ({duration})} - {logs.length > 0 && [{logs.length} logs]} {/* Error message */} - {(job.status === 'failed' || job.status === 'error') && job.error_message && ( + {FAILURE_STATUSES.includes(job.status) && job.error_message && ( Error: {job.error_message} )} - {/* Logs - always show if there are any */} - {recentLogs.length > 0 && ( - + {/* Logs — show during progress, compact on finish */} + {recentLogs.length > 0 && !isFinished && ( + {recentLogs.map((log, i) => ( ))} @@ -197,7 +201,7 @@ function DeploymentWatcher({ projectId, deploymentId, timeoutSecs, - projectName: _projectName, + projectName, apiUrl, apiKey, onComplete, @@ -207,9 +211,11 @@ function DeploymentWatcher({ const [startTime] = useState(Date.now()) const [elapsed, setElapsed] = useState('0s') const [error, setError] = useState(null) + const [result, setResult] = useState(null) // Update elapsed time useEffect(() => { + if (result) return // Stop updating once finished const timer = setInterval(() => { const seconds = Math.floor((Date.now() - startTime) / 1000) if (seconds < 60) { @@ -222,147 +228,148 @@ function DeploymentWatcher({ }, 1000) return () => clearInterval(timer) - }, [startTime]) + }, [startTime, result]) + + // Signal completion after result is rendered + useEffect(() => { + if (!result) return + const timer = setTimeout(() => onComplete(result), 200) + return () => clearTimeout(timer) + }, [result, onComplete]) + + // Fetch jobs helper + const fetchJobs = useCallback(async (currentJobStates: Map): Promise> => { + const jobsRes = await fetch( + `${apiUrl}/projects/${projectId}/deployments/${deploymentId}/jobs`, + { headers: { Authorization: `Bearer ${apiKey}` } } + ) + + if (!jobsRes.ok) return currentJobStates + + const jobsData = (await jobsRes.json()) as { jobs: DeploymentJobResponse[] } + const jobs = jobsData.jobs || [] + jobs.sort((a, b) => a.id - b.id) + + const newJobStates = new Map(currentJobStates) + + for (const job of jobs) { + let state = newJobStates.get(job.job_id) + if (!state) { + state = { job, logs: [], lastLogLine: 0 } + } else { + state = { ...state, job } + } + + // Fetch logs for jobs that have started + if (job.status !== 'pending' && job.status !== 'queued') { + try { + const logsRes = await fetch( + `${apiUrl}/projects/${projectId}/deployments/${deploymentId}/jobs/${job.id}/logs`, + { headers: { Authorization: `Bearer ${apiKey}` } } + ) + + if (logsRes.ok) { + const logsText = await logsRes.text() + if (logsText.trim()) { + const newLogs: LogEntry[] = [] + for (const line of logsText.trim().split('\n')) { + if (!line.trim()) continue + try { + const entry = JSON.parse(line) as LogEntry + if (entry.line > state.lastLogLine) { + newLogs.push(entry) + state.lastLogLine = entry.line + } + } catch { /* skip malformed log lines */ } + } + if (newLogs.length > 0) { + state = { ...state, logs: [...state.logs, ...newLogs] } + } + } + } + } catch { /* skip log fetch errors */ } + } + + newJobStates.set(job.job_id, state) + } + + return newJobStates + }, [apiUrl, apiKey, projectId, deploymentId]) // Main polling effect useEffect(() => { let cancelled = false const timeoutMs = timeoutSecs * 1000 + let latestJobStates = jobStates async function poll() { - while (!cancelled && Date.now() - startTime < timeoutMs) { try { - // Fetch deployment + // 1. Fetch deployment status const deploymentRes = await fetch( `${apiUrl}/projects/${projectId}/deployments/${deploymentId}`, { headers: { Authorization: `Bearer ${apiKey}` } } ) + let dep: DeploymentResponse | null = null + if (deploymentRes.ok) { - const dep = (await deploymentRes.json()) as DeploymentResponse + dep = (await deploymentRes.json()) as DeploymentResponse setDeployment(dep) - - // Check terminal states - const isDeploymentTerminal = ['success', 'completed', 'deployed', 'failed', 'error', 'cancelled'].includes(dep.status) - - if (isDeploymentTerminal) { - // For failed deployments, exit immediately - if (['failed', 'error', 'cancelled'].includes(dep.status)) { - onComplete({ - success: false, - deployment: dep, - error: dep.cancelled_reason || 'Deployment failed', - }) - return - } - - // For successful deployments: - // - If URL is available, deployment is live - // - Otherwise check if "Mark Deployment Complete" job is done - const markCompleteJob = Array.from(jobStates.values()).find( - js => js.job.name.toLowerCase().includes('mark deployment complete') - ) - const isMarkCompleteDone = markCompleteJob && - ['success', 'completed'].includes(markCompleteJob.job.status) - - if (dep.url || isMarkCompleteDone) { - onComplete({ success: true, deployment: dep }) - return - } - } } else { - // Show error in UI const errorText = await deploymentRes.text() setError(`API Error ${deploymentRes.status}: ${errorText.substring(0, 200)}`) } - // Fetch jobs - const jobsRes = await fetch( - `${apiUrl}/projects/${projectId}/deployments/${deploymentId}/jobs`, - { headers: { Authorization: `Bearer ${apiKey}` } } - ) - - if (jobsRes.ok) { - const jobsData = (await jobsRes.json()) as { jobs: DeploymentJobResponse[] } - const jobs = jobsData.jobs || [] - jobs.sort((a, b) => a.id - b.id) - - // Update job states and fetch logs - const newJobStates = new Map(jobStates) - - for (const job of jobs) { - let state = newJobStates.get(job.job_id) - if (!state) { - state = { job, logs: [], lastLogLine: 0 } - } else { - state = { ...state, job } - } - - // Fetch logs for jobs that have started - if (job.status !== 'pending' && job.status !== 'queued') { - try { - const logsRes = await fetch( - `${apiUrl}/projects/${projectId}/deployments/${deploymentId}/jobs/${job.id}/logs`, - { headers: { Authorization: `Bearer ${apiKey}` } } - ) - - if (logsRes.ok) { - const logsText = await logsRes.text() - if (logsText.trim()) { - const newLogs: LogEntry[] = [] - for (const line of logsText.trim().split('\n')) { - if (!line.trim()) continue - try { - const entry = JSON.parse(line) as LogEntry - if (entry.line > state.lastLogLine) { - newLogs.push(entry) - state.lastLogLine = entry.line - } - } catch {} - } - if (newLogs.length > 0) { - state = { ...state, logs: [...state.logs, ...newLogs] } - } - } - } - } catch {} - } + // 2. Always fetch jobs (so final states are captured) + latestJobStates = await fetchJobs(latestJobStates) + if (!cancelled) { + setJobStates(latestJobStates) + } - newJobStates.set(job.job_id, state) + // 3. Check terminal state AFTER jobs are updated + if (dep && TERMINAL_STATUSES.includes(dep.status)) { + if (FAILURE_STATUSES.includes(dep.status)) { + setResult({ + success: false, + deployment: dep, + error: dep.cancelled_reason || 'Deployment failed', + }) + return } - setJobStates(newJobStates) + // Success — deployment is in a terminal success state + setResult({ success: true, deployment: dep }) + return } - await new Promise((r) => setTimeout(r, 1000)) + await new Promise((r) => setTimeout(r, 1500)) } catch (err) { setError(`Exception: ${err instanceof Error ? err.message : String(err)}`) - await new Promise((r) => setTimeout(r, 1000)) + await new Promise((r) => setTimeout(r, 2000)) } } // Timeout if (!cancelled) { - onComplete({ success: false, error: 'Timeout' }) + setResult({ success: false, error: 'Timeout waiting for deployment to complete' }) } } poll() - - return () => { - cancelled = true - } - }, [projectId, deploymentId, timeoutSecs, startTime, onComplete]) + return () => { cancelled = true } + }, []) // eslint-disable-line react-hooks/exhaustive-deps const sortedJobs = Array.from(jobStates.values()).sort((a, b) => a.job.id - b.job.id) const statusColor = deployment ? getStatusColor(deployment.status) : 'gray' + const isFinished = !!result + const webUrl = getWebUrl() return ( {/* Header */} - {' '}🚀 Deployment Progress + {' '}🚀 Deployment #{deploymentId} @@ -385,7 +392,7 @@ function DeploymentWatcher({ {/* Error display */} - {error && ( + {error && !result && ( Error: {error} @@ -395,80 +402,61 @@ function DeploymentWatcher({ {/* Jobs */} {sortedJobs.map((jobState) => ( - + ))} - {sortedJobs.length === 0 && deployment && ( + {sortedJobs.length === 0 && deployment && !result && ( Waiting for jobs... )} - - - ) -} - -// Result display component -function DeploymentResult({ - result, - projectName, -}: { - result: WatchDeploymentResult - projectName?: string -}) { - if (result.success) { - const deployment = result.deployment - const envDomains = deployment?.environment?.domains || [] - // Domain might already include protocol, check before adding https:// - const firstDomain = envDomains[0] - const envUrl = firstDomain - ? (firstDomain.startsWith('http') ? firstDomain : `https://${firstDomain}`) - : null - - return ( - - - {' '}✓ Deployment completed successfully! - - - - Deployment ID: - {deployment?.id} - - {deployment?.url && ( - - Deployment URL: - {deployment.url} - - )} - {envUrl && ( - - Environment URL: - {envUrl} - - )} - - - ) - } - - return ( - - - {' '}✗ Deployment failed - - {result.error && ( - - Reason: {result.error} - + {/* Result summary */} + {result && ( + <> + + {result.success ? ( + + + ✓ Deployment completed successfully! + + {deployment?.url && ( + + URL: + {deployment.url} + + )} + {deployment?.environment?.domains?.[0] && ( + + Domain: + + {deployment.environment.domains[0].startsWith('http') + ? deployment.environment.domains[0] + : `https://${deployment.environment.domains[0]}`} + + + )} + + ) : ( + + + ✗ Deployment failed + + {result.error && ( + + {result.error} + + )} + + )} + {projectName && ( + + Dashboard: {webUrl}/projects/{projectName}/deployments + + )} + + )} - {projectName && ( - - View full logs: temps logs {projectName} - - )} - ) } @@ -480,7 +468,7 @@ export async function watchDeployment( options: WatchDeploymentOptions ): Promise { // Fetch credentials before rendering to avoid async issues in React - const apiUrl = config.get('apiUrl') + const apiUrl = normalizeApiUrl(config.get('apiUrl')) const apiKey = await credentials.getApiKey() || '' if (!apiKey) { @@ -488,32 +476,18 @@ export async function watchDeployment( } return new Promise((resolve) => { - let instance: ReturnType | null = null - - const handleComplete = (res: WatchDeploymentResult) => { - // Unmount the watcher and show the result - if (instance) { - instance.unmount() - } - - // Render the result - const resultInstance = render( - - ) - - // Wait a bit then unmount and resolve - setTimeout(() => { - resultInstance.unmount() - resolve(res) - }, 100) - } - - instance = render( + const instance = render( { + // Give Ink time to render the final state, then unmount + setTimeout(() => { + instance.unmount() + resolve(res) + }, 300) + }} /> ) }) diff --git a/apps/temps-cli/src/lib/detect-project.ts b/apps/temps-cli/src/lib/detect-project.ts index ed4db1db..ea641af9 100644 --- a/apps/temps-cli/src/lib/detect-project.ts +++ b/apps/temps-cli/src/lib/detect-project.ts @@ -273,6 +273,8 @@ export function isGitRepo(dir?: string): boolean { } } +// ─── Git Commit Detection ──────────────────────────────────────────────────── + // ─── Service Hints Detection ───────────────────────────────────────────────── import type { ServiceTypeRoute } from '../api/types.gen.js' diff --git a/apps/temps-cli/src/lib/env-file.ts b/apps/temps-cli/src/lib/env-file.ts new file mode 100644 index 00000000..696475af --- /dev/null +++ b/apps/temps-cli/src/lib/env-file.ts @@ -0,0 +1,79 @@ +/** + * Shared .env file parsing utility. + * Handles comments, empty lines, quoted values, and escape sequences. + */ +import { existsSync, readFileSync } from 'node:fs' +import { resolve } from 'node:path' + +/** + * Parse a .env file content string into a key-value record. + * Supports: comments (#), empty lines, KEY=VALUE, single/double quoted values, + * escape sequences (\n, \", \'). + */ +export function parseEnvFile(content: string): Record { + const variables: Record = {} + + for (const line of content.split('\n')) { + const trimmed = line.trim() + + // Skip empty lines and comments + if (!trimmed || trimmed.startsWith('#')) continue + + // Parse KEY=VALUE + const match = trimmed.match(/^([^=]+)=(.*)$/) + if (!match) continue + + const [, key, rawValue] = match + if (!key || rawValue === undefined) continue + + let value = rawValue.trim() + + // Handle quoted values + if ( + (value.startsWith('"') && value.endsWith('"')) || + (value.startsWith("'") && value.endsWith("'")) + ) { + value = value + .slice(1, -1) + .replace(/\\n/g, '\n') + .replace(/\\"/g, '"') + .replace(/\\'/g, "'") + } + + variables[key.trim()] = value + } + + return variables +} + +/** + * Read and parse a .env file from disk. + * Returns null if the file doesn't exist. + */ +export function readEnvFile(filePath: string): Record | null { + const resolved = resolve(filePath) + if (!existsSync(resolved)) { + return null + } + const content = readFileSync(resolved, 'utf-8') + return parseEnvFile(content) +} + +/** + * Look for common .env file names in a directory. + * Returns the paths of files that exist, ordered by priority. + */ +export function findEnvFiles(dir?: string): string[] { + const cwd = dir ?? process.cwd() + const candidates = ['.env', '.env.local', '.env.development', '.env.example'] + const found: string[] = [] + + for (const name of candidates) { + const fullPath = resolve(cwd, name) + if (existsSync(fullPath)) { + found.push(name) + } + } + + return found +} diff --git a/crates/temps-analytics-events/src/services/events_service.rs b/crates/temps-analytics-events/src/services/events_service.rs index c742998f..9de420f0 100644 --- a/crates/temps-analytics-events/src/services/events_service.rs +++ b/crates/temps-analytics-events/src/services/events_service.rs @@ -497,6 +497,11 @@ impl AnalyticsEventsService { "events e LEFT JOIN ip_geolocations ig ON e.ip_geolocation_id = ig.id", format!("COALESCE(ig.{}, 'Unknown')", group_by_str), ) + } else if is_referrer_column { + ( + "events e", + format!("COALESCE(e.{}, 'Direct')", group_by_str), + ) } else { ("events e", format!("e.{}", group_by_str)) }; @@ -650,11 +655,17 @@ impl AnalyticsEventsService { // Check if we need to join with ip_geolocations let is_geo_column = matches!(group_by_str, "country" | "region" | "city"); + let is_referrer_column = group_by_str == "referrer_hostname"; let (from_clause, select_column) = if is_geo_column { ( "events e LEFT JOIN ip_geolocations ig ON e.ip_geolocation_id = ig.id", format!("COALESCE(ig.{}, 'Unknown')", group_by_str), ) + } else if is_referrer_column { + ( + "events e", + format!("COALESCE(e.{}, 'Direct')", group_by_str), + ) } else { ("events e", format!("e.{}", group_by_str)) }; @@ -1047,6 +1058,10 @@ WHERE project_id = $1 .collect(); // Query 3: Hourly sparkline data per project (current period — raw events for accuracy) + // Uses generate_series to produce the full hour grid and LEFT JOINs actual data. + // This guarantees every project gets a row for every hour in the range, even when + // a project has events in only a single bucket (time_bucket_gapfill inside + // CROSS JOIN LATERAL fails to fill gaps in that edge case). let gapfill_start_idx = project_ids.len() + 3; let gapfill_end_idx = gapfill_start_idx + 1; @@ -1054,21 +1069,27 @@ WHERE project_id = $1 r#" SELECT p.project_id, - sub.bucket::timestamptz as bucket, - COALESCE(sub.count, 0) as count + h.bucket, + COALESCE(d.count, 0) as count FROM unnest(ARRAY[{in_clause}]) AS p(project_id) - CROSS JOIN LATERAL ( + CROSS JOIN generate_series( + date_trunc('hour', ${gapfill_start_idx}::timestamptz), + date_trunc('hour', ${gapfill_end_idx}::timestamptz), + '1 hour'::interval + ) AS h(bucket) + LEFT JOIN ( SELECT - time_bucket_gapfill('1 hour', timestamp, ${gapfill_start_idx}::timestamptz, ${gapfill_end_idx}::timestamptz) as bucket, - COALESCE(COUNT(DISTINCT visitor_id) FILTER (WHERE visitor_id IS NOT NULL), 0) as count + project_id, + date_trunc('hour', timestamp) as bucket, + COUNT(DISTINCT visitor_id) FILTER (WHERE visitor_id IS NOT NULL) as count FROM events - WHERE project_id = p.project_id - AND timestamp >= $1 + WHERE timestamp >= $1 AND timestamp <= $2 + AND project_id IN ({in_clause}) AND event_type = 'page_view' - GROUP BY bucket - ) sub - ORDER BY p.project_id, sub.bucket ASC + GROUP BY project_id, date_trunc('hour', timestamp) + ) d ON d.project_id = p.project_id AND d.bucket = h.bucket + ORDER BY p.project_id, h.bucket ASC "#, ); diff --git a/crates/temps-analytics/src/analytics.rs b/crates/temps-analytics/src/analytics.rs index b5bd2386..6318796b 100644 --- a/crates/temps-analytics/src/analytics.rs +++ b/crates/temps-analytics/src/analytics.rs @@ -315,7 +315,10 @@ impl Analytics for AnalyticsService { ig.country_code, ig.timezone, ig.is_eu, - last_event.page_path as current_page + last_event.page_path as current_page, + v.first_referrer, + v.first_referrer_hostname, + v.first_channel FROM visitor v LEFT JOIN ip_geolocations ig ON v.ip_address_id = ig.id LEFT JOIN LATERAL ( @@ -361,6 +364,9 @@ impl Analytics for AnalyticsService { timezone: Option, is_eu: Option, current_page: Option, + first_referrer: Option, + first_referrer_hostname: Option, + first_channel: Option, } let results = VisitorResult::find_by_statement(Statement::from_sql_and_values( @@ -395,6 +401,9 @@ impl Analytics for AnalyticsService { timezone: r.timezone, is_eu: r.is_eu, current_page: r.current_page, + first_referrer: r.first_referrer, + first_referrer_hostname: r.first_referrer_hostname, + first_channel: r.first_channel, }) .collect(); @@ -651,7 +660,10 @@ impl Analytics for AnalyticsService { ig.country, ig.country_code, ig.timezone, - ig.is_eu + ig.is_eu, + v.first_referrer, + v.first_referrer_hostname, + v.first_channel FROM visitor v LEFT JOIN ip_geolocations ig ON v.ip_address_id = ig.id WHERE v.id = $1 @@ -679,6 +691,9 @@ impl Analytics for AnalyticsService { country_code: Option, timezone: Option, is_eu: Option, + first_referrer: Option, + first_referrer_hostname: Option, + first_channel: Option, } let result = DetailResult::find_by_statement(Statement::from_sql_and_values( @@ -710,6 +725,9 @@ impl Analytics for AnalyticsService { country_code: r.country_code, timezone: r.timezone, is_eu: r.is_eu, + first_referrer: r.first_referrer, + first_referrer_hostname: r.first_referrer_hostname, + first_channel: r.first_channel, })) } @@ -2235,6 +2253,9 @@ WHERE project_id = $1 country_code: geo_opt.as_ref().and_then(|g| g.country_code.clone()), timezone: geo_opt.as_ref().and_then(|g| g.timezone.clone()), is_eu: geo_opt.as_ref().map(|g| g.is_eu), + first_referrer: visitor_model.first_referrer, + first_referrer_hostname: visitor_model.first_referrer_hostname, + first_channel: visitor_model.first_channel, }; Ok(Some(response)) } @@ -2292,6 +2313,9 @@ WHERE project_id = $1 country_code: geo_opt.as_ref().and_then(|g| g.country_code.clone()), timezone: geo_opt.as_ref().and_then(|g| g.timezone.clone()), is_eu: geo_opt.as_ref().map(|g| g.is_eu), + first_referrer: visitor_model.first_referrer, + first_referrer_hostname: visitor_model.first_referrer_hostname, + first_channel: visitor_model.first_channel, }; Ok(Some(response)) } @@ -2327,7 +2351,10 @@ WHERE project_id = $1 ig.country_code, ig.timezone, ig.is_eu, - last_event.page_path as current_page + last_event.page_path as current_page, + v.first_referrer, + v.first_referrer_hostname, + v.first_channel FROM visitor v LEFT JOIN ip_geolocations ig ON v.ip_address_id = ig.id LEFT JOIN LATERAL ( @@ -2366,6 +2393,9 @@ WHERE project_id = $1 timezone: Option, is_eu: Option, current_page: Option, + first_referrer: Option, + first_referrer_hostname: Option, + first_channel: Option, } let rows = LiveVisitorRow::find_by_statement(Statement::from_sql_and_values( @@ -2404,6 +2434,9 @@ WHERE project_id = $1 timezone: row.timezone, is_eu: row.is_eu, current_page: row.current_page, + first_referrer: row.first_referrer, + first_referrer_hostname: row.first_referrer_hostname, + first_channel: row.first_channel, }) .collect(); @@ -3581,6 +3614,507 @@ WHERE project_id = $1 total_sessions, }) } + + /// Get detailed analytics for a specific event name + async fn get_event_detail( + &self, + project_id: i32, + event_name: &str, + start_date: UtcDateTime, + end_date: UtcDateTime, + environment_id: Option, + bucket_interval: Option<&str>, + ) -> Result { + // Determine bucket interval based on date range + let duration = end_date - start_date; + let interval = bucket_interval.unwrap_or_else(|| { + if duration.num_days() <= 1 { + "hour" + } else if duration.num_days() <= 31 { + "day" + } else if duration.num_days() <= 180 { + "week" + } else { + "month" + } + }); + + let (pg_interval, date_trunc_unit) = match interval { + "hour" => ("1 hour", "hour"), + "day" => ("1 day", "day"), + "week" => ("1 week", "week"), + "month" => ("1 month", "month"), + _ => ("1 day", "day"), + }; + + let env_filter = environment_id + .map(|id| format!("AND e.environment_id = {}", id)) + .unwrap_or_default(); + + // 1. Get summary stats + let stats_sql = format!( + r#" + SELECT + COUNT(*) as total_count, + COUNT(DISTINCT e.visitor_id) as unique_visitors, + COUNT(DISTINCT e.session_id) as unique_sessions + FROM events e + WHERE e.project_id = $1 + AND COALESCE(e.event_name, e.event_type) = $2 + AND e.timestamp >= $3 + AND e.timestamp < $4 + {} + "#, + env_filter + ); + + #[derive(FromQueryResult)] + struct EventStats { + total_count: i64, + unique_visitors: i64, + unique_sessions: i64, + } + + let stats = EventStats::find_by_statement(Statement::from_sql_and_values( + DatabaseBackend::Postgres, + &stats_sql, + vec![ + project_id.into(), + event_name.into(), + start_date.into(), + end_date.into(), + ], + )) + .one(self.db.as_ref()) + .await? + .unwrap_or(EventStats { + total_count: 0, + unique_visitors: 0, + unique_sessions: 0, + }); + + // 2. Get time series data + let activity_sql = format!( + r#" + WITH time_buckets AS ( + SELECT generate_series( + date_trunc('{date_trunc}', $3::timestamptz), + date_trunc('{date_trunc}', $4::timestamptz), + '{pg_interval}'::interval + ) AS bucket + ), + event_activity AS ( + SELECT + date_trunc('{date_trunc}', e.timestamp) as bucket, + COUNT(*) as count, + COUNT(DISTINCT e.visitor_id) as unique_visitors + FROM events e + WHERE e.project_id = $1 + AND COALESCE(e.event_name, e.event_type) = $2 + AND e.timestamp >= $3 + AND e.timestamp < $4 + {env_filter} + GROUP BY date_trunc('{date_trunc}', e.timestamp) + ) + SELECT + tb.bucket::timestamptz as timestamp, + COALESCE(ea.count, 0) as count, + COALESCE(ea.unique_visitors, 0) as unique_visitors + FROM time_buckets tb + LEFT JOIN event_activity ea ON tb.bucket = ea.bucket + ORDER BY tb.bucket + "#, + date_trunc = date_trunc_unit, + pg_interval = pg_interval, + env_filter = env_filter, + ); + + #[derive(FromQueryResult)] + struct ActivityBucket { + timestamp: UtcDateTime, + count: i64, + unique_visitors: i64, + } + + let activity_results = ActivityBucket::find_by_statement(Statement::from_sql_and_values( + DatabaseBackend::Postgres, + &activity_sql, + vec![ + project_id.into(), + event_name.into(), + start_date.into(), + end_date.into(), + ], + )) + .all(self.db.as_ref()) + .await?; + + let activity_over_time: Vec = + activity_results + .into_iter() + .map(|b| crate::types::responses::EventActivityBucket { + timestamp: b.timestamp, + count: b.count, + unique_visitors: b.unique_visitors, + }) + .collect(); + + // 3. Get top referrers + let referrers_sql = format!( + r#" + WITH referrer_stats AS ( + SELECT + COALESCE(NULLIF(e.referrer_hostname, ''), 'Direct') as referrer, + COUNT(*) as count + FROM events e + WHERE e.project_id = $1 + AND COALESCE(e.event_name, e.event_type) = $2 + AND e.timestamp >= $3 + AND e.timestamp < $4 + {} + GROUP BY COALESCE(NULLIF(e.referrer_hostname, ''), 'Direct') + ), + total AS ( + SELECT SUM(count) as total_count FROM referrer_stats + ) + SELECT + rs.referrer, + rs.count, + CASE WHEN t.total_count > 0 + THEN rs.count::float / t.total_count::float * 100 + ELSE 0 END as percentage + FROM referrer_stats rs + CROSS JOIN total t + ORDER BY rs.count DESC + LIMIT 20 + "#, + env_filter + ); + + #[derive(FromQueryResult)] + struct ReferrerResult { + referrer: String, + count: i64, + percentage: f64, + } + + let referrer_results = ReferrerResult::find_by_statement(Statement::from_sql_and_values( + DatabaseBackend::Postgres, + &referrers_sql, + vec![ + project_id.into(), + event_name.into(), + start_date.into(), + end_date.into(), + ], + )) + .all(self.db.as_ref()) + .await?; + + let referrers: Vec = referrer_results + .into_iter() + .map(|r| crate::types::responses::EventReferrerStats { + referrer: r.referrer, + count: r.count, + percentage: r.percentage, + }) + .collect(); + + // 4. Get top countries + let countries_sql = format!( + r#" + WITH country_stats AS ( + SELECT + COALESCE(ig.country, 'Unknown') as country, + ig.country_code, + COUNT(*) as count + FROM events e + LEFT JOIN ip_geolocations ig ON e.ip_geolocation_id = ig.id + WHERE e.project_id = $1 + AND COALESCE(e.event_name, e.event_type) = $2 + AND e.timestamp >= $3 + AND e.timestamp < $4 + {} + GROUP BY COALESCE(ig.country, 'Unknown'), ig.country_code + ), + total AS ( + SELECT SUM(count) as total_count FROM country_stats + ) + SELECT + cs.country, + cs.country_code, + cs.count, + CASE WHEN t.total_count > 0 + THEN cs.count::float / t.total_count::float * 100 + ELSE 0 END as percentage + FROM country_stats cs + CROSS JOIN total t + ORDER BY cs.count DESC + LIMIT 30 + "#, + env_filter + ); + + #[derive(FromQueryResult)] + struct CountryResult { + country: String, + country_code: Option, + count: i64, + percentage: f64, + } + + let country_results = CountryResult::find_by_statement(Statement::from_sql_and_values( + DatabaseBackend::Postgres, + &countries_sql, + vec![ + project_id.into(), + event_name.into(), + start_date.into(), + end_date.into(), + ], + )) + .all(self.db.as_ref()) + .await?; + + let countries: Vec = country_results + .into_iter() + .map(|c| crate::types::responses::EventCountryStats { + country: c.country, + country_code: c.country_code, + count: c.count, + percentage: c.percentage, + }) + .collect(); + + // 5. Get top browsers + let browsers_sql = format!( + r#" + WITH browser_stats AS ( + SELECT + COALESCE(e.browser, 'Unknown') as browser, + COUNT(*) as count + FROM events e + WHERE e.project_id = $1 + AND COALESCE(e.event_name, e.event_type) = $2 + AND e.timestamp >= $3 + AND e.timestamp < $4 + {} + GROUP BY COALESCE(e.browser, 'Unknown') + ), + total AS ( + SELECT SUM(count) as total_count FROM browser_stats + ) + SELECT + bs.browser, + bs.count, + CASE WHEN t.total_count > 0 + THEN bs.count::float / t.total_count::float * 100 + ELSE 0 END as percentage + FROM browser_stats bs + CROSS JOIN total t + ORDER BY bs.count DESC + LIMIT 20 + "#, + env_filter + ); + + #[derive(FromQueryResult)] + struct BrowserResult { + browser: String, + count: i64, + percentage: f64, + } + + let browser_results = BrowserResult::find_by_statement(Statement::from_sql_and_values( + DatabaseBackend::Postgres, + &browsers_sql, + vec![ + project_id.into(), + event_name.into(), + start_date.into(), + end_date.into(), + ], + )) + .all(self.db.as_ref()) + .await?; + + let browsers: Vec = browser_results + .into_iter() + .map(|b| crate::types::responses::EventBrowserStats { + browser: b.browser, + count: b.count, + percentage: b.percentage, + }) + .collect(); + + Ok(crate::types::responses::EventDetailResponse { + event_name: event_name.to_string(), + total_count: stats.total_count, + unique_visitors: stats.unique_visitors, + unique_sessions: stats.unique_sessions, + activity_over_time, + referrers, + countries, + browsers, + bucket_interval: interval.to_string(), + }) + } + + /// Get paginated list of visitors who triggered a specific event + async fn get_event_visitors( + &self, + project_id: i32, + event_name: &str, + start_date: UtcDateTime, + end_date: UtcDateTime, + environment_id: Option, + page: u64, + per_page: u64, + ) -> Result { + let per_page = per_page.min(100); + let offset = (page.saturating_sub(1)) * per_page; + + let env_filter = environment_id + .map(|id| format!("AND e.environment_id = {}", id)) + .unwrap_or_default(); + + // Get total count of unique visitors + let count_sql = format!( + r#" + SELECT COUNT(DISTINCT e.visitor_id) as total_count + FROM events e + WHERE e.project_id = $1 + AND COALESCE(e.event_name, e.event_type) = $2 + AND e.timestamp >= $3 + AND e.timestamp < $4 + AND e.visitor_id IS NOT NULL + {} + "#, + env_filter + ); + + #[derive(FromQueryResult)] + struct CountResult { + total_count: i64, + } + + let total_count = CountResult::find_by_statement(Statement::from_sql_and_values( + DatabaseBackend::Postgres, + &count_sql, + vec![ + project_id.into(), + event_name.into(), + start_date.into(), + end_date.into(), + ], + )) + .one(self.db.as_ref()) + .await? + .map(|c| c.total_count) + .unwrap_or(0); + + // Get paginated visitors with aggregated stats + let visitors_sql = format!( + r#" + WITH visitor_events AS ( + SELECT + e.visitor_id, + COUNT(*) as event_count, + MIN(e.timestamp) as first_triggered, + MAX(e.timestamp) as last_triggered, + -- Pick the most recent non-null values for each field + (array_agg(ig.country ORDER BY e.timestamp DESC) FILTER (WHERE ig.country IS NOT NULL))[1] as country, + (array_agg(ig.country_code ORDER BY e.timestamp DESC) FILTER (WHERE ig.country_code IS NOT NULL))[1] as country_code, + (array_agg(ig.city ORDER BY e.timestamp DESC) FILTER (WHERE ig.city IS NOT NULL))[1] as city, + (array_agg(e.browser ORDER BY e.timestamp DESC) FILTER (WHERE e.browser IS NOT NULL))[1] as browser, + (array_agg(e.device_type ORDER BY e.timestamp DESC) FILTER (WHERE e.device_type IS NOT NULL))[1] as device_type, + (array_agg(e.referrer_hostname ORDER BY e.timestamp DESC) FILTER (WHERE e.referrer_hostname IS NOT NULL AND e.referrer_hostname != ''))[1] as referrer_hostname + FROM events e + LEFT JOIN ip_geolocations ig ON e.ip_geolocation_id = ig.id + WHERE e.project_id = $1 + AND COALESCE(e.event_name, e.event_type) = $2 + AND e.timestamp >= $3 + AND e.timestamp < $4 + AND e.visitor_id IS NOT NULL + {env_filter} + GROUP BY e.visitor_id + ORDER BY event_count DESC, last_triggered DESC + LIMIT $5 OFFSET $6 + ) + SELECT + ve.visitor_id, + COALESCE(v.visitor_id, '') as visitor_uuid, + ve.event_count, + ve.first_triggered, + ve.last_triggered, + ve.country, + ve.country_code, + ve.city, + ve.browser, + ve.device_type, + ve.referrer_hostname + FROM visitor_events ve + LEFT JOIN visitor v ON v.id = ve.visitor_id + ORDER BY ve.event_count DESC, ve.last_triggered DESC + "#, + env_filter = env_filter + ); + + #[derive(FromQueryResult)] + struct VisitorResult { + visitor_id: i32, + visitor_uuid: String, + event_count: i64, + first_triggered: UtcDateTime, + last_triggered: UtcDateTime, + country: Option, + country_code: Option, + city: Option, + browser: Option, + device_type: Option, + referrer_hostname: Option, + } + + let visitor_results = VisitorResult::find_by_statement(Statement::from_sql_and_values( + DatabaseBackend::Postgres, + &visitors_sql, + vec![ + project_id.into(), + event_name.into(), + start_date.into(), + end_date.into(), + (per_page as i64).into(), + (offset as i64).into(), + ], + )) + .all(self.db.as_ref()) + .await?; + + let visitors: Vec = visitor_results + .into_iter() + .map(|v| crate::types::responses::EventVisitorInfo { + visitor_id: v.visitor_id, + visitor_uuid: v.visitor_uuid, + event_count: v.event_count, + first_triggered: v.first_triggered, + last_triggered: v.last_triggered, + country: v.country, + country_code: v.country_code, + city: v.city, + browser: v.browser, + device_type: v.device_type, + referrer_hostname: v.referrer_hostname, + }) + .collect(); + + Ok(crate::types::responses::EventVisitorsResponse { + event_name: event_name.to_string(), + total_count, + page, + per_page, + visitors, + }) + } } #[cfg(test)] diff --git a/crates/temps-analytics/src/handler.rs b/crates/temps-analytics/src/handler.rs index 8f836669..57b9fd16 100644 --- a/crates/temps-analytics/src/handler.rs +++ b/crates/temps-analytics/src/handler.rs @@ -25,6 +25,8 @@ pub struct AppState { #[openapi( paths( get_events_count, + get_event_detail, + get_event_visitors, get_visitors, get_visitor_details, get_visitor_info, @@ -142,6 +144,16 @@ pub struct AppState { RecentActivityQuery, RecentActivityResponse, ActivityEvent, + // Event detail types + EventDetailQuery, + EventDetailResponse, + EventActivityBucket, + EventReferrerStats, + EventCountryStats, + EventBrowserStats, + EventVisitorsQuery, + EventVisitorsResponse, + EventVisitorInfo, )), info( title = "Analytics API", @@ -156,6 +168,8 @@ pub fn configure_routes() -> Router> { Router::new() .route("/analytics/general-stats", get(get_general_stats)) .route("/analytics/events", get(get_events_count)) + .route("/analytics/event-detail", get(get_event_detail)) + .route("/analytics/event-visitors", get(get_event_visitors)) .route("/analytics/visitors", get(get_visitors)) .route("/analytics/visitors/{visitor_id}", get(get_visitor_details)) .route( @@ -1429,3 +1443,111 @@ pub async fn get_recent_activity( } } } + +/// Get detailed analytics for a specific event +#[utoipa::path( + tag = "Analytics", + get, + path = "/analytics/event-detail", + params( + ("event_name" = String, Query, description = "Event name to get details for"), + ("project_id" = i32, Query, description = "Project ID"), + ("environment_id" = Option, Query, description = "Environment ID (optional)"), + ("start_date" = String, Query, description = "Start date (ISO 8601)"), + ("end_date" = String, Query, description = "End date (ISO 8601)"), + ("bucket_interval" = Option, Query, description = "Bucket interval: hour, day, week, month (default: auto)") + ), + responses( + (status = 200, description = "Successfully retrieved event details", body = EventDetailResponse), + (status = 400, description = "Invalid parameters"), + (status = 500, description = "Internal server error") + ), + security( + ("bearer_auth" = []) + ) +)] +pub async fn get_event_detail( + RequireAuth(auth): RequireAuth, + State(app_state): State>, + Query(query): Query, +) -> Result { + permission_guard!(auth, AnalyticsRead); + + let start_date: UtcDateTime = query.start_date.into(); + let end_date: UtcDateTime = query.end_date.into(); + + match app_state + .analytics_service + .get_event_detail( + query.project_id, + &query.event_name, + start_date, + end_date, + query.environment_id, + query.bucket_interval.as_deref(), + ) + .await + { + Ok(detail) => Ok(Json(detail)), + Err(e) => { + error!("Analytics error: {:?}", e); + Err(handle_analytics_error(e)) + } + } +} + +/// Get paginated list of visitors who triggered a specific event +#[utoipa::path( + tag = "Analytics", + get, + path = "/analytics/event-visitors", + params( + ("event_name" = String, Query, description = "Event name to list visitors for"), + ("project_id" = i32, Query, description = "Project ID"), + ("environment_id" = Option, Query, description = "Environment ID (optional)"), + ("start_date" = String, Query, description = "Start date (ISO 8601)"), + ("end_date" = String, Query, description = "End date (ISO 8601)"), + ("page" = Option, Query, description = "Page number (1-based, default: 1)"), + ("per_page" = Option, Query, description = "Items per page (default: 20, max: 100)") + ), + responses( + (status = 200, description = "Successfully retrieved event visitors", body = EventVisitorsResponse), + (status = 400, description = "Invalid parameters"), + (status = 500, description = "Internal server error") + ), + security( + ("bearer_auth" = []) + ) +)] +pub async fn get_event_visitors( + RequireAuth(auth): RequireAuth, + State(app_state): State>, + Query(query): Query, +) -> Result { + permission_guard!(auth, AnalyticsRead); + + let start_date: UtcDateTime = query.start_date.into(); + let end_date: UtcDateTime = query.end_date.into(); + let page = query.page.unwrap_or(1); + let per_page = query.per_page.unwrap_or(20).min(100); + + match app_state + .analytics_service + .get_event_visitors( + query.project_id, + &query.event_name, + start_date, + end_date, + query.environment_id, + page, + per_page, + ) + .await + { + Ok(result) => Ok(Json(result)), + Err(e) => { + error!("Analytics error: {:?}", e); + Err(handle_analytics_error(e)) + } + } +} diff --git a/crates/temps-analytics/src/traits.rs b/crates/temps-analytics/src/traits.rs index d600cc5d..7c708d2e 100644 --- a/crates/temps-analytics/src/traits.rs +++ b/crates/temps-analytics/src/traits.rs @@ -254,4 +254,28 @@ pub trait Analytics: Send + Sync { environment_id: Option, bucket_interval: Option<&str>, ) -> Result; + + /// Get detailed analytics for a specific event name + /// Returns total count, unique visitors, timeline, referrers, countries, browsers + async fn get_event_detail( + &self, + project_id: i32, + event_name: &str, + start_date: UtcDateTime, + end_date: UtcDateTime, + environment_id: Option, + bucket_interval: Option<&str>, + ) -> Result; + + /// Get paginated list of visitors who triggered a specific event + async fn get_event_visitors( + &self, + project_id: i32, + event_name: &str, + start_date: UtcDateTime, + end_date: UtcDateTime, + environment_id: Option, + page: u64, + per_page: u64, + ) -> Result; } diff --git a/crates/temps-analytics/src/types/requests.rs b/crates/temps-analytics/src/types/requests.rs index 8f5e06e5..bb8a2876 100644 --- a/crates/temps-analytics/src/types/requests.rs +++ b/crates/temps-analytics/src/types/requests.rs @@ -300,6 +300,34 @@ pub struct PagePathDetailQuery { pub bucket_interval: Option, } +/// Query parameters for event detail analytics +#[derive(Deserialize, Clone, ToSchema)] +pub struct EventDetailQuery { + /// The specific event name to get details for + pub event_name: String, + pub project_id: i32, + pub environment_id: Option, + pub start_date: DateTime, + pub end_date: DateTime, + /// Bucket interval for time series: 'hour', 'day', 'week', 'month' (default: auto) + pub bucket_interval: Option, +} + +/// Query parameters for event visitors list +#[derive(Deserialize, Clone, ToSchema)] +pub struct EventVisitorsQuery { + /// The specific event name to list visitors for + pub event_name: String, + pub project_id: i32, + pub environment_id: Option, + pub start_date: DateTime, + pub end_date: DateTime, + /// Page number (1-based, default: 1) + pub page: Option, + /// Items per page (default: 20, max: 100) + pub per_page: Option, +} + /// Query parameters for page flow analytics #[derive(Deserialize, Clone, ToSchema)] pub struct PageFlowQuery { diff --git a/crates/temps-analytics/src/types/responses.rs b/crates/temps-analytics/src/types/responses.rs index c2469fab..7b09ff10 100644 --- a/crates/temps-analytics/src/types/responses.rs +++ b/crates/temps-analytics/src/types/responses.rs @@ -147,6 +147,13 @@ pub struct VisitorInfo { pub is_eu: Option, /// Most recent page path visited by this visitor pub current_page: Option, + // First-visit attribution + /// Full referrer URL from the visitor's first session + pub first_referrer: Option, + /// Hostname extracted from first_referrer + pub first_referrer_hostname: Option, + /// Marketing channel from the first visit (e.g. "Organic Search", "Direct") + pub first_channel: Option, } #[derive(Debug, Serialize, ToSchema)] @@ -181,6 +188,13 @@ pub struct VisitorDetails { pub country_code: Option, pub timezone: Option, pub is_eu: Option, + // First-visit attribution + /// Full referrer URL from the visitor's first session + pub first_referrer: Option, + /// Hostname extracted from first_referrer + pub first_referrer_hostname: Option, + /// Marketing channel from the first visit (e.g. "Organic Search", "Direct") + pub first_channel: Option, } #[derive(Debug, Serialize, ToSchema)] @@ -447,6 +461,13 @@ pub struct VisitorWithGeolocation { pub country_code: Option, pub timezone: Option, pub is_eu: Option, + // First-visit attribution + /// Full referrer URL from the visitor's first session + pub first_referrer: Option, + /// Hostname extracted from first_referrer + pub first_referrer_hostname: Option, + /// Marketing channel from the first visit (e.g. "Organic Search", "Direct") + pub first_channel: Option, } #[derive(Debug, Serialize, ToSchema)] @@ -507,6 +528,13 @@ pub struct LiveVisitorInfo { pub is_eu: Option, /// Most recent page path visited by this visitor pub current_page: Option, + // First-visit attribution + /// Full referrer URL from the visitor's first session + pub first_referrer: Option, + /// Hostname extracted from first_referrer + pub first_referrer_hostname: Option, + /// Marketing channel from the first visit (e.g. "Organic Search", "Direct") + pub first_channel: Option, } #[derive(Debug, Serialize, Deserialize, ToSchema)] @@ -828,6 +856,124 @@ pub struct RecentActivityResponse { pub count: usize, } +// ============================================================================ +// Event Detail types +// ============================================================================ + +/// Summary response for a specific event's analytics +#[derive(Debug, Serialize, Deserialize, ToSchema)] +pub struct EventDetailResponse { + /// The event name being analyzed + pub event_name: String, + /// Total number of times this event was triggered in the date range + pub total_count: i64, + /// Number of unique visitors who triggered this event + pub unique_visitors: i64, + /// Number of unique sessions where this event occurred + pub unique_sessions: i64, + /// Time series data for event activity graph + pub activity_over_time: Vec, + /// Top referrer hostnames for visitors who triggered this event + pub referrers: Vec, + /// Geographic distribution of visitors who triggered this event + pub countries: Vec, + /// Browser distribution of visitors who triggered this event + pub browsers: Vec, + /// Bucket interval used for time series ('hour', 'day', etc.) + pub bucket_interval: String, +} + +/// Time bucket data point for event activity graph +#[derive(Debug, Serialize, Deserialize, ToSchema, Clone)] +pub struct EventActivityBucket { + /// Timestamp for this bucket (ISO 8601) + #[schema(value_type = String)] + pub timestamp: UtcDateTime, + /// Number of event occurrences in this bucket + pub count: i64, + /// Number of unique visitors in this bucket + pub unique_visitors: i64, +} + +/// Referrer stats for an event +#[derive(Debug, Serialize, Deserialize, ToSchema, Clone)] +pub struct EventReferrerStats { + /// Referrer hostname or "Direct" + pub referrer: String, + /// Number of event occurrences from this referrer + pub count: i64, + /// Percentage of total events + pub percentage: f64, +} + +/// Country stats for an event +#[derive(Debug, Serialize, Deserialize, ToSchema, Clone)] +pub struct EventCountryStats { + /// Country name + pub country: String, + /// ISO country code (2-letter) + pub country_code: Option, + /// Number of event occurrences from this country + pub count: i64, + /// Percentage of total events + pub percentage: f64, +} + +/// Browser stats for an event +#[derive(Debug, Serialize, Deserialize, ToSchema, Clone)] +pub struct EventBrowserStats { + /// Browser name + pub browser: String, + /// Number of event occurrences from this browser + pub count: i64, + /// Percentage of total events + pub percentage: f64, +} + +/// A visitor who triggered a specific event +#[derive(Debug, Serialize, Deserialize, ToSchema, Clone)] +pub struct EventVisitorInfo { + /// Visitor numeric ID + pub visitor_id: i32, + /// Visitor UUID + pub visitor_uuid: String, + /// Number of times this visitor triggered the event + pub event_count: i64, + /// When the visitor first triggered the event in the date range + #[schema(value_type = String, format = "date-time")] + pub first_triggered: UtcDateTime, + /// When the visitor last triggered the event in the date range + #[schema(value_type = String, format = "date-time")] + pub last_triggered: UtcDateTime, + /// Visitor's country + pub country: Option, + /// Visitor's country code + pub country_code: Option, + /// Visitor's city + pub city: Option, + /// Browser name + pub browser: Option, + /// Device type (Desktop, Mobile, Tablet) + pub device_type: Option, + /// Referrer hostname for the event + pub referrer_hostname: Option, +} + +/// Paginated response for event visitors +#[derive(Debug, Serialize, Deserialize, ToSchema)] +pub struct EventVisitorsResponse { + /// The event name + pub event_name: String, + /// Total number of unique visitors who triggered this event + pub total_count: i64, + /// Current page number + pub page: u64, + /// Items per page + pub per_page: u64, + /// Individual visitors who triggered this event + pub visitors: Vec, +} + /// Complete page flow analytics response #[derive(Debug, Serialize, Deserialize, ToSchema)] pub struct PageFlowResponse { diff --git a/crates/temps-cli/src/commands/deploy.rs b/crates/temps-cli/src/commands/deploy.rs index b09cb6aa..23447da4 100644 --- a/crates/temps-cli/src/commands/deploy.rs +++ b/crates/temps-cli/src/commands/deploy.rs @@ -1,7 +1,7 @@ //! Deploy Command //! -//! Deploy pre-built Docker images or static files to Temps environments -//! without Git integration. +//! Deploy pre-built Docker images, static files, or Git commits/branches/tags +//! to Temps environments. use clap::{Args, Subcommand}; use colored::Colorize; @@ -21,6 +21,8 @@ enum DeployCommands { Image(DeployImageArgs), /// Deploy static files (tar.gz or zip archive) Static(DeployStaticArgs), + /// Deploy from a Git commit, branch, or tag (triggers the build pipeline) + Git(DeployGitArgs), } #[derive(Args)] @@ -93,6 +95,45 @@ struct DeployStaticArgs { metadata: Option, } +#[derive(Args)] +struct DeployGitArgs { + /// Project slug or ID + #[arg(short, long)] + project: String, + + /// Environment name (default: production) + #[arg(short, long, default_value = "production")] + environment: String, + + /// Git commit SHA to deploy (e.g., "a1b2c3d" or full 40-char hash) + #[arg(short, long, group = "git_ref")] + commit: Option, + + /// Git branch to deploy (e.g., "main", "feature/new-ui") + #[arg(short, long, group = "git_ref")] + branch: Option, + + /// Git tag to deploy (e.g., "v1.0.0") + #[arg(short, long, group = "git_ref")] + tag: Option, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + api_token: String, + + /// Wait for deployment to complete + #[arg(long, default_value = "false")] + wait: bool, + + /// Timeout in seconds for --wait (default: 300) + #[arg(long, default_value = "300")] + timeout: u64, +} + // API Response types #[derive(Debug, Deserialize)] struct ProjectResponse { @@ -120,11 +161,35 @@ struct DeployImageRequest { metadata: Option, } +#[derive(Debug, Serialize)] +struct TriggerPipelineRequest { + #[serde(skip_serializing_if = "Option::is_none")] + branch: Option, + #[serde(skip_serializing_if = "Option::is_none")] + tag: Option, + #[serde(skip_serializing_if = "Option::is_none")] + commit: Option, + environment_id: i32, +} + +#[derive(Debug, Deserialize)] +struct TriggerPipelineApiResponse { + message: String, + #[allow(dead_code)] + project_id: i32, + #[allow(dead_code)] + environment_id: i32, + branch: Option, + tag: Option, + commit: Option, +} + impl DeployCommand { pub fn execute(self) -> anyhow::Result<()> { match self.command { DeployCommands::Image(args) => Self::execute_image_deploy(args), DeployCommands::Static(args) => Self::execute_static_deploy(args), + DeployCommands::Git(args) => Self::execute_git_deploy(args), } } @@ -629,6 +694,242 @@ impl DeployCommand { }) } + fn execute_git_deploy(args: DeployGitArgs) -> anyhow::Result<()> { + // Validate that at least one git ref is provided + if args.commit.is_none() && args.branch.is_none() && args.tag.is_none() { + return Err(anyhow::anyhow!( + "At least one of --commit, --branch, or --tag must be provided" + )); + } + + let rt = tokio::runtime::Runtime::new()?; + + rt.block_on(async { + let client = reqwest::Client::new(); + + // Determine what we're deploying for display + let ref_display = if let Some(ref commit) = args.commit { + format!("commit {}", commit) + } else if let Some(ref branch) = args.branch { + format!("branch {}", branch) + } else if let Some(ref tag) = args.tag { + format!("tag {}", tag) + } else { + unreachable!() + }; + + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!("{}", " 🔀 Deploying from Git".bright_white().bold()); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!(); + println!(" {} {}", "Ref:".bright_white(), ref_display.bright_cyan()); + println!( + " {} {}", + "Project:".bright_white(), + args.project.bright_white() + ); + println!( + " {} {}", + "Environment:".bright_white(), + args.environment.bright_white() + ); + println!(); + + // Look up project + println!("{}", "Looking up project...".bright_white()); + let project = + Self::get_project(&client, &args.api_url, &args.api_token, &args.project).await?; + println!( + " {} {} (id: {})", + "✓".bright_green(), + project.slug.bright_cyan(), + project.id + ); + + // Look up environment + println!("{}", "Looking up environment...".bright_white()); + let environment = Self::get_environment( + &client, + &args.api_url, + &args.api_token, + project.id, + &args.environment, + ) + .await?; + println!( + " {} {} (id: {})", + "✓".bright_green(), + environment.name.bright_cyan(), + environment.id + ); + + // Trigger pipeline + println!("{}", "Triggering pipeline...".bright_white()); + let trigger_url = format!( + "{}/projects/{}/trigger-pipeline", + args.api_url.trim_end_matches('/'), + project.id + ); + + let request = TriggerPipelineRequest { + branch: args.branch.clone(), + tag: args.tag.clone(), + commit: args.commit.clone(), + environment_id: environment.id, + }; + + let response = client + .post(&trigger_url) + .header("Authorization", format!("Bearer {}", args.api_token)) + .header("Content-Type", "application/json") + .json(&request) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to trigger pipeline: {}", e))?; + + if !response.status().is_success() { + let status = response.status(); + let body = response.text().await.unwrap_or_default(); + return Err(anyhow::anyhow!( + "Pipeline trigger failed with status {}: {}", + status, + body + )); + } + + let pipeline_response: TriggerPipelineApiResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse pipeline response: {}", e))?; + + println!( + " {} {}", + "✓".bright_green(), + pipeline_response.message.bright_white() + ); + + if let Some(ref branch) = pipeline_response.branch { + println!(" {} Branch: {}", "→".bright_white(), branch.bright_cyan()); + } + if let Some(ref commit) = pipeline_response.commit { + println!(" {} Commit: {}", "→".bright_white(), commit.bright_cyan()); + } + if let Some(ref tag) = pipeline_response.tag { + println!(" {} Tag: {}", "→".bright_white(), tag.bright_cyan()); + } + + // Wait for completion if requested + if args.wait { + println!(); + println!( + "{}", + format!( + "Waiting for deployment to complete (timeout: {}s)...", + args.timeout + ) + .bright_white() + ); + + // Poll the last deployment for this project/environment + let deployments_url = format!( + "{}/projects/{}/deployments?environment_id={}", + args.api_url.trim_end_matches('/'), + project.id, + environment.id + ); + + let start = std::time::Instant::now(); + let timeout = std::time::Duration::from_secs(args.timeout); + + // First, wait a moment for the deployment to be created + tokio::time::sleep(std::time::Duration::from_secs(2)).await; + + loop { + if start.elapsed() > timeout { + return Err(anyhow::anyhow!( + "Deployment timed out after {}s", + args.timeout + )); + } + + tokio::time::sleep(std::time::Duration::from_secs(5)).await; + + let status_response = client + .get(&deployments_url) + .header("Authorization", format!("Bearer {}", args.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to get deployment status: {}", e))?; + + if !status_response.status().is_success() { + continue; + } + + // Parse the paginated response to get the latest deployment + let body: serde_json::Value = status_response.json().await.map_err(|e| { + anyhow::anyhow!("Failed to parse deployments response: {}", e) + })?; + + // Try to get the first deployment from the response + let deployment = body + .get("data") + .and_then(|d| d.as_array()) + .and_then(|arr| arr.first()) + .or_else(|| body.as_array().and_then(|arr| arr.first())); + + let state = deployment + .and_then(|d| d.get("state").or_else(|| d.get("status"))) + .and_then(|s| s.as_str()) + .unwrap_or("unknown"); + + print!("\r {} Current state: {} ", "⏳".bright_yellow(), state); + + match state { + "running" => { + println!(); + println!( + " {} Deployment completed successfully!", + "✅".bright_green() + ); + break; + } + "failed" | "cancelled" => { + println!(); + return Err(anyhow::anyhow!("Deployment failed with state: {}", state)); + } + _ => continue, + } + } + } + + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_green() + ); + println!( + "{}", + " ✅ Pipeline triggered successfully!" + .bright_green() + .bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_green() + ); + println!(); + + Ok(()) + }) + } + async fn get_project( client: &reqwest::Client, api_url: &str, diff --git a/crates/temps-cli/src/commands/domain.rs b/crates/temps-cli/src/commands/domain.rs index 9499cb69..d8d00806 100644 --- a/crates/temps-cli/src/commands/domain.rs +++ b/crates/temps-cli/src/commands/domain.rs @@ -1,16 +1,20 @@ -//! Domain management commands for certificate import and management +//! Domain management commands via HTTP API +//! +//! Provides CLI commands for managing domains and TLS certificates through the +//! Temps HTTP API. Supports creating domains with ACME challenges (HTTP-01 or DNS-01), +//! managing certificate orders, and importing custom certificates. use anyhow::Context; use chrono::Utc; -use clap::{Args, Subcommand}; +use clap::{Args, Subcommand, ValueEnum}; use colored::Colorize; use sea_orm::{ActiveModelTrait, ColumnTrait, EntityTrait, QueryFilter, Set}; +use serde::{Deserialize, Serialize}; use std::fs; use std::path::{Path, PathBuf}; use temps_core::EncryptionService; use temps_database::establish_connection; use temps_entities::domains; -use tracing::debug; use x509_parser::prelude::*; /// Domain and certificate management commands @@ -22,82 +26,1451 @@ pub struct DomainCommand { #[derive(Subcommand)] pub enum DomainSubcommand { - /// Import a custom certificate for a domain - Import(ImportCertificateCommand), + /// Create a new domain and request a TLS certificate via Let's Encrypt + Add(AddDomainCommand), /// List all domains and their certificate status - List(ListDomainsCommand), + #[command(alias = "ls")] + List(ListDomainsApiCommand), + /// Show details for a specific domain + Show(ShowDomainCommand), + /// Delete a domain + #[command(alias = "rm")] + Delete(DeleteDomainCommand), + /// Import a custom certificate for a domain (direct database access) + Import(ImportCertificateCommand), + /// Provision a certificate via HTTP-01 challenge + Provision(ProvisionDomainCommand), + /// Manage ACME certificate orders + Order(OrderCommand), +} + +/// Challenge type for Let's Encrypt validation +#[derive(Clone, ValueEnum, Debug)] +pub enum ChallengeType { + /// HTTP-01 challenge (requires port 80 accessible) + #[value(name = "http-01")] + Http01, + /// DNS-01 challenge (required for wildcard domains) + #[value(name = "dns-01")] + Dns01, +} + +impl std::fmt::Display for ChallengeType { + fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + match self { + ChallengeType::Http01 => write!(f, "http-01"), + ChallengeType::Dns01 => write!(f, "dns-01"), + } + } +} + +// ======================================== +// API-based commands +// ======================================== + +/// Create a new domain and request a TLS certificate +#[derive(Args)] +pub struct AddDomainCommand { + /// Domain name (e.g., "example.com" or "*.example.com") + #[arg(long, short = 'd')] + pub domain: String, + + /// Challenge type for Let's Encrypt validation + #[arg(long, short = 'c', value_enum)] + pub challenge: ChallengeType, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, +} + +/// List all domains via API +#[derive(Args)] +pub struct ListDomainsApiCommand { + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, + + /// Output as JSON + #[arg(long, default_value = "false")] + pub json: bool, +} + +/// Show details for a specific domain +#[derive(Args)] +pub struct ShowDomainCommand { + /// Domain ID + #[arg(long)] + pub id: i32, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, + + /// Output as JSON + #[arg(long, default_value = "false")] + pub json: bool, +} + +/// Delete a domain +#[derive(Args)] +pub struct DeleteDomainCommand { + /// Domain name to delete + #[arg(long, short = 'd')] + pub domain: String, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, + + /// Skip confirmation + #[arg(long, short = 'y', default_value = "false")] + pub yes: bool, +} + +/// Provision a certificate via HTTP-01 challenge +#[derive(Args)] +pub struct ProvisionDomainCommand { + /// Domain name to provision + #[arg(long, short = 'd')] + pub domain: String, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, +} + +/// ACME certificate order management +#[derive(Args)] +pub struct OrderCommand { + #[command(subcommand)] + pub command: OrderSubcommand, +} + +#[derive(Subcommand)] +pub enum OrderSubcommand { + /// Create (or recreate) an ACME order for a domain + Create(OrderCreateCommand), + /// Show ACME order details (includes live challenge validation status) + Show(OrderShowCommand), + /// Cancel an ACME order + Cancel(OrderCancelCommand), + /// Finalize an ACME order (complete challenge and obtain certificate) + Finalize(OrderFinalizeCommand), + /// List all ACME orders + #[command(alias = "ls")] + List(OrderListCommand), +} + +/// Create a new ACME order +#[derive(Args)] +pub struct OrderCreateCommand { + /// Domain ID to create order for + #[arg(long)] + pub domain_id: i32, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, +} + +/// Show ACME order details +#[derive(Args)] +pub struct OrderShowCommand { + /// Domain ID to show order for + #[arg(long)] + pub domain_id: i32, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, + + /// Output as JSON + #[arg(long, default_value = "false")] + pub json: bool, +} + +/// Cancel an ACME order +#[derive(Args)] +pub struct OrderCancelCommand { + /// Domain ID to cancel order for + #[arg(long)] + pub domain_id: i32, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, + + /// Skip confirmation + #[arg(long, short = 'y', default_value = "false")] + pub yes: bool, +} + +/// Finalize an ACME order +#[derive(Args)] +pub struct OrderFinalizeCommand { + /// Domain ID to finalize order for + #[arg(long)] + pub domain_id: i32, + + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, +} + +/// List all ACME orders +#[derive(Args)] +pub struct OrderListCommand { + /// Temps API URL + #[arg(long, env = "TEMPS_API_URL")] + pub api_url: String, + + /// Temps API token + #[arg(long, env = "TEMPS_API_TOKEN")] + pub api_token: String, + + /// Output as JSON + #[arg(long, default_value = "false")] + pub json: bool, +} + +// ======================================== +// Import command (direct database access) +// ======================================== + +/// Import a custom certificate for a domain +#[derive(Args)] +pub struct ImportCertificateCommand { + /// Domain name (e.g., "*.localho.st" or "app.example.com") + #[arg(long, short = 'd')] + pub domain: String, + + /// Path to the certificate file (PEM format) + #[arg(long, short = 'c')] + pub certificate: PathBuf, + + /// Path to the private key file (PEM format) + #[arg(long, short = 'k')] + pub private_key: PathBuf, + + /// Database URL + #[arg(long, env = "TEMPS_DATABASE_URL")] + pub database_url: String, + + /// Data directory containing the encryption key + #[arg(long, env = "TEMPS_DATA_DIR")] + pub data_dir: Option, + + /// Overwrite existing certificate for this domain + #[arg(long, default_value = "false")] + pub force: bool, +} + +// ======================================== +// API response types +// ======================================== + +#[derive(Debug, Deserialize, Serialize)] +struct DomainResponse { + id: i32, + domain: String, + status: String, + expiration_time: Option, + last_renewed: Option, + dns_challenge_token: Option, + dns_challenge_value: Option, + last_error: Option, + last_error_type: Option, + is_wildcard: bool, + verification_method: String, + created_at: i64, + updated_at: i64, + certificate: Option, +} + +#[derive(Debug, Deserialize, Serialize)] +struct DomainChallengeResponse { + domain: String, + txt_records: Vec, + status: String, +} + +#[derive(Debug, Deserialize, Serialize)] +struct TxtRecord { + name: String, + value: String, +} + +#[derive(Debug, Deserialize, Serialize)] +#[serde(tag = "type")] +enum ProvisionApiResponse { + #[serde(rename = "error")] + Error(DomainErrorResponse), + #[serde(rename = "complete")] + Complete(DomainResponse), + #[serde(rename = "pending")] + Pending(DomainChallengeResponse), +} + +#[derive(Debug, Deserialize, Serialize)] +struct DomainErrorResponse { + message: String, + code: String, + details: Option, +} + +#[derive(Debug, Deserialize, Serialize)] +struct ListDomainsResponse { + domains: Vec, +} + +#[derive(Debug, Deserialize, Serialize)] +struct AcmeOrderResponse { + id: i32, + order_url: String, + domain_id: i32, + email: String, + status: String, + identifiers: serde_json::Value, + authorizations: Option, + finalize_url: Option, + certificate_url: Option, + error: Option, + error_type: Option, + created_at: i64, + updated_at: i64, + expires_at: Option, + challenge_validation: Option, +} + +#[derive(Debug, Deserialize, Serialize)] +struct ChallengeValidationStatus { + #[serde(rename = "type")] + challenge_type: String, + url: String, + status: String, + validated: Option, + error: Option, + token: String, +} + +#[derive(Debug, Deserialize, Serialize)] +struct ChallengeError { + #[serde(rename = "type")] + error_type: String, + detail: String, + status: i32, +} + +#[derive(Debug, Deserialize, Serialize)] +struct ListOrdersResponse { + orders: Vec, +} + +#[derive(Debug, Serialize)] +struct CreateDomainRequest { + domain: String, + challenge_type: String, +} + +// ======================================== +// Command execution +// ======================================== + +impl DomainCommand { + pub fn execute(self) -> anyhow::Result<()> { + let rt = tokio::runtime::Runtime::new()?; + + rt.block_on(async { + match self.command { + DomainSubcommand::Add(cmd) => execute_add(cmd).await, + DomainSubcommand::List(cmd) => execute_list_api(cmd).await, + DomainSubcommand::Show(cmd) => execute_show(cmd).await, + DomainSubcommand::Delete(cmd) => execute_delete(cmd).await, + DomainSubcommand::Import(cmd) => execute_import(cmd).await, + DomainSubcommand::Provision(cmd) => execute_provision(cmd).await, + DomainSubcommand::Order(cmd) => match cmd.command { + OrderSubcommand::Create(c) => execute_order_create(c).await, + OrderSubcommand::Show(c) => execute_order_show(c).await, + OrderSubcommand::Cancel(c) => execute_order_cancel(c).await, + OrderSubcommand::Finalize(c) => execute_order_finalize(c).await, + OrderSubcommand::List(c) => execute_order_list(c).await, + }, + } + }) + } +} + +// ======================================== +// Helper: build API URL +// ======================================== + +fn api_url(base: &str, path: &str) -> String { + format!("{}{}", base.trim_end_matches('/'), path) +} + +// ======================================== +// Helper: handle API error responses +// ======================================== + +async fn handle_api_error(response: reqwest::Response) -> anyhow::Error { + let status = response.status(); + let body = response.text().await.unwrap_or_default(); + anyhow::anyhow!("API request failed (HTTP {}): {}", status, body) +} + +// ======================================== +// Helper: format millis timestamp +// ======================================== + +fn format_millis_timestamp(millis: i64) -> String { + chrono::DateTime::from_timestamp_millis(millis) + .map(|dt| dt.format("%Y-%m-%d %H:%M:%S UTC").to_string()) + .unwrap_or_else(|| "N/A".to_string()) +} + +fn format_millis_date(millis: i64) -> String { + chrono::DateTime::from_timestamp_millis(millis) + .map(|dt| dt.format("%Y-%m-%d").to_string()) + .unwrap_or_else(|| "N/A".to_string()) +} + +// ======================================== +// Add domain +// ======================================== + +async fn execute_add(cmd: AddDomainCommand) -> anyhow::Result<()> { + let is_wildcard = cmd.domain.starts_with("*."); + + // Enforce dns-01 for wildcard domains + if is_wildcard { + if let ChallengeType::Http01 = cmd.challenge { + return Err(anyhow::anyhow!( + "Wildcard domains (*.example.com) require DNS-01 challenge. Use --challenge dns-01" + )); + } + } + + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!( + "{}", + " Creating Domain & Requesting Certificate" + .bright_white() + .bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!(); + println!( + " {} {}", + "Domain:".bright_white(), + cmd.domain.bright_cyan() + ); + println!( + " {} {}", + "Challenge:".bright_white(), + cmd.challenge.to_string().bright_cyan() + ); + println!( + " {} {}", + "Type:".bright_white(), + if is_wildcard { "Wildcard" } else { "Single" }.bright_cyan() + ); + println!(); + + let client = reqwest::Client::new(); + let url = api_url(&cmd.api_url, "/domains"); + + let request = CreateDomainRequest { + domain: cmd.domain.clone(), + challenge_type: cmd.challenge.to_string(), + }; + + println!("{} Requesting certificate...", "→".bright_blue()); + + let response = client + .post(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .header("Content-Type", "application/json") + .json(&request) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + let domain_resp: DomainResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; + + println!( + " {} Domain created (ID: {})", + "✓".bright_green(), + domain_resp.id + ); + + // Show domain details + print_domain_details(&domain_resp); + + // Show challenge instructions based on challenge type + print_challenge_instructions(&cmd.challenge, &domain_resp); + + println!(); + Ok(()) +} + +// ======================================== +// Print challenge instructions (like UI) +// ======================================== + +fn print_challenge_instructions(challenge_type: &ChallengeType, domain: &DomainResponse) { + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_yellow() + ); + + match challenge_type { + ChallengeType::Dns01 => { + println!( + "{}", + " DNS-01 Challenge Instructions".bright_yellow().bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_yellow() + ); + println!(); + println!(" Add the following DNS TXT record to verify domain ownership:"); + println!(); + + if let (Some(token), Some(value)) = + (&domain.dns_challenge_token, &domain.dns_challenge_value) + { + println!( + " {} {}", + "Record Name:".bright_white().bold(), + format!("_acme-challenge.{}", domain.domain.trim_start_matches("*.")) + .bright_cyan() + ); + println!( + " {} {}", + "Record Type:".bright_white().bold(), + "TXT".bright_cyan() + ); + println!( + " {} {}", + "Record Value:".bright_white().bold(), + value.bright_cyan() + ); + let _ = token; // token is stored but value is what goes in DNS + } else { + println!(" {} Challenge data not yet available.", "ℹ".bright_blue()); + println!( + " {} Use 'temps domain order show --domain-id {}' to check challenge details.", + "→".bright_blue(), + domain.id + ); + } + + println!(); + println!( + " {} After adding the DNS record:", + "Next steps:".bright_white().bold() + ); + println!(" 1. Wait for DNS propagation (usually 1-5 minutes)"); + println!( + " 2. Verify: {}", + format!( + "dig TXT _acme-challenge.{}", + domain.domain.trim_start_matches("*.") + ) + .bright_white() + ); + println!( + " 3. Finalize: {}", + format!("temps domain order finalize --domain-id {}", domain.id).bright_white() + ); + } + ChallengeType::Http01 => { + println!( + "{}", + " HTTP-01 Challenge Instructions".bright_yellow().bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_yellow() + ); + println!(); + println!(" The HTTP-01 challenge requires port 80 to be publicly accessible."); + println!(" Temps will automatically serve the challenge token."); + println!(); + println!( + " {} Ensure your domain {} points to this server.", + "→".bright_blue(), + domain.domain.bright_cyan() + ); + println!(); + println!(" {} Next steps:", "Next steps:".bright_white().bold()); + println!( + " 1. Verify DNS: {}", + format!("dig A {}", domain.domain).bright_white() + ); + println!( + " 2. Provision: {}", + format!("temps domain provision -d {}", domain.domain).bright_white() + ); + println!( + " {} Or finalize via order: {}", + "→".bright_blue(), + format!("temps domain order finalize --domain-id {}", domain.id).bright_white() + ); + } + } +} + +// ======================================== +// Print domain details +// ======================================== + +fn print_domain_details(domain: &DomainResponse) { + println!(); + println!( + " {} {}", + "Domain:".bright_white(), + domain.domain.bright_cyan() + ); + println!( + " {} {}", + "ID:".bright_white(), + domain.id.to_string().bright_cyan() + ); + + let status_colored = match domain.status.as_str() { + "active" => domain.status.bright_green(), + "pending" | "pending_dns" | "pending_validation" | "pending_http" => { + domain.status.bright_yellow() + } + "failed" | "expired" => domain.status.bright_red(), + _ => domain.status.normal(), + }; + println!(" {} {}", "Status:".bright_white(), status_colored); + println!( + " {} {}", + "Type:".bright_white(), + if domain.is_wildcard { + "Wildcard" + } else { + "Single" + } + .bright_cyan() + ); + println!( + " {} {}", + "Verification:".bright_white(), + domain.verification_method.bright_cyan() + ); + + if let Some(exp) = domain.expiration_time { + println!( + " {} {}", + "Expires:".bright_white(), + format_millis_timestamp(exp).bright_cyan() + ); + } + + if let Some(ref err) = domain.last_error { + println!(" {} {}", "Last Error:".bright_white(), err.bright_red()); + } +} + +// ======================================== +// List domains (API) +// ======================================== + +async fn execute_list_api(cmd: ListDomainsApiCommand) -> anyhow::Result<()> { + let client = reqwest::Client::new(); + let url = api_url(&cmd.api_url, "/domains"); + + let response = client + .get(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + let list_resp: ListDomainsResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; + + if cmd.json { + println!( + "{}", + serde_json::to_string_pretty(&list_resp.domains) + .map_err(|e| anyhow::anyhow!("Failed to serialize: {}", e))? + ); + return Ok(()); + } + + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!( + "{}", + " Domain Certificates" + .bright_white() + .bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!(); + + if list_resp.domains.is_empty() { + println!(" {} No domains configured.", "ℹ".bright_blue()); + println!(); + return Ok(()); + } + + println!( + " {:<5} {:<40} {:<18} {:<12} {:<12}", + "ID".bright_white().bold(), + "DOMAIN".bright_white().bold(), + "STATUS".bright_white().bold(), + "TYPE".bright_white().bold(), + "EXPIRES".bright_white().bold() + ); + println!(" {}", "─".repeat(90)); + + for domain in &list_resp.domains { + let status_colored = match domain.status.as_str() { + "active" => domain.status.bright_green(), + "pending" | "pending_dns" | "pending_validation" | "pending_http" => { + domain.status.bright_yellow() + } + "failed" | "expired" => domain.status.bright_red(), + _ => domain.status.normal(), + }; + + let domain_type = if domain.is_wildcard { + "wildcard" + } else { + "single" + }; + + let expiration = domain + .expiration_time + .map(|t| format_millis_date(t)) + .unwrap_or_else(|| "N/A".to_string()); + + println!( + " {:<5} {:<40} {:<18} {:<12} {:<12}", + domain.id.to_string().bright_white(), + domain.domain.bright_cyan(), + status_colored, + domain_type, + expiration + ); + } + + println!(); + Ok(()) +} + +// ======================================== +// Show domain +// ======================================== + +async fn execute_show(cmd: ShowDomainCommand) -> anyhow::Result<()> { + let client = reqwest::Client::new(); + let url = api_url(&cmd.api_url, &format!("/domains/{}", cmd.id)); + + let response = client + .get(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + let domain_resp: DomainResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; + + if cmd.json { + println!( + "{}", + serde_json::to_string_pretty(&domain_resp) + .map_err(|e| anyhow::anyhow!("Failed to serialize: {}", e))? + ); + return Ok(()); + } + + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!( + "{}", + " Domain Details".bright_white().bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + + print_domain_details(&domain_resp); + println!( + " {} {}", + "Created:".bright_white(), + format_millis_timestamp(domain_resp.created_at).bright_cyan() + ); + println!( + " {} {}", + "Updated:".bright_white(), + format_millis_timestamp(domain_resp.updated_at).bright_cyan() + ); + if let Some(renewed) = domain_resp.last_renewed { + println!( + " {} {}", + "Last Renewed:".bright_white(), + format_millis_timestamp(renewed).bright_cyan() + ); + } + + println!(); + Ok(()) +} + +// ======================================== +// Delete domain +// ======================================== + +async fn execute_delete(cmd: DeleteDomainCommand) -> anyhow::Result<()> { + if !cmd.yes { + println!( + "{} Are you sure you want to delete domain '{}'? Use --yes to confirm.", + "⚠".bright_yellow(), + cmd.domain.bright_cyan() + ); + return Ok(()); + } + + let client = reqwest::Client::new(); + let url = api_url( + &cmd.api_url, + &format!("/domains/{}", urlencoding::encode(&cmd.domain)), + ); + + let response = client + .delete(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + println!( + " {} Domain '{}' deleted successfully.", + "✓".bright_green(), + cmd.domain.bright_cyan() + ); + + Ok(()) +} + +// ======================================== +// Provision domain (HTTP-01) +// ======================================== + +async fn execute_provision(cmd: ProvisionDomainCommand) -> anyhow::Result<()> { + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!( + "{}", + " Provisioning Certificate (HTTP-01)" + .bright_white() + .bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!(); + + let client = reqwest::Client::new(); + let url = api_url( + &cmd.api_url, + &format!("/domains/{}/provision", urlencoding::encode(&cmd.domain)), + ); + + println!( + "{} Provisioning certificate for {}...", + "→".bright_blue(), + cmd.domain.bright_cyan() + ); + + let response = client + .post(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + let provision_resp: ProvisionApiResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; + + match provision_resp { + ProvisionApiResponse::Complete(domain) => { + println!( + " {} Certificate provisioned successfully!", + "✓".bright_green() + ); + print_domain_details(&domain); + } + ProvisionApiResponse::Pending(challenge) => { + println!( + " {} Challenge is pending. DNS records needed:", + "⏳".bright_yellow() + ); + for record in &challenge.txt_records { + println!( + " {} {} = {}", + "TXT".bright_white(), + record.name.bright_cyan(), + record.value.bright_white() + ); + } + } + ProvisionApiResponse::Error(err) => { + println!( + " {} Provisioning failed: {}", + "✗".bright_red(), + err.message.bright_red() + ); + if let Some(details) = err.details { + println!(" {}", details); + } + } + } + + println!(); + Ok(()) +} + +// ======================================== +// Order: Create +// ======================================== + +async fn execute_order_create(cmd: OrderCreateCommand) -> anyhow::Result<()> { + println!(); + println!( + "{} Creating ACME order for domain ID {}...", + "→".bright_blue(), + cmd.domain_id + ); + + let client = reqwest::Client::new(); + let url = api_url(&cmd.api_url, &format!("/domains/{}/order", cmd.domain_id)); + + let response = client + .post(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + let challenge_resp: DomainChallengeResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; + + println!( + " {} ACME order created for {}", + "✓".bright_green(), + challenge_resp.domain.bright_cyan() + ); + println!( + " {} {}", + "Status:".bright_white(), + challenge_resp.status.bright_yellow() + ); + + if !challenge_resp.txt_records.is_empty() { + println!(); + println!( + " {} Add the following DNS TXT record(s):", + "DNS Records:".bright_white().bold() + ); + println!(); + for record in &challenge_resp.txt_records { + println!( + " {} {}", + "Name:".bright_white(), + record.name.bright_cyan() + ); + println!( + " {} {}", + "Value:".bright_white(), + record.value.bright_white() + ); + println!(); + } + println!( + " {} After DNS propagation, finalize: {}", + "→".bright_blue(), + format!("temps domain order finalize --domain-id {}", cmd.domain_id).bright_white() + ); + } + + println!(); + Ok(()) } -/// Import a custom certificate for a domain -#[derive(Args)] -pub struct ImportCertificateCommand { - /// Domain name (e.g., "*.localho.st" or "app.example.com") - #[arg(long, short = 'd')] - pub domain: String, +// ======================================== +// Order: Show +// ======================================== - /// Path to the certificate file (PEM format) - #[arg(long, short = 'c')] - pub certificate: PathBuf, +async fn execute_order_show(cmd: OrderShowCommand) -> anyhow::Result<()> { + let client = reqwest::Client::new(); + let url = api_url(&cmd.api_url, &format!("/domains/{}/order", cmd.domain_id)); - /// Path to the private key file (PEM format) - #[arg(long, short = 'k')] - pub private_key: PathBuf, + let response = client + .get(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; - /// Database URL - #[arg(long, env = "TEMPS_DATABASE_URL")] - pub database_url: String, + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } - /// Data directory containing the encryption key - #[arg(long, env = "TEMPS_DATA_DIR")] - pub data_dir: Option, + let order_resp: AcmeOrderResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; - /// Overwrite existing certificate for this domain - #[arg(long, default_value = "false")] - pub force: bool, + if cmd.json { + println!( + "{}", + serde_json::to_string_pretty(&order_resp) + .map_err(|e| anyhow::anyhow!("Failed to serialize: {}", e))? + ); + return Ok(()); + } + + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!( + "{}", + " ACME Order Details" + .bright_white() + .bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!(); + + println!( + " {} {}", + "Order ID:".bright_white(), + order_resp.id.to_string().bright_cyan() + ); + println!( + " {} {}", + "Domain ID:".bright_white(), + order_resp.domain_id.to_string().bright_cyan() + ); + println!( + " {} {}", + "Email:".bright_white(), + order_resp.email.bright_cyan() + ); + + let status_colored = match order_resp.status.as_str() { + "valid" | "ready" => order_resp.status.bright_green(), + "pending" | "processing" => order_resp.status.bright_yellow(), + "invalid" | "expired" | "deactivated" | "revoked" => order_resp.status.bright_red(), + _ => order_resp.status.normal(), + }; + println!(" {} {}", "Status:".bright_white(), status_colored); + + println!( + " {} {}", + "Order URL:".bright_white(), + order_resp.order_url.bright_white() + ); + println!( + " {} {}", + "Created:".bright_white(), + format_millis_timestamp(order_resp.created_at).bright_cyan() + ); + + if let Some(expires) = order_resp.expires_at { + println!( + " {} {}", + "Expires:".bright_white(), + format_millis_timestamp(expires).bright_cyan() + ); + } + + if let Some(ref err) = order_resp.error { + println!(" {} {}", "Error:".bright_white(), err.bright_red()); + } + + // Show challenge validation status + if let Some(ref validation) = order_resp.challenge_validation { + println!(); + println!(" {}", "Challenge Validation:".bright_white().bold()); + + let validation_status = match validation.status.as_str() { + "valid" => validation.status.bright_green(), + "pending" => validation.status.bright_yellow(), + "invalid" => validation.status.bright_red(), + _ => validation.status.normal(), + }; + + println!( + " {} {}", + "Type:".bright_white(), + validation.challenge_type.bright_cyan() + ); + println!(" {} {}", "Status:".bright_white(), validation_status); + println!( + " {} {}", + "Token:".bright_white(), + validation.token.bright_white() + ); + + if let Some(ref validated) = validation.validated { + println!( + " {} {}", + "Validated:".bright_white(), + validated.bright_green() + ); + } + + if let Some(ref err) = validation.error { + println!( + " {} {} ({})", + "Error:".bright_white(), + err.detail.bright_red(), + err.error_type + ); + } + } + + println!(); + Ok(()) } -/// List all domains and their certificate status -#[derive(Args)] -pub struct ListDomainsCommand { - /// Database URL - #[arg(long, env = "TEMPS_DATABASE_URL")] - pub database_url: String, +// ======================================== +// Order: Cancel +// ======================================== + +async fn execute_order_cancel(cmd: OrderCancelCommand) -> anyhow::Result<()> { + if !cmd.yes { + println!( + "{} Are you sure you want to cancel the order for domain ID {}? Use --yes to confirm.", + "⚠".bright_yellow(), + cmd.domain_id + ); + return Ok(()); + } + + let client = reqwest::Client::new(); + let url = api_url(&cmd.api_url, &format!("/domains/{}/order", cmd.domain_id)); + + println!( + "{} Cancelling ACME order for domain ID {}...", + "→".bright_blue(), + cmd.domain_id + ); + + let response = client + .delete(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + let domain_resp: DomainResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; + + println!( + " {} Order cancelled for domain '{}'", + "✓".bright_green(), + domain_resp.domain.bright_cyan() + ); + println!( + " {} {}", + "Status:".bright_white(), + domain_resp.status.bright_yellow() + ); + + println!(); + Ok(()) } -impl DomainCommand { - pub fn execute(self) -> anyhow::Result<()> { - let rt = tokio::runtime::Runtime::new()?; +// ======================================== +// Order: Finalize +// ======================================== - rt.block_on(async { - match self.command { - DomainSubcommand::Import(cmd) => execute_import(cmd).await, - DomainSubcommand::List(cmd) => execute_list(cmd).await, - } - }) +async fn execute_order_finalize(cmd: OrderFinalizeCommand) -> anyhow::Result<()> { + println!(); + println!( + "{} Finalizing ACME order for domain ID {}...", + "→".bright_blue(), + cmd.domain_id + ); + + let client = reqwest::Client::new(); + let url = api_url( + &cmd.api_url, + &format!("/domains/{}/order/finalize", cmd.domain_id), + ); + + let response = client + .post(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + let domain_resp: DomainResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; + + match domain_resp.status.as_str() { + "active" => { + println!(" {} Certificate issued successfully!", "✓".bright_green()); + print_domain_details(&domain_resp); + } + "failed" => { + println!(" {} Certificate issuance failed.", "✗".bright_red()); + print_domain_details(&domain_resp); + println!(); + println!( + " {} You can recreate the order: {}", + "→".bright_blue(), + format!("temps domain order create --domain-id {}", cmd.domain_id).bright_white() + ); + } + _ => { + println!( + " {} Order finalized. Current status: {}", + "ℹ".bright_blue(), + domain_resp.status.bright_yellow() + ); + print_domain_details(&domain_resp); + } + } + + println!(); + Ok(()) +} + +// ======================================== +// Order: List +// ======================================== + +async fn execute_order_list(cmd: OrderListCommand) -> anyhow::Result<()> { + let client = reqwest::Client::new(); + let url = api_url(&cmd.api_url, "/orders"); + + let response = client + .get(&url) + .header("Authorization", format!("Bearer {}", cmd.api_token)) + .send() + .await + .map_err(|e| anyhow::anyhow!("Failed to connect to API: {}", e))?; + + if !response.status().is_success() { + return Err(handle_api_error(response).await); + } + + let list_resp: ListOrdersResponse = response + .json() + .await + .map_err(|e| anyhow::anyhow!("Failed to parse response: {}", e))?; + + if cmd.json { + println!( + "{}", + serde_json::to_string_pretty(&list_resp.orders) + .map_err(|e| anyhow::anyhow!("Failed to serialize: {}", e))? + ); + return Ok(()); + } + + println!(); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!( + "{}", + " ACME Orders".bright_white().bold() + ); + println!( + "{}", + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_cyan() + ); + println!(); + + if list_resp.orders.is_empty() { + println!(" {} No ACME orders found.", "ℹ".bright_blue()); + println!(); + return Ok(()); + } + + println!( + " {:<6} {:<12} {:<15} {:<30} {:<20}", + "ID".bright_white().bold(), + "DOMAIN ID".bright_white().bold(), + "STATUS".bright_white().bold(), + "EMAIL".bright_white().bold(), + "CREATED".bright_white().bold() + ); + println!(" {}", "─".repeat(85)); + + for order in &list_resp.orders { + let status_colored = match order.status.as_str() { + "valid" | "ready" => order.status.bright_green(), + "pending" | "processing" => order.status.bright_yellow(), + "invalid" | "expired" | "deactivated" | "revoked" => order.status.bright_red(), + _ => order.status.normal(), + }; + + println!( + " {:<6} {:<12} {:<15} {:<30} {:<20}", + order.id.to_string().bright_white(), + order.domain_id.to_string().bright_cyan(), + status_colored, + order.email, + format_millis_date(order.created_at) + ); } + + println!(); + Ok(()) } +// ======================================== +// Import certificate (direct database access) +// ======================================== + async fn execute_import(cmd: ImportCertificateCommand) -> anyhow::Result<()> { println!(); println!( "{}", - "═══════════════════════════════════════════════════════════════".bright_blue() + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_blue() ); println!( "{}", - " Import Custom Certificate " + " Import Custom Certificate" .bright_blue() .bold() ); println!( "{}", - "═══════════════════════════════════════════════════════════════".bright_blue() + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_blue() ); println!(); // Get data directory let data_dir = get_data_dir(&cmd.data_dir)?; - debug!("Using data directory: {}", data_dir.display()); // Load encryption key let encryption_key = load_encryption_key(&data_dir)?; @@ -206,12 +1579,12 @@ async fn execute_import(cmd: ImportCertificateCommand) -> anyhow::Result<()> { println!(); println!( "{}", - "═══════════════════════════════════════════════════════════════".bright_green() + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_green() ); println!("{} Certificate imported successfully!", "✓".bright_green()); println!( "{}", - "═══════════════════════════════════════════════════════════════".bright_green() + "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━".bright_green() ); println!(); println!( @@ -243,78 +1616,9 @@ async fn execute_import(cmd: ImportCertificateCommand) -> anyhow::Result<()> { Ok(()) } -async fn execute_list(cmd: ListDomainsCommand) -> anyhow::Result<()> { - println!(); - println!( - "{}", - "═══════════════════════════════════════════════════════════════".bright_blue() - ); - println!( - "{}", - " Domain Certificates " - .bright_blue() - .bold() - ); - println!( - "{}", - "═══════════════════════════════════════════════════════════════".bright_blue() - ); - println!(); - - // Connect to database - let db = establish_connection(&cmd.database_url).await?; - - // List all domains - let domains_list = domains::Entity::find().all(db.as_ref()).await?; - - if domains_list.is_empty() { - println!(" {} No domains configured.", "ℹ".bright_blue()); - println!(); - return Ok(()); - } - - println!( - " {:<40} {:<15} {:<12} {:<20}", - "DOMAIN".bright_white().bold(), - "STATUS".bright_white().bold(), - "TYPE".bright_white().bold(), - "EXPIRES".bright_white().bold() - ); - println!(" {}", "─".repeat(90)); - - for domain in domains_list { - let status_colored = match domain.status.as_str() { - "active" => domain.status.bright_green(), - "pending" | "pending_dns" | "pending_validation" | "pending_http" => { - domain.status.bright_yellow() - } - "failed" | "expired" => domain.status.bright_red(), - _ => domain.status.normal(), - }; - - let domain_type = if domain.is_wildcard { - "wildcard" - } else { - "single" - }; - - let expiration = domain - .expiration_time - .map(|t| t.format("%Y-%m-%d").to_string()) - .unwrap_or_else(|| "N/A".to_string()); - - println!( - " {:<40} {:<15} {:<12} {:<20}", - domain.domain.bright_cyan(), - status_colored, - domain_type, - expiration - ); - } - - println!(); - Ok(()) -} +// ======================================== +// Certificate validation helpers +// ======================================== fn get_data_dir(data_dir: &Option) -> anyhow::Result { if let Some(dir) = data_dir { @@ -504,4 +1808,57 @@ MIIBkTCB+wIJAKHBfpeg... assert!(validate_private_key(wrong_type).is_err()); } + + #[test] + fn test_challenge_type_display() { + assert_eq!(ChallengeType::Http01.to_string(), "http-01"); + assert_eq!(ChallengeType::Dns01.to_string(), "dns-01"); + } + + #[test] + fn test_wildcard_requires_dns01() { + // Simulating the validation logic from execute_add + let domain = "*.example.com"; + let is_wildcard = domain.starts_with("*."); + assert!(is_wildcard); + + // HTTP-01 should be rejected for wildcards + let challenge = ChallengeType::Http01; + let should_reject = is_wildcard && matches!(challenge, ChallengeType::Http01); + assert!(should_reject); + + // DNS-01 should be accepted for wildcards + let challenge = ChallengeType::Dns01; + let should_reject = is_wildcard && matches!(challenge, ChallengeType::Http01); + assert!(!should_reject); + let _ = challenge; + } + + #[test] + fn test_format_millis_timestamp() { + let millis = 1700000000000_i64; // 2023-11-14 + let formatted = format_millis_timestamp(millis); + assert!(formatted.contains("2023")); + assert!(formatted.contains("UTC")); + } + + #[test] + fn test_format_millis_date() { + let millis = 1700000000000_i64; + let formatted = format_millis_date(millis); + assert!(formatted.contains("2023")); + assert!(!formatted.contains("UTC")); + } + + #[test] + fn test_api_url() { + assert_eq!( + api_url("http://localhost:3000", "/domains"), + "http://localhost:3000/domains" + ); + assert_eq!( + api_url("http://localhost:3000/", "/domains"), + "http://localhost:3000/domains" + ); + } } diff --git a/crates/temps-deployments/src/handlers/deployments.rs b/crates/temps-deployments/src/handlers/deployments.rs index 0888edcf..0d296f31 100644 --- a/crates/temps-deployments/src/handlers/deployments.rs +++ b/crates/temps-deployments/src/handlers/deployments.rs @@ -710,7 +710,8 @@ pub async fn list_containers( ("start_date" = Option, Query, description = "Start date for logs"), ("end_date" = Option, Query, description = "End date for logs"), ("tail" = Option, Query, description = "Number of lines to tail (or 'all')"), - ("timestamps" = Option, Query, description = "Include timestamps in log output (default: false)") + ("timestamps" = Option, Query, description = "Include timestamps in log output (default: false)"), + ("follow" = Option, Query, description = "Follow log output in real-time (default: true)") ), responses( (status = 101, description = "WebSocket connection established for streaming container logs"), @@ -747,6 +748,7 @@ pub async fn get_container_logs_by_id( end_date: query.end_date, tail: query.tail, timestamps: query.timestamps, + follow: query.follow, }, ) })) @@ -761,6 +763,7 @@ struct ContainerLogParams { end_date: Option, tail: Option, timestamps: bool, + follow: bool, } async fn handle_container_logs_socket( @@ -785,6 +788,7 @@ async fn handle_container_logs_socket( end_date: params.end_date, tail: params.tail, timestamps: params.timestamps, + follow: params.follow, }, ) .await @@ -851,7 +855,8 @@ async fn handle_container_logs_socket( ("end_date" = Option, Query, description = "End date for logs"), ("tail" = Option, Query, description = "Number of lines to tail (or 'all')"), ("container_name" = Option, Query, description = "Optional container name (defaults to first/primary container)"), - ("timestamps" = Option, Query, description = "Include timestamps in log output (default: false)") + ("timestamps" = Option, Query, description = "Include timestamps in log output (default: false)"), + ("follow" = Option, Query, description = "Follow log output in real-time (default: true)") ), responses( (status = 101, description = "WebSocket connection established for streaming container logs"), @@ -888,6 +893,7 @@ pub async fn get_container_logs( tail: query.tail, container_name: query.container_name, timestamps: query.timestamps, + follow: query.follow, }, ) })) @@ -902,6 +908,7 @@ struct FilteredContainerLogParams { tail: Option, container_name: Option, timestamps: bool, + follow: bool, } async fn handle_filtered_container_logs_socket( @@ -926,6 +933,7 @@ async fn handle_filtered_container_logs_socket( end_date: params.end_date, tail: params.tail, timestamps: params.timestamps, + follow: params.follow, }, ) .await diff --git a/crates/temps-deployments/src/handlers/types.rs b/crates/temps-deployments/src/handlers/types.rs index 6d13f34e..51424be3 100644 --- a/crates/temps-deployments/src/handlers/types.rs +++ b/crates/temps-deployments/src/handlers/types.rs @@ -379,12 +379,19 @@ pub struct ContainerLogsQuery { /// Include timestamps in log output (default: false) #[serde(default = "default_timestamps")] pub timestamps: bool, + /// Follow log output in real-time (default: true for backward compatibility) + #[serde(default = "default_follow")] + pub follow: bool, } fn default_timestamps() -> bool { false } +fn default_follow() -> bool { + true +} + #[derive(Deserialize, ToSchema)] pub struct JobLogsQuery { pub lines: Option, diff --git a/crates/temps-deployments/src/jobs/deploy_image.rs b/crates/temps-deployments/src/jobs/deploy_image.rs index 543efb71..4828a3bd 100644 --- a/crates/temps-deployments/src/jobs/deploy_image.rs +++ b/crates/temps-deployments/src/jobs/deploy_image.rs @@ -296,8 +296,9 @@ impl DeployImageJob { fn find_available_port() -> Result { use std::net::TcpListener; - // Try to bind to port 0, which tells the OS to assign an available port - let listener = TcpListener::bind("127.0.0.1:0") + // Bind to 0.0.0.0:0 to match Docker's binding address and avoid port collisions + // where a port appears free on 127.0.0.1 but is occupied on 0.0.0.0 + let listener = TcpListener::bind("0.0.0.0:0") .map_err(|e| WorkflowError::Other(format!("Failed to find available port: {}", e)))?; let port = listener diff --git a/crates/temps-deployments/src/services/services.rs b/crates/temps-deployments/src/services/services.rs index 53c4942d..58463bcd 100644 --- a/crates/temps-deployments/src/services/services.rs +++ b/crates/temps-deployments/src/services/services.rs @@ -22,6 +22,7 @@ pub struct ContainerLogParams { pub end_date: Option, pub tail: Option, pub timestamps: bool, + pub follow: bool, } #[derive(Error, Debug)] @@ -158,6 +159,7 @@ impl DeploymentService { }), tail: params.tail, timestamps: params.timestamps, + follow: params.follow, }, ) .await @@ -232,6 +234,7 @@ impl DeploymentService { }), tail: params.tail, timestamps: params.timestamps, + follow: params.follow, }, ) .await @@ -604,7 +607,9 @@ impl DeploymentService { project_id: i32, deployment_id: i32, ) -> Result { - // Fetch the target deployment + use temps_entities::deployments::DeploymentMetadata; + + // Fetch the target deployment (the one we're rolling back TO) let target_deployment = deployments::Entity::find_by_id(deployment_id) .filter(deployments::Column::ProjectId.eq(project_id)) .one(self.db.as_ref()) @@ -640,28 +645,93 @@ impl DeploymentService { .ok_or_else(|| DeploymentError::NotFound("Environment not found".to_string()))?; info!( - "Initiating rollback for project_id: {}, deployment_id: {}, image: {}, environment_id: {}", + "Initiating rollback for project_id: {}, to deployment_id: {}, image: {}, environment_id: {}", project_id, deployment_id, image_name, environment_id ); + let preset = temps_presets::get_preset_by_slug(project.preset.as_str()) .ok_or_else(|| DeploymentError::NotFound("Preset not found".to_string()))?; + + // --- Create a NEW deployment record for the rollback --- + // This gives us fresh timestamps, a unique slug, and proper tracking. + let now = chrono::Utc::now(); + + // Get next deployment number + let deployment_count = deployments::Entity::find() + .filter(deployments::Column::ProjectId.eq(project_id)) + .count(self.db.as_ref()) + .await + .map_err(|e| DeploymentError::Other(format!("Failed to count deployments: {}", e)))?; + let deployment_number = deployment_count + 1; + + let rollback_slug = format!("{}-{}", project.slug, deployment_number); + + let rollback_metadata = DeploymentMetadata { + is_rollback: true, + rolled_back_from_id: Some(deployment_id), + ..Default::default() + }; + + let new_deployment = deployments::ActiveModel { + id: sea_orm::NotSet, + project_id: Set(project_id), + environment_id: Set(environment_id), + slug: Set(rollback_slug.clone()), + state: Set("running".to_string()), + metadata: Set(Some(rollback_metadata)), + branch_ref: Set(target_deployment.branch_ref.clone()), + tag_ref: Set(target_deployment.tag_ref.clone()), + commit_sha: Set(target_deployment.commit_sha.clone()), + commit_message: Set(target_deployment.commit_message.clone()), + commit_author: Set(target_deployment.commit_author.clone()), + commit_json: Set(target_deployment.commit_json.clone()), + image_name: Set(Some(image_name.clone())), + started_at: Set(Some(now)), + finished_at: Set(None), + deploying_at: Set(Some(now)), + ready_at: Set(None), + static_dir_location: Set(target_deployment.static_dir_location.clone()), + screenshot_location: Set(None), + cancelled_reason: Set(None), + context_vars: Set(Some(serde_json::json!({ + "trigger": "rollback", + "source_deployment_id": deployment_id, + }))), + deployment_config: Set(target_deployment.deployment_config.clone()), + created_at: Set(now), + updated_at: Set(now), + }; + + let rollback_deployment = new_deployment.insert(self.db.as_ref()).await.map_err(|e| { + DeploymentError::Other(format!("Failed to create rollback deployment: {}", e)) + })?; + + let rollback_deployment_id = rollback_deployment.id; + info!( + "Created rollback deployment #{} (rolling back to #{}, image: {})", + rollback_deployment_id, deployment_id, image_name + ); + // Check if preset is static - if so, just update environment without deploying if preset.project_type() == temps_presets::ProjectType::Static { info!("Rollback: Static preset detected - updating environment only"); - // For static deployments, just point environment to the previous deployment let mut active_env: environments::ActiveModel = environment.into(); - active_env.current_deployment_id = Set(Some(deployment_id)); + active_env.current_deployment_id = Set(Some(rollback_deployment_id)); active_env.update(self.db.as_ref()).await?; + // Mark the rollback deployment as completed + let mut active_dep: deployments::ActiveModel = rollback_deployment.clone().into(); + active_dep.state = Set("completed".to_string()); + active_dep.finished_at = Set(Some(chrono::Utc::now())); + active_dep.update(self.db.as_ref()).await?; + info!( - "Rollback completed - environment {} now points to deployment {}", - environment_id, deployment_id + "Rollback completed - environment {} now points to rollback deployment {}", + environment_id, rollback_deployment_id ); } else { - // Pre-flight check: verify the Docker image still exists locally before - // attempting rollback. Without this check, rollback would fail mid-way - // (during container creation) after potentially disrupting the current deployment. + // Pre-flight check: verify the Docker image still exists locally match self.deployer.image_exists(&image_name).await { Ok(true) => { info!( @@ -670,6 +740,17 @@ impl DeploymentService { ); } Ok(false) => { + // Mark the rollback deployment as failed + let mut active_dep: deployments::ActiveModel = + rollback_deployment.clone().into(); + active_dep.state = Set("failed".to_string()); + active_dep.finished_at = Set(Some(chrono::Utc::now())); + active_dep.cancelled_reason = Set(Some(format!( + "Docker image '{}' no longer exists locally", + image_name + ))); + let _ = active_dep.update(self.db.as_ref()).await; + return Err(DeploymentError::Other(format!( "Cannot rollback: Docker image '{}' no longer exists locally. \ The image may have been removed by Docker pruning. \ @@ -685,37 +766,115 @@ impl DeploymentService { } } - // Rollback workflow for non-static presets: - // 1. Deploy the image using DeployImageJob (will deploy/redeploy containers as needed) - // 2. Mark deployment as complete using MarkDeploymentCompleteJob (will stop previous deployments) - - // Create log path for rollback execution - let rollback_log_id = format!( - "rollback-{}-{}", - deployment_id, - chrono::Utc::now().timestamp() + // --- Create per-job log paths (matching normal deployment pattern) --- + let deploy_log_id = format!( + "{}/{}/{}/{:02}/{:02}/{:02}/{:02}/deployment-{}-job-deploy_container.log", + project.slug, + environment.slug, + now.format("%Y"), + now.format("%m"), + now.format("%d"), + now.format("%H"), + now.format("%M"), + rollback_deployment_id ); + let complete_log_id = format!( + "{}/{}/{}/{:02}/{:02}/{:02}/{:02}/deployment-{}-job-mark_deployment_complete.log", + project.slug, + environment.slug, + now.format("%Y"), + now.format("%m"), + now.format("%d"), + now.format("%H"), + now.format("%M"), + rollback_deployment_id + ); + self.log_service - .create_log_path(&rollback_log_id) + .create_log_path(&deploy_log_id) .await .map_err(|e| { - DeploymentError::Other(format!("Failed to create log path for rollback: {}", e)) + DeploymentError::Other(format!("Failed to create deploy log path: {}", e)) })?; + self.log_service + .create_log_path(&complete_log_id) + .await + .map_err(|e| { + DeploymentError::Other(format!("Failed to create complete log path: {}", e)) + })?; + + // --- Create deployment_jobs records so the API can return them --- + use temps_entities::{deployment_jobs, types::JobStatus}; + + let deploy_job_record = deployment_jobs::ActiveModel { + deployment_id: Set(rollback_deployment_id), + job_id: Set("deploy_container".to_string()), + job_type: Set("DeployImageJob".to_string()), + name: Set("Deploy Container".to_string()), + description: Set(Some(format!("Rollback: deploy image {}", image_name))), + status: Set(JobStatus::Running), + log_id: Set(deploy_log_id.clone()), + job_config: Set(None), + dependencies: Set(None), + execution_order: Set(Some(0)), + started_at: Set(Some(now)), + ..Default::default() + }; + let deploy_job_model = + deploy_job_record + .insert(self.db.as_ref()) + .await + .map_err(|e| { + DeploymentError::Other(format!("Failed to create deploy job record: {}", e)) + })?; + + let complete_job_record = deployment_jobs::ActiveModel { + deployment_id: Set(rollback_deployment_id), + job_id: Set("mark_deployment_complete".to_string()), + job_type: Set("MarkDeploymentCompleteJob".to_string()), + name: Set("Mark Deployment Complete".to_string()), + description: Set(Some("Finalize rollback deployment".to_string())), + status: Set(JobStatus::Pending), + log_id: Set(complete_log_id.clone()), + job_config: Set(None), + dependencies: Set(Some( + serde_json::to_value(vec!["deploy_container"]).unwrap_or_default(), + )), + execution_order: Set(Some(1)), + ..Default::default() + }; + let complete_job_model = + complete_job_record + .insert(self.db.as_ref()) + .await + .map_err(|e| { + DeploymentError::Other(format!( + "Failed to create complete job record: {}", + e + )) + })?; + + // --- Step 0: Stop current environment containers BEFORE deploying --- + // This prevents port conflicts where the old container still holds a port. + info!( + "Rollback: Stopping current containers for environment {}", + environment_id + ); + self.stop_environment_containers(environment_id, rollback_deployment_id) + .await; info!("Rollback: Deploying image: {}", image_name); // Step 1: Execute DeployImageJob with external image - // CRITICAL: Use "deploy_container" as job_id so MarkDeploymentCompleteJob can find the outputs - // Skip HTTP health checks for rollbacks: the image was already deployed and - // verified healthy before, so we only need to confirm the container starts. + // Use the NEW rollback slug as the container name (not the old deployment's slug) let mut deploy_builder = crate::jobs::DeployImageJobBuilder::new() .job_id("deploy_container".to_string()) - .build_job_id("external-image".to_string()) // Placeholder, will use external image instead + .build_job_id("external-image".to_string()) .target(crate::jobs::DeploymentTarget::Docker { registry_url: "local".to_string(), network: Some(temps_core::NETWORK_NAME.to_string()), }) - .service_name(target_deployment.slug.clone()) + .service_name(rollback_slug.clone()) .health_check_path(None) .replicas( environment @@ -743,7 +902,7 @@ impl DeploymentService { }) .unwrap_or(3000) as u32, ) - .log_id(rollback_log_id.clone()) + .log_id(deploy_log_id.clone()) .log_service(self.log_service.clone()); // Apply container log rotation settings from config @@ -758,13 +917,13 @@ impl DeploymentService { let deploy_job = deploy_builder .build(self.deployer.clone()) .map_err(|e| DeploymentError::Other(format!("Failed to create deploy job: {}", e)))? - .with_external_image_tag(image_name.clone()); // Use existing image without rebuild + .with_external_image_tag(image_name.clone()); - // Create workflow context for execution with a mock log writer + // Create workflow context for the NEW rollback deployment let mock_log_writer = Arc::new(crate::test_utils::MockLogWriter::new(0)); let mut rollback_context = temps_core::WorkflowContext::new( - format!("rollback-{}", deployment_id), - deployment_id, + format!("rollback-{}", rollback_deployment_id), + rollback_deployment_id, project_id, environment_id, mock_log_writer, @@ -774,9 +933,38 @@ impl DeploymentService { Ok(job_result) => { info!("Rollback: Deploy job completed successfully"); rollback_context = job_result.context; + + // Update deploy job record to Success + let mut active_job: deployment_jobs::ActiveModel = deploy_job_model.into(); + active_job.status = Set(JobStatus::Success); + active_job.finished_at = Set(Some(chrono::Utc::now())); + let _ = active_job.update(self.db.as_ref()).await; } Err(e) => { error!("Rollback: Deploy job failed: {}", e); + + // Update deploy job record to Failure + let mut active_job: deployment_jobs::ActiveModel = deploy_job_model.into(); + active_job.status = Set(JobStatus::Failure); + active_job.finished_at = Set(Some(chrono::Utc::now())); + active_job.error_message = Set(Some(format!("Deploy failed: {}", e))); + let _ = active_job.update(self.db.as_ref()).await; + + // Cancel the pending complete job + let mut active_complete: deployment_jobs::ActiveModel = + complete_job_model.into(); + active_complete.status = Set(JobStatus::Cancelled); + active_complete.error_message = Set(Some("Deploy job failed".to_string())); + let _ = active_complete.update(self.db.as_ref()).await; + + // Mark the rollback deployment as failed + let mut active_dep: deployments::ActiveModel = + rollback_deployment.clone().into(); + active_dep.state = Set("failed".to_string()); + active_dep.finished_at = Set(Some(chrono::Utc::now())); + active_dep.cancelled_reason = Set(Some(format!("Deploy failed: {}", e))); + let _ = active_dep.update(self.db.as_ref()).await; + return Err(DeploymentError::Other(format!( "Failed to deploy image during rollback: {}", e @@ -784,14 +972,32 @@ impl DeploymentService { } } - // Step 2: Execute MarkDeploymentCompleteJob - info!("Rollback: Marking deployment {} as complete", deployment_id); + // Step 2: Execute MarkDeploymentCompleteJob on the NEW rollback deployment + info!( + "Rollback: Marking deployment {} as complete", + rollback_deployment_id + ); + + // Update complete job to Running + let mut active_complete: deployment_jobs::ActiveModel = complete_job_model.into(); + active_complete.status = Set(JobStatus::Running); + active_complete.started_at = Set(Some(chrono::Utc::now())); + let complete_job_model = + active_complete + .update(self.db.as_ref()) + .await + .map_err(|e| { + DeploymentError::Other(format!( + "Failed to update complete job status: {}", + e + )) + })?; let mark_complete_job = crate::jobs::MarkDeploymentCompleteJobBuilder::new() - .job_id(format!("rollback-mark-complete-{}", deployment_id)) - .deployment_id(deployment_id) + .job_id("mark_deployment_complete".to_string()) + .deployment_id(rollback_deployment_id) .db(self.db.clone()) - .log_id(rollback_log_id) + .log_id(complete_log_id) .log_service(self.log_service.clone()) .container_deployer(self.deployer.clone()) .queue(self.queue_service.clone()) @@ -803,9 +1009,23 @@ impl DeploymentService { match mark_complete_job.execute(rollback_context).await { Ok(_) => { info!("Rollback: Mark complete job executed successfully"); + + // Update complete job record to Success + let mut active_job: deployment_jobs::ActiveModel = complete_job_model.into(); + active_job.status = Set(JobStatus::Success); + active_job.finished_at = Set(Some(chrono::Utc::now())); + let _ = active_job.update(self.db.as_ref()).await; } Err(e) => { error!("Rollback: Mark complete job failed: {}", e); + + // Update complete job record to Failure + let mut active_job: deployment_jobs::ActiveModel = complete_job_model.into(); + active_job.status = Set(JobStatus::Failure); + active_job.finished_at = Set(Some(chrono::Utc::now())); + active_job.error_message = Set(Some(format!("Mark complete failed: {}", e))); + let _ = active_job.update(self.db.as_ref()).await; + return Err(DeploymentError::Other(format!( "Failed to mark deployment complete during rollback: {}", e @@ -815,15 +1035,87 @@ impl DeploymentService { info!( "Rollback completed - deployment {} is now active", - deployment_id + rollback_deployment_id ); } + // Re-fetch the rollback deployment to get the final state + let final_deployment = deployments::Entity::find_by_id(rollback_deployment_id) + .one(self.db.as_ref()) + .await? + .ok_or_else(|| DeploymentError::Other("Rollback deployment disappeared".to_string()))?; + Ok(self - .map_db_deployment_to_deployment(target_deployment, true, None) + .map_db_deployment_to_deployment(final_deployment, true, None) .await) } + /// Stop all running containers for an environment (used before rollback deploys) + async fn stop_environment_containers(&self, environment_id: i32, exclude_deployment_id: i32) { + // Find all active deployments for this environment + let active_deployments = match deployments::Entity::find() + .filter(deployments::Column::EnvironmentId.eq(environment_id)) + .filter(deployments::Column::Id.ne(exclude_deployment_id)) + .filter(deployments::Column::State.is_in(vec!["running", "completed", "deployed"])) + .all(self.db.as_ref()) + .await + { + Ok(deps) => deps, + Err(e) => { + warn!( + "Failed to fetch active deployments for pre-rollback cleanup: {}", + e + ); + return; + } + }; + + for dep in &active_deployments { + let containers = match deployment_containers::Entity::find() + .filter(deployment_containers::Column::DeploymentId.eq(dep.id)) + .filter(deployment_containers::Column::DeletedAt.is_null()) + .all(self.db.as_ref()) + .await + { + Ok(c) => c, + Err(e) => { + warn!( + "Failed to fetch containers for deployment {}: {}", + dep.id, e + ); + continue; + } + }; + + for container in containers { + let container_id = container.container_id.clone(); + if let Err(e) = self.deployer.stop_container(&container_id).await { + warn!( + "Failed to stop container {} during pre-rollback cleanup: {}", + container_id, e + ); + } + if let Err(e) = self.deployer.remove_container(&container_id).await { + warn!( + "Failed to remove container {} during pre-rollback cleanup: {}", + container_id, e + ); + } + + // Mark container as deleted + let mut active_container: deployment_containers::ActiveModel = container.into(); + active_container.deleted_at = Set(Some(chrono::Utc::now())); + active_container.status = Set(Some("removed".to_string())); + let _ = active_container.update(self.db.as_ref()).await; + + info!( + "Pre-rollback: stopped and removed container {}", + container_id + ); + } + } + } + /// Tears down a specific deployment, removing containers and cleaning up resources pub async fn teardown_deployment( &self, @@ -2473,19 +2765,26 @@ mod tests { .rollback_to_deployment(target_deployment.project_id, target_deployment.id) .await?; - // Verify result - assert_eq!(result.id, target_deployment.id); + // Verify result - rollback now creates a NEW deployment record + // The returned deployment ID should be different from the target (it's the new rollback deployment) + assert_ne!(result.id, target_deployment.id); assert!(result.is_current); - // Verify environment was updated to point to target deployment + // Verify the new rollback deployment has the correct metadata + let rollback_dep = deployments::Entity::find_by_id(result.id) + .one(db.as_ref()) + .await? + .unwrap(); + let metadata = rollback_dep.metadata.unwrap(); + assert!(metadata.is_rollback); + assert_eq!(metadata.rolled_back_from_id, Some(target_deployment.id)); + + // Verify environment was updated to point to the NEW rollback deployment let updated_environment = environments::Entity::find_by_id(environment.id) .one(db.as_ref()) .await? .unwrap(); - assert_eq!( - updated_environment.current_deployment_id, - Some(target_deployment.id) - ); + assert_eq!(updated_environment.current_deployment_id, Some(result.id)); Ok(()) } @@ -3002,6 +3301,7 @@ mod tests { end_date: None, tail: None, timestamps: false, + follow: false, }, ) .await; @@ -3597,60 +3897,76 @@ mod tests { let deployment_service = create_deployment_service_for_test(db.clone()); // Test 1: Rollback to deployment2 + // Rollback now creates a NEW deployment record with is_rollback metadata println!("Test 1: Rolling back to deployment 2"); - deployment_service + let rollback1 = deployment_service .rollback_to_deployment(project.id, deployment2.id) .await?; - // Verify deployment2 is now current + // Verify the new rollback deployment is now current (not the original deployment2) let updated_env = environments::Entity::find_by_id(environment.id) .one(db.as_ref()) .await? .expect("Environment should exist"); - assert_eq!(updated_env.current_deployment_id, Some(deployment2.id)); + assert_eq!(updated_env.current_deployment_id, Some(rollback1.id)); + // Verify rollback metadata points to the original deployment + let rollback1_dep = deployments::Entity::find_by_id(rollback1.id) + .one(db.as_ref()) + .await? + .unwrap(); + let meta1 = rollback1_dep.metadata.unwrap(); + assert!(meta1.is_rollback); + assert_eq!(meta1.rolled_back_from_id, Some(deployment2.id)); // Test 2: Rollback to deployment1 (containers redeployed) println!("Test 2: Rolling back to deployment 1 (containers redeployed)"); - deployment_service + let rollback2 = deployment_service .rollback_to_deployment(project.id, deployment1.id) .await?; - // Verify deployment1 is now current + // Verify the new rollback deployment is now current let updated_env = environments::Entity::find_by_id(environment.id) .one(db.as_ref()) .await? .expect("Environment should exist"); - assert_eq!(updated_env.current_deployment_id, Some(deployment1.id)); + assert_eq!(updated_env.current_deployment_id, Some(rollback2.id)); + let rollback2_dep = deployments::Entity::find_by_id(rollback2.id) + .one(db.as_ref()) + .await? + .unwrap(); + let meta2 = rollback2_dep.metadata.unwrap(); + assert!(meta2.is_rollback); + assert_eq!(meta2.rolled_back_from_id, Some(deployment1.id)); // Test 3: Verify rollback chain (3 -> 2 -> 1) println!("Test 3: Full rollback chain (3 -> 2 -> 1)"); - deployment_service + let rollback3 = deployment_service .rollback_to_deployment(project.id, deployment3.id) .await?; let updated_env = environments::Entity::find_by_id(environment.id) .one(db.as_ref()) .await? .expect("Environment should exist"); - assert_eq!(updated_env.current_deployment_id, Some(deployment3.id)); + assert_eq!(updated_env.current_deployment_id, Some(rollback3.id)); - deployment_service + let rollback4 = deployment_service .rollback_to_deployment(project.id, deployment2.id) .await?; let updated_env = environments::Entity::find_by_id(environment.id) .one(db.as_ref()) .await? .expect("Environment should exist"); - assert_eq!(updated_env.current_deployment_id, Some(deployment2.id)); + assert_eq!(updated_env.current_deployment_id, Some(rollback4.id)); - deployment_service + let rollback5 = deployment_service .rollback_to_deployment(project.id, deployment1.id) .await?; let updated_env = environments::Entity::find_by_id(environment.id) .one(db.as_ref()) .await? .expect("Environment should exist"); - assert_eq!(updated_env.current_deployment_id, Some(deployment1.id)); + assert_eq!(updated_env.current_deployment_id, Some(rollback5.id)); println!("All rollback tests passed!"); Ok(()) diff --git a/crates/temps-domains/src/handlers/domain_handler.rs b/crates/temps-domains/src/handlers/domain_handler.rs index da2202ee..b867b3a8 100644 --- a/crates/temps-domains/src/handlers/domain_handler.rs +++ b/crates/temps-domains/src/handlers/domain_handler.rs @@ -529,13 +529,89 @@ async fn provision_domain( .build()); } + // Look up the domain to check the stored verification method + let domain_model = app_state + .domain_service + .get_domain(&domain) + .await + .map_err(|e| { + error!("Failed to get domain {}: {}", domain, e); + e + })? + .ok_or_else(|| { + ErrorBuilder::new(StatusCode::NOT_FOUND) + .title("Domain not found") + .detail(format!("Domain {} not found", domain)) + .build() + })?; + + // For DNS-01 domains, use request_challenge + complete_challenge flow + if domain_model.verification_method == "dns-01" { + info!( + "Starting DNS-01 challenge provisioning for domain: {} for user: {}", + domain, + auth.user_id() + ); + + // Try to complete the DNS-01 challenge (assumes TXT records are already set) + let result = app_state + .domain_service + .complete_challenge(&domain, user_email) + .await; + + let result = match result { + Ok(certificate) => { + info!( + "Certificate successfully provisioned via DNS-01 for {}", + domain + ); + Ok(( + StatusCode::OK, + Json(ProvisionResponse::Complete(DomainResponse::from( + certificate, + ))), + )) + } + Err(e) => { + error!( + "Failed to provision certificate via DNS-01 for {}: {}", + domain, e + ); + Ok(( + StatusCode::OK, + Json(ProvisionResponse::Error(DomainError { + message: e.to_string(), + code: "PROVISION_FAILED".to_string(), + details: Some("DNS-01 challenge provisioning failed. Ensure TXT records are set correctly.".to_string()), + })), + )) + } + }; + + // Audit log + let audit = DomainAudit { + context: AuditContext { + user_id: auth.user_id(), + ip_address: Some(metadata.ip_address.clone()), + user_agent: metadata.user_agent.clone(), + }, + domain: domain.clone(), + action: "DOMAIN_PROVISIONED".to_string(), + }; + if let Err(e) = app_state.audit_service.create_audit_log(&audit).await { + error!("Failed to create audit log: {}", e); + } + + return result; + } + + // HTTP-01 flow (default) info!( - "Starting HTTP challenge provisioning for domain: {} for user: {}", + "Starting HTTP-01 challenge provisioning for domain: {} for user: {}", domain, auth.user_id() ); - // Try to provision the certificate using HTTP-01 challenge let result = match app_state .tls_service .provision_certificate(&domain, user_email) @@ -556,7 +632,6 @@ async fn provision_domain( domain, msg ); - // Return a challenge response that includes HTTP challenge details let challenge_response = DomainChallengeResponse { domain: domain.clone(), txt_records: vec![TxtRecord { diff --git a/crates/temps-entities/src/visitor.rs b/crates/temps-entities/src/visitor.rs index be267b62..e2d48a27 100644 --- a/crates/temps-entities/src/visitor.rs +++ b/crates/temps-entities/src/visitor.rs @@ -19,6 +19,19 @@ pub struct Model { pub custom_data: Option, /// Flag indicating visitor has recorded events/sessions (not a "ghost" visitor) pub has_activity: bool, + // First-visit attribution fields (set once on visitor creation, never overwritten) + /// Full referrer URL from the visitor's first session + pub first_referrer: Option, + /// Hostname extracted from first_referrer + pub first_referrer_hostname: Option, + /// Marketing channel computed from the first visit (e.g. "Organic Search", "Direct") + pub first_channel: Option, + /// UTM source parameter from the first visit + pub first_utm_source: Option, + /// UTM medium parameter from the first visit + pub first_utm_medium: Option, + /// UTM campaign parameter from the first visit + pub first_utm_campaign: Option, } #[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)] diff --git a/crates/temps-git/src/handlers/base.rs b/crates/temps-git/src/handlers/base.rs index b61c41dc..f4d06314 100644 --- a/crates/temps-git/src/handlers/base.rs +++ b/crates/temps-git/src/handlers/base.rs @@ -1,6 +1,6 @@ use super::repositories::{ check_commit_exists, get_branches_by_repository_id, get_repository_branches, - get_repository_tags, get_tags_by_repository_id, + get_repository_tags, get_tags_by_repository_id, list_commits_by_repository_id, }; use super::types::GitAppState as AppState; use super::types::GitAppState; @@ -1304,6 +1304,10 @@ pub fn configure_routes() -> axum::Router> { "/repository/{repository_id}/commits/{commit_sha}", get(check_commit_exists), ) + .route( + "/repository/{repository_id}/commits", + get(list_commits_by_repository_id), + ) } // Helper function to parse auth method diff --git a/crates/temps-git/src/handlers/repositories.rs b/crates/temps-git/src/handlers/repositories.rs index 62609c7c..9972640f 100644 --- a/crates/temps-git/src/handlers/repositories.rs +++ b/crates/temps-git/src/handlers/repositories.rs @@ -608,6 +608,139 @@ pub async fn check_commit_exists( })) } +#[derive(Debug, Deserialize, IntoParams)] +pub struct CommitListQueryParams { + /// Branch name to list commits for + pub branch: String, + /// Number of commits to return (default: 20, max: 100) + pub per_page: Option, +} + +#[derive(Debug, Serialize, Deserialize, ToSchema)] +pub struct CommitInfo { + /// Commit SHA hash + pub sha: String, + /// Commit message + pub message: String, + /// Author name + pub author: String, + /// Author email + pub author_email: String, + /// Commit date in ISO 8601 format + #[schema(value_type = String, format = DateTime, example = "2025-10-12T12:15:47.609192Z")] + pub date: chrono::DateTime, +} + +#[derive(Debug, Serialize, Deserialize, ToSchema)] +pub struct CommitListResponse { + pub commits: Vec, +} + +/// List recent commits for a repository branch +#[utoipa::path( + get, + path = "repository/{repository_id}/commits", + params( + ("repository_id" = i32, Path, description = "Repository ID"), + CommitListQueryParams + ), + responses( + (status = 200, description = "List of commits", body = CommitListResponse), + (status = 401, description = "Unauthorized"), + (status = 404, description = "Repository not found"), + (status = 500, description = "Internal server error") + ), + tag = "Repositories", + security( + ("bearer_auth" = []) + ) +)] +pub async fn list_commits_by_repository_id( + RequireAuth(auth): RequireAuth, + State(state): State>, + Path(repository_id): Path, + Query(params): Query, +) -> Result, Problem> { + // Check permission + permission_check!(auth, Permission::GitRepositoriesRead); + + // Find the repository by ID + let repository = state + .git_provider_manager + .get_repository_by_id(repository_id) + .await?; + + // Get the connection and provider + let connection_id = repository.git_provider_connection_id; + + let connection = state + .git_provider_manager + .get_connection(connection_id) + .await + .map_err(|e| { + ErrorBuilder::new(StatusCode::INTERNAL_SERVER_ERROR) + .title("Failed to get git provider connection") + .detail(format!("Error: {}", e)) + .build() + })?; + + let provider_service = state + .git_provider_manager + .get_provider_service(connection.provider_id) + .await + .map_err(|e| { + ErrorBuilder::new(StatusCode::INTERNAL_SERVER_ERROR) + .title("Failed to get git provider service") + .detail(format!("Error: {}", e)) + .build() + })?; + + let access_token = state + .git_provider_manager + .get_connection_token(connection_id) + .await + .map_err(|e| { + ErrorBuilder::new(StatusCode::INTERNAL_SERVER_ERROR) + .title("Failed to get access token") + .detail(format!("Error: {}", e)) + .build() + })?; + + let per_page = std::cmp::min(params.per_page.unwrap_or(20), 100); + + // Get commits from the git provider + let commits = provider_service + .list_commits( + &access_token, + &repository.owner, + &repository.name, + ¶ms.branch, + per_page, + ) + .await + .map_err(|e| { + ErrorBuilder::new(StatusCode::INTERNAL_SERVER_ERROR) + .title("Failed to fetch commits") + .detail(format!("Error fetching commits from git provider: {}", e)) + .build() + })?; + + let commit_infos: Vec = commits + .into_iter() + .map(|commit| CommitInfo { + sha: commit.sha, + message: commit.message, + author: commit.author, + author_email: commit.author_email, + date: commit.date, + }) + .collect(); + + Ok(Json(CommitListResponse { + commits: commit_infos, + })) +} + #[derive(OpenApi)] #[openapi( paths( @@ -615,7 +748,8 @@ pub async fn check_commit_exists( get_repository_tags, get_branches_by_repository_id, get_tags_by_repository_id, - check_commit_exists + check_commit_exists, + list_commits_by_repository_id ), components( schemas( @@ -623,7 +757,9 @@ pub async fn check_commit_exists( BranchListResponse, TagInfo, TagListResponse, - CommitExistsResponse + CommitExistsResponse, + CommitInfo, + CommitListResponse ) ), tags( diff --git a/crates/temps-git/src/services/git_provider.rs b/crates/temps-git/src/services/git_provider.rs index e0b3486f..f09d16ad 100644 --- a/crates/temps-git/src/services/git_provider.rs +++ b/crates/temps-git/src/services/git_provider.rs @@ -458,6 +458,16 @@ pub trait GitProviderService: Send + Sync { commit_sha: &str, ) -> Result; + /// List recent commits for a branch + async fn list_commits( + &self, + access_token: &str, + owner: &str, + repo: &str, + branch: &str, + per_page: u32, + ) -> Result, GitProviderError>; + /// Create a webhook for repository async fn create_webhook( &self, diff --git a/crates/temps-git/src/services/github_provider.rs b/crates/temps-git/src/services/github_provider.rs index 886a73a6..55284633 100644 --- a/crates/temps-git/src/services/github_provider.rs +++ b/crates/temps-git/src/services/github_provider.rs @@ -1267,6 +1267,79 @@ impl GitProviderService for GitHubProvider { } } + async fn list_commits( + &self, + access_token: &str, + owner: &str, + repo: &str, + branch: &str, + per_page: u32, + ) -> Result, GitProviderError> { + let client = self.get_client(); + let headers = self.get_headers(access_token); + + let url = format!( + "{}/repos/{}/{}/commits?sha={}&per_page={}", + self.api_url, owner, repo, branch, per_page + ); + + let response = self + .send_with_retry(|| client.get(&url).headers(headers.clone())) + .await?; + + if !response.status().is_success() { + return Err(GitProviderError::ApiError(format!( + "Failed to list commits: {}", + response.status() + ))); + } + + #[derive(Deserialize)] + struct GitHubCommitItem { + sha: String, + commit: GitHubCommitItemDetails, + } + + #[derive(Deserialize)] + struct GitHubCommitItemDetails { + message: String, + author: Option, + } + + #[derive(Deserialize)] + struct GitHubCommitItemAuthor { + name: Option, + email: Option, + date: Option, + } + + let items: Vec = response + .json() + .await + .map_err(|e| GitProviderError::ApiError(e.to_string()))?; + + let commits = items + .into_iter() + .map(|item| { + let author = item.commit.author.as_ref(); + let date_str = author.and_then(|a| a.date.as_deref()).unwrap_or(""); + let date = chrono::DateTime::parse_from_rfc3339(date_str) + .map(|dt| dt.into()) + .unwrap_or_else(|_| chrono::Utc::now()); + + Commit { + sha: item.sha, + message: item.commit.message, + author: author.and_then(|a| a.name.clone()).unwrap_or_default(), + author_email: author.and_then(|a| a.email.clone()).unwrap_or_default(), + date, + } + }) + .collect(); + + Ok(commits) + } + async fn download_archive( &self, access_token: &str, diff --git a/crates/temps-git/src/services/gitlab_provider.rs b/crates/temps-git/src/services/gitlab_provider.rs index 22766640..88e69d30 100644 --- a/crates/temps-git/src/services/gitlab_provider.rs +++ b/crates/temps-git/src/services/gitlab_provider.rs @@ -1069,6 +1069,60 @@ impl GitProviderService for GitLabProvider { } } + async fn list_commits( + &self, + access_token: &str, + owner: &str, + repo: &str, + branch: &str, + per_page: u32, + ) -> Result, GitProviderError> { + let client = self.get_client(); + let headers = self.get_headers(access_token); + + let project_path = format!("{}/{}", owner, repo); + let encoded_path = urlencoding::encode(&project_path); + let url = format!( + "{}/api/v4/projects/{}/repository/commits?ref_name={}&per_page={}", + self.base_url, encoded_path, branch, per_page + ); + + let response = self + .send_with_retry(|| client.get(&url).headers(headers.clone())) + .await?; + + if !response.status().is_success() { + return Err(GitProviderError::ApiError(format!( + "Failed to list commits: {}", + response.status() + ))); + } + + let items: Vec = response + .json() + .await + .map_err(|e| GitProviderError::ApiError(e.to_string()))?; + + let commits = items + .into_iter() + .map(|item| { + let date = chrono::DateTime::parse_from_rfc3339(&item.committed_date) + .map(|dt| dt.with_timezone(&chrono::Utc)) + .unwrap_or_else(|_| chrono::Utc::now()); + + Commit { + sha: item.id, + message: item.message, + author: item.author_name, + author_email: item.author_email, + date, + } + }) + .collect(); + + Ok(commits) + } + async fn download_archive( &self, access_token: &str, diff --git a/crates/temps-logs/src/docker_logs.rs b/crates/temps-logs/src/docker_logs.rs index 78b6c310..8f7bd559 100644 --- a/crates/temps-logs/src/docker_logs.rs +++ b/crates/temps-logs/src/docker_logs.rs @@ -33,6 +33,7 @@ pub struct ContainerLogOptions { pub end_date: Option, pub tail: Option, // "all" or number of lines pub timestamps: bool, // Include timestamps in log output + pub follow: bool, // Follow log output (like `docker logs -f`) } impl DockerLogService { pub fn new(docker: Arc) -> Self { @@ -54,7 +55,7 @@ impl DockerLogService { let tail = options.tail.unwrap_or_else(|| "all".to_string()); let log_options = Some(LogsOptions { - follow: true, + follow: options.follow, stdout: true, stderr: true, timestamps: options.timestamps, @@ -140,6 +141,7 @@ impl DockerLogService { end_date: None, tail: lines.map(|l| l.to_string()), timestamps: true, + follow: false, }; let logs = self.get_logs_with_timestamps(container_id, lines).await?; tokio::fs::write(file_path, logs).await?; @@ -256,6 +258,7 @@ mod tests { end_date: None, tail: Some("10".to_string()), timestamps: true, + follow: false, }; let result = docker_service .get_container_logs("non-existent-container-12345", options) diff --git a/crates/temps-migrations/src/migration/m20260217_000001_add_first_referrer_to_visitor.rs b/crates/temps-migrations/src/migration/m20260217_000001_add_first_referrer_to_visitor.rs new file mode 100644 index 00000000..7de9f89d --- /dev/null +++ b/crates/temps-migrations/src/migration/m20260217_000001_add_first_referrer_to_visitor.rs @@ -0,0 +1,86 @@ +use sea_orm_migration::prelude::*; + +#[derive(DeriveMigrationName)] +pub struct Migration; + +#[async_trait::async_trait] +impl MigrationTrait for Migration { + async fn up(&self, manager: &SchemaManager) -> Result<(), DbErr> { + let db = manager.get_connection(); + + // Step 1: Add first-visit attribution columns to visitor table + db.execute_unprepared( + r#" + ALTER TABLE visitor + ADD COLUMN IF NOT EXISTS first_referrer TEXT, + ADD COLUMN IF NOT EXISTS first_referrer_hostname TEXT, + ADD COLUMN IF NOT EXISTS first_channel TEXT, + ADD COLUMN IF NOT EXISTS first_utm_source TEXT, + ADD COLUMN IF NOT EXISTS first_utm_medium TEXT, + ADD COLUMN IF NOT EXISTS first_utm_campaign TEXT + "#, + ) + .await?; + + // Step 2: Backfill existing visitors from their earliest request_sessions record + db.execute_unprepared( + r#" + UPDATE visitor v + SET first_referrer = rs.referrer, + first_referrer_hostname = rs.referrer_hostname, + first_channel = rs.channel, + first_utm_source = rs.utm_source, + first_utm_medium = rs.utm_medium, + first_utm_campaign = rs.utm_campaign + FROM ( + SELECT DISTINCT ON (visitor_id) + visitor_id, + referrer, + referrer_hostname, + channel, + utm_source, + utm_medium, + utm_campaign + FROM request_sessions + WHERE visitor_id IS NOT NULL + ORDER BY visitor_id, started_at ASC + ) rs + WHERE v.id = rs.visitor_id + AND v.first_referrer IS NULL + "#, + ) + .await?; + + // Step 3: Create index on first_channel for filtering/grouping visitors by acquisition channel + db.execute_unprepared( + r#" + CREATE INDEX IF NOT EXISTS idx_visitor_first_channel + ON visitor (project_id, first_channel) + WHERE first_channel IS NOT NULL + "#, + ) + .await?; + + Ok(()) + } + + async fn down(&self, manager: &SchemaManager) -> Result<(), DbErr> { + let db = manager.get_connection(); + + db.execute_unprepared( + r#" + DROP INDEX IF EXISTS idx_visitor_first_channel; + ALTER TABLE visitor + DROP COLUMN IF EXISTS first_referrer, + DROP COLUMN IF EXISTS first_referrer_hostname, + DROP COLUMN IF EXISTS first_channel, + DROP COLUMN IF EXISTS first_utm_source, + DROP COLUMN IF EXISTS first_utm_medium, + DROP COLUMN IF EXISTS first_utm_campaign + "#, + ) + .await?; + + Ok(()) + } +} diff --git a/crates/temps-migrations/src/migration/mod.rs b/crates/temps-migrations/src/migration/mod.rs index b007d049..5c626b65 100644 --- a/crates/temps-migrations/src/migration/mod.rs +++ b/crates/temps-migrations/src/migration/mod.rs @@ -26,6 +26,7 @@ mod m20260122_000001_increase_checksum_length; mod m20260213_000001_create_source_maps; mod m20260214_000001_create_events_hourly_aggregate; mod m20260214_000002_add_analytics_performance_indexes; +mod m20260217_000001_add_first_referrer_to_visitor; pub struct Migrator; @@ -59,6 +60,7 @@ impl MigratorTrait for Migrator { Box::new(m20260213_000001_create_source_maps::Migration), Box::new(m20260214_000001_create_events_hourly_aggregate::Migration), Box::new(m20260214_000002_add_analytics_performance_indexes::Migration), + Box::new(m20260217_000001_add_first_referrer_to_visitor::Migration), ] } } diff --git a/crates/temps-proxy/src/proxy.rs b/crates/temps-proxy/src/proxy.rs index 7f539c8b..796c2d53 100644 --- a/crates/temps-proxy/src/proxy.rs +++ b/crates/temps-proxy/src/proxy.rs @@ -356,6 +356,28 @@ impl LoadBalancer { return Ok(()); } + // Compute first-visit attribution from referrer and query string + // These fields are only stored when creating a NEW visitor + let utm = ctx + .query_string + .as_deref() + .map(temps_analytics::parse_utm_params) + .unwrap_or_default(); + let referrer_hostname = ctx + .referrer + .as_deref() + .and_then(temps_analytics::extract_referrer_hostname); + let channel = + temps_analytics::get_channel(&utm, referrer_hostname.as_deref(), Some(&ctx.host)); + let attribution = crate::traits::FirstVisitAttribution { + referrer: ctx.referrer.clone(), + referrer_hostname: referrer_hostname.clone(), + channel: Some(channel.to_string()), + utm_source: utm.utm_source.clone(), + utm_medium: utm.utm_medium.clone(), + utm_campaign: utm.utm_campaign.clone(), + }; + // Create visitor using the trait (only for non-crawlers) let visitor = match self .visitor_manager @@ -364,6 +386,7 @@ impl LoadBalancer { project_context.as_ref(), &ctx.user_agent, ctx.ip_address.as_deref(), + &attribution, ) .await { diff --git a/crates/temps-proxy/src/proxy_test.rs b/crates/temps-proxy/src/proxy_test.rs index 72a2c71e..60d35c6f 100644 --- a/crates/temps-proxy/src/proxy_test.rs +++ b/crates/temps-proxy/src/proxy_test.rs @@ -398,12 +398,14 @@ pub mod proxy_tests { VisitorManagerImpl::new(test_db.db.clone(), crypto.clone(), ip_service); // Test visitor creation + let attribution = crate::traits::FirstVisitAttribution::default(); let visitor = visitor_manager .get_or_create_visitor( None, // No existing cookie None, // No project context "Mozilla/5.0 (test)", Some("127.0.0.1"), + &attribution, ) .await .map_err(|_| anyhow::anyhow!("Failed to get or create visitor"))?; @@ -425,7 +427,7 @@ pub mod proxy_tests { // Test bot detection let bot_visitor = convert_send_sync_error( visitor_manager - .get_or_create_visitor(None, None, "Googlebot/2.1", Some("127.0.0.1")) + .get_or_create_visitor(None, None, "Googlebot/2.1", Some("127.0.0.1"), &attribution) .await, )?; diff --git a/crates/temps-proxy/src/services.rs b/crates/temps-proxy/src/services.rs index 3aa0a8ae..41be38c5 100644 --- a/crates/temps-proxy/src/services.rs +++ b/crates/temps-proxy/src/services.rs @@ -440,6 +440,7 @@ impl VisitorManager for VisitorManagerImpl { context: Option<&ProjectContext>, user_agent: &str, ip_address: Option<&str>, + attribution: &FirstVisitAttribution, ) -> Result> { let project_id = context.as_ref().map(|c| c.project.id).unwrap_or(1); let environment_id = context.as_ref().map(|c| c.environment.id).unwrap_or(1); @@ -502,6 +503,13 @@ impl VisitorManager for VisitorManagerImpl { ip_address_id: Set(ip_address_id), is_crawler: Set(is_crawler), crawler_name: Set(crawler_name), + // First-visit attribution (set once, never overwritten) + first_referrer: Set(attribution.referrer.clone()), + first_referrer_hostname: Set(attribution.referrer_hostname.clone()), + first_channel: Set(attribution.channel.clone()), + first_utm_source: Set(attribution.utm_source.clone()), + first_utm_medium: Set(attribution.utm_medium.clone()), + first_utm_campaign: Set(attribution.utm_campaign.clone()), ..Default::default() }; @@ -1566,4 +1574,267 @@ mod tests { "last_accessed_at should be updated on reuse" ); } + + #[tokio::test] + async fn test_new_visitor_stores_first_referrer_attribution() { + let test_db = TestDatabase::with_migrations().await.unwrap(); + let crypto = Arc::new( + temps_core::CookieCrypto::new( + "0000000000000000000000000000000000000000000000000000000000000000", + ) + .unwrap(), + ); + let ip_service = create_mock_ip_service(test_db.connection_arc().clone()); + let visitor_manager = + VisitorManagerImpl::new(test_db.connection_arc().clone(), crypto.clone(), ip_service); + + let context = create_test_project_context(&test_db.connection_arc()).await; + + // Create visitor with referrer attribution from Google organic search + let attribution = FirstVisitAttribution { + referrer: Some("https://www.google.com/search?q=temps+deploy".to_string()), + referrer_hostname: Some("www.google.com".to_string()), + channel: Some("Organic Search".to_string()), + utm_source: None, + utm_medium: None, + utm_campaign: None, + }; + + let new_visitor = visitor_manager + .get_or_create_visitor( + None, + Some(&context), + "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)", + Some("8.8.8.8"), + &attribution, + ) + .await + .unwrap(); + + // Verify attribution was stored on the visitor record + let db_visitor = visitor::Entity::find_by_id(new_visitor.visitor_id_i32) + .one(test_db.connection_arc().as_ref()) + .await + .unwrap() + .expect("Visitor should exist in database"); + + assert_eq!( + db_visitor.first_referrer, + Some("https://www.google.com/search?q=temps+deploy".to_string()) + ); + assert_eq!( + db_visitor.first_referrer_hostname, + Some("www.google.com".to_string()) + ); + assert_eq!(db_visitor.first_channel, Some("Organic Search".to_string())); + assert_eq!(db_visitor.first_utm_source, None); + assert_eq!(db_visitor.first_utm_medium, None); + assert_eq!(db_visitor.first_utm_campaign, None); + } + + #[tokio::test] + async fn test_new_visitor_stores_utm_attribution() { + let test_db = TestDatabase::with_migrations().await.unwrap(); + let crypto = Arc::new( + temps_core::CookieCrypto::new( + "0000000000000000000000000000000000000000000000000000000000000000", + ) + .unwrap(), + ); + let ip_service = create_mock_ip_service(test_db.connection_arc().clone()); + let visitor_manager = + VisitorManagerImpl::new(test_db.connection_arc().clone(), crypto.clone(), ip_service); + + let context = create_test_project_context(&test_db.connection_arc()).await; + + // Create visitor with UTM campaign attribution + let attribution = FirstVisitAttribution { + referrer: Some("https://twitter.com/post/123".to_string()), + referrer_hostname: Some("twitter.com".to_string()), + channel: Some("Paid Social".to_string()), + utm_source: Some("twitter".to_string()), + utm_medium: Some("paid_social".to_string()), + utm_campaign: Some("launch_2026".to_string()), + }; + + let new_visitor = visitor_manager + .get_or_create_visitor( + None, + Some(&context), + "Mozilla/5.0 (Windows NT 10.0; Win64; x64)", + Some("1.2.3.4"), + &attribution, + ) + .await + .unwrap(); + + // Verify all UTM fields were stored + let db_visitor = visitor::Entity::find_by_id(new_visitor.visitor_id_i32) + .one(test_db.connection_arc().as_ref()) + .await + .unwrap() + .expect("Visitor should exist"); + + assert_eq!( + db_visitor.first_referrer, + Some("https://twitter.com/post/123".to_string()) + ); + assert_eq!( + db_visitor.first_referrer_hostname, + Some("twitter.com".to_string()) + ); + assert_eq!(db_visitor.first_channel, Some("Paid Social".to_string())); + assert_eq!(db_visitor.first_utm_source, Some("twitter".to_string())); + assert_eq!(db_visitor.first_utm_medium, Some("paid_social".to_string())); + assert_eq!( + db_visitor.first_utm_campaign, + Some("launch_2026".to_string()) + ); + } + + #[tokio::test] + async fn test_returning_visitor_does_not_overwrite_first_referrer() { + let test_db = TestDatabase::with_migrations().await.unwrap(); + let crypto = Arc::new( + temps_core::CookieCrypto::new( + "0000000000000000000000000000000000000000000000000000000000000000", + ) + .unwrap(), + ); + let ip_service = create_mock_ip_service(test_db.connection_arc().clone()); + let visitor_manager = + VisitorManagerImpl::new(test_db.connection_arc().clone(), crypto.clone(), ip_service); + + let context = create_test_project_context(&test_db.connection_arc()).await; + + // First visit: from Google organic search + let first_attribution = FirstVisitAttribution { + referrer: Some("https://www.google.com/search?q=temps".to_string()), + referrer_hostname: Some("www.google.com".to_string()), + channel: Some("Organic Search".to_string()), + utm_source: None, + utm_medium: None, + utm_campaign: None, + }; + + let new_visitor = visitor_manager + .get_or_create_visitor( + None, + Some(&context), + "Mozilla/5.0", + Some("8.8.8.8"), + &first_attribution, + ) + .await + .unwrap(); + + // Generate encrypted cookie for the visitor + let cookie = visitor_manager + .generate_visitor_cookie(&new_visitor, false, Some(&context)) + .await + .unwrap(); + let encrypted_visitor_id = cookie + .split(';') + .next() + .unwrap() + .trim() + .split('=') + .nth(1) + .unwrap() + .to_string(); + + // Second visit: from Twitter (different referrer) + let second_attribution = FirstVisitAttribution { + referrer: Some("https://twitter.com/someone/status/123".to_string()), + referrer_hostname: Some("twitter.com".to_string()), + channel: Some("Organic Social".to_string()), + utm_source: Some("twitter".to_string()), + utm_medium: None, + utm_campaign: None, + }; + + let returning_visitor = visitor_manager + .get_or_create_visitor( + Some(&encrypted_visitor_id), + Some(&context), + "Mozilla/5.0", + Some("8.8.8.8"), + &second_attribution, + ) + .await + .unwrap(); + + // Should be the same visitor + assert_eq!(new_visitor.visitor_id_i32, returning_visitor.visitor_id_i32); + + // Verify the FIRST referrer is still preserved (not overwritten) + let db_visitor = visitor::Entity::find_by_id(returning_visitor.visitor_id_i32) + .one(test_db.connection_arc().as_ref()) + .await + .unwrap() + .expect("Visitor should exist"); + + assert_eq!( + db_visitor.first_referrer, + Some("https://www.google.com/search?q=temps".to_string()), + "First referrer should NOT be overwritten on return visit" + ); + assert_eq!( + db_visitor.first_referrer_hostname, + Some("www.google.com".to_string()), + "First referrer hostname should NOT be overwritten" + ); + assert_eq!( + db_visitor.first_channel, + Some("Organic Search".to_string()), + "First channel should NOT be overwritten" + ); + } + + #[tokio::test] + async fn test_direct_visitor_has_direct_channel() { + let test_db = TestDatabase::with_migrations().await.unwrap(); + let crypto = Arc::new( + temps_core::CookieCrypto::new( + "0000000000000000000000000000000000000000000000000000000000000000", + ) + .unwrap(), + ); + let ip_service = create_mock_ip_service(test_db.connection_arc().clone()); + let visitor_manager = + VisitorManagerImpl::new(test_db.connection_arc().clone(), crypto.clone(), ip_service); + + let context = create_test_project_context(&test_db.connection_arc()).await; + + // Direct visit: no referrer, no UTM + let attribution = FirstVisitAttribution { + referrer: None, + referrer_hostname: None, + channel: Some("Direct".to_string()), + utm_source: None, + utm_medium: None, + utm_campaign: None, + }; + + let new_visitor = visitor_manager + .get_or_create_visitor( + None, + Some(&context), + "Mozilla/5.0", + Some("1.2.3.4"), + &attribution, + ) + .await + .unwrap(); + + let db_visitor = visitor::Entity::find_by_id(new_visitor.visitor_id_i32) + .one(test_db.connection_arc().as_ref()) + .await + .unwrap() + .expect("Visitor should exist"); + + assert_eq!(db_visitor.first_referrer, None); + assert_eq!(db_visitor.first_referrer_hostname, None); + assert_eq!(db_visitor.first_channel, Some("Direct".to_string())); + } } diff --git a/crates/temps-proxy/src/traits.rs b/crates/temps-proxy/src/traits.rs index 94aa96c5..e0ff587a 100644 --- a/crates/temps-proxy/src/traits.rs +++ b/crates/temps-proxy/src/traits.rs @@ -139,16 +139,38 @@ pub trait ProjectContextResolver: Send + Sync { async fn get_static_path(&self, host: &str) -> Option; } +/// First-visit attribution data for a new visitor +#[derive(Debug, Clone, Default)] +pub struct FirstVisitAttribution { + /// Full referrer URL + pub referrer: Option, + /// Hostname extracted from referrer + pub referrer_hostname: Option, + /// Marketing channel (e.g. "Organic Search", "Direct") + pub channel: Option, + /// UTM source parameter + pub utm_source: Option, + /// UTM medium parameter + pub utm_medium: Option, + /// UTM campaign parameter + pub utm_campaign: Option, +} + /// Trait for managing visitors #[async_trait] pub trait VisitorManager: Send + Sync { /// Get or create a visitor from encrypted cookie + /// + /// The `attribution` parameter provides first-visit referrer/UTM/channel data. + /// These fields are only set when creating a NEW visitor and are never overwritten + /// for returning visitors. async fn get_or_create_visitor( &self, visitor_cookie: Option<&str>, context: Option<&ProjectContext>, user_agent: &str, ip_address: Option<&str>, + attribution: &FirstVisitAttribution, ) -> Result>; /// Generate encrypted visitor cookie diff --git a/sdks/node/packages/blob/README.md b/sdks/node/packages/blob/README.md index 33ce7b0b..8a180497 100644 --- a/sdks/node/packages/blob/README.md +++ b/sdks/node/packages/blob/README.md @@ -1,5 +1,5 @@

- Temps Platform + Temps Platform

@temps-sdk/blob

diff --git a/sdks/node/packages/blob/package.json b/sdks/node/packages/blob/package.json index b60917db..d52e02a5 100644 --- a/sdks/node/packages/blob/package.json +++ b/sdks/node/packages/blob/package.json @@ -1,6 +1,6 @@ { "name": "@temps-sdk/blob", - "version": "0.0.1", + "version": "0.0.3", "description": "Blob storage SDK for the Temps platform", "main": "./dist/index.js", "types": "./dist/index.d.ts", diff --git a/sdks/node/packages/kv/README.md b/sdks/node/packages/kv/README.md index 1bd02499..9474521c 100644 --- a/sdks/node/packages/kv/README.md +++ b/sdks/node/packages/kv/README.md @@ -1,5 +1,5 @@

- Temps Platform + Temps Platform

@temps-sdk/kv

diff --git a/sdks/node/packages/kv/package.json b/sdks/node/packages/kv/package.json index 004193d4..6c55832e 100644 --- a/sdks/node/packages/kv/package.json +++ b/sdks/node/packages/kv/package.json @@ -1,6 +1,6 @@ { "name": "@temps-sdk/kv", - "version": "0.0.1", + "version": "0.0.3", "description": "Key-Value store SDK for Temps platform", "main": "./dist/index.js", "types": "./dist/index.d.ts", diff --git a/sdks/node/packages/node-sdk/README.md b/sdks/node/packages/node-sdk/README.md index afca7733..4d3fd8e9 100644 --- a/sdks/node/packages/node-sdk/README.md +++ b/sdks/node/packages/node-sdk/README.md @@ -1,5 +1,5 @@

- Temps Platform + Temps Platform

@temps-sdk/node-sdk

diff --git a/sdks/node/packages/node-sdk/package.json b/sdks/node/packages/node-sdk/package.json index 22a8ac34..0a264f19 100644 --- a/sdks/node/packages/node-sdk/package.json +++ b/sdks/node/packages/node-sdk/package.json @@ -1,6 +1,6 @@ { "name": "@temps-sdk/node-sdk", - "version": "0.0.1-alpha.6", + "version": "0.0.4", "private": false, "type": "module", "main": "./dist/index.js", @@ -19,7 +19,9 @@ "test:watch": "vitest --watch", "test:coverage": "vitest --coverage", "build": "tsc -p tsconfig.build.json", - "generate": "npx @hey-api/openapi-ts@0.82.4 -i ./scripts/openapi.json -o src/client -c @hey-api/client-fetch" + "generate": "npx @hey-api/openapi-ts@0.82.4 -i ./scripts/openapi.json -o src/client -c @hey-api/client-fetch", + "prepublishOnly": "npm run build", + "clean": "rm -rf dist" }, "devDependencies": { "@hey-api/openapi-ts": "^0.82.4", diff --git a/sdks/node/packages/react-analytics/README.md b/sdks/node/packages/react-analytics/README.md index fcbbed7c..062b9577 100644 --- a/sdks/node/packages/react-analytics/README.md +++ b/sdks/node/packages/react-analytics/README.md @@ -1,5 +1,5 @@

- Temps Platform + Temps Platform

@temps-sdk/react-analytics

diff --git a/sdks/node/packages/react-analytics/package.json b/sdks/node/packages/react-analytics/package.json index 2571536b..02fae539 100644 --- a/sdks/node/packages/react-analytics/package.json +++ b/sdks/node/packages/react-analytics/package.json @@ -1,6 +1,6 @@ { "name": "@temps-sdk/react-analytics", - "version": "0.0.1-beta.40", + "version": "0.0.3", "private": false, "main": "./dist/index.js", "types": "./dist/index.d.ts", diff --git a/temps-demo-slow.gif b/temps-demo-slow.gif new file mode 100644 index 00000000..f5ccd5b9 Binary files /dev/null and b/temps-demo-slow.gif differ diff --git a/temps-demo.gif b/temps-demo.gif index f5ccd5b9..221e3d75 100644 Binary files a/temps-demo.gif and b/temps-demo.gif differ diff --git a/web/src/api/client/@tanstack/react-query.gen.ts b/web/src/api/client/@tanstack/react-query.gen.ts index 5dfdd817..743ecb7a 100644 --- a/web/src/api/client/@tanstack/react-query.gen.ts +++ b/web/src/api/client/@tanstack/react-query.gen.ts @@ -1,8 +1,8 @@ // This file is auto-generated by @hey-api/openapi-ts -import { type Options, getPlatformInfo, recordEventMetrics, addSessionReplayEvents, initSessionReplay, recordSpeedMetrics, updateSpeedMetrics, getActiveVisitors, getEventsCount, getGeneralStats, getLiveVisitorsList, getPageFlow, getPageHourlySessions, getPagePathDetail, getPagePathVisitors, getPagePaths, getPagePathsSparklines, getRecentActivity, getSessionDetails, getSessionEvents, getSessionLogs, getVisitors, getVisitorByGuid, getVisitorById, getVisitorDetails, enrichVisitor, getVisitorInfo, getVisitorJourney, getVisitorSessions, getVisitorStats, listApiKeys, createApiKey, getApiKeyPermissions, deleteApiKey, getApiKey, updateApiKey, activateApiKey, deactivateApiKey, chunkUploadOptions, createRelease, listReleaseFiles, uploadReleaseFile, getDeploymentJobLogs, ingestSentryEnvelope, ingestSentryEvent, emailStatus, login, requestMagicLink, verifyMagicLink, requestPasswordReset, resetPassword, verifyEmail, verifyMfaChallenge, runExternalServiceBackup, listS3Sources, createS3Source, deleteS3Source, getS3Source, updateS3Source, listSourceBackups, runBackupForSource, listBackupSchedules, createBackupSchedule, deleteBackupSchedule, getBackupSchedule, listBackupsForSchedule, disableBackupSchedule, enableBackupSchedule, getBackup, blobDelete, blobList, blobPut, blobCopy, blobDisable, blobEnable, blobStatus, blobUpdate, blobDownload, getDashboardProjectsAnalytics, getActivityGraph, getScanByDeployment, listProviders, createProvider, deleteProvider, getProvider, updateProvider, listManagedDomains, addManagedDomain, testProviderConnection, listProviderZones, removeManagedDomain, verifyManagedDomain, lookupDnsARecords, listDomains, createDomain, getDomainByHost, cancelDomainOrder, getDomainOrder, createOrRecreateOrder, finalizeOrder, setupDnsChallenge, deleteDomain, getDomainById, getChallengeToken, getHttpChallengeDebug, provisionDomain, renewDomain, checkDomainStatus, listDomains2, createDomain2, getDomainByName, deleteDomain2, getDomain, getDomainDnsRecords, setupDns, verifyDomain, listProviders2, createProvider2, deleteProvider2, getProvider2, testProvider, listEmails, sendEmail, getEmailStats, validateEmail, getEmail, listServices, createService, listAvailableContainers, getServiceBySlug, importExternalService, listProjectServices, getProjectServiceEnvironmentVariables, getProvidersMetadata, getProviderMetadata, getServiceTypes, getServiceTypeParameters, deleteService, getService, updateService, getServicePreviewEnvironmentVariablesMasked, getServicePreviewEnvironmentVariableNames, listServiceProjects, linkServiceToProject, unlinkServiceFromProject, getServiceEnvironmentVariables, getServiceEnvironmentVariable, startService, stopService, upgradeService, listRootContainers, listContainersAtPath, listEntities, getEntityInfo, queryData, downloadObject, getContainerInfo, checkExplorerSupport, getFile, getIpGeolocation, listConnections, deleteConnection, activateConnection, deactivateConnection, listRepositoriesByConnection, syncRepositories, updateConnectionToken, validateConnection, listGitProviders, createGitProvider, createGithubPatProvider, createGitlabOauthProvider, createGitlabPatProvider, deleteProvider3, getGitProvider, activateProvider, handleGitProviderOauthCallback, getProviderConnections, deactivateProvider, checkProviderDeletionSafety, startGitProviderOauth, deleteProviderSafely, getPublicRepository, getPublicBranches, detectPublicPresets, discoverWorkloads, executeImport, createPlan, listSources, getImportStatus, getIncident, updateIncidentStatus, getIncidentUpdates, listIpAccessControl, createIpAccessControl, checkIpBlocked, deleteIpAccessControl, getIpAccessControl, updateIpAccessControl, kvDel, kvDisable, kvEnable, kvExpire, kvGet, kvIncr, kvKeys, kvSet, kvStatus, kvTtl, kvUpdate, listRoutes, createRoute, deleteRoute, getRoute, updateRoute, logout, deleteMonitor, getMonitor, getBucketedStatus, getCurrentMonitorStatus, getUptimeHistory, deletePreferences, getPreferences, updatePreferences, listNotificationProviders, createNotificationProvider, createEmailProvider, updateEmailProvider, createSlackProvider, updateSlackProvider, createWebhookProvider, updateWebhookProvider, deleteProvider4, getNotificationProvider, updateProvider2, testProvider2, listOrders, hasPerformanceMetrics, getPerformanceMetrics, getMetricsOverTime, getGroupedPageMetrics, getAccessInfo, getPrivateIp, getPublicIp, listPresets, getProjects, createProject, getProjectBySlug, createProjectFromTemplate, getProjectStatistics, deleteProject, getProject, updateProject, getProjectDeployments, getLastDeployment, triggerProjectPipeline, getActiveVisitors2, getAggregatedBuckets, updateAutomaticDeploy, listCustomDomainsForProject, createCustomDomain, deleteCustomDomain, getCustomDomain, updateCustomDomain, linkCustomDomainToCertificate, updateProjectDeploymentConfig, getDeployment, cancelDeployment, getDeploymentJobs, tailDeploymentJobLogs, getDeploymentOperations, executeDeploymentOperation, getDeploymentOperationStatus, pauseDeployment, resumeDeployment, rollbackToDeployment, teardownDeployment, listDsns, createDsn, getOrCreateDsn, regenerateDsn, revokeDsn, getEnvironmentVariables, createEnvironmentVariable, getEnvironmentVariableValue, deleteEnvironmentVariable, updateEnvironmentVariable, getEnvironments, createEnvironment, deleteEnvironment, getEnvironment, getEnvironmentCrons, getCronById, getCronExecutions, getEnvironmentDomains, addEnvironmentDomain, deleteEnvironmentDomain, updateEnvironmentSettings, teardownEnvironment, getContainerLogs, listContainers, getContainerDetail, getContainerLogsById, getContainerMetrics, streamContainerMetrics, restartContainer, startContainer, stopContainer, deployFromImage, deployFromImageUpload, deployFromStatic, getErrorDashboardStats, listErrorGroups, getErrorGroup, updateErrorGroup, listErrorEvents, getErrorEvent, getErrorStats, getErrorTimeSeries, getEventsCount2, getEventTypeBreakdown, getPropertyBreakdown, getPropertyTimeline, getEventsTimeline, getUniqueEvents, listExternalImages, registerExternalImage, deleteExternalImage, getExternalImage, listFunnels, createFunnel, previewFunnelMetrics, deleteFunnel, updateFunnel, getFunnelMetrics, updateGitSettings, hasErrorGroups, hasAnalyticsEvents, getHourlyVisits, listExternalImages2, pushExternalImage, getExternalImage2, listIncidents, createIncident, getBucketedIncidents, listMonitors, createMonitor, deleteReleaseSourceMaps, listSourceMaps, uploadSourceMap, updateProjectSettings, listReleases, deleteSourceMap, listStaticBundles, deleteStaticBundle, getStaticBundle, getStatusOverview, getUniqueCounts, uploadStaticBundle, listProjectScans, triggerScan, getLatestScansPerEnvironment, getLatestScan, listWebhooks, createWebhook, deleteWebhook, getWebhook, updateWebhook, listDeliveries, getDelivery, retryDelivery, getProxyLogs, getProxyLogByRequestId, getTimeBucketStats, getTodayStats, getProxyLogById, listSyncedRepositories, getRepositoryByName, getAllRepositoriesByName, getRepositoryPresetByName, getRepositoryBranches, getRepositoryTags, getRepositoryPresetLive, getBranchesByRepositoryId, checkCommitExists, getTagsByRepositoryId, getProjectSessionReplays, getSessionEvents2, getSettings, updateSettings, listTemplates, listTemplateTags, getTemplate, getCurrentUser, listUsers, createUser, updateSelf, disableMfa, setupMfa, verifyAndEnableMfa, deleteUser, updateUser, restoreUser, assignRole, removeRole, getVisitorSessions2, deleteSessionReplay, getSessionReplay, updateSessionDuration, getSessionReplayEvents, addEvents, deleteScan, getScan, getScanVulnerabilities, listEventTypes, triggerWeeklyDigest, listAuditLogs, getAuditLog } from '../sdk.gen'; +import { type Options, getPlatformInfo, recordEventMetrics, addSessionReplayEvents, initSessionReplay, recordSpeedMetrics, updateSpeedMetrics, getActiveVisitors, getEventDetail, getEventVisitors, getEventsCount, getGeneralStats, getLiveVisitorsList, getPageFlow, getPageHourlySessions, getPagePathDetail, getPagePathVisitors, getPagePaths, getPagePathsSparklines, getRecentActivity, getSessionDetails, getSessionEvents, getSessionLogs, getVisitors, getVisitorByGuid, getVisitorById, getVisitorDetails, enrichVisitor, getVisitorInfo, getVisitorJourney, getVisitorSessions, getVisitorStats, listApiKeys, createApiKey, getApiKeyPermissions, deleteApiKey, getApiKey, updateApiKey, activateApiKey, deactivateApiKey, chunkUploadOptions, createRelease, listReleaseFiles, uploadReleaseFile, getDeploymentJobLogs, ingestSentryEnvelope, ingestSentryEvent, emailStatus, login, requestMagicLink, verifyMagicLink, requestPasswordReset, resetPassword, verifyEmail, verifyMfaChallenge, runExternalServiceBackup, listS3Sources, createS3Source, deleteS3Source, getS3Source, updateS3Source, listSourceBackups, runBackupForSource, listBackupSchedules, createBackupSchedule, deleteBackupSchedule, getBackupSchedule, listBackupsForSchedule, disableBackupSchedule, enableBackupSchedule, getBackup, blobDelete, blobList, blobPut, blobCopy, blobDisable, blobEnable, blobStatus, blobUpdate, blobDownload, getDashboardProjectsAnalytics, getActivityGraph, getScanByDeployment, listProviders, createProvider, deleteProvider, getProvider, updateProvider, listManagedDomains, addManagedDomain, testProviderConnection, listProviderZones, removeManagedDomain, verifyManagedDomain, lookupDnsARecords, listDomains, createDomain, getDomainByHost, cancelDomainOrder, getDomainOrder, createOrRecreateOrder, finalizeOrder, setupDnsChallenge, deleteDomain, getDomainById, getChallengeToken, getHttpChallengeDebug, provisionDomain, renewDomain, checkDomainStatus, listDomains2, createDomain2, getDomainByName, deleteDomain2, getDomain, getDomainDnsRecords, setupDns, verifyDomain, listProviders2, createProvider2, deleteProvider2, getProvider2, testProvider, listEmails, sendEmail, getEmailStats, validateEmail, getEmail, listServices, createService, listAvailableContainers, getServiceBySlug, importExternalService, listProjectServices, getProjectServiceEnvironmentVariables, getProvidersMetadata, getProviderMetadata, getServiceTypes, getServiceTypeParameters, deleteService, getService, updateService, getServicePreviewEnvironmentVariablesMasked, getServicePreviewEnvironmentVariableNames, listServiceProjects, linkServiceToProject, unlinkServiceFromProject, getServiceEnvironmentVariables, getServiceEnvironmentVariable, startService, stopService, upgradeService, listRootContainers, listContainersAtPath, listEntities, getEntityInfo, queryData, downloadObject, getContainerInfo, checkExplorerSupport, getFile, getIpGeolocation, listConnections, deleteConnection, activateConnection, deactivateConnection, listRepositoriesByConnection, syncRepositories, updateConnectionToken, validateConnection, listGitProviders, createGitProvider, createGithubPatProvider, createGitlabOauthProvider, createGitlabPatProvider, deleteProvider3, getGitProvider, activateProvider, handleGitProviderOauthCallback, getProviderConnections, deactivateProvider, checkProviderDeletionSafety, startGitProviderOauth, deleteProviderSafely, getPublicRepository, getPublicBranches, detectPublicPresets, discoverWorkloads, executeImport, createPlan, listSources, getImportStatus, getIncident, updateIncidentStatus, getIncidentUpdates, listIpAccessControl, createIpAccessControl, checkIpBlocked, deleteIpAccessControl, getIpAccessControl, updateIpAccessControl, kvDel, kvDisable, kvEnable, kvExpire, kvGet, kvIncr, kvKeys, kvSet, kvStatus, kvTtl, kvUpdate, listRoutes, createRoute, deleteRoute, getRoute, updateRoute, logout, deleteMonitor, getMonitor, getBucketedStatus, getCurrentMonitorStatus, getUptimeHistory, deletePreferences, getPreferences, updatePreferences, listNotificationProviders, createNotificationProvider, createEmailProvider, updateEmailProvider, createSlackProvider, updateSlackProvider, createWebhookProvider, updateWebhookProvider, deleteProvider4, getNotificationProvider, updateProvider2, testProvider2, listOrders, hasPerformanceMetrics, getPerformanceMetrics, getMetricsOverTime, getGroupedPageMetrics, getAccessInfo, getPrivateIp, getPublicIp, listPresets, generatePresetDockerfile, getProjects, createProject, getProjectBySlug, createProjectFromTemplate, getProjectStatistics, deleteProject, getProject, updateProject, getProjectDeployments, getLastDeployment, triggerProjectPipeline, getActiveVisitors2, getAggregatedBuckets, updateAutomaticDeploy, listCustomDomainsForProject, createCustomDomain, deleteCustomDomain, getCustomDomain, updateCustomDomain, linkCustomDomainToCertificate, updateProjectDeploymentConfig, getDeployment, cancelDeployment, getDeploymentJobs, tailDeploymentJobLogs, getDeploymentOperations, executeDeploymentOperation, getDeploymentOperationStatus, pauseDeployment, resumeDeployment, rollbackToDeployment, teardownDeployment, listDsns, createDsn, getOrCreateDsn, regenerateDsn, revokeDsn, getEnvironmentVariables, createEnvironmentVariable, getEnvironmentVariableValue, deleteEnvironmentVariable, updateEnvironmentVariable, getEnvironments, createEnvironment, deleteEnvironment, getEnvironment, getEnvironmentCrons, getCronById, getCronExecutions, getEnvironmentDomains, addEnvironmentDomain, deleteEnvironmentDomain, updateEnvironmentSettings, teardownEnvironment, getContainerLogs, listContainers, getContainerDetail, getContainerLogsById, getContainerMetrics, streamContainerMetrics, restartContainer, startContainer, stopContainer, deployFromImage, deployFromImageUpload, deployFromStatic, getErrorDashboardStats, listErrorGroups, getErrorGroup, updateErrorGroup, listErrorEvents, getErrorEvent, getErrorStats, getErrorTimeSeries, getEventsCount2, getEventTypeBreakdown, getPropertyBreakdown, getPropertyTimeline, getEventsTimeline, getUniqueEvents, listExternalImages, registerExternalImage, deleteExternalImage, getExternalImage, listFunnels, createFunnel, previewFunnelMetrics, deleteFunnel, updateFunnel, getFunnelMetrics, updateGitSettings, hasErrorGroups, hasAnalyticsEvents, getHourlyVisits, listExternalImages2, pushExternalImage, getExternalImage2, listIncidents, createIncident, getBucketedIncidents, listMonitors, createMonitor, deleteReleaseSourceMaps, listSourceMaps, uploadSourceMap, updateProjectSettings, listReleases, deleteSourceMap, listStaticBundles, deleteStaticBundle, getStaticBundle, getStatusOverview, getUniqueCounts, uploadStaticBundle, listProjectScans, triggerScan, getLatestScansPerEnvironment, getLatestScan, listWebhooks, createWebhook, deleteWebhook, getWebhook, updateWebhook, listDeliveries, getDelivery, retryDelivery, getProxyLogs, getProxyLogByRequestId, getTimeBucketStats, getTodayStats, getProxyLogById, listSyncedRepositories, getRepositoryByName, getAllRepositoriesByName, getRepositoryPresetByName, getRepositoryBranches, getRepositoryTags, getRepositoryPresetLive, getBranchesByRepositoryId, listCommitsByRepositoryId, checkCommitExists, getTagsByRepositoryId, getProjectSessionReplays, getSessionEvents2, getSettings, updateSettings, listTemplates, listTemplateTags, getTemplate, getCurrentUser, listUsers, createUser, updateSelf, disableMfa, setupMfa, verifyAndEnableMfa, deleteUser, updateUser, restoreUser, assignRole, removeRole, getVisitorSessions2, deleteSessionReplay, getSessionReplay, updateSessionDuration, getSessionReplayEvents, addEvents, deleteScan, getScan, getScanVulnerabilities, listEventTypes, triggerWeeklyDigest, listAuditLogs, getAuditLog } from '../sdk.gen'; import { queryOptions, type UseMutationOptions, type DefaultError, infiniteQueryOptions, type InfiniteData } from '@tanstack/react-query'; -import type { GetPlatformInfoData, RecordEventMetricsData, AddSessionReplayEventsData, AddSessionReplayEventsError, AddSessionReplayEventsResponse, InitSessionReplayData, InitSessionReplayError, InitSessionReplayResponse, RecordSpeedMetricsData, RecordSpeedMetricsError, UpdateSpeedMetricsData, UpdateSpeedMetricsError, GetActiveVisitorsData, GetEventsCountData, GetGeneralStatsData, GetLiveVisitorsListData, GetPageFlowData, GetPageHourlySessionsData, GetPagePathDetailData, GetPagePathVisitorsData, GetPagePathVisitorsResponse, GetPagePathsData, GetPagePathsSparklinesData, GetRecentActivityData, GetSessionDetailsData, GetSessionEventsData, GetSessionEventsResponse, GetSessionLogsData, GetSessionLogsResponse, GetVisitorsData, GetVisitorsResponse, GetVisitorByGuidData, GetVisitorByIdData, GetVisitorDetailsData, EnrichVisitorData, EnrichVisitorResponse2 as EnrichVisitorResponse, GetVisitorInfoData, GetVisitorJourneyData, GetVisitorSessionsData, GetVisitorStatsData, ListApiKeysData, ListApiKeysResponse, CreateApiKeyData, CreateApiKeyResponse2 as CreateApiKeyResponse, GetApiKeyPermissionsData, DeleteApiKeyData, DeleteApiKeyResponse, GetApiKeyData, UpdateApiKeyData, UpdateApiKeyResponse, ActivateApiKeyData, ActivateApiKeyResponse, DeactivateApiKeyData, DeactivateApiKeyResponse, ChunkUploadOptionsData, CreateReleaseData, CreateReleaseResponse, ListReleaseFilesData, UploadReleaseFileData, UploadReleaseFileResponse, GetDeploymentJobLogsData, IngestSentryEnvelopeData, IngestSentryEventData, IngestSentryEventResponse, EmailStatusData, LoginData, LoginResponse, RequestMagicLinkData, RequestMagicLinkResponse, VerifyMagicLinkData, RequestPasswordResetData, RequestPasswordResetResponse, ResetPasswordData, ResetPasswordResponse, VerifyEmailData, VerifyMfaChallengeData, VerifyMfaChallengeResponse, RunExternalServiceBackupData, RunExternalServiceBackupError, RunExternalServiceBackupResponse, ListS3SourcesData, CreateS3SourceData, CreateS3SourceError, CreateS3SourceResponse, DeleteS3SourceData, DeleteS3SourceError, DeleteS3SourceResponse, GetS3SourceData, UpdateS3SourceData, UpdateS3SourceError, UpdateS3SourceResponse, ListSourceBackupsData, RunBackupForSourceData, RunBackupForSourceError, RunBackupForSourceResponse, ListBackupSchedulesData, CreateBackupScheduleData, CreateBackupScheduleError, CreateBackupScheduleResponse, DeleteBackupScheduleData, DeleteBackupScheduleError, DeleteBackupScheduleResponse, GetBackupScheduleData, ListBackupsForScheduleData, DisableBackupScheduleData, DisableBackupScheduleResponse, EnableBackupScheduleData, EnableBackupScheduleResponse, GetBackupData, BlobDeleteData, BlobDeleteError, BlobDeleteResponse, BlobListData, BlobListError, BlobListResponse, BlobPutData, BlobPutError, BlobPutResponse, BlobCopyData, BlobCopyError, BlobCopyResponse, BlobDisableData, BlobDisableResponse, BlobEnableData, BlobEnableResponse, BlobStatusData, BlobUpdateData, BlobUpdateResponse, BlobDownloadData, GetDashboardProjectsAnalyticsData, GetActivityGraphData, GetScanByDeploymentData, ListProvidersData, CreateProviderData, CreateProviderResponse, DeleteProviderData, DeleteProviderResponse, GetProviderData, UpdateProviderData, UpdateProviderResponse, ListManagedDomainsData, AddManagedDomainData, AddManagedDomainResponse, TestProviderConnectionData, TestProviderConnectionResponse, ListProviderZonesData, RemoveManagedDomainData, RemoveManagedDomainResponse, VerifyManagedDomainData, VerifyManagedDomainResponse, LookupDnsARecordsData, ListDomainsData, CreateDomainData, CreateDomainResponse, GetDomainByHostData, CancelDomainOrderData, CancelDomainOrderResponse, GetDomainOrderData, CreateOrRecreateOrderData, CreateOrRecreateOrderResponse, FinalizeOrderData, FinalizeOrderResponse, SetupDnsChallengeData, SetupDnsChallengeResponse2 as SetupDnsChallengeResponse, DeleteDomainData, DeleteDomainResponse, GetDomainByIdData, GetChallengeTokenData, GetHttpChallengeDebugData, ProvisionDomainData, ProvisionDomainResponse, RenewDomainData, RenewDomainResponse, CheckDomainStatusData, ListDomains2Data, CreateDomain2Data, CreateDomain2Response, GetDomainByNameData, DeleteDomain2Data, DeleteDomain2Response, GetDomainData, GetDomainDnsRecordsData, SetupDnsData, SetupDnsResponse2 as SetupDnsResponse, VerifyDomainData, VerifyDomainResponse, ListProviders2Data, CreateProvider2Data, CreateProvider2Response, DeleteProvider2Data, DeleteProvider2Response, GetProvider2Data, TestProviderData, TestProviderResponse2 as TestProviderResponse, ListEmailsData, ListEmailsResponse, SendEmailData, SendEmailResponse, GetEmailStatsData, ValidateEmailData, ValidateEmailResponse2 as ValidateEmailResponse, GetEmailData, ListServicesData, CreateServiceData, CreateServiceResponse, ListAvailableContainersData, GetServiceBySlugData, ImportExternalServiceData, ImportExternalServiceResponse, ListProjectServicesData, GetProjectServiceEnvironmentVariablesData, GetProvidersMetadataData, GetProviderMetadataData, GetServiceTypesData, GetServiceTypeParametersData, DeleteServiceData, DeleteServiceResponse, GetServiceData, UpdateServiceData, UpdateServiceResponse, GetServicePreviewEnvironmentVariablesMaskedData, GetServicePreviewEnvironmentVariableNamesData, ListServiceProjectsData, LinkServiceToProjectData, LinkServiceToProjectResponse, UnlinkServiceFromProjectData, UnlinkServiceFromProjectResponse, GetServiceEnvironmentVariablesData, GetServiceEnvironmentVariableData, StartServiceData, StartServiceResponse, StopServiceData, StopServiceResponse, UpgradeServiceData, UpgradeServiceResponse, ListRootContainersData, ListContainersAtPathData, ListEntitiesData, GetEntityInfoData, QueryDataData, QueryDataResponse2 as QueryDataResponse, DownloadObjectData, GetContainerInfoData, CheckExplorerSupportData, GetFileData, GetIpGeolocationData, ListConnectionsData, ListConnectionsResponse, DeleteConnectionData, DeleteConnectionResponse, ActivateConnectionData, DeactivateConnectionData, ListRepositoriesByConnectionData, ListRepositoriesByConnectionResponse, SyncRepositoriesData, SyncRepositoriesResponse, UpdateConnectionTokenData, UpdateConnectionTokenResponse, ValidateConnectionData, ListGitProvidersData, CreateGitProviderData, CreateGitProviderResponse, CreateGithubPatProviderData, CreateGithubPatProviderResponse, CreateGitlabOauthProviderData, CreateGitlabOauthProviderResponse, CreateGitlabPatProviderData, CreateGitlabPatProviderResponse, DeleteProvider3Data, DeleteProvider3Response, GetGitProviderData, ActivateProviderData, HandleGitProviderOauthCallbackData, GetProviderConnectionsData, DeactivateProviderData, CheckProviderDeletionSafetyData, StartGitProviderOauthData, DeleteProviderSafelyData, DeleteProviderSafelyResponse, GetPublicRepositoryData, GetPublicBranchesData, DetectPublicPresetsData, DiscoverWorkloadsData, DiscoverWorkloadsResponse, ExecuteImportData, ExecuteImportResponse2 as ExecuteImportResponse, CreatePlanData, CreatePlanResponse2 as CreatePlanResponse, ListSourcesData, GetImportStatusData, GetIncidentData, UpdateIncidentStatusData, UpdateIncidentStatusResponse, GetIncidentUpdatesData, ListIpAccessControlData, CreateIpAccessControlData, CreateIpAccessControlResponse, CheckIpBlockedData, DeleteIpAccessControlData, DeleteIpAccessControlResponse, GetIpAccessControlData, UpdateIpAccessControlData, UpdateIpAccessControlResponse, KvDelData, KvDelResponse, KvDisableData, KvDisableResponse, KvEnableData, KvEnableResponse, KvExpireData, KvExpireResponse, KvGetData, KvGetResponse, KvIncrData, KvIncrResponse, KvKeysData, KvKeysResponse, KvSetData, KvSetResponse, KvStatusData, KvTtlData, KvTtlResponse, KvUpdateData, KvUpdateResponse, ListRoutesData, CreateRouteData, CreateRouteResponse, DeleteRouteData, DeleteRouteResponse, GetRouteData, UpdateRouteData, UpdateRouteResponse, LogoutData, DeleteMonitorData, DeleteMonitorResponse, GetMonitorData, GetBucketedStatusData, GetCurrentMonitorStatusData, GetUptimeHistoryData, DeletePreferencesData, DeletePreferencesResponse, GetPreferencesData, UpdatePreferencesData, UpdatePreferencesResponse, ListNotificationProvidersData, CreateNotificationProviderData, CreateNotificationProviderResponse, CreateEmailProviderData, CreateEmailProviderResponse, UpdateEmailProviderData, UpdateEmailProviderResponse, CreateSlackProviderData, CreateSlackProviderResponse, UpdateSlackProviderData, UpdateSlackProviderResponse, CreateWebhookProviderData, CreateWebhookProviderResponse, UpdateWebhookProviderData, UpdateWebhookProviderResponse, DeleteProvider4Data, DeleteProvider4Response, GetNotificationProviderData, UpdateProvider2Data, UpdateProvider2Response, TestProvider2Data, TestProvider2Response, ListOrdersData, HasPerformanceMetricsData, GetPerformanceMetricsData, GetMetricsOverTimeData, GetGroupedPageMetricsData, GetAccessInfoData, GetPrivateIpData, GetPublicIpData, ListPresetsData, GetProjectsData, GetProjectsResponse, CreateProjectData, CreateProjectResponse, GetProjectBySlugData, CreateProjectFromTemplateData, CreateProjectFromTemplateResponse2 as CreateProjectFromTemplateResponse, GetProjectStatisticsData, DeleteProjectData, DeleteProjectResponse, GetProjectData, UpdateProjectData, UpdateProjectResponse, GetProjectDeploymentsData, GetProjectDeploymentsResponse, GetLastDeploymentData, TriggerProjectPipelineData, TriggerProjectPipelineResponse, GetActiveVisitors2Data, GetAggregatedBucketsData, UpdateAutomaticDeployData, UpdateAutomaticDeployResponse, ListCustomDomainsForProjectData, CreateCustomDomainData, CreateCustomDomainResponse, DeleteCustomDomainData, DeleteCustomDomainResponse, GetCustomDomainData, UpdateCustomDomainData, UpdateCustomDomainResponse, LinkCustomDomainToCertificateData, LinkCustomDomainToCertificateResponse, UpdateProjectDeploymentConfigData, UpdateProjectDeploymentConfigResponse, GetDeploymentData, CancelDeploymentData, CancelDeploymentResponse, GetDeploymentJobsData, TailDeploymentJobLogsData, GetDeploymentOperationsData, ExecuteDeploymentOperationData, ExecuteDeploymentOperationResponse, GetDeploymentOperationStatusData, PauseDeploymentData, PauseDeploymentResponse, ResumeDeploymentData, ResumeDeploymentResponse, RollbackToDeploymentData, RollbackToDeploymentResponse, TeardownDeploymentData, TeardownDeploymentResponse, ListDsnsData, CreateDsnData, CreateDsnResponse, GetOrCreateDsnData, GetOrCreateDsnResponse, RegenerateDsnData, RegenerateDsnResponse, RevokeDsnData, RevokeDsnResponse, GetEnvironmentVariablesData, CreateEnvironmentVariableData, CreateEnvironmentVariableResponse, GetEnvironmentVariableValueData, DeleteEnvironmentVariableData, DeleteEnvironmentVariableResponse, UpdateEnvironmentVariableData, UpdateEnvironmentVariableResponse, GetEnvironmentsData, CreateEnvironmentData, CreateEnvironmentResponse, DeleteEnvironmentData, DeleteEnvironmentResponse, GetEnvironmentData, GetEnvironmentCronsData, GetCronByIdData, GetCronExecutionsData, GetCronExecutionsResponse, GetEnvironmentDomainsData, AddEnvironmentDomainData, AddEnvironmentDomainResponse, DeleteEnvironmentDomainData, DeleteEnvironmentDomainResponse, UpdateEnvironmentSettingsData, UpdateEnvironmentSettingsResponse, TeardownEnvironmentData, TeardownEnvironmentResponse, GetContainerLogsData, ListContainersData, GetContainerDetailData, GetContainerLogsByIdData, GetContainerMetricsData, StreamContainerMetricsData, RestartContainerData, RestartContainerResponse, StartContainerData, StartContainerResponse, StopContainerData, StopContainerResponse, DeployFromImageData, DeployFromImageResponse, DeployFromImageUploadData, DeployFromImageUploadResponse, DeployFromStaticData, DeployFromStaticResponse, GetErrorDashboardStatsData, ListErrorGroupsData, ListErrorGroupsResponse, GetErrorGroupData, UpdateErrorGroupData, ListErrorEventsData, ListErrorEventsResponse, GetErrorEventData, GetErrorStatsData, GetErrorTimeSeriesData, GetEventsCount2Data, GetEventTypeBreakdownData, GetPropertyBreakdownData, GetPropertyTimelineData, GetEventsTimelineData, GetUniqueEventsData, GetUniqueEventsResponse, ListExternalImagesData, ListExternalImagesResponse, RegisterExternalImageData, RegisterExternalImageResponse, DeleteExternalImageData, DeleteExternalImageResponse, GetExternalImageData, ListFunnelsData, CreateFunnelData, CreateFunnelResponse2 as CreateFunnelResponse, PreviewFunnelMetricsData, PreviewFunnelMetricsResponse, DeleteFunnelData, UpdateFunnelData, GetFunnelMetricsData, UpdateGitSettingsData, UpdateGitSettingsResponse, HasErrorGroupsData, HasAnalyticsEventsData, GetHourlyVisitsData, ListExternalImages2Data, PushExternalImageData, PushExternalImageResponse, GetExternalImage2Data, ListIncidentsData, CreateIncidentData, CreateIncidentResponse, GetBucketedIncidentsData, ListMonitorsData, CreateMonitorData, CreateMonitorResponse, DeleteReleaseSourceMapsData, DeleteReleaseSourceMapsResponse, ListSourceMapsData, UploadSourceMapData, UploadSourceMapResponse, UpdateProjectSettingsData, UpdateProjectSettingsResponse, ListReleasesData, DeleteSourceMapData, DeleteSourceMapResponse, ListStaticBundlesData, ListStaticBundlesResponse, DeleteStaticBundleData, DeleteStaticBundleResponse, GetStaticBundleData, GetStatusOverviewData, GetUniqueCountsData, UploadStaticBundleData, UploadStaticBundleResponse, ListProjectScansData, ListProjectScansError, ListProjectScansResponse, TriggerScanData, TriggerScanError, TriggerScanResponse2 as TriggerScanResponse, GetLatestScansPerEnvironmentData, GetLatestScanData, ListWebhooksData, CreateWebhookData, CreateWebhookResponse, DeleteWebhookData, DeleteWebhookResponse, GetWebhookData, UpdateWebhookData, UpdateWebhookResponse, ListDeliveriesData, GetDeliveryData, RetryDeliveryData, RetryDeliveryResponse, GetProxyLogsData, GetProxyLogsResponse, GetProxyLogByRequestIdData, GetTimeBucketStatsData, GetTodayStatsData, GetProxyLogByIdData, ListSyncedRepositoriesData, ListSyncedRepositoriesResponse, GetRepositoryByNameData, GetAllRepositoriesByNameData, GetRepositoryPresetByNameData, GetRepositoryBranchesData, GetRepositoryTagsData, GetRepositoryPresetLiveData, GetBranchesByRepositoryIdData, CheckCommitExistsData, GetTagsByRepositoryIdData, GetProjectSessionReplaysData, GetProjectSessionReplaysError, GetProjectSessionReplaysResponse2 as GetProjectSessionReplaysResponse, GetSessionEvents2Data, GetSettingsData, UpdateSettingsData, UpdateSettingsResponse, ListTemplatesData, ListTemplateTagsData, GetTemplateData, GetCurrentUserData, ListUsersData, CreateUserData, CreateUserResponse, UpdateSelfData, UpdateSelfResponse, DisableMfaData, DisableMfaResponse, SetupMfaData, SetupMfaResponse, VerifyAndEnableMfaData, VerifyAndEnableMfaResponse, DeleteUserData, DeleteUserResponse, UpdateUserData, UpdateUserResponse, RestoreUserData, RestoreUserResponse, AssignRoleData, RemoveRoleData, RemoveRoleResponse, GetVisitorSessions2Data, GetVisitorSessions2Error, GetVisitorSessions2Response, DeleteSessionReplayData, DeleteSessionReplayError, GetSessionReplayData, UpdateSessionDurationData, UpdateSessionDurationError, UpdateSessionDurationResponse2 as UpdateSessionDurationResponse, GetSessionReplayEventsData, AddEventsData, AddEventsError, AddEventsResponse2 as AddEventsResponse, DeleteScanData, DeleteScanError, DeleteScanResponse, GetScanData, GetScanVulnerabilitiesData, GetScanVulnerabilitiesError, GetScanVulnerabilitiesResponse, ListEventTypesData, TriggerWeeklyDigestData, TriggerWeeklyDigestResponse, ListAuditLogsData, ListAuditLogsResponse, GetAuditLogData } from '../types.gen'; +import type { GetPlatformInfoData, RecordEventMetricsData, AddSessionReplayEventsData, AddSessionReplayEventsError, AddSessionReplayEventsResponse, InitSessionReplayData, InitSessionReplayError, InitSessionReplayResponse, RecordSpeedMetricsData, RecordSpeedMetricsError, UpdateSpeedMetricsData, UpdateSpeedMetricsError, GetActiveVisitorsData, GetEventDetailData, GetEventVisitorsData, GetEventVisitorsResponse, GetEventsCountData, GetGeneralStatsData, GetLiveVisitorsListData, GetPageFlowData, GetPageHourlySessionsData, GetPagePathDetailData, GetPagePathVisitorsData, GetPagePathVisitorsResponse, GetPagePathsData, GetPagePathsSparklinesData, GetRecentActivityData, GetSessionDetailsData, GetSessionEventsData, GetSessionEventsResponse, GetSessionLogsData, GetSessionLogsResponse, GetVisitorsData, GetVisitorsResponse, GetVisitorByGuidData, GetVisitorByIdData, GetVisitorDetailsData, EnrichVisitorData, EnrichVisitorResponse2 as EnrichVisitorResponse, GetVisitorInfoData, GetVisitorJourneyData, GetVisitorSessionsData, GetVisitorStatsData, ListApiKeysData, ListApiKeysResponse, CreateApiKeyData, CreateApiKeyResponse2 as CreateApiKeyResponse, GetApiKeyPermissionsData, DeleteApiKeyData, DeleteApiKeyResponse, GetApiKeyData, UpdateApiKeyData, UpdateApiKeyResponse, ActivateApiKeyData, ActivateApiKeyResponse, DeactivateApiKeyData, DeactivateApiKeyResponse, ChunkUploadOptionsData, CreateReleaseData, CreateReleaseResponse, ListReleaseFilesData, UploadReleaseFileData, UploadReleaseFileResponse, GetDeploymentJobLogsData, IngestSentryEnvelopeData, IngestSentryEventData, IngestSentryEventResponse, EmailStatusData, LoginData, LoginResponse, RequestMagicLinkData, RequestMagicLinkResponse, VerifyMagicLinkData, RequestPasswordResetData, RequestPasswordResetResponse, ResetPasswordData, ResetPasswordResponse, VerifyEmailData, VerifyMfaChallengeData, VerifyMfaChallengeResponse, RunExternalServiceBackupData, RunExternalServiceBackupError, RunExternalServiceBackupResponse, ListS3SourcesData, CreateS3SourceData, CreateS3SourceError, CreateS3SourceResponse, DeleteS3SourceData, DeleteS3SourceError, DeleteS3SourceResponse, GetS3SourceData, UpdateS3SourceData, UpdateS3SourceError, UpdateS3SourceResponse, ListSourceBackupsData, RunBackupForSourceData, RunBackupForSourceError, RunBackupForSourceResponse, ListBackupSchedulesData, CreateBackupScheduleData, CreateBackupScheduleError, CreateBackupScheduleResponse, DeleteBackupScheduleData, DeleteBackupScheduleError, DeleteBackupScheduleResponse, GetBackupScheduleData, ListBackupsForScheduleData, DisableBackupScheduleData, DisableBackupScheduleResponse, EnableBackupScheduleData, EnableBackupScheduleResponse, GetBackupData, BlobDeleteData, BlobDeleteError, BlobDeleteResponse, BlobListData, BlobListError, BlobListResponse, BlobPutData, BlobPutError, BlobPutResponse, BlobCopyData, BlobCopyError, BlobCopyResponse, BlobDisableData, BlobDisableResponse, BlobEnableData, BlobEnableResponse, BlobStatusData, BlobUpdateData, BlobUpdateResponse, BlobDownloadData, GetDashboardProjectsAnalyticsData, GetActivityGraphData, GetScanByDeploymentData, ListProvidersData, CreateProviderData, CreateProviderResponse, DeleteProviderData, DeleteProviderResponse, GetProviderData, UpdateProviderData, UpdateProviderResponse, ListManagedDomainsData, AddManagedDomainData, AddManagedDomainResponse, TestProviderConnectionData, TestProviderConnectionResponse, ListProviderZonesData, RemoveManagedDomainData, RemoveManagedDomainResponse, VerifyManagedDomainData, VerifyManagedDomainResponse, LookupDnsARecordsData, ListDomainsData, ListDomainsResponse2 as ListDomainsResponse, CreateDomainData, CreateDomainResponse, GetDomainByHostData, CancelDomainOrderData, CancelDomainOrderResponse, GetDomainOrderData, CreateOrRecreateOrderData, CreateOrRecreateOrderResponse, FinalizeOrderData, FinalizeOrderResponse, SetupDnsChallengeData, SetupDnsChallengeResponse2 as SetupDnsChallengeResponse, DeleteDomainData, DeleteDomainResponse, GetDomainByIdData, GetChallengeTokenData, GetHttpChallengeDebugData, ProvisionDomainData, ProvisionDomainResponse, RenewDomainData, RenewDomainResponse, CheckDomainStatusData, ListDomains2Data, CreateDomain2Data, CreateDomain2Response, GetDomainByNameData, DeleteDomain2Data, DeleteDomain2Response, GetDomainData, GetDomainDnsRecordsData, SetupDnsData, SetupDnsResponse2 as SetupDnsResponse, VerifyDomainData, VerifyDomainResponse, ListProviders2Data, CreateProvider2Data, CreateProvider2Response, DeleteProvider2Data, DeleteProvider2Response, GetProvider2Data, TestProviderData, TestProviderResponse2 as TestProviderResponse, ListEmailsData, ListEmailsResponse, SendEmailData, SendEmailResponse, GetEmailStatsData, ValidateEmailData, ValidateEmailResponse2 as ValidateEmailResponse, GetEmailData, ListServicesData, ListServicesResponse, CreateServiceData, CreateServiceResponse, ListAvailableContainersData, GetServiceBySlugData, ImportExternalServiceData, ImportExternalServiceResponse, ListProjectServicesData, ListProjectServicesResponse, GetProjectServiceEnvironmentVariablesData, GetProvidersMetadataData, GetProviderMetadataData, GetServiceTypesData, GetServiceTypeParametersData, DeleteServiceData, DeleteServiceResponse, GetServiceData, UpdateServiceData, UpdateServiceResponse, GetServicePreviewEnvironmentVariablesMaskedData, GetServicePreviewEnvironmentVariableNamesData, ListServiceProjectsData, ListServiceProjectsResponse, LinkServiceToProjectData, LinkServiceToProjectResponse, UnlinkServiceFromProjectData, UnlinkServiceFromProjectResponse, GetServiceEnvironmentVariablesData, GetServiceEnvironmentVariableData, StartServiceData, StartServiceResponse, StopServiceData, StopServiceResponse, UpgradeServiceData, UpgradeServiceResponse, ListRootContainersData, ListContainersAtPathData, ListEntitiesData, GetEntityInfoData, QueryDataData, QueryDataResponse2 as QueryDataResponse, DownloadObjectData, GetContainerInfoData, CheckExplorerSupportData, GetFileData, GetIpGeolocationData, ListConnectionsData, ListConnectionsResponse, DeleteConnectionData, DeleteConnectionResponse, ActivateConnectionData, DeactivateConnectionData, ListRepositoriesByConnectionData, ListRepositoriesByConnectionResponse, SyncRepositoriesData, SyncRepositoriesResponse, UpdateConnectionTokenData, UpdateConnectionTokenResponse, ValidateConnectionData, ListGitProvidersData, CreateGitProviderData, CreateGitProviderResponse, CreateGithubPatProviderData, CreateGithubPatProviderResponse, CreateGitlabOauthProviderData, CreateGitlabOauthProviderResponse, CreateGitlabPatProviderData, CreateGitlabPatProviderResponse, DeleteProvider3Data, DeleteProvider3Response, GetGitProviderData, ActivateProviderData, HandleGitProviderOauthCallbackData, GetProviderConnectionsData, DeactivateProviderData, CheckProviderDeletionSafetyData, StartGitProviderOauthData, DeleteProviderSafelyData, DeleteProviderSafelyResponse, GetPublicRepositoryData, GetPublicBranchesData, DetectPublicPresetsData, DiscoverWorkloadsData, DiscoverWorkloadsResponse, ExecuteImportData, ExecuteImportResponse2 as ExecuteImportResponse, CreatePlanData, CreatePlanResponse2 as CreatePlanResponse, ListSourcesData, GetImportStatusData, GetIncidentData, UpdateIncidentStatusData, UpdateIncidentStatusResponse, GetIncidentUpdatesData, ListIpAccessControlData, CreateIpAccessControlData, CreateIpAccessControlResponse, CheckIpBlockedData, DeleteIpAccessControlData, DeleteIpAccessControlResponse, GetIpAccessControlData, UpdateIpAccessControlData, UpdateIpAccessControlResponse, KvDelData, KvDelResponse, KvDisableData, KvDisableResponse, KvEnableData, KvEnableResponse, KvExpireData, KvExpireResponse, KvGetData, KvGetResponse, KvIncrData, KvIncrResponse, KvKeysData, KvKeysResponse, KvSetData, KvSetResponse, KvStatusData, KvTtlData, KvTtlResponse, KvUpdateData, KvUpdateResponse, ListRoutesData, CreateRouteData, CreateRouteResponse, DeleteRouteData, DeleteRouteResponse, GetRouteData, UpdateRouteData, UpdateRouteResponse, LogoutData, DeleteMonitorData, DeleteMonitorResponse, GetMonitorData, GetBucketedStatusData, GetCurrentMonitorStatusData, GetUptimeHistoryData, DeletePreferencesData, DeletePreferencesResponse, GetPreferencesData, UpdatePreferencesData, UpdatePreferencesResponse, ListNotificationProvidersData, ListNotificationProvidersResponse, CreateNotificationProviderData, CreateNotificationProviderResponse, CreateEmailProviderData, CreateEmailProviderResponse, UpdateEmailProviderData, UpdateEmailProviderResponse, CreateSlackProviderData, CreateSlackProviderResponse, UpdateSlackProviderData, UpdateSlackProviderResponse, CreateWebhookProviderData, CreateWebhookProviderResponse, UpdateWebhookProviderData, UpdateWebhookProviderResponse, DeleteProvider4Data, DeleteProvider4Response, GetNotificationProviderData, UpdateProvider2Data, UpdateProvider2Response, TestProvider2Data, TestProvider2Response, ListOrdersData, ListOrdersResponse2 as ListOrdersResponse, HasPerformanceMetricsData, GetPerformanceMetricsData, GetMetricsOverTimeData, GetGroupedPageMetricsData, GetAccessInfoData, GetPrivateIpData, GetPublicIpData, ListPresetsData, GeneratePresetDockerfileData, GeneratePresetDockerfileResponse, GetProjectsData, GetProjectsResponse, CreateProjectData, CreateProjectResponse, GetProjectBySlugData, CreateProjectFromTemplateData, CreateProjectFromTemplateResponse2 as CreateProjectFromTemplateResponse, GetProjectStatisticsData, DeleteProjectData, DeleteProjectResponse, GetProjectData, UpdateProjectData, UpdateProjectResponse, GetProjectDeploymentsData, GetProjectDeploymentsResponse, GetLastDeploymentData, TriggerProjectPipelineData, TriggerProjectPipelineResponse, GetActiveVisitors2Data, GetAggregatedBucketsData, UpdateAutomaticDeployData, UpdateAutomaticDeployResponse, ListCustomDomainsForProjectData, CreateCustomDomainData, CreateCustomDomainResponse, DeleteCustomDomainData, DeleteCustomDomainResponse, GetCustomDomainData, UpdateCustomDomainData, UpdateCustomDomainResponse, LinkCustomDomainToCertificateData, LinkCustomDomainToCertificateResponse, UpdateProjectDeploymentConfigData, UpdateProjectDeploymentConfigResponse, GetDeploymentData, CancelDeploymentData, CancelDeploymentResponse, GetDeploymentJobsData, TailDeploymentJobLogsData, GetDeploymentOperationsData, ExecuteDeploymentOperationData, ExecuteDeploymentOperationResponse, GetDeploymentOperationStatusData, PauseDeploymentData, PauseDeploymentResponse, ResumeDeploymentData, ResumeDeploymentResponse, RollbackToDeploymentData, RollbackToDeploymentResponse, TeardownDeploymentData, TeardownDeploymentResponse, ListDsnsData, CreateDsnData, CreateDsnResponse, GetOrCreateDsnData, GetOrCreateDsnResponse, RegenerateDsnData, RegenerateDsnResponse, RevokeDsnData, RevokeDsnResponse, GetEnvironmentVariablesData, CreateEnvironmentVariableData, CreateEnvironmentVariableResponse, GetEnvironmentVariableValueData, DeleteEnvironmentVariableData, DeleteEnvironmentVariableResponse, UpdateEnvironmentVariableData, UpdateEnvironmentVariableResponse, GetEnvironmentsData, CreateEnvironmentData, CreateEnvironmentResponse, DeleteEnvironmentData, DeleteEnvironmentResponse, GetEnvironmentData, GetEnvironmentCronsData, GetCronByIdData, GetCronExecutionsData, GetCronExecutionsResponse, GetEnvironmentDomainsData, AddEnvironmentDomainData, AddEnvironmentDomainResponse, DeleteEnvironmentDomainData, DeleteEnvironmentDomainResponse, UpdateEnvironmentSettingsData, UpdateEnvironmentSettingsResponse, TeardownEnvironmentData, TeardownEnvironmentResponse, GetContainerLogsData, ListContainersData, GetContainerDetailData, GetContainerLogsByIdData, GetContainerMetricsData, StreamContainerMetricsData, RestartContainerData, RestartContainerResponse, StartContainerData, StartContainerResponse, StopContainerData, StopContainerResponse, DeployFromImageData, DeployFromImageResponse, DeployFromImageUploadData, DeployFromImageUploadResponse, DeployFromStaticData, DeployFromStaticResponse, GetErrorDashboardStatsData, ListErrorGroupsData, ListErrorGroupsResponse, GetErrorGroupData, UpdateErrorGroupData, ListErrorEventsData, ListErrorEventsResponse, GetErrorEventData, GetErrorStatsData, GetErrorTimeSeriesData, GetEventsCount2Data, GetEventTypeBreakdownData, GetPropertyBreakdownData, GetPropertyTimelineData, GetEventsTimelineData, GetUniqueEventsData, GetUniqueEventsResponse, ListExternalImagesData, ListExternalImagesResponse, RegisterExternalImageData, RegisterExternalImageResponse, DeleteExternalImageData, DeleteExternalImageResponse, GetExternalImageData, ListFunnelsData, CreateFunnelData, CreateFunnelResponse2 as CreateFunnelResponse, PreviewFunnelMetricsData, PreviewFunnelMetricsResponse, DeleteFunnelData, UpdateFunnelData, GetFunnelMetricsData, UpdateGitSettingsData, UpdateGitSettingsResponse, HasErrorGroupsData, HasAnalyticsEventsData, GetHourlyVisitsData, ListExternalImages2Data, PushExternalImageData, PushExternalImageResponse, GetExternalImage2Data, ListIncidentsData, CreateIncidentData, CreateIncidentResponse, GetBucketedIncidentsData, ListMonitorsData, CreateMonitorData, CreateMonitorResponse, DeleteReleaseSourceMapsData, DeleteReleaseSourceMapsResponse, ListSourceMapsData, UploadSourceMapData, UploadSourceMapResponse, UpdateProjectSettingsData, UpdateProjectSettingsResponse, ListReleasesData, DeleteSourceMapData, DeleteSourceMapResponse, ListStaticBundlesData, ListStaticBundlesResponse, DeleteStaticBundleData, DeleteStaticBundleResponse, GetStaticBundleData, GetStatusOverviewData, GetUniqueCountsData, UploadStaticBundleData, UploadStaticBundleResponse, ListProjectScansData, ListProjectScansError, ListProjectScansResponse, TriggerScanData, TriggerScanError, TriggerScanResponse2 as TriggerScanResponse, GetLatestScansPerEnvironmentData, GetLatestScanData, ListWebhooksData, ListWebhooksResponse, CreateWebhookData, CreateWebhookResponse, DeleteWebhookData, DeleteWebhookResponse, GetWebhookData, UpdateWebhookData, UpdateWebhookResponse, ListDeliveriesData, GetDeliveryData, RetryDeliveryData, RetryDeliveryResponse, GetProxyLogsData, GetProxyLogsResponse, GetProxyLogByRequestIdData, GetTimeBucketStatsData, GetTodayStatsData, GetProxyLogByIdData, ListSyncedRepositoriesData, ListSyncedRepositoriesResponse, GetRepositoryByNameData, GetAllRepositoriesByNameData, GetRepositoryPresetByNameData, GetRepositoryBranchesData, GetRepositoryTagsData, GetRepositoryPresetLiveData, GetBranchesByRepositoryIdData, ListCommitsByRepositoryIdData, CheckCommitExistsData, GetTagsByRepositoryIdData, GetProjectSessionReplaysData, GetProjectSessionReplaysError, GetProjectSessionReplaysResponse2 as GetProjectSessionReplaysResponse, GetSessionEvents2Data, GetSettingsData, UpdateSettingsData, UpdateSettingsResponse, ListTemplatesData, ListTemplateTagsData, GetTemplateData, GetCurrentUserData, ListUsersData, CreateUserData, CreateUserResponse, UpdateSelfData, UpdateSelfResponse, DisableMfaData, DisableMfaResponse, SetupMfaData, SetupMfaResponse, VerifyAndEnableMfaData, VerifyAndEnableMfaResponse, DeleteUserData, DeleteUserResponse, UpdateUserData, UpdateUserResponse, RestoreUserData, RestoreUserResponse, AssignRoleData, RemoveRoleData, RemoveRoleResponse, GetVisitorSessions2Data, GetVisitorSessions2Error, GetVisitorSessions2Response, DeleteSessionReplayData, DeleteSessionReplayError, GetSessionReplayData, UpdateSessionDurationData, UpdateSessionDurationError, UpdateSessionDurationResponse2 as UpdateSessionDurationResponse, GetSessionReplayEventsData, AddEventsData, AddEventsError, AddEventsResponse2 as AddEventsResponse, DeleteScanData, DeleteScanError, DeleteScanResponse, GetScanData, GetScanVulnerabilitiesData, GetScanVulnerabilitiesError, GetScanVulnerabilitiesResponse, ListEventTypesData, TriggerWeeklyDigestData, TriggerWeeklyDigestResponse, ListAuditLogsData, ListAuditLogsResponse, GetAuditLogData } from '../types.gen'; import { client } from '../client.gen'; export type QueryKey = [ @@ -165,6 +165,106 @@ export const getActiveVisitorsOptions = (options: Options }); }; +export const getEventDetailQueryKey = (options: Options) => createQueryKey('getEventDetail', options); + +/** + * Get detailed analytics for a specific event + */ +export const getEventDetailOptions = (options: Options) => { + return queryOptions({ + queryFn: async ({ queryKey, signal }) => { + const { data } = await getEventDetail({ + ...options, + ...queryKey[0], + signal, + throwOnError: true + }); + return data; + }, + queryKey: getEventDetailQueryKey(options) + }); +}; + +export const getEventVisitorsQueryKey = (options: Options) => createQueryKey('getEventVisitors', options); + +/** + * Get paginated list of visitors who triggered a specific event + */ +export const getEventVisitorsOptions = (options: Options) => { + return queryOptions({ + queryFn: async ({ queryKey, signal }) => { + const { data } = await getEventVisitors({ + ...options, + ...queryKey[0], + signal, + throwOnError: true + }); + return data; + }, + queryKey: getEventVisitorsQueryKey(options) + }); +}; + +const createInfiniteParams = [0], 'body' | 'headers' | 'path' | 'query'>>(queryKey: QueryKey, page: K) => { + const params = { + ...queryKey[0] + }; + if (page.body) { + params.body = { + ...queryKey[0].body as any, + ...page.body as any + }; + } + if (page.headers) { + params.headers = { + ...queryKey[0].headers, + ...page.headers + }; + } + if (page.path) { + params.path = { + ...queryKey[0].path as any, + ...page.path as any + }; + } + if (page.query) { + params.query = { + ...queryKey[0].query as any, + ...page.query as any + }; + } + return params as unknown as typeof page; +}; + +export const getEventVisitorsInfiniteQueryKey = (options: Options): QueryKey> => createQueryKey('getEventVisitors', options, true); + +/** + * Get paginated list of visitors who triggered a specific event + */ +export const getEventVisitorsInfiniteOptions = (options: Options) => { + return infiniteQueryOptions, QueryKey>, number | Pick>[0], 'body' | 'headers' | 'path' | 'query'>>( + // @ts-ignore + { + queryFn: async ({ pageParam, queryKey, signal }) => { + // @ts-ignore + const page: Pick>[0], 'body' | 'headers' | 'path' | 'query'> = typeof pageParam === 'object' ? pageParam : { + query: { + page: pageParam + } + }; + const params = createInfiniteParams(queryKey, page); + const { data } = await getEventVisitors({ + ...options, + ...params, + signal, + throwOnError: true + }); + return data; + }, + queryKey: getEventVisitorsInfiniteQueryKey(options) + }); +}; + export const getEventsCountQueryKey = (options: Options) => createQueryKey('getEventsCount', options); export const getEventsCountOptions = (options: Options) => { @@ -300,37 +400,6 @@ export const getPagePathVisitorsOptions = (options: Options[0], 'body' | 'headers' | 'path' | 'query'>>(queryKey: QueryKey, page: K) => { - const params = { - ...queryKey[0] - }; - if (page.body) { - params.body = { - ...queryKey[0].body as any, - ...page.body as any - }; - } - if (page.headers) { - params.headers = { - ...queryKey[0].headers, - ...page.headers - }; - } - if (page.path) { - params.path = { - ...queryKey[0].path as any, - ...page.path as any - }; - } - if (page.query) { - params.query = { - ...queryKey[0].query as any, - ...page.query as any - }; - } - return params as unknown as typeof page; -}; - export const getPagePathVisitorsInfiniteQueryKey = (options: Options): QueryKey> => createQueryKey('getPagePathVisitors', options, true); /** @@ -1913,6 +1982,35 @@ export const listDomainsOptions = (options?: Options) => { }); }; +export const listDomainsInfiniteQueryKey = (options?: Options): QueryKey> => createQueryKey('listDomains', options, true); + +/** + * List all domains + */ +export const listDomainsInfiniteOptions = (options?: Options) => { + return infiniteQueryOptions, QueryKey>, number | null | Pick>[0], 'body' | 'headers' | 'path' | 'query'>>( + // @ts-ignore + { + queryFn: async ({ pageParam, queryKey, signal }) => { + // @ts-ignore + const page: Pick>[0], 'body' | 'headers' | 'path' | 'query'> = typeof pageParam === 'object' ? pageParam : { + query: { + page: pageParam + } + }; + const params = createInfiniteParams(queryKey, page); + const { data } = await listDomains({ + ...options, + ...params, + signal, + throwOnError: true + }); + return data; + }, + queryKey: listDomainsInfiniteQueryKey(options) + }); +}; + /** * Create a new domain * Creates a new domain and automatically requests a Let's Encrypt challenge. @@ -2576,6 +2674,35 @@ export const listServicesOptions = (options?: Options) => { }); }; +export const listServicesInfiniteQueryKey = (options?: Options): QueryKey> => createQueryKey('listServices', options, true); + +/** + * Get all external services + */ +export const listServicesInfiniteOptions = (options?: Options) => { + return infiniteQueryOptions, QueryKey>, number | null | Pick>[0], 'body' | 'headers' | 'path' | 'query'>>( + // @ts-ignore + { + queryFn: async ({ pageParam, queryKey, signal }) => { + // @ts-ignore + const page: Pick>[0], 'body' | 'headers' | 'path' | 'query'> = typeof pageParam === 'object' ? pageParam : { + query: { + page: pageParam + } + }; + const params = createInfiniteParams(queryKey, page); + const { data } = await listServices({ + ...options, + ...params, + signal, + throwOnError: true + }); + return data; + }, + queryKey: listServicesInfiniteQueryKey(options) + }); +}; + /** * Create new external service */ @@ -2670,6 +2797,35 @@ export const listProjectServicesOptions = (options: Options): QueryKey> => createQueryKey('listProjectServices', options, true); + +/** + * List services linked to a project + */ +export const listProjectServicesInfiniteOptions = (options: Options) => { + return infiniteQueryOptions, QueryKey>, number | null | Pick>[0], 'body' | 'headers' | 'path' | 'query'>>( + // @ts-ignore + { + queryFn: async ({ pageParam, queryKey, signal }) => { + // @ts-ignore + const page: Pick>[0], 'body' | 'headers' | 'path' | 'query'> = typeof pageParam === 'object' ? pageParam : { + query: { + page: pageParam + } + }; + const params = createInfiniteParams(queryKey, page); + const { data } = await listProjectServices({ + ...options, + ...params, + signal, + throwOnError: true + }); + return data; + }, + queryKey: listProjectServicesInfiniteQueryKey(options) + }); +}; + export const getProjectServiceEnvironmentVariablesQueryKey = (options: Options) => createQueryKey('getProjectServiceEnvironmentVariables', options); /** @@ -2884,6 +3040,35 @@ export const listServiceProjectsOptions = (options: Options): QueryKey> => createQueryKey('listServiceProjects', options, true); + +/** + * List projects linked to service + */ +export const listServiceProjectsInfiniteOptions = (options: Options) => { + return infiniteQueryOptions, QueryKey>, number | null | Pick>[0], 'body' | 'headers' | 'path' | 'query'>>( + // @ts-ignore + { + queryFn: async ({ pageParam, queryKey, signal }) => { + // @ts-ignore + const page: Pick>[0], 'body' | 'headers' | 'path' | 'query'> = typeof pageParam === 'object' ? pageParam : { + query: { + page: pageParam + } + }; + const params = createInfiniteParams(queryKey, page); + const { data } = await listServiceProjects({ + ...options, + ...params, + signal, + throwOnError: true + }); + return data; + }, + queryKey: listServiceProjectsInfiniteQueryKey(options) + }); +}; + /** * Link service to project */ @@ -4442,6 +4627,35 @@ export const listNotificationProvidersOptions = (options?: Options): QueryKey> => createQueryKey('listNotificationProviders', options, true); + +/** + * List all notification providers + */ +export const listNotificationProvidersInfiniteOptions = (options?: Options) => { + return infiniteQueryOptions, QueryKey>, number | null | Pick>[0], 'body' | 'headers' | 'path' | 'query'>>( + // @ts-ignore + { + queryFn: async ({ pageParam, queryKey, signal }) => { + // @ts-ignore + const page: Pick>[0], 'body' | 'headers' | 'path' | 'query'> = typeof pageParam === 'object' ? pageParam : { + query: { + page: pageParam + } + }; + const params = createInfiniteParams(queryKey, page); + const { data } = await listNotificationProviders({ + ...options, + ...params, + signal, + throwOnError: true + }); + return data; + }, + queryKey: listNotificationProvidersInfiniteQueryKey(options) + }); +}; + /** * Create a new notification provider */ @@ -4656,6 +4870,35 @@ export const listOrdersOptions = (options?: Options) => { }); }; +export const listOrdersInfiniteQueryKey = (options?: Options): QueryKey> => createQueryKey('listOrders', options, true); + +/** + * List all ACME orders + */ +export const listOrdersInfiniteOptions = (options?: Options) => { + return infiniteQueryOptions, QueryKey>, number | null | Pick>[0], 'body' | 'headers' | 'path' | 'query'>>( + // @ts-ignore + { + queryFn: async ({ pageParam, queryKey, signal }) => { + // @ts-ignore + const page: Pick>[0], 'body' | 'headers' | 'path' | 'query'> = typeof pageParam === 'object' ? pageParam : { + query: { + page: pageParam + } + }; + const params = createInfiniteParams(queryKey, page); + const { data } = await listOrders({ + ...options, + ...params, + signal, + throwOnError: true + }); + return data; + }, + queryKey: listOrdersInfiniteQueryKey(options) + }); +}; + export const hasPerformanceMetricsQueryKey = (options: Options) => createQueryKey('hasPerformanceMetrics', options); /** @@ -4818,6 +5061,26 @@ export const listPresetsOptions = (options?: Options) => { }); }; +/** + * Generate a Dockerfile from a preset + * Returns the Dockerfile content and build arguments for a given preset slug. + * The CLI can use this to build Docker images locally without needing a Dockerfile + * in the project directory, enabling zero-config deployments. + */ +export const generatePresetDockerfileMutation = (options?: Partial>): UseMutationOptions> => { + const mutationOptions: UseMutationOptions> = { + mutationFn: async (fnOptions) => { + const { data } = await generatePresetDockerfile({ + ...options, + ...fnOptions, + throwOnError: true + }); + return data; + } + }; + return mutationOptions; +}; + export const getProjectsQueryKey = (options?: Options) => createQueryKey('getProjects', options); /** @@ -5325,7 +5588,7 @@ export const tailDeploymentJobLogsQueryKey = (options: Options) => { }); }; +export const listWebhooksInfiniteQueryKey = (options: Options): QueryKey> => createQueryKey('listWebhooks', options, true); + +/** + * List all webhooks for a project + */ +export const listWebhooksInfiniteOptions = (options: Options) => { + return infiniteQueryOptions, QueryKey>, number | null | Pick>[0], 'body' | 'headers' | 'path' | 'query'>>( + // @ts-ignore + { + queryFn: async ({ pageParam, queryKey, signal }) => { + // @ts-ignore + const page: Pick>[0], 'body' | 'headers' | 'path' | 'query'> = typeof pageParam === 'object' ? pageParam : { + query: { + page: pageParam + } + }; + const params = createInfiniteParams(queryKey, page); + const { data } = await listWebhooks({ + ...options, + ...params, + signal, + throwOnError: true + }); + return data; + }, + queryKey: listWebhooksInfiniteQueryKey(options) + }); +}; + /** * Create a new webhook */ @@ -7773,6 +8065,26 @@ export const getBranchesByRepositoryIdOptions = (options: Options) => createQueryKey('listCommitsByRepositoryId', options); + +/** + * List recent commits for a repository branch + */ +export const listCommitsByRepositoryIdOptions = (options: Options) => { + return queryOptions({ + queryFn: async ({ queryKey, signal }) => { + const { data } = await listCommitsByRepositoryId({ + ...options, + ...queryKey[0], + signal, + throwOnError: true + }); + return data; + }, + queryKey: listCommitsByRepositoryIdQueryKey(options) + }); +}; + export const checkCommitExistsQueryKey = (options: Options) => createQueryKey('checkCommitExists', options); /** diff --git a/web/src/api/client/sdk.gen.ts b/web/src/api/client/sdk.gen.ts index f774a047..37cf8711 100644 --- a/web/src/api/client/sdk.gen.ts +++ b/web/src/api/client/sdk.gen.ts @@ -1,7 +1,7 @@ // This file is auto-generated by @hey-api/openapi-ts import type { Options as ClientOptions, Client, TDataShape } from './client'; -import type { GetPlatformInfoData, GetPlatformInfoResponses, GetPlatformInfoErrors, RecordEventMetricsData, RecordEventMetricsResponses, RecordEventMetricsErrors, AddSessionReplayEventsData, AddSessionReplayEventsResponses, AddSessionReplayEventsErrors, InitSessionReplayData, InitSessionReplayResponses, InitSessionReplayErrors, RecordSpeedMetricsData, RecordSpeedMetricsResponses, RecordSpeedMetricsErrors, UpdateSpeedMetricsData, UpdateSpeedMetricsResponses, UpdateSpeedMetricsErrors, GetActiveVisitorsData, GetActiveVisitorsResponses, GetActiveVisitorsErrors, GetEventsCountData, GetEventsCountResponses, GetEventsCountErrors, GetGeneralStatsData, GetGeneralStatsResponses, GetGeneralStatsErrors, GetLiveVisitorsListData, GetLiveVisitorsListResponses, GetLiveVisitorsListErrors, GetPageFlowData, GetPageFlowResponses, GetPageFlowErrors, GetPageHourlySessionsData, GetPageHourlySessionsResponses, GetPageHourlySessionsErrors, GetPagePathDetailData, GetPagePathDetailResponses, GetPagePathDetailErrors, GetPagePathVisitorsData, GetPagePathVisitorsResponses, GetPagePathVisitorsErrors, GetPagePathsData, GetPagePathsResponses, GetPagePathsErrors, GetPagePathsSparklinesData, GetPagePathsSparklinesResponses, GetPagePathsSparklinesErrors, GetRecentActivityData, GetRecentActivityResponses, GetRecentActivityErrors, GetSessionDetailsData, GetSessionDetailsResponses, GetSessionDetailsErrors, GetSessionEventsData, GetSessionEventsResponses, GetSessionEventsErrors, GetSessionLogsData, GetSessionLogsResponses, GetSessionLogsErrors, GetVisitorsData, GetVisitorsResponses, GetVisitorsErrors, GetVisitorByGuidData, GetVisitorByGuidResponses, GetVisitorByGuidErrors, GetVisitorByIdData, GetVisitorByIdResponses, GetVisitorByIdErrors, GetVisitorDetailsData, GetVisitorDetailsResponses, GetVisitorDetailsErrors, EnrichVisitorData, EnrichVisitorResponses, EnrichVisitorErrors, GetVisitorInfoData, GetVisitorInfoResponses, GetVisitorInfoErrors, GetVisitorJourneyData, GetVisitorJourneyResponses, GetVisitorJourneyErrors, GetVisitorSessionsData, GetVisitorSessionsResponses, GetVisitorSessionsErrors, GetVisitorStatsData, GetVisitorStatsResponses, GetVisitorStatsErrors, ListApiKeysData, ListApiKeysResponses, ListApiKeysErrors, CreateApiKeyData, CreateApiKeyResponses, CreateApiKeyErrors, GetApiKeyPermissionsData, GetApiKeyPermissionsResponses, GetApiKeyPermissionsErrors, DeleteApiKeyData, DeleteApiKeyResponses, DeleteApiKeyErrors, GetApiKeyData, GetApiKeyResponses, GetApiKeyErrors, UpdateApiKeyData, UpdateApiKeyResponses, UpdateApiKeyErrors, ActivateApiKeyData, ActivateApiKeyResponses, ActivateApiKeyErrors, DeactivateApiKeyData, DeactivateApiKeyResponses, DeactivateApiKeyErrors, ChunkUploadOptionsData, ChunkUploadOptionsResponses, CreateReleaseData, CreateReleaseResponses, CreateReleaseErrors, ListReleaseFilesData, ListReleaseFilesResponses, ListReleaseFilesErrors, UploadReleaseFileData, UploadReleaseFileResponses, UploadReleaseFileErrors, GetDeploymentJobLogsData, GetDeploymentJobLogsResponses, GetDeploymentJobLogsErrors, IngestSentryEnvelopeData, IngestSentryEnvelopeResponses, IngestSentryEnvelopeErrors, IngestSentryEventData, IngestSentryEventResponses, IngestSentryEventErrors, EmailStatusData, EmailStatusResponses, EmailStatusErrors, LoginData, LoginResponses, LoginErrors, RequestMagicLinkData, RequestMagicLinkResponses, RequestMagicLinkErrors, VerifyMagicLinkData, VerifyMagicLinkResponses, VerifyMagicLinkErrors, RequestPasswordResetData, RequestPasswordResetResponses, RequestPasswordResetErrors, ResetPasswordData, ResetPasswordResponses, ResetPasswordErrors, VerifyEmailData, VerifyEmailResponses, VerifyEmailErrors, VerifyMfaChallengeData, VerifyMfaChallengeResponses, VerifyMfaChallengeErrors, RunExternalServiceBackupData, RunExternalServiceBackupResponses, RunExternalServiceBackupErrors, ListS3SourcesData, ListS3SourcesResponses, ListS3SourcesErrors, CreateS3SourceData, CreateS3SourceResponses, CreateS3SourceErrors, DeleteS3SourceData, DeleteS3SourceResponses, DeleteS3SourceErrors, GetS3SourceData, GetS3SourceResponses, GetS3SourceErrors, UpdateS3SourceData, UpdateS3SourceResponses, UpdateS3SourceErrors, ListSourceBackupsData, ListSourceBackupsResponses, ListSourceBackupsErrors, RunBackupForSourceData, RunBackupForSourceResponses, RunBackupForSourceErrors, ListBackupSchedulesData, ListBackupSchedulesResponses, ListBackupSchedulesErrors, CreateBackupScheduleData, CreateBackupScheduleResponses, CreateBackupScheduleErrors, DeleteBackupScheduleData, DeleteBackupScheduleResponses, DeleteBackupScheduleErrors, GetBackupScheduleData, GetBackupScheduleResponses, GetBackupScheduleErrors, ListBackupsForScheduleData, ListBackupsForScheduleResponses, ListBackupsForScheduleErrors, DisableBackupScheduleData, DisableBackupScheduleResponses, DisableBackupScheduleErrors, EnableBackupScheduleData, EnableBackupScheduleResponses, EnableBackupScheduleErrors, GetBackupData, GetBackupResponses, GetBackupErrors, BlobDeleteData, BlobDeleteResponses, BlobDeleteErrors, BlobListData, BlobListResponses, BlobListErrors, BlobPutData, BlobPutResponses, BlobPutErrors, BlobCopyData, BlobCopyResponses, BlobCopyErrors, BlobDisableData, BlobDisableResponses, BlobDisableErrors, BlobEnableData, BlobEnableResponses, BlobEnableErrors, BlobStatusData, BlobStatusResponses, BlobStatusErrors, BlobUpdateData, BlobUpdateResponses, BlobUpdateErrors, BlobDownloadData, BlobDownloadResponses, BlobDownloadErrors, BlobHeadData, BlobHeadResponses, BlobHeadErrors, GetDashboardProjectsAnalyticsData, GetDashboardProjectsAnalyticsResponses, GetDashboardProjectsAnalyticsErrors, GetActivityGraphData, GetActivityGraphResponses, GetActivityGraphErrors, GetScanByDeploymentData, GetScanByDeploymentResponses, GetScanByDeploymentErrors, ListProvidersData, ListProvidersResponses, ListProvidersErrors, CreateProviderData, CreateProviderResponses, CreateProviderErrors, DeleteProviderData, DeleteProviderResponses, DeleteProviderErrors, GetProviderData, GetProviderResponses, GetProviderErrors, UpdateProviderData, UpdateProviderResponses, UpdateProviderErrors, ListManagedDomainsData, ListManagedDomainsResponses, ListManagedDomainsErrors, AddManagedDomainData, AddManagedDomainResponses, AddManagedDomainErrors, TestProviderConnectionData, TestProviderConnectionResponses, TestProviderConnectionErrors, ListProviderZonesData, ListProviderZonesResponses, ListProviderZonesErrors, RemoveManagedDomainData, RemoveManagedDomainResponses, RemoveManagedDomainErrors, VerifyManagedDomainData, VerifyManagedDomainResponses, VerifyManagedDomainErrors, LookupDnsARecordsData, LookupDnsARecordsResponses, LookupDnsARecordsErrors, ListDomainsData, ListDomainsResponses, ListDomainsErrors, CreateDomainData, CreateDomainResponses, CreateDomainErrors, GetDomainByHostData, GetDomainByHostResponses, GetDomainByHostErrors, CancelDomainOrderData, CancelDomainOrderResponses, CancelDomainOrderErrors, GetDomainOrderData, GetDomainOrderResponses, GetDomainOrderErrors, CreateOrRecreateOrderData, CreateOrRecreateOrderResponses, CreateOrRecreateOrderErrors, FinalizeOrderData, FinalizeOrderResponses, FinalizeOrderErrors, SetupDnsChallengeData, SetupDnsChallengeResponses, SetupDnsChallengeErrors, DeleteDomainData, DeleteDomainResponses, DeleteDomainErrors, GetDomainByIdData, GetDomainByIdResponses, GetDomainByIdErrors, GetChallengeTokenData, GetChallengeTokenResponses, GetChallengeTokenErrors, GetHttpChallengeDebugData, GetHttpChallengeDebugResponses, GetHttpChallengeDebugErrors, ProvisionDomainData, ProvisionDomainResponses, ProvisionDomainErrors, RenewDomainData, RenewDomainResponses, RenewDomainErrors, CheckDomainStatusData, CheckDomainStatusResponses, CheckDomainStatusErrors, ListDomains2Data, ListDomains2Responses, ListDomains2Errors, CreateDomain2Data, CreateDomain2Responses, CreateDomain2Errors, GetDomainByNameData, GetDomainByNameResponses, GetDomainByNameErrors, DeleteDomain2Data, DeleteDomain2Responses, DeleteDomain2Errors, GetDomainData, GetDomainResponses, GetDomainErrors, GetDomainDnsRecordsData, GetDomainDnsRecordsResponses, GetDomainDnsRecordsErrors, SetupDnsData, SetupDnsResponses, SetupDnsErrors, VerifyDomainData, VerifyDomainResponses, VerifyDomainErrors, ListProviders2Data, ListProviders2Responses, ListProviders2Errors, CreateProvider2Data, CreateProvider2Responses, CreateProvider2Errors, DeleteProvider2Data, DeleteProvider2Responses, DeleteProvider2Errors, GetProvider2Data, GetProvider2Responses, GetProvider2Errors, TestProviderData, TestProviderResponses, TestProviderErrors, ListEmailsData, ListEmailsResponses, ListEmailsErrors, SendEmailData, SendEmailResponses, SendEmailErrors, GetEmailStatsData, GetEmailStatsResponses, GetEmailStatsErrors, ValidateEmailData, ValidateEmailResponses, ValidateEmailErrors, GetEmailData, GetEmailResponses, GetEmailErrors, ListServicesData, ListServicesResponses, ListServicesErrors, CreateServiceData, CreateServiceResponses, CreateServiceErrors, ListAvailableContainersData, ListAvailableContainersResponses, ListAvailableContainersErrors, GetServiceBySlugData, GetServiceBySlugResponses, GetServiceBySlugErrors, ImportExternalServiceData, ImportExternalServiceResponses, ImportExternalServiceErrors, ListProjectServicesData, ListProjectServicesResponses, ListProjectServicesErrors, GetProjectServiceEnvironmentVariablesData, GetProjectServiceEnvironmentVariablesResponses, GetProjectServiceEnvironmentVariablesErrors, GetProvidersMetadataData, GetProvidersMetadataResponses, GetProvidersMetadataErrors, GetProviderMetadataData, GetProviderMetadataResponses, GetProviderMetadataErrors, GetServiceTypesData, GetServiceTypesResponses, GetServiceTypesErrors, GetServiceTypeParametersData, GetServiceTypeParametersResponses, GetServiceTypeParametersErrors, DeleteServiceData, DeleteServiceResponses, DeleteServiceErrors, GetServiceData, GetServiceResponses, GetServiceErrors, UpdateServiceData, UpdateServiceResponses, UpdateServiceErrors, GetServicePreviewEnvironmentVariablesMaskedData, GetServicePreviewEnvironmentVariablesMaskedResponses, GetServicePreviewEnvironmentVariablesMaskedErrors, GetServicePreviewEnvironmentVariableNamesData, GetServicePreviewEnvironmentVariableNamesResponses, GetServicePreviewEnvironmentVariableNamesErrors, ListServiceProjectsData, ListServiceProjectsResponses, ListServiceProjectsErrors, LinkServiceToProjectData, LinkServiceToProjectResponses, LinkServiceToProjectErrors, UnlinkServiceFromProjectData, UnlinkServiceFromProjectResponses, UnlinkServiceFromProjectErrors, GetServiceEnvironmentVariablesData, GetServiceEnvironmentVariablesResponses, GetServiceEnvironmentVariablesErrors, GetServiceEnvironmentVariableData, GetServiceEnvironmentVariableResponses, GetServiceEnvironmentVariableErrors, StartServiceData, StartServiceResponses, StartServiceErrors, StopServiceData, StopServiceResponses, StopServiceErrors, UpgradeServiceData, UpgradeServiceResponses, UpgradeServiceErrors, ListRootContainersData, ListRootContainersResponses, ListRootContainersErrors, ListContainersAtPathData, ListContainersAtPathResponses, ListContainersAtPathErrors, ListEntitiesData, ListEntitiesResponses, ListEntitiesErrors, GetEntityInfoData, GetEntityInfoResponses, GetEntityInfoErrors, QueryDataData, QueryDataResponses, QueryDataErrors, DownloadObjectData, DownloadObjectResponses, DownloadObjectErrors, GetContainerInfoData, GetContainerInfoResponses, GetContainerInfoErrors, CheckExplorerSupportData, CheckExplorerSupportResponses, CheckExplorerSupportErrors, GetFileData, GetFileResponses, GetFileErrors, GetIpGeolocationData, GetIpGeolocationResponses, GetIpGeolocationErrors, ListConnectionsData, ListConnectionsResponses, ListConnectionsErrors, DeleteConnectionData, DeleteConnectionResponses, DeleteConnectionErrors, ActivateConnectionData, ActivateConnectionResponses, ActivateConnectionErrors, DeactivateConnectionData, DeactivateConnectionResponses, DeactivateConnectionErrors, ListRepositoriesByConnectionData, ListRepositoriesByConnectionResponses, ListRepositoriesByConnectionErrors, SyncRepositoriesData, SyncRepositoriesResponses, SyncRepositoriesErrors, UpdateConnectionTokenData, UpdateConnectionTokenResponses, UpdateConnectionTokenErrors, ValidateConnectionData, ValidateConnectionResponses, ValidateConnectionErrors, ListGitProvidersData, ListGitProvidersResponses, ListGitProvidersErrors, CreateGitProviderData, CreateGitProviderResponses, CreateGitProviderErrors, CreateGithubPatProviderData, CreateGithubPatProviderResponses, CreateGithubPatProviderErrors, CreateGitlabOauthProviderData, CreateGitlabOauthProviderResponses, CreateGitlabOauthProviderErrors, CreateGitlabPatProviderData, CreateGitlabPatProviderResponses, CreateGitlabPatProviderErrors, DeleteProvider3Data, DeleteProvider3Responses, DeleteProvider3Errors, GetGitProviderData, GetGitProviderResponses, GetGitProviderErrors, ActivateProviderData, ActivateProviderResponses, ActivateProviderErrors, HandleGitProviderOauthCallbackData, HandleGitProviderOauthCallbackErrors, GetProviderConnectionsData, GetProviderConnectionsResponses, GetProviderConnectionsErrors, DeactivateProviderData, DeactivateProviderResponses, DeactivateProviderErrors, CheckProviderDeletionSafetyData, CheckProviderDeletionSafetyResponses, CheckProviderDeletionSafetyErrors, StartGitProviderOauthData, StartGitProviderOauthErrors, DeleteProviderSafelyData, DeleteProviderSafelyResponses, DeleteProviderSafelyErrors, GetPublicRepositoryData, GetPublicRepositoryResponses, GetPublicRepositoryErrors, GetPublicBranchesData, GetPublicBranchesResponses, GetPublicBranchesErrors, DetectPublicPresetsData, DetectPublicPresetsResponses, DetectPublicPresetsErrors, DiscoverWorkloadsData, DiscoverWorkloadsResponses, DiscoverWorkloadsErrors, ExecuteImportData, ExecuteImportResponses, ExecuteImportErrors, CreatePlanData, CreatePlanResponses, CreatePlanErrors, ListSourcesData, ListSourcesResponses, ListSourcesErrors, GetImportStatusData, GetImportStatusResponses, GetImportStatusErrors, GetIncidentData, GetIncidentResponses, GetIncidentErrors, UpdateIncidentStatusData, UpdateIncidentStatusResponses, UpdateIncidentStatusErrors, GetIncidentUpdatesData, GetIncidentUpdatesResponses, GetIncidentUpdatesErrors, ListIpAccessControlData, ListIpAccessControlResponses, ListIpAccessControlErrors, CreateIpAccessControlData, CreateIpAccessControlResponses, CreateIpAccessControlErrors, CheckIpBlockedData, CheckIpBlockedResponses, CheckIpBlockedErrors, DeleteIpAccessControlData, DeleteIpAccessControlResponses, DeleteIpAccessControlErrors, GetIpAccessControlData, GetIpAccessControlResponses, GetIpAccessControlErrors, UpdateIpAccessControlData, UpdateIpAccessControlResponses, UpdateIpAccessControlErrors, KvDelData, KvDelResponses, KvDelErrors, KvDisableData, KvDisableResponses, KvDisableErrors, KvEnableData, KvEnableResponses, KvEnableErrors, KvExpireData, KvExpireResponses, KvExpireErrors, KvGetData, KvGetResponses, KvGetErrors, KvIncrData, KvIncrResponses, KvIncrErrors, KvKeysData, KvKeysResponses, KvKeysErrors, KvSetData, KvSetResponses, KvSetErrors, KvStatusData, KvStatusResponses, KvStatusErrors, KvTtlData, KvTtlResponses, KvTtlErrors, KvUpdateData, KvUpdateResponses, KvUpdateErrors, ListRoutesData, ListRoutesResponses, ListRoutesErrors, CreateRouteData, CreateRouteResponses, CreateRouteErrors, DeleteRouteData, DeleteRouteResponses, DeleteRouteErrors, GetRouteData, GetRouteResponses, GetRouteErrors, UpdateRouteData, UpdateRouteResponses, UpdateRouteErrors, LogoutData, LogoutResponses, LogoutErrors, DeleteMonitorData, DeleteMonitorResponses, DeleteMonitorErrors, GetMonitorData, GetMonitorResponses, GetMonitorErrors, GetBucketedStatusData, GetBucketedStatusResponses, GetBucketedStatusErrors, GetCurrentMonitorStatusData, GetCurrentMonitorStatusResponses, GetCurrentMonitorStatusErrors, GetUptimeHistoryData, GetUptimeHistoryResponses, GetUptimeHistoryErrors, DeletePreferencesData, DeletePreferencesResponses, DeletePreferencesErrors, GetPreferencesData, GetPreferencesResponses, GetPreferencesErrors, UpdatePreferencesData, UpdatePreferencesResponses, UpdatePreferencesErrors, ListNotificationProvidersData, ListNotificationProvidersResponses, ListNotificationProvidersErrors, CreateNotificationProviderData, CreateNotificationProviderResponses, CreateNotificationProviderErrors, CreateEmailProviderData, CreateEmailProviderResponses, CreateEmailProviderErrors, UpdateEmailProviderData, UpdateEmailProviderResponses, UpdateEmailProviderErrors, CreateSlackProviderData, CreateSlackProviderResponses, CreateSlackProviderErrors, UpdateSlackProviderData, UpdateSlackProviderResponses, UpdateSlackProviderErrors, CreateWebhookProviderData, CreateWebhookProviderResponses, CreateWebhookProviderErrors, UpdateWebhookProviderData, UpdateWebhookProviderResponses, UpdateWebhookProviderErrors, DeleteProvider4Data, DeleteProvider4Responses, DeleteProvider4Errors, GetNotificationProviderData, GetNotificationProviderResponses, GetNotificationProviderErrors, UpdateProvider2Data, UpdateProvider2Responses, UpdateProvider2Errors, TestProvider2Data, TestProvider2Responses, TestProvider2Errors, ListOrdersData, ListOrdersResponses, ListOrdersErrors, HasPerformanceMetricsData, HasPerformanceMetricsResponses, HasPerformanceMetricsErrors, GetPerformanceMetricsData, GetPerformanceMetricsResponses, GetPerformanceMetricsErrors, GetMetricsOverTimeData, GetMetricsOverTimeResponses, GetMetricsOverTimeErrors, GetGroupedPageMetricsData, GetGroupedPageMetricsResponses, GetGroupedPageMetricsErrors, GetAccessInfoData, GetAccessInfoResponses, GetAccessInfoErrors, GetPrivateIpData, GetPrivateIpResponses, GetPrivateIpErrors, GetPublicIpData, GetPublicIpResponses, GetPublicIpErrors, ListPresetsData, ListPresetsResponses, ListPresetsErrors, GetProjectsData, GetProjectsResponses, GetProjectsErrors, CreateProjectData, CreateProjectResponses, CreateProjectErrors, GetProjectBySlugData, GetProjectBySlugResponses, GetProjectBySlugErrors, CreateProjectFromTemplateData, CreateProjectFromTemplateResponses, CreateProjectFromTemplateErrors, GetProjectStatisticsData, GetProjectStatisticsResponses, GetProjectStatisticsErrors, DeleteProjectData, DeleteProjectResponses, DeleteProjectErrors, GetProjectData, GetProjectResponses, GetProjectErrors, UpdateProjectData, UpdateProjectResponses, UpdateProjectErrors, GetProjectDeploymentsData, GetProjectDeploymentsResponses, GetProjectDeploymentsErrors, GetLastDeploymentData, GetLastDeploymentResponses, GetLastDeploymentErrors, TriggerProjectPipelineData, TriggerProjectPipelineResponses, TriggerProjectPipelineErrors, GetActiveVisitors2Data, GetActiveVisitors2Responses, GetActiveVisitors2Errors, GetAggregatedBucketsData, GetAggregatedBucketsResponses, GetAggregatedBucketsErrors, UpdateAutomaticDeployData, UpdateAutomaticDeployResponses, UpdateAutomaticDeployErrors, ListCustomDomainsForProjectData, ListCustomDomainsForProjectResponses, ListCustomDomainsForProjectErrors, CreateCustomDomainData, CreateCustomDomainResponses, CreateCustomDomainErrors, DeleteCustomDomainData, DeleteCustomDomainResponses, DeleteCustomDomainErrors, GetCustomDomainData, GetCustomDomainResponses, GetCustomDomainErrors, UpdateCustomDomainData, UpdateCustomDomainResponses, UpdateCustomDomainErrors, LinkCustomDomainToCertificateData, LinkCustomDomainToCertificateResponses, LinkCustomDomainToCertificateErrors, UpdateProjectDeploymentConfigData, UpdateProjectDeploymentConfigResponses, UpdateProjectDeploymentConfigErrors, GetDeploymentData, GetDeploymentResponses, GetDeploymentErrors, CancelDeploymentData, CancelDeploymentResponses, CancelDeploymentErrors, GetDeploymentJobsData, GetDeploymentJobsResponses, GetDeploymentJobsErrors, TailDeploymentJobLogsData, TailDeploymentJobLogsErrors, GetDeploymentOperationsData, GetDeploymentOperationsResponses, GetDeploymentOperationsErrors, ExecuteDeploymentOperationData, ExecuteDeploymentOperationResponses, ExecuteDeploymentOperationErrors, GetDeploymentOperationStatusData, GetDeploymentOperationStatusResponses, GetDeploymentOperationStatusErrors, PauseDeploymentData, PauseDeploymentResponses, PauseDeploymentErrors, ResumeDeploymentData, ResumeDeploymentResponses, ResumeDeploymentErrors, RollbackToDeploymentData, RollbackToDeploymentResponses, RollbackToDeploymentErrors, TeardownDeploymentData, TeardownDeploymentResponses, TeardownDeploymentErrors, ListDsnsData, ListDsnsResponses, CreateDsnData, CreateDsnResponses, CreateDsnErrors, GetOrCreateDsnData, GetOrCreateDsnResponses, GetOrCreateDsnErrors, RegenerateDsnData, RegenerateDsnResponses, RegenerateDsnErrors, RevokeDsnData, RevokeDsnResponses, RevokeDsnErrors, GetEnvironmentVariablesData, GetEnvironmentVariablesResponses, GetEnvironmentVariablesErrors, CreateEnvironmentVariableData, CreateEnvironmentVariableResponses, CreateEnvironmentVariableErrors, GetEnvironmentVariableValueData, GetEnvironmentVariableValueResponses, GetEnvironmentVariableValueErrors, DeleteEnvironmentVariableData, DeleteEnvironmentVariableResponses, DeleteEnvironmentVariableErrors, UpdateEnvironmentVariableData, UpdateEnvironmentVariableResponses, UpdateEnvironmentVariableErrors, GetEnvironmentsData, GetEnvironmentsResponses, GetEnvironmentsErrors, CreateEnvironmentData, CreateEnvironmentResponses, CreateEnvironmentErrors, DeleteEnvironmentData, DeleteEnvironmentResponses, DeleteEnvironmentErrors, GetEnvironmentData, GetEnvironmentResponses, GetEnvironmentErrors, GetEnvironmentCronsData, GetEnvironmentCronsResponses, GetEnvironmentCronsErrors, GetCronByIdData, GetCronByIdResponses, GetCronByIdErrors, GetCronExecutionsData, GetCronExecutionsResponses, GetCronExecutionsErrors, GetEnvironmentDomainsData, GetEnvironmentDomainsResponses, GetEnvironmentDomainsErrors, AddEnvironmentDomainData, AddEnvironmentDomainResponses, AddEnvironmentDomainErrors, DeleteEnvironmentDomainData, DeleteEnvironmentDomainResponses, DeleteEnvironmentDomainErrors, UpdateEnvironmentSettingsData, UpdateEnvironmentSettingsResponses, UpdateEnvironmentSettingsErrors, TeardownEnvironmentData, TeardownEnvironmentResponses, TeardownEnvironmentErrors, GetContainerLogsData, GetContainerLogsErrors, ListContainersData, ListContainersResponses, ListContainersErrors, GetContainerDetailData, GetContainerDetailResponses, GetContainerDetailErrors, GetContainerLogsByIdData, GetContainerLogsByIdErrors, GetContainerMetricsData, GetContainerMetricsResponses, GetContainerMetricsErrors, StreamContainerMetricsData, StreamContainerMetricsResponses, StreamContainerMetricsErrors, RestartContainerData, RestartContainerResponses, RestartContainerErrors, StartContainerData, StartContainerResponses, StartContainerErrors, StopContainerData, StopContainerResponses, StopContainerErrors, DeployFromImageData, DeployFromImageResponses, DeployFromImageErrors, DeployFromImageUploadData, DeployFromImageUploadResponses, DeployFromImageUploadErrors, DeployFromStaticData, DeployFromStaticResponses, DeployFromStaticErrors, GetErrorDashboardStatsData, GetErrorDashboardStatsResponses, GetErrorDashboardStatsErrors, ListErrorGroupsData, ListErrorGroupsResponses, ListErrorGroupsErrors, GetErrorGroupData, GetErrorGroupResponses, GetErrorGroupErrors, UpdateErrorGroupData, UpdateErrorGroupResponses, UpdateErrorGroupErrors, ListErrorEventsData, ListErrorEventsResponses, ListErrorEventsErrors, GetErrorEventData, GetErrorEventResponses, GetErrorEventErrors, GetErrorStatsData, GetErrorStatsResponses, GetErrorStatsErrors, GetErrorTimeSeriesData, GetErrorTimeSeriesResponses, GetErrorTimeSeriesErrors, GetEventsCount2Data, GetEventsCount2Responses, GetEventsCount2Errors, GetEventTypeBreakdownData, GetEventTypeBreakdownResponses, GetEventTypeBreakdownErrors, GetPropertyBreakdownData, GetPropertyBreakdownResponses, GetPropertyBreakdownErrors, GetPropertyTimelineData, GetPropertyTimelineResponses, GetPropertyTimelineErrors, GetEventsTimelineData, GetEventsTimelineResponses, GetEventsTimelineErrors, GetUniqueEventsData, GetUniqueEventsResponses, GetUniqueEventsErrors, ListExternalImagesData, ListExternalImagesResponses, ListExternalImagesErrors, RegisterExternalImageData, RegisterExternalImageResponses, RegisterExternalImageErrors, DeleteExternalImageData, DeleteExternalImageResponses, DeleteExternalImageErrors, GetExternalImageData, GetExternalImageResponses, GetExternalImageErrors, ListFunnelsData, ListFunnelsResponses, ListFunnelsErrors, CreateFunnelData, CreateFunnelResponses, CreateFunnelErrors, PreviewFunnelMetricsData, PreviewFunnelMetricsResponses, PreviewFunnelMetricsErrors, DeleteFunnelData, DeleteFunnelResponses, DeleteFunnelErrors, UpdateFunnelData, UpdateFunnelResponses, UpdateFunnelErrors, GetFunnelMetricsData, GetFunnelMetricsResponses, GetFunnelMetricsErrors, UpdateGitSettingsData, UpdateGitSettingsResponses, UpdateGitSettingsErrors, HasErrorGroupsData, HasErrorGroupsResponses, HasErrorGroupsErrors, HasAnalyticsEventsData, HasAnalyticsEventsResponses, HasAnalyticsEventsErrors, GetHourlyVisitsData, GetHourlyVisitsResponses, GetHourlyVisitsErrors, ListExternalImages2Data, ListExternalImages2Responses, ListExternalImages2Errors, PushExternalImageData, PushExternalImageResponses, PushExternalImageErrors, GetExternalImage2Data, GetExternalImage2Responses, GetExternalImage2Errors, ListIncidentsData, ListIncidentsResponses, ListIncidentsErrors, CreateIncidentData, CreateIncidentResponses, CreateIncidentErrors, GetBucketedIncidentsData, GetBucketedIncidentsResponses, GetBucketedIncidentsErrors, ListMonitorsData, ListMonitorsResponses, ListMonitorsErrors, CreateMonitorData, CreateMonitorResponses, CreateMonitorErrors, DeleteReleaseSourceMapsData, DeleteReleaseSourceMapsResponses, DeleteReleaseSourceMapsErrors, ListSourceMapsData, ListSourceMapsResponses, ListSourceMapsErrors, UploadSourceMapData, UploadSourceMapResponses, UploadSourceMapErrors, UpdateProjectSettingsData, UpdateProjectSettingsResponses, UpdateProjectSettingsErrors, ListReleasesData, ListReleasesResponses, ListReleasesErrors, DeleteSourceMapData, DeleteSourceMapResponses, DeleteSourceMapErrors, ListStaticBundlesData, ListStaticBundlesResponses, ListStaticBundlesErrors, DeleteStaticBundleData, DeleteStaticBundleResponses, DeleteStaticBundleErrors, GetStaticBundleData, GetStaticBundleResponses, GetStaticBundleErrors, GetStatusOverviewData, GetStatusOverviewResponses, GetStatusOverviewErrors, GetUniqueCountsData, GetUniqueCountsResponses, GetUniqueCountsErrors, UploadStaticBundleData, UploadStaticBundleResponses, UploadStaticBundleErrors, ListProjectScansData, ListProjectScansResponses, ListProjectScansErrors, TriggerScanData, TriggerScanResponses, TriggerScanErrors, GetLatestScansPerEnvironmentData, GetLatestScansPerEnvironmentResponses, GetLatestScansPerEnvironmentErrors, GetLatestScanData, GetLatestScanResponses, GetLatestScanErrors, ListWebhooksData, ListWebhooksResponses, ListWebhooksErrors, CreateWebhookData, CreateWebhookResponses, CreateWebhookErrors, DeleteWebhookData, DeleteWebhookResponses, DeleteWebhookErrors, GetWebhookData, GetWebhookResponses, GetWebhookErrors, UpdateWebhookData, UpdateWebhookResponses, UpdateWebhookErrors, ListDeliveriesData, ListDeliveriesResponses, ListDeliveriesErrors, GetDeliveryData, GetDeliveryResponses, GetDeliveryErrors, RetryDeliveryData, RetryDeliveryResponses, RetryDeliveryErrors, GetProxyLogsData, GetProxyLogsResponses, GetProxyLogsErrors, GetProxyLogByRequestIdData, GetProxyLogByRequestIdResponses, GetProxyLogByRequestIdErrors, GetTimeBucketStatsData, GetTimeBucketStatsResponses, GetTimeBucketStatsErrors, GetTodayStatsData, GetTodayStatsResponses, GetTodayStatsErrors, GetProxyLogByIdData, GetProxyLogByIdResponses, GetProxyLogByIdErrors, ListSyncedRepositoriesData, ListSyncedRepositoriesResponses, ListSyncedRepositoriesErrors, GetRepositoryByNameData, GetRepositoryByNameResponses, GetRepositoryByNameErrors, GetAllRepositoriesByNameData, GetAllRepositoriesByNameResponses, GetAllRepositoriesByNameErrors, GetRepositoryPresetByNameData, GetRepositoryPresetByNameResponses, GetRepositoryPresetByNameErrors, GetRepositoryBranchesData, GetRepositoryBranchesResponses, GetRepositoryBranchesErrors, GetRepositoryTagsData, GetRepositoryTagsResponses, GetRepositoryTagsErrors, GetRepositoryPresetLiveData, GetRepositoryPresetLiveResponses, GetRepositoryPresetLiveErrors, GetBranchesByRepositoryIdData, GetBranchesByRepositoryIdResponses, GetBranchesByRepositoryIdErrors, CheckCommitExistsData, CheckCommitExistsResponses, CheckCommitExistsErrors, GetTagsByRepositoryIdData, GetTagsByRepositoryIdResponses, GetTagsByRepositoryIdErrors, GetProjectSessionReplaysData, GetProjectSessionReplaysResponses, GetProjectSessionReplaysErrors, GetSessionEvents2Data, GetSessionEvents2Responses, GetSessionEvents2Errors, GetSettingsData, GetSettingsResponses, GetSettingsErrors, UpdateSettingsData, UpdateSettingsResponses, UpdateSettingsErrors, ListTemplatesData, ListTemplatesResponses, ListTemplatesErrors, ListTemplateTagsData, ListTemplateTagsResponses, ListTemplateTagsErrors, GetTemplateData, GetTemplateResponses, GetTemplateErrors, GetCurrentUserData, GetCurrentUserResponses, GetCurrentUserErrors, ListUsersData, ListUsersResponses, ListUsersErrors, CreateUserData, CreateUserResponses, CreateUserErrors, UpdateSelfData, UpdateSelfResponses, UpdateSelfErrors, DisableMfaData, DisableMfaResponses, DisableMfaErrors, SetupMfaData, SetupMfaResponses, SetupMfaErrors, VerifyAndEnableMfaData, VerifyAndEnableMfaResponses, VerifyAndEnableMfaErrors, DeleteUserData, DeleteUserResponses, DeleteUserErrors, UpdateUserData, UpdateUserResponses, UpdateUserErrors, RestoreUserData, RestoreUserResponses, RestoreUserErrors, AssignRoleData, AssignRoleResponses, AssignRoleErrors, RemoveRoleData, RemoveRoleResponses, RemoveRoleErrors, GetVisitorSessions2Data, GetVisitorSessions2Responses, GetVisitorSessions2Errors, DeleteSessionReplayData, DeleteSessionReplayResponses, DeleteSessionReplayErrors, GetSessionReplayData, GetSessionReplayResponses, GetSessionReplayErrors, UpdateSessionDurationData, UpdateSessionDurationResponses, UpdateSessionDurationErrors, GetSessionReplayEventsData, GetSessionReplayEventsResponses, GetSessionReplayEventsErrors, AddEventsData, AddEventsResponses, AddEventsErrors, DeleteScanData, DeleteScanResponses, DeleteScanErrors, GetScanData, GetScanResponses, GetScanErrors, GetScanVulnerabilitiesData, GetScanVulnerabilitiesResponses, GetScanVulnerabilitiesErrors, ListEventTypesData, ListEventTypesResponses, TriggerWeeklyDigestData, TriggerWeeklyDigestResponses, TriggerWeeklyDigestErrors, ListAuditLogsData, ListAuditLogsResponses, ListAuditLogsErrors, GetAuditLogData, GetAuditLogResponses, GetAuditLogErrors } from './types.gen'; +import type { GetPlatformInfoData, GetPlatformInfoResponses, GetPlatformInfoErrors, RecordEventMetricsData, RecordEventMetricsResponses, RecordEventMetricsErrors, AddSessionReplayEventsData, AddSessionReplayEventsResponses, AddSessionReplayEventsErrors, InitSessionReplayData, InitSessionReplayResponses, InitSessionReplayErrors, RecordSpeedMetricsData, RecordSpeedMetricsResponses, RecordSpeedMetricsErrors, UpdateSpeedMetricsData, UpdateSpeedMetricsResponses, UpdateSpeedMetricsErrors, GetActiveVisitorsData, GetActiveVisitorsResponses, GetActiveVisitorsErrors, GetEventDetailData, GetEventDetailResponses, GetEventDetailErrors, GetEventVisitorsData, GetEventVisitorsResponses, GetEventVisitorsErrors, GetEventsCountData, GetEventsCountResponses, GetEventsCountErrors, GetGeneralStatsData, GetGeneralStatsResponses, GetGeneralStatsErrors, GetLiveVisitorsListData, GetLiveVisitorsListResponses, GetLiveVisitorsListErrors, GetPageFlowData, GetPageFlowResponses, GetPageFlowErrors, GetPageHourlySessionsData, GetPageHourlySessionsResponses, GetPageHourlySessionsErrors, GetPagePathDetailData, GetPagePathDetailResponses, GetPagePathDetailErrors, GetPagePathVisitorsData, GetPagePathVisitorsResponses, GetPagePathVisitorsErrors, GetPagePathsData, GetPagePathsResponses, GetPagePathsErrors, GetPagePathsSparklinesData, GetPagePathsSparklinesResponses, GetPagePathsSparklinesErrors, GetRecentActivityData, GetRecentActivityResponses, GetRecentActivityErrors, GetSessionDetailsData, GetSessionDetailsResponses, GetSessionDetailsErrors, GetSessionEventsData, GetSessionEventsResponses, GetSessionEventsErrors, GetSessionLogsData, GetSessionLogsResponses, GetSessionLogsErrors, GetVisitorsData, GetVisitorsResponses, GetVisitorsErrors, GetVisitorByGuidData, GetVisitorByGuidResponses, GetVisitorByGuidErrors, GetVisitorByIdData, GetVisitorByIdResponses, GetVisitorByIdErrors, GetVisitorDetailsData, GetVisitorDetailsResponses, GetVisitorDetailsErrors, EnrichVisitorData, EnrichVisitorResponses, EnrichVisitorErrors, GetVisitorInfoData, GetVisitorInfoResponses, GetVisitorInfoErrors, GetVisitorJourneyData, GetVisitorJourneyResponses, GetVisitorJourneyErrors, GetVisitorSessionsData, GetVisitorSessionsResponses, GetVisitorSessionsErrors, GetVisitorStatsData, GetVisitorStatsResponses, GetVisitorStatsErrors, ListApiKeysData, ListApiKeysResponses, ListApiKeysErrors, CreateApiKeyData, CreateApiKeyResponses, CreateApiKeyErrors, GetApiKeyPermissionsData, GetApiKeyPermissionsResponses, GetApiKeyPermissionsErrors, DeleteApiKeyData, DeleteApiKeyResponses, DeleteApiKeyErrors, GetApiKeyData, GetApiKeyResponses, GetApiKeyErrors, UpdateApiKeyData, UpdateApiKeyResponses, UpdateApiKeyErrors, ActivateApiKeyData, ActivateApiKeyResponses, ActivateApiKeyErrors, DeactivateApiKeyData, DeactivateApiKeyResponses, DeactivateApiKeyErrors, ChunkUploadOptionsData, ChunkUploadOptionsResponses, CreateReleaseData, CreateReleaseResponses, CreateReleaseErrors, ListReleaseFilesData, ListReleaseFilesResponses, ListReleaseFilesErrors, UploadReleaseFileData, UploadReleaseFileResponses, UploadReleaseFileErrors, GetDeploymentJobLogsData, GetDeploymentJobLogsResponses, GetDeploymentJobLogsErrors, IngestSentryEnvelopeData, IngestSentryEnvelopeResponses, IngestSentryEnvelopeErrors, IngestSentryEventData, IngestSentryEventResponses, IngestSentryEventErrors, EmailStatusData, EmailStatusResponses, EmailStatusErrors, LoginData, LoginResponses, LoginErrors, RequestMagicLinkData, RequestMagicLinkResponses, RequestMagicLinkErrors, VerifyMagicLinkData, VerifyMagicLinkResponses, VerifyMagicLinkErrors, RequestPasswordResetData, RequestPasswordResetResponses, RequestPasswordResetErrors, ResetPasswordData, ResetPasswordResponses, ResetPasswordErrors, VerifyEmailData, VerifyEmailResponses, VerifyEmailErrors, VerifyMfaChallengeData, VerifyMfaChallengeResponses, VerifyMfaChallengeErrors, RunExternalServiceBackupData, RunExternalServiceBackupResponses, RunExternalServiceBackupErrors, ListS3SourcesData, ListS3SourcesResponses, ListS3SourcesErrors, CreateS3SourceData, CreateS3SourceResponses, CreateS3SourceErrors, DeleteS3SourceData, DeleteS3SourceResponses, DeleteS3SourceErrors, GetS3SourceData, GetS3SourceResponses, GetS3SourceErrors, UpdateS3SourceData, UpdateS3SourceResponses, UpdateS3SourceErrors, ListSourceBackupsData, ListSourceBackupsResponses, ListSourceBackupsErrors, RunBackupForSourceData, RunBackupForSourceResponses, RunBackupForSourceErrors, ListBackupSchedulesData, ListBackupSchedulesResponses, ListBackupSchedulesErrors, CreateBackupScheduleData, CreateBackupScheduleResponses, CreateBackupScheduleErrors, DeleteBackupScheduleData, DeleteBackupScheduleResponses, DeleteBackupScheduleErrors, GetBackupScheduleData, GetBackupScheduleResponses, GetBackupScheduleErrors, ListBackupsForScheduleData, ListBackupsForScheduleResponses, ListBackupsForScheduleErrors, DisableBackupScheduleData, DisableBackupScheduleResponses, DisableBackupScheduleErrors, EnableBackupScheduleData, EnableBackupScheduleResponses, EnableBackupScheduleErrors, GetBackupData, GetBackupResponses, GetBackupErrors, BlobDeleteData, BlobDeleteResponses, BlobDeleteErrors, BlobListData, BlobListResponses, BlobListErrors, BlobPutData, BlobPutResponses, BlobPutErrors, BlobCopyData, BlobCopyResponses, BlobCopyErrors, BlobDisableData, BlobDisableResponses, BlobDisableErrors, BlobEnableData, BlobEnableResponses, BlobEnableErrors, BlobStatusData, BlobStatusResponses, BlobStatusErrors, BlobUpdateData, BlobUpdateResponses, BlobUpdateErrors, BlobDownloadData, BlobDownloadResponses, BlobDownloadErrors, BlobHeadData, BlobHeadResponses, BlobHeadErrors, GetDashboardProjectsAnalyticsData, GetDashboardProjectsAnalyticsResponses, GetDashboardProjectsAnalyticsErrors, GetActivityGraphData, GetActivityGraphResponses, GetActivityGraphErrors, GetScanByDeploymentData, GetScanByDeploymentResponses, GetScanByDeploymentErrors, ListProvidersData, ListProvidersResponses, ListProvidersErrors, CreateProviderData, CreateProviderResponses, CreateProviderErrors, DeleteProviderData, DeleteProviderResponses, DeleteProviderErrors, GetProviderData, GetProviderResponses, GetProviderErrors, UpdateProviderData, UpdateProviderResponses, UpdateProviderErrors, ListManagedDomainsData, ListManagedDomainsResponses, ListManagedDomainsErrors, AddManagedDomainData, AddManagedDomainResponses, AddManagedDomainErrors, TestProviderConnectionData, TestProviderConnectionResponses, TestProviderConnectionErrors, ListProviderZonesData, ListProviderZonesResponses, ListProviderZonesErrors, RemoveManagedDomainData, RemoveManagedDomainResponses, RemoveManagedDomainErrors, VerifyManagedDomainData, VerifyManagedDomainResponses, VerifyManagedDomainErrors, LookupDnsARecordsData, LookupDnsARecordsResponses, LookupDnsARecordsErrors, ListDomainsData, ListDomainsResponses, ListDomainsErrors, CreateDomainData, CreateDomainResponses, CreateDomainErrors, GetDomainByHostData, GetDomainByHostResponses, GetDomainByHostErrors, CancelDomainOrderData, CancelDomainOrderResponses, CancelDomainOrderErrors, GetDomainOrderData, GetDomainOrderResponses, GetDomainOrderErrors, CreateOrRecreateOrderData, CreateOrRecreateOrderResponses, CreateOrRecreateOrderErrors, FinalizeOrderData, FinalizeOrderResponses, FinalizeOrderErrors, SetupDnsChallengeData, SetupDnsChallengeResponses, SetupDnsChallengeErrors, DeleteDomainData, DeleteDomainResponses, DeleteDomainErrors, GetDomainByIdData, GetDomainByIdResponses, GetDomainByIdErrors, GetChallengeTokenData, GetChallengeTokenResponses, GetChallengeTokenErrors, GetHttpChallengeDebugData, GetHttpChallengeDebugResponses, GetHttpChallengeDebugErrors, ProvisionDomainData, ProvisionDomainResponses, ProvisionDomainErrors, RenewDomainData, RenewDomainResponses, RenewDomainErrors, CheckDomainStatusData, CheckDomainStatusResponses, CheckDomainStatusErrors, ListDomains2Data, ListDomains2Responses, ListDomains2Errors, CreateDomain2Data, CreateDomain2Responses, CreateDomain2Errors, GetDomainByNameData, GetDomainByNameResponses, GetDomainByNameErrors, DeleteDomain2Data, DeleteDomain2Responses, DeleteDomain2Errors, GetDomainData, GetDomainResponses, GetDomainErrors, GetDomainDnsRecordsData, GetDomainDnsRecordsResponses, GetDomainDnsRecordsErrors, SetupDnsData, SetupDnsResponses, SetupDnsErrors, VerifyDomainData, VerifyDomainResponses, VerifyDomainErrors, ListProviders2Data, ListProviders2Responses, ListProviders2Errors, CreateProvider2Data, CreateProvider2Responses, CreateProvider2Errors, DeleteProvider2Data, DeleteProvider2Responses, DeleteProvider2Errors, GetProvider2Data, GetProvider2Responses, GetProvider2Errors, TestProviderData, TestProviderResponses, TestProviderErrors, ListEmailsData, ListEmailsResponses, ListEmailsErrors, SendEmailData, SendEmailResponses, SendEmailErrors, GetEmailStatsData, GetEmailStatsResponses, GetEmailStatsErrors, ValidateEmailData, ValidateEmailResponses, ValidateEmailErrors, GetEmailData, GetEmailResponses, GetEmailErrors, ListServicesData, ListServicesResponses, ListServicesErrors, CreateServiceData, CreateServiceResponses, CreateServiceErrors, ListAvailableContainersData, ListAvailableContainersResponses, ListAvailableContainersErrors, GetServiceBySlugData, GetServiceBySlugResponses, GetServiceBySlugErrors, ImportExternalServiceData, ImportExternalServiceResponses, ImportExternalServiceErrors, ListProjectServicesData, ListProjectServicesResponses, ListProjectServicesErrors, GetProjectServiceEnvironmentVariablesData, GetProjectServiceEnvironmentVariablesResponses, GetProjectServiceEnvironmentVariablesErrors, GetProvidersMetadataData, GetProvidersMetadataResponses, GetProvidersMetadataErrors, GetProviderMetadataData, GetProviderMetadataResponses, GetProviderMetadataErrors, GetServiceTypesData, GetServiceTypesResponses, GetServiceTypesErrors, GetServiceTypeParametersData, GetServiceTypeParametersResponses, GetServiceTypeParametersErrors, DeleteServiceData, DeleteServiceResponses, DeleteServiceErrors, GetServiceData, GetServiceResponses, GetServiceErrors, UpdateServiceData, UpdateServiceResponses, UpdateServiceErrors, GetServicePreviewEnvironmentVariablesMaskedData, GetServicePreviewEnvironmentVariablesMaskedResponses, GetServicePreviewEnvironmentVariablesMaskedErrors, GetServicePreviewEnvironmentVariableNamesData, GetServicePreviewEnvironmentVariableNamesResponses, GetServicePreviewEnvironmentVariableNamesErrors, ListServiceProjectsData, ListServiceProjectsResponses, ListServiceProjectsErrors, LinkServiceToProjectData, LinkServiceToProjectResponses, LinkServiceToProjectErrors, UnlinkServiceFromProjectData, UnlinkServiceFromProjectResponses, UnlinkServiceFromProjectErrors, GetServiceEnvironmentVariablesData, GetServiceEnvironmentVariablesResponses, GetServiceEnvironmentVariablesErrors, GetServiceEnvironmentVariableData, GetServiceEnvironmentVariableResponses, GetServiceEnvironmentVariableErrors, StartServiceData, StartServiceResponses, StartServiceErrors, StopServiceData, StopServiceResponses, StopServiceErrors, UpgradeServiceData, UpgradeServiceResponses, UpgradeServiceErrors, ListRootContainersData, ListRootContainersResponses, ListRootContainersErrors, ListContainersAtPathData, ListContainersAtPathResponses, ListContainersAtPathErrors, ListEntitiesData, ListEntitiesResponses, ListEntitiesErrors, GetEntityInfoData, GetEntityInfoResponses, GetEntityInfoErrors, QueryDataData, QueryDataResponses, QueryDataErrors, DownloadObjectData, DownloadObjectResponses, DownloadObjectErrors, GetContainerInfoData, GetContainerInfoResponses, GetContainerInfoErrors, CheckExplorerSupportData, CheckExplorerSupportResponses, CheckExplorerSupportErrors, GetFileData, GetFileResponses, GetFileErrors, GetIpGeolocationData, GetIpGeolocationResponses, GetIpGeolocationErrors, ListConnectionsData, ListConnectionsResponses, ListConnectionsErrors, DeleteConnectionData, DeleteConnectionResponses, DeleteConnectionErrors, ActivateConnectionData, ActivateConnectionResponses, ActivateConnectionErrors, DeactivateConnectionData, DeactivateConnectionResponses, DeactivateConnectionErrors, ListRepositoriesByConnectionData, ListRepositoriesByConnectionResponses, ListRepositoriesByConnectionErrors, SyncRepositoriesData, SyncRepositoriesResponses, SyncRepositoriesErrors, UpdateConnectionTokenData, UpdateConnectionTokenResponses, UpdateConnectionTokenErrors, ValidateConnectionData, ValidateConnectionResponses, ValidateConnectionErrors, ListGitProvidersData, ListGitProvidersResponses, ListGitProvidersErrors, CreateGitProviderData, CreateGitProviderResponses, CreateGitProviderErrors, CreateGithubPatProviderData, CreateGithubPatProviderResponses, CreateGithubPatProviderErrors, CreateGitlabOauthProviderData, CreateGitlabOauthProviderResponses, CreateGitlabOauthProviderErrors, CreateGitlabPatProviderData, CreateGitlabPatProviderResponses, CreateGitlabPatProviderErrors, DeleteProvider3Data, DeleteProvider3Responses, DeleteProvider3Errors, GetGitProviderData, GetGitProviderResponses, GetGitProviderErrors, ActivateProviderData, ActivateProviderResponses, ActivateProviderErrors, HandleGitProviderOauthCallbackData, HandleGitProviderOauthCallbackErrors, GetProviderConnectionsData, GetProviderConnectionsResponses, GetProviderConnectionsErrors, DeactivateProviderData, DeactivateProviderResponses, DeactivateProviderErrors, CheckProviderDeletionSafetyData, CheckProviderDeletionSafetyResponses, CheckProviderDeletionSafetyErrors, StartGitProviderOauthData, StartGitProviderOauthErrors, DeleteProviderSafelyData, DeleteProviderSafelyResponses, DeleteProviderSafelyErrors, GetPublicRepositoryData, GetPublicRepositoryResponses, GetPublicRepositoryErrors, GetPublicBranchesData, GetPublicBranchesResponses, GetPublicBranchesErrors, DetectPublicPresetsData, DetectPublicPresetsResponses, DetectPublicPresetsErrors, DiscoverWorkloadsData, DiscoverWorkloadsResponses, DiscoverWorkloadsErrors, ExecuteImportData, ExecuteImportResponses, ExecuteImportErrors, CreatePlanData, CreatePlanResponses, CreatePlanErrors, ListSourcesData, ListSourcesResponses, ListSourcesErrors, GetImportStatusData, GetImportStatusResponses, GetImportStatusErrors, GetIncidentData, GetIncidentResponses, GetIncidentErrors, UpdateIncidentStatusData, UpdateIncidentStatusResponses, UpdateIncidentStatusErrors, GetIncidentUpdatesData, GetIncidentUpdatesResponses, GetIncidentUpdatesErrors, ListIpAccessControlData, ListIpAccessControlResponses, ListIpAccessControlErrors, CreateIpAccessControlData, CreateIpAccessControlResponses, CreateIpAccessControlErrors, CheckIpBlockedData, CheckIpBlockedResponses, CheckIpBlockedErrors, DeleteIpAccessControlData, DeleteIpAccessControlResponses, DeleteIpAccessControlErrors, GetIpAccessControlData, GetIpAccessControlResponses, GetIpAccessControlErrors, UpdateIpAccessControlData, UpdateIpAccessControlResponses, UpdateIpAccessControlErrors, KvDelData, KvDelResponses, KvDelErrors, KvDisableData, KvDisableResponses, KvDisableErrors, KvEnableData, KvEnableResponses, KvEnableErrors, KvExpireData, KvExpireResponses, KvExpireErrors, KvGetData, KvGetResponses, KvGetErrors, KvIncrData, KvIncrResponses, KvIncrErrors, KvKeysData, KvKeysResponses, KvKeysErrors, KvSetData, KvSetResponses, KvSetErrors, KvStatusData, KvStatusResponses, KvStatusErrors, KvTtlData, KvTtlResponses, KvTtlErrors, KvUpdateData, KvUpdateResponses, KvUpdateErrors, ListRoutesData, ListRoutesResponses, ListRoutesErrors, CreateRouteData, CreateRouteResponses, CreateRouteErrors, DeleteRouteData, DeleteRouteResponses, DeleteRouteErrors, GetRouteData, GetRouteResponses, GetRouteErrors, UpdateRouteData, UpdateRouteResponses, UpdateRouteErrors, LogoutData, LogoutResponses, LogoutErrors, DeleteMonitorData, DeleteMonitorResponses, DeleteMonitorErrors, GetMonitorData, GetMonitorResponses, GetMonitorErrors, GetBucketedStatusData, GetBucketedStatusResponses, GetBucketedStatusErrors, GetCurrentMonitorStatusData, GetCurrentMonitorStatusResponses, GetCurrentMonitorStatusErrors, GetUptimeHistoryData, GetUptimeHistoryResponses, GetUptimeHistoryErrors, DeletePreferencesData, DeletePreferencesResponses, DeletePreferencesErrors, GetPreferencesData, GetPreferencesResponses, GetPreferencesErrors, UpdatePreferencesData, UpdatePreferencesResponses, UpdatePreferencesErrors, ListNotificationProvidersData, ListNotificationProvidersResponses, ListNotificationProvidersErrors, CreateNotificationProviderData, CreateNotificationProviderResponses, CreateNotificationProviderErrors, CreateEmailProviderData, CreateEmailProviderResponses, CreateEmailProviderErrors, UpdateEmailProviderData, UpdateEmailProviderResponses, UpdateEmailProviderErrors, CreateSlackProviderData, CreateSlackProviderResponses, CreateSlackProviderErrors, UpdateSlackProviderData, UpdateSlackProviderResponses, UpdateSlackProviderErrors, CreateWebhookProviderData, CreateWebhookProviderResponses, CreateWebhookProviderErrors, UpdateWebhookProviderData, UpdateWebhookProviderResponses, UpdateWebhookProviderErrors, DeleteProvider4Data, DeleteProvider4Responses, DeleteProvider4Errors, GetNotificationProviderData, GetNotificationProviderResponses, GetNotificationProviderErrors, UpdateProvider2Data, UpdateProvider2Responses, UpdateProvider2Errors, TestProvider2Data, TestProvider2Responses, TestProvider2Errors, ListOrdersData, ListOrdersResponses, ListOrdersErrors, HasPerformanceMetricsData, HasPerformanceMetricsResponses, HasPerformanceMetricsErrors, GetPerformanceMetricsData, GetPerformanceMetricsResponses, GetPerformanceMetricsErrors, GetMetricsOverTimeData, GetMetricsOverTimeResponses, GetMetricsOverTimeErrors, GetGroupedPageMetricsData, GetGroupedPageMetricsResponses, GetGroupedPageMetricsErrors, GetAccessInfoData, GetAccessInfoResponses, GetAccessInfoErrors, GetPrivateIpData, GetPrivateIpResponses, GetPrivateIpErrors, GetPublicIpData, GetPublicIpResponses, GetPublicIpErrors, ListPresetsData, ListPresetsResponses, ListPresetsErrors, GeneratePresetDockerfileData, GeneratePresetDockerfileResponses, GeneratePresetDockerfileErrors, GetProjectsData, GetProjectsResponses, GetProjectsErrors, CreateProjectData, CreateProjectResponses, CreateProjectErrors, GetProjectBySlugData, GetProjectBySlugResponses, GetProjectBySlugErrors, CreateProjectFromTemplateData, CreateProjectFromTemplateResponses, CreateProjectFromTemplateErrors, GetProjectStatisticsData, GetProjectStatisticsResponses, GetProjectStatisticsErrors, DeleteProjectData, DeleteProjectResponses, DeleteProjectErrors, GetProjectData, GetProjectResponses, GetProjectErrors, UpdateProjectData, UpdateProjectResponses, UpdateProjectErrors, GetProjectDeploymentsData, GetProjectDeploymentsResponses, GetProjectDeploymentsErrors, GetLastDeploymentData, GetLastDeploymentResponses, GetLastDeploymentErrors, TriggerProjectPipelineData, TriggerProjectPipelineResponses, TriggerProjectPipelineErrors, GetActiveVisitors2Data, GetActiveVisitors2Responses, GetActiveVisitors2Errors, GetAggregatedBucketsData, GetAggregatedBucketsResponses, GetAggregatedBucketsErrors, UpdateAutomaticDeployData, UpdateAutomaticDeployResponses, UpdateAutomaticDeployErrors, ListCustomDomainsForProjectData, ListCustomDomainsForProjectResponses, ListCustomDomainsForProjectErrors, CreateCustomDomainData, CreateCustomDomainResponses, CreateCustomDomainErrors, DeleteCustomDomainData, DeleteCustomDomainResponses, DeleteCustomDomainErrors, GetCustomDomainData, GetCustomDomainResponses, GetCustomDomainErrors, UpdateCustomDomainData, UpdateCustomDomainResponses, UpdateCustomDomainErrors, LinkCustomDomainToCertificateData, LinkCustomDomainToCertificateResponses, LinkCustomDomainToCertificateErrors, UpdateProjectDeploymentConfigData, UpdateProjectDeploymentConfigResponses, UpdateProjectDeploymentConfigErrors, GetDeploymentData, GetDeploymentResponses, GetDeploymentErrors, CancelDeploymentData, CancelDeploymentResponses, CancelDeploymentErrors, GetDeploymentJobsData, GetDeploymentJobsResponses, GetDeploymentJobsErrors, TailDeploymentJobLogsData, TailDeploymentJobLogsErrors, GetDeploymentOperationsData, GetDeploymentOperationsResponses, GetDeploymentOperationsErrors, ExecuteDeploymentOperationData, ExecuteDeploymentOperationResponses, ExecuteDeploymentOperationErrors, GetDeploymentOperationStatusData, GetDeploymentOperationStatusResponses, GetDeploymentOperationStatusErrors, PauseDeploymentData, PauseDeploymentResponses, PauseDeploymentErrors, ResumeDeploymentData, ResumeDeploymentResponses, ResumeDeploymentErrors, RollbackToDeploymentData, RollbackToDeploymentResponses, RollbackToDeploymentErrors, TeardownDeploymentData, TeardownDeploymentResponses, TeardownDeploymentErrors, ListDsnsData, ListDsnsResponses, CreateDsnData, CreateDsnResponses, CreateDsnErrors, GetOrCreateDsnData, GetOrCreateDsnResponses, GetOrCreateDsnErrors, RegenerateDsnData, RegenerateDsnResponses, RegenerateDsnErrors, RevokeDsnData, RevokeDsnResponses, RevokeDsnErrors, GetEnvironmentVariablesData, GetEnvironmentVariablesResponses, GetEnvironmentVariablesErrors, CreateEnvironmentVariableData, CreateEnvironmentVariableResponses, CreateEnvironmentVariableErrors, GetEnvironmentVariableValueData, GetEnvironmentVariableValueResponses, GetEnvironmentVariableValueErrors, DeleteEnvironmentVariableData, DeleteEnvironmentVariableResponses, DeleteEnvironmentVariableErrors, UpdateEnvironmentVariableData, UpdateEnvironmentVariableResponses, UpdateEnvironmentVariableErrors, GetEnvironmentsData, GetEnvironmentsResponses, GetEnvironmentsErrors, CreateEnvironmentData, CreateEnvironmentResponses, CreateEnvironmentErrors, DeleteEnvironmentData, DeleteEnvironmentResponses, DeleteEnvironmentErrors, GetEnvironmentData, GetEnvironmentResponses, GetEnvironmentErrors, GetEnvironmentCronsData, GetEnvironmentCronsResponses, GetEnvironmentCronsErrors, GetCronByIdData, GetCronByIdResponses, GetCronByIdErrors, GetCronExecutionsData, GetCronExecutionsResponses, GetCronExecutionsErrors, GetEnvironmentDomainsData, GetEnvironmentDomainsResponses, GetEnvironmentDomainsErrors, AddEnvironmentDomainData, AddEnvironmentDomainResponses, AddEnvironmentDomainErrors, DeleteEnvironmentDomainData, DeleteEnvironmentDomainResponses, DeleteEnvironmentDomainErrors, UpdateEnvironmentSettingsData, UpdateEnvironmentSettingsResponses, UpdateEnvironmentSettingsErrors, TeardownEnvironmentData, TeardownEnvironmentResponses, TeardownEnvironmentErrors, GetContainerLogsData, GetContainerLogsErrors, ListContainersData, ListContainersResponses, ListContainersErrors, GetContainerDetailData, GetContainerDetailResponses, GetContainerDetailErrors, GetContainerLogsByIdData, GetContainerLogsByIdErrors, GetContainerMetricsData, GetContainerMetricsResponses, GetContainerMetricsErrors, StreamContainerMetricsData, StreamContainerMetricsResponses, StreamContainerMetricsErrors, RestartContainerData, RestartContainerResponses, RestartContainerErrors, StartContainerData, StartContainerResponses, StartContainerErrors, StopContainerData, StopContainerResponses, StopContainerErrors, DeployFromImageData, DeployFromImageResponses, DeployFromImageErrors, DeployFromImageUploadData, DeployFromImageUploadResponses, DeployFromImageUploadErrors, DeployFromStaticData, DeployFromStaticResponses, DeployFromStaticErrors, GetErrorDashboardStatsData, GetErrorDashboardStatsResponses, GetErrorDashboardStatsErrors, ListErrorGroupsData, ListErrorGroupsResponses, ListErrorGroupsErrors, GetErrorGroupData, GetErrorGroupResponses, GetErrorGroupErrors, UpdateErrorGroupData, UpdateErrorGroupResponses, UpdateErrorGroupErrors, ListErrorEventsData, ListErrorEventsResponses, ListErrorEventsErrors, GetErrorEventData, GetErrorEventResponses, GetErrorEventErrors, GetErrorStatsData, GetErrorStatsResponses, GetErrorStatsErrors, GetErrorTimeSeriesData, GetErrorTimeSeriesResponses, GetErrorTimeSeriesErrors, GetEventsCount2Data, GetEventsCount2Responses, GetEventsCount2Errors, GetEventTypeBreakdownData, GetEventTypeBreakdownResponses, GetEventTypeBreakdownErrors, GetPropertyBreakdownData, GetPropertyBreakdownResponses, GetPropertyBreakdownErrors, GetPropertyTimelineData, GetPropertyTimelineResponses, GetPropertyTimelineErrors, GetEventsTimelineData, GetEventsTimelineResponses, GetEventsTimelineErrors, GetUniqueEventsData, GetUniqueEventsResponses, GetUniqueEventsErrors, ListExternalImagesData, ListExternalImagesResponses, ListExternalImagesErrors, RegisterExternalImageData, RegisterExternalImageResponses, RegisterExternalImageErrors, DeleteExternalImageData, DeleteExternalImageResponses, DeleteExternalImageErrors, GetExternalImageData, GetExternalImageResponses, GetExternalImageErrors, ListFunnelsData, ListFunnelsResponses, ListFunnelsErrors, CreateFunnelData, CreateFunnelResponses, CreateFunnelErrors, PreviewFunnelMetricsData, PreviewFunnelMetricsResponses, PreviewFunnelMetricsErrors, DeleteFunnelData, DeleteFunnelResponses, DeleteFunnelErrors, UpdateFunnelData, UpdateFunnelResponses, UpdateFunnelErrors, GetFunnelMetricsData, GetFunnelMetricsResponses, GetFunnelMetricsErrors, UpdateGitSettingsData, UpdateGitSettingsResponses, UpdateGitSettingsErrors, HasErrorGroupsData, HasErrorGroupsResponses, HasErrorGroupsErrors, HasAnalyticsEventsData, HasAnalyticsEventsResponses, HasAnalyticsEventsErrors, GetHourlyVisitsData, GetHourlyVisitsResponses, GetHourlyVisitsErrors, ListExternalImages2Data, ListExternalImages2Responses, ListExternalImages2Errors, PushExternalImageData, PushExternalImageResponses, PushExternalImageErrors, GetExternalImage2Data, GetExternalImage2Responses, GetExternalImage2Errors, ListIncidentsData, ListIncidentsResponses, ListIncidentsErrors, CreateIncidentData, CreateIncidentResponses, CreateIncidentErrors, GetBucketedIncidentsData, GetBucketedIncidentsResponses, GetBucketedIncidentsErrors, ListMonitorsData, ListMonitorsResponses, ListMonitorsErrors, CreateMonitorData, CreateMonitorResponses, CreateMonitorErrors, DeleteReleaseSourceMapsData, DeleteReleaseSourceMapsResponses, DeleteReleaseSourceMapsErrors, ListSourceMapsData, ListSourceMapsResponses, ListSourceMapsErrors, UploadSourceMapData, UploadSourceMapResponses, UploadSourceMapErrors, UpdateProjectSettingsData, UpdateProjectSettingsResponses, UpdateProjectSettingsErrors, ListReleasesData, ListReleasesResponses, ListReleasesErrors, DeleteSourceMapData, DeleteSourceMapResponses, DeleteSourceMapErrors, ListStaticBundlesData, ListStaticBundlesResponses, ListStaticBundlesErrors, DeleteStaticBundleData, DeleteStaticBundleResponses, DeleteStaticBundleErrors, GetStaticBundleData, GetStaticBundleResponses, GetStaticBundleErrors, GetStatusOverviewData, GetStatusOverviewResponses, GetStatusOverviewErrors, GetUniqueCountsData, GetUniqueCountsResponses, GetUniqueCountsErrors, UploadStaticBundleData, UploadStaticBundleResponses, UploadStaticBundleErrors, ListProjectScansData, ListProjectScansResponses, ListProjectScansErrors, TriggerScanData, TriggerScanResponses, TriggerScanErrors, GetLatestScansPerEnvironmentData, GetLatestScansPerEnvironmentResponses, GetLatestScansPerEnvironmentErrors, GetLatestScanData, GetLatestScanResponses, GetLatestScanErrors, ListWebhooksData, ListWebhooksResponses, ListWebhooksErrors, CreateWebhookData, CreateWebhookResponses, CreateWebhookErrors, DeleteWebhookData, DeleteWebhookResponses, DeleteWebhookErrors, GetWebhookData, GetWebhookResponses, GetWebhookErrors, UpdateWebhookData, UpdateWebhookResponses, UpdateWebhookErrors, ListDeliveriesData, ListDeliveriesResponses, ListDeliveriesErrors, GetDeliveryData, GetDeliveryResponses, GetDeliveryErrors, RetryDeliveryData, RetryDeliveryResponses, RetryDeliveryErrors, GetProxyLogsData, GetProxyLogsResponses, GetProxyLogsErrors, GetProxyLogByRequestIdData, GetProxyLogByRequestIdResponses, GetProxyLogByRequestIdErrors, GetTimeBucketStatsData, GetTimeBucketStatsResponses, GetTimeBucketStatsErrors, GetTodayStatsData, GetTodayStatsResponses, GetTodayStatsErrors, GetProxyLogByIdData, GetProxyLogByIdResponses, GetProxyLogByIdErrors, ListSyncedRepositoriesData, ListSyncedRepositoriesResponses, ListSyncedRepositoriesErrors, GetRepositoryByNameData, GetRepositoryByNameResponses, GetRepositoryByNameErrors, GetAllRepositoriesByNameData, GetAllRepositoriesByNameResponses, GetAllRepositoriesByNameErrors, GetRepositoryPresetByNameData, GetRepositoryPresetByNameResponses, GetRepositoryPresetByNameErrors, GetRepositoryBranchesData, GetRepositoryBranchesResponses, GetRepositoryBranchesErrors, GetRepositoryTagsData, GetRepositoryTagsResponses, GetRepositoryTagsErrors, GetRepositoryPresetLiveData, GetRepositoryPresetLiveResponses, GetRepositoryPresetLiveErrors, GetBranchesByRepositoryIdData, GetBranchesByRepositoryIdResponses, GetBranchesByRepositoryIdErrors, ListCommitsByRepositoryIdData, ListCommitsByRepositoryIdResponses, ListCommitsByRepositoryIdErrors, CheckCommitExistsData, CheckCommitExistsResponses, CheckCommitExistsErrors, GetTagsByRepositoryIdData, GetTagsByRepositoryIdResponses, GetTagsByRepositoryIdErrors, GetProjectSessionReplaysData, GetProjectSessionReplaysResponses, GetProjectSessionReplaysErrors, GetSessionEvents2Data, GetSessionEvents2Responses, GetSessionEvents2Errors, GetSettingsData, GetSettingsResponses, GetSettingsErrors, UpdateSettingsData, UpdateSettingsResponses, UpdateSettingsErrors, ListTemplatesData, ListTemplatesResponses, ListTemplatesErrors, ListTemplateTagsData, ListTemplateTagsResponses, ListTemplateTagsErrors, GetTemplateData, GetTemplateResponses, GetTemplateErrors, GetCurrentUserData, GetCurrentUserResponses, GetCurrentUserErrors, ListUsersData, ListUsersResponses, ListUsersErrors, CreateUserData, CreateUserResponses, CreateUserErrors, UpdateSelfData, UpdateSelfResponses, UpdateSelfErrors, DisableMfaData, DisableMfaResponses, DisableMfaErrors, SetupMfaData, SetupMfaResponses, SetupMfaErrors, VerifyAndEnableMfaData, VerifyAndEnableMfaResponses, VerifyAndEnableMfaErrors, DeleteUserData, DeleteUserResponses, DeleteUserErrors, UpdateUserData, UpdateUserResponses, UpdateUserErrors, RestoreUserData, RestoreUserResponses, RestoreUserErrors, AssignRoleData, AssignRoleResponses, AssignRoleErrors, RemoveRoleData, RemoveRoleResponses, RemoveRoleErrors, GetVisitorSessions2Data, GetVisitorSessions2Responses, GetVisitorSessions2Errors, DeleteSessionReplayData, DeleteSessionReplayResponses, DeleteSessionReplayErrors, GetSessionReplayData, GetSessionReplayResponses, GetSessionReplayErrors, UpdateSessionDurationData, UpdateSessionDurationResponses, UpdateSessionDurationErrors, GetSessionReplayEventsData, GetSessionReplayEventsResponses, GetSessionReplayEventsErrors, AddEventsData, AddEventsResponses, AddEventsErrors, DeleteScanData, DeleteScanResponses, DeleteScanErrors, GetScanData, GetScanResponses, GetScanErrors, GetScanVulnerabilitiesData, GetScanVulnerabilitiesResponses, GetScanVulnerabilitiesErrors, ListEventTypesData, ListEventTypesResponses, TriggerWeeklyDigestData, TriggerWeeklyDigestResponses, TriggerWeeklyDigestErrors, ListAuditLogsData, ListAuditLogsResponses, ListAuditLogsErrors, GetAuditLogData, GetAuditLogResponses, GetAuditLogErrors } from './types.gen'; import { client } from './client.gen'; export type Options = ClientOptions & { @@ -120,6 +120,38 @@ export const getActiveVisitors = (options: }); }; +/** + * Get detailed analytics for a specific event + */ +export const getEventDetail = (options: Options) => { + return (options.client ?? client).get({ + security: [ + { + scheme: 'bearer', + type: 'http' + } + ], + url: '/analytics/event-detail', + ...options + }); +}; + +/** + * Get paginated list of visitors who triggered a specific event + */ +export const getEventVisitors = (options: Options) => { + return (options.client ?? client).get({ + security: [ + { + scheme: 'bearer', + type: 'http' + } + ], + url: '/analytics/event-visitors', + ...options + }); +}; + export const getEventsCount = (options: Options) => { return (options.client ?? client).get({ security: [ @@ -3807,6 +3839,29 @@ export const listPresets = (options?: Opti }); }; +/** + * Generate a Dockerfile from a preset + * Returns the Dockerfile content and build arguments for a given preset slug. + * The CLI can use this to build Docker images locally without needing a Dockerfile + * in the project directory, enabling zero-config deployments. + */ +export const generatePresetDockerfile = (options: Options) => { + return (options.client ?? client).post({ + security: [ + { + scheme: 'bearer', + type: 'http' + } + ], + url: '/presets/{slug}/dockerfile', + ...options, + headers: { + 'Content-Type': 'application/json', + ...options.headers + } + }); +}; + /** * Get a list of all projects */ @@ -4193,7 +4248,7 @@ export const getDeploymentJobs = (options: * upgrade request. * * **API Client Authentication**: Include API key in Authorization header: - * ``` + * ```text * Authorization: Bearer tk_your_api_key_here * ``` */ @@ -5849,6 +5904,22 @@ export const getBranchesByRepositoryId = ( }); }; +/** + * List recent commits for a repository branch + */ +export const listCommitsByRepositoryId = (options: Options) => { + return (options.client ?? client).get({ + security: [ + { + scheme: 'bearer', + type: 'http' + } + ], + url: '/repository/{repository_id}/commits', + ...options + }); +}; + /** * Check if a commit exists in a repository */ diff --git a/web/src/api/client/types.gen.ts b/web/src/api/client/types.gen.ts index 6dfd6d9f..37ce44c9 100644 --- a/web/src/api/client/types.gen.ts +++ b/web/src/api/client/types.gen.ts @@ -278,6 +278,7 @@ export type ApiKeyResponse = { */ export type AppSettings = { allow_readonly_external_access?: boolean; + container_logs?: ContainerLogSettings; demo_mode?: DemoModeSettings; disk_space_alert?: DiskSpaceAlertSettings; dns_provider?: DnsProviderSettings; @@ -294,6 +295,7 @@ export type AppSettings = { * Safe response for application settings that masks sensitive fields */ export type AppSettingsResponse = { + container_logs: ContainerLogSettings; disk_space_alert: DiskSpaceAlertSettings; dns_provider: DnsProviderSettingsMasked; docker_registry: DockerRegistrySettingsMasked; @@ -417,7 +419,7 @@ export type AvailableContainerInfo = { */ exposed_ports?: Array; /** - * Docker image name (e.g., "postgres:17-alpine") + * Docker image name (e.g., "postgres:18-alpine") */ image: string; /** @@ -656,6 +658,33 @@ export type CommitExistsResponse = { exists: boolean; }; +export type CommitInfo = { + /** + * Author name + */ + author: string; + /** + * Author email + */ + author_email: string; + /** + * Commit date in ISO 8601 format + */ + date: string; + /** + * Commit message + */ + message: string; + /** + * Commit SHA hash + */ + sha: string; +}; + +export type CommitListResponse = { + commits: Array; +}; + export type ConnectionListQuery = { direction?: string | null; page?: number | null; @@ -745,12 +774,41 @@ export type ContainerListResponse = { total: number; }; +/** + * Docker container log rotation settings + * Controls the `--log-opt max-size` and `--log-opt max-file` for containers + */ +export type ContainerLogSettings = { + /** + * Maximum number of rotated log files to keep (e.g., 3 means up to 3 x max_size total) + */ + max_file?: number; + /** + * Maximum size of each log file (e.g., "50m", "100m", "1g") + * Docker default is unlimited; we default to "50m" to prevent disk exhaustion + */ + max_size?: string; + /** + * Maximum rotated log files for external service containers + */ + service_max_file?: number; + /** + * Maximum size for external service container logs (postgres, redis, etc.) + * Defaults to "20m" since services are typically less verbose than app containers + */ + service_max_size?: string; +}; + export type ContainerLogsQuery = { /** * Optional container name to get logs from (if deployment has multiple containers) */ container_name?: string | null; end_date?: number | null; + /** + * Follow log output in real-time (default: true for backward compatibility) + */ + follow?: boolean; start_date?: number | null; tail?: string | null; /** @@ -2840,14 +2898,132 @@ export type ErrorTimeSeriesQuery = { start_time: string; }; +/** + * Time bucket data point for event activity graph + */ +export type EventActivityBucket = { + /** + * Number of event occurrences in this bucket + */ + count: number; + /** + * Timestamp for this bucket (ISO 8601) + */ + timestamp: string; + /** + * Number of unique visitors in this bucket + */ + unique_visitors: number; +}; + export type EventBreakdown = 'country' | 'region' | 'city'; +/** + * Browser stats for an event + */ +export type EventBrowserStats = { + /** + * Browser name + */ + browser: string; + /** + * Number of event occurrences from this browser + */ + count: number; + /** + * Percentage of total events + */ + percentage: number; +}; + export type EventCount = { count: number; event_name: string; percentage: number; }; +/** + * Country stats for an event + */ +export type EventCountryStats = { + /** + * Number of event occurrences from this country + */ + count: number; + /** + * Country name + */ + country: string; + /** + * ISO country code (2-letter) + */ + country_code?: string | null; + /** + * Percentage of total events + */ + percentage: number; +}; + +/** + * Query parameters for event detail analytics + */ +export type EventDetailQuery = { + /** + * Bucket interval for time series: 'hour', 'day', 'week', 'month' (default: auto) + */ + bucket_interval?: string | null; + end_date: string; + environment_id?: number | null; + /** + * The specific event name to get details for + */ + event_name: string; + project_id: number; + start_date: string; +}; + +/** + * Summary response for a specific event's analytics + */ +export type EventDetailResponse = { + /** + * Time series data for event activity graph + */ + activity_over_time: Array; + /** + * Browser distribution of visitors who triggered this event + */ + browsers: Array; + /** + * Bucket interval used for time series ('hour', 'day', etc.) + */ + bucket_interval: string; + /** + * Geographic distribution of visitors who triggered this event + */ + countries: Array; + /** + * The event name being analyzed + */ + event_name: string; + /** + * Top referrer hostnames for visitors who triggered this event + */ + referrers: Array; + /** + * Total number of times this event was triggered in the date range + */ + total_count: number; + /** + * Number of unique sessions where this event occurred + */ + unique_sessions: number; + /** + * Number of unique visitors who triggered this event + */ + unique_visitors: number; +}; + export type EventMetricsPayload = { /** * Cumulative Layout Shift (score) @@ -2889,6 +3065,24 @@ export type EventMetricsPayload = { viewport_width?: number | null; }; +/** + * Referrer stats for an event + */ +export type EventReferrerStats = { + /** + * Number of event occurrences from this referrer + */ + count: number; + /** + * Percentage of total events + */ + percentage: number; + /** + * Referrer hostname or "Direct" + */ + referrer: string; +}; + export type EventTimeline = { count: number; date: string; @@ -2943,6 +3137,104 @@ export type EventTypesResponse = { total: number; }; +/** + * A visitor who triggered a specific event + */ +export type EventVisitorInfo = { + /** + * Browser name + */ + browser?: string | null; + /** + * Visitor's city + */ + city?: string | null; + /** + * Visitor's country + */ + country?: string | null; + /** + * Visitor's country code + */ + country_code?: string | null; + /** + * Device type (Desktop, Mobile, Tablet) + */ + device_type?: string | null; + /** + * Number of times this visitor triggered the event + */ + event_count: number; + /** + * When the visitor first triggered the event in the date range + */ + first_triggered: string; + /** + * When the visitor last triggered the event in the date range + */ + last_triggered: string; + /** + * Referrer hostname for the event + */ + referrer_hostname?: string | null; + /** + * Visitor numeric ID + */ + visitor_id: number; + /** + * Visitor UUID + */ + visitor_uuid: string; +}; + +/** + * Query parameters for event visitors list + */ +export type EventVisitorsQuery = { + end_date: string; + environment_id?: number | null; + /** + * The specific event name to list visitors for + */ + event_name: string; + /** + * Page number (1-based, default: 1) + */ + page?: number | null; + /** + * Items per page (default: 20, max: 100) + */ + per_page?: number | null; + project_id: number; + start_date: string; +}; + +/** + * Paginated response for event visitors + */ +export type EventVisitorsResponse = { + /** + * The event name + */ + event_name: string; + /** + * Current page number + */ + page: number; + /** + * Items per page + */ + per_page: number; + /** + * Total number of unique visitors who triggered this event + */ + total_count: number; + /** + * Individual visitors who triggered this event + */ + visitors: Array; +}; + export type EventsCountQuery = { /** * Aggregation level: events (raw count), sessions (unique sessions), or visitors (unique visitors) @@ -3194,6 +3486,57 @@ export type GeneralStatsResponse = { visitors_trend_percentage?: number | null; }; +/** + * Request body for generating a Dockerfile from a preset + */ +export type GenerateDockerfileRequest = { + /** + * Custom build command (overrides preset default) + */ + build_command?: string | null; + /** + * Custom install command (overrides preset default) + */ + install_command?: string | null; + /** + * Output directory for static builds + */ + output_dir?: string | null; + /** + * Package manager used by the project (npm, yarn, pnpm, bun) + * If not provided, defaults to npm + */ + package_manager?: string | null; + /** + * Project name/slug used for container naming + */ + project_name?: string | null; + /** + * Whether to use BuildKit cache mounts for faster builds + */ + use_buildkit?: boolean; +}; + +/** + * Response containing a generated Dockerfile and build arguments + */ +export type GenerateDockerfileResponse = { + /** + * Build arguments to pass to `docker build --build-arg KEY=VALUE` + */ + build_args: { + [key: string]: string; + }; + /** + * The generated Dockerfile content + */ + dockerfile: string; + /** + * The preset slug used for generation + */ + preset: string; +}; + /** * Response containing geolocation information for an IP address */ @@ -4249,6 +4592,18 @@ export type LiveVisitorInfo = { current_page?: string | null; custom_data?: unknown; environment_id: number; + /** + * Marketing channel from the first visit (e.g. "Organic Search", "Direct") + */ + first_channel?: string | null; + /** + * Full referrer URL from the visitor's first session + */ + first_referrer?: string | null; + /** + * Hostname extracted from first_referrer + */ + first_referrer_hostname?: string | null; first_seen: string; id: number; ip_address?: string | null; @@ -7662,7 +8017,7 @@ export type UpdateErrorGroupRequest = { export type UpdateExternalServiceRequest = { /** - * Docker image to use for the service (e.g., "postgres:17-alpine", "timescale/timescaledb-ha:pg17") + * Docker image to use for the service (e.g., "postgres:18-alpine", "timescale/timescaledb-ha:pg18") * When provided, the service will be recreated with the new image while preserving data */ docker_image?: string | null; @@ -7881,7 +8236,7 @@ export type UpdateWebhookRequestBody = { export type UpgradeExternalServiceRequest = { /** - * Docker image to upgrade to (e.g., "postgres:17-alpine") + * Docker image to upgrade to (e.g., "postgres:18-alpine") * This will trigger pg_upgrade for PostgreSQL or equivalent upgrade procedures for other services */ docker_image: string; @@ -8088,6 +8443,18 @@ export type VisitorDetails = { crawler_name?: string | null; custom_data?: unknown; environment_id: number; + /** + * Marketing channel from the first visit (e.g. "Organic Search", "Direct") + */ + first_channel?: string | null; + /** + * Full referrer URL from the visitor's first session + */ + first_referrer?: string | null; + /** + * Hostname extracted from first_referrer + */ + first_referrer_hostname?: string | null; first_seen: string; id: number; ip_address?: string | null; @@ -8115,6 +8482,18 @@ export type VisitorInfo = { current_page?: string | null; custom_data?: unknown; environment_id: number; + /** + * Marketing channel from the first visit (e.g. "Organic Search", "Direct") + */ + first_channel?: string | null; + /** + * Full referrer URL from the visitor's first session + */ + first_referrer?: string | null; + /** + * Hostname extracted from first_referrer + */ + first_referrer_hostname?: string | null; first_seen: string; id: number; ip_address?: string | null; @@ -8210,6 +8589,18 @@ export type VisitorWithGeolocation = { crawler_name?: string | null; custom_data?: unknown; environment_id: number; + /** + * Marketing channel from the first visit (e.g. "Organic Search", "Direct") + */ + first_channel?: string | null; + /** + * Full referrer URL from the visitor's first session + */ + first_referrer?: string | null; + /** + * Hostname extracted from first_referrer + */ + first_referrer_hostname?: string | null; first_seen: string; id: number; ip_address?: string | null; @@ -8641,6 +9032,114 @@ export type GetActiveVisitorsResponses = { export type GetActiveVisitorsResponse = GetActiveVisitorsResponses[keyof GetActiveVisitorsResponses]; +export type GetEventDetailData = { + body?: never; + path?: never; + query: { + /** + * Event name to get details for + */ + event_name: string; + /** + * Project ID + */ + project_id: number; + /** + * Environment ID (optional) + */ + environment_id?: number; + /** + * Start date (ISO 8601) + */ + start_date: string; + /** + * End date (ISO 8601) + */ + end_date: string; + /** + * Bucket interval: hour, day, week, month (default: auto) + */ + bucket_interval?: string; + }; + url: '/analytics/event-detail'; +}; + +export type GetEventDetailErrors = { + /** + * Invalid parameters + */ + 400: unknown; + /** + * Internal server error + */ + 500: unknown; +}; + +export type GetEventDetailResponses = { + /** + * Successfully retrieved event details + */ + 200: EventDetailResponse; +}; + +export type GetEventDetailResponse = GetEventDetailResponses[keyof GetEventDetailResponses]; + +export type GetEventVisitorsData = { + body?: never; + path?: never; + query: { + /** + * Event name to list visitors for + */ + event_name: string; + /** + * Project ID + */ + project_id: number; + /** + * Environment ID (optional) + */ + environment_id?: number; + /** + * Start date (ISO 8601) + */ + start_date: string; + /** + * End date (ISO 8601) + */ + end_date: string; + /** + * Page number (1-based, default: 1) + */ + page?: number; + /** + * Items per page (default: 20, max: 100) + */ + per_page?: number; + }; + url: '/analytics/event-visitors'; +}; + +export type GetEventVisitorsErrors = { + /** + * Invalid parameters + */ + 400: unknown; + /** + * Internal server error + */ + 500: unknown; +}; + +export type GetEventVisitorsResponses = { + /** + * Successfully retrieved event visitors + */ + 200: EventVisitorsResponse; +}; + +export type GetEventVisitorsResponse = GetEventVisitorsResponses[keyof GetEventVisitorsResponses]; + export type GetEventsCountData = { body?: never; path?: never; @@ -11850,7 +12349,18 @@ export type LookupDnsARecordsResponse = LookupDnsARecordsResponses[keyof LookupD export type ListDomainsData = { body?: never; path?: never; - query?: never; + query?: { + /** + * Page number (1-indexed) + */ + page?: number | null; + /** + * Number of items per page (max 100) + */ + page_size?: number | null; + sort_by?: string | null; + sort_order?: string | null; + }; url: '/domains'; }; @@ -13068,7 +13578,18 @@ export type GetEmailResponse = GetEmailResponses[keyof GetEmailResponses]; export type ListServicesData = { body?: never; path?: never; - query?: never; + query?: { + /** + * Page number (1-indexed) + */ + page?: number | null; + /** + * Number of items per page (max 100) + */ + page_size?: number | null; + sort_by?: string | null; + sort_order?: string | null; + }; url: '/external-services'; }; @@ -13217,7 +13738,18 @@ export type ListProjectServicesData = { */ project_id: number; }; - query?: never; + query?: { + /** + * Page number (1-indexed) + */ + page?: number | null; + /** + * Number of items per page (max 100) + */ + page_size?: number | null; + sort_by?: string | null; + sort_order?: string | null; + }; url: '/external-services/projects/{project_id}'; }; @@ -13563,7 +14095,18 @@ export type ListServiceProjectsData = { */ id: number; }; - query?: never; + query?: { + /** + * Page number (1-indexed) + */ + page?: number | null; + /** + * Number of items per page (max 100) + */ + page_size?: number | null; + sort_by?: string | null; + sort_order?: string | null; + }; url: '/external-services/{id}/projects'; }; @@ -16431,7 +16974,18 @@ export type UpdatePreferencesResponse = UpdatePreferencesResponses[keyof UpdateP export type ListNotificationProvidersData = { body?: never; path?: never; - query?: never; + query?: { + /** + * Page number (1-indexed) + */ + page?: number | null; + /** + * Number of items per page (max 100) + */ + page_size?: number | null; + sort_by?: string | null; + sort_order?: string | null; + }; url: '/notification-providers'; }; @@ -16786,7 +17340,18 @@ export type TestProvider2Response = TestProvider2Responses[keyof TestProvider2Re export type ListOrdersData = { body?: never; path?: never; - query?: never; + query?: { + /** + * Page number (1-indexed) + */ + page?: number | null; + /** + * Number of items per page (max 100) + */ + page_size?: number | null; + sort_by?: string | null; + sort_order?: string | null; + }; url: '/orders'; }; @@ -17110,6 +17675,42 @@ export type ListPresetsResponses = { export type ListPresetsResponse2 = ListPresetsResponses[keyof ListPresetsResponses]; +export type GeneratePresetDockerfileData = { + body: GenerateDockerfileRequest; + path: { + /** + * Preset slug (e.g., nextjs, vite, python) + */ + slug: string; + }; + query?: never; + url: '/presets/{slug}/dockerfile'; +}; + +export type GeneratePresetDockerfileErrors = { + /** + * Unauthorized + */ + 401: unknown; + /** + * Preset not found + */ + 404: unknown; + /** + * Internal server error + */ + 500: unknown; +}; + +export type GeneratePresetDockerfileResponses = { + /** + * Generated Dockerfile + */ + 200: GenerateDockerfileResponse; +}; + +export type GeneratePresetDockerfileResponse = GeneratePresetDockerfileResponses[keyof GeneratePresetDockerfileResponses]; + export type GetProjectsData = { body?: never; path?: never; @@ -19145,6 +19746,10 @@ export type GetContainerLogsData = { * Include timestamps in log output (default: false) */ timestamps?: boolean; + /** + * Follow log output in real-time (default: true) + */ + follow?: boolean; }; url: '/projects/{project_id}/environments/{environment_id}/container-logs'; }; @@ -19277,6 +19882,10 @@ export type GetContainerLogsByIdData = { * Include timestamps in log output (default: false) */ timestamps?: boolean; + /** + * Follow log output in real-time (default: true) + */ + follow?: boolean; }; url: '/projects/{project_id}/environments/{environment_id}/containers/{container_id}/logs'; }; @@ -21846,7 +22455,18 @@ export type ListWebhooksData = { */ project_id: number; }; - query?: never; + query?: { + /** + * Page number (1-indexed) + */ + page?: number | null; + /** + * Number of items per page (max 100) + */ + page_size?: number | null; + sort_by?: string | null; + sort_order?: string | null; + }; url: '/projects/{project_id}/webhooks'; }; @@ -22933,6 +23553,51 @@ export type GetBranchesByRepositoryIdResponses = { export type GetBranchesByRepositoryIdResponse = GetBranchesByRepositoryIdResponses[keyof GetBranchesByRepositoryIdResponses]; +export type ListCommitsByRepositoryIdData = { + body?: never; + path: { + /** + * Repository ID + */ + repository_id: number; + }; + query: { + /** + * Branch name to list commits for + */ + branch: string; + /** + * Number of commits to return (default: 20, max: 100) + */ + per_page?: number | null; + }; + url: '/repository/{repository_id}/commits'; +}; + +export type ListCommitsByRepositoryIdErrors = { + /** + * Unauthorized + */ + 401: unknown; + /** + * Repository not found + */ + 404: unknown; + /** + * Internal server error + */ + 500: unknown; +}; + +export type ListCommitsByRepositoryIdResponses = { + /** + * List of commits + */ + 200: CommitListResponse; +}; + +export type ListCommitsByRepositoryIdResponse = ListCommitsByRepositoryIdResponses[keyof ListCommitsByRepositoryIdResponses]; + export type CheckCommitExistsData = { body?: never; path: { diff --git a/web/src/components/analytics/EventDetail.tsx b/web/src/components/analytics/EventDetail.tsx new file mode 100644 index 00000000..7b7f9784 --- /dev/null +++ b/web/src/components/analytics/EventDetail.tsx @@ -0,0 +1,532 @@ +import { + getEventDetailOptions, + getEventVisitorsOptions, +} from '@/api/client/@tanstack/react-query.gen' +import { ProjectResponse } from '@/api/client/types.gen' +import { Badge } from '@/components/ui/badge' +import { Button } from '@/components/ui/button' +import { + Card, + CardContent, + CardDescription, + CardHeader, + CardTitle, +} from '@/components/ui/card' +import { + Table, + TableBody, + TableCell, + TableHead, + TableHeader, + TableRow, +} from '@/components/ui/table' +import { + Tooltip, + TooltipContent, + TooltipTrigger, +} from '@/components/ui/tooltip' +import { Skeleton } from '@/components/ui/skeleton' +import { useQuery } from '@tanstack/react-query' +import { format } from 'date-fns' +import { + ArrowLeft, + BarChart3, + Chrome, + Globe, + Hash, + Link2, + Loader2, + Users, +} from 'lucide-react' +import { useState } from 'react' +import { useNavigate } from 'react-router-dom' +import { TimeAgo } from '../utils/TimeAgo' + +interface EventDetailProps { + project: ProjectResponse + eventName: string + startDate: Date | undefined + endDate: Date | undefined + environment: number | undefined + onBack: () => void +} + +export function EventDetail({ + project, + eventName, + startDate, + endDate, + environment, + onBack, +}: EventDetailProps) { + const navigate = useNavigate() + const [currentPage, setCurrentPage] = useState(1) + const perPage = 20 + + // Fetch event detail analytics + const { data: detailData, isLoading: detailLoading } = useQuery({ + ...getEventDetailOptions({ + query: { + event_name: eventName, + project_id: project.id, + start_date: startDate ? startDate.toISOString() : '', + end_date: endDate ? endDate.toISOString() : '', + environment_id: environment, + }, + }), + enabled: !!startDate && !!endDate, + }) + + // Fetch visitors for this event + const { data: visitorsData, isLoading: visitorsLoading } = useQuery({ + ...getEventVisitorsOptions({ + query: { + event_name: eventName, + project_id: project.id, + start_date: startDate ? startDate.toISOString() : '', + end_date: endDate ? endDate.toISOString() : '', + environment_id: environment, + page: currentPage, + per_page: perPage, + }, + }), + enabled: !!startDate && !!endDate, + }) + + const totalPages = visitorsData + ? Math.ceil(visitorsData.total_count / perPage) + : 0 + + return ( +
+ {/* Back button */} +
+ +
+ + {/* Event name title */} +
+

{eventName}

+

+ {startDate && endDate + ? `${format(startDate, 'LLL dd, y')} - ${format(endDate, 'LLL dd, y')}` + : 'Event analytics'} +

+
+ + {/* Summary stats */} + {detailLoading ? ( +
+ {['total', 'visitors', 'sessions'].map((key) => ( + + +
+ + +
+ +
+
+ ))} +
+ ) : detailData ? ( +
+ } + /> + } + /> + } + /> +
+ ) : null} + + {/* Referrers, Countries, Browsers side by side */} + {detailLoading && ( +
+ {['referrers', 'countries', 'browsers'].map((section) => ( + + + + + +
+ {['a', 'b', 'c', 'd'].map((row) => ( +
+ +
+ + +
+
+ ))} +
+
+
+ ))} +
+ )} + {detailData && ( +
+ {/* Top Referrers */} + } + items={detailData.referrers.slice(0, 8)} + renderItem={(ref) => ({ + label: ref.referrer || '(direct)', + count: ref.count, + percentage: ref.percentage, + })} + emptyMessage="No referrer data" + /> + + {/* Top Countries */} + } + items={detailData.countries.slice(0, 8)} + renderItem={(country) => ({ + label: country.country, + count: country.count, + percentage: country.percentage, + })} + emptyMessage="No location data" + /> + + {/* Top Browsers */} + } + items={detailData.browsers.slice(0, 8)} + renderItem={(browser) => ({ + label: browser.browser, + count: browser.count, + percentage: browser.percentage, + })} + emptyMessage="No browser data" + /> +
+ )} + + {/* Visitors Table */} + + +
+
+ Visitors + + Visitors who triggered this event + {visitorsData && ( + + ({visitorsData.total_count.toLocaleString()} unique) + + )} + +
+ {visitorsLoading && ( +
+ + Loading... +
+ )} +
+
+ + {visitorsLoading && !visitorsData ? ( + + ) : !visitorsData?.visitors || + visitorsData.visitors.length === 0 ? ( +
+

+ No visitors found for this event in the selected date range +

+
+ ) : ( + <> + + + + Visitor + Count + Last Triggered + Browser + Location + Referrer + + + + {visitorsData.visitors.map((visitor) => ( + + navigate( + `/projects/${project.slug}/analytics/visitors/${visitor.visitor_id}` + ) + } + > + +
+ + + {visitor.visitor_id} + +
+
+ + + {visitor.event_count}x + + + + + + + + + + {visitor.browser || '-'} + + + + + + + + {visitor.referrer_hostname || 'Direct'} + + +
+ ))} +
+
+ + {/* Pagination */} + {totalPages > 1 && ( +
+

+ Page {currentPage} of {totalPages} +

+
+ + +
+
+ )} + + )} +
+
+
+ ) +} + +// ============================================================================ +// Helper Components +// ============================================================================ + +interface StatCardProps { + label: string + value: string + icon: React.ReactNode +} + +function StatCard({ label, value, icon }: StatCardProps) { + return ( + + +
+ {icon} + {label} +
+

{value}

+
+
+ ) +} + +interface BreakdownItem { + label: string + count: number + percentage: number +} + +interface BreakdownCardProps { + title: string + icon: React.ReactNode + items: T[] + renderItem: (item: T) => BreakdownItem + emptyMessage: string +} + +function BreakdownCard({ + title, + icon, + items, + renderItem, + emptyMessage, +}: BreakdownCardProps) { + if (items.length === 0) { + return ( + + + + {icon} + {title} + + + +

{emptyMessage}

+
+
+ ) + } + + return ( + + + + {icon} + {title} + + + +
+ {items.map((item) => { + const { label, count, percentage } = renderItem(item) + return ( +
+ + {label} + +
+ {count} + + {percentage.toFixed(1)}% + +
+
+ ) + })} +
+
+
+ ) +} + +interface VisitorLocationProps { + visitor: { + city?: string | null + country?: string | null + country_code?: string | null + } +} + +function VisitorLocation({ visitor }: VisitorLocationProps) { + const parts: string[] = [] + if (visitor.city) parts.push(visitor.city) + if (visitor.country) parts.push(visitor.country) + + if (parts.length === 0) { + return - + } + + return ( + + +
+ + + {parts.join(', ')} + +
+
+ +
+ {visitor.city &&
City: {visitor.city}
} + {visitor.country &&
Country: {visitor.country}
} +
+
+
+ ) +} + +function VisitorsTableSkeleton() { + return ( + + + + Visitor + Count + Last Triggered + Browser + Location + Referrer + + + + {['s1', 's2', 's3', 's4', 's5'].map((key) => ( + + +
+ + +
+
+ + + + + + + + + + + + + + + +
+ ))} +
+
+ ) +} diff --git a/web/src/components/analytics/LiveGlobe.tsx b/web/src/components/analytics/LiveGlobe.tsx index f28bb8bd..f3b33b08 100644 --- a/web/src/components/analytics/LiveGlobe.tsx +++ b/web/src/components/analytics/LiveGlobe.tsx @@ -259,7 +259,7 @@ export function LiveGlobePage({ project }: LiveGlobePageProps) { query: { project_id: project.id, environment_id: selectedEnvironment, - window_minutes: 30, + window_minutes: 5, }, }), refetchInterval: isPaused ? false : 10000, diff --git a/web/src/components/dashboard/VisitorSparkline.tsx b/web/src/components/dashboard/VisitorSparkline.tsx index ed512a20..e4ee348e 100644 --- a/web/src/components/dashboard/VisitorSparkline.tsx +++ b/web/src/components/dashboard/VisitorSparkline.tsx @@ -41,8 +41,8 @@ export function VisitorSparkline({ } // Process the rolling 24-hour window data - // Data comes sorted from newest to oldest, so we reverse it - return [...data].reverse().map((item) => ({ + // Data comes sorted oldest-to-newest (ASC) from the backend + return data.map((item) => ({ hour: item.hour, value: item.count || 0, })) @@ -72,6 +72,10 @@ export function VisitorSparkline({ // Calculate max value for proper scaling const maxValue = Math.max(...chartData.map((d) => d.value), 1) + // Count how many hours have non-zero values to decide whether to show dots + const nonZeroCount = chartData.filter((d) => d.value > 0).length + const showActiveDot = nonZeroCount <= 3 + return (
@@ -81,12 +85,12 @@ export function VisitorSparkline({ margin={{ left: 0, right: 0, - top: 0, + top: 4, bottom: 0, }} height={height} > - + ) => { + const { cx, cy, payload } = props as { + cx: number + cy: number + payload: { value: number } + } + if (!payload || payload.value === 0) return + return ( + + ) + } + : false + } /> diff --git a/web/src/components/project/ProjectAnalytics.tsx b/web/src/components/project/ProjectAnalytics.tsx index 14930ec5..b462d083 100644 --- a/web/src/components/project/ProjectAnalytics.tsx +++ b/web/src/components/project/ProjectAnalytics.tsx @@ -20,7 +20,6 @@ import { Pages } from '@/components/analytics/Pages' import { SessionReplays } from '@/components/analytics/SessionReplays' import { FunnelDetail } from '@/components/funnel/FunnelDetail' import { FunnelManagement } from '@/components/funnel/FunnelManagement' -import { LiveVisitorsList } from '@/components/visitors/LiveVisitorsList' import { LiveVisitors } from '@/pages/LiveVisitors' import { Button } from '@/components/ui/button' import { Calendar } from '@/components/ui/calendar' @@ -61,6 +60,8 @@ import { } from '@/components/ui/table' import { Tabs, TabsList, TabsTrigger } from '@/components/ui/tabs' import VisitorAnalytics from '@/components/visitors/VisitorAnalytics' +import { Input } from '@/components/ui/input' +import { Label } from '@/components/ui/label' import { cn } from '@/lib/utils' import { CreateFunnel } from '@/pages/CreateFunnel' import { EditFunnel } from '@/pages/EditFunnel' @@ -74,6 +75,7 @@ import { Globe, Info, RefreshCw, + RotateCcw, Terminal, } from 'lucide-react' import * as React from 'react' @@ -83,8 +85,10 @@ import { Routes, useLocation, useNavigate, + useParams, useSearchParams, } from 'react-router-dom' +import { EventDetail } from '@/components/analytics/EventDetail' import { Badge } from '@/components/ui/badge' import { useAuth } from '@/contexts/AuthContext' @@ -102,6 +106,7 @@ interface VisitorChartProps { startDate: Date | undefined endDate: Date | undefined environment: number | undefined + onZoom?: (from: Date, to: Date) => void } export function VisitorChart({ @@ -109,11 +114,20 @@ export function VisitorChart({ startDate, endDate, environment, + onZoom, }: VisitorChartProps) { const [aggregationLevel, setAggregationLevel] = React.useState< 'events' | 'sessions' | 'visitors' >('visitors') + // Brush zoom state — track timestamps for zoom + pixel X for overlay + const [refAreaLeft, setRefAreaLeft] = React.useState(null) + const [refAreaRight, setRefAreaRight] = React.useState(null) + const [dragPixelLeft, setDragPixelLeft] = React.useState(null) + const [dragPixelRight, setDragPixelRight] = React.useState(null) + const isDragging = React.useRef(false) + const chartContainerRef = React.useRef(null) + const { data, isLoading, error } = useQuery({ ...getHourlyVisitsOptions({ path: { @@ -183,6 +197,79 @@ export function VisitorChart({ }) }, [data, startDate, endDate]) + // Helper: get pixel X relative to chart container from a recharts event + const getPixelX = React.useCallback((e: any): number | null => { + if (!e?.chartX) return null + return e.chartX + }, []) + + const handleMouseDown = React.useCallback( + (e: any) => { + if (!e || !onZoom) return + const timestamp = e.activePayload?.[0]?.payload?.timestamp + const px = getPixelX(e) + if (timestamp && px != null) { + isDragging.current = true + setRefAreaLeft(timestamp) + setRefAreaRight(null) + setDragPixelLeft(px) + setDragPixelRight(null) + } + }, + [onZoom, getPixelX] + ) + + const handleMouseMove = React.useCallback( + (e: any) => { + if (!isDragging.current || !e) return + const timestamp = e.activePayload?.[0]?.payload?.timestamp + const px = getPixelX(e) + if (timestamp) { + setRefAreaRight(timestamp) + } + if (px != null) { + setDragPixelRight(px) + } + }, + [getPixelX] + ) + + const handleMouseUp = React.useCallback(() => { + if (!isDragging.current || refAreaLeft == null || refAreaRight == null) { + isDragging.current = false + setRefAreaLeft(null) + setRefAreaRight(null) + setDragPixelLeft(null) + setDragPixelRight(null) + return + } + isDragging.current = false + + const left = Math.min(refAreaLeft, refAreaRight) + const right = Math.max(refAreaLeft, refAreaRight) + + setRefAreaLeft(null) + setRefAreaRight(null) + setDragPixelLeft(null) + setDragPixelRight(null) + + // Require a minimum drag distance (at least 2 data points apart) + if (right - left < 1000 * 60 * 30) { + return + } + + onZoom?.(new Date(left), new Date(right)) + }, [refAreaLeft, refAreaRight, onZoom]) + + // Compute the overlay position from pixel coordinates + const selectionOverlay = React.useMemo(() => { + if (dragPixelLeft == null || dragPixelRight == null) return null + const left = Math.min(dragPixelLeft, dragPixelRight) + const width = Math.abs(dragPixelRight - dragPixelLeft) + if (width < 4) return null + return { left, width } + }, [dragPixelLeft, dragPixelRight]) + const getAggregationLabel = () => { switch (aggregationLevel) { case 'events': @@ -213,28 +300,35 @@ export function VisitorChart({
{getChartTitle()} -
- setAggregationLevel('events')} - > - Events - - setAggregationLevel('sessions')} - > - Sessions - - setAggregationLevel('visitors')} - > - Visitors - +
+ {onZoom && ( + + Drag on chart to zoom + + )} +
+ setAggregationLevel('events')} + > + Events + + setAggregationLevel('sessions')} + > + Sessions + + setAggregationLevel('visitors')} + > + Visitors + +
{isLoading ? ( @@ -254,40 +348,57 @@ export function VisitorChart({
) : ( - - - - value.toLocaleString()} +
+ {selectionOverlay && ( +
- } /> - - - + )} + + + + value.toLocaleString()} + /> + } /> + + + +
)}
) @@ -417,11 +528,11 @@ function AnalyticsFilters({ {dateRange?.from ? ( dateRange.to ? ( <> - {format(dateRange.from, 'LLL dd, y')} -{' '} - {format(dateRange.to, 'LLL dd, y')} + {format(dateRange.from, 'LLL dd, y HH:mm')} -{' '} + {format(dateRange.to, 'LLL dd, y HH:mm')} ) : ( - format(dateRange.from, 'LLL dd, y') + format(dateRange.from, 'LLL dd, y HH:mm') ) ) : ( Custom range @@ -444,6 +555,62 @@ function AnalyticsFilters({ new Date(new Date().setMonth(new Date().getMonth() - 1)) } /> +
+
+ + { + if (!dateRange?.from) return + const [hours, minutes] = e.target.value + .split(':') + .map(Number) + const updated = new Date(dateRange.from) + updated.setHours(hours, minutes, 0, 0) + onDateRangeChange({ + from: updated, + to: dateRange.to, + }) + }} + disabled={!dateRange?.from} + /> +
+
+ + { + if (!dateRange?.to) return + const [hours, minutes] = e.target.value + .split(':') + .map(Number) + const updated = new Date(dateRange.to) + updated.setHours(hours, minutes, 59, 999) + onDateRangeChange({ + from: dateRange.from, + to: updated, + }) + }} + disabled={!dateRange?.to} + /> +
+
@@ -585,6 +752,133 @@ function PagesTab({ project }: PagesTabProps) { ) } +// Event Detail Tab Component +interface EventDetailTabProps { + project: ProjectResponse +} + +function EventDetailTab({ project }: EventDetailTabProps) { + const { eventName: rawEventName } = useParams<{ eventName: string }>() + const navigate = useNavigate() + const eventName = rawEventName ? decodeURIComponent(rawEventName) : '' + + const [dateFilter, setDateFilter] = React.useState({ + quickFilter: '24hours', + dateRange: undefined, + }) + const [selectedEnvironment, setSelectedEnvironment] = React.useState< + number | undefined + >(undefined) + const [isRefreshing, setIsRefreshing] = React.useState(false) + const queryClient = useQueryClient() + + const getDateRange = React.useCallback(() => { + const now = new Date() + if (dateFilter.quickFilter === 'custom' && dateFilter.dateRange) { + return { + startDate: dateFilter.dateRange.from, + endDate: dateFilter.dateRange.to, + } + } + + switch (dateFilter.quickFilter) { + case 'today': + return { + startDate: new Date(now.setHours(0, 0, 0, 0)), + endDate: new Date(now.setHours(23, 59, 59, 999)), + } + case 'yesterday': { + const yesterday = new Date(now) + yesterday.setDate(yesterday.getDate() - 1) + return { + startDate: new Date(yesterday.setHours(0, 0, 0, 0)), + endDate: new Date(yesterday.setHours(23, 59, 59, 999)), + } + } + case '24hours': { + const twentyFourHoursAgo = new Date(now) + twentyFourHoursAgo.setHours(twentyFourHoursAgo.getHours() - 24) + return { + startDate: twentyFourHoursAgo, + endDate: now, + } + } + case '7days': + return { + startDate: subDays(now, 7), + endDate: now, + } + case '30days': + return { + startDate: subDays(now, 30), + endDate: now, + } + default: + return { + startDate: subDays(now, 7), + endDate: now, + } + } + }, [dateFilter]) + + const { startDate, endDate } = getDateRange() + + const handleRefresh = React.useCallback(() => { + setIsRefreshing(true) + queryClient.invalidateQueries({ + predicate: (query) => { + const key = query.queryKey[0] as string + return !!( + key && + typeof key === 'string' && + (key.includes('getEventDetail') || key.includes('getEventVisitors')) + ) + }, + }) + setTimeout(() => setIsRefreshing(false), 1000) + }, [queryClient]) + + if (!eventName) { + return ( +
+

No event specified

+
+ ) + } + + return ( +
+ + setDateFilter((prev) => ({ ...prev, quickFilter: filter })) + } + onDateRangeChange={(range) => + setDateFilter((prev) => ({ + quickFilter: range ? 'custom' : prev.quickFilter, + dateRange: range, + })) + } + onEnvironmentChange={setSelectedEnvironment} + onRefresh={handleRefresh} + isRefreshing={isRefreshing} + /> + + navigate(`/projects/${project.slug}/analytics`)} + /> +
+ ) +} + // Session Replays Tab Component interface SessionReplaysTabProps { project: ProjectResponse @@ -835,6 +1129,7 @@ export function ProjectAnalytics({ project }: ProjectAnalyticsProps) { element={} /> } /> + } /> } /> } /> ({ - quickFilter: '24hours', - dateRange: undefined, - }) + + // Restore date filter from URL search params (enables browser back/forward) + const [dateFilter, setDateFilter] = React.useState( + () => { + const filter = searchParams.get('filter') as QuickFilter | null + const from = searchParams.get('from') + const to = searchParams.get('to') + + if (filter === 'custom' && from && to) { + return { + quickFilter: 'custom', + dateRange: { from: new Date(from), to: new Date(to) }, + } + } + if (filter && QUICK_FILTERS.some((f) => f.value === filter)) { + return { quickFilter: filter, dateRange: undefined } + } + return { quickFilter: '24hours', dateRange: undefined } + } + ) + + // Sync date filter to URL search params + const updateDateFilter = React.useCallback( + (next: AnalyticsDateFilter) => { + setDateFilter(next) + const params = new URLSearchParams() + params.set('filter', next.quickFilter) + if ( + next.quickFilter === 'custom' && + next.dateRange?.from && + next.dateRange?.to + ) { + params.set('from', next.dateRange.from.toISOString()) + params.set('to', next.dateRange.to.toISOString()) + } + setSearchParams(params, { replace: false }) + }, + [setSearchParams] + ) + + // Listen for popstate (browser back/forward) and restore date filter + React.useEffect(() => { + const filter = searchParams.get('filter') as QuickFilter | null + const from = searchParams.get('from') + const to = searchParams.get('to') + + if (filter === 'custom' && from && to) { + setDateFilter({ + quickFilter: 'custom', + dateRange: { from: new Date(from), to: new Date(to) }, + }) + } else if (filter && QUICK_FILTERS.some((f) => f.value === filter)) { + setDateFilter({ quickFilter: filter, dateRange: undefined }) + } + }, [searchParams]) + const [selectedEnvironment, setSelectedEnvironment] = React.useState< number | undefined >(undefined) @@ -918,6 +1266,17 @@ function ProjectAnalyticsOverview({ project }: ProjectAnalyticsOverviewProps) { }, [dateFilter]) const { startDate, endDate } = getDateRange() + // Chart zoom handler — sets a custom date range from drag selection + const handleChartZoom = React.useCallback( + (from: Date, to: Date) => { + updateDateFilter({ + quickFilter: 'custom', + dateRange: { from, to }, + }) + }, + [updateDateFilter] + ) + // Check if we have any analytics data using the new endpoint const hasAnalyticsEventsQuery = useQuery({ ...hasAnalyticsEventsOptions({ @@ -1006,13 +1365,13 @@ function ProjectAnalyticsOverview({ project }: ProjectAnalyticsOverviewProps) { dateRange={dateFilter.dateRange} selectedEnvironment={selectedEnvironment} onFilterChange={(filter) => - setDateFilter((prev) => ({ ...prev, quickFilter: filter })) + updateDateFilter({ ...dateFilter, quickFilter: filter, dateRange: undefined }) } onDateRangeChange={(range) => - setDateFilter((prev) => ({ - quickFilter: range ? 'custom' : prev.quickFilter, + updateDateFilter({ + quickFilter: range ? 'custom' : dateFilter.quickFilter, dateRange: range, - })) + }) } onEnvironmentChange={setSelectedEnvironment} onRefresh={handleRefresh} @@ -1027,12 +1386,30 @@ function ProjectAnalyticsOverview({ project }: ProjectAnalyticsOverviewProps) { endDate={endDate} environment={selectedEnvironment} /> - +
+ {dateFilter.quickFilter === 'custom' && dateFilter.dateRange && ( +
+ +
+ )} + +
{/* Globe link */}
- {[...Array(5)].map((_, i) => ( -
+ {['e1', 'e2', 'e3', 'e4', 'e5'].map((key) => ( +
@@ -1180,7 +1558,18 @@ function EventsChart({ project, startDate, endDate, environment }: ChartProps) { {chartData.map((item) => ( - + { + const url = `/projects/${project.slug}/analytics/events/${encodeURIComponent(item.event)}` + if (e.metaKey || e.ctrlKey) { + window.open(url, '_blank') + } else { + navigate(url) + } + }} + > {item.event} {item.count.toLocaleString()} diff --git a/web/src/components/ui/date-range-picker.tsx b/web/src/components/ui/date-range-picker.tsx index cdb68ce8..31354f62 100644 --- a/web/src/components/ui/date-range-picker.tsx +++ b/web/src/components/ui/date-range-picker.tsx @@ -1,12 +1,13 @@ 'use client' -import * as React from 'react' import { format } from 'date-fns' import { Calendar as CalendarIcon } from 'lucide-react' import { DateRange } from 'react-day-picker' import { cn } from '@/lib/utils' import { Button } from '@/components/ui/button' import { Calendar } from '@/components/ui/calendar' +import { Input } from '@/components/ui/input' +import { Label } from '@/components/ui/label' import { Popover, PopoverContent, @@ -72,6 +73,54 @@ export function DateRangePicker({ numberOfMonths={2} className="max-w-screen" /> + {showTime && ( +
+
+ + { + if (!date?.from || !onDateChange) return + const [hours, minutes] = e.target.value + .split(':') + .map(Number) + const updated = new Date(date.from) + updated.setHours(hours, minutes, 0, 0) + onDateChange({ from: updated, to: date.to }) + }} + disabled={!date?.from} + /> +
+
+ + { + if (!date?.to || !onDateChange) return + const [hours, minutes] = e.target.value + .split(':') + .map(Number) + const updated = new Date(date.to) + updated.setHours(hours, minutes, 59, 999) + onDateChange({ from: date.from, to: updated }) + }} + disabled={!date?.to} + /> +
+
+ )}
diff --git a/web/src/components/visitors/VisitorDetail.tsx b/web/src/components/visitors/VisitorDetail.tsx index d031c8b8..78321be1 100644 --- a/web/src/components/visitors/VisitorDetail.tsx +++ b/web/src/components/visitors/VisitorDetail.tsx @@ -55,12 +55,14 @@ import { ChevronLeft, ChevronRight, Clock, + ExternalLink, Globe as MapPinIcon, Loader2, Monitor, Pencil, PlayCircle, Plus, + Share2, Smartphone, Users as UserIcon, } from 'lucide-react' @@ -404,7 +406,7 @@ export function VisitorDetail({ project, visitorId }: VisitorDetailProps) { ) : visitorDetails ? ( <> {/* Visitor Info Cards - Redesigned with better layout */} -
+
{/* Location Card */} @@ -425,6 +427,29 @@ export function VisitorDetail({ project, visitorId }: VisitorDetailProps) { + {/* Source / Referrer Card */} + + + + + Source + + + +
+ {visitorDetails?.first_channel || 'Direct'} +
+ {visitorDetails?.first_referrer_hostname && ( +
+ + + {visitorDetails.first_referrer_hostname} + +
+ )} +
+
+ {/* First Seen Card */} diff --git a/web/src/components/visitors/VisitorsList.tsx b/web/src/components/visitors/VisitorsList.tsx index 8a10d8c1..9a6db27b 100644 --- a/web/src/components/visitors/VisitorsList.tsx +++ b/web/src/components/visitors/VisitorsList.tsx @@ -40,6 +40,7 @@ import { User, ChevronLeft, ChevronRight, + ExternalLink, } from 'lucide-react' import * as React from 'react' import { useNavigate } from 'react-router-dom' @@ -228,6 +229,7 @@ export function VisitorsList({ project }: VisitorsListProps) { Visitor Location + Source Browser / OS First Seen Last Seen @@ -377,6 +379,11 @@ function VisitorRow({ visitor, onClick }: VisitorRowProps) {
+ {/* Source / Referrer */} + + + + {/* Browser / OS */}
@@ -419,3 +426,30 @@ function VisitorRow({ visitor, onClick }: VisitorRowProps) { ) } + +function VisitorSource({ visitor }: { visitor: VisitorInfo }) { + const channel = visitor.first_channel + const hostname = visitor.first_referrer_hostname + + if (!channel && !hostname) { + return Direct + } + + return ( +
+ {channel && ( + + {channel} + + )} + {hostname && ( +
+ + + {hostname} + +
+ )} +
+ ) +} diff --git a/web/src/pages/ApiKeyCreate.tsx b/web/src/pages/ApiKeyCreate.tsx index 48b9e498..d457da38 100644 --- a/web/src/pages/ApiKeyCreate.tsx +++ b/web/src/pages/ApiKeyCreate.tsx @@ -324,6 +324,7 @@ export default function ApiKeyCreate() { value={keyName} onChange={(e) => setKeyName(e.target.value)} className="max-w-md" + autoFocus />

Choose a name that helps you remember what this key is used for