Skip to content

Feat(feature): Add sync workflow and script to automatically sync AgentKit templates to lamatic.ai agentkits#395

Open
Hussain-Tinwala wants to merge 7 commits intoLamatic:mainfrom
Hussain-Tinwala:feature/agentkit-sync-376
Open

Feat(feature): Add sync workflow and script to automatically sync AgentKit templates to lamatic.ai agentkits#395
Hussain-Tinwala wants to merge 7 commits intoLamatic:mainfrom
Hussain-Tinwala:feature/agentkit-sync-376

Conversation

@Hussain-Tinwala
Copy link
Contributor

@Hussain-Tinwala Hussain-Tinwala commented Oct 30, 2025

Fixes #376

Description
This PR introduces a complete GitHub Actions workflow to automatically synchronize the agentkit templates from the /files/templates directory to the https://lamatic.ai/templates/agentkits endpoint, as requested in issue #376.

Changes Implemented
Two New Files

  • New GitHub Action (.github/workflows/sync-agenetkits.yml)
  • sync-agentkits.py (scripts/sync-agentkits.py)

This is my contribution for Hacktoberfest 2025.

Summary by CodeRabbit

  • Chores
    • Added an automated workflow that runs daily, on relevant repository updates, and can be triggered manually.
    • Introduced a sync operation that creates, updates, or deletes remote AgentKit templates to reflect local changes, with per-item status reporting.
    • Fails loudly on errors to surface issues.
    • Added a Python runtime dependency for the sync job.

@vercel
Copy link

vercel bot commented Oct 30, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
site Ready Ready Preview Comment Oct 31, 2025 5:36am

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 30, 2025

Walkthrough

Adds a new scheduled and on-demand GitHub Actions workflow plus a Python sync script that mirrors local JSON agent kit templates in files/templates to the lamatic.ai agentkits endpoint (create/update/delete) using the LAMATIC_API_KEY secret; also adds requests to requirements.txt.

Changes

Cohort / File(s) Summary
GitHub Actions workflow
\.github/workflows/sync-agentkits.yml
New workflow "Sync AgentKit Templates": triggers daily at 00:00 UTC (cron), on pushes to main affecting files/templates/** and requirements.txt, and via workflow_dispatch. Job runs on ubuntu-latest, checks out code, sets up Python 3.10 with pip and requirements caching, installs dependencies, and runs scripts/sync-agentkits.py with LAMATIC_API_KEY from secrets.
Sync script
\scripts/sync-agentkits.py``
New Python script that discovers JSON files under files/templates, computes SHA-256 hashes, loads remote templates from https://lamatic.ai/templates/agentkits using LAMATIC_API_KEY, and issues POST/PUT/DELETE requests to reconcile remote state. Adds config variables (LOCAL_TEMPLATES_DIR, API_BASE_URL, REQUEST_TIMEOUT, API_KEY, HEADERS) and functions (get_file_hash, get_local_files, get_remote_templates, sync_templates) with error handling and non-zero exit on failure.
Dependencies
\requirements.txt``
Adds requests>=2.32.0,<3.0.0 required by the new sync script.

Sequence Diagram(s)

sequenceDiagram
    participant GHA as GitHub Actions
    participant Script as sync-agentkits.py
    participant LocalFS as Local Filesystem
    participant API as lamatic.ai API

    GHA->>Script: Run (env: LAMATIC_API_KEY)

    rect rgba(220,235,255,0.35)
      note over Script,LocalFS: Discover local templates
      Script->>LocalFS: Read `files/templates/*.json`
      LocalFS-->>Script: File contents
      Script->>Script: Compute SHA-256 per file
    end

    rect rgba(220,255,235,0.35)
      note over Script,API: Fetch remote templates
      Script->>API: GET /templates/agentkits (Authorization)
      API-->>Script: Remote templates list
    end

    rect rgba(255,245,215,0.35)
      note over Script,API: Reconcile and act
      alt New local file
        Script->>API: POST /templates/agentkits
      else Updated local file
        Script->>API: PUT /templates/agentkits/{id}
      else Deleted local file
        Script->>API: DELETE /templates/agentkits/{id}
      end
      Script-->>GHA: Log results & exit status
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

  • Pay extra attention to:
    • HTTP error handling, retry and timeout behavior in scripts/sync-agentkits.py.
    • Correct mapping between local filenames/hashes and remote template IDs to avoid accidental deletions.
    • Validation of JSON parsing and edge cases for malformed template files.
    • Workflow secrets usage and caching steps.

Poem

🐰
I nibble bytes at break of day,
Hashes hop and lead the way,
New ones arrive, old ones part,
I patch and prune with careful heart,
Sync complete — I thump, hooray!

Pre-merge checks and finishing touches

❌ Failed checks (1 inconclusive)
Check name Status Explanation Resolution
Linked Issues Check ❓ Inconclusive The pull request addresses the core coding requirements from issue #376. The implementation includes: a GitHub Actions workflow triggering on daily cron schedule and repository changes (automation on regular intervals), a synchronization script that discovers local templates and performs create/update/delete operations (handles new, updated, and deleted files), and comprehensive error handling for file I/O, HTTP requests, and API interactions. However, issue #376's acceptance criteria include "Clear documentation provided for setup and troubleshooting," which is not explicitly addressed in the PR through dedicated documentation files or setup guides, though the workflow file and code comments provide some implicit documentation. While all coding-related requirements from #376 are implemented (automation infrastructure, sync logic for all operations, and error handling), clarification is needed on whether explicit documentation for setup and troubleshooting should be included in this PR or addressed separately. Consider either adding a documentation file explaining the setup process and troubleshooting steps, or confirming that this documentation requirement will be handled in a follow-up task.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The pull request title "Feat(feature): Add sync workflow and script to automatically sync AgentKit templates to lamatic.ai agentkits" accurately and specifically describes the main changes. It clearly identifies the two key components added (sync workflow and script) and their purpose (automatically synchronize AgentKit templates to lamatic.ai). The title is sufficiently specific that teammates reviewing history would understand the primary change without ambiguity.
Out of Scope Changes Check ✅ Passed All changes in the pull request are directly in-scope and related to implementing issue #376's objectives. The GitHub Actions workflow file provides the scheduling and automation infrastructure, the sync script implements the core template synchronization logic with hash-based change detection and deletion safeguards, and the requirements.txt addition of the requests package is a necessary dependency for the HTTP-based sync operations. No unrelated refactoring, auxiliary fixes, or extraneous modifications are present.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between b61fea9 and e2db578.

📒 Files selected for processing (2)
  • .github/workflows/sync-agentkits.yml (1 hunks)
  • scripts/sync-agentkits.py (1 hunks)
🧰 Additional context used
🪛 Ruff (0.14.2)
scripts/sync-agentkits.py

41-41: Probable use of requests call without timeout

(S113)


48-48: Consider moving this statement to an else block

(TRY300)


50-50: Do not catch blind exception: Exception

(BLE001)


65-65: Probable use of requests call without timeout

(S113)


67-67: Do not catch blind exception: Exception

(BLE001)


74-74: Probable use of requests call without timeout

(S113)


76-76: Do not catch blind exception: Exception

(BLE001)


85-85: Probable use of requests call without timeout

(S113)


87-87: Do not catch blind exception: Exception

(BLE001)

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (3)
.github/workflows/sync-agentkits.yml (1)

27-30: Consider caching pip dependencies.

Adding pip cache can speed up workflow runs, especially for the scheduled daily sync.

Apply this diff to add caching:

       - name: Set up Python
         uses: actions/setup-python@v5
         with:
           python-version: '3.10'
+          cache: 'pip'
+          cache-dependency-path: '**/requirements.txt'

       - name: Install dependencies
         run: |
           python -m pip install --upgrade pip
           pip install requests

Alternatively, if you create a requirements.txt file with requests listed, the cache will work automatically.

scripts/sync-agentkits.py (2)

92-100: Consider safeguard for bulk deletions.

The deletion logic correctly removes remote templates that no longer exist locally, but a misconfiguration (wrong directory, empty checkout, filesystem issues) could trigger mass deletion of production templates.

Add a safeguard to abort if too many deletions are attempted:

+    # Safeguard: abort if more than 50% of templates would be deleted
+    deletion_count = len([f for f in remote_templates if f not in local_files])
+    if deletion_count > 0 and deletion_count >= len(remote_templates) * 0.5:
+        print(f"ABORT: Would delete {deletion_count}/{len(remote_templates)} templates (>50%). "
+              "This may indicate a misconfiguration.")
+        return False
+
     for filename in remote_templates:
         if filename not in local_files:
             print(f"DELETING: {filename}")

Alternatively, add a --no-delete flag or require an environment variable (e.g., ALLOW_DELETIONS=true) before performing any deletions.


70-78: Enhance error messages with HTTP details.

The current error messages don't include HTTP status codes or response bodies, making it harder to diagnose API failures during sync runs.

Apply this diff to add detailed error information:

         if filename not in remote_templates:
             print(f"CREATING: {filename}")
             try:
                 payload = {"name": filename, "content": content, "hash": local_data['hash']}
                 response = requests.post(API_BASE_URL, headers=HEADERS, json=payload, timeout=REQUEST_TIMEOUT)
                 response.raise_for_status()
             except Exception as e:
-                print(f"Error creating {filename}: {e}")
+                status = getattr(getattr(e, 'response', None), 'status_code', 'N/A')
+                print(f"Error creating {filename}: {e} (HTTP {status})")
                 sync_failed = True
                 
         elif local_data['hash'] != remote_templates[filename]:
             print(f"UPDATING: {filename}")
             try:
                 payload = {"content": content, "hash": local_data['hash']}
                 response = requests.put(f"{API_BASE_URL}/{filename}", headers=HEADERS, json=payload, timeout=REQUEST_TIMEOUT)
                 response.raise_for_status()
             except Exception as e:
-                print(f"Error updating {filename}: {e}")
+                status = getattr(getattr(e, 'response', None), 'status_code', 'N/A')
+                print(f"Error updating {filename}: {e} (HTTP {status})")
                 sync_failed = True
                 
 ...
             try:
                 response = requests.delete(f"{API_BASE_URL}/{filename}", headers=HEADERS, timeout=REQUEST_TIMEOUT)
                 response.raise_for_status()
             except Exception as e:
-                print(f"Error deleting {filename}: {e}")
+                status = getattr(getattr(e, 'response', None), 'status_code', 'N/A')
+                print(f"Error deleting {filename}: {e} (HTTP {status})")
                 sync_failed = True

Also applies to: 80-88, 92-100

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between e2db578 and 5a90889.

📒 Files selected for processing (2)
  • .github/workflows/sync-agentkits.yml (1 hunks)
  • scripts/sync-agentkits.py (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
scripts/sync-agentkits.py (1)
lib/templateData.ts (1)
  • templates (16-185)
🪛 Ruff (0.14.2)
scripts/sync-agentkits.py

50-50: Consider moving this statement to an else block

(TRY300)


52-52: Do not catch blind exception: Exception

(BLE001)


65-65: Do not catch blind exception: Exception

(BLE001)


76-76: Do not catch blind exception: Exception

(BLE001)


86-86: Do not catch blind exception: Exception

(BLE001)


98-98: Do not catch blind exception: Exception

(BLE001)

🔇 Additional comments (2)
scripts/sync-agentkits.py (2)

10-10: LGTM!

The timeout constant has been added as suggested in previous reviews, preventing indefinite hangs.


59-59: LGTM!

The failure tracking and non-zero exit code have been added as suggested in previous reviews. The script now correctly reports sync failures to CI.

Also applies to: 102-102, 108-112

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (4)
requirements.txt (1)

1-1: Consider pinning the requests version.

Unpinned dependencies can lead to non-reproducible builds and potential breakage from future releases. Since this is a production automation script, pinning to a specific version (or range) improves reliability.

Apply this diff:

-requests
+requests>=2.32.0,<3.0.0

Alternatively, pin to an exact version if strict reproducibility is required:

-requests
+requests==2.32.3
.github/workflows/sync-agentkits.yml (2)

1-15: Add concurrency control to prevent simultaneous executions.

The workflow lacks concurrency control, so a manual workflow_dispatch trigger could run simultaneously with a scheduled execution or a push-triggered run. Concurrent syncs may cause race conditions or conflicting API operations (e.g., both attempting to update the same template).

Apply this diff to add concurrency control:

 name: Sync AgentKit Templates
 
+concurrency:
+  group: sync-agentkit-templates
+  cancel-in-progress: false
+
 on:
   schedule:

This ensures only one sync runs at a time, queuing subsequent triggers until the current run completes.


30-39: LGTM!

Dependency installation and script execution are correctly configured. The API key is properly sourced from repository secrets, and the script path is accurate.

The inline comment on line 33 (# === FIX 2: Install from requirements.txt ===) appears to be a development artifact. Consider removing it for cleaner production code:

-          # === FIX 2: Install from requirements.txt ===
           pip install -r requirements.txt
scripts/sync-agentkits.py (1)

86-92: Consider removing redundant Exception handlers.

After catching RequestException (which covers most requests-related failures), the subsequent except Exception blocks (lines 90-92, 104-106, 126-128) rarely catch additional errors. The pattern is safe but verbose.

If you prefer to keep broad exception catching, the current code is acceptable. For slightly cleaner code, you could consolidate:

             except requests.exceptions.RequestException as e:
                 status = getattr(getattr(e, 'response', None), 'status_code', 'N/A')
                 print(f"Error creating {rel_path}: {e} (HTTP {status})")
                 sync_failed = True
-            except Exception as e:
-                print(f"Error creating {rel_path}: {e}")
-                sync_failed = True

Apply the same pattern to lines 104-106 and 126-128.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 5a90889 and 85c2cc6.

📒 Files selected for processing (3)
  • .github/workflows/sync-agentkits.yml (1 hunks)
  • requirements.txt (1 hunks)
  • scripts/sync-agentkits.py (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
scripts/sync-agentkits.py (1)
lib/templateData.ts (1)
  • templates (16-185)
🪛 Ruff (0.14.2)
scripts/sync-agentkits.py

56-56: Consider moving this statement to an else block

(TRY300)


62-62: Do not catch blind exception: Exception

(BLE001)


75-75: Do not catch blind exception: Exception

(BLE001)


90-90: Do not catch blind exception: Exception

(BLE001)


104-104: Do not catch blind exception: Exception

(BLE001)


126-126: Do not catch blind exception: Exception

(BLE001)

🔇 Additional comments (7)
.github/workflows/sync-agentkits.yml (2)

16-21: LGTM!

Job configuration and checkout step are correctly configured.


23-28: LGTM!

Python setup with caching is correctly configured and will optimize workflow execution time.

scripts/sync-agentkits.py (5)

1-19: LGTM!

The imports, constants, and API key validation are well-structured. The early exit on missing credentials prevents silent failures and provides clear error messaging.


21-24: LGTM!

The hash calculation is correctly implemented using SHA-256 on binary file content, ensuring consistent hashing across platforms.


26-43: LGTM!

The function correctly handles subdirectories and normalizes paths with forward slashes for cross-platform consistency. The directory existence check prevents cryptic errors downstream.


110-114: Excellent safeguard against accidental mass deletion!

This protection prevents catastrophic data loss if the local directory is empty or misconfigured. The 50% threshold and clear messaging make this a robust safety mechanism.


132-140: LGTM!

The main block orchestrates the sync cleanly with appropriate exit codes for CI/CD integration. The error handling ensures workflow failures are properly reported to GitHub Actions.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
scripts/sync-agentkits.py (2)

28-44: Consider validating JSON files during discovery.

The function successfully addresses the previous filename collision issue by using relative paths. However, it doesn't validate that discovered JSON files are well-formed or contain expected fields. A corrupted JSON file will only fail later during sync, making debugging harder.

Consider adding basic validation:

     for root, _, files in os.walk(LOCAL_TEMPLATES_DIR):
         for filename in files:
             if filename.endswith(".json"):
                 filepath = os.path.join(root, filename)
-                file_hash = get_file_hash(filepath)
-                rel_path = os.path.relpath(filepath, LOCAL_TEMPLATES_DIR).replace(os.sep, '/')
-                local_files[rel_path] = {"path": filepath, "hash": file_hash}
+                try:
+                    # Validate JSON is well-formed
+                    with open(filepath, 'r') as f:
+                        json.load(f)
+                    file_hash = get_file_hash(filepath)
+                    rel_path = os.path.relpath(filepath, LOCAL_TEMPLATES_DIR).replace(os.sep, '/')
+                    local_files[rel_path] = {"path": filepath, "hash": file_hash}
+                except (json.JSONDecodeError, OSError) as e:
+                    print(f"Warning: Skipping invalid file {filepath}: {e}")
+                    continue

112-128: Excellent safeguard against accidental mass deletion.

The 50% threshold protection is a smart defensive measure that prevents catastrophic mistakes from configuration errors or bugs. The deletion logic properly encodes paths and handles errors consistently.

Consider extracting the threshold as a named constant for clarity:

+# Safeguard: abort if attempting to delete more than this percentage
+MAX_DELETION_THRESHOLD = 0.5
+
 def sync_templates(local_files, remote_templates):
     """Compares local and remote and performs sync operations."""
     
     ...
     
     deletion_count = len([path for path in remote_templates if path not in local_files])
-    if deletion_count > 0 and deletion_count >= (len(remote_templates) * 0.5):
+    if deletion_count > 0 and deletion_count >= (len(remote_templates) * MAX_DELETION_THRESHOLD):
         print(f"ABORT: Safeguard triggered. Attempting to delete {deletion_count}/{len(remote_templates)} templates (>50%).")
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 85c2cc6 and a47a665.

📒 Files selected for processing (3)
  • .github/workflows/sync-agentkits.yml (1 hunks)
  • requirements.txt (1 hunks)
  • scripts/sync-agentkits.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • requirements.txt
  • .github/workflows/sync-agentkits.yml
🧰 Additional context used
🧬 Code graph analysis (1)
scripts/sync-agentkits.py (1)
lib/templateData.ts (1)
  • templates (16-185)
🪛 Ruff (0.14.2)
scripts/sync-agentkits.py

62-62: Consider moving this statement to an else block

(TRY300)


82-82: Do not catch blind exception: Exception

(BLE001)

🔇 Additional comments (6)
scripts/sync-agentkits.py (6)

23-26: LGTM!

The hash calculation is straightforward and correct. Opening in binary mode ensures consistent hashing across platforms.


46-71: Solid error handling with appropriate fail-fast behavior.

The function properly validates content-type before attempting JSON parsing, addressing the previous HTML response concern. Exception handling distinguishes between network errors and JSON decoding failures with clear error messages.

Minor note: Line 60 assumes each template object has name and hash fields. If the API returns objects with different keys, this will raise a KeyError.


87-96: LGTM!

The create operation properly handles errors and includes the timeout. The defensive getattr pattern for extracting the status code prevents exceptions when the response is None.


98-108: LGTM!

The update operation correctly encodes the path while preserving forward slashes (safe='/'), which is essential for subdirectory support. Error handling is consistent and comprehensive.


132-140: LGTM!

The main block properly orchestrates the sync workflow and exits with the correct status code for CI/CD integration. Clear status messages aid troubleshooting.


8-11: Let me try an unauthenticated request to at least confirm the endpoint is reachable:

Let me search the repository for more context about this script's intended functionality:

The API endpoint may not exist or may be incorrectly designed.

Lamatic.ai's documented API exclusively uses GraphQL integration, but the script uses REST-style patterns (https://lamatic.ai/templates/agentkits) that don't appear in any official documentation. The script cannot be verified without valid credentials (which are not available in the sandbox environment).

Before deploying, manually verify that:

  • The endpoint actually exists and returns JSON (not HTML)
  • The GraphQL API should be used instead if this is the preferred Lamatic integration pattern
  • Previous reviews flagged this endpoint as returning HTML—confirm this has been resolved with the service owner

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between a47a665 and b0a7c10.

📒 Files selected for processing (1)
  • scripts/sync-agentkits.py (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
scripts/sync-agentkits.py (1)
lib/templateData.ts (1)
  • templates (16-185)
🪛 Ruff (0.14.2)
scripts/sync-agentkits.py

62-62: Consider moving this statement to an else block

(TRY300)

🔇 Additional comments (5)
scripts/sync-agentkits.py (5)

8-21: Configuration looks solid.

The API key validation and header setup are appropriate. The REQUEST_TIMEOUT constant and defensive Content-Type checking (handled later in the code) address past concerns about API reliability.


23-26: LGTM!

Clean hash implementation using SHA-256.


28-44: Well implemented with proper safeguards.

The directory validation and relative path handling correctly address the filename collision concern from earlier reviews. Path normalization ensures cross-platform compatibility.


73-130: Excellent implementation with outstanding defensive measures.

The sync logic correctly handles all CRUD operations with proper error tracking. The mass-deletion safeguard on lines 112-116 is exemplary defensive programming—aborting when >50% of templates would be deleted prevents catastrophic data loss from misconfigurations.

The specific exception handling (line 82), URL encoding (lines 102, 122), and detailed HTTP status logging all demonstrate production-ready quality.


132-140: LGTM!

Clean orchestration with appropriate exit codes. The conditional exit on line 140 ensures CI will correctly detect sync failures.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

Disabled knowledge base sources:

  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between b0a7c10 and 59768c6.

📒 Files selected for processing (1)
  • scripts/sync-agentkits.py (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
scripts/sync-agentkits.py (1)
lib/templateData.ts (1)
  • templates (16-185)
🪛 Ruff (0.14.2)
scripts/sync-agentkits.py

68-68: Consider moving this statement to an else block

(TRY300)

Comment on lines +118 to +134
deletion_count = len([path for path in remote_templates if path not in local_files])
if deletion_count > 0 and deletion_count >= (len(remote_templates) * 0.5):
print(f"ABORT: Safeguard triggered. Attempting to delete {deletion_count}/{len(remote_templates)} templates (>50%).")
print("This may indicate a misconfiguration. Aborting all deletions.")
sync_failed = True
else:
for rel_path in remote_templates:
if rel_path not in local_files:
print(f"DELETING: {rel_path}")
try:
encoded_path = quote(rel_path, safe='/')
response = requests.delete(f"{API_BASE_URL}/{encoded_path}", headers=HEADERS, timeout=REQUEST_TIMEOUT)
response.raise_for_status()
except requests.exceptions.RequestException as e:
status = getattr(getattr(e, 'response', None), 'status_code', 'N/A')
print(f"Error deleting {rel_path}: {e} (HTTP {status})")
sync_failed = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Deletion safeguard blocks valid deletions

With the current >50% safeguard, a perfectly legitimate sync is prevented—for example, if the remote set has 1 template and we remove it locally, deletion_count becomes 1, which is ≥ 50% of 1, so the script aborts and exits 1. Any restructuring that removes a majority of templates will fail the same way, violating the requirement that deletions mirror the repo. Please loosen the guard so it only trips when both the percentage and an absolute count threshold are exceeded.

-    if deletion_count > 0 and deletion_count >= (len(remote_templates) * 0.5):
+    if (
+        deletion_count > 0
+        and len(remote_templates) >= 10
+        and deletion_count >= (len(remote_templates) * 0.5)
+    ):

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Automate Sync: AgentKit templates to lamatic.ai agentkits

1 participant