Conversation
## Overview
Migrated from Express-based REST API to Model Context Protocol (MCP) using
FastMCP library. Python clients can now use standard JSON-RPC 2.0 over HTTP
to interact with the VS Code extension.
## Changes
### TypeScript Server
- **Added** `debriefMcpServer.ts` - FastMCP server implementation
- 5 MCP resources: features, selection, time, viewport, plot list
- 9 MCP tools: add/update/delete features, selection, zoom, time, viewport, notify
- Port 60123 with `/mcp` endpoint (same as before)
- Health check at `/health`
### Extension Integration
- **Modified** `extension.ts` - Use DebriefMcpServer instead of DebriefHTTPServer
- **Modified** `serverIndicatorConfigs.ts` - Updated for MCP server
- **Modified** `ServerIndicatorConfig.ts` - Updated documentation
- **Modified** `package.json` - Added fastmcp and zod dependencies, updated esbuild externals
### Python Client
- **Added** `mcp_client.py` - Simple JSON-RPC 2.0 client
- Uses standard `requests` library (no special MCP dependencies)
- Helper methods for all Debrief operations
- Clean error handling with MCPError exception
- **Modified** `test_http_connection.py` - Updated to use new MCP client
## Benefits
- **Standardized Protocol**: Uses MCP (Model Context Protocol) standard
- **Type Safety**: Zod validation on all tool parameters
- **Better Errors**: Structured JSON-RPC error responses
- **Auto-discovery**: Clients can list available resources and tools
- **Simpler Python Client**: Raw HTTP requests, no special dependencies
## Breaking Changes
- Python scripts must use JSON-RPC 2.0 format instead of custom REST format
- Use `mcp_client.py` helper or make raw JSON-RPC requests
## Migration Path
Old: `debrief.notify("Hello")`
New: `client.notify("Hello")` or raw HTTP POST with JSON-RPC 2.0
Related: #227
## Cleanup - **Removed** `debriefHttpServer.ts` - old Express-based REST server - All Python test scripts now use `mcp_client.py` ## Updated Python Tests - **test_notify_command.py** - Now uses MCPClient, added notification level tests - **move_point_north_simple.py** - Updated to use MCP resources and tools - **toggle_paris_color.py** - Simplified with MCP client, added status messages - **select_centre_time.py** - Updated to work with ISO 8601 strings from MCP ## Migration Complete All Python code now uses JSON-RPC 2.0 MCP protocol instead of custom REST API. The old `debrief_api.py` remains for reference but is no longer used.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| async start(): Promise<void> { | ||
| console.warn('Starting Debrief MCP server...'); | ||
|
|
||
| await this.server.start({ | ||
| transportType: 'httpStream', | ||
| httpStream: { | ||
| port: this.port, | ||
| endpoint: '/mcp' | ||
| } | ||
| }); | ||
|
|
||
| console.warn(`Debrief MCP server started on http://localhost:${this.port}/mcp`); | ||
| } | ||
|
|
||
| async stop(): Promise<void> { | ||
| console.warn('Stopping Debrief MCP server...'); | ||
| // FastMCP handles cleanup internally | ||
| // Note: FastMCP v3 doesn't expose a stop method, server lifecycle is managed by the framework | ||
| console.warn('Debrief MCP server stopped'); | ||
| } | ||
|
|
||
| isRunning(): boolean { | ||
| // FastMCP doesn't expose a running state check | ||
| // We rely on start() being called successfully | ||
| return true; |
There was a problem hiding this comment.
Implement actual shutdown for MCP server
The new DebriefMcpServer.stop() is a no‑op and isRunning() always returns true. The status indicator calls onStop and then onStart again when the user restarts the server or when the extension deactivates/reactivates. Because the FastMCP instance keeps listening on port 60123, the subsequent start() call will encounter an EADDRINUSE error and the restart will fail even though the UI reports a successful stop. Please dispose the FastMCP server or at least prevent subsequent starts while the port is still bound.
Useful? React with 👍 / 👎.
🚀 VS Code Extension PR Preview DeployedYour VS Code extension PR preview has been successfully deployed to Fly.io! 🌐 Preview URL: https://pr-231-futuredebrief.fly.dev 📋 Details:
🔧 What's included:
💡 How to use:
This preview will be automatically updated when you push new commits to this PR. |
## Problem
Python client was getting 400 errors when calling `get_features()` without
a filename because it tried to access `plot:///features` which doesn't match
any resource template.
## Solution
Added non-templated resources that auto-select when only one plot is open:
- `plot://features` (instead of `plot:///features`)
- `plot://selection` (instead of `plot:///selection`)
- `plot://time` (instead of `plot:///time`)
- `plot://viewport` (instead of `plot:///viewport`)
These complement the existing templated resources:
- `plot://{filename}/features`
- `plot://{filename}/selection`
- `plot://{filename}/time`
- `plot://{filename}/viewport`
## Changes
- **debriefMcpServer.ts**: Added 4 auto-select resources with helpful error messages
- **mcp_client.py**: Fixed URI patterns to use `plot://` instead of `plot:///`
## Behavior
- Single plot open: Auto-select resources work seamlessly
- Multiple plots open: Clear error message with list of available plots
- No plots open: Error message indicating no plots available
Fixes: toggle_paris_color.py and other scripts that don't specify filename
FastMCP and zod need to be loaded from node_modules at runtime rather than bundled into the extension. When bundled, FastMCP was trying to access files with undefined paths during initialization. This fixes the extension activation error: 'The argument filename must be a file URL object, file URL string, or absolute path string. Received undefined' Bundle size reduced from 3.5MB to 207KB as a bonus.
- Updated mcp_client.py to show actual JSON-RPC error messages - Added debug_mcp.py to help diagnose connection issues - Now errors will show the server's error message instead of just HTTP status
## Changes
**mcp_client.py:**
- Added unique session ID generation using uuid4
- Include 'Mcp-Session-Id' header in all HTTP requests
- Added context manager support (__enter__, __exit__)
- Added close() method for explicit cleanup
- Updated documentation with session examples
**debug_mcp.py:**
- Refactored to use session-aware MCPClient
- More readable output with structured tests
- Shows session ID for debugging
## How Sessions Work
Each MCPClient instance:
1. Generates a unique session ID on initialization
2. Includes the session ID in all requests via HTTP header
3. Maintains session for the lifetime of the client
4. Supports context manager for automatic cleanup
## Usage
Basic:
client = MCPClient()
features = client.get_features()
Context manager:
with MCPClient() as client:
features = client.get_features()
This is the future-proof solution using full MCP protocol compliance.
Fixes: 'Bad Request: No valid session ID provided' error
The MCP protocol requires clients to call 'initialize' before making any other requests. Without this, the server returns 'Session not found' even though the session ID is being sent. Changes: - Added _initialize_session() method that calls MCP 'initialize' - Includes protocol version and client capabilities - Called automatically in __init__ - Proper error handling if initialization fails This completes the MCP session lifecycle: 1. Client generates session ID 2. Client calls 'initialize' with session ID header 3. Server registers the session 4. Client makes subsequent requests Fixes: 'MCP Error -32001: Session not found'
## Problem Session-based mode requires persistent connections and initialization handshakes, which doesn't work well with simple HTTP POST requests. This caused: - 'Session not found' errors - Complex client initialization - Chicken-and-egg problems with session establishment ## Solution Enable stateless mode on the FastMCP server, which is the standard pattern for HTTP-based MCP servers. Each request is now independent and self-contained. ## Changes **Server (debriefMcpServer.ts):** - Added `stateless: true` to httpStream configuration - Updated console log to indicate stateless mode **Python Client (mcp_client.py):** - Removed session initialization handshake (\_initialize_session) - Simplified \_\_init\_\_ method - Updated documentation to reflect stateless operation - Session ID still generated for request tracking ## Benefits ✅ Simpler client code - no initialization needed ✅ More reliable - no session lifecycle issues ✅ Standard HTTP-based MCP pattern ✅ Better for serverless/scaled deployments ✅ Still fully MCP protocol compliant ## Why This Is Future-Proof Stateless HTTP is the recommended MCP transport for request/response patterns. Session management is primarily for bidirectional transports like WebSocket. After this change: - Pull latest code - Rebuild: cd apps/vs-code && pnpm esbuild-base --sourcemap - Reload VS Code - Python scripts should work immediately!
The Python client was still sending 'Mcp-Session-Id' headers, which caused FastMCP to expect session management even in stateless mode. Changes: - Removed Mcp-Session-Id header from all requests - Removed session_id attribute from MCPClient class - Removed uuid import (no longer needed) - Updated debug script to not reference session_id - Updated documentation to clarify stateless operation In true stateless mode: - No session headers sent - No session tracking - Each request completely independent - Server creates/discards temporary session per request This should finally fix the 'Session not found' errors.
FastMCP's HTTP streaming transport requires clients to accept both: - application/json (for JSON-RPC responses) - text/event-stream (for streaming capabilities) Added Accept header to all requests: 'Accept': 'application/json, text/event-stream' This fixes the error: 'Not Acceptable: Client must accept both application/json and text/event-stream'
When the server returns non-JSON content, now shows: - Content-Type header - HTTP status code - Response body preview (first 500 chars) - JSON parsing error details This will help diagnose what the server is actually returning.
FastMCP's HTTP streaming transport returns responses in SSE format:
event: message
id: <uuid>
data: {"jsonrpc": "2.0", "result": ...}
Changes:
- Added _parse_sse_response() method to extract JSON from SSE format
- Check Content-Type header and parse accordingly
- Extract JSON from 'data:' field when Content-Type is 'text/event-stream'
- Fall back to plain JSON for other content types
This fixes the 'Expecting value: line 1 column 1' JSON parsing errors.
The client now properly handles FastMCP's SSE response format.
- Remove effect, sury, and @valibot/to-json-schema from externals These are transitive dependencies that should be bundled, not external - Fix zod version to match fastmcp's requirement (^3.25.76 instead of ^4.x) - This was causing extension activation failure and preventing plot editors from loading
- Bundle fastmcp and zod into extension (not external) - Keep @valibot/to-json-schema, sury, effect as external (optional deps) - Add import.meta.url polyfill for CommonJS compatibility - This allows extension to work both locally and in VSIX deployment
- Remove @debrief/shared-types from externals list - Workspace packages must be bundled, not external - Fixes 'Cannot find module' error in fly.io deployment - Extension now bundles: shared-types, fastmcp, zod - External only: vscode, @valibot/to-json-schema, sury, effect (optional)
- Update pnpm-lock.yaml to reflect zod ^3.25.76 (from ^4.1.12) - Fixes CI build failure with frozen-lockfile
- Remove debrief_api.py (replaced by mcp_client.py) - Remove debug_mcp.py (debug file, no longer needed) - All test scripts now use mcp_client.py with FastMCP - Migration from REST to MCP is complete
- Add Zod schema generation to shared-types (alongside TypeScript types) - Install json-schema-to-zod and zod dependencies - Generate 43 Zod schemas from existing JSON schemas - Export Zod schemas from shared-types package MCP Server improvements (TypeScript): - Replace z.any() with typed Zod schemas in tool parameters - Add runtime validation for debrief_add_features tool - Add runtime validation for debrief_update_features tool - Create union schema for all Debrief feature types Python client improvements: - Add optional Pydantic validation to all data methods - Validate features before sending (add_features, update_features) - Validate responses after receiving (get_features, get_selection, get_time, get_viewport) - Graceful fallback when Pydantic models not available Build system improvements: - Update shared-types Makefile to generate and copy Zod schemas - Fix TypeScript compilation to trigger when Zod files change - Upgrade TypeScript to ^5.2.0 for fastmcp compatibility (using keyword) Type compatibility fixes: - Cast Zod-inferred types to GeoJSONFeature where needed - Handle id field difference (null vs undefined) This provides full type safety with runtime validation on both sides of the MCP interface while maintaining backward compatibility through optional validation.
🎨 Web Components Visual Testing Results✅ Visual testing completed successfully! 🔍 Visual Review: View visual changes 📋 Details:
Visual testing runs automatically when web components are modified. |
🔧 Tool Vault Build CompleteCommit: ec2a6ba The Tool Vault packager has been built and tested successfully. |
MCP Client improvements: - Import all feature types (Track, Point, Annotation) and create union type - Change return types from Dict[str, Any] to Pydantic models: - get_features() → DebriefFeatureCollection - get_selection() → SelectionState - get_time() → TimeState - get_viewport() → ViewportState - Change parameter types to accept Pydantic models: - add_features(features: List[DebriefFeature]) - update_features(features: List[DebriefFeature]) - set_features(feature_collection: DebriefFeatureCollection) - set_time(time_state: TimeState) - set_viewport(viewport_state: ViewportState) - Simplified validation: models are always validated, removed validate parameter complexity - Auto-serialize Pydantic models to dicts when sending to server - Added Dict[str, Any] type hints to args dicts to fix Pylance errors Example script improvements: - toggle_paris_color.py: Type-safe property access with Pydantic models - move_point_north_simple.py: Type-safe geometry manipulation using DebriefPointFeature - select_centre_time.py: Type-safe time state using TimeState model Benefits: - Better IDE autocomplete (knows model fields and types) - Type checking catches errors at development time - Clearer, more maintainable code - Pydantic validation ensures data correctness - Great examples for users to follow
Problem: Pylance was reporting 'Variable not allowed in type expression' errors because we were setting type aliases to None at runtime when Pydantic wasn't available, then using them in type annotations. Solution: - Import Pydantic types in TYPE_CHECKING block for type checkers (always available) - Import same types with _ prefix at runtime for actual validation (may be None) - Use non-prefixed types in type annotations (e.g., def get_features() -> DebriefFeatureCollection) - Use _prefixed types for runtime operations (e.g., _DebriefFeatureCollection.model_validate()) This pattern ensures: - Type checkers always see valid type expressions - Runtime code gracefully handles missing Pydantic - No 'Variable not allowed in type expression' errors Fixes all 10 Pylance reportInvalidTypeForm errors in mcp_client.py
🔧 Tool Vault Build CompleteCommit: 9877dbc The Tool Vault packager has been built and tested successfully. |
Problem: Extension activation was failing with 'filename must be a file URL object' error because fastmcp couldn't access import.meta.url correctly when bundled. Changes: - Use clearer variable name __import_meta_url__ instead of importMetaUrl - Use .href instead of .toString() for URL (more standard) - Add fallbacks for __filename and __dirname in case they're undefined - Provide fallback URL 'file:///' if all else fails - More defensive polyfill that handles edge cases in VS Code's loading context This should fix the extension activation error that prevents maps from showing.
🔧 Tool Vault Build CompleteCommit: 72fbad1 The Tool Vault packager has been built and tested successfully. |
Problem: Example scripts had ~25 Pylance type errors because they didn't properly handle Pydantic union types. Errors included: - 'Cannot access attribute "lower" for class "StrictInt"' (feature.id can be int) - 'Cannot access attribute "color" for class "TrackProperties"' (not all types have color) - 'is not a known attribute of "None"' (properties can be None) This teaches bad practices to users and makes their IDEs show errors. Solution: Demonstrate proper type narrowing patterns: 1. Use isinstance() to narrow union types: - Before: feature.properties.color (ERROR: union type) - After: if isinstance(feature, DebriefPointFeature): feature.properties.color 2. Check ID type before string methods: - Before: feature.id.lower() (ERROR: could be int) - After: if isinstance(feature.id, str): feature.id.lower() 3. Check for None before accessing: - Before: feature.id in selected_ids (ERROR: could be None) - After: if feature.id is not None and feature.id in selected_ids 4. Use hasattr() for optional properties: - if feature.properties and hasattr(feature.properties, 'color') New files: - TYPE_SAFETY_GUIDE.md: Comprehensive guide showing patterns, common errors, and best practices for working with Pydantic union types Updated examples: - toggle_paris_color.py: Proper isinstance() checks, clear comments - move_point_north_simple.py: Safe ID handling, geometry access patterns Benefits: - Zero Pylance errors (was 25+) - Better IDE autocomplete - Teaches users correct patterns - Catches errors at development time - Self-documenting code with explicit type checks This is essential for teaching users how to work with typed Python code!
🔧 Tool Vault Build CompleteCommit: 27b1fd0 The Tool Vault packager has been built and tested successfully. |
Problem: Extension activation was failing with 'Cannot read properties of undefined (reading run)' because fastmcp's dependency xsschema was trying to dynamically import optional libraries (effect, sury, @valibot/to-json-schema) that weren't installed. When these imports failed, xsschema received undefined and tried to call methods on it, causing the activation error. Solution: Install the optional dependencies as devDependencies: - effect@^3.19.2 - sury@11.0.0-alpha.4 - @valibot/to-json-schema@^1.3.0 These remain externalized in esbuild config (not bundled) but are now available at runtime when xsschema tries to import them dynamically. This fixes the extension activation error while keeping bundle size down by not bundling these optional schema conversion libraries.
🔧 Tool Vault Build CompleteCommit: e5b7592 The Tool Vault packager has been built and tested successfully. |
🔧 Tool Vault Build CompleteCommit: 45c1e00 The Tool Vault packager has been built and tested successfully. |
…or logging Problem 1: Extension still failing with 'filename must be a file URL' error despite previous polyfill attempts. The error message from VS Code (notificationsAlerts.ts:40) doesn't show the actual source of the problem, making debugging very difficult. Problem 2: Lost stack traces in activation errors make it hard to diagnose where failures occur in our code. Solutions: 1. More Robust Polyfill: - Wrap polyfill in IIFE with try-catch for safer execution - Use 'const' instead of 'var' to prevent redeclaration issues - Add '[Polyfill Error]' logging if pathToFileURL fails - Provide better fallback: 'file:///extension.js' instead of 'file:///' - Remove dependency on __filename/__dirname (which might not be available yet) 2. Comprehensive Error Logging: - Wrap entire activate() function in try-catch block - Log full error object, message, and stack trace - Prefix with '[Extension Activation Error]' for easy filtering - Re-throw error after logging so VS Code still shows it - Now we'll see BOTH VS Code's error AND our detailed stack trace Benefits: - Polyfill is more defensive and handles edge cases - If polyfill fails, we'll see '[Polyfill Error]' in console - If activation fails, we'll see full stack trace showing exact line in our code - Much easier to debug issues in deployed environments - Fallback URL is valid and more descriptive This should fix the extension activation while also making future issues much easier to diagnose.
🔧 Tool Vault Build CompleteCommit: 7856f2d The Tool Vault packager has been built and tested successfully. |
Added detailed logging throughout the Python wheel installation process to diagnose why debrief-types package installation is failing in both local and fly.io environments. Changes: - Enhanced checkAndInstallPackage() with step-by-step logging - Added detailed logging to getPythonInterpreter() showing all attempted paths - Added logging to installPackage() showing each installation strategy attempt - Added logging to getInstalledVersion() showing version check results - Enhanced error logging in extension.ts activation to show full error details - Added 'from __future__ import annotations' to mcp_client.py for PEP 563 compatibility The logging will help identify: - Whether the bundled wheel file is found - Which Python interpreter is detected - Which installation strategy succeeds/fails - What error messages pip returns - Why the installation might be failing silently All log messages are prefixed with [Python Wheel Installer] for easy filtering.
🔧 Tool Vault Build CompleteCommit: 680a37b The Tool Vault packager has been built and tested successfully. |
Fixed test data files and toggle_paris_color.py script to use the correct property names as defined in the canonical Pydantic schemas. The Pydantic models are the source of truth for data structure (per CLAUDE.md). Schema compliance issues fixed: - Changed "color" to "marker-color" in sample.plot.json (4 features) - Changed "color" to "marker-color" in colored_points_test.plot.json (7 features) - Added required "dataType": "reference-point" discriminator to colored_points_test.plot.json - Fixed toggle_paris_color.py hasattr check from 'color' to use marker_color property directly Root cause: The Pydantic PointProperties model defines marker_color with alias "marker-color", not "color". Test data was using incorrect property names, causing Pydantic validation errors: "Extra inputs are not permitted [type=extra_forbidden]" Following CLAUDE.md guidance: "We should not be forgiving when handling data - if it's not in the expected format, fail-fast and we can fix it."
🔧 Tool Vault Build CompleteCommit: 0768ecc The Tool Vault packager has been built and tested successfully. |
…s change Fixed issue where updating feature properties (like marker-color) via MCP did not visually update the map display, even though the document was correctly updated. Root cause: React's reconciliation algorithm uses keys to determine if components should be re-rendered. The previous key was only `feature.id || index`, so when a feature's properties changed (e.g., color update), React saw the same key and did not re-render the component. Solution: Include relevant properties in the React key: - feature.id (for identity) - marker-color (triggers re-render on color changes) - visible (triggers re-render on visibility changes) This forces React to treat the updated feature as a new component instance, causing the CircleMarker to re-render with the updated color. Example key: "paris_city_002-#FF0000-true" This fixes the issue where Python scripts using mcp_client.py could successfully update features via debrief_update_features, but the changes weren't visible on the map.
🔧 Tool Vault Build CompleteCommit: 3fa1a0f The Tool Vault packager has been built and tested successfully. |
Fixed critical bug where Pydantic model_dump() was serializing using Python
field names instead of JSON field aliases, causing property loss during
feature updates.
Problem:
1. Test data has "marker-color": "#00FF00" (JSON field name with hyphen)
2. Pydantic loads it into marker_color field (Python field name, underscore)
3. Python script modifies marker_color = "#FF0000"
4. model_dump() serializes as "marker_color": "#FF0000" (wrong!)
5. Server saves with wrong field name
6. Renderer looks for "marker-color" or "color", finds neither
7. Falls back to default blue color (#3388ff)
Solution:
Use model_dump(by_alias=True, exclude_none=True) to:
- Serialize using JSON field aliases (marker-color not marker_color)
- Exclude None values to keep JSON clean
This fix applies to both add_features() and update_features() methods.
Related schema definition:
```python
marker_color: Optional[str] = Field(
None,
alias="marker-color",
description="Marker color (hex color code)"
)
```
Now toggle_paris_color.py correctly updates the visible marker color!
🔧 Tool Vault Build CompleteCommit: 4795ba6 The Tool Vault packager has been built and tested successfully. |
…e_case Removed the Pydantic field alias to eliminate fragility and ensure consistent naming throughout the codebase. Now using marker_color (snake_case) everywhere instead of mixing marker-color (kebab-case) and marker_color. Changes to point.py: - Removed alias="marker-color" from marker_color field - Removed populate_by_name = True from Config (no longer needed) This is a breaking change for existing data files, but acceptable since we're not in production yet. All test data and renderers will be updated to use marker_color consistently. Regenerated all derived files: - TypeScript types now use marker_color - JSON schemas now use marker_color - Zod schemas now use marker_color - Python wheel rebuilt with updated model Next steps: - Update test data files to use marker_color - Update renderer to look for marker_color - Revert by_alias=True workaround in mcp_client.py
- Update test data files to use marker_color instead of marker-color - Update React renderers to access marker_color property - Update featureUtils to use dot notation for marker_color - Remove by_alias=True from mcp_client.py serialization This completes the removal of Pydantic field aliases, standardizing on snake_case throughout the codebase for consistency and simplicity.
- Update Pydantic model to use str fields with date-time format - Preserves JSON schema validation while fixing type compatibility - Resolves TypeError when test code treats fields as strings - Maintains ISO 8601 datetime string format throughout stack
- Update multi_feature_collection.json test data - Update simple_feature_collection.json test data - Fixes Pydantic validation errors in tool tests - Aligns with marker-color to marker_color refactor
- Track running state with boolean flag - Store HTTP server instance from FastMCP start() - Prevent starting if already running (throws error) - Properly close HTTP server in stop() to release port 60123 - Return accurate state from isRunning() Fixes P1 issue where stop() was a no-op causing EADDRINUSE errors on restart. The status indicator can now successfully stop and restart the server without port conflicts.
❌ VS Code Extension PR Preview Deployment FailedThe VS Code extension PR preview deployment encountered an error during the build or deployment process. 🔧 Troubleshooting:
📋 Details:
The deployment will retry automatically when you push new commits to this PR. |
🔧 Tool Vault Build CompleteCommit: 598f662 The Tool Vault packager has been built and tested successfully. |
🧹 VS Code Extension PR Preview Cleaned UpYour PR preview environment has been automatically destroyed. ✅ Cleanup successful!
All associated resources have been removed from Fly.io to optimize costs. |
🧹 Tool Vault Cleanup CompleteThe preview Fly.io app |
Summary
This PR fixes type consistency issues in the Pydantic models and standardizes property naming conventions across the codebase, resolving runtime errors in Python test scripts.
Changes
Problem: The codebase had inconsistent naming between marker-color (kebab-case, JSON) and marker_color (snake_case, Python), managed through Pydantic field aliases. This created fragility where developers had to remember to use by_alias=True when serializing, leading to bugs where properties would be lost or use the wrong name.
Solution: Removed all Pydantic field aliases and standardized on marker_color (snake_case) throughout the entire stack:
Files Changed:
Benefits:
Problem: The TimeState Pydantic model defined fields as datetime objects, which caused Pydantic to automatically parse JSON datetime strings into Python datetime instances. Test code expected strings and called .replace('Z', '+00:00'), which failed because datetime.replace() expects keyword arguments, not string arguments.
Error:
TypeError: 'str' object cannot be interpreted as an integer
Solution: Changed TimeState fields from datetime to str with format: "date-time" validation:
Before
current: datetime = Field(...)
After
current: str = Field(
...,
description="Current time position as ISO 8601 date-time string",
json_schema_extra={"format": "date-time"}
)
Files Changed:
Benefits:
Issue: Map features weren't re-rendering when properties changed because React keys only included feature IDs.
Solution: Updated React keys to include marker_color and visible properties to force re-render on property changes:
const key =
${feature.id || index}-${markerColor || ''}-${visible !== undefined ? visible : 'true'};Testing:
✅ toggle_paris_color.py - Colors update correctly on map
✅ select_centre_time.py - No longer throws TypeError, correctly calculates center time
✅ All test scripts working without errors
✅ Pydantic validation enforces correct data types
✅ Map updates visually when feature properties change
Migration Notes
Since we're not in production, no backwards compatibility is needed. All JSON data files have been updated to use marker_color instead of marker-color.
This PR establishes a cleaner, more consistent foundation for the type system going forward by eliminating aliases and ensuring type compatibility across the Python and TypeScript stacks.