-
Notifications
You must be signed in to change notification settings - Fork 0
Daily Test Coverage ImproverResearch and Plan #15
Description
Test Coverage Research Summary
I've analyzed the repository to understand the current state of test coverage and develop a systematic improvement plan.
Repository Overview
Technology Stack:
- Frontend: Vanilla JavaScript, TypeScript, React components with Vite build system
- Backend: Python with FastAPI
- Testing Frameworks: Vitest (JS/TS), pytest (Python)
- Focus Area: Enterprise compliance (GDPR, accessibility, audit trails)
Current Test Coverage State
Existing Test Files:
tests/ContactForm.test.tsx- Comprehensive (185 lines) - React component with accessibility, validation, form submission, error handlingtests/unit/emailValidation.test.ts- Partial - Email validation teststests/unit/test_validation.py- Minimal (16 lines) - Basic Python validation teststests/integration/test_api_endpoints.py- Basic (20 lines) - Simple API endpoint teststests/test_contact_handler.py- Exists but may have issues
Source Files Needing Coverage:
src/utils/validation.ts- Comprehensive validation utilities (217 lines)src/api/contact.ts- API client with GDPR compliance (261 lines)server/contact_handler.py- Complex GDPR-compliant form handler (363 lines)backend/api/users.py- FastAPI user endpoints (17 lines)main.js- Application entry point with event handlingsrc/components/ContactForm.tsx- React form component
Test Organization Standards
Directory Structure:
tests/
├── unit/ # Unit tests for individual functions/classes
│ ├── *.test.ts # TypeScript unit tests
│ └── test_*.py # Python unit tests
├── integration/ # Integration tests for API/system interactions
│ └── test_*.py
└── *.test.tsx # Component tests
Naming Conventions:
- JavaScript/TypeScript:
*.test.ts,*.test.tsx - Python:
test_*.py - Place tests near what they test conceptually (unit vs integration vs component)
Commands for Testing
Build and Test Commands:
# Install dependencies
npm install
pip install pytest pytest-cov coverage fastapi httpx requests
# Build project
npm run build
# Run JavaScript/TypeScript tests with coverage
npm run test:coverage
# Run Python tests with coverage
python -m pytest tests/ --cov=server --cov=backend --cov-report=html:coverage/python --cov-report=json
# Generate combined coverage report
# (automated in .github/actions/daily-test-improver/coverage-steps/action.yml)Coverage Improvement Strategy
Phase 3 Priorities (High-Impact Areas):
-
Python Backend Coverage (Critical Gap)
server/contact_handler.py- Complex compliance logic needs thorough testing- Rate limiting (
check_rate_limit) - Data validation (
ContactFormData.validate) - CSRF token validation
- Audit logging (
AuditLogger) - Data retention policies (
DataRetentionManager) - GDPR handlers (
handle_data_export_request,handle_data_deletion_request)
- Rate limiting (
backend/api/users.py- Basic endpoints need comprehensive tests
-
JavaScript/TypeScript Utilities (Medium Gap)
src/utils/validation.ts- Many functions partially testedRateLimiterclassvalidatePhoneNumbervalidateURLhashSensitiveDatagenerateSecureTokenDataRetentionManagerclass
src/api/contact.ts- Complex API logic needs testingsubmitContactFormwith mockinglogAuditEventexportUserDatadeleteUserData
-
Main Application Logic
main.js- Event handlers, user interactions, GDPR consent flows
-
Edge Cases & Error Handling
- Security validation edge cases
- Error scenarios in API calls
- Concurrent request handling
- Data retention policy edge cases
Testing Approach:
- Unit Tests: Test individual functions in isolation with mocks
- Integration Tests: Test API endpoints with TestClient
- Security Tests: Validate XSS prevention, CSRF protection, input sanitization
- Compliance Tests: Verify GDPR data handling, audit logging, retention policies
- Accessibility Tests: Ensure WCAG compliance (forms, error messages, ARIA attributes)
Opportunities for Major Coverage Gains
- Security & Validation Module:
src/utils/validation.tshas extensive untested code (rate limiting, data retention, security tokens) - Compliance Handler:
server/contact_handler.pycontains critical business logic that needs comprehensive testing - API Layer: Both
src/api/contact.tsandbackend/api/users.pylack thorough integration tests - Error Handling: Many error paths are untested across all modules
Questions for Maintainers
- Coverage Targets: What minimum coverage percentage should we aim for (e.g., 80% overall, 90% for critical security code)?
- Test Data: Should we create test fixtures for common scenarios (sample form data, mock API responses)?
- Python Validation Module: The test file
tests/unit/test_validation.pyimports fromsrc.utils.validationbut I don't see a Python module at that path - should this be testing TypeScript validation instead? - CI Integration: Should coverage reports fail the build if coverage drops below a threshold?
How to Control this Workflow
You can manage this workflow using the following commands:
# Disable the workflow
gh aw disable daily-test-improver --repo microsoftgbb/corporate-website
# Enable the workflow
gh aw enable daily-test-improver --repo microsoftgbb/corporate-website
# Run the workflow on demand with multiple iterations
gh aw run daily-test-improver --repo microsoftgbb/corporate-website --repeat (number-of-repeats)
# View workflow logs
gh aw logs daily-test-improver --repo microsoftgbb/corporate-websiteYou can also add comments to this discussion to provide feedback or request adjustments to the plan.
What Happens Next
- Next Run (Phase 2): The workflow will analyze the codebase to create/validate the coverage steps configuration in
.github/actions/daily-test-improver/coverage-steps/action.yml - After Phase 2: Phase 3 will begin on subsequent runs to implement actual test coverage improvements based on this plan
- Repeat Mode: If running with
--repeat, the workflow will automatically proceed through phases without waiting - Human Review: You can review this research and add comments before the workflow continues to Phase 2
Note: This was intended to be a discussion, but discussions could not be created due to permissions issues. This issue was created as a fallback.
AI generated by Daily Test Coverage Improver
To add this workflow in your repository, run
gh aw add githubnext/agentics/workflows/daily-test-improver.md@e43596e069e74a65cd7d93315091672d278c2642. See usage guide.
- expires on Mar 5, 2026, 9:27 AM UTC