This document summarizes the complete Continuous Integration setup for the MetaDetect project.
Triggers:
- Push to
mainbranch - Push to any
features/*branch - Pull requests to
main
Automated Steps:
-
Environment Setup
- Ubuntu latest runner
- JDK 17 (Temurin distribution)
- Maven dependency caching
-
Build & Compilation
- Clean workspace
- Compile all Java source files
- BLOCKS on failure ❌
-
Unit Testing
- Run all JUnit tests
- Generate test reports (XML + TXT)
- Collect coverage data via JaCoCo
- BLOCKS on failure ❌
-
Code Coverage Analysis
- JaCoCo coverage report generation
- Line, branch, method, and class coverage
- HTML reports with detailed breakdowns
- Non-blocking
⚠️
-
Style Checking
- Checkstyle validation (Google Java Style Guide)
- Check all source and test files
- XML and HTML report generation
- Non-blocking in report mode
⚠️
-
Static Code Analysis
- PMD bug detection and code quality analysis
- Custom ruleset (
config/pmd/ruleset.xml) - Identify potential bugs, dead code, code smells
- Non-blocking in report mode
⚠️
-
Maven Site Generation
- Aggregate all reports into unified documentation
- Project information and dependencies
- Professional HTML site with navigation
- Non-blocking
⚠️
-
Quality Gates
- Checkstyle violations check
- PMD violations check
- Report but continue
⚠️
-
Report Visualization
- Install wkhtmltoimage
- Convert HTML reports to PNG screenshots
- Save to
reports/directory - Non-blocking
⚠️
-
Artifact Upload
ci-reports-html-xml: Full HTML/XML reports (30 day retention)ci-reports-screenshots: PNG visualizations (30 day retention)test-results: JUnit XML files (30 day retention)- Always runs ✅
-
CI Summary
- Test counts
- Available reports list
- Download instructions
- Always displayed ✅
Purpose: Fast build and test validation
Steps:
- Compile code
- Run tests
- Package application
- Upload JAR artifact (7 day retention)
| Tool | Purpose | Configuration |
|---|---|---|
| Maven | Build automation | pom.xml |
| JUnit | Unit testing | Test classes in src/test/java |
| JaCoCo | Code coverage | Maven plugin in pom.xml |
| Checkstyle | Style checking | Google checks, pom.xml config |
| PMD | Static analysis | config/pmd/ruleset.xml |
| wkhtmltoimage | Report screenshots | scripts/html_to_png.sh |
| GitHub Actions | CI/CD orchestration | .github/workflows/*.yml |
-
Test Results
- Location:
target/surefire-reports/ - Formats: XML (machine), TXT (human)
- Per-test execution details
- Stack traces for failures
- Location:
-
Code Coverage (JaCoCo)
- Location:
target/site/jacoco/index.html - Metrics: Line, branch, method, class coverage
- Color-coded source files
- Package-level breakdowns
- Location:
-
Checkstyle Report
- Location:
target/site/checkstyle.html - Violations by severity
- Grouped by file
- Line-specific references
- Location:
-
PMD Analysis
- Location:
target/site/pmd.html - Issues by priority
- Detailed descriptions
- Fix suggestions
- Location:
-
Maven Site
- Location:
target/site/index.html - Aggregated documentation
- Project information
- All report links
- Location:
-
PNG Screenshots (if successful)
reports/jacoco.pngreports/pmd.png
All reports packaged as workflow artifacts:
- Available for 30 days
- Downloadable as ZIP files
- Can be extracted and viewed offline
Why not automated:
- Requires live Supabase authentication service
- Needs real PostgreSQL database connection
- Uses production S3 storage buckets
- Sensitive credentials (cannot be safely stored in CI)
- Risk of test data polluting production
- Cost implications of live service usage
Manual process:
# Set environment variables
export SPRING_DATASOURCE_URL="jdbc:postgresql://..."
export SPRING_DATASOURCE_USERNAME="..."
export SPRING_DATASOURCE_PASSWORD="..."
export SUPABASE_URL="https://..."
export SUPABASE_ANON_KEY="..."
export SUPABASE_JWT_SECRET="..."
export LIVE_E2E=true
# Run E2E tests
mvn -Dtest=dev.coms4156.project.metadetect.e2e.ClientServiceLiveE2eTest testCoverage:
- Real user signup and login
- Actual image uploads to storage
- Database persistence validation
- Complete authentication flow
- Storage cleanup verification
Why not automated:
- Requires running backend server
- Manual token management and refresh
- File uploads from local filesystem
- Visual validation of responses
- Interactive testing of error scenarios
- Need to verify side effects in external systems
Manual process:
# Start backend
mvn spring-boot:run
# Use cURL or Postman
curl -X POST http://localhost:8080/auth/signup \
-H "Content-Type: application/json" \
-d '{"email": "test@example.com", "password": "pass123"}'Evidence: See reports/api-testing.png for Postman test screenshots
Why not automated:
- Browser-based user interface
- Requires visual validation
- Interactive form submission
- Token persistence in browser storage
- Real-time status updates and polling
- Cross-browser compatibility testing
Manual process:
# Terminal 1: Backend
mvn spring-boot:run
# Terminal 2: Frontend
python3 -m http.server 4173 --directory client
# Browser: http://localhost:4173Test checklist:
- Landing page rendering
- Signup form submission
- Login flow and redirection
- Image upload interface
- Feed display and updates
- Post deletion
- AI analysis badge updates
-
README.md(Updated)- Complete CI section added
- Clear explanation of automated vs manual testing
- Instructions for accessing reports
- Links to detailed documentation
-
CI-PIPELINE.md(New)- Comprehensive pipeline documentation
- Step-by-step process breakdown
- Manual testing procedures
- Troubleshooting guide
- Report access instructions
- Architecture overview
-
CI-QUICKREF.md(New)- Quick reference card
- Common commands
- Pre-push checklist
- Troubleshooting shortcuts
- Quality targets
-
CI-DIAGRAM.md(New)- Visual pipeline flow
- ASCII diagrams
- Stage relationships
- Manual vs automated comparison
-
scripts/run-ci-locally.sh(New)- Bash script for Linux/macOS
- Runs complete CI pipeline locally
- Color-coded output
- Summary of results
-
scripts/run-ci-locally.ps1(New)- PowerShell script for Windows
- Same functionality as bash version
- Windows-specific commands
- Color output support
-
scripts/html_to_png.sh(Existing, Enhanced)- Converts HTML reports to PNG
- Error handling improved
- Multiple search locations for reports
The reports/ directory contains recent CI outputs:
checkstyle-report.png- Style checking visualizationbranch-report.png- Code coverage heatmappmd-report.png- Static analysis findingsapi-testing.png- Manual API test evidencetwo-users-proof.jpg- Database testing proofobjects-stored-DB.jpg- Storage bucket proof
These are periodically updated from CI runs and committed for quick reference without needing to download artifacts.
Before pushing code:
-
Run local CI validation:
# Linux/macOS ./scripts/run-ci-locally.sh # Windows .\scripts\run-ci-locally.ps1
-
Fix any errors or warnings
-
Push to feature branch
-
Check GitHub Actions for CI results
-
Address any failures before merging
On GitHub:
- Navigate to Actions tab
- Click latest workflow run
- View CI summary at bottom
- Download artifacts if needed
Locally:
- Run
./mvnw site - Open
target/site/index.html - Navigate through report links
E2E Tests:
- Set environment variables (see
.envexample) - Run with
LIVE_E2E=true mvn test
API Tests:
- Start backend:
mvn spring-boot:run - Use cURL or Postman
- Follow examples in README
Client Tests:
- Start backend and frontend servers
- Open browser to
http://localhost:4173 - Follow manual checklist
- Test Coverage: ≥ 80%
- Checkstyle Violations: 0
- PMD Critical Issues: 0
- Build Time: < 5 minutes
- Test Success Rate: ≥ 95%
- Build success rate
- Test execution time
- Coverage trends
- Violation counts
- PMD issue density
- Automated E2E with test database/mock services
- Performance testing (JMeter/Gatling)
- Security scanning (OWASP Dependency Check)
- Automated deployment to staging
- Pull request comment with reports
- Slack/Discord notifications
- Code quality badges in README
- Trend graphs for metrics
- Documentation: Start with
CI-PIPELINE.md - Quick Help: Check
CI-QUICKREF.md - GitHub Issues: Label with
ci-pipeline - Logs: Review GitHub Actions workflow logs
See troubleshooting section in CI-PIPELINE.md for:
- Test failures
- Style violations
- PMD warnings
- Coverage issues
- Artifact access
| Requirement | Status | Implementation |
|---|---|---|
| Automate style checking | ✅ Complete | Checkstyle in CI |
| Automate static analysis | ✅ Complete | PMD in CI |
| Automate unit testing | ✅ Complete | JUnit in CI |
| End-to-end testing | Explained in README | |
| API testing | Explained in README | |
| Include CI reports | ✅ Complete | Reports in reports/ + artifacts |
| README explanation | ✅ Complete | Detailed CI section added |
Per assignment: "If it's not possible to automate the API testing tool (or any other testing and analysis tool), your README should explain."
Our README explains:
- ✅ What is automated (unit tests, style, static analysis)
- ✅ What is manual (E2E, API integration, client UI)
- ✅ Why it's manual (external services, credentials, visual validation)
- ✅ How to run manual tests (detailed instructions)
- ✅ Evidence of manual testing (screenshots in
reports/)
This CI implementation provides:
- Comprehensive automation of all automatable checks
- Clear documentation of what can't be automated and why
- Easy access to CI reports via GitHub and repository
- Local validation tools for pre-push checks
- Professional quality matching industry standards
The system catches issues early, provides actionable feedback, and maintains high code quality standards while being transparent about limitations.
Implementation Date: November 29, 2025
Author: GitHub Copilot
Status: Complete ✅