Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
242 changes: 90 additions & 152 deletions .github/workflows/dedo-duro-analysis.yml
Original file line number Diff line number Diff line change
@@ -1,39 +1,27 @@
# Dedo-Duro AWS Resource Analysis
# Automated weekly analysis with on-demand triggering

name: Dedo-Duro AWS Analysis

on:
# Run weekly on Monday at 6 AM UTC
schedule:
# Run weekly on Monday at 6 AM UTC
- cron: '0 6 * * 1'

# Allow manual trigger

workflow_dispatch:
inputs:
region:
description: 'AWS Region to analyze (leave empty for default)'
required: false
type: string
resource_types:
description: 'Comma-separated resource types (leave empty for all)'
description: 'Comma-separated resource types (e.g., ec2,rds,s3)'
required: false
type: string
output_format:
description: 'Output format'
default: 'ec2,rds,s3,ebs,lambda,dynamodb'
regions:
description: 'Comma-separated AWS regions (e.g., us-east-1,us-west-2)'
required: false
default: 'html'
type: choice
options:
- html
- json
- csv
multi_region:
description: 'Analyze all regions'
default: 'us-east-1'
environment:
description: 'Environment filter (prod, test, dev, or empty for all)'
required: false
default: false
type: boolean
environment_filter:
description: 'Environment filter (prod, test, dev)'
required: false
type: string
default: ''

env:
PYTHON_VERSION: '3.11'
Expand All @@ -42,165 +30,115 @@ jobs:
analyze:
name: Run AWS Resource Analysis
runs-on: ubuntu-latest

permissions:
id-token: write # Required for OIDC authentication
id-token: write # For OIDC authentication
contents: read

steps:
- name: Checkout repository
- name: Checkout Repository
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: 'pip'

- name: Install dependencies
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt

- name: Configure AWS credentials
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
env:
INPUT_REGION: ${{ inputs.region }}
DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}
with:
role-to-assume: ${{ secrets.AWS_ROLE_ARN }}
aws-region: ${{ inputs.region || secrets.AWS_DEFAULT_REGION || 'us-east-1' }}

aws-region: us-east-1
# Alternative: Use access keys (less secure)
# aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
# aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

- name: Run Dedo-Duro Analysis
id: analysis
env:
INPUT_REGION: ${{ inputs.region }}
INPUT_RESOURCE_TYPES: ${{ inputs.resource_types }}
INPUT_OUTPUT_FORMAT: ${{ inputs.output_format }}
INPUT_MULTI_REGION: ${{ inputs.multi_region }}
INPUT_ENVIRONMENT: ${{ inputs.environment_filter }}
run: |
# Build command with optional parameters using environment variables
CMD="python main.py"

# Add region if specified (validate alphanumeric and hyphens only)
if [ -n "$INPUT_REGION" ]; then
SAFE_REGION=$(echo "$INPUT_REGION" | grep -E '^[a-z0-9-]+$' || echo "")
if [ -n "$SAFE_REGION" ]; then
CMD="$CMD --region $SAFE_REGION"
fi
fi

# Add resource types if specified (validate alphanumeric, commas, underscores)
if [ -n "$INPUT_RESOURCE_TYPES" ]; then
SAFE_TYPES=$(echo "$INPUT_RESOURCE_TYPES" | grep -E '^[a-zA-Z0-9_,]+$' || echo "")
if [ -n "$SAFE_TYPES" ]; then
CMD="$CMD --resource-types $SAFE_TYPES"
fi
fi

# Add output format (choice type, already validated)
if [ -n "$INPUT_OUTPUT_FORMAT" ]; then
CMD="$CMD --output-format $INPUT_OUTPUT_FORMAT"
else
CMD="$CMD --output-format html"
fi

# Add multi-region flag if enabled
if [ "$INPUT_MULTI_REGION" = "true" ]; then
CMD="$CMD --multi-region"
# Set default values
RESOURCE_TYPES="${{ github.event.inputs.resource_types || 'ec2,rds,s3,ebs,lambda' }}"
REGIONS="${{ github.event.inputs.regions || 'us-east-1' }}"
ENVIRONMENT="${{ github.event.inputs.environment || '' }}"

# Build command
CMD="python main.py --resource-types $RESOURCE_TYPES --regions $REGIONS --output-format html,json"

if [ -n "$ENVIRONMENT" ]; then
CMD="$CMD --environment $ENVIRONMENT"
fi

# Add environment filter if specified (validate alphanumeric only)
if [ -n "$INPUT_ENVIRONMENT" ]; then
SAFE_ENV=$(echo "$INPUT_ENVIRONMENT" | grep -E '^[a-zA-Z]+$' || echo "")
if [ -n "$SAFE_ENV" ]; then
CMD="$CMD --environment $SAFE_ENV"
fi
fi

# Run analysis

echo "Running: $CMD"
eval "$CMD"

# Set output file path
REPORT=$(ls aws-optimization-report.* 2>/dev/null | head -1)
echo "report_file=$REPORT" >> "$GITHUB_OUTPUT"

- name: Upload Report Artifact
$CMD

# Set outputs
echo "report_date=$(date +%Y%m%d_%H%M%S)" >> $GITHUB_OUTPUT

- name: Upload HTML Report
uses: actions/upload-artifact@v4
with:
name: dedo-duro-report-${{ github.run_number }}
name: dedo-duro-report-${{ steps.analysis.outputs.report_date }}
path: |
aws-optimization-report.*
retention-days: 30

- name: Upload to S3 (optional)
aws_resource_report_*.html
aws_resource_report_*.json
retention-days: 90

- name: Upload to S3 (Optional)
if: ${{ secrets.REPORT_S3_BUCKET != '' }}
env:
REPORT_FILE: ${{ steps.analysis.outputs.report_file }}
S3_BUCKET: ${{ secrets.REPORT_S3_BUCKET }}
run: |
if [ -n "$REPORT_FILE" ] && [ -f "$REPORT_FILE" ]; then
TIMESTAMP=$(date +%Y-%m-%d)
aws s3 cp "$REPORT_FILE" "s3://${S3_BUCKET}/reports/${TIMESTAMP}/${REPORT_FILE}"
echo "Report uploaded to s3://${S3_BUCKET}/reports/${TIMESTAMP}/${REPORT_FILE}"
fi

- name: Create Summary
env:
REPORT_FILE: ${{ steps.analysis.outputs.report_file }}
INPUT_REGION: ${{ inputs.region }}
INPUT_OUTPUT_FORMAT: ${{ inputs.output_format }}
run: |
{
echo "## Dedo-Duro Analysis Complete"
echo ""
echo "**Report:** \`${REPORT_FILE:-no report}\`"
echo "**Region:** ${INPUT_REGION:-default}"
echo "**Format:** ${INPUT_OUTPUT_FORMAT:-html}"
echo ""
echo "Download the report from the Artifacts section above."
} >> "$GITHUB_STEP_SUMMARY"
aws s3 cp aws_resource_report_*.html s3://${{ secrets.REPORT_S3_BUCKET }}/reports/
aws s3 cp aws_resource_report_*.json s3://${{ secrets.REPORT_S3_BUCKET }}/reports/

- name: Post Summary to PR/Issue
if: github.event_name == 'workflow_dispatch'
uses: actions/github-script@v7
with:
script: |
const fs = require('fs');

// Read JSON report for summary
const files = fs.readdirSync('.').filter(f => f.endsWith('.json') && f.startsWith('aws_resource_report'));
if (files.length > 0) {
const report = JSON.parse(fs.readFileSync(files[0], 'utf8'));

let summary = `## Dedo-Duro Analysis Complete\n\n`;
summary += `**Date:** ${new Date().toISOString()}\n`;
summary += `**Regions:** ${{ github.event.inputs.regions || 'us-east-1' }}\n\n`;
summary += `### Summary\n`;

if (report.summary) {
summary += `- **Total Resources:** ${report.summary.total_resources || 'N/A'}\n`;
summary += `- **Potential Savings:** $${(report.summary.total_potential_savings || 0).toLocaleString()}/month\n`;
}

core.summary.addRaw(summary).write();
}

notify:
name: Send Notifications
needs: analyze
runs-on: ubuntu-latest
if: always()

steps:
- name: Send Slack Notification (optional)
- name: Notify Slack
if: ${{ secrets.SLACK_WEBHOOK_URL != '' }}
uses: slackapi/slack-github-action@v1.25.0
uses: 8398a7/action-slack@v3
with:
status: ${{ needs.analyze.result }}
fields: repo,message,commit,author,action,eventName,ref,workflow
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
SLACK_WEBHOOK_TYPE: INCOMING_WEBHOOK
ANALYZE_RESULT: ${{ needs.analyze.result }}
RUN_URL: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}
with:
payload: |
{
"text": "Dedo-Duro AWS Analysis Complete",
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": "Dedo-Duro AWS Analysis Report"
}
},
{
"type": "section",
"fields": [
{
"type": "mrkdwn",
"text": "*Status:*\n${{ needs.analyze.result }}"
},
{
"type": "mrkdwn",
"text": "*Run:*\n<${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}|View Details>"
}
]
}
]
}

- name: Notify Teams
if: ${{ secrets.TEAMS_WEBHOOK_URL != '' && needs.analyze.result != 'success' }}
run: |
curl -H 'Content-Type: application/json' \
-d '{"@type":"MessageCard","title":"Dedo-Duro Analysis","text":"Analysis completed with status: ${{ needs.analyze.result }}"}' \
${{ secrets.TEAMS_WEBHOOK_URL }}
55 changes: 50 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -220,6 +220,25 @@ Comprehensive cost optimization for AWS AI/ML services:
- **CircleCI Config**: Configuration for CircleCI pipelines
- **Artifact Upload**: Automatic report upload to S3 or CI artifacts

#### Web Dashboard (Enterprise)
- **Real-time Monitoring**: Flask-based web dashboard for live analysis status
- **REST API**: Full API for triggering analysis and retrieving results
- **Report History**: View and compare historical analysis reports
- **Alert Configuration**: Configure custom alert thresholds via web interface

#### Notifications (Enterprise)
- **Slack Integration**: Send alerts and reports to Slack channels via webhooks
- **Microsoft Teams**: Teams channel integration for notifications
- **Custom Alerts**: Configurable thresholds for cost, security, and idle resources
- **Alert Severity Levels**: Critical, warning, and info classifications

#### Auto-Remediation (Experimental)
- **Safe Operations Only**: Tagging and snapshot operations by default
- **Dry-Run Mode**: All actions simulated unless explicitly enabled
- **Approval Workflow**: High-risk actions require manual approval
- **Audit Logging**: Complete audit trail of all remediation actions
- **Risk Levels**: SAFE, LOW, MEDIUM, HIGH, CRITICAL classifications

### Advanced Capabilities & Reporting

- **Multi-Region & China Region Support:** Analyzes resources across multiple specified AWS regions simultaneously, including AWS China regions (`cn-north-1`, `cn-northwest-1`).
Expand Down Expand Up @@ -468,6 +487,26 @@ flowchart LR
│ └── dedo-duro-analysis.yml # Automated analysis workflow
├── .circleci/ # CircleCI configuration (v12.0)
│ └── config.yml # CircleCI pipeline config
├── web/ # Web Dashboard (v12.0-Enterprise)
│ ├── app.py # Flask application
│ ├── templates/ # HTML templates
│ │ └── index.html # Dashboard template
│ └── static/ # CSS/JS assets
│ ├── style.css # Dashboard styles
│ └── app.js # Dashboard JavaScript
├── notifications/ # Notification System (v12.0-Enterprise)
│ ├── __init__.py
│ ├── slack.py # Slack webhook integration
│ ├── teams.py # Microsoft Teams integration
│ └── alerting.py # Alert manager with thresholds
├── remediation/ # Auto-Remediation (v12.0-Enterprise)
│ ├── __init__.py
│ ├── base.py # Base remediation framework
│ ├── ec2_remediation.py # EC2 remediation actions
│ ├── rds_remediation.py # RDS remediation actions
│ └── s3_remediation.py # S3 remediation actions
├── docs/ # Documentation
│ └── kubernetes_permissions.md # K8s permissions guide
└── utils/ # Utility functions (shared)
├── __init__.py
├── aws_utils.py # AWS-specific utilities
Expand Down Expand Up @@ -1396,13 +1435,19 @@ Key milestones: v2.0 (architecture), v3.0 (security), v4.0 (Spot), v5.0 (orphan)
- ~~Reading files with tags and metadata to facilitate the resource grouping process~~ → **Tag-based grouping** (`--grouping-tags`)
- ~~Create the all-in option - Run for a set of accounts at the same time~~ → **Multi-Account Analysis** (`--accounts-file`, `--all-accounts`)

### Completed in v12.0-Enterprise ✅

- ~~Web interface for real-time monitoring~~ → **Web Dashboard** (`web/app.py` - Flask-based)
- ~~Auto-remediation capabilities (experimental)~~ → **Remediation Framework** (`remediation/` module)
- ~~Integration with Slack/Teams for notifications~~ → **Notification System** (`notifications/` module)
- ~~Custom alerting thresholds~~ → **Alert Manager** (`notifications/alerting.py`)
- ~~Kubernetes permissions documentation~~ → **Kubernetes Permissions** (`docs/kubernetes_permissions.md`)

### Pending

- List new permissions required for new functions, such as Kubernetes (partial - see `docs/kubernetes_permissions.md`)
- Web interface for real-time monitoring
- Auto-remediation capabilities (experimental)
- Integration with Slack/Teams for notifications
- Custom alerting thresholds
- Enhanced web dashboard with real-time WebSocket updates
- Remediation approval workflow via web interface
- Historical trend analysis and forecasting

---

Expand Down
Loading