Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
94 changes: 94 additions & 0 deletions .github/workflows/opportunity-crawler.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
# VolunteerConnect Hub - Opportunity Crawler Workflow
# ===================================================
# Crawls volunteer opportunities from multiple sources and updates the database
# Powered by AGI Board: OpportunityCrawlerAGI

name: Crawl Volunteer Opportunities

on:
schedule:
# Run daily at 6 AM UTC (2 AM EST)
- cron: '0 6 * * *'
workflow_dispatch:
inputs:
source:
description: 'Source to crawl (all, volunteermatch, idealist, etc.)'
required: false
default: 'all'
dry_run:
description: 'Dry run (do not update database)'
required: false
default: 'false'

jobs:
crawl-opportunities:
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.11'

- name: Install dependencies
run: |
pip install requests beautifulsoup4 feedparser supabase python-dotenv

- name: Run Opportunity Crawler
env:
SUPABASE_URL: ${{ secrets.SUPABASE_URL }}
SUPABASE_SERVICE_KEY: ${{ secrets.SUPABASE_SERVICE_KEY }}
VOLUNTEERMATCH_API_KEY: ${{ secrets.VOLUNTEERMATCH_API_KEY }}
run: |
python scripts/crawl_opportunities.py \
--source ${{ github.event.inputs.source || 'all' }} \
${{ github.event.inputs.dry_run == 'true' && '--dry-run' || '' }}

- name: Update opportunities database
if: ${{ github.event.inputs.dry_run != 'true' }}
env:
SUPABASE_URL: ${{ secrets.SUPABASE_URL }}
SUPABASE_SERVICE_KEY: ${{ secrets.SUPABASE_SERVICE_KEY }}
run: |
python scripts/update_opportunities_db.py

- name: Commit updated opportunities JSON
if: ${{ github.event.inputs.dry_run != 'true' }}
run: |
git config --local user.email "action@github.com"
git config --local user.name "GitHub Action - Opportunity Crawler"
git add data/opportunities.json
git diff --staged --quiet || git commit -m "Update volunteer opportunities [automated]"
git push
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

generate-recommendations:
needs: crawl-opportunities
runs-on: ubuntu-latest
if: ${{ github.event.inputs.dry_run != 'true' }}

steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
ref: main

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.11'

- name: Install dependencies
run: |
pip install supabase python-dotenv

- name: Generate recommendations for active users
env:
SUPABASE_URL: ${{ secrets.SUPABASE_URL }}
SUPABASE_SERVICE_KEY: ${{ secrets.SUPABASE_SERVICE_KEY }}
run: |
python scripts/generate_recommendations.py
3 changes: 0 additions & 3 deletions Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,6 @@ source "https://rubygems.org"
# Jekyll version
gem "jekyll", "~> 4.3"

# GitHub Pages gem for compatibility
gem "github-pages", group: :jekyll_plugins

# Jekyll plugins
group :jekyll_plugins do
gem "jekyll-feed", "~> 0.12"
Expand Down
37 changes: 31 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@

🔗 **Live Site**: [https://pythpythpython.github.io/volunteer-connect-hub/](https://pythpythpython.github.io/volunteer-connect-hub/)

🔒 **Production Status**: Connected to Supabase backend

📊 **Opportunities**: 40+ curated volunteer listings from 10+ organizations

---

## Overview
Expand Down Expand Up @@ -87,8 +91,14 @@ volunteer-connect-hub/
│ ├── css/ # Stylesheets
│ └── js/ # JavaScript (auth, database, app)
├── data/
│ └── opportunities.json # Volunteer opportunities data
│ └── opportunities.json # 40+ curated opportunities
├── scripts/ # Automation scripts
│ ├── crawl_opportunities.py
│ ├── update_opportunities_db.py
│ ├── generate_recommendations.py
│ └── volunteermatch_api.py
├── agi_boards/ # AGI Board implementations
│ ├── boards_config.json
│ ├── user_profile_board.py
│ ├── database_board.py
│ ├── opportunity_crawler_board.py
Expand All @@ -97,6 +107,7 @@ volunteer-connect-hub/
│ └── ux_testing_board.py
├── .github/workflows/ # GitHub Actions
│ ├── deploy.yml
│ ├── opportunity-crawler.yml
│ ├── ux-testing.yml
│ └── data-backup.yml
├── index.html # Home page
Expand All @@ -106,6 +117,8 @@ volunteer-connect-hub/
├── ai-assistant.html # AI tools
├── onboarding.html # Profile questionnaire
├── docs/ # Documentation
├── supabase_schema.sql # Database schema
├── SUPABASE_SETUP.md # Setup guide
├── _config.yml # Jekyll configuration
└── README.md
```
Expand Down Expand Up @@ -150,16 +163,28 @@ This data is used to:

## Opportunity Sources

Real volunteer opportunities are aggregated from:
40+ real volunteer opportunities are aggregated from:

- **VolunteerMatch** - volunteermatch.org
- **VolunteerMatch** - volunteermatch.org (API integration available)
- **Idealist** - idealist.org
- **Habitat for Humanity** - habitat.org
- **American Red Cross** - redcross.org
- **AmeriCorps** - americorps.gov
- **Feeding America** - feedingamerica.org

Data is updated periodically via GitHub Actions.
- **Big Brothers Big Sisters** - bbbs.org
- **Meals on Wheels** - mealsonwheelsamerica.org
- **Special Olympics** - specialolympics.org
- **Boys & Girls Clubs** - bgca.org
- **United Way** - unitedway.org
- **Crisis Text Line** - crisistextline.org

### Opportunity Crawler Workflow

Data is updated automatically via the `opportunity-crawler.yml` GitHub Actions workflow:
- Runs daily at 6 AM UTC
- Can be triggered manually
- Fetches from RSS feeds and APIs
- Updates Supabase database
- Generates fresh recommendations for users

---

Expand Down
36 changes: 36 additions & 0 deletions SUPABASE_SETUP.md
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,43 @@ Supabase Free Tier includes:

This is more than enough for most volunteer organizations. Upgrade only if you exceed these limits.

## Production Deployment

Once you've completed the setup:

1. **Deploy to GitHub Pages**: Push to the main branch and the deploy workflow will run automatically.

2. **Enable Opportunity Crawler**: The `opportunity-crawler.yml` workflow runs daily to fetch new opportunities. You can also trigger it manually.

3. **Optional: VolunteerMatch API**: For live opportunity data, add `VOLUNTEERMATCH_API_KEY` to your secrets. Apply for API access at [VolunteerMatch Business](https://www.volunteermatch.org/business/).

4. **Monitor Data**: Check the Supabase Table Editor to see:
- User profiles and activity
- Hours logged
- Scheduled events
- Generated letters and applications

## Workflows

| Workflow | Purpose | Schedule |
|----------|---------|----------|
| `deploy.yml` | Build and deploy Jekyll site | On push to main |
| `opportunity-crawler.yml` | Fetch new volunteer opportunities | Daily at 6 AM UTC |
| `data-backup.yml` | Backup user data | Weekly |

## AGI Board Integration

The platform uses specialized AGI boards for:

- **OpportunityCrawlerAGI**: Fetches and curates opportunities from multiple sources
- **RecommendationAGI**: Generates personalized opportunity matches
- **UserProfileAGI**: Manages comprehensive volunteer profiles
- **LinguaChartAGI**: Powers AI letter and email writing

See `agi_boards/boards_config.json` for quality metrics and board assignments.

## Support

- [Supabase Documentation](https://supabase.com/docs)
- [VolunteerConnect Hub Issues](https://github.com/pythpythpython/volunteer-connect-hub/issues)
- [VolunteerMatch API](https://www.volunteermatch.org/business/api/)
4 changes: 2 additions & 2 deletions assets/js/database.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@
// ============================================
// CONFIGURATION - Replace with your Supabase project details
// ============================================
const SUPABASE_URL = 'YOUR_SUPABASE_URL'; // e.g., https://xxxxx.supabase.co
const SUPABASE_ANON_KEY = 'YOUR_SUPABASE_ANON_KEY';
const SUPABASE_URL = 'https://njnabnhnuwrzcpncdtuu.supabase.co';
const SUPABASE_ANON_KEY = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6Im5qbmFibmhudXdyemNwbmNkdHV1Iiwicm9sZSI6ImFub24iLCJpYXQiOjE3NjUzMzcxODAsImV4cCI6MjA4MDkxMzE4MH0.cIw0wy3Lb4LuIBXICju_n9oxPhTqE8btr5JJCUe8HrY';

// ============================================
// DATABASE CLIENT
Expand Down
Loading