The site serves from two places simultaneously:
- Primary — Azure Static Web App at https://destroyallhumans.ai (custom domain).
Deployed by .github/workflows/azure-static-web-apps-polite-sea-05966b00f.yml. SPA fallback + headers in staticwebapp.config.json. Built with
homepage: "."so assets load from root. - Secondary — GitHub Pages at https://chiefinnovator.github.io/destroyallhumans/.
Deployed by .github/workflows/deploy-pages.yml with
PUBLIC_URL=/destroyallhumans. Pages source is "GitHub Actions" (nogh-pagesbranch, no committed build artifacts).
The router in src/App.js picks basename at runtime based on window.location.hostname, so internal links work on both.
- Project requirements
- Logo assets
- Robot personas
- React app (CRA, react-scripts 5)
- Component/style/data directory layout
- Logo assets copied and optimized
- GitHub repository — ChiefInnovator/destroyallhumans
- Homepage with message display
- Pagination (5 days at a time)
- Monthly archive
- Robot persona integration based on tone
- Responsive layout
- Legal pages (Terms, Privacy, Cookies)
- Copyright notice
- .github/workflows/generate-messages.yml — twice daily (09:00 and 21:00 UTC)
- JSON storage under
public/data/ - Scripts in scripts/
- scripts/generateMessage.js
-
OPENAI_API_KEYstored as GitHub secret - Persona selection by tone
- Meta tag in public/index.html
- ads.txt
- Ad slot at top of page
- Azure Static Web App provisioned and wired up
- Custom domain
destroyallhumans.aion the SWA - GitHub Pages parallel deployment (workflow-based)
- Open: verify
https://destroyallhumans.aiTLS handshake — at the time of this writing,curl https://destroyallhumans.ai/returnssslv3 alert handshake failure. DNS resolves to a Cloudflare shared IP but no Cloudflare zone appears to own the handshake. Confirm the custom domain is still bound to the SWA and its managed cert is valid.
-
<title>, description, keywords, canonical, robots meta - Open Graph + Twitter card with 1200x630
og-image.png - JSON-LD:
WebSite,CreativeWork,Organization,FAQPage - public/sitemap.xml — home + legal pages
- public/robots.txt — broad AI crawler allowlist (GPTBot, ClaudeBot, PerplexityBot, Google-Extended, Applebot-Extended, CCBot, cohere-ai, Meta-ExternalAgent, etc.)
- public/llms.txt — llmstxt.org format summary + key links
-
text/plainMIME + SPA exclude forllms.txt,robots.txt,ads.txtin staticwebapp.config.json
- Local build passes (
npm run build) - After next push: confirm the new
deploy-pages.ymlworkflow publishes and the site renders at both URLs with working nav - Confirm
destroyallhumans.aiTLS issue (see section 7) - Lighthouse / Core Web Vitals pass