Checkmate is a website constructed by teammate to organize puzzles during Mystery Hunt.
To start a server or to develop, you will need docker.
You will also need credentials in Discord and in Google Drive.
Copy SECRETS.template.yaml to SECRETS.yaml. You will need to populate its
values.
To override values in prod.default.env, copy prod.override.template.env to
prod.override.env and update its values.
- Create or select a project here.
- Enable the Drive API Instructions.
- Do the same for the Sheets API.
- Create a service account.
- Create a key and download the json to
credentials/drive-credentials.json. This filename should matchSECRETS.yaml. - Set up an OAuth 2 client application for Google.
- Use a "web" client type.
- Set the authorized redirect urls to
https://localhost:8081/accounts/google/login/callback/andhttps://example.org/accounts/google/login/callback/. - Under Audience, add users who should be able to give credentials to become the owner of puzzle sheets to the "Test users".
- Add the OAuth client ID and secret to
SECRETS.yaml.
- Create a Discord application here.
- In OAuth2, add
https://localhost:8081/accounts/discord/login/callback/andhttps://example.org/accounts/discord/login/callback/to the redirects list. - You can use the OAuth2 URL generator to create an invite link to add your
Discord bot to your server. You need these scopes and should end up with the
url
https://discord.com/oauth2/authorize?client_id=<YOUR_BOT_CLIENT_ID>&permissions=17828880&integration_type=0&scope=bot(replacing<YOUR_BOT_CLIENT_ID>).
- Checkmate supports webhooks for new puzzle alerts and solved puzzle alerts. Create these webhooks in your Discord server settings (Integrations -> New Webhook), and then add the webhook urls to the Bot config in the Django admin settings.
You can rent or host whatever server you like and it should work fine as long as it has Docker.
Here is one example from 2026 for posterity.
- I bought a Linode compute instance with Ubuntu on it and installed Docker on it. Docker will pretty much take care of everything, so there's no need to e.g. install PostgreSQL separately etc.
- I added the DNS of our domain with an A record to the IP address of our
Linode, say
example.org, so we didn't have to pass IP addresses around. - On the Linode, I created a user
matewithsudoaccess anddockergroup, and worked with that user (instead ofroot). - I cloned this repository under
~/checkmateon the server. - I ran the
initialize_prodscript on server and created an Django superuser as below. - I also went into
/adminand created a normal user namedteammatewith a password, so people who didn't want to use Discord OAuth could still connect with that username and password.
People on the tech team added their SSH key
into ~/.ssh/authorized_keys for ease of access.
An example host config on the local computer:
Host checkmate
User mate
HostName example.org
IdentityFile ~/.ssh/id_ed25519
In fact, set up this way you no longer need to the password for mate,
since everyone should be using public-key authentication anyway.
You probably then want to disable root access, e.g. see
https://gist.github.com/eliangcs/337beeb3e34e16f617d0
./scripts/initialize_prod
# This will create an admin user with access to the Django admin panel.
docker compose exec app /app/backend/manage.py createsuperuser --username admin # will prompt to pick a passwordTo run in dev mode, run the following:
./scripts/initialize_dev
# This will create an admin user with access to the Django admin panel.
docker compose exec app /app/backend/manage.py createsuperuser --username admin # will prompt to pick a passwordNote that the checkmate-extension zip file is not created in dev mode but it
exists (unzipped) in the repository. The build_extension script for Firefox
requires credentials for a Firefox developer account.
You can also run prod mode locally (which can test static build and prod Caddy setup but not DNS or OAuth redirection) with:
./scripts/initialize_prod --localhostTo stop the server:
./scripts/teardownThe admin page is located at https://example.org/admin. You shouldn't need
to touch most of this in most cases. This is mainly helpful for setting up
puzzle scraping, Discord alerting, and deleting accidental / duplicate puzzles
and rounds.
There should only be at most one instance of the Bot config and Hunt config. (This will be enforced by the backend, so you won't be able to mess this up.) Defaults are used if the respective config does not exist.
The primary config you might want to edit is the Bot config. This tells the scraper how to login, and what page has the list of puzzles. To enable scraping, you should expect to need to parse out the puzzles from the page in python. In the unexpected case that the Hunt site is exceedingly complex, you may also need to fiddle with the scraper to make it auth as your team correctly. This contains the checkbox to enable scraping for real.
The Bot config is also where you can define the webhook urls that can be used to send Discord alerts when new puzzles are created and when puzzles are marked as solved.
The Hunt config generally does not need to be created. It does have some niche options like coloring puzzles on the main sheet based on tags, if you also define tags in the admin panel for a round.
Rounds and puzzles can only be deleted from the admin panel. Additionally, you can update round / puzzle metadata here. Note that puzzle scraping uses a mix of round name and round link for determining which rounds are new, and uses the puzzle link for determining which puzzles are new. Messing with the name or link values could cause duplicates to be created the next time the scraper is run.
If you implement an all puzzles page parser, Checkmate can auto-scrape for new puzzles every minute. (This needs to be setup during the Hunt because you need to know how the Hunt site is structured.)
This is done in backend/services/scraper.py and backend/services/scraper_examples.py.
In backend/services/scraper.py, you will need
to make async_parse parse the all puzzles page and return a dict of
rounds and puzzles. See that file for more info.
As you update the scraper, you can visit /api/scraper to check the current
results. It should show the rounds and puzzles it was able to parse / would be
created, or the traceback if there was an error.
It's recommended to try this out locally at https://localhost:8081/api/scraper.
Although the Django server does not normally autoreload on edits, the
/api/scraper endpoint is set up to run the scraper function in a new process
so that it picks up local python changes. Thus you can make changes to
scraper_examples.py and reload the
API page to test it until it works.
Hopefully this section should not be needed. Prefer to debug by visiting /api/scraper.
To test autocreating puzzles, use this command:
docker compose exec -w /app/backend app celery -A checkmate call services.tasks.auto_create_new_puzzlesThis will be a dry_run by default and will print a task id. To check the result, use:
docker compose exec -w /app/backend app celery -A checkmate result [TASK_ID]or if there was an error:
docker compose exec -w /app/backend app celery -A checkmate result --traceback [TASK_ID]To migrate a postgres database between servers:
# on old server
docker compose exec -u postgres postgres pg_dump postgres > dumpfile.dump
# on new server
docker compose exec -T -u postgres postgres psql -U postgres < dumpfile.dumpThe Checkmate browser extension is needed to display Discord within the Checkmate website. Sometimes this is also needed to display the Hunt website within Checkmate (depending on the Hunt website's x-frame-options).
To build a new version of the Checkmate browser extension for Chrome, run
docker compose exec app /app/build_extensionFirefox needs a signed version of the app. You can sign one with this command.
docker compose exec app /app/build_extension --signNote that Firefox will only sign a version number once, so the version needs to be incremented each time.
If not modifying extension code, then neither of these steps are necessary, and
Checkmate will construct the Chrome extension from source and distribute the
Firefox extension from the web-ext-artifacts folder.