A small, personal-use Python script that downloads images, GIFs, and videos from Reddit.
It supports interactive prompts (when flags are omitted), Reddit galleries, and saves files into a single folder per subreddit.
For personal use only. Follow Reddit’s Terms of Service and each subreddit's rules.
Do not re-share media unless you have rights.
- Interactive mode: If you don’t pass CLI flags, the script will prompt for:
- subreddit name
- sort order (
new/top/rising/hot) - top time filter (when
sort=top) - number of posts to fetch (
None= unlimited)
- Media support:
- Direct images (
i.redd.it,i.imgur.com, etc.) with retry + exponential backoff - Galleries (multi-image posts)
- Videos and embeds via
yt-dlp
- Direct images (
- Output organization:
- Saves into one folder per subreddit:
media/<subreddit>/
- Saves into one folder per subreddit:
- .env-based credentials:
- Loads
CLIENT_ID,CLIENT_SECRET,USERNAME,PASSWORD,USER_AGENTfrom a local.env - Optional
TWOFA=trueprompts for a 2FA code at runtime
- Loads
- Safety knobs:
--limitcan be capped withSAFE_MAX_LIMIT(from.env, default500)--sleepadds a delay between posts to be polite to the API (default0.3s)--max-downloadslets you stop after N media downloads even if more posts exist
- Python 3.9+
- Dependencies:
# 1) Create and activate a virtual environment (recommended)
python -m venv .venv
source .venv/bin/activate
# 2) Install dependencies
pip install praw python-dotenv yt-dlp
# 3) Create a .env file next to main.py (see below)CLIENT_ID=your_client_id
CLIENT_SECRET=your_client_secret
USERNAME=your_username
PASSWORD=your_password
USER_AGENT=reddit-downloader/1.0 by u/your_username
# Optional:
# TWOFA=true # enable if your Reddit account uses 2FA
# SAFE_MAX_LIMIT=500 # soft cap for --limit (default 500)## Download 100 newest posts from r/wallpapers
python main.py --subreddit wallpapers --sort new --limit 100
## Top posts of all time from r/pics (unlimited by default)
python main.py --subreddit pics --sort top --time allIf you run the script without arguments, it will prompt you:
python main.py
# Enter subreddit: pics
# Sort (new/top/rising/hot) [Default = new]: top
# Top time filter (hour/day/week/month/year/all) [Default = all]: all
# Number of posts to fetch [Default = None]: 200 # or leave blank for unlimited# Save to a different base folder and slow down requests
python main.py --subreddit pics --media-dir my_media --sleep 0.8
# Stop after 50 media downloads
python main.py --subreddit gifs --max-downloads 50- Folder:
media/<subreddit>/ - Filenames:
<sanitized-title>-<postid>.<ext> - Extensions: guessed from the URL or HTTP Content-Type (e.g.,
image/jpeg→.jpg)
- If you omit
--limit, the script setslimit=None(unlimited; PRAW fetches as many as possible). - If you provide
--limit N, it is capped bySAFE_MAX_LIMIT(from.env, default500).
- Use
--sleepto avoid hammering the API. - Use smaller limits if you frequently hit Reddit’s rate limits.
- Unsafe characters in titles are replaced with
_to keep filenames safe.
- “.env not found” → Make sure the
.envfile is in the same folder asmain.py. - 401 / 403 / 404 API errors → Double-check your Reddit app credentials. The app type must be Script in Reddit’s developer settings.
- 2FA prompt not showing → Set
TWOFA=truein.env. - 429 Too Many Requests (videos) → Lower
--limit, increase--sleep, or wait before retrying. - Images fail to download → The script retries automatically; persistent failures are logged with the URL for manual download.
This tool is for personal archiving.
Respect copyright and the Reddit API Terms of Service.
Do not redistribute or re-host downloaded content without permission.