A production-ready backend service built with Python 3.11 and FastAPI for checking images for Not Safe For Work (NSFW) content.
- Asynchronous API for high performance
- Supports multiple moderation service providers:
- DeepAI NSFW Detector
- JigsawStack NSFW Validator
- File validation (size, type)
- Configurable threshold for NSFW content
- Comprehensive error handling
- Fully tested with pytest (unit and integration tests)
- Docker and Docker Compose support
- Health check endpoint
- Python 3.11+
- FastAPI 0.115+
- Other dependencies in requirements.txt
# 1. Clone repository
git clone https://github.com/ursaloper/image_moderator_api.git
cd image_moderator_api
# 2. Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# 3. Install dependencies
pip install -r requirements.txt
# 4. Create .env file from example
cp .env.example .env
# 5. Edit .env and add your API keys
# DEEPAI_API_KEY=your_key
# JIGSAWSTACK_API_KEY=your_key
# MODERATION_PROVIDER=deepai # or jigsawstackuvicorn app.main:create_app --factory --host 0.0.0.0 --port 8000 --reloadgunicorn app.main:create_app -k uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000# Build and run
docker-compose up --buildThe API can be configured using environment variables:
| Variable | Required | Default | Description |
|---|---|---|---|
| MODERATION_PROVIDER | No | jigsawstack |
Provider to use (deepai or jigsawstack) |
| DEEPAI_API_KEY | Yes* | DeepAI API key (required when using DeepAI provider) | |
| JIGSAWSTACK_API_KEY | Yes* | JigsawStack API key (required when using JigsawStack provider) | |
| THRESHOLD | No | 0.7 |
NSFW score threshold (0-1) |
| LOG_LEVEL | No | info |
Logging level |
- API key is only required for the selected provider
Request:
curl -X POST -F "file=@example.jpg" http://localhost:8000/v1/moderateSuccess Response (Safe Content):
{
"status": "OK"
}Success Response (NSFW Content):
{
"status": "REJECTED",
"reason": "NSFW content",
"score": 0.83
}Error Responses:
- 400 Bad Request: Unsupported file type
- 413 Request Entity Too Large: File too large
- 422 Unprocessable Entity: No file provided
- 502 Bad Gateway: Moderation service unavailable
- 500 Internal Server Error: Unexpected error
- Swagger UI:
/docs - ReDoc:
/redoc - OpenAPI Spec:
/openapi.json
# Run all tests
pytest
# Run with coverage report
pytest --cov=app
# Run unit tests only
pytest tests/unit/
# Run integration tests only
pytest tests/integration/The service is designed to be easily extended with new moderation providers:
- Create a new service class that implements the
AbstractModerationServiceinterface - Add configuration options in
app/core/config.py - Update the factory function in
app/services/factory.py - Add appropriate tests
This project is licensed under the MIT License - see the LICENSE file for details.