Skip to content

ursaloper/image_moderator_api

Repository files navigation

Image Moderator API

A production-ready backend service built with Python 3.11 and FastAPI for checking images for Not Safe For Work (NSFW) content.

Features

  • Asynchronous API for high performance
  • Supports multiple moderation service providers:
    • DeepAI NSFW Detector
    • JigsawStack NSFW Validator
  • File validation (size, type)
  • Configurable threshold for NSFW content
  • Comprehensive error handling
  • Fully tested with pytest (unit and integration tests)
  • Docker and Docker Compose support
  • Health check endpoint

Requirements

  • Python 3.11+
  • FastAPI 0.115+
  • Other dependencies in requirements.txt

Installation

Local Development

# 1. Clone repository
git clone https://github.com/ursaloper/image_moderator_api.git
cd image_moderator_api

# 2. Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

# 3. Install dependencies
pip install -r requirements.txt

# 4. Create .env file from example
cp .env.example .env

# 5. Edit .env and add your API keys
# DEEPAI_API_KEY=your_key
# JIGSAWSTACK_API_KEY=your_key
# MODERATION_PROVIDER=deepai  # or jigsawstack

Running the API

Development Mode

uvicorn app.main:create_app --factory --host 0.0.0.0 --port 8000 --reload

Production Mode

gunicorn app.main:create_app -k uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000

With Docker

# Build and run
docker-compose up --build

Configuration

The API can be configured using environment variables:

Variable Required Default Description
MODERATION_PROVIDER No jigsawstack Provider to use (deepai or jigsawstack)
DEEPAI_API_KEY Yes* DeepAI API key (required when using DeepAI provider)
JIGSAWSTACK_API_KEY Yes* JigsawStack API key (required when using JigsawStack provider)
THRESHOLD No 0.7 NSFW score threshold (0-1)
LOG_LEVEL No info Logging level
  • API key is only required for the selected provider

API Usage

Moderate an Image

Request:

curl -X POST -F "file=@example.jpg" http://localhost:8000/v1/moderate

Success Response (Safe Content):

{
  "status": "OK"
}

Success Response (NSFW Content):

{
  "status": "REJECTED",
  "reason": "NSFW content",
  "score": 0.83
}

Error Responses:

  • 400 Bad Request: Unsupported file type
  • 413 Request Entity Too Large: File too large
  • 422 Unprocessable Entity: No file provided
  • 502 Bad Gateway: Moderation service unavailable
  • 500 Internal Server Error: Unexpected error

API Documentation

  • Swagger UI: /docs
  • ReDoc: /redoc
  • OpenAPI Spec: /openapi.json

Testing

# Run all tests
pytest

# Run with coverage report
pytest --cov=app

# Run unit tests only
pytest tests/unit/

# Run integration tests only
pytest tests/integration/

Adding a New Provider

The service is designed to be easily extended with new moderation providers:

  1. Create a new service class that implements the AbstractModerationService interface
  2. Add configuration options in app/core/config.py
  3. Update the factory function in app/services/factory.py
  4. Add appropriate tests

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published