Inpainting-App is a cross-platform Flutter application for offline object removal from images. The system performs on-device inference using ONNX Runtime and combines interactive segmentation with deep learning–based image inpainting.
The application was developed as part of an engineering thesis focused on mobile object removal and visual artifact handling (e.g. shadows and reflections), without relying on cloud services or external APIs.
The application is designed as a modular foundation for experimenting with mobile inpainting workflows.
By combining segmentation (MobileSAM) with generative inpainting (MI-GAN), the system supports fully offline object removal on mobile devices. The architecture emphasizes modularity, allowing easy integration of new models and systematic benchmarking.
The object removal process follows a multi-stage pipeline:
- Image selection by the user
- Interactive mask creation (manual or segmentation-assisted)
- Object segmentation using a MobileSAM-based ONNX model
- Image inpainting using a generative model
- Display and export of the reconstructed image
![]() Start |
![]() Image selected |
![]() Manual mask |
![]() SAM segmentation |
![]() Inpainted result |
Each stage is implemented as a separate module to allow easy replacement and comparative evaluation of different models.
- Image selection from device gallery or taking photo from camera
- Interactive mask drawing
- Object segmentation using MobileSAM (ONNX)
- Image inpainting using deep learning models (ONNX)
- Fully offline, on-device inference
- Cross-platform Flutter implementation
- Modular pipeline designed for experimentation and benchmarking
The application was developed using Flutter and is intended to be cross-platform. However, all experimental evaluation and on-device testing were performed exclusively on iOS devices.
The behavior on Android devices has not been experimentally validated and may require additional adjustments, particularly with respect to ONNX Runtime execution providers and hardware acceleration.
Install dependencies:
flutter pub get
Run the application on a connected device or simulator:
flutter run
- Flutter / Dart – cross-platform UI
- onnxruntime_flutter – optimized on-device ONNX inference
- MI-GAN ONNX – generative inpainting model
- MobileSAM ONNX – lightweight segmentation model for point-based prompting
- Target platforms: Android and iOS
lib/main.dart– app entry pointlib/app.dart– global app configuration (theme, navigation)lib/firebase_options.dart– Firebase initialization optionslib/services/– inference + image processing serviceslib/ui/– UI pages and widgets (presentation layer)lib/inpainting/– inpainting workflow logic and statelib/utils/– shared utilities (logging, image/tensor helpers)
The application uses ONNX models for both segmentation and inpainting.
Required models:
- MobileSAM (segmentation)
- MI-GAN (inpainting)
Models should be placed in the following directory: assets
Ensure that the model paths and input resolutions match the configuration used in the application.
- MobileSAM-Shadow
Research fork of MobileSAM used for fine-tuning segmentation models with support for shadows and reflections:
https://github.com/Michall00/MobileSAM-Shadow





