Picture: ImageSet2Text generates detailed and nuanced descriptions from large sets of images.
This repository allows users to produce descriptions of given collections of images. It provides the official PyTorch implementation of the method introduced in the following paper:
ImageSet2Text: Describing Sets of Images through Text
Piera Riccio*, Francesco Galati*, Kajetan Schweighofer, Noa Garcia, Nuria M Oliver
*Equal contribution
Accepted at AAAI 2026Abstract: In the era of large-scale visual data, understanding collections of images is a challenging yet important task. To this end, we introduce ImageSet2Text, a novel method to automatically generate natural language descriptions of image sets. Based on large language models, visual-question answering chains, an external lexical graph, and CLIP-based verification, ImageSet2Text iteratively extracts key concepts from image subsets and organizes them into a structured concept graph. We conduct extensive experiments evaluating the quality of the generated descriptions in terms of accuracy, completeness, and user satisfaction. We also examine the method's behavior through ablation studies, scalability assessments, and failure analyses. Results demonstrate that ImageSet2Text combines data-driven AI and symbolic representations to reliably summarize large image collections for a wide range of applications.
- click==8.2.1
- Flask==3.1.1
- inflect==7.5.0
- lmdb==1.7.3
- networkx==3.5
- nltk==3.9.1
- numpy==2.3.2
- omegaconf==2.3.0
- openai==1.98.0
- pandas==2.3.1
- Pillow==11.3.0
- pydantic==2.11.7
- Requests==2.32.4
- scikit_learn==1.7.1
- scipy==1.16.1
- torch==2.6.0+cu124
- tqdm==4.67.1
First, make sure you start the CLIP server and set your OpenAI API key:
# Start the CLIP server in the background
python serve/clip_server.py &
# Set your OpenAI API key
export OPENAI_API_KEY=your_openai_api_key_here
Now, you can generate descriptions for any folder of images with a single command:
python main.py --path ${IMGs_dir}
Replace ${IMGs_dir} with the path to your image set directory (e.g., ./data/examples/meditation).