Livingdex is a Flutter application that uses Gemini 2.0 Flash to simulate a real-life Pokédex, dedicated to identifying plants and animals.
![]() |
![]() |
![]() |
|---|---|---|
| Application Icon | Splash Screen | Dark Mode |
![]() |
![]() |
![]() |
|---|---|---|
| Main Screen | Information Screen | Rotomdex Chatbot (full-screen image) |
- 2025 Updates
- Project Description
- Technical Analysis
- Main Features
- Architecture and Technologies
- Configuration and Installation
- Contributions and Future Developments
- Useful Links
- Migration to Gemini 2.0 Flash: The project has been upgraded from Gemini 1.5 to Gemini 2.0 Flash for better performance.
- Introduction of Firebase AI Logic: The architecture has moved from a direct dependency on Firebase Vertex AI to the new Firebase AI Logic.
The main advantage of Firebase AI Logic is flexibility. You can now easily choose which AI provider to use (Vertex AI or Google AI) directly from the code configuration, without having to rewrite the calling logic. This allows you to:
- Test and compare the models and pricing of both providers.
- Simplify maintenance and future updates.
Before (Firebase Vertex AI):
// The logic was tightly coupled to FirebaseVertexAI
model = FirebaseVertexAI.instance.generativeModel(
model: geminiModel,
generationConfig: GenerationConfig(
temperature: 0,
responseMimeType: 'application/json',
),
);Now (Firebase AI Logic):
With Firebase AI Logic, the provider choice is configured in the file that manages the model calling logic (in this case, lib/quick_id.dart)
👉 Vertex AI Provider (for enterprise solutions and RAG):
final googleAI = FirebaseAI.vertexAI();
model = googleAI.generativeModel(
model: geminiModel,
generationConfig: GenerationConfig(
temperature: 0.1,
responseMimeType: 'application/json',
),
);👉 Google AI Provider (for prototyping and lower costs):
final googleAI = FirebaseAI.googleAI();
model = googleAI.generativeModel(
model: geminiModel,
generationConfig: GenerationConfig(
temperature: 0.1,
responseMimeType: 'application/json',
),
);Livingdex is a personal project that I enjoyed developing. The main goal is to satisfy people's curiosity about the animals and plants they encounter. By taking a photo through the application, you can identify the living being in the image, obtain detailed information (name, weight, height, description enriched with curiosities), and interact with a chatbot for further exploration that will respond by consulting certified sources.
Livingdex is designed to encourage people to look around and see their surroundings better, with a fresh perspective on the environment. Everything is presented with an interface that recalls the aesthetics of a Pokédex, enhanced with additional features like dark mode.
Here you can find the functional analysis of the project and the folder with the unit tests performed:
- Visual Recognition: Identification of plants and animals via Gemini 2.0 Flash.
- Pokédex-Themed Interface: UI inspired by the original design for an immersive experience.
- Integrated Chatbot (Rotomdex): Virtual assistant that provides reliable information from English Wikipedia, thanks to a Reasoning Engine that performs RAG (Retrieval-Augmented Generation).
- Dark Mode: For a customizable and comfortable visual experience.
- Language and Framework: Dart and Flutter
- AI and Provider: Gemini 2.0 Flash, Firebase AI Logic (with Vertex AI or Google AI provider)
- Backend: Google Cloud Platform, Cloud Run, Firebase, FlutterFire
To handle requests from the app, a backend on Cloud Run is required. Two approaches are recommended:
This approach orchestrates multiple services to provide high-quality responses (RAG).
- Receives the image from the app via an HTTP endpoint.
- Uploads it to Cloud Storage.
- Performs a search on Vertex AI Search to find relevant information.
- Builds a prompt for Gemini, including the search context.
- Calls the Gemini model via Firebase AI Logic requesting structured JSON output.
- Returns the formatted data to the app.
Structured JSON Response Example:
{
"id": "req-1234",
"identified": true,
"species": "Acer platanoides",
"common_name": "Platano",
"confidence": 0.93,
"height_estimate": "5-10 m",
"description": "Short description...",
"sources": [
{"name":"Wikipedia", "url":"https://en.wikipedia.org/...."}
]
}A simpler alternative if RAG is not needed. The backend acts as a proxy that authenticates the request and forwards it to Gemini. It's faster and cheaper to implement, but with lower response quality.
The app works on mobile devices and, at the moment, has been tested only on Android. iOS configuration has not been tested and may cause installation and configuration issues.
Make sure you have the following installed:
- Flutter SDK: Official Guide
- IDE: Visual Studio Code and Android Studio for an optimal development experience.
- Google Cloud & Firebase Account: To use backend and AI services.
This guide is based on the recommended Reasoning Engine approach.
- In your Google Cloud project, create a search data store on Vertex AI Search.
- Configure a search app with the necessary data for identification (e.g., descriptions from Wikipedia).
- Create an application (e.g., in Node.js or Python) that acts as a Reasoning Engine.
- Deploy the app on Cloud Run. This service will orchestrate calls to Vertex AI Search and Gemini.
- In your Firebase project, enable Firebase AI Logic.
- Configure the integration to communicate with your agent's endpoint on Cloud Run.
- Create the
config.dartfile insidelib/and insert your Cloud Run service URL and the model you want to use.const geminiModel = 'gemini-2.0-flash'; const cloudRunHost = 'your-cloud-run-service.a.run.app';
lib/quick_id.dart:Choose which AI provider to use (Vertex AI or Google AI) as shown in the Updates section
flutterfire configureto connect the Flutter project to your Firebase project. This will generate thelib/firebase_options.dartfile. Example oflib/firebase_options.dart:
// File automatically generated by `flutterfire configure`.
import 'package:firebase_core/firebase_core.dart' show FirebaseOptions;
import 'package:flutter/foundation.dart' show defaultTargetPlatform, TargetPlatform;
class DefaultFirebaseOptions {
static FirebaseOptions get currentPlatform {
// Example for Android
if (defaultTargetPlatform == TargetPlatform.android) {
return const FirebaseOptions(
apiKey: 'ANDROID_API_KEY_PLACEHOLDER',
appId: 'ANDROID_APP_ID_PLACEHOLDER',
messagingSenderId: 'SENDER_ID_PLACEHOLDER',
projectId: 'PROJECT_ID_PLACEHOLDER',
storageBucket: 'PROJECT_ID.appspot.com',
);
}
// Add configurations for other platforms here (e.g., iOS)
throw UnsupportedError(
'DefaultFirebaseOptions are not supported for this platform.',
);
}
}
- Install all project dependencies:
flutter pub get - Start the application on an emulator or physical device:
flutter run -dTip: It's preferable to run the app on a physical device. Follow this guide for device configuration.
- Image Quality < 360p:
If the image quality is below 360p, the Gemini API may misinterpret the subject or fail to recognize the image, reporting that the subject is neither an animal nor a plant. In this case, a generic error message will be displayed indicating that the image cannot be identified. - Slow Loading:
Slow loading of the subject's description may be due to an internet connection problem or communication with the Gemini API, which may take longer depending on connection quality.
- Text-to-Speech: Add a voice reading function for descriptions to improve accessibility.
- iOS Support: Test and resolve any compatibility issues.
- UI/UX Improvements: Optimize the user interface.
If you want to contribute, you are welcome! The areas of greatest need are those listed above. Open a Pull Request to propose your changes.





