Accurate fish population monitoring is vital for sustainable aquaculture management and marine ecological research. Traditional manual fish counting is labour-intensive and prone to errors, often leading to unreliable data and inefficiencies in decision-making. To address these challenges, FishTally leverages advanced computer vision techniques, specifically YOLOv8 for high-precision object detection and ByteTrack for robust multi-object tracking. By integrating these state-of-the-art deep learning models, FishTally ensures accurate species-specific fish counting in both controlled aquaculture settings and complex natural marine environments. FishTally’s architecture combines real-time detection and association-based tracking, allowing it to maintain high accuracy even under challenging conditions such as occlusion, varying lighting, and water turbidity. Its customisable command-line interface enables users to fine-tune detection thresholds using lines or polygons, making it adaptable to diverse aquatic habitats and experimental setups. Additionally, its ability to process continuous video streams allows for dynamic fish population analysis over time. FishTally enhances data reliability, minimises human errors, and reduces operational costs by automating fish counting. The system has been rigorously evaluated using datasets like DeepFish, demonstrating strong statistical performance with high precision and recall rates across both single-species (Caranx sexfasciatus) and multi-species (Acanthopagrus palmaris and Lutjanus russellii) models. Its real-time processing capabilities provide significant advantages for time-sensitive decision-making in aquaculture and ecological research. By overcoming the limitations of traditional fish counting methods, FishTally represents a technological advancement in marine resource management, with promising applications in fishery regulation, biodiversity conservation, and sustainable aquaculture practices.
FishTally goes beyond a single line threshold, it contains source code to adjust for more than one line, and keep a count for each line used.
Polygon zones can also be introduced to check species count in a particular zone. More than one polygon (as many as the user inputs) can be used as well.
Snippets of videos have been used to demonstrate the theory into practice. A simple version of the FishTally tool has been created into a CLI tool which can be used with ease. The source code can be adjusted accordingly.
Ensure you have the following prerequisites installed on your system:
- Python 3.6 or higher
- Git
- Pip (Python package manager)
-
Clone the Repository
First, clone the repository containing the
fishtally.pytool and its associated files to your local machine.git clone https://github.com/kluless13/fishtally.git cd fishtally -
Install Requirements
The
fishtally.pytool relies on ByteTrack for object tracking. Run the provided setup script to clone and set up ByteTrack.chmod +x setup.sh ./setup.sh
-
Verification
Once successful, you should get a message which says "Setup complete." within your terminal.
Follow the code, copy paste one after the other
(Easily reproduce the code to try it out yourself - model weights and videos from the 'Assets' folder have been plugged into the snippets.)
First we check the classes present in the model's weights, using the list_classes.py script.
python list_classes.py --model_weights ~/fishtally/Assets/multiclass-wts.ptNow that we know we are after class ID '3', we can take a look at our video and check to see how to create tigger threshold boundaries, using the frames.py script.
Rename the image to whatever you want.
python frames.py --source_video ~/fishtally/Assets/tangtest.mp4 --save_path ~/fishtally/Assets/tangtest_1.jpgPress 's' to save the image to examine it. Now you can identify points to plot your threshold triggers.
Now we use a single line threshold detector at point A (100,100) and point B (1100,600). [Points read as (x,y)] This is a diagnonal line that is going to cut across the screen for maximum coverage. Threshold points can be chosen based on camera movement, it should be straightforward for survey transects.
Let's tally some fish!
python fishtally.py --model_weights ~/fishtally/Assets/multiclass-wts.pt --source_video ~/fishtally/Assets/tangtest.mp4 --target_video ~/fishtally/Assets/demo_1.mp4 --detector_type single_line --line_start 100 100 --line_end 1100 600 --class_id 3You can change the name of the output file.
Congratulations, you've automated fish counting! 🥳
This repository contains tools for detecting and counting fish in video footage using YOLO-based models. There are two main scripts: fishtally.py for processing videos and list_classes.py for listing available classes in the YOLO model.
Before using the fish tallying tool, you can list the available classes in your YOLO model. This will help you identify the correct class ID for detection. [This is the weights file: https://github.com/kluless13/fishtally/blob/main/Assets/multiclass-wts.pt]
List Classes
Run list_classes.py with the path to your model weights:
python list_classes.py --model_weights <path_to_weights>Replace <path_to_weights> with the path to your YOLO model weights file. This will print out the class names and their corresponding indices.
To make sure you know where to plot the ploints for your detector, use frames.py as mentioned below:
Frame Check
python frames.py --source_video <path_to_source_video.mp4> --save_path <path_to_reference_img.jpg>After identifying the correct class ID, you can proceed to use fishtally.py.
-
Open the Terminal
Open a terminal window and navigate to the directory where
fishtally.pyis located. -
Running the Tool
Use the following command structure to run the tool:
python fishtally.py --model_weights <path_to_weights> --source_video <path_to_source_video> --target_video <path_to_output_video> --detector_type <detector_type> --class_id <class_id>
Replace
<path_to_weights>,<path_to_source_video>, and<path_to_output_video>with the respective paths. For<detector_type>, choose fromsingle_line,multi_line, orpolygon. Replace<class_id>with the ID of the class you want to detect.- For
single_lineandmulti_line, specify the line coordinates. - For
polygon, provide the polygon points [bottom left, top left, top right and bottom right].
- For
-
Example Commands
-
Single Line:
python fishtally.py --model_weights weights.pt --source_video source.mp4 --target_video output.mp4 --detector_type single_line --line_start 100 200 --line_end 300 400 --class_id 3
-
Multiple Lines:
python fishtally.py --model_weights weights.pt --source_video source.mp4 --target_video output.mp4 --detector_type multi_line --line1_start 100 200 --line1_end 300 400 --line2_start 500 600 --line2_end 700 800 --class_id 3
-
Single Polygon:
python fishtally.py --model_weights weights.pt --source_video source.mp4 --target_video output.mp4 --detector_type polygon --polygon_points 100 200 300 400 500 600 700 800 --class_id 3
Replace the coordinates and the class ID in these examples with those relevant to your specific use case. Here is the arrangement of the coordinates: bottom left, top left, top right and bottom right.
- Multiple Polygons:
python fishtally.py --model_weights weights.pt --source_video source.mp4 --target_video output.mp4 --detector_type multi_polygon --polygons 100 200 300 400 500 600 700 800 --polygons 100 200 300 400 500 600 700 800 --class_id 3
Replace the coordinates and the class ID in these examples with those relevant to your specific use case. You can add as many polygons as you want. Here is the arrangement of the coordinates: bottom left, top left, top right and bottom right.
-
-
Viewing the Results
After running the command, the tool will process the video and output the results to the path specified in
--target_video. Check this file to view the results of the fish counting process based on the specified class ID.
- Ensure that the paths to the model weights and videos are correct.
- The coordinates for lines or polygons should be adjusted based on the requirements of your specific task.
- The tool can be used from the jupyter notebooks as well, in case more customisation is required.
- Model weights are necessary before running the tool.
- There can be more than 1 polygon, this is the first iteration of FishTally and I will be updating this as I go.
- Single, multi and multi polygon set up to identify, track and count species.
- In and Out counter present in source code, not applied to CLI version - I&O used to mitigate double counting.
- Backbone is a lightweight YOLOv8 engine (object detection only), can be used for fish species, coral species, incoming bodies such as missles (for naval drone ops) and other debris that needs tracking and counting.
- Semantic segmentation, especially for corals.
- Pipeline for 3D construction in real time.
- Integrate information into database for long term use case.
- Possibly create a front end UI for users to drop videos and run the tool.









