Skip to content

Question: How to get evaluation metrics (F1-score, etc.)? Seeing 0.0 for many metrics. #35

@Obito0313

Description

@Obito0313

Hi,First of all, thank you for your great work and contribution to this project!
I have successfully run your code in a Docker environment.

My Steps:

  1. I used the test data from FORinstance_dataset/TUWIEN/test.las as input.
  2. The code ran successfully and generated an output file at segmentanytree/output/results/home/datascience/results/test_out.laz.

My Question:
I would like to evaluate the model's performance on the test dataset and get metrics like the F1-score. However, looking at the terminal output, many of the key evaluation metrics are 0.0. For example, test_F1, test_mIPre, and test_mIRec are all zero.

This makes me think that while the prediction file is being generated correctly, the evaluation against the ground truth labels might not be configured properly.

Could you please provide some guidance on the correct procedure to run the evaluation and generate these metrics? Am I missing a step or a specific configuration for the evaluation part?

Terminal Output:
Here is the log from my terminal showing the metrics:

[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] - ==================================================
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_acc = 15.637814558351629
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_macc = 15.637814558351629
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_miou = 7.818908279175749
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_iou_per_class = {0: 0.15637815558351498, 1: 1e-08}
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_pos = 0.0
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_neg = 0.0
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_Iacc = 0.0
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_cov = 0.0
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_wcov = 0.0
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_mIPre = 0.0
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_mIRec = 0.0
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] -     test_F1 = 0.0
[2025-12-05 11:14:43,773][torch_points3d.metrics.base_tracker][INFO] - ==================================================
Done with inference using the config file: /home/datascience/tmp_out_folder/eval.yaml
Renamed /home/datascience/tmp_out_folder/result_0.ply to /home/datascience/tmp_out_folder/instance_segmentation_test_out.ply
Renamed /home/datascience/tmp_out_folder/semantic_result_0.ply to /home/datascience/tmp_out_folder/semantic_segmentation_test_out.ply
Merging point cloud, semantic segmentation and instance segmentation.
  0%|                                                                                                                                                                                | 0/1 [00:00<?, ?it/s]point_cloud: /home/datascience/tmp_out_folder/utm2local/test_out.ply
semantic_segmentation: /home/datascience/tmp_out_folder/semantic_segmentation_test_out.ply
instance_segmentation: /home/datascience/tmp_out_folder/instance_segmentation_test_out.ply
Merging point cloud, semantic segmentation and instance segmentation.
File saved as: /home/datascience/tmp_out_folder/final_results/test_out.laz
Done for:
output_file_path: /home/datascience/tmp_out_folder/final_results/test_out.las
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:40<00:00, 40.41s/it]
Number of files in the final results directory: 1
  adding: home/datascience/results/test_out.laz (deflated 2%)
Processing complete.

Thank you for your time and any help you can provide

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions