Hi,
I'm testing Awesome-MVS on the Family scene from the Tank and Temples (Intermediate) dataset, but the generated depth maps have very poor quality.
Here’s my current setup:
python test.py
--normalpath C:...\omnidata_tools\torch\outputs\npy
--testlist ./lists/custom/family.txt
--dataset general_eval
--batch_size 1
--testpath_single_scene C:...\datasets\tankandtemples\intermediate\Family
--loadckpt ./bld_model_000012.ckpt
--outdir ./outputs/test1
--numdepth 192
--ndepths "48,32,8"
--depth_inter_r "4.0,1.0,0.5"
--interval_scale 1.06
--filter_method "o3d"
I’ve also tried adjusting parameters such as:
--max_h 1024 --max_w 1536
--num_view 8
--thres_view 3
--conf 0.06
For the normals, I generated them with Omnidata using:
python demo.py --task normal
--img_path C:...\datasets\tankandtemples\intermediate\Family\images
--output_path outputs/family
The normals look good, so I believe the problem is mostly with the depth estimation stage.

Hi,
I'm testing Awesome-MVS on the Family scene from the Tank and Temples (Intermediate) dataset, but the generated depth maps have very poor quality.
Here’s my current setup:
Dataset: Tank and Temples – Intermediate / Family
Model: bld_model_000012.ckpt
Command:
python test.py
--normalpath C:...\omnidata_tools\torch\outputs\npy
--testlist ./lists/custom/family.txt
--dataset general_eval
--batch_size 1
--testpath_single_scene C:...\datasets\tankandtemples\intermediate\Family
--loadckpt ./bld_model_000012.ckpt
--outdir ./outputs/test1
--numdepth 192
--ndepths "48,32,8"
--depth_inter_r "4.0,1.0,0.5"
--interval_scale 1.06
--filter_method "o3d"
I’ve also tried adjusting parameters such as:
--max_h 1024 --max_w 1536
--num_view 8
--thres_view 3
--conf 0.06
For the normals, I generated them with Omnidata using:
python demo.py --task normal
--img_path C:...\datasets\tankandtemples\intermediate\Family\images
--output_path outputs/family
The normals look good, so I believe the problem is mostly with the depth estimation stage.