-
Notifications
You must be signed in to change notification settings - Fork 2
Added benchmark #39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
LevDenisov
wants to merge
21
commits into
prime-slam:stable
Choose a base branch
from
LevDenisov:benchmark-artifact
base: stable
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Added benchmark #39
Changes from all commits
Commits
Show all changes
21 commits
Select commit
Hold shift + click to select a range
d6e7073
Added benchmark-artifact
LevDenisov 44d47af
delete package Python.h
LevDenisov ac0b772
return directory examples
LevDenisov 2bf047c
Fixed C++ version in CMakeLists file
LevDenisov c07d909
Added the minimum amount of data for the benchmark to work
LevDenisov 2e3b0b6
add using uint for windows
LevDenisov 12e4c53
fixed convertation
LevDenisov 20bf9ed
changed the data type for storing time
LevDenisov c6bded3
Made edits from the comment
LevDenisov 7c873ca
Removing .DS_Store and fixed types
LevDenisov ca700a1
Changed units of measurement, fixed types
LevDenisov 1f46f2f
Merge branch 'prime-slam:stable' into benchmark-artifact
LevDenisov f343a8b
fix: Changed variable name and output format
LevDenisov 0e93da3
feat: Add python benchmark
LevDenisov 5a35325
fix: delete directory python-data-processing
LevDenisov 5fa373f
fix: rename function
LevDenisov 7ddc051
Merge branch 'prime-slam:stable' into benchmark-artifact
LevDenisov 725be05
Merge branch 'prime-slam:stable' into benchmark-artifact
LevDenisov cb19621
feat: added measurements by image processing and segmentation stages
LevDenisov 63998cc
fix: the directory name has been changed and minor errors have been f…
LevDenisov a71553d
feat: changed the way the image is read
LevDenisov File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -44,6 +44,9 @@ install | |
| # IDE | ||
| .idea | ||
|
|
||
| #macOS | ||
| .DS_Store | ||
|
|
||
| # Build dir | ||
| cmake-build-debug | ||
| cmake-build-release | ||
|
|
||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,15 @@ | ||
| target_compile_options(deplex PRIVATE -O3) | ||
|
|
||
| ##################################### | ||
| # process_cloud.cpp | ||
| ##################################### | ||
| add_executable(benchmark_process_cloud benchmark_process_cloud.cpp) | ||
| target_compile_features(benchmark_process_cloud PRIVATE cxx_std_17) | ||
| target_link_libraries(benchmark_process_cloud PRIVATE ${PROJECT_NAME}) | ||
|
|
||
| ##################################### | ||
| # process_sequence.cpp | ||
| ##################################### | ||
| add_executable(benchmark_process_sequence benchmark_process_sequence.cpp) | ||
| target_compile_features(benchmark_process_sequence PRIVATE cxx_std_17) | ||
| target_link_libraries(benchmark_process_sequence PRIVATE ${PROJECT_NAME}) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,156 @@ | ||
| #include <deplex/plane_extractor.h> | ||
| #include <deplex/utils/depth_image.h> | ||
| #include <deplex/utils/eigen_io.h> | ||
|
|
||
| #include <chrono> | ||
| #include <filesystem> | ||
| #include <iostream> | ||
| #include <numeric> | ||
|
|
||
| double variance(const Eigen::VectorXd& data, double mean) { | ||
| double sum = 0; | ||
|
|
||
| for (auto x : data) { | ||
| sum += (x - mean) * (x - mean); | ||
| } | ||
|
|
||
| sum /= static_cast<double>(data.size()); | ||
| return sum; | ||
| } | ||
|
|
||
| int main() { | ||
| std::filesystem::path data_dir = | ||
| std::filesystem::current_path().parent_path().parent_path().parent_path() / "benchmark/data"; | ||
| std::filesystem::path image_path = data_dir / "depth/000004415622.png"; | ||
| std::filesystem::path intrinsics_path = data_dir / "config/intrinsics.K"; | ||
| std::filesystem::path config_path = data_dir / "config/TUM_fr3_long_val.ini"; | ||
|
|
||
| auto start_time = std::chrono::high_resolution_clock::now(); | ||
| auto end_time = std::chrono::high_resolution_clock::now(); | ||
|
|
||
| const int NUMBER_OF_RUNS = 10; | ||
|
|
||
| std::vector<Eigen::Vector3d> execution_time_stage(NUMBER_OF_RUNS, Eigen::VectorXd::Zero(3)); | ||
|
|
||
| deplex::config::Config config = deplex::config::Config(config_path.string()); | ||
| Eigen::Matrix3f intrinsics(deplex::utils::readIntrinsics(intrinsics_path.string())); | ||
| deplex::utils::DepthImage image(image_path.string()); | ||
|
|
||
| auto algorithm = deplex::PlaneExtractor(image.getHeight(), image.getWidth(), config); | ||
| Eigen::VectorXi labels; | ||
|
|
||
| std::cout << "Image Height: " << image.getHeight() << " Image Width: " << image.getWidth() << "\n\n"; | ||
|
|
||
| for (int i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| start_time = std::chrono::high_resolution_clock::now(); | ||
| image.reset(image_path.string()); | ||
| end_time = std::chrono::high_resolution_clock::now(); | ||
|
|
||
| execution_time_stage[i][0] += | ||
| static_cast<double>(std::chrono::duration_cast<std::chrono::milliseconds>(end_time - start_time).count()); | ||
|
|
||
| start_time = std::chrono::high_resolution_clock::now(); | ||
| auto pcd_array = image.toPointCloud(intrinsics); | ||
| end_time = std::chrono::high_resolution_clock::now(); | ||
|
|
||
| execution_time_stage[i][1] += | ||
| static_cast<double>(std::chrono::duration_cast<std::chrono::milliseconds>(end_time - start_time).count()); | ||
|
|
||
| start_time = std::chrono::high_resolution_clock::now(); | ||
| labels = algorithm.process(pcd_array); | ||
| end_time = std::chrono::high_resolution_clock::now(); | ||
|
|
||
| execution_time_stage[i][2] += | ||
| static_cast<double>(std::chrono::duration_cast<std::chrono::milliseconds>(end_time - start_time).count()); | ||
|
|
||
| std::cout << "Iteration #" << i + 1 << " Planes found: " << labels.maxCoeff() << std::endl; | ||
| } | ||
|
|
||
| auto execution_time_segmentation_stage = algorithm.GetExecutionTime(); | ||
|
|
||
| Eigen::VectorXd elements = Eigen::VectorXd::Zero(NUMBER_OF_RUNS); | ||
| for (auto i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| elements[i] = execution_time_stage[i][0]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_cloud_stage_read_image.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| elements[i] = execution_time_stage[i][1]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_cloud_stage_translate_image.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| elements[i] = execution_time_stage[i][2]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_cloud_stage_segmentation.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| elements[i] = execution_time_segmentation_stage[i][0]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_cloud_stage_segmentation_cell_grid.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| elements[i] = execution_time_segmentation_stage[i][1]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_cloud_stage_segmentation_region_growing.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| elements[i] = execution_time_segmentation_stage[i][2]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_cloud_stage_segmentation_merge_planes.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| elements[i] = execution_time_segmentation_stage[i][3]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_cloud_stage_segmentation_labels.csv")).string()); | ||
|
|
||
| Eigen::VectorXd total_time = Eigen::VectorXd::Zero(NUMBER_OF_RUNS); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_RUNS; ++i) { | ||
| total_time[i] = execution_time_stage[i][0] + execution_time_stage[i][1] + execution_time_stage[i][2]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(total_time.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_total_time.csv")).string()); | ||
|
|
||
| double elapsed_time_min = *std::min_element(total_time.begin(), total_time.end()); | ||
| double elapsed_time_max = *std::max_element(total_time.begin(), total_time.end()); | ||
| double elapsed_time_mean = std::accumulate(total_time.begin(), total_time.end(), 0.0) / NUMBER_OF_RUNS; | ||
|
|
||
| double dispersion = variance(total_time, elapsed_time_mean); | ||
| double standard_deviation = sqrt(dispersion); | ||
| double standard_error = standard_deviation / sqrt(NUMBER_OF_RUNS); | ||
|
|
||
| // 95% confidence interval | ||
| const float t_value = 1.96; | ||
| double lower_bound = elapsed_time_mean - t_value * standard_error; | ||
| double upper_bound = elapsed_time_mean + t_value * standard_error; | ||
|
|
||
| std::cout << "\nDispersion: " << dispersion << '\n'; | ||
| std::cout << "Standard deviation: " << standard_deviation << '\n'; | ||
| std::cout << "Standard error: " << standard_error << '\n'; | ||
| std::cout << "Confidence interval (95%): [" << lower_bound << "; " << upper_bound << "]\n\n"; | ||
|
|
||
| std::cout << "Elapsed time (ms.) (min): " << elapsed_time_min << '\n'; | ||
| std::cout << "Elapsed time (ms.) (max): " << elapsed_time_max << '\n'; | ||
| std::cout << "Elapsed time (ms.) (mean): " << elapsed_time_mean << '\n'; | ||
| std::cout << "FPS (max): " << 1000 / elapsed_time_min << '\n'; | ||
| std::cout << "FPS (min): " << 1000 / elapsed_time_max << '\n'; | ||
| std::cout << "FPS (mean): " << 1000 / elapsed_time_mean << '\n'; | ||
|
|
||
| return 0; | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,181 @@ | ||
| #include <deplex/plane_extractor.h> | ||
| #include <deplex/utils/depth_image.h> | ||
| #include <deplex/utils/eigen_io.h> | ||
|
|
||
| #include <chrono> | ||
| #include <filesystem> | ||
| #include <iostream> | ||
| #include <numeric> | ||
|
|
||
| double variance(const Eigen::VectorXd& data, double mean) { | ||
| double sum = 0; | ||
|
|
||
| for (auto x : data) { | ||
| sum += (x - mean) * (x - mean); | ||
| } | ||
|
|
||
| sum /= static_cast<double>(data.size()); | ||
| return sum; | ||
| } | ||
|
|
||
| int main(int argc, char* argv[]) { | ||
| std::filesystem::path data_dir = | ||
| std::filesystem::current_path().parent_path().parent_path().parent_path() / "benchmark/data"; | ||
| std::filesystem::path image_path = data_dir / "depth/000004415622.png"; | ||
| std::filesystem::path intrinsics_path = data_dir / "config/intrinsics.K"; | ||
| std::filesystem::path config_path = data_dir / "config/TUM_fr3_long_val.ini"; | ||
|
|
||
| auto start_time = std::chrono::high_resolution_clock::now(); | ||
| auto end_time = std::chrono::high_resolution_clock::now(); | ||
|
|
||
| const int NUMBER_OF_RUNS = 10; | ||
| const int NUMBER_OF_SNAPSHOT = 1; | ||
|
|
||
| std::vector<Eigen::Vector3d> execution_time_stage(NUMBER_OF_SNAPSHOT, Eigen::VectorXd::Zero(3)); | ||
|
|
||
| std::string dataset_path = (argc > 1 ? argv[1] : (data_dir / "depth").string()); | ||
|
|
||
| deplex::config::Config config = deplex::config::Config(config_path.string()); | ||
| Eigen::Matrix3f intrinsics(deplex::utils::readIntrinsics(intrinsics_path.string())); | ||
| deplex::utils::DepthImage image(image_path.string()); | ||
|
|
||
| // Sort data entries | ||
| std::vector<std::filesystem::directory_entry> sorted_input_data; | ||
| for (auto const& entry : std::filesystem::directory_iterator(dataset_path)) { | ||
| if (entry.path().extension() == ".png") { | ||
| sorted_input_data.push_back(entry); | ||
| } | ||
| } | ||
| sort(sorted_input_data.begin(), sorted_input_data.end()); | ||
|
|
||
| Eigen::VectorXi labels; | ||
|
|
||
| deplex::PlaneExtractor algorithm(image.getHeight(), image.getWidth(), config); | ||
| std::cout << "Image Height: " << image.getHeight() << " Image Width: " << image.getWidth() << "\n\n"; | ||
|
|
||
| for (int t = 0; t < NUMBER_OF_RUNS; ++t) { | ||
| std::cout << "LAUNCH #" << t + 1 << std::endl; | ||
|
|
||
| for (Eigen::Index i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| start_time = std::chrono::high_resolution_clock::now(); | ||
| image.reset(sorted_input_data[i].path().string()); | ||
| end_time = std::chrono::high_resolution_clock::now(); | ||
| execution_time_stage[i][0] += | ||
| static_cast<double>(std::chrono::duration_cast<std::chrono::milliseconds>(end_time - start_time).count()); | ||
|
|
||
| start_time = std::chrono::high_resolution_clock::now(); | ||
| auto pcd_array = image.toPointCloud(intrinsics); | ||
| end_time = std::chrono::high_resolution_clock::now(); | ||
| execution_time_stage[i][1] += | ||
| static_cast<double>(std::chrono::duration_cast<std::chrono::milliseconds>(end_time - start_time).count()); | ||
|
|
||
| start_time = std::chrono::high_resolution_clock::now(); | ||
| labels = algorithm.process(pcd_array); | ||
| end_time = std::chrono::high_resolution_clock::now(); | ||
| execution_time_stage[i][2] += | ||
| static_cast<double>(std::chrono::duration_cast<std::chrono::milliseconds>(end_time - start_time).count()); | ||
|
|
||
| std::cout << "Snapshot #" << i + 1 << " Planes found: " << labels.maxCoeff() << std::endl; | ||
| } | ||
| } | ||
|
|
||
| auto execution_time_segmentation_stage = algorithm.GetExecutionTime(); | ||
|
|
||
| for (auto& v : execution_time_segmentation_stage) { | ||
| for (auto& stage : v) { | ||
| stage /= NUMBER_OF_RUNS; | ||
| } | ||
| } | ||
|
|
||
| for (auto& v : execution_time_stage) { | ||
| for (auto& stage : v) { | ||
| stage /= NUMBER_OF_RUNS; | ||
| } | ||
| } | ||
|
|
||
| Eigen::VectorXd elements = Eigen::VectorXd::Zero(NUMBER_OF_SNAPSHOT); | ||
| for (auto i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| elements[i] = execution_time_stage[i][0]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_stage_read_image.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| elements[i] = execution_time_stage[i][1]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_stage_translate_image.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| elements[i] = execution_time_stage[i][2]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_stage_segmentation.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| elements[i] = execution_time_segmentation_stage[i][0]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_stage_segmentation_cell_grid.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| elements[i] = execution_time_segmentation_stage[i][1]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_stage_segmentation_region_growing.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| elements[i] = execution_time_segmentation_stage[i][2]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_stage_segmentation_merge_planes.csv")).string()); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| elements[i] = execution_time_segmentation_stage[i][3]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(elements.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_stage_segmentation_labels.csv")).string()); | ||
|
|
||
| Eigen::VectorXd total_time = Eigen::VectorXd::Zero(NUMBER_OF_SNAPSHOT); | ||
|
|
||
| for (auto i = 0; i < NUMBER_OF_SNAPSHOT; ++i) { | ||
| total_time[i] = execution_time_stage[i][0] + execution_time_stage[i][1] + execution_time_stage[i][2]; | ||
| } | ||
|
|
||
| deplex::utils::savePointCloudCSV(total_time.cast<float>().transpose(), | ||
| (data_dir / ("process_sequence_total_time.csv")).string()); | ||
|
|
||
| double elapsed_time_min = *std::min_element(total_time.begin(), total_time.end()); | ||
| double elapsed_time_max = *std::max_element(total_time.begin(), total_time.end()); | ||
| double elapsed_time_mean = std::accumulate(total_time.begin(), total_time.end(), 0.0) / NUMBER_OF_SNAPSHOT; | ||
|
|
||
| double dispersion = variance(total_time, elapsed_time_mean); | ||
| double standard_deviation = sqrt(dispersion); | ||
| double standard_error = standard_deviation / sqrt(NUMBER_OF_SNAPSHOT); | ||
|
|
||
| // 95% confidence interval | ||
| const float t_value = 1.96; | ||
| double lower_bound = elapsed_time_mean - t_value * standard_error; | ||
| double upper_bound = elapsed_time_mean + t_value * standard_error; | ||
|
|
||
| std::cout << "\nDispersion: " << dispersion << '\n'; | ||
| std::cout << "Standard deviation: " << standard_deviation << '\n'; | ||
| std::cout << "Standard error: " << standard_error << '\n'; | ||
| std::cout << "Confidence interval (95%): [" << lower_bound << "; " << upper_bound << "]\n\n"; | ||
|
|
||
| std::cout << "Elapsed time (ms.) (min): " << elapsed_time_min << '\n'; | ||
| std::cout << "Elapsed time (ms.) (max): " << elapsed_time_max << '\n'; | ||
| std::cout << "Elapsed time (ms.) (mean): " << elapsed_time_mean << '\n'; | ||
| std::cout << "FPS (max): " << 1000 / elapsed_time_min << '\n'; | ||
| std::cout << "FPS (min): " << 1000 / elapsed_time_max << '\n'; | ||
| std::cout << "FPS (mean): " << 1000 / elapsed_time_mean << '\n'; | ||
|
|
||
| return 0; | ||
| } |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,13 @@ | ||
| [Parameters] | ||
| patchSize=10 | ||
| histogramBinsPerCoord=20 | ||
| minCosAngleForMerge=0.90 | ||
| maxMergeDist=500 | ||
| minRegionGrowingCandidateSize=5 | ||
| minRegionGrowingCellsActivated=4 | ||
| minRegionPlanarityScore=0.55 | ||
| depthSigmaCoeff=1.425e-6 | ||
| depthSigmaMargin=10 | ||
| minPtsPerCell=3 | ||
| depthDiscontinuityThreshold=160 | ||
| maxNumberDepthDiscontinuity=1 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| 944.16741943 0. 956.85647899 | ||
| 0. 939.10968018 552.31795789 | ||
| 0. 0. 1 |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still can see .DS_Store in config directory
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed