Add support for multiple loss functions in AADForest#267
Add support for multiple loss functions in AADForest#267
Conversation
bce7fb6 to
e619237
Compare
e619237 to
2ae0b83
Compare
There was a problem hiding this comment.
Pull Request Overview
This PR adds support for multiple loss functions to the AADForest class, refactoring the code to prepare for extensibility. The key changes include moving the _q_tau method from AADForest to AADEvaluator and adding a loss parameter to the constructor.
- Moves
_q_taumethod from AADForest to AADEvaluator for better encapsulation - Adds loss parameter to AADForest constructor with validation
- Updates method calls to use the evaluator's
_q_taumethod
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| src/coniferest/aadforest.py | Refactors _q_tau method to AADEvaluator, adds loss parameter with validation |
| tests/test_aadforest.py | Updates test to call _q_tau through evaluator instead of forest directly |
| @@ -231,24 +248,20 @@ def __init__( | |||
| else: | |||
| raise ValueError(f"map_value is neither a callable nor one of {', '.join(MAP_VALUES.keys())}.") | |||
|
|
|||
There was a problem hiding this comment.
[nitpick] The LOSSES list contains only one item, making the validation somewhat redundant. Consider either removing this validation until multiple loss functions are implemented, or add a comment explaining the future extensibility plan.
| # Currently, only the "hinge" loss function is supported. | |
| # This list is designed for future extensibility to include additional loss functions. |
| if loss not in LOSSES: | ||
| raise ValueError(f"loss is not one of {', '.join(LOSSES)}.") | ||
|
|
||
| self.loss = loss |
There was a problem hiding this comment.
The loss parameter is stored but never used in the implementation. Consider either implementing the loss function logic or adding a comment explaining how this will be used in future implementations.
| self.loss = loss | |
| self.loss = loss | |
| # The `loss` parameter is currently not used in the implementation. | |
| # It is reserved for future extensions where different loss functions | |
| # may be incorporated into the anomaly detection logic. |
CodSpeed Performance ReportMerging #267 will improve performances by ×18Comparing 🎉 Hooray!
|
| Benchmark | BASE |
HEAD |
Change | |
|---|---|---|---|---|
| ⚡ | test_benchmark_fit_known |
266.8 ms | 205.8 ms | +29.65% |
| ⚡ | test_benchmark_loss_gradient[1024-1024] |
4.4 ms | 3.9 ms | +13.9% |
| ⚡ | test_benchmark_loss_gradient[1024-65536] |
4.4 ms | 3.9 ms | +13.85% |
| ⚡ | test_benchmark_feature_signature[128] |
555.7 µs | 409.1 µs | +35.84% |
| ⚡ | test_benchmark_feature_signature[2] |
563.4 µs | 414 µs | +36.06% |
| ⚡ | test_benchmark_fit_sklearn[1024] |
5.3 s | 4.5 s | +17.6% |
| ⚡ | test_benchmark_fit_sklearn[128] |
668.1 ms | 568.5 ms | +17.51% |
| ⚡ | test_benchmark_score[1024] |
8.5 ms | 7 ms | +22.72% |
| ⚡ | test_benchmark_score[1048576] |
8.4 s | 6.9 s | +21.5% |
| ⚡ | test_benchmark_score_samples[1-128-1024] |
9.8 ms | 8.3 ms | +18.46% |
| ⚡ | test_benchmark_score_samples[1-128-1048576] |
9.7 s | 8.3 s | +17.46% |
| ⚡ | test_benchmark_score_samples[1-128-1] |
264.8 µs | 125.7 µs | ×2.1 |
| ⚡ | test_benchmark_score_samples[1-128-32] |
614.1 µs | 456.3 µs | +34.6% |
| ⚡ | test_benchmark_score_samples[1-256-1024] |
18 ms | 15.1 ms | +19.68% |
| ⚡ | test_benchmark_score_samples[1-256-1048576] |
17.9 s | 15 s | +19.39% |
| ⚡ | test_benchmark_score_samples[1-256-1] |
276.2 µs | 158.2 µs | +74.56% |
| ⚡ | test_benchmark_score_samples[1-256-32] |
978.7 µs | 773.6 µs | +26.51% |
| ⚡ | test_benchmark_score_samples[1-64-1024] |
5.6 ms | 4.8 ms | +17.86% |
| ⚡ | test_benchmark_score_samples[1-64-1048576] |
5.5 s | 4.7 s | +15.62% |
| ⚡ | test_benchmark_score_samples[1-64-1] |
223.6 µs | 109.7 µs | ×2 |
| ... | ... | ... | ... | ... |
ℹ️ Only the first 20 benchmarks are displayed. Go to the app to view all benchmarks.
Footnotes
789beb5 to
830f31a
Compare
Reduce amount of redundant code by moving loss-function related things to AADEvaluator.
2a10da0 to
2b74f34
Compare
No description provided.