feat(core): Add class group-aware metric analysis with multi-config support#12
Open
Go-MinSeong wants to merge 26 commits intodevfrom
Open
feat(core): Add class group-aware metric analysis with multi-config support#12Go-MinSeong wants to merge 26 commits intodevfrom
Go-MinSeong wants to merge 26 commits intodevfrom
Conversation
…onfig Add configuration files for YOLOv9 Tiny model training from scratch
…lov9-info Update yolov9 info
Add EXIR exporting feature
Release: v1.4.1
Bug fixes: - Bug1: Fix double-registration of true_class_ids when pred=0 and GT>0 (if→elif chain) causing recall to be ~half the correct value - Bug2: Register FP predictions when GT=0 so precision is not over-estimated - Bug3: Fix Precision50 to use IoU=0.5 (index 0) instead of all IoU thresholds average - Bug4: Reset classwise_metric_meters each epoch in MetricFactory.reset_values() New features: - Feature5: Enable classwise metrics for train phase (not only valid) - Feature6: Add weighted_mean (instance-count weighted) alongside unweighted mean for all detection metrics; display as 'All (weighted)' row in stdout logs Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
log_end_evaluation() was explicitly copying only 'mean' key when converting classwise class numbers to names, silently dropping 'weighted_mean'. Use dict comprehension to preserve all non-classwise keys (mean, weighted_mean, etc.) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Adds class_groups config in logging.yaml (under metrics) to define which classes should be treated as a group. Intra-group predictions are counted as TP, enabling group-level precision/recall/mAP evaluation alongside the existing per-class and overall metrics. Key changes: - config/logging.yaml: new class_groups field (null by default) - metrics/builder.py: parse class_groups → group_map/group_names/group_map_array - metrics/base.py: MetricFactory stores group info; BaseMetric has optional group meters; result() includes group_classwise/group_mean/group_weighted_mean; reset_values() resets group meters each epoch - metrics/detection/metric.py: DetectionMetricAdaptor computes parallel group_stats by remapping class IDs → group IDs; each detection metric initialises group meters and calls _update_group_meters() in calibrate() - loggers/stdout.py: prints a second 'Group-aware metric' table when groups are configured, showing per-group and All(group)/All(group/weighted) rows - pipelines/base.py: _convert_classwise_to_names() shared helper handles both classwise and group_classwise key conversion (int → 'id_name' string) - pipelines/builder.py: passes class_map to build_metrics() - pipelines/train.py + evaluation.py: use base class helper Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Add _enrich_data_stats_with_groups() to BasePipeline: aggregates instances_per_class → instances_per_group using group_map, then appends it to the data_stats dict before log_results(). stdout.py reads instances_per_group to populate '# of Instances' column in the group table, including totals for All(group) rows. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- detection/metric.py: Replace single-group meter attrs with list-based group_metric_meters/group_classwise_meter_sets/group_weighted_metric_meters; add _init_group_meter_sets, _apply_group_updates helpers; update all 5 metric classes (__init__ + calibrate) to use group_configs list - base.py: reset_values and result() already use list-based attrs (no change) - builder.py: _parse_class_groups detects single/multi format; returns List[dict] - pipelines/base.py: _enrich_data_stats_with_groups produces instances_per_group_list; _convert_classwise_to_names handles group_results list with per-config group_names - loggers/stdout.py: render one group table per group config using group_results list Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Pass image names through _convert_images_as_readable and use them
in ImageSaver.save_result instead of sequential indices.
Output: {stem}_{key}.png (evaluation/inference)
{epoch:04d}_{stem}_{key}.png (training/validation)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Detection 메트릭에 클래스 그룹 분석 기능 추가. 그룹 내 혼동을 정답으로 처리하는 그룹별 성능을 별도 테이블로 출력.
New Features
class_groups설정 시 그룹 내 클래스 혼동을 TP로 처리한 mAP/Precision/Recall 계산by_9class,by_3class)YAML Configuration
Output Example
Modified Files
metrics/detection/metric.pymetrics/base.pymetrics/builder.py_parse_class_groupsfor single/multi format detectionpipelines/base.py_enrich_data_stats_with_groups,_convert_classwise_to_namesfor group_resultspipelines/builder.pyclass_maptobuild_metricsloggers/stdout.pyloggers/base.pyloggers/image.py