Skip to content

feat(core): Add class group-aware metric analysis with multi-config support#12

Open
Go-MinSeong wants to merge 26 commits intodevfrom
feat/group-metric-analysis
Open

feat(core): Add class group-aware metric analysis with multi-config support#12
Go-MinSeong wants to merge 26 commits intodevfrom
feat/group-metric-analysis

Conversation

@Go-MinSeong
Copy link
Collaborator

Summary

Detection 메트릭에 클래스 그룹 분석 기능 추가. 그룹 내 혼동을 정답으로 처리하는 그룹별 성능을 별도 테이블로 출력.

Note: 이 PR은 fix/metric-bugs-and-improvements (#11) 위에 쌓인 브랜치입니다. 해당 PR이 먼저 merge되어야 합니다.

New Features

  • 그룹 메트릭: YAML에 class_groups 설정 시 그룹 내 클래스 혼동을 TP로 처리한 mAP/Precision/Recall 계산
  • 다중 그룹 설정: 여러 기준의 그룹핑 동시 지원 (예: by_9class, by_3class)
  • 인스턴스 수 표기: 그룹 테이블에 그룹별 GT 인스턴스 수 표시
  • 학습/평가 모두 지원: train, valid, evaluation 전 phase에서 그룹 메트릭 출력
  • 이미지 저장 파일명: evaluation/inference 시 저장 이미지를 원본 파일명 stem으로 저장

YAML Configuration

# Single config
metrics:
  class_groups:
    vehicle: [car, truck]
    person: [pedestrian_adult, pedestrian_child]

# Multiple configs
metrics:
  class_groups:
    by_9class:
      car: [passenger_car, suv, van]
      bus: [midsize_bus, fullsize_bus]
    by_3class:
      car: [passenger_car, suv, van, midsize_bus, fullsize_bus]
      emergency: [firetruck, police, ambulance]

Output Example

Group-aware metric [by_9class] (intra-group confusion = TP):
+----------+------------+----------------+--------+
| Group ID | Group name | # of Instances | mAP50  |
+----------+------------+----------------+--------+
| 0        | car        | 1234           | 0.85   |
| 1        | bus        | 456            | 0.72   |
| -        | All (group)| 1690           | 0.79   |
+----------+------------+----------------+--------+

Modified Files

File Change
metrics/detection/metric.py Group stats collection, list-based multi-config meters
metrics/base.py Multi-group meter lists in BaseMetric/MetricFactory
metrics/builder.py _parse_class_groups for single/multi format detection
pipelines/base.py _enrich_data_stats_with_groups, _convert_classwise_to_names for group_results
pipelines/builder.py Pass class_map to build_metrics
loggers/stdout.py Per-config group table rendering
loggers/base.py Pass image name stem to ImageSaver
loggers/image.py Use original filename stem for saved images

hglee98 and others added 26 commits June 11, 2025 08:59
…onfig

Add configuration files for YOLOv9 Tiny model training from scratch
Bug fixes:
- Bug1: Fix double-registration of true_class_ids when pred=0 and GT>0 (if→elif chain)
  causing recall to be ~half the correct value
- Bug2: Register FP predictions when GT=0 so precision is not over-estimated
- Bug3: Fix Precision50 to use IoU=0.5 (index 0) instead of all IoU thresholds average
- Bug4: Reset classwise_metric_meters each epoch in MetricFactory.reset_values()

New features:
- Feature5: Enable classwise metrics for train phase (not only valid)
- Feature6: Add weighted_mean (instance-count weighted) alongside unweighted mean
  for all detection metrics; display as 'All (weighted)' row in stdout logs

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
log_end_evaluation() was explicitly copying only 'mean' key when converting
classwise class numbers to names, silently dropping 'weighted_mean'.
Use dict comprehension to preserve all non-classwise keys (mean, weighted_mean, etc.)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Adds class_groups config in logging.yaml (under metrics) to define which
classes should be treated as a group. Intra-group predictions are counted
as TP, enabling group-level precision/recall/mAP evaluation alongside the
existing per-class and overall metrics.

Key changes:
- config/logging.yaml: new class_groups field (null by default)
- metrics/builder.py: parse class_groups → group_map/group_names/group_map_array
- metrics/base.py: MetricFactory stores group info; BaseMetric has optional
  group meters; result() includes group_classwise/group_mean/group_weighted_mean;
  reset_values() resets group meters each epoch
- metrics/detection/metric.py: DetectionMetricAdaptor computes parallel
  group_stats by remapping class IDs → group IDs; each detection metric
  initialises group meters and calls _update_group_meters() in calibrate()
- loggers/stdout.py: prints a second 'Group-aware metric' table when groups
  are configured, showing per-group and All(group)/All(group/weighted) rows
- pipelines/base.py: _convert_classwise_to_names() shared helper handles
  both classwise and group_classwise key conversion (int → 'id_name' string)
- pipelines/builder.py: passes class_map to build_metrics()
- pipelines/train.py + evaluation.py: use base class helper

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Add _enrich_data_stats_with_groups() to BasePipeline: aggregates
instances_per_class → instances_per_group using group_map, then
appends it to the data_stats dict before log_results().
stdout.py reads instances_per_group to populate '# of Instances'
column in the group table, including totals for All(group) rows.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- detection/metric.py: Replace single-group meter attrs with list-based
  group_metric_meters/group_classwise_meter_sets/group_weighted_metric_meters;
  add _init_group_meter_sets, _apply_group_updates helpers; update all 5
  metric classes (__init__ + calibrate) to use group_configs list
- base.py: reset_values and result() already use list-based attrs (no change)
- builder.py: _parse_class_groups detects single/multi format; returns List[dict]
- pipelines/base.py: _enrich_data_stats_with_groups produces instances_per_group_list;
  _convert_classwise_to_names handles group_results list with per-config group_names
- loggers/stdout.py: render one group table per group config using group_results list

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Pass image names through _convert_images_as_readable and use them
in ImageSaver.save_result instead of sequential indices.

Output: {stem}_{key}.png (evaluation/inference)
        {epoch:04d}_{stem}_{key}.png (training/validation)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@Go-MinSeong Go-MinSeong requested a review from Dain-Jeong March 19, 2026 10:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants