Conversation
WalkthroughOSIV 비활성화 환경 대응으로 Changes
Sequence Diagram(s)sequenceDiagram
participant Kafka
participant ClickConsumer as Consumer
participant AdRepo as AdContentRepository
participant ClickLogRepo as ClickLogRepository
participant Redis
Kafka->>Consumer: 이벤트 전달
Consumer->>AdRepo: findByIdWithGroupAndCampaign(id)
AdRepo-->>Consumer: AdContent + AdGroup + AdCampaign
alt isSuspect && orgId != null
Consumer->>Redis: SET click:suspect:alert:org:{orgId} (JSON) EX 60
Redis-->>Consumer: OK
end
Consumer->>ClickLogRepo: save(ClickLog)
ClickLogRepo-->>Consumer: 저장 완료
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
Suggested reviewers
추가 코멘트(간단히): JOIN FETCH로 OSIV 비활성화 이슈를 직접 해결한 설계는 적절합니다. Redis에 저장되는 페이로드의 null 가능성(adGroup/adCampaign 이름 등)과 ObjectMapper 설정(custom serializer 포함)을 리뷰에서 한번만 확인해 주세요. 🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (2)
src/main/java/com/whereyouad/WhereYouAd/infrastructure/client/kafka/ClickConsumer.java (2)
113-115: 예외 처리 전략이 적절합니다.Redis 저장 실패 시에도 메인 로직(ClickLog DB 저장)이 계속 진행되도록 한 것은 장애 격리(Fault Isolation) 관점에서 올바른 설계입니다. 알림 기능의 일시적 실패가 핵심 비즈니스 로직(클릭 로그 저장)에 영향을 주지 않습니다.
다만, 운영 모니터링을 위해
log.error에 추가 컨텍스트(예:orgId,adContentId)를 포함하면 디버깅에 도움이 됩니다:💡 로깅 개선 제안
} catch (Exception e) { - log.error("봇 알림 JSON 변환/저장 실패", e); + log.error("봇 알림 JSON 변환/저장 실패 - orgId={}, adContentId={}", + event.getOrgId(), event.getAdContentId(), e); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/main/java/com/whereyouad/WhereYouAd/infrastructure/client/kafka/ClickConsumer.java` around lines 113 - 115, In the catch block inside ClickConsumer where Redis save failures are logged (the catch(Exception e) that currently calls log.error("봇 알림 JSON 변환/저장 실패", e)), augment the log message with contextual identifiers (e.g., orgId and adContentId extracted from the surrounding scope or the parsed DTO) so the error record includes those values; update the log.error invocation to include the context fields alongside the exception while keeping the existing behavior (do not rethrow so ClickLog DB save continues).
83-100: NPE 방어 코드가 깔끔하게 작성되었습니다! 👏기본값을 미리 설정하고 연관 엔티티를 단계별로 체크하는 방식이 안전합니다.
findByIdWithGroupAndCampaign이 Fetch Join을 사용하므로 정상적인 경우adGroup과adCampaign이 null일 가능성은 낮지만, 데이터 정합성 문제나 예외 상황에 대비한 방어적 프로그래밍으로 좋은 습관입니다.한 가지 팁: 반복되는
adContent.getAdGroup().getAdCampaign()호출을 지역 변수로 추출하면 가독성이 더 좋아질 수 있습니다:♻️ 선택적 리팩토링 제안
if (adContent.getAdGroup() != null && adContent.getAdGroup().getAdCampaign() != null) { + var campaign = adContent.getAdGroup().getAdCampaign(); // 캠페인 이름 추출 - if (adContent.getAdGroup().getAdCampaign().getName() != null) { - campaignNameStr = adContent.getAdGroup().getAdCampaign().getName(); + if (campaign.getName() != null) { + campaignNameStr = campaign.getName(); } // 플랫폼(Provider) 추출 - if (adContent.getAdGroup().getAdCampaign().getProvider() != null) { - providerStr = String.valueOf(adContent.getAdGroup().getAdCampaign().getProvider()); + if (campaign.getProvider() != null) { + providerStr = String.valueOf(campaign.getProvider()); } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/main/java/com/whereyouad/WhereYouAd/infrastructure/client/kafka/ClickConsumer.java` around lines 83 - 100, The repeated chained calls adContent.getAdGroup().getAdCampaign() should be hoisted into a local variable to improve readability and avoid redundant accessor calls; create a local variable (e.g., AdCampaign adCampaign = adContent.getAdGroup() != null ? adContent.getAdGroup().getAdCampaign() : null), then use adCampaign to set providerStr and campaignNameStr, and keep the existing adNameStr logic intact.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In
`@src/main/java/com/whereyouad/WhereYouAd/infrastructure/client/kafka/ClickConsumer.java`:
- Around line 113-115: In the catch block inside ClickConsumer where Redis save
failures are logged (the catch(Exception e) that currently calls log.error("봇 알림
JSON 변환/저장 실패", e)), augment the log message with contextual identifiers (e.g.,
orgId and adContentId extracted from the surrounding scope or the parsed DTO) so
the error record includes those values; update the log.error invocation to
include the context fields alongside the exception while keeping the existing
behavior (do not rethrow so ClickLog DB save continues).
- Around line 83-100: The repeated chained calls
adContent.getAdGroup().getAdCampaign() should be hoisted into a local variable
to improve readability and avoid redundant accessor calls; create a local
variable (e.g., AdCampaign adCampaign = adContent.getAdGroup() != null ?
adContent.getAdGroup().getAdCampaign() : null), then use adCampaign to set
providerStr and campaignNameStr, and keep the existing adNameStr logic intact.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: c644e6a3-2565-4207-a669-3fb9de9c0af2
📒 Files selected for processing (2)
src/main/java/com/whereyouad/WhereYouAd/domains/advertisement/persistence/repository/AdContentRepository.javasrc/main/java/com/whereyouad/WhereYouAd/infrastructure/client/kafka/ClickConsumer.java
kingmingyu
left a comment
There was a problem hiding this comment.
P4: 고생하셨습니다..! 저도 sse 관련 코드나 osiv 설정은 처음 봐서 많이 배워갑니다!!
📌 관련 이슈
🚀 개요
ClickConsumer 에서 이상 클릭에 대한 정보를 Redis 에 적재하는 부분이 빠져있는 것을 확인해 Redis 에 적절하게 정보를 적재하는 로직추가
📄 작업 내용
📸 스크린샷 / 테스트 결과 (선택)
수정 이후 Postman 을 통해 Redirect URL 에 요청을 보내면 정상적으로 이상 클릭 정보가 출력됩니다.


✅ 체크리스트
🔍 리뷰 포인트 (Review Points)
Summary by CodeRabbit
변경 사항
신기능
성능 개선
문서