Skip to content

[Feat] code_generate 도구 구현#12

Open
haein45 wants to merge 4 commits into
devfrom
feat/10-code-generate
Open

[Feat] code_generate 도구 구현#12
haein45 wants to merge 4 commits into
devfrom
feat/10-code-generate

Conversation

@haein45
Copy link
Copy Markdown
Collaborator

@haein45 haein45 commented May 14, 2026

📌 관련 이슈

🏷️ PR 타입

  • ✨ 기능 추가 (Feature)
  • 🐛 버그 수정 (Bug Fix)
  • ♻️ 리팩토링 (Refactoring)
  • 📝 문서 수정 (Documentation)
  • 🎨 스타일 변경 (Style)
  • ✅ 테스트 추가 (Test)

📝 작업 내용

  • src/proovy_agent/graph/tools/code_generate.py 구현
  • src/proovy_agent/common/sse/context.py 구현 — ContextVar 기반 SSE emitter 전달
  • LangChain @tool로 정의, problemapproach를 입력받아 실행 가능한 Python 코드 반환
  • get_llm("flash") 사용으로 코드 생성
  • current_emitter ContextVar로 tool 내부에서 SSE tool_start / tool_result 이벤트 발행

📸 스크린샷

  • 해당 없음 (외부 API 의존)

✅ 체크리스트

  • 코드 리뷰를 받을 준비가 완료되었습니다
  • 테스트를 작성하고 모두 통과했습니다
  • 문서를 업데이트했습니다 (필요한 경우)
  • 코드 스타일 가이드를 준수했습니다
  • 셀프 리뷰를 완료했습니다

📎 기타 참고사항

  • 외부 API(ChatOpenRouter) 의존으로 단위 테스트 미작성
  • current_emitter ContextVar는 CoreSolver 노드가 실행 전 set(), tool 내부에서 get()으로 참조하는 방식
  • cherry-pick으로 포함된 LLM 클라이언트 / SSE 커밋은 앞선 PR 머지 시 자동으로 중복 제거됨

Summary by CodeRabbit

릴리스 노트

  • New Features
    • 수학 문제 검증을 위한 Python 코드 생성 기능 추가
    • 실시간 이벤트 스트리밍을 통한 향상된 사용자 피드백 지원 (작업 시작, 진행 상황, 결과 알림)
    • 대규모 언어 모델 관리 기능 개선으로 더 나은 성능 제공

Review Change Stack

haein45 and others added 3 commits May 14, 2026 15:15
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented May 14, 2026

Warning

Rate limit exceeded

@haein45 has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 37 minutes and 4 seconds before requesting another review.

You’ve run out of usage credits. Purchase more in the billing tab.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: d7db06ad-2651-411c-8314-b8c24f4c0753

📥 Commits

Reviewing files that changed from the base of the PR and between 64cb051 and 0993b24.

📒 Files selected for processing (1)
  • src/proovy_agent/graph/tools/code_generate.py
📝 Walkthrough

Walkthrough

OpenRouter 기반 LLM 클라이언트 캐싱, SSE 이벤트 스트리밍 인프라, 그리고 이를 통합한 code_generate 도구를 추가합니다. 모델 별칭별로 ChatOpenRouter 인스턴스를 캐싱하고, 비동기 큐 기반 SSE 에미터로 실시간 이벤트 스트리밍을 지원하며, 수학 문제 검증용 Python 코드 생성 기능을 구현합니다.

Changes

LLM 클라이언트 및 SSE 스트리밍 인프라 추가

Layer / File(s) Summary
OpenRouter LLM 클라이언트 및 모델 캐싱
src/proovy_agent/common/llm/client.py
MODEL_MAP 상수로 flash/sonnet/opus 별칭을 OpenRouter 모델명으로 매핑하고, get_llm(model) 함수에서 별칭 검증, API 키 설정 확인, 인스턴스 캐싱을 수행합니다.
SSE 이벤트 모델 및 에미터
src/proovy_agent/common/sse/events.py, src/proovy_agent/common/sse/emitter.py, src/proovy_agent/common/sse/context.py
10가지 SSE 이벤트 타입(EventType)과 Pydantic SSEEvent 모델을 정의하고, asyncio 큐 기반 SSEEmitter 클래스로 비동기 이벤트 스트리밍을 구현합니다. current_emitter 컨텍스트 변수로 실행 중 활성 에미터를 추적합니다.
code_generate 도구
src/proovy_agent/graph/tools/code_generate.py
LangChain @tool 데코레이터를 사용한 비동기 코드 생성 도구로, problemapproach를 입력받아 get_llm("flash")로 LLM을 호출하고 실행 가능한 Python 코드를 반환합니다. 실행 시작/결과 이벤트를 SSE로 방출합니다.

Sequence Diagram(s)

sequenceDiagram
  participant Caller
  participant Tool as code_generate Tool
  participant Emitter as SSEEmitter
  participant LLMClient as get_llm(flash)
  participant Queue as asyncio.Queue
  
  Caller->>Tool: code_generate(problem, approach)
  alt SSE emitter available
    Tool->>Emitter: emit(tool_start, {name, input})
    Emitter->>Queue: put(SSEEvent)
  end
  Tool->>LLMClient: request ChatOpenRouter instance
  LLMClient->>LLMClient: check/create cached instance
  LLMClient->>Tool: return ChatOpenRouter
  Tool->>Tool: invoke LLM with system prompt
  alt SSE emitter available
    Tool->>Emitter: emit(tool_result, {output})
    Emitter->>Queue: put(SSEEvent)
  end
  Tool->>Caller: return code string
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

  • Team-Proovy/proovy-agent#4: OpenRouter 클라이언트 구현에서 동일한 MODEL_MAPget_llm(model) 캐싱 로직을 포함하고 있어 모듈 수준의 직접적인 중복이 있습니다.

Suggested reviewers

  • gaeunee2

Poem

🐰 코드를 짜는 토끼가 달린다고,
LLM과 이벤트 스트림을 엮어내며,
별칭 alias는 모델을 부르고,
SSE 큐는 실시간으로 흐른다.
수학 풀이 검증은 Python 코드로,
증명의 시작은 여기서부터!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed PR 제목 '[Feat] code_generate 도구 구현'은 주요 변경사항(code_generate 도구의 구현)을 명확하게 요약하고 있습니다.
Linked Issues check ✅ Passed code_generate 도구는 요구사항을 충족하도록 구현되었습니다: LangChain @tool 데코레이터 사용, problem/approach 입력, Python 코드 반환, get_llm('flash') 사용, async 구현, 비용 최적화 원칙 준수.
Out of Scope Changes check ✅ Passed 추가된 파일들(llm/client.py, sse/context.py, sse/emitter.py, sse/events.py)은 모두 code_generate 도구 구현에 필요한 지원 인프라로서 scope 내에 있습니다.
Docstring Coverage ✅ Passed Docstring coverage is 85.71% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feat/10-code-generate

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@haein45 haein45 self-assigned this May 15, 2026
@haein45 haein45 added the 혜인 label May 15, 2026
@haein45 haein45 requested review from chowon442 and gaeunee2 May 15, 2026 09:53
@haein45 haein45 changed the title feat: code_generate 도구 구현 [Feat] code_generate 도구 구현 May 15, 2026
@haein45
Copy link
Copy Markdown
Collaborator Author

haein45 commented May 15, 2026

@coderabbitai 리뷰 ㄱ

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented May 15, 2026

@haein45 PR 리뷰 시작할게요! 잠시만요 🐇

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (1)
src/proovy_agent/graph/tools/code_generate.py (1)

27-44: ⚡ Quick win

LLM 호출 실패 시 SSE 종료 이벤트를 보강해 주세요.

현재는 예외가 나면 tool_start만 발행되고 결과/오류 이벤트가 빠질 수 있습니다. try/except에서 error 이벤트를 emit하면 UI 상태가 안정적입니다.

제안 diff
     emitter = current_emitter.get()
     if emitter:
         await emitter.emit(
             "tool_start", {"name": "code_generate", "label": "🔧 검증 코드 생성 중..."}
         )
 
-    llm = get_llm("flash")
-    messages = [
-        {"role": "system", "content": _SYSTEM_PROMPT},
-        {"role": "user", "content": f"문제: {problem}\n\n풀이 방향: {approach}"},
-    ]
-    response = await llm.ainvoke(messages)
-    code = response.content.strip()
+    try:
+        llm = get_llm("flash")
+        messages = [
+            {"role": "system", "content": _SYSTEM_PROMPT},
+            {"role": "user", "content": f"문제: {problem}\n\n풀이 방향: {approach}"},
+        ]
+        response = await llm.ainvoke(messages)
+        code = response.content.strip()
+    except Exception as exc:
+        if emitter:
+            await emitter.emit(
+                "error",
+                {"name": "code_generate", "message": str(exc)},
+            )
+        raise
 
     if emitter:
         await emitter.emit("tool_result", {"name": "code_generate", "output": code})
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/proovy_agent/graph/tools/code_generate.py` around lines 27 - 44, The
code_generate function emits "tool_start" but doesn't emit a follow-up event if
the LLM call fails; wrap the LLM invocation and result emission in a try/except
and on exception call emitter.emit with an error event (e.g., "tool_error" or
"tool_result" with an error payload) including the exception message and the
tool name "code_generate", then re-raise or return an appropriate value so the
UI gets a terminal event; ensure you reference current_emitter.get(),
emitter.emit("tool_start", ...), llm.ainvoke(...), and
emitter.emit("tool_result", ...) and add emitter.emit("tool_error",
{"name":"code_generate", "error": str(e)}) in the except block.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@src/proovy_agent/common/sse/emitter.py`:
- Around line 22-28: Add the missing return type annotation to the async
generator method stream: change its signature in the Emitter class (async def
stream(self)) to specify AsyncIterator[dict[str, str]] so the function is
declared as returning an AsyncIterator of dict[str, str]; ensure you import
AsyncIterator from typing if not already imported.

In `@src/proovy_agent/graph/tools/code_generate.py`:
- Around line 38-40: The code assumes response.content is always a string;
change the extraction to use the LangChain-standard content_blocks on the
response from llm.ainvoke so it handles string, list, or dict formats: fetch
response.content_blocks, join or pick the primary textual block and then strip
it (with a safe fallback if content_blocks is missing or empty) instead of
calling response.content.strip(); update the logic around the response variable
returned by llm.ainvoke to use content_blocks and a fallback text to avoid
AttributeError.

---

Nitpick comments:
In `@src/proovy_agent/graph/tools/code_generate.py`:
- Around line 27-44: The code_generate function emits "tool_start" but doesn't
emit a follow-up event if the LLM call fails; wrap the LLM invocation and result
emission in a try/except and on exception call emitter.emit with an error event
(e.g., "tool_error" or "tool_result" with an error payload) including the
exception message and the tool name "code_generate", then re-raise or return an
appropriate value so the UI gets a terminal event; ensure you reference
current_emitter.get(), emitter.emit("tool_start", ...), llm.ainvoke(...), and
emitter.emit("tool_result", ...) and add emitter.emit("tool_error",
{"name":"code_generate", "error": str(e)}) in the except block.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 1228a28e-5f9e-42ef-ae14-f628397fa846

📥 Commits

Reviewing files that changed from the base of the PR and between 8eda78a and 64cb051.

📒 Files selected for processing (5)
  • src/proovy_agent/common/llm/client.py
  • src/proovy_agent/common/sse/context.py
  • src/proovy_agent/common/sse/emitter.py
  • src/proovy_agent/common/sse/events.py
  • src/proovy_agent/graph/tools/code_generate.py

Comment thread src/proovy_agent/common/sse/emitter.py
Comment thread src/proovy_agent/graph/tools/code_generate.py
@haein45 haein45 force-pushed the feat/10-code-generate branch 2 times, most recently from 6841430 to 64cb051 Compare May 15, 2026 11:56
response.content가 list인 경우 .strip() 호출 시 AttributeError 발생하므로
isinstance 분기로 안전하게 텍스트 추출

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feat] code_generate 도구 구현

1 participant