Skip to content

Conversation

@kxz2002
Copy link
Contributor

@kxz2002 kxz2002 commented Dec 25, 2025

Motivation

The process_response_dict function in serving_completion.py does not support asynchronous calls, which causes issues when using EB5's processor.

💡 If this PR is a Cherry Pick, the PR title needs to follow the format by adding the [Cherry-Pick] label at the very beginning and appending the original PR ID at the end. For example, [Cherry-Pick][CI] Add check trigger and logic(#5191)

💡 如若此PR是Cherry Pick,PR标题需遵循格式,在最开始加上[Cherry-Pick]标签,以及最后面加上原PR ID,例如[Cherry-Pick][CI] Add check trigger and logic(#5191)

Modifications

Add a check to determine whether the process_response_dict function is an asynchronous call, enabling support for asynchronous invocations.

Usage or Command

no change in user command.

Accuracy Tests

no need

Checklist

  • Add at least a tag in the PR title.
    • Tag list: [[FDConfig],[APIServer],[Engine], [Scheduler], [PD Disaggregation], [Executor], [Graph Optimization], [Speculative Decoding], [RL], [Models], [Quantization], [Loader], [OP], [KVCache], [DataProcessor], [BugFix], [Docs], [CI], [Optimization], [Feature], [Benchmark], [Others], [XPU], [HPU], [GCU], [DCU], [Iluvatar], [Metax]]
    • You can add new tags based on the PR content, but the semantics must be clear.
  • Format your code, run pre-commit before commit.
  • Add unit tests. Please write the reason in this PR if no unit tests.
  • Provide accuracy results.
  • If the current PR is submitting to the release branch, make sure the PR has been submitted to the develop branch, then cherry-pick it to the release branch with the [Cherry-Pick] PR tag.

@paddle-bot
Copy link

paddle-bot bot commented Dec 25, 2025

Thanks for your contribution!

@paddle-bot paddle-bot bot added the contributor External developers label Dec 25, 2025
@codecov-commenter
Copy link

codecov-commenter commented Dec 25, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
⚠️ Please upload report for BASE (develop@01c18f3). Learn more about missing BASE report.

Additional details and impacted files
@@            Coverage Diff             @@
##             develop    #5758   +/-   ##
==========================================
  Coverage           ?   65.51%           
==========================================
  Files              ?      337           
  Lines              ?    43082           
  Branches           ?     6639           
==========================================
  Hits               ?    28225           
  Misses             ?    12754           
  Partials           ?     2103           
Flag Coverage Δ
GPU 65.51% <100.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes a bug where the process_response_dict function in serving_completion.py did not support asynchronous data processors. The fix adds runtime checks to determine if the function is a coroutine and appropriately awaits it if needed.

  • Adds inspect.iscoroutinefunction() checks to detect async processors
  • Handles both sync and async code paths for process_response_dict calls
  • Enables compatibility with EB5's processor which requires async support

Comment on lines 513 to 520
if inspect.iscoroutinefunction(self.engine_client.data_processor.process_response_dict):
await self.engine_client.data_processor.process_response_dict(
res, stream=True, include_stop_str_in_output=request.include_stop_str_in_output
)
else:
self.engine_client.data_processor.process_response_dict(
res, stream=True, include_stop_str_in_output=request.include_stop_str_in_output
)
Copy link

Copilot AI Dec 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The inspect.iscoroutinefunction check is performed on every iteration inside the loop, which can impact performance. Consider caching this check result in the __init__ method of the class as an instance variable (e.g., self._is_process_response_dict_async), and then use the cached value here. This would eliminate the repeated introspection overhead during request processing.

Copilot uses AI. Check for mistakes.
Comment on lines 318 to 325
if inspect.iscoroutinefunction(self.engine_client.data_processor.process_response_dict):
await self.engine_client.data_processor.process_response_dict(
data, stream=False, include_stop_str_in_output=request.include_stop_str_in_output
)
else:
self.engine_client.data_processor.process_response_dict(
data, stream=False, include_stop_str_in_output=request.include_stop_str_in_output
)
Copy link

Copilot AI Dec 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code block is duplicated across two locations in the file (lines 318-325 and lines 513-520). Consider extracting this logic into a helper method (e.g., async def _call_process_response_dict) to improve maintainability and ensure consistency if the logic needs to change in the future.

Copilot uses AI. Check for mistakes.
Comment on lines 318 to 325
if inspect.iscoroutinefunction(self.engine_client.data_processor.process_response_dict):
await self.engine_client.data_processor.process_response_dict(
data, stream=False, include_stop_str_in_output=request.include_stop_str_in_output
)
else:
self.engine_client.data_processor.process_response_dict(
data, stream=False, include_stop_str_in_output=request.include_stop_str_in_output
)
Copy link

Copilot AI Dec 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new async path for process_response_dict is not covered by tests. Since this file has comprehensive test coverage in tests/entrypoints/openai/test_serving_completion.py, consider adding test cases that verify the async code path works correctly. This would involve mocking process_response_dict as an async function using AsyncMock and verifying it gets called with await.

Copilot uses AI. Check for mistakes.
Comment on lines 318 to 325
if inspect.iscoroutinefunction(self.engine_client.data_processor.process_response_dict):
await self.engine_client.data_processor.process_response_dict(
data, stream=False, include_stop_str_in_output=request.include_stop_str_in_output
)
else:
self.engine_client.data_processor.process_response_dict(
data, stream=False, include_stop_str_in_output=request.include_stop_str_in_output
)
Copy link

Copilot AI Dec 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The inspect.iscoroutinefunction check is performed on every iteration inside the loop, which can impact performance. Consider caching this check result in the __init__ method of the class as an instance variable (e.g., self._is_process_response_dict_async), and then use the cached value here. This would eliminate the repeated introspection overhead during request processing.

Copilot uses AI. Check for mistakes.
LiqinruiG
LiqinruiG previously approved these changes Dec 26, 2025
Copy link
Collaborator

@LiqinruiG LiqinruiG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@LiqinruiG LiqinruiG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@LiqinruiG LiqinruiG merged commit cad2932 into PaddlePaddle:develop Dec 26, 2025
18 of 20 checks passed
kxz2002 added a commit to kxz2002/FastDeploy that referenced this pull request Dec 26, 2025
…tion (PaddlePaddle#5758)

* support process_response_dict async initial commit

* fixbug

* add unit test

* optimize
LiqinruiG pushed a commit that referenced this pull request Dec 29, 2025
…tion (#5758) (#5802)

* support process_response_dict async initial commit

* fixbug

* add unit test

* optimize
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

contributor External developers

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants