Skip to content

WIP : ADD: Batch request (Non-streaming) for OpenAI Plugin#66

Open
thameem-abbas wants to merge 7 commits intoopenshift-psap:mainfrom
thameem-abbas:dev-batch-non-streaming
Open

WIP : ADD: Batch request (Non-streaming) for OpenAI Plugin#66
thameem-abbas wants to merge 7 commits intoopenshift-psap:mainfrom
thameem-abbas:dev-batch-non-streaming

Conversation

@thameem-abbas
Copy link
Copy Markdown
Member

Adds non-streaming batch request support in OpenAI plugin.

@thameem-abbas thameem-abbas changed the title ADD: Batch request for OpenAI Plugin ADD: Batch request (Non-streaming) for OpenAI Plugin Nov 4, 2024
@thameem-abbas
Copy link
Copy Markdown
Member Author

@sjmonson @dagrayvid @npalaska
It would be great if someone could review the PR.

@npalaska
Copy link
Copy Markdown
Collaborator

npalaska commented Nov 4, 2024

Thanks @thameem-abbas 👍 Initial tests with changes from this PR work well. I'll add more in the afternoon.

Comment thread plugins/openai_plugin.py
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still don't like this approach. It duplicates too much code; fine for a one-off, but to merge it would be preferable to have all request methods take a list of queries and return a list of Results (see later comment).

Comment thread plugins/openai_plugin.py

return result

def request_batch_http(self, queries, user_id, test_end_time: float = 0):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing the interface of request_func() based on calling args is bad practice. See above comment.

Comment thread plugins/openai_plugin.py Outdated
Comment thread plugins/openai_plugin.py Outdated
Comment thread plugins/openai_plugin.py Outdated
Comment thread user.py Outdated
Comment thread user.py
Comment on lines -40 to -41
# if timeout passes, queue.Empty will be thrown
# User should continue to poll for inputs
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why was this comment removed?

Comment thread result.py Outdated
Comment thread plugins/openai_plugin.py Outdated
Comment thread utils.py
if plugin_type == "openai_plugin":
plugin = openai_plugin.OpenAIPlugin(
config.get("plugin_options")
config.get("plugin_options"), batch_size
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, this should be handled in a way the the plugin does not need to know the batch size in advance. However, if you must... just set config["plugin_options"] = batch_size to avoid changing the interface.

thameem-abbas and others added 5 commits November 4, 2024 12:38
Agreeing on change of condition.

Co-authored-by: Samuel Monson <smonson@irbash.net>
Don't know what was running through my head when I wrote that. Thanks

Co-authored-by: Samuel Monson <smonson@irbash.net>
Fix redundant annotation

Co-authored-by: Samuel Monson <smonson@irbash.net>
@thameem-abbas thameem-abbas changed the title ADD: Batch request (Non-streaming) for OpenAI Plugin WIP : ADD: Batch request (Non-streaming) for OpenAI Plugin Nov 11, 2024
@rgreenberg1
Copy link
Copy Markdown

Has this worked been picked back up/re-reviewed?

@thameem-abbas
Copy link
Copy Markdown
Member Author

@rgreenberg1 This hasn't been picked up for a while now. It's out of sync with main by quite a bit. Can prioritize this if there is a need for this to land in llm-load-test.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants