I have verified this feature I'm about to request hasn't been suggested before.\n\nLiteLLM users currently have to wire an OpenAI-compatible provider, baseURL, auth, and model list by hand. There are several related reports around OpenAI-compatible provider setup, including #5674.\n\nI published a small community plugin that adds LiteLLM to /connect and discovers models from LiteLLM's /v1/models endpoint:\n\n- npm: https://www.npmjs.com/package/@finger_xie/opencode-plugin-litellm\n- repo: https://github.com/yuyu1025/opencode-plugin-litellm\n\nThe requested docs change is just to list it in the existing ecosystem Plugins table so users can find it. I have a one-line docs PR ready from yuyu1025:add-litellm-plugin-docs.
I have verified this feature I'm about to request hasn't been suggested before.\n\nLiteLLM users currently have to wire an OpenAI-compatible provider, baseURL, auth, and model list by hand. There are several related reports around OpenAI-compatible provider setup, including #5674.\n\nI published a small community plugin that adds LiteLLM to /connect and discovers models from LiteLLM's /v1/models endpoint:\n\n- npm: https://www.npmjs.com/package/@finger_xie/opencode-plugin-litellm\n- repo: https://github.com/yuyu1025/opencode-plugin-litellm\n\nThe requested docs change is just to list it in the existing ecosystem Plugins table so users can find it. I have a one-line docs PR ready from yuyu1025:add-litellm-plugin-docs.