Describe the feature or problem you'd like to solve
GitHub Copilot CLI should be a CLI/batch interface for the VS Code Copilot Chat setup if that has already been setup for an existing VS Code project
Proposed solution
At face value, it would look like GitHub Copilot CLI is just another proprietary AI coding agent CLI like Claude Code in calling proprietary external models. If that is the case, this is a disappointment and a missed opportunity.
Note the features that VS Code Copilot Chat currently has or will soon have that require non-trivial setup for a VS Code project:
For example, the latter issue of context gathering for C++ projects requires a local CMake configuration of the project and the clangd implementation of the LSP to provide this indexing. A dump CLI agent will not have any of this (and will have to fall back on slow and token-hogging find/grep).
Any and all of the features and behaviors provided by a custom setup of VS Code Copilot Chat should be available in a CLI/batch interface as per the VS Code backlog item:
What is the relationship between VS Code Copilot Chat and GitHub Copilot CLI? Are they completely separate products with no relation? If GitHub Copilot CLI is just going to be a independent AI agent that goes off of an AGETNS.md file like Codex CLI or Claude Code, then what is the advantage using GitHub Copilot CLI over Codex CLI? Codex CLI is open source and I can use it to call locally hosted models with zero external networking (and we are doing that right now with good success). Without leveraging the significant setup of a VS Code project with the VS Code Copilot Chat setup, why would anyone in enterprise that has to use local models bother with GitHub Copilot CLI?
Microsoft is making a huge mistake if they are not leveraging the connection to VS Code Copilot Chat with a CLI/batch AI agent interface. That connection is a huge advantage that other AI coding agent CLIs can't easily match.
What is strategy here?
Example prompts or workflows
No response
Additional context
No response
Describe the feature or problem you'd like to solve
GitHub Copilot CLI should be a CLI/batch interface for the VS Code Copilot Chat setup if that has already been setup for an existing VS Code project
Proposed solution
At face value, it would look like GitHub Copilot CLI is just another proprietary AI coding agent CLI like Claude Code in calling proprietary external models. If that is the case, this is a disappointment and a missed opportunity.
Note the features that VS Code Copilot Chat currently has or will soon have that require non-trivial setup for a VS Code project:
rg). See:For example, the latter issue of context gathering for C++ projects requires a local CMake configuration of the project and the clangd implementation of the LSP to provide this indexing. A dump CLI agent will not have any of this (and will have to fall back on slow and token-hogging find/grep).
Any and all of the features and behaviors provided by a custom setup of VS Code Copilot Chat should be available in a CLI/batch interface as per the VS Code backlog item:
What is the relationship between VS Code Copilot Chat and GitHub Copilot CLI? Are they completely separate products with no relation? If GitHub Copilot CLI is just going to be a independent AI agent that goes off of an AGETNS.md file like Codex CLI or Claude Code, then what is the advantage using GitHub Copilot CLI over Codex CLI? Codex CLI is open source and I can use it to call locally hosted models with zero external networking (and we are doing that right now with good success). Without leveraging the significant setup of a VS Code project with the VS Code Copilot Chat setup, why would anyone in enterprise that has to use local models bother with GitHub Copilot CLI?
Microsoft is making a huge mistake if they are not leveraging the connection to VS Code Copilot Chat with a CLI/batch AI agent interface. That connection is a huge advantage that other AI coding agent CLIs can't easily match.
What is strategy here?
Example prompts or workflows
No response
Additional context
No response