feat: add GLM-4.7 model support to Z.ai provider#4627
feat: add GLM-4.7 model support to Z.ai provider#4627kilo-code-bot[bot] wants to merge 1 commit intomainfrom
Conversation
- Added GLM-4.7 documentation link to the file header comments - Added glm-4.7 entry to internationalZAiModels with: - maxTokens: 98,304 - contextWindow: 200,000 (matching GLM-4.6) - supportsImages: false - supportsPromptCache: true - supportsNativeTools: true - defaultToolProtocol: native - Pricing aligned with GLM-4.6 - Added glm-4.7 entry to mainlandZAiModels with similar configuration but using mainland China pricing Cherry-picked from PR #10271
|
✅ No Issues Found1 file reviewed | Confidence: 95% | Recommendation: Merge Review DetailsFiles: packages/types/src/providers/zai.ts Checked: Security, bugs, type safety, code patterns Summary: This PR adds GLM-4.7 model configuration to the Z.ai provider. The changes:
No security, bug, or type safety concerns identified. |
Related GitHub Issue
Cherry-picked from: RooCodeInc/Roo-Code#10271
Description
This PR cherry-picks the changes from PR #10271 to add basic support for the GLM-4.7 model to the Z.ai provider.
Changes made:
glm-4.7entry tointernationalZAiModelswith the following configuration:maxTokens: 98,304contextWindow: 200,000 (matching GLM-4.6)supportsImages: falsesupportsPromptCache: truesupportsNativeTools: truedefaultToolProtocol: "native"glm-4.7entry tomainlandZAiModelswith similar configuration but using mainland China pricingThis follows the existing pattern used for GLM-4.5 and GLM-4.6 models already in the codebase.
Test Procedure
Screenshots / Videos
N/A - This is a data/configuration change only.
Documentation Updates
Additional Notes
This PR adds basic support for GLM-4.7 as requested in the original issue. The issue mentions that GLM-4.7 may have "thinking capabilities" - if extended thinking/reasoning support is needed in the future, that can be added in a follow-up PR.