-
Notifications
You must be signed in to change notification settings - Fork 0
release/1.2.3-alpha #371
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
release/1.2.3-alpha #371
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…nd special turn to summarize the conversation context when it gets too big - Updated version in Solution.props and README badge to 1.2.3-dev.260105 - Added comprehensive CHANGELOG entries for context management features: - Context limit tracking and automatic summarization at 80% threshold - SummarizeSpecialTurn factory and context exceeded error handling - GhJSON canvas tools for start/end nodes with optional runtime data - Standardized node classification terminology
…automatic system prompt merging - Added new AIAgent.Summary enum value and display/serialization support - Updated SummarizeSpecialTurn to create Summary interactions instead of Assistant - Implemented MergeSystemAndSummary in AIProvider base to automatically merge summaries with system prompt using format: "System prompt\n---\nThis is a summary of the previous messages in the conversation:\n\nSummary" - Updated all providers (
…ile data from GhJSON by default - Added new `SerializationContext.Optimized` that excludes `PersistentData` and `VolatileData` to prevent encrypted/binary strings in GhJSON output, ensuring LLM-safe and token-efficient results - Fixed `gh_get` tool incorrectly using `Optimized` context for node classification filters instead of `Standard`, which was breaking `gh_put` restoration of internalized parameter values - Updated `gh_get`
…culate the ContextUsagePercent
…roviders - Added ContextLimit field to all Anthropic models (200,000 tokens) - Added ContextLimit field to DeepSeek models (64,000 and 60,000 tokens) - Added ContextLimit field to OpenAI models (ranging from 2,000 to 1,047,576 tokens) - Added ContextLimit field to OpenRouter models (ranging from 60,000 to 1,048,576 tokens) - Removed extra whitespace in OpenAI provider models file
…history - Added Update button to debug actions in ChatResourceManager - Added getAllMessageKeys() JavaScript function to retrieve all message keys from DOM - Implemented UpdateChatView() method to sync DOM messages with conversation history by comparing cached HTML hashes and updating only changed messages - Added update event handler in WebChatDialog with platform-specific JSON parsing for WebView.ExecuteScript return values
## Description This PR introduces comprehensive context limit management for AI conversations and adds debug improvements to the WebChat interface. ### Context Management Features - Added ContextLimit property to all AI model capabilities across providers (Anthropic, DeepSeek, OpenAI, OpenRouter) - Implemented automatic context tracking in ConversationSession with percentage calculation - Added pre-emptive summarization when context usage exceeds 80% of model limit - Added context exceeded error detection and automatic summarization with retry - Enhanced AIMetrics with LastEffectiveTotalTokens field for accurate context usage calculation - Added SummarizeSpecialTurn factory for creating conversation summarization special turns ### GhJSON Serialization Improvements - Added Optimized serialization context that excludes PersistentData and VolatileData to prevent encrypted/binary strings in GhJSON output - Fixed gh_get tool incorrectly using Optimized context for node classification filters, breaking gh_put restoration - Ensures LLM-safe and token-efficient results by default ### WebChat Debug Enhancements - Added debug Update button to refresh chat view from conversation history - Implemented DOM synchronization by comparing cached HTML hashes and updating only changed messages - Added JavaScript utility functions for message key retrieval and view updates ## Breaking Changes None. ## Testing Done - Tested automatic summarization when context exceeds 80% threshold - Confirmed GhJSON serialization excludes persistent/volatile data in Optimized mode while preserving restoration capability ## Checklist - [x] This PR is focused on a single feature or bug fix - [x] Version in Solution.props was updated, if necessary, and follows semantic versioning - [x] CHANGELOG.md has been updated - [x] PR title follows [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/) format - [x] PR description follows [Pull Request Description Template](#pull-request-description-template)
This PR updates the version badge in the README.md to match the current version in Solution.props. This is an automated PR created by the Update Version Badge workflow.
… fixes (#370) This PR prepares the release for version 1.2.3-alpha with version update and code style fixes: - Updated version in Solution.props - Updated changelog with closed-solved issues - Updated README badges
- Swapped License and Ask DeepWiki badge positions in README.md - Changed DeconstructMetricsComponent GUID from 250D14BA-D96A-4DC0-8703-87468CE2A18D to B8FE17D7-F146-4C94-9673-D2FA04BF7B9F
- Modified PR title validation to accept either conventional commits format OR release branch name format (release/x.y.z) - Updated error message to document both accepted formats - Added examples for release branch name format in validation error output
marc-romu
approved these changes
Jan 11, 2026
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR prepares the release for version 1.2.3-alpha with version update and code style fixes: