Skip to content

Conversation

@github-actions
Copy link
Contributor

This PR prepares the release for version 1.2.3-alpha with version update and code style fixes:

  • Updated version in Solution.props
  • Updated changelog with closed-solved issues
  • Updated README badges

marc-romu and others added 14 commits January 3, 2026 17:32
…nd special turn to summarize the conversation context when it gets too big

- Updated version in Solution.props and README badge to 1.2.3-dev.260105
- Added comprehensive CHANGELOG entries for context management features:
  - Context limit tracking and automatic summarization at 80% threshold
  - SummarizeSpecialTurn factory and context exceeded error handling
  - GhJSON canvas tools for start/end nodes with optional runtime data
- Standardized node classification terminology
…automatic system prompt merging

- Added new AIAgent.Summary enum value and display/serialization support
- Updated SummarizeSpecialTurn to create Summary interactions instead of Assistant
- Implemented MergeSystemAndSummary in AIProvider base to automatically merge summaries with system prompt using format: "System prompt\n---\nThis is a summary of the previous messages in the conversation:\n\nSummary"
- Updated all providers (
…ile data from GhJSON by default

- Added new `SerializationContext.Optimized` that excludes `PersistentData` and `VolatileData` to prevent encrypted/binary strings in GhJSON output, ensuring LLM-safe and token-efficient results
- Fixed `gh_get` tool incorrectly using `Optimized` context for node classification filters instead of `Standard`, which was breaking `gh_put` restoration of internalized parameter values
- Updated `gh_get`
…roviders

- Added ContextLimit field to all Anthropic models (200,000 tokens)
- Added ContextLimit field to DeepSeek models (64,000 and 60,000 tokens)
- Added ContextLimit field to OpenAI models (ranging from 2,000 to 1,047,576 tokens)
- Added ContextLimit field to OpenRouter models (ranging from 60,000 to 1,048,576 tokens)
- Removed extra whitespace in OpenAI provider models file
…history

- Added Update button to debug actions in ChatResourceManager
- Added getAllMessageKeys() JavaScript function to retrieve all message keys from DOM
- Implemented UpdateChatView() method to sync DOM messages with conversation history by comparing cached HTML hashes and updating only changed messages
- Added update event handler in WebChatDialog with platform-specific JSON parsing for WebView.ExecuteScript return values
## Description

This PR introduces comprehensive context limit management for AI
conversations and adds debug improvements to the WebChat interface.

### Context Management Features

- Added ContextLimit property to all AI model capabilities across
providers (Anthropic, DeepSeek, OpenAI, OpenRouter)
- Implemented automatic context tracking in ConversationSession with
percentage calculation
- Added pre-emptive summarization when context usage exceeds 80% of
model limit
- Added context exceeded error detection and automatic summarization
with retry
- Enhanced AIMetrics with LastEffectiveTotalTokens field for accurate
context usage calculation
- Added SummarizeSpecialTurn factory for creating conversation
summarization special turns

### GhJSON Serialization Improvements

- Added Optimized serialization context that excludes PersistentData and
VolatileData to prevent encrypted/binary strings in GhJSON output
- Fixed gh_get tool incorrectly using Optimized context for node
classification filters, breaking gh_put restoration
- Ensures LLM-safe and token-efficient results by default

### WebChat Debug Enhancements

- Added debug Update button to refresh chat view from conversation
history
- Implemented DOM synchronization by comparing cached HTML hashes and
updating only changed messages
- Added JavaScript utility functions for message key retrieval and view
updates

## Breaking Changes

None.

## Testing Done

- Tested automatic summarization when context exceeds 80% threshold
- Confirmed GhJSON serialization excludes persistent/volatile data in
Optimized mode while preserving restoration capability

## Checklist

- [x] This PR is focused on a single feature or bug fix
- [x] Version in Solution.props was updated, if necessary, and follows
semantic versioning
- [x] CHANGELOG.md has been updated
- [x] PR title follows [Conventional
Commits](https://www.conventionalcommits.org/en/v1.0.0/) format
- [x] PR description follows [Pull Request Description
Template](#pull-request-description-template)
This PR updates the version badge in the README.md to match the current
version in Solution.props.

This is an automated PR created by the Update Version Badge workflow.
… fixes (#370)

This PR prepares the release for version 1.2.3-alpha with version update
and code style fixes:

- Updated version in Solution.props
- Updated changelog with closed-solved issues
- Updated README badges
@github-actions github-actions bot requested a review from marc-romu as a code owner January 11, 2026 09:55
- Swapped License and Ask DeepWiki badge positions in README.md
- Changed DeconstructMetricsComponent GUID from 250D14BA-D96A-4DC0-8703-87468CE2A18D to B8FE17D7-F146-4C94-9673-D2FA04BF7B9F
- Modified PR title validation to accept either conventional commits format OR release branch name format (release/x.y.z)
- Updated error message to document both accepted formats
- Added examples for release branch name format in validation error output
@marc-romu marc-romu enabled auto-merge January 11, 2026 10:13
@marc-romu marc-romu merged commit 4951632 into main Jan 11, 2026
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants