A Windows application that detects system crashes and uses AI to provide explanations and solutions for crash events. Supports multiple AI backends including local Phi-4 (ONNX) and Ollama for access to a wide range of local models.
- Automatically detects Windows crash events
- Extracts fault bucket information from Windows Error Reporting
- Multiple AI service support: Phi-4 (local ONNX) and Ollama
- Streaming AI responses with real-time token display
- Modern, sleek UI with responsive design
| Detection | AI recommendation |
|---|---|
![]() |
![]() |
- Built with WPF (.NET 9.0)
- Uses ONNX Runtime for local Phi-4 model inference
- Supports Ollama REST API for access to additional local models
IModelServiceinterface enables extensible AI service integration
- Windows operating system
- .NET 9.0 runtime
- One of the following AI backends:
- Phi-4: Model files in ONNX format (configured in appSettings.yml)
- Ollama: Ollama installed and running
- Clone the repository
- Update the
appSettings.ymlwith your preferred AI service configuration - Build and run the application
Uses the Phi-4 model locally via ONNX Runtime. Best for offline use with no additional services.
- Download the Phi-4 ONNX model (Int4 quantized, CPU-optimized)
- Set
modelPathinappSettings.ymlto the model directory - Select Phi-4 (Local) in the app
Supports any model available through Ollama, such as Llama 3, Mistral, DeepSeek, Phi-4, Gemma, and more.
- Install Ollama from https://ollama.com/
- Pull a model:
ollama pull llama3.2 - Ensure Ollama is running:
ollama serve - Update
appSettings.yml:aiService: ollama ollamaEndpoint: http://localhost:11434 ollamaModel: llama3.2
- Select Ollama in the app and enter your model name
In appSettings.yml, configure your AI service:
# AI Service selection: phi4 or ollama
aiService: phi4
# Path to the Phi-4 model (used when aiService is phi4)
modelPath: C:\Path\To\Your\Phi4Model
# Ollama configuration (used when aiService is ollama)
ollamaEndpoint: http://localhost:11434
ollamaModel: llama3.2YAML configuration allows you to use regular Windows paths with backslashes without needing to escape them.

