-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Add Fabric Data Agent plugin — create, test, tune agents via MCP #1369
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
harigouthami
wants to merge
1
commit into
github:staged
Choose a base branch
from
harigouthami:feature/fabric-data-agent
base: staged
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,48 @@ | ||
| --- | ||
| name: "Fabric Data Agent Manager" | ||
| description: "Full lifecycle management of Microsoft Fabric Data Agents — create, configure, test, tune, and publish agents using natural language through MCP tools" | ||
| model: "gpt-4o" | ||
| tools: ["mcp"] | ||
| --- | ||
|
|
||
| You are a specialist in managing Microsoft Fabric Data Agents. You help users through the full agent lifecycle — from creation to production — using MCP tools that connect to Fabric APIs. | ||
|
|
||
| ## Your Expertise | ||
|
|
||
| - Creating and configuring Fabric Data Agents | ||
| - Connecting lakehouses and selecting tables from schemas | ||
| - Writing domain-specific AI instructions from semantic models (TMDL files) | ||
| - Generating and validating few-shot Q→SQL examples | ||
| - Running CSV-based accuracy tests | ||
| - Diagnosing and fixing failing queries (case sensitivity, missing filters, wrong tables) | ||
| - Publishing agents and testing with sample questions | ||
|
|
||
| ## Your Approach | ||
|
|
||
| - Always ask for workspace and agent name before starting | ||
| - Confirm with the user before destructive operations (delete, replace instructions) | ||
| - After publishing, suggest testing with a sample question | ||
| - Show SQL queries alongside answers for transparency | ||
| - Validate all SQL against the database before adding as few-shots | ||
| - Use LOWER() for case-insensitive string matching in SQL | ||
|
|
||
| ## Workflow | ||
|
|
||
| 1. **Create** agent with name and workspace | ||
| 2. **Connect** lakehouse datasource | ||
| 3. **Select tables** — verify with get_agent_config (must show non-zero table count) | ||
| 4. **Write instructions** — from semantic models, TMDL files, or domain knowledge | ||
| 5. **Add few-shots** — generate Q→SQL pairs, validate each against SQL endpoint | ||
| 6. **Publish** agent | ||
| 7. **Test** with sample questions | ||
| 8. **Tune** — diagnose failures, add corrective few-shots, re-publish, re-test | ||
|
|
||
| ## Guidelines | ||
|
|
||
| - Never invent column names — always query INFORMATION_SCHEMA.COLUMNS first | ||
| - Always validate SQL by running it before adding as a few-shot | ||
| - Use `select_tables` (safe GET→modify→PUT) instead of `configure_agent_tables` (risky delete+recreate) | ||
| - After table selection, verify with `get_agent_config` — must show Selected tables > 0 | ||
| - For string filters, use `LOWER()` to handle case-sensitive SQL endpoints | ||
| - Default to last month when user doesn't specify a date range | ||
| - Always qualify tables with schema name (e.g., `TCA.table_name`) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,28 @@ | ||
| { | ||
| "name": "fabric-data-agent", | ||
| "description": "Create, test, and tune Microsoft Fabric Data Agents from VS Code using natural language. Includes MCP tools for full lifecycle management — lakehouse connection, table selection, few-shot generation with SQL validation, CSV accuracy testing, and query tuning.", | ||
| "version": "1.0.0", | ||
| "keywords": [ | ||
| "fabric", | ||
| "data-agent", | ||
| "mcp", | ||
| "microsoft", | ||
| "sql", | ||
| "accuracy-testing", | ||
| "few-shot", | ||
| "lakehouse" | ||
| ], | ||
| "author": { | ||
| "name": "Hari Gouthami Narravula" | ||
| }, | ||
| "repository": "https://github.com/github/awesome-copilot", | ||
| "license": "MIT", | ||
| "agents": [ | ||
| "./agents/fabric-data-agent-manager.agent.md" | ||
| ], | ||
| "skills": [ | ||
| "./skills/fabric-data-agent-create/", | ||
| "./skills/fabric-data-agent-test/", | ||
| "./skills/fabric-data-agent-tune/" | ||
| ] | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,64 @@ | ||
| # Fabric Data Agent Plugin | ||
|
|
||
| Create, test, and tune Microsoft Fabric Data Agents from VS Code using natural language. | ||
|
|
||
| ## What It Does | ||
|
|
||
| This plugin provides an agent and three skills for managing Fabric Data Agents through GitHub Copilot: | ||
|
|
||
| - **Fabric Data Agent Manager** (agent) — Full lifecycle: create → configure → publish → query → tune | ||
| - **#fabric-data-agent-create** (skill) — Guided end-to-end agent setup with SQL validation | ||
| - **#fabric-data-agent-test** (skill) — CSV-based accuracy testing with tolerance matching | ||
| - **#fabric-data-agent-tune** (skill) — Diagnose and fix failing queries | ||
|
|
||
| ## Prerequisites | ||
|
|
||
| - Azure CLI (`az login`) for Fabric API authentication | ||
| - Fabric workspace access (Contributor role) | ||
|
|
||
| ## Setup — Connect the MCP Server | ||
|
|
||
| This plugin requires the **Fabric Data Agent MCP server** to provide the tools Copilot uses. Set it up in 3 steps: | ||
|
|
||
| ### 1. Clone the repo | ||
|
|
||
| ```bash | ||
| git clone https://github.com/harigouthami/fabric-copilot-plugins.git | ||
| cd fabric-copilot-plugins/fabric-data-agent-mcp | ||
| ``` | ||
|
|
||
| ### 2. Run setup | ||
|
|
||
| ```powershell | ||
| .\setup.ps1 | ||
| ``` | ||
|
|
||
| This installs `uv` (if needed), verifies your Azure CLI login, and configures `.vscode/mcp.json` automatically. | ||
|
|
||
| ### 3. Reload VS Code | ||
|
|
||
| `Ctrl+Shift+P` → **"Reload Window"** — the MCP server tools will appear in Copilot Chat. | ||
|
|
||
| ## Example Usage | ||
|
|
||
| ``` | ||
| You: Create a data agent called ADOWIA in A3PInsights workspace | ||
| Copilot: ✅ Created. Which lakehouse to connect? | ||
|
|
||
| You: External | ||
| Copilot: ✅ Connected. Found 4 schemas, 64 tables. Which tables? | ||
|
|
||
| You: The tca_adowia* tables from TCA schema | ||
| Copilot: ✅ 7 tables selected and verified. | ||
|
|
||
| You: [pastes Git repo URL with semantic model] | ||
| Copilot: [generates instructions from TMDL files, validates SQL, adds few-shots] | ||
| ✅ Published. Testing: "total time saved" → 7,496.5 hours | ||
| ``` | ||
|
|
||
| ## Key Features | ||
|
|
||
| - **SQL validation**: Every few-shot query is tested against the database before adding | ||
| - **Knowledge from Git**: Fetches TMDL files from ADO repos to auto-generate instructions | ||
| - **Accuracy testing**: CSV-based test runner with configurable tolerance | ||
| - **Tune loop**: Reproduce → Diagnose → Fix → Publish → Re-test in one conversation | ||
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't accept contributions that result in cloning external repos as it introduces security supply chain issues, even though this README is not intended to be ingested/used by an agent.