research-canvas-with-tako.mp4
An AI-powered research assistant that combines data visualization with web search capabilities. Built with CopilotKit, LangGraph, and the Model Context Protocol (MCP).
This demo showcases how to integrate custom data sources via MCP with a conversational AI interface for research and exploration.
- AI Research Agent: Automatically generates relevant search queries and gathers resources
- Dual Search: Combines structured data (via MCP) with web search (via Tavily)
- Interactive Resources: Preview and explore data visualizations inline
- Report Generation: Compile findings into formatted research reports
- Real-time Collaboration: Built with CopilotKit for seamless AI-human interaction
- OpenAI API Key - Get from platform.openai.com
- Tavily API Key - Get from tavily.com
- Data Source - Configure your MCP server endpoint (optional)
# Clone the repository
git clone <your-repo>
cd research-canvas
# Install dependencies
npm install
# Copy environment template
cp .env.example .env.local
# Add your API keys to .env.local
# OPENAI_API_KEY=your-openai-key
# TAVILY_API_KEY=your-tavily-key
# Start the development server
npm run devThe application will be available at:
- Frontend: http://localhost:3000
- Agent Backend: http://localhost:2024
research-canvas/
βββ src/ # Next.js frontend
β βββ app/ # App router pages
β βββ components/ # React components
β βββ lib/ # Utilities and types
βββ agents/ # LangGraph agent backends
β βββ typescript/ # TypeScript agent (primary)
β βββ python/ # Python agent (alternative)
βββ api/ # API routes and MCP integration
βββ .env.example # Environment variable template
Create a .env.local file based on .env.example:
| Variable | Description | Required |
|---|---|---|
OPENAI_API_KEY |
OpenAI API key for LLM | Yes |
TAVILY_API_KEY |
Tavily API key for web search | Yes |
TAKO_API_TOKEN |
API token for data source | Optional* |
TAKO_MCP_URL |
MCP server endpoint URL | Optional* |
TAKO_URL |
Base URL for data source | Optional* |
*Optional fields are only needed if you're connecting to a custom MCP data source.
- Start a Research Session: Enter a research question or topic
- AI Generates Queries: The agent automatically creates data-focused search queries
- Resource Discovery: View discovered resources (data visualizations, web articles)
- Explore Resources: Click on resources to preview and analyze
- Generate Report: Compile your findings into a formatted report
βββββββββββββββββββ
β Next.js App β
β (Frontend UI) β
ββββββββββ¬βββββββββ
β
β CopilotKit
βΌ
βββββββββββββββββββ
β LangGraph β
β Agent β
ββββββββββ¬βββββββββ
β
βββ OpenAI (LLM)
β
βββ Tavily (Web Search)
β
βββ MCP Server (Optional Data Source)
- CopilotKit: AI copilot framework for React applications
- LangGraph: Framework for building stateful agent workflows
- Model Context Protocol (MCP): Standard for connecting AI systems to data sources
- Next.js: React framework for the frontend
- Tavily: Web search API for research
# Reinstall dependencies
npm install
# Try running with legacy peer deps
npm install --legacy-peer-depsIf you encounter rate limits:
- Check your OpenAI API quota
- Verify Tavily API subscription level
- Consider adding caching to reduce API calls
- Verify all API keys are set in
.env.local - Check browser console for errors
- Ensure agent backend is running on port 2024
Important: Never commit API keys to version control
- Always use
.env.localfor sensitive values - Add
.env.localto.gitignore(already configured) - Rotate API keys regularly
- Use environment-specific keys for dev/production
This project uses a simple two-branch strategy for safe deployments.
main- Development branch (CI runs, no auto-deploy)production- Auto-deploys to production (protected branch)
# Develop and test on main
git push origin main
# Deploy to production when ready
git checkout production
git merge main
git push origin productionSee DEPLOYMENT.md for detailed deployment instructions, including:
- Setting up Vercel and Railway
- Configuring GitHub Actions secrets
- Branch protection rules
- Rollback procedures
Contributions are welcome! This project demonstrates how to build AI-powered research tools with CopilotKit and MCP.
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
This project includes an example MCP integration. To add your own:
- Implement MCP server for your data source
- Update environment variables with your MCP endpoint
- Modify agent tools to call your MCP methods
- Update UI components to display your data types
The LangGraph agent is in agents/typescript/src/. Key files:
agent.ts- Agent routing logicchat.ts- Chat and tool definitionssearch.ts- Search node implementation
The Next.js frontend is in src/. Key components:
components/ResearchCanvas.tsx- Main research interfacecomponents/Resources.tsx- Resource display and management
MIT License - See LICENSE file for details
Built as a demonstration of CopilotKit + LangGraph + MCP integration