A lightweight and modern chat interface for MCP and LLM interactions with Markdown support!
A minimalist chat interface built with React and TypeScript, designed to be easily integrated with any backend. Features a clean and modern design.
├── Chatbot_ui
│ ├── LICENSE
│ ├── components.json
│ ├── eslint.config.js
│ ├── index.html
│ ├── package-lock.json
│ ├── package.json
│ ├── postcss.config.js
│ ├── public
│ │ └── integrations
│ │ └── integration.yaml
│ ├── src
│ │ ├── App.css
│ │ ├── App.tsx
│ │ ├── components
│ │ │ ├── custom
│ │ │ │ ├── actions.tsx
│ │ │ │ ├── chatinput.tsx
│ │ │ │ ├── header.tsx
│ │ │ │ ├── icons.tsx
│ │ │ │ ├── markdown.tsx
│ │ │ │ ├── message.tsx
│ │ │ │ ├── overview.tsx
│ │ │ │ ├── sidebar.tsx
│ │ │ │ ├── theme-toggle.tsx
│ │ │ │ └── use-scroll-to-bottom.ts
│ │ │ └── ui
│ │ │ ├── button.tsx
│ │ │ ├── card.tsx
│ │ │ ├── command.tsx
│ │ │ ├── dialog.tsx
│ │ │ ├── icons.tsx
│ │ │ ├── input.tsx
│ │ │ ├── label.tsx
│ │ │ ├── popover.tsx
│ │ │ ├── scroll-area.tsx
│ │ │ └── textarea.tsx
│ │ ├── context
│ │ │ └── ThemeContext.tsx
│ │ ├── index.css
│ │ ├── integrationManager.ts
│ │ ├── integrator.js
│ │ ├── interfaces
│ │ │ └── interfaces.ts
│ │ ├── lib
│ │ │ └── utils.ts
│ │ ├── main.tsx
│ │ ├── pages
│ │ │ └── chat
│ │ │ └── chat.tsx
│ │ └── vite-env.d.ts
│ ├── tailwind.config.js
│ ├── testbackend
│ │ └── test.py
│ ├── tsconfig.app.json
│ ├── tsconfig.app.tsbuildinfo
│ ├── tsconfig.json
│ ├── tsconfig.node.json
│ ├── tsconfig.node.tsbuildinfo
│ └── vite.config.ts
├── MCP.postman_collection.json
├── MCP_client
│ ├── build
│ │ ├── index.js
│ │ ├── mcpClientManager.js
│ │ └── server.js
│ ├── package-lock.json
│ ├── package.json
│ ├── src
│ │ ├── index.ts
│ │ ├── mcpClientManager.ts
│ │ └── server.ts
│ └── tsconfig.json
├── README.md
└── Screenshot
├── Mermaid_Chart.png
└── image.png
- The
Chatbot_ui/directory contains all the frontend code (React + TypeScript), configuration, and assets. - The
MCP_client/directory contains the MCP Client, which manages communication with the MCP server and LLM providers. - The
Screenshot/folder contains images for documentation.
Below is the architecture diagram of the chatbot:
flowchart LR
subgraph MCP_Pilot["MCP_Pilot Repo"]
direction TB
A["Chatbot UI"]
B["MCP Client"]
F["LLM"]
end
subgraph MCP_Server["MCP Server"]
C["Mars Rover Image Tool"]
D["Earth Image Tool"]
E["Weather Tool"]
end
subgraph NASA_MCP_Server["NASA_MCP_Server Repo"]
direction TB
MCP_Server
end
A -- Prompt --> B
B -- Prompt + Tools info --> F
F -- Tool choice + params --> B
B -- Final result --> A
F -- Final Result --> B
B -- Get available tools --> MCP_Server
MCP_Server -- Tools info --> B
B -- Execute tool --> MCP_Server
MCP_Server -- Tool result --> B
B -- Tool Result --> F
style MCP_Server stroke-width:2px,stroke:#888
style MCP_Pilot stroke-width:2px,stroke:#FF4136
style NASA_MCP_Server stroke-width:2px,stroke:#FF4136
| Description | YouTube Short | Commit/PR | Date |
|---|---|---|---|
| Commit Mermaid diagram on the MCP Chatbot | Watch | Commit | June 18 |
| Vite Learning | Watch | PR | June 17 |
| MCP chatbot architecture and Clean up folder structure. | Watch | PR1, PR2 | June 14, June 16 |
| UI for MCP architecture demo | Watch | ||
| MCP backend API testing Postman | Watch | ||
| Simple CMD line interaction with MCP architecture. | Watch | ||
| MCP - Inspector 🚓🚨 | Watch | ||
| MCP client (TypeScript/Node) server (python) with🌡️ Weather tool. | Watch | ||
| MCP architecture diagram | Watch | ||
| VS code copilot MCP tool attachment in agent mode | Watch | ||
| Simple MCP | Watch | ||
| Mars Image API integration: | Watch | ||
| Earth Image API integration: | Watch | ||
| YAML File Integration | Watch | ||
| MultiTool Integration + Chat history | Watch | ||
| LLM Response Parsing (XML) | Watch | ||
| Exploring NASA APIs | Watch | ||
| Other | Watch1, Watch2 |
cd Chatbot_ui
npm install
npm run devThe frontend will be available at http://localhost:8501 by default.
cd MCP_client
npm install
npm run devCreate a .env file in the MCP_client directory and add your API keys:
OPENAI_API_KEY=your_openai_api_key_here
ANTHROPIC_API_KEY=your_anthropic_api_key_here
The MCP Client configuration is currently hardcoded to use the following server URL:
const mcpConfig: MCPClientConfig = {
serverUrl: "http://127.0.0.1:8000/mcp",
provider: "openai",
};Update this if your MCP server runs on a different URL.
- Connect to the WebSocket server at:
ws://localhost:8090 - Send messages as JSON:
{ "query": "What is the weather like today?" } - Or send plain text messages directly.
-
Health Check:
GET http://localhost:3000/health -
Execute Query:
POST http://localhost:3000/mcp/execute
Content-Type: application/json{ "message": "Your question here" } -
Check MCP Status:
GET http://localhost:3000/mcp/status
- WebSocket Server: Real-time communication on port 8090
- HTTP REST API: HTTP endpoints on port 3000
- MCP Integration: Seamless tool calling through MCP protocol
- Multi-Provider Support: Both OpenAI and Claude support
- Error Handling: Comprehensive error handling and logging
- Message Queuing: Queues messages when MCP client is initializing
-
Frontend: The
Chatbot_uidirectory contains the code for the chatbot frontend, built with TypeScript and React. Runningnpm run devin this directory will start the development server on port 8501. The frontend acts as a WebSocket client, maintaining a persistent connection with the backend for real-time chat functionality. -
MCP Client: The
MCP_clientdirectory contains the MCP Client, which manages communication with the MCP server and LLM providers (OpenAI, Claude). It exposes both WebSocket (port 8090) and HTTP REST API (port 3000) interfaces for client communication. -
Backend: The backend MCP server is not included in this repository. It is available in the NASA MCP Server repository. You must run the backend separately to enable full chat functionality.
- MCP Client not ready: Ensure your MCP server path is correct and the Python environment is set up properly.
- API Key errors: Make sure your
.envfile contains valid API keys for your chosen provider. - Connection errors: Check that your MCP server script exists and is executable.
- Port conflicts: Ensure ports 8090 and 3000 are available.
The server provides comprehensive logging. Check the console output for detailed error messages and connection status.
