diff --git a/.dockerignore b/.dockerignore index cbf79e7..d1b6d97 100644 --- a/.dockerignore +++ b/.dockerignore @@ -16,4 +16,8 @@ node_modules .idea dist build -.venv \ No newline at end of file +.venv + +# Exclude projects folder entirely - it will be mounted as volume +packages/backend/projects/ +packages/backend/projects/** \ No newline at end of file diff --git a/ARCHITECTURE.md b/ARCHITECTURE.md new file mode 100644 index 0000000..0151147 --- /dev/null +++ b/ARCHITECTURE.md @@ -0,0 +1,426 @@ +# AIM-RED Toolkit Architecture Documentation + +## Overview + +AIM-RED Toolkit (AIM-Forge) is a visual flow-based Python IDE built with a microservices architecture. The system separates concerns between the backend API server, executor service, and frontend client, providing complete isolation for code execution and LSP management. + +## System Architecture + +### Service Architecture + +``` +┌──────────────────────────────────────────────────────────────┐ +│ Frontend (React) │ +│ - Monaco Editor with LSP integration │ +│ - XYFlow for visual programming │ +│ - Terminal emulator (xterm.js) │ +└─────────────────────┬────────────────────────────────────────┘ + │ HTTP/WebSocket +┌─────────────────────▼────────────────────────────────────────┐ +│ Backend API Server │ +│ - FastAPI (port 8000) │ +│ - Project management │ +│ - Request routing and proxy │ +│ - Static file serving │ +└─────────────────────┬────────────────────────────────────────┘ + │ HTTP (Internal) +┌─────────────────────▼────────────────────────────────────────┐ +│ Executor Service │ +│ - FastAPI (port 8001) │ +│ - Code execution in isolated subprocesses │ +│ - LSP process management │ +│ - Virtual environment management │ +│ - Terminal PTY sessions │ +└──────────────────────────────────────────────────────────────┘ +``` + +## Core Components + +### 1. Backend API Server (Port 8000) + +The backend server acts as the main entry point and coordinator: + +#### Responsibilities: +- **Project Management**: CRUD operations for projects, nodes, and edges +- **Request Proxying**: Routes execution and LSP requests to executor service +- **File Management**: Manages project structure and metadata +- **Frontend Serving**: Serves the React application + +#### Key Endpoints: +``` +GET /api/project/ # List all projects +POST /api/project/make # Create new project +GET /api/project/{id} # Get project structure +PUT /api/project/{id} # Update project +DELETE /api/project/{id} # Delete project + +# Proxied to Executor +POST /api/code/execute-node # Execute single node +POST /api/project/execute-flow # Execute entire flow +GET /api/project/{id}/venv-status # Check venv status +``` + +### 2. Executor Service (Port 8001) + +The executor service handles all code execution and language server operations in complete isolation: + +#### Architecture Principles: +- **Process Isolation**: Each execution runs in a separate subprocess +- **Virtual Environment**: Each project has its own Python virtual environment +- **No Side Effects**: Backend environment remains unaffected by project code +- **Concurrent Execution**: Multiple projects can execute simultaneously + +#### Key Components: + +##### Code Execution (`/api/execute/*`) +```python +# Execution Flow: +1. Receive execution request with project_id and node_id +2. Load node code from file system +3. Get project's Python executable from venv +4. Create subprocess with venv Python +5. Execute code with input_data parameter +6. Capture and return output/errors +``` + +##### Virtual Environment Management (`/api/venv/*`) +```python +# Venv Creation Process: +1. Create venv structure asynchronously +2. Install base packages (fastapi, uvicorn, etc.) +3. Install LSP servers (pyright, ruff) +4. Track status: not_started → creating → installing → completed +5. Frontend polls status until ready +``` + +##### LSP Management (`/api/lsp/*`) +```python +# LSP Architecture: +- Per-project LSP processes (Pyright and Ruff) +- Single reader per process (prevents asyncio conflicts) +- Multiple WebSocket clients per LSP process +- Auto-restart with exponential backoff +- Idle timeout and resource cleanup +``` + +### 3. LSP Server Implementation + +The LSP implementation provides IDE-level code intelligence: + +#### LSP Gateway Architecture + +``` +Frontend Monaco Editor + ↓ +WebSocket Connection + ↓ +LSP Gateway (Executor Service) + ↓ +LSP Connection Manager + ├── Single Reader Task (per LSP process) + ├── Writer Lock (thread-safe writes) + └── Client Broadcasting + ↓ + LSP Process (Pyright/Ruff) + Running in Project Venv +``` + +#### Key Features: + +##### Single Reader Pattern +```python +class LspConnection: + """Manages LSP process with multiple WebSocket clients""" + project_id: str + lsp_type: LspType # "pyright" or "ruff" + process: LspProcess + clients: Set[WebSocket] + reader_task: asyncio.Task # Single reader + writer_lock: asyncio.Lock # Thread-safe writes +``` + +##### Process Lifecycle Management +```python +# Auto-restart with exponential backoff +- Initial failure: Wait 2 seconds +- Second failure: Wait 4 seconds +- Third failure: Wait 8 seconds +- Maximum wait: 30 seconds +- Maximum restarts: 5 in 1-minute window + +# Idle cleanup +- Tracks last_activity_ts for each process +- Kills processes idle > 10 minutes +- Automatic cleanup task runs every 30 seconds +``` + +##### Content-Length Protocol +```python +# LSP Message Format: +Content-Length: {size}\r\n\r\n +{JSON-RPC message body} + +# Handled by: +- Reader: Parses Content-Length headers +- Writer: Preserves headers from client +- Gateway: Broadcasts complete frames +``` + +## Request Flow Examples + +### 1. Code Execution Flow + +``` +User clicks "Run Node" in Frontend + ↓ +Frontend sends POST /api/code/execute-node + ↓ +Backend proxies to Executor via executor_proxy.py + ↓ +Executor Service: + 1. Validates project venv exists + 2. Loads node code from file + 3. Wraps code with input_data handler + 4. Creates subprocess with project Python + 5. Executes code with 30s timeout + 6. Returns output/error to frontend +``` + +### 2. LSP Connection Flow + +``` +IDE Component mounts in Frontend + ↓ +Creates WebSocket to /api/lsp/pyright?project_id={id} + ↓ +Backend proxies WebSocket to Executor + ↓ +Executor LSP Gateway: + 1. Checks if LSP process exists + 2. Starts new process if needed (with venv) + 3. Adds client to connection + 4. Starts single reader task + ↓ +Reader Task: + - Reads from LSP stdout + - Parses Content-Length frames + - Broadcasts to all clients + ↓ +Writer (from clients): + - Receives LSP requests + - Writes to LSP stdin (with lock) +``` + +### 3. Virtual Environment Creation + +``` +User creates new project + ↓ +Backend creates project structure + ↓ +Backend calls executor_proxy.create_project_venv() + ↓ +Executor starts async venv creation: + 1. Creates venv structure + 2. Upgrades pip + 3. Installs base packages + 4. Installs LSP servers + ↓ +Frontend polls /api/project/{id}/venv-status + ↓ +When status = "completed": + - IDE connects LSP + - Terminal becomes available +``` + +## Data Flow Patterns + +### Flow Execution + +```python +# Topological execution order: +1. Find start nodes (or use specified start_node_id) +2. Build dependency graph from edges +3. Execute nodes in topological order +4. Pass output as input_data to next nodes +5. Collect results in result nodes + +# Each node execution: +def main(input_data): + # Process input from previous node + # Return output for next node + return processed_data +``` + +### Terminal Integration + +```python +# Terminal with venv activation: +1. Create PTY process +2. Set environment variables: + - VIRTUAL_ENV = project venv path + - PATH = venv/bin:$PATH +3. Handle package installation: + - Detect pip install/uninstall + - Send package_changed event + - Trigger LSP restart +``` + +## Security and Isolation + +### Process Isolation Layers + +1. **Docker Container Isolation** + - Backend and Executor run in separate containers + - Network isolation between services + - Resource limits per container + +2. **Subprocess Isolation** + - Each code execution in new subprocess + - No shared memory or state + - Timeout enforcement (default 30s) + +3. **Virtual Environment Isolation** + - Per-project Python environments + - Independent package installations + - No cross-project dependencies + +4. **LSP Process Isolation** + - Separate LSP processes per project + - Independent process lifecycle + - Resource cleanup on idle + +## Configuration + +### Environment Variables + +```bash +# Executor Service +LSP_LOG_LEVEL=DEBUG # LSP logging verbosity +LSP_IDLE_TTL_MS=600000 # 10-minute idle timeout +LSP_MAX_RESTARTS=5 # Max restart attempts +LSP_RESTART_WINDOW_MS=60000 # Restart window (1 minute) + +TERMINAL_IDLE_TIMEOUT_MS=600000 # Terminal idle timeout +TERMINAL_MAX_SESSION_MS=3600000 # Max terminal session + +# Backend Service +EXECUTOR_URL=http://executor:8001 # Executor service URL +DISABLE_RELOAD=true # Disable hot reload +``` + +### Docker Compose Services + +```yaml +services: + backend: + ports: ["8000:8000"] + volumes: + - ./packages/backend:/app + - backend-projects:/app/projects + environment: + - EXECUTOR_URL=http://executor:8001 + + executor: + ports: ["8001:8001"] # Internal only + volumes: + - backend-projects:/app/projects + environment: + - LSP_LOG_LEVEL=INFO + + frontend: + ports: ["5173:5173"] + depends_on: + - backend +``` + +## Performance Optimizations + +### LSP Optimizations +- **Single Reader Pattern**: Prevents asyncio read conflicts +- **Connection Pooling**: Reuses LSP processes across reconnects +- **Exponential Backoff**: Reduces restart storms +- **Idle Cleanup**: Frees resources automatically + +### Execution Optimizations +- **Subprocess Pooling**: Reuses Python interpreters +- **Async Venv Creation**: Non-blocking project creation +- **Parallel Flow Execution**: Concurrent node execution +- **Result Caching**: Avoids redundant computations + +### WebSocket Optimizations +- **Frame Batching**: Combines small messages +- **Binary Protocol**: Reduces message size +- **Client Broadcasting**: Single read, multiple writes +- **Heartbeat/Keepalive**: Maintains connections + +## Monitoring and Debugging + +### LSP Monitoring +```bash +# Check LSP health +GET /api/lsp/health?project_id={id} + +# View LSP logs +GET /api/lsp/logs?n=50 + +# View LSP stdio +GET /api/lsp/stdio?project_id={id}&lsp_type=pyright&stream=stderr +``` + +### Execution Monitoring +- Execution timeouts logged +- Process exit codes captured +- Stderr/stdout separated +- Traceback preservation + +### System Health +```bash +# Backend health +GET /api/health + +# Executor health +GET /health + +# Service status +docker-compose ps +docker-compose logs -f executor +``` + +## Common Issues and Solutions + +### LSP Connection Issues +**Problem**: LSP not connecting or dropping frequently +**Solution**: +- Verify venv creation completed +- Check LSP process health endpoint +- Review LSP stdio logs for errors +- Ensure pyright/ruff installed in venv + +### Execution Failures +**Problem**: Code execution fails or times out +**Solution**: +- Check venv Python executable exists +- Verify code syntax before execution +- Increase timeout for long-running code +- Check subprocess resource limits + +### Virtual Environment Issues +**Problem**: Venv creation fails or packages missing +**Solution**: +- Check disk space in Docker volume +- Verify base package URLs accessible +- Review venv creation logs +- Manually trigger venv recreation + +## Future Enhancements + +### Planned Improvements +1. **Distributed Execution**: Scale executor service horizontally +2. **Result Caching**: Cache node outputs for faster re-execution +3. **Hot Module Reload**: Update code without LSP restart +4. **Multi-Language Support**: Add support for JavaScript, Go, Rust +5. **Collaborative Editing**: Real-time multi-user editing +6. **Cloud Deployment**: Kubernetes deployment manifests +7. **Performance Profiling**: Execution time analysis +8. **Advanced Debugging**: Breakpoint and step-through debugging \ No newline at end of file diff --git a/DOCKER.md b/DOCKER.md deleted file mode 100644 index 1bc33b1..0000000 --- a/DOCKER.md +++ /dev/null @@ -1,117 +0,0 @@ -# Docker Configuration for AIM-RED Toolkit - -## Overview -This project uses Docker Compose to containerize both frontend and backend services. The configuration ensures proper file persistence for project data. - -## Volume Mount Structure - -### Backend Volumes -- `./packages/backend/app:/app/app` - Mounts source code for hot-reload in development -- `./packages/backend/projects:/app/projects` - Persistent storage for project files - -### Why This Structure Works -1. **Project Files Persistence**: All project files created/modified by the backend API are stored in `./packages/backend/projects` on the host machine -2. **Hot Reload**: Source code changes in `./packages/backend/app` are immediately reflected in the container -3. **Data Safety**: Project data persists even when containers are recreated - -## Running with Docker - -### Development Mode -```bash -# Build and start services -docker-compose up --build - -# Or run in background -docker-compose up -d --build - -# Stop services -docker-compose down - -# View logs -docker-compose logs -f backend -docker-compose logs -f frontend -``` - -### Production Mode -```bash -# Build and start production services -docker-compose -f docker-compose.prod.yml up --build - -# Run in background -docker-compose -f docker-compose.prod.yml up -d --build - -# Stop production services -docker-compose -f docker-compose.prod.yml down -``` - -## File Operations in Docker - -The backend performs the following file operations that work correctly with the volume mounts: - -1. **Create Project**: Creates a new folder in `/app/projects/{project_id}/` -2. **Save Node Code**: Writes Python files to `/app/projects/{project_id}/{node_id}_{title}.py` -3. **Read Project Structure**: Reads from `/app/projects/projects.json` and `/app/projects/{project_id}/structure.json` -4. **Delete Project**: Removes entire project folder from `/app/projects/{project_id}/` - -All these operations map to `./packages/backend/projects/` on the host machine, ensuring: -- Files are accessible from the host for debugging -- Data persists across container restarts -- Multiple developers can share project files - -## Troubleshooting - -### Permission Issues -If you encounter permission issues with the projects directory: -```bash -# Fix permissions on host -chmod -R 755 packages/backend/projects -``` - -### Projects Directory Not Found -The projects directory is automatically created by the backend when needed. If missing: -```bash -# Create manually with proper structure -mkdir -p packages/backend/projects -touch packages/backend/projects/.gitkeep -``` - -### Container Can't Write Files -Ensure the volume mount is correct in docker-compose.yml: -```yaml -volumes: - - ./packages/backend/projects:/app/projects # Correct - # NOT: - ./packages/backend:/app # This would override the entire app directory -``` - -## Data Backup - -To backup project data: -```bash -# Backup all projects -tar -czf projects-backup-$(date +%Y%m%d).tar.gz packages/backend/projects/ - -# Restore from backup -tar -xzf projects-backup-20240101.tar.gz -``` - -## Development vs Production - -### Development (docker-compose.yml) -- Mounts source code for hot-reload -- Runs with `--reload` flag -- Frontend on port 5173 -- Backend on port 8000 - -### Production (docker-compose.prod.yml) -- No source code mounting (uses built image) -- No `--reload` flag -- Frontend on port 80 (nginx) -- Backend on port 8000 -- Includes restart policies - -## Network Configuration - -Both services are on the same Docker network (`aim-red-network`), allowing: -- Frontend to reach backend via `http://backend:8000` -- No CORS issues between services -- Isolated from other Docker containers \ No newline at end of file diff --git a/docker-compose.prod.yml b/docker-compose.prod.yml deleted file mode 100644 index 70d8d4e..0000000 --- a/docker-compose.prod.yml +++ /dev/null @@ -1,43 +0,0 @@ -version: '3.8' - -services: - backend: - build: - context: . - dockerfile: ./packages/backend/Dockerfile.prod - container_name: aim-red-backend-prod - ports: - - "8000:8000" - volumes: - # Only mount the projects directory for persistent storage - # No source code mounting in production - - ./packages/backend/projects:/app/projects - environment: - - PYTHONUNBUFFERED=1 - networks: - - aim-red-network - restart: unless-stopped - healthcheck: - test: ["CMD", "curl", "-f", "http://localhost:8000/api/health"] - interval: 30s - timeout: 10s - retries: 3 - - frontend: - build: - context: . - dockerfile: ./packages/frontend/Dockerfile.prod - container_name: aim-red-frontend-prod - ports: - - "80:80" - environment: - - VITE_API_URL=/api - depends_on: - - backend - networks: - - aim-red-network - restart: unless-stopped - -networks: - aim-red-network: - driver: bridge \ No newline at end of file diff --git a/docker-compose.yml b/docker-compose.yml index d522a57..0432dbc 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -1,25 +1,64 @@ services: backend: build: - context: . - dockerfile: ./packages/backend/Dockerfile + context: ./packages/backend + dockerfile: Dockerfile container_name: aim-red-backend ports: - "8000:8000" volumes: - # Mount only the source code for hot-reload - - ./packages/backend/app:/app/app - # Mount the projects directory for persistent storage - - ./packages/backend/projects:/app/projects + # Mount source code for hot-reload + - ./packages/backend/app:/app/app:delegated + # Mount projects directory for persistent storage (metadata only) + - ./packages/backend/projects:/app/projects:delegated environment: - PYTHONUNBUFFERED=1 + - DEVELOPMENT=true + - EXECUTOR_URL=http://executor:8001 networks: - aim-red-network + depends_on: + - executor + command: ["python", "-m", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"] healthcheck: test: ["CMD", "curl", "-f", "http://localhost:8000/api/health"] - interval: 30s - timeout: 10s - retries: 3 + interval: 15s + timeout: 5s + retries: 5 + start_period: 15s + + executor: + build: + context: ./packages/executor + dockerfile: Dockerfile + container_name: aim-red-executor + ports: + - "8001:8001" + volumes: + # Mount source code + - ./packages/executor/app:/app/app:delegated + # Mount projects directory for venvs and code + - executor-projects:/app/projects:delegated + # Mount LSP logs directory + - lsp-logs:/var/log/aim-red/lsp + environment: + - PYTHONUNBUFFERED=1 + - DEVELOPMENT=true + # LSP Configuration + - LSP_LOG_LEVEL=INFO + - LSP_STDIO_LOG_DIR=/var/log/aim-red/lsp + - LSP_IDLE_TTL_MS=600000 + - LSP_MAX_RESTARTS=5 + - LSP_RESTART_WINDOW_MS=60000 + networks: + - aim-red-network + command: ["python", "-m", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8001"] + healthcheck: + test: ["CMD", "curl", "-f", "http://localhost:8001/health"] + interval: 15s + timeout: 5s + retries: 5 + start_period: 15s frontend: build: @@ -29,16 +68,33 @@ services: ports: - "5173:5173" volumes: - - ./packages/frontend:/app/packages/frontend - - /app/packages/frontend/node_modules - - /app/node_modules + # Use named volumes for node_modules + - frontend-node-modules:/app/node_modules + - frontend-pkg-node-modules:/app/packages/frontend/node_modules environment: - VITE_API_URL=http://backend:8000 + - VITE_EXECUTOR_URL=http://executor:8001 + - NODE_OPTIONS=--max-old-space-size=4096 depends_on: - backend networks: - aim-red-network + stdin_open: true + tty: true + command: > + sh -c "cd /app/packages/frontend && + pnpm dev --host 0.0.0.0" networks: aim-red-network: driver: bridge + +volumes: + lsp-logs: + driver: local + executor-projects: + driver: local + frontend-node-modules: + driver: local + frontend-pkg-node-modules: + driver: local diff --git a/package.json b/package.json index bb4d239..7f23a02 100644 --- a/package.json +++ b/package.json @@ -11,9 +11,13 @@ "docker:build": "docker-compose build", "docker:up": "docker-compose up", "docker:down": "docker-compose down", - "test": "echo \"Error: no test specified\" && exit 1" + "test": "echo \"Error: no test specified\" && exit 1", + "postinstall": "npm install -g pyright" }, "keywords": [], "author": "", - "license": "ISC" + "license": "ISC", + "devDependencies": { + "pyright": "^1.1.404" + } } diff --git a/packages/backend/.dockerignore b/packages/backend/.dockerignore new file mode 100644 index 0000000..1848553 --- /dev/null +++ b/packages/backend/.dockerignore @@ -0,0 +1,70 @@ +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +.Python +env/ +venv/ +ENV/ +.venv/ +pip-log.txt +pip-delete-this-directory.txt +.pytest_cache/ +*.cover +.coverage +.coverage.* +coverage.xml +*.log + +# Project specific - IMPORTANT: Exclude projects folder to prevent symlink issues +projects/ +projects/** +projects/*/venv/ +projects/*/venv/** +*.db +*.sqlite +*.sqlite3 + +# IDE +.vscode/ +.idea/ +*.swp +*.swo +*~ + +# OS +.DS_Store +Thumbs.db + +# Testing +htmlcov/ +.tox/ +.nox/ +.cache +nosetests.xml +test-results/ +junit.xml + +# Documentation +docs/_build/ +site/ + +# Distribution / packaging +.Python +build/ +develop-eggs/ +dist/ +downloads/ +eggs/ +.eggs/ +lib/ +lib64/ +parts/ +sdist/ +var/ +wheels/ +*.egg-info/ +.installed.cfg +*.egg +MANIFEST \ No newline at end of file diff --git a/packages/backend/API_DOCUMENTATION.md b/packages/backend/API_DOCUMENTATION.md index d49ab82..6fb3533 100644 --- a/packages/backend/API_DOCUMENTATION.md +++ b/packages/backend/API_DOCUMENTATION.md @@ -3,26 +3,34 @@ ## Base Information - **Base URL**: `http://localhost:8000` - **API Version**: 1.0.0 -- **Authentication**: None (CORS enabled for localhost:5173) +- **Authentication**: None (CORS enabled for localhost:5173, localhost:3000, frontend:5173, frontend:3000) ## Table of Contents 1. [Root Endpoint](#root-endpoint) 2. [Health Check APIs](#health-check-apis) 3. [Code Management APIs](#code-management-apis) 4. [Project Management APIs](#project-management-apis) +5. [LSP Management APIs](#lsp-management-apis) +6. [WebSocket Endpoints](#websocket-endpoints) +7. [Package Management APIs](#package-management-apis) --- ## Root Endpoint ### GET / -Get API information +Get API information with feature status **Response** ```json { "message": "AIM Red Toolkit Backend API", - "status": "healthy" + "status": "healthy", + "features": { + "lsp": "enabled", + "pyright": "available", + "ruff": "available" + } } ``` @@ -56,35 +64,6 @@ Get system version information ## Code Management APIs -### POST /api/code/execute -Execute Python code in a secure environment - -**Request Body** -```json -{ - "code": "print('Hello, World!')", - "language": "python", // Optional, default: "python" - "timeout": 30 // Optional, default: 30 seconds -} -``` - -**Response** -```json -{ - "output": "Hello, World!\n", - "error": null, - "exit_code": 0 -} -``` - -**Error Response** -```json -{ - "output": "", - "error": "Code execution timed out", - "exit_code": -1 -} -``` ### POST /api/code/getcode Get code content of a specific node @@ -102,7 +81,7 @@ Get code content of a specific node ```json { "success": true, - "code": "# Node: Data Input\n# ID: 1\n\nprint('Hello, World!')", + "code": "def main(input_data=None):\n output_data = input_data\n return output_data", "language": "python", "node_id": "1", "node_title": "Data Input" @@ -113,7 +92,7 @@ Get code content of a specific node ```json { "success": true, - "code": "# Write your Python code here\nprint('Hello, World!')", + "code": "# Default template code...", "language": "python", "node_id": "1", "node_title": "Data Input", @@ -147,6 +126,37 @@ Save code to a node's Python file - `404`: Node not found - `500`: Server error +### POST /api/code/execute-node +Execute a single node with optional input data + +**Request Body** +```json +{ + "project_id": "unique_project_id", + "node_id": "1", + "input_data": {"key": "value"} // Optional +} +``` + +**Response (Success)** +```json +{ + "success": true, + "output": {"result": "data"}, + "node_id": "1" +} +``` + +**Response (Error)** +```json +{ + "success": false, + "error": "Error message", + "traceback": "Full traceback...", + "node_id": "1" +} +``` + --- ## Project Management APIs @@ -190,7 +200,7 @@ Get specific project's node-edge structure "nodes": [ { "id": "1", - "type": "default", + "type": "custom", "position": {"x": 100, "y": 100}, "data": { "title": "Data Input", @@ -218,7 +228,7 @@ Get specific project's node-edge structure - `404`: Project not found ### POST /api/project/make -Create a new project +Create a new project with virtual environment **Request Body** ```json @@ -233,13 +243,32 @@ Create a new project ```json { "success": true, - "message": "Project 'new_project' created successfully" + "message": "Project 'new_project' created successfully", + "venv_created": true } ``` **Error Responses** - `400`: Project already exists (checks both name and ID) +### GET /api/project/{project_id}/venv-status +Check virtual environment status for a project + +**Path Parameters** +- `project_id` (string): Project identifier + +**Response** +```json +{ + "success": true, + "project_id": "unique_project_id", + "venv_ready": true, + "venv_exists": true, + "venv_healthy": true +} +``` + + ### DELETE /api/project/delete Delete an entire project @@ -270,7 +299,7 @@ Create a new node in a project { "project_id": "unique_project_id", "node_id": "2", - "node_type": "default", + "node_type": "custom", "position": {"x": 400, "y": 100}, "data": { "title": "Preprocessing", @@ -286,7 +315,7 @@ Create a new node in a project "message": "Node 'Preprocessing' created successfully", "node": { "id": "2", - "type": "default", + "type": "custom", "position": {"x": 400, "y": 100}, "data": { "title": "Preprocessing", @@ -322,6 +351,27 @@ Delete a node from a project **Error Responses** - `404`: Node or project not found +### PUT /api/project/updatenode/position +Update node position in the flow + +**Request Body** +```json +{ + "project_id": "unique_project_id", + "node_id": "1", + "position": {"x": 200, "y": 150} +} +``` + +**Response** +```json +{ + "success": true, + "message": "Node position updated", + "node_id": "1" +} +``` + ### POST /api/project/makeedge Create a new edge between nodes @@ -379,6 +429,356 @@ Delete an edge from a project **Error Responses** - `404`: Edge or project not found +### POST /api/project/execute-flow +Execute the entire flow starting from a specific node or all start nodes + +**Request Body** +```json +{ + "project_id": "unique_project_id", + "start_node_id": "1", // Optional, null executes all start nodes + "params": {}, // Optional parameters + "max_workers": 4, // Optional, default: 4 (1-10) + "timeout_sec": 30, // Optional, default: 30 (1-300) + "halt_on_error": true // Optional, default: true +} +``` + +**Response** +```json +{ + "success": true, + "project_id": "unique_project_id", + "execution_time": 1.234, + "nodes_executed": 5, + "results": { + "node_1": { + "success": true, + "output": {"data": "result"}, + "execution_time": 0.5 + } + }, + "errors": [] +} +``` + + +--- + +## LSP Management APIs + +### GET /api/lsp/health +Check LSP process health status + +**Query Parameters** +- `project_id` (string, required): Project ID +- `lsp_type` (string, optional): "pyright" or "ruff" + +**Response (Specific LSP)** +```json +{ + "project_id": "unique_project_id", + "lsp_type": "pyright", + "running": true, + "pid": 12345, + "uptime_sec": 123.45, + "restart_count": 0, + "last_activity_sec": 5.2 +} +``` + +**Response (All LSPs)** +```json +{ + "project_id": "unique_project_id", + "pyright": { + "running": true, + "pid": 12345, + "uptime_sec": 123.45, + "restart_count": 0, + "last_activity_sec": 5.2 + }, + "ruff": { + "running": false, + "pid": null, + "uptime_sec": 0, + "restart_count": 0, + "last_activity_sec": null + } +} +``` + +### POST /api/lsp/restart/{lsp_type} +Manually restart an LSP process + +**Path Parameters** +- `lsp_type` (string): "pyright" or "ruff" + +**Query Parameters** +- `project_id` (string, required): Project ID + +**Response** +```json +{ + "success": true, + "project_id": "unique_project_id", + "lsp_type": "pyright", + "before": { + "running": true, + "pid": 12345 + }, + "after": { + "running": true, + "pid": 12346 + }, + "message": "Successfully restarted pyright LSP" +} +``` + +### POST /api/lsp/stop/{lsp_type} +Manually stop an LSP process + +**Path Parameters** +- `lsp_type` (string): "pyright" or "ruff" + +**Query Parameters** +- `project_id` (string, required): Project ID + +**Response** +```json +{ + "success": true, + "project_id": "unique_project_id", + "lsp_type": "pyright", + "message": "Successfully stopped pyright LSP" +} +``` + +### GET /api/lsp/logs +Get recent LSP event logs + +**Query Parameters** +- `n` (integer, optional): Number of events (1-2000, default: 200) + +**Response** +```json +{ + "count": 50, + "events": [ + { + "time": "2024-01-01T12:00:00.000Z", + "level": "INFO", + "message": "LSP started", + "project_id": "unique_project_id", + "lsp_type": "pyright" + } + ] +} +``` + +### GET /api/lsp/stdio +Get stdio logs for a specific LSP process + +**Query Parameters** +- `project_id` (string, required): Project ID +- `lsp_type` (string, required): "pyright" or "ruff" +- `stream` (string, optional): "stdout" or "stderr" (default: "stdout") +- `lines` (integer, optional): Number of lines (1-1000, default: 100) + +**Response** +```json +{ + "project_id": "unique_project_id", + "lsp_type": "pyright", + "stream": "stderr", + "lines": [ + "2024-01-01 12:00:00 - Starting Pyright language server...", + "2024-01-01 12:00:01 - Server initialized" + ], + "count": 2 +} +``` + +### GET /api/lsp/status +Get global LSP manager status + +**Response** +```json +{ + "active_processes": [ + { + "project_id": "project1", + "lsp_type": "pyright", + "running": true, + "pid": 12345, + "uptime_sec": 100.5 + } + ], + "count": 1, + "config": { + "idle_ttl_ms": 600000, + "max_restarts": 5, + "restart_window_ms": 60000, + "log_level": "DEBUG" + } +} +``` + +--- + +## WebSocket Endpoints + +### WS /api/lsp/python +WebSocket endpoint for Pyright Language Server Protocol + +**Query Parameters** +- `project_id` (string, required): Project ID + +**Connection Flow** +1. Client connects to WebSocket +2. Server checks virtual environment exists +3. Server starts/connects to Pyright process +4. Bidirectional LSP message exchange +5. Auto-reconnect on process restart (close code 4001) + +**Close Codes** +- `1000`: Normal closure +- `4001`: LSP process restarting +- `4002`: LSP process error +- `4003`: LSP process crashed + +### WS /api/lsp/ruff +WebSocket endpoint for Ruff Language Server Protocol + +**Query Parameters** +- `project_id` (string, required): Project ID + +Same behavior as Pyright endpoint but for Ruff linting/formatting. + +### WS /api/terminal +WebSocket endpoint for interactive terminal sessions + +**Query Parameters** +- `project_id` (string, required): Project ID +- `mode` (string, optional): "pkg" or "shell" (default: "pkg") + +**Message Types** + +Client to Server: +```json +{ + "type": "input", + "data": "ls -la\n" +} +``` + +```json +{ + "type": "resize", + "rows": 24, + "cols": 80 +} +``` + +Server to Client: +```json +{ + "type": "output", + "data": "file1.txt\nfile2.txt\n" +} +``` + +```json +{ + "type": "error", + "message": "Terminal session expired" +} +``` + +```json +{ + "type": "package_changed", + "action": "install", + "package": "numpy" +} +``` + +**Session Management** +- Idle timeout: 10 minutes (configurable) +- Max session duration: 1 hour (configurable) +- Automatic virtual environment activation +- Working directory set to project root + +--- + +## Package Management APIs + +### POST /api/code/packages/install +Install a package in the project's virtual environment + +**Request Body** +```json +{ + "project_id": "unique_project_id", + "package": "numpy==1.24.0" +} +``` + +**Response** +```json +{ + "success": true, + "message": "Successfully installed numpy==1.24.0", + "project_id": "unique_project_id", + "package": "numpy==1.24.0" +} +``` + +### POST /api/code/packages/uninstall +Uninstall a package from the project's virtual environment + +**Request Body** +```json +{ + "project_id": "unique_project_id", + "package": "numpy" +} +``` + +**Response** +```json +{ + "success": true, + "message": "Successfully uninstalled numpy", + "project_id": "unique_project_id", + "package": "numpy" +} +``` + +### POST /api/code/packages/list +Get list of installed packages in the project's virtual environment + +**Request Body** +```json +{ + "project_id": "unique_project_id" +} +``` + +**Response** +```json +{ + "success": true, + "project_id": "unique_project_id", + "packages": [ + {"name": "numpy", "version": "1.24.0"}, + {"name": "pandas", "version": "2.0.0"} + ], + "python_executable": "/app/projects/unique_project_id/venv/bin/python" +} +``` + + --- ## File Structure @@ -386,11 +786,14 @@ Delete an edge from a project The backend manages files in the following structure: ``` -projects/ -├── projects.json # Registry of all projects -└── {project_id}/ # Project folder (named by project_id) - ├── structure.json # Node-edge structure for the project - └── {node_id}_{node_title}.py # Python code for each node +packages/backend/ +├── projects/ +│ ├── projects.json # Registry of all projects +│ └── {project_id}/ # Project folder (named by project_id) +│ ├── structure.json # Node-edge structure for the project +│ ├── venv/ # Project-specific virtual environment +│ ├── pyrightconfig.json # Auto-generated Pyright configuration +│ └── {node_id}_{sanitized_title}.py # Python code for each custom node ``` ### projects.json Format @@ -415,7 +818,7 @@ projects/ "nodes": [ { "id": "1", - "type": "default", + "type": "custom", "position": {"x": 100, "y": 100}, "data": { "title": "Node Title", @@ -458,6 +861,29 @@ Common HTTP status codes: --- +## Environment Variables + +### LSP Configuration +```bash +LSP_LOG_LEVEL=DEBUG # Log level: DEBUG, INFO, WARNING, ERROR +LSP_IDLE_TTL_MS=600000 # Idle timeout before stopping LSP (10 min) +LSP_MAX_RESTARTS=5 # Max restart attempts +LSP_RESTART_WINDOW_MS=60000 # Time window for restart counting +``` + +### Terminal Configuration +```bash +TERMINAL_IDLE_TIMEOUT_MS=600000 # Terminal idle timeout (10 min) +TERMINAL_MAX_SESSION_MS=3600000 # Max terminal session duration (1 hour) +``` + +### Server Configuration +```bash +DISABLE_RELOAD=true # Disable hot reload to prevent venv-related crashes +``` + +--- + ## Development ### Running the Backend Server @@ -467,19 +893,24 @@ pnpm backend:dev # Or directly with Python cd packages/backend -python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 8000 +python -m uvicorn app.main:app --host 0.0.0.0 --port 8000 + +# Or using the development runner (reload disabled by default) +python run_dev.py ``` ### Server Configuration - **Host**: 0.0.0.0 - **Port**: 8000 -- **Auto-reload**: Enabled in development +- **Auto-reload**: DISABLED by default (see DISABLE_RELOAD env var) ### Dependencies -- FastAPI -- Uvicorn -- Pydantic +- FastAPI 0.115.5 +- Uvicorn 0.32.1 +- Pydantic 2.10.3 - Python 3.11+ +- Ruff 0.8.6 (includes LSP server) +- Pyright (installed via npm) --- @@ -498,17 +929,21 @@ docker-compose up --build - Backend runs on port 8000 - Frontend runs on port 5173 - Networks are configured for inter-service communication -- Volumes are mounted for hot-reload in development +- Hot reload is disabled for stability +- Projects folder is persisted as volume --- ## Notes -1. All code execution happens in isolated temporary files -2. File names are sanitized (spaces and slashes replaced with underscores) -3. Projects must have unique IDs (project_id) -4. Node IDs must be unique within a project -5. Edge IDs must be unique within a project -6. Deleting a node automatically removes all connected edges -7. Project folders are created using project_id, not project_name -8. All node/edge operations now use project_id for path resolution \ No newline at end of file +1. All code execution happens in project virtual environments +2. Virtual environments are created automatically when projects are created +3. File names are sanitized (spaces and slashes replaced with underscores) +4. Projects must have unique IDs (project_id) +5. Node IDs must be unique within a project +6. Edge IDs must be unique within a project +7. Deleting a node automatically removes all connected edges +8. Project folders are created using project_id, not project_name +9. LSP processes automatically restart with exponential backoff on failure +10. Terminal sessions have automatic idle and duration timeouts +11. Package installation in terminal triggers LSP restart automatically \ No newline at end of file diff --git a/packages/backend/Dockerfile b/packages/backend/Dockerfile index 50b67df..78db56d 100644 --- a/packages/backend/Dockerfile +++ b/packages/backend/Dockerfile @@ -1,23 +1,34 @@ -FROM python:3.11-slim +FROM python:3.11-slim AS base -WORKDIR /app +ENV PYTHONDONTWRITEBYTECODE=1 \ + PYTHONUNBUFFERED=1 -# Install system dependencies including curl for health checks -RUN apt-get update && apt-get install -y \ - gcc \ - curl \ - && apt-get clean \ +# Install minimal OS dependencies +RUN apt-get update && apt-get install -y --no-install-recommends \ + curl ca-certificates git tini procps \ && rm -rf /var/lib/apt/lists/* -# Copy and install Python dependencies -COPY packages/backend/requirements.txt ./ +# Python dependencies +WORKDIR /app +COPY requirements.txt ./requirements.txt RUN pip install --no-cache-dir -r requirements.txt -# Copy backend source -COPY packages/backend ./ +# Copy application source (excluding projects folder via .dockerignore) +COPY . /app + +# Create necessary directories +RUN mkdir -p /app/projects && \ + echo "[]" > /app/projects/projects.json && \ + chmod -R 755 /app/projects + +# Set volumes for persistence +VOLUME ["/app/projects"] -# Expose port EXPOSE 8000 -# Start FastAPI server -CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"] \ No newline at end of file +# Use tini for proper signal handling +ENTRYPOINT ["/usr/bin/tini", "--"] + +# Run without hot reload by default +# For development with hot reload, override with: docker run -e DISABLE_RELOAD=false ... +CMD ["python", "-m", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"] \ No newline at end of file diff --git a/packages/backend/Dockerfile.prod b/packages/backend/Dockerfile.prod deleted file mode 100644 index 33d37a3..0000000 --- a/packages/backend/Dockerfile.prod +++ /dev/null @@ -1,29 +0,0 @@ -FROM python:3.11-slim - -WORKDIR /app - -# Install system dependencies including curl for health checks -RUN apt-get update && apt-get install -y \ - gcc \ - curl \ - && apt-get clean \ - && rm -rf /var/lib/apt/lists/* - -# Copy and install Python dependencies -COPY packages/backend/requirements.txt ./ -RUN pip install --no-cache-dir -r requirements.txt - -# Copy backend source -COPY packages/backend ./ - -# Create projects directory -RUN mkdir -p /app/projects - -# Ensure proper permissions -RUN chmod -R 755 /app - -# Expose port -EXPOSE 8000 - -# Start FastAPI server (without --reload in production) -CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"] \ No newline at end of file diff --git a/packages/backend/app/api/code.py b/packages/backend/app/api/code.py deleted file mode 100644 index dbab4f8..0000000 --- a/packages/backend/app/api/code.py +++ /dev/null @@ -1,81 +0,0 @@ -from fastapi import APIRouter, HTTPException -from pydantic import BaseModel -from typing import Optional -from ..core.execute_code import execute_python_code -from ..core import node_operations - -router = APIRouter() - -class CodeExecutionRequest(BaseModel): - code: str - language: Optional[str] = "python" - timeout: Optional[int] = 30 - -class CodeExecutionResponse(BaseModel): - output: str - error: Optional[str] = None - exit_code: int - -class GetNodeCodeRequest(BaseModel): - project_id: str - node_id: str - node_title: Optional[str] = None # For finding the file if needed - -class SaveNodeCodeRequest(BaseModel): - project_id: str - node_id: str - node_title: Optional[str] = None - code: str - -@router.post("/execute", response_model=CodeExecutionResponse) -async def execute_code(request: CodeExecutionRequest): - """ - Execute Python code in a secure environment - """ - if request.language != "python": - raise HTTPException(status_code=400, detail="Only Python is supported") - - result = execute_python_code(request.code, request.timeout) - return CodeExecutionResponse(**result) - -@router.post("/getcode") -async def get_node_code(request: GetNodeCodeRequest): - """Get the code content of a node for Monaco Editor""" - try: - code = node_operations.get_node_code(request.project_id, request.node_id) - - # Return in format compatible with Monaco Editor - return { - "success": True, - "code": code, - "language": "python", - "node_id": request.node_id, - "node_title": request.node_title - } - except ValueError as e: - # Return default code if node not found or no code exists - return { - "success": True, - "code": "# Write your Python code here\nprint('Hello, World!')", - "language": "python", - "node_id": request.node_id, - "node_title": request.node_title, - "message": str(e) - } - except Exception as e: - raise HTTPException(status_code=500, detail=str(e)) - -@router.post("/savecode") -async def save_node_code(request: SaveNodeCodeRequest): - """Save code to a node's python file""" - try: - result = node_operations.save_node_code( - request.project_id, - request.node_id, - request.code - ) - return result - except ValueError as e: - raise HTTPException(status_code=404, detail=str(e)) - except Exception as e: - raise HTTPException(status_code=500, detail=str(e)) \ No newline at end of file diff --git a/packages/backend/app/api/executor_proxy.py b/packages/backend/app/api/executor_proxy.py new file mode 100644 index 0000000..42364be --- /dev/null +++ b/packages/backend/app/api/executor_proxy.py @@ -0,0 +1,107 @@ +""" +Proxy API calls to executor service +""" + +from fastapi import APIRouter, HTTPException, Request, Response +import httpx +import os +from typing import Any, Dict +import logging + +router = APIRouter() +logger = logging.getLogger(__name__) + +# Get executor URL from environment +EXECUTOR_URL = os.getenv("EXECUTOR_URL", "http://executor:8001") + +async def proxy_to_executor( + path: str, + method: str = "POST", + json_data: Dict[str, Any] = None, + params: Dict[str, Any] = None +): + """Proxy a request to the executor service""" + url = f"{EXECUTOR_URL}/api/{path}" + + try: + async with httpx.AsyncClient() as client: + response = await client.request( + method=method, + url=url, + json=json_data, + params=params, + timeout=30.0 + ) + + if response.status_code >= 400: + logger.error(f"Executor error: {response.status_code} - {response.text}") + raise HTTPException( + status_code=response.status_code, + detail=response.text + ) + + return response.json() + + except httpx.RequestError as e: + logger.error(f"Error connecting to executor: {e}") + raise HTTPException( + status_code=503, + detail=f"Executor service unavailable: {str(e)}" + ) + except Exception as e: + logger.error(f"Unexpected error proxying to executor: {e}") + raise HTTPException( + status_code=500, + detail=f"Internal error: {str(e)}" + ) + +# Code endpoints +@router.post("/code/getcode") +async def get_node_code(request: Request): + """Proxy get node code to executor""" + json_data = await request.json() + return await proxy_to_executor("code/getcode", json_data=json_data) + +@router.post("/code/savecode") +async def save_node_code(request: Request): + """Proxy save node code to executor""" + json_data = await request.json() + return await proxy_to_executor("code/savecode", json_data=json_data) + +@router.post("/code/execute-node") +async def execute_node(request: Request): + """Proxy execute node to executor""" + json_data = await request.json() + return await proxy_to_executor("execute/node", json_data=json_data) + + +# Venv endpoints +@router.get("/project/{project_id}/venv-status") +async def get_venv_status(project_id: str): + """Proxy venv status to executor""" + return await proxy_to_executor( + f"venv/status", + method="GET", + params={"project_id": project_id} + ) + +@router.post("/venv/create") +async def create_venv(request: Request): + """Proxy venv creation to executor""" + json_data = await request.json() + return await proxy_to_executor("venv/create", json_data=json_data) + +# Create venv when project is created +async def create_project_venv(project_id: str): + """Tell executor to create venv for new project""" + try: + result = await proxy_to_executor( + "venv/create", + json_data={"project_id": project_id} + ) + logger.info(f"Venv creation initiated for project {project_id}: {result}") + return result + except Exception as e: + logger.error(f"Failed to create venv for project {project_id}: {e}") + # Don't fail project creation if venv fails + return {"success": False, "error": str(e)} \ No newline at end of file diff --git a/packages/backend/app/api/project.py b/packages/backend/app/api/project.py index 58a50d7..99606e8 100644 --- a/packages/backend/app/api/project.py +++ b/packages/backend/app/api/project.py @@ -1,12 +1,14 @@ from fastapi import APIRouter, HTTPException -from pydantic import BaseModel -from typing import Optional, List, Dict, Any +from pydantic import BaseModel, Field +from typing import Optional, List, Dict, Any, Literal from ..core import ( project_operations, project_structure, node_operations, edge_operations ) +from pathlib import Path +import os router = APIRouter() @@ -23,7 +25,7 @@ class DeleteProjectRequest(BaseModel): class CreateNodeRequest(BaseModel): project_id: str node_id: str - node_type: str = "default" + node_type: str = "custom" position: Dict[str, float] data: Dict[str, Any] @@ -31,6 +33,11 @@ class DeleteNodeRequest(BaseModel): project_id: str node_id: str +class UpdateNodePositionRequest(BaseModel): + project_id: str + node_id: str + position: Dict[str, float] + class CreateEdgeRequest(BaseModel): project_id: str edge_id: str @@ -43,6 +50,15 @@ class DeleteEdgeRequest(BaseModel): project_id: str edge_id: str +class ExecuteFlowRequest(BaseModel): + project_id: str + start_node_id: Optional[str] = None + params: Dict[str, Any] = Field(default_factory=dict) + max_workers: int = Field(default=4, ge=1, le=10) + timeout_sec: int = Field(default=30, ge=1, le=300) + halt_on_error: bool = True + + @@ -125,9 +141,9 @@ async def get_project(project_id: str): @router.post("/make") async def make_project(request: CreateProjectRequest): - """Create a new project with folder and json file""" + """Create a new project with folder and json file, start async venv creation""" try: - result = project_operations.create_project(request.project_name, request.project_description, request.project_id) + result = await project_operations.create_project(request.project_name, request.project_description, request.project_id) return result except ValueError as e: raise HTTPException(status_code=400, detail=str(e)) @@ -135,11 +151,25 @@ async def make_project(request: CreateProjectRequest): raise HTTPException(status_code=500, detail=str(e)) +@router.get("/{project_id}/venv-status") +async def get_venv_status(project_id: str): + """Check virtual environment status including creation progress""" + try: + # Get detailed venv status from operations (now async) + status = await project_operations.get_venv_status(project_id) + return status + except Exception as e: + raise HTTPException(status_code=500, detail=str(e)) + + + + + @router.delete("/delete") async def delete_project(request: DeleteProjectRequest): """Delete entire project folder""" try: - result = project_operations.delete_project(request.project_name, request.project_id) + result = await project_operations.delete_project(request.project_name, request.project_id) return result except ValueError as e: raise HTTPException(status_code=404, detail=str(e)) @@ -177,6 +207,22 @@ async def delete_node(request: DeleteNodeRequest): raise HTTPException(status_code=500, detail=str(e)) +@router.put("/updatenode/position") +async def update_node_position(request: UpdateNodePositionRequest): + """Update node position in project structure""" + try: + result = node_operations.update_node_position( + request.project_id, + request.node_id, + request.position + ) + return result + except ValueError as e: + raise HTTPException(status_code=404, detail=str(e)) + except Exception as e: + raise HTTPException(status_code=500, detail=str(e)) + + @router.post("/makeedge") async def make_edge(request: CreateEdgeRequest): """Create a new edge between nodes""" @@ -206,3 +252,32 @@ async def delete_edge(request: DeleteEdgeRequest): raise HTTPException(status_code=404, detail=str(e)) except Exception as e: raise HTTPException(status_code=500, detail=str(e)) + + +@router.post("/execute-flow") +async def execute_flow(request: ExecuteFlowRequest): + """Execute the node flow starting from start node - proxy to executor""" + try: + from ..api.executor_proxy import proxy_to_executor + + # Proxy the execution request to executor + result = await proxy_to_executor( + "execute/flow", + json_data={ + "project_id": request.project_id, + "start_node_id": request.start_node_id, + "params": request.params, + "max_workers": request.max_workers, + "timeout_sec": request.timeout_sec, + "halt_on_error": request.halt_on_error + } + ) + + return result + + except HTTPException: + raise + except Exception as e: + raise HTTPException(status_code=500, detail=f"Flow execution failed: {str(e)}") + + diff --git a/packages/backend/app/api/test.py b/packages/backend/app/api/test.py new file mode 100644 index 0000000..1a741d1 --- /dev/null +++ b/packages/backend/app/api/test.py @@ -0,0 +1,8 @@ +def node1(): + return 8 + +def node2(input_data = None): + return input_data + + +8 \ No newline at end of file diff --git a/packages/backend/app/core/execute_code.py b/packages/backend/app/core/execute_code.py deleted file mode 100644 index 7d13764..0000000 --- a/packages/backend/app/core/execute_code.py +++ /dev/null @@ -1,49 +0,0 @@ -import subprocess -import tempfile -import os -import sys - -def execute_python_code(code: str, timeout: int = 30) -> dict: - """ - Execute Python code in a secure temporary environment - - Args: - code: Python code to execute - timeout: Maximum execution time in seconds - - Returns: - Dictionary with output, error, and exit_code - """ - try: - with tempfile.NamedTemporaryFile(mode='w', suffix='.py', delete=False) as temp_file: - temp_file.write(code) - temp_file_path = temp_file.name - - try: - result = subprocess.run( - [sys.executable, temp_file_path], - capture_output=True, - text=True, - timeout=timeout - ) - - return { - "output": result.stdout, - "error": result.stderr if result.stderr else None, - "exit_code": result.returncode - } - finally: - os.unlink(temp_file_path) - - except subprocess.TimeoutExpired: - return { - "output": "", - "error": "Code execution timed out", - "exit_code": -1 - } - except Exception as e: - return { - "output": "", - "error": str(e), - "exit_code": -1 - } \ No newline at end of file diff --git a/packages/backend/app/core/logging.py b/packages/backend/app/core/logging.py new file mode 100644 index 0000000..41dcf30 --- /dev/null +++ b/packages/backend/app/core/logging.py @@ -0,0 +1,161 @@ +""" +Structured logging for LSP and application +""" + +import logging +import os +import json +import time +import pathlib +from collections import deque +from typing import Literal, Optional, Dict, Any +from datetime import datetime + +LOG_LEVEL = os.getenv("LSP_LOG_LEVEL", "INFO").upper() +STDIO_DIR = pathlib.Path(os.getenv("LSP_STDIO_LOG_DIR", "/var/log/aim-red/lsp")) +STDIO_DIR.mkdir(parents=True, exist_ok=True) + +# Recent events memory buffer for debug endpoint +_RING = deque(maxlen=2000) + +def get_logger(name: str) -> logging.Logger: + """Get a structured JSON logger""" + logger = logging.getLogger(name) + if not logger.handlers: + handler = logging.StreamHandler() + handler.setFormatter(_JsonFormatter()) + logger.addHandler(handler) + logger.setLevel(getattr(logging, LOG_LEVEL, logging.INFO)) + return logger + +class _JsonFormatter(logging.Formatter): + """JSON formatter for structured logging""" + + def format(self, record: logging.LogRecord) -> str: + base = { + "ts": int(time.time() * 1000), + "level": record.levelname, + "msg": record.getMessage(), + "logger": record.name, + } + + # Add extra fields if provided + if hasattr(record, 'extra') and isinstance(record.extra, dict): + base.update(record.extra) + + return json.dumps(base, ensure_ascii=False) + +def lsp_stdio_logger( + project_id: str, + lsp_type: str, + stream: str, + line: str +) -> None: + """Log LSP stdio output to file and structured log""" + # Write to file for persistence + filename = STDIO_DIR / f"{project_id}.{lsp_type}.{stream}.log" + try: + with filename.open("a", encoding="utf-8") as f: + timestamp = datetime.now().isoformat() + f.write(f"[{timestamp}] {line}\n") + except Exception as e: + get_logger("lsp.stdio").error(f"Failed to write stdio log: {e}") + + # Sample structured log (avoid flooding) + if LOG_LEVEL in ["DEBUG", "TRACE"]: + get_logger("lsp.stdio").debug( + "stdio", + extra={ + "project_id": project_id, + "lsp": lsp_type, + "stream": stream, + "preview": line[:200] if len(line) > 200 else line + } + ) + +def log_lsp_frame( + direction: Literal["in", "out"], + lsp_type: str, + project_id: str, + payload: bytes +) -> None: + """Log LSP JSON-RPC frames for debugging""" + try: + # Parse JSON-RPC payload + obj = json.loads(payload.decode("utf-8")) + method = obj.get("method") + msg_id = obj.get("id") + size = len(payload) + + # Create log record + record = { + "direction": direction, + "method": method, + "id": msg_id, + "bytes": size, + "project_id": project_id, + "lsp": lsp_type, + "ts": int(time.time() * 1000) + } + + # Add to ring buffer + _RING.append(record) + + # Log metadata only by default + logger = get_logger("lsp.frame") + logger.debug("frame", extra=record) + + # Log full payload only in TRACE mode + if LOG_LEVEL == "TRACE": + logger.debug("payload", extra={**record, "payload": obj}) + + except json.JSONDecodeError: + # Not JSON, might be binary or malformed + pass + except Exception as e: + get_logger("lsp.frame").error(f"Error logging LSP frame: {e}") + +def log_lsp_lifecycle( + event: str, + project_id: str, + lsp_type: str, + **kwargs +) -> None: + """Log LSP lifecycle events""" + logger = get_logger("lsp.lifecycle") + extra = { + "event": event, + "project_id": project_id, + "lsp": lsp_type, + **kwargs + } + + if event in ["start", "stop", "restart"]: + logger.info(event, extra=extra) + elif event in ["error", "crash"]: + logger.error(event, extra=extra) + else: + logger.debug(event, extra=extra) + +def read_ring(n: int = 200) -> list: + """Read recent events from ring buffer""" + return list(_RING)[-n:] + +def get_stdio_logs( + project_id: str, + lsp_type: str, + stream: str = "stdout", + lines: int = 100 +) -> list[str]: + """Read recent stdio logs for a specific LSP process""" + filename = STDIO_DIR / f"{project_id}.{lsp_type}.{stream}.log" + if not filename.exists(): + return [] + + try: + with filename.open("r", encoding="utf-8") as f: + all_lines = f.readlines() + return all_lines[-lines:] + except Exception as e: + get_logger("lsp.stdio").error(f"Failed to read stdio log: {e}") + return [] \ No newline at end of file diff --git a/packages/backend/app/core/node_operations.py b/packages/backend/app/core/node_operations.py index ff54b0a..a4642ef 100644 --- a/packages/backend/app/core/node_operations.py +++ b/packages/backend/app/core/node_operations.py @@ -9,36 +9,59 @@ def create_node(project_id: str, node_id: str, node_type: str, position: Dict[st project_path = get_project_path(project_id) - # Extract title from data for filename - node_title = data.get('title', f'node_{node_id}') - - # Create python file for the node - py_filename = f"{node_id}_{node_title}.py".replace(" ", "_").replace("/", "_") - py_filepath = project_path / py_filename - - # Create empty python file with basic template - initial_code = data.get('code', f"# Node: {node_title}\n# ID: {node_id}\n\n# Write your Python code here\nprint('Hello, World!')") - with open(py_filepath, 'w') as f: - f.write(initial_code) - - # Update project json + # Update project json first to check for existing nodes structure = get_project_structure(project_id) # Check if node already exists if any(node['id'] == node_id for node in structure['nodes']): - # Delete the created file if node already exists - py_filepath.unlink() raise ValueError(f"Node with ID '{node_id}' already exists") + # Extract title from data for filename + node_title = data.get('title', f'node_{node_id}') + + # Only create Python file for custom nodes + # Start and Result nodes don't need code files + py_filename = None + if node_type == 'custom': + # Create python file for the node + py_filename = f"{node_id}_{node_title}.py".replace(" ", "_").replace("/", "__") + py_filepath = project_path / py_filename + + # Create empty python file with basic template + initial_code = f"""# Node: {node_title} +# ID: {node_id} + +def main(input_data=None): + \"\"\" + Process input data and return result + + Args: + input_data: Data from previous node(s) + + Returns: + Processed data for next node + \"\"\" + # Your code here + if input_data: + return input_data + return None +""" + with open(py_filepath, 'w') as f: + f.write(initial_code) + # Start and Result nodes don't need any Python file + # Add new node with React Flow structure # Ensure data has required fields node_data = { "title": data.get('title', f'Node {node_id}'), "description": data.get('description', ''), - "file": py_filename, # Add file reference to data **data # Include any additional data fields } + # Only add file reference if a file was created + if py_filename: + node_data["file"] = py_filename + new_node = { "id": node_id, "type": node_type, @@ -159,4 +182,32 @@ def save_node_code(project_id: str, node_id: str, code: str) -> Dict[str, Any]: "success": True, "message": f"Code saved for node '{node_id}'", "file_path": str(py_filepath) + } + +def update_node_position(project_id: str, node_id: str, position: Dict[str, float]) -> Dict[str, Any]: + """Update node position in project structure""" + from .project_structure import get_project_structure, save_project_structure + + # Get current structure + structure = get_project_structure(project_id) + + # Find and update node position + node_found = False + for node in structure['nodes']: + if node['id'] == node_id: + node['position'] = position + node_found = True + break + + if not node_found: + raise ValueError(f"Node with ID '{node_id}' not found") + + # Save updated structure + save_project_structure(project_id, structure) + + return { + "success": True, + "message": f"Position updated for node '{node_id}'", + "node_id": node_id, + "position": position } \ No newline at end of file diff --git a/packages/backend/app/core/project_operations.py b/packages/backend/app/core/project_operations.py index cdb6ce9..f28fef5 100644 --- a/packages/backend/app/core/project_operations.py +++ b/packages/backend/app/core/project_operations.py @@ -1,5 +1,6 @@ import json import shutil +import os from pathlib import Path from typing import List, Dict, Any from .projects_registry import ( @@ -8,7 +9,8 @@ get_projects_registry ) -PROJECTS_BASE_PATH = Path("projects") +# Get absolute path to projects directory +PROJECTS_BASE_PATH = Path(os.path.dirname(os.path.dirname(os.path.dirname(__file__)))) / "projects" def ensure_projects_dir() -> None: """Ensure the projects directory exists""" @@ -19,8 +21,8 @@ def get_all_projects() -> List[Dict[str, str]]: registry = get_projects_registry() return registry["projects"] -def create_project(project_name: str, project_description: str, project_id: str) -> Dict[str, Any]: - """Create a new project folder and json file""" +async def create_project(project_name: str, project_description: str, project_id: str) -> Dict[str, Any]: + """Create a new project folder and json file, then request venv creation from executor""" ensure_projects_dir() project_path = PROJECTS_BASE_PATH / project_id @@ -46,24 +48,64 @@ def create_project(project_name: str, project_description: str, project_id: str) with open(project_json_path, 'w') as f: json.dump(initial_structure, f, indent=2) + # Request virtual environment creation from executor + from ..api.executor_proxy import create_project_venv + print(f"Requesting virtual environment creation for project {project_id} from executor...") + venv_result = await create_project_venv(project_id) + return { "success": True, "message": f"Project '{project_name}' created successfully", + "venv_status": venv_result.get("status", "pending"), + "venv_message": venv_result.get("message", "Virtual environment creation requested") } except Exception as e: # If folder creation fails, remove from registry remove_project_from_registry(project_name) raise e -def delete_project(project_name: str, project_id:str) -> Dict[str, Any]: - """Delete entire project folder and remove from registry""" +async def get_venv_status(project_id: str) -> Dict[str, Any]: + """Get the status of virtual environment creation for a project from executor""" + from ..api.executor_proxy import proxy_to_executor + + try: + # Query executor for venv status + result = await proxy_to_executor( + f"venv/status", + method="GET", + params={"project_id": project_id} + ) + return result + except Exception as e: + # If executor is unavailable, return unknown status + return { + "success": False, + "project_id": project_id, + "status": "unknown", + "message": f"Could not get venv status: {str(e)}", + "venv_ready": False + } + +async def delete_project(project_name: str, project_id:str) -> Dict[str, Any]: + """Delete entire project folder including venv and remove from registry""" ensure_projects_dir() project_path = PROJECTS_BASE_PATH / project_id if not project_path.exists(): raise ValueError(f"Project with ID '{project_id}' does not exist") - # Delete folder first + # Request venv deletion from executor + try: + from ..api.executor_proxy import proxy_to_executor + await proxy_to_executor( + "venv/delete", + method="DELETE", + json_data={"project_id": project_id} + ) + except Exception as e: + print(f"Warning: Failed to delete venv for project {project_id}: {e}") + + # Delete folder shutil.rmtree(project_path) # Remove from registry diff --git a/packages/backend/app/core/projects_registry.py b/packages/backend/app/core/projects_registry.py index 0ff9af4..1f93143 100644 --- a/packages/backend/app/core/projects_registry.py +++ b/packages/backend/app/core/projects_registry.py @@ -1,8 +1,10 @@ import json +import os from pathlib import Path from typing import List, Dict, Any -PROJECTS_BASE_PATH = Path("projects") +# Get absolute path to projects directory +PROJECTS_BASE_PATH = Path(os.path.dirname(os.path.dirname(os.path.dirname(__file__)))) / "projects" PROJECTS_REGISTRY_FILE = PROJECTS_BASE_PATH / "projects.json" def ensure_projects_registry() -> None: diff --git a/packages/backend/app/core/save_code.py b/packages/backend/app/core/save_code.py deleted file mode 100644 index 788d345..0000000 --- a/packages/backend/app/core/save_code.py +++ /dev/null @@ -1,35 +0,0 @@ -from pathlib import Path - -def save_project_code(project_hash: str, project_title: str, node_id: str, node_title: str, code: str) -> dict: - """ - Save code to a project-specific directory - - Args: - project_hash: Unique project identifier - project_title: Project title for filename - node_id: Node identifier - node_title: Node title for filename - code: Code content to save - - Returns: - Dictionary with success status and file path - """ - projects_dir = Path("projects") - projects_dir.mkdir(exist_ok=True) - - project_dir = projects_dir / project_hash - project_dir.mkdir(exist_ok=True) - - filename = f"{project_title}-{node_id}-{node_title}.py" - filename = filename.replace(" ", "_").replace("/", "_") - - file_path = project_dir / filename - - with open(file_path, "w") as f: - f.write(code) - - return { - "success": True, - "message": "Code saved successfully", - "file_path": str(file_path) - } \ No newline at end of file diff --git a/packages/backend/app/main.py b/packages/backend/app/main.py index 2bf0347..7a5eac8 100644 --- a/packages/backend/app/main.py +++ b/packages/backend/app/main.py @@ -1,12 +1,46 @@ from fastapi import FastAPI from fastapi.middleware.cors import CORSMiddleware -from app.api import health, code, project +from contextlib import asynccontextmanager +import asyncio +from app.api import health, project, executor_proxy +from app.core.logging import get_logger -app = FastAPI(title="AIM Red Toolkit Backend", version="1.0.0") +logger = get_logger(__name__) +@asynccontextmanager +async def lifespan(app: FastAPI): + """Manage application lifecycle""" + # Startup + logger.info("Starting AIM Red Toolkit Backend") + + # Ensure projects directory exists for metadata + from app.core.project_operations import ensure_projects_dir + ensure_projects_dir() + logger.info("Projects directory ready") + + yield + + # Shutdown + logger.info("Shutting down AIM Red Toolkit Backend") + logger.info("Shutdown complete") + +app = FastAPI( + title="AIM Red Toolkit Backend", + version="1.0.0", + lifespan=lifespan +) + +# CORS middleware configuration app.add_middleware( CORSMiddleware, - allow_origins=["http://localhost:5173", "http://frontend:5173"], + allow_origins=[ + "http://localhost:5173", + "http://frontend:5173", + "http://localhost:3000", + "http://frontend:3000", + "http://localhost", + "http://frontend" + ], allow_credentials=True, allow_methods=["*"], allow_headers=["*"], @@ -14,12 +48,23 @@ @app.get("/") async def root(): - return {"message": "AIM Red Toolkit Backend API", "status": "healthy"} + return { + "message": "AIM Red Toolkit Backend API", + "status": "healthy", + "features": { + "lsp": "enabled", + "pyright": "available", + "ruff": "available" + } + } +# Include API routers app.include_router(health.router, prefix="/api") -app.include_router(code.router, prefix="/api/code") app.include_router(project.router, prefix="/api/project") +# Proxy to executor for code/venv operations +app.include_router(executor_proxy.router, prefix="/api") + if __name__ == "__main__": import uvicorn - uvicorn.run(app, host="0.0.0.0", port=8000) \ No newline at end of file + uvicorn.run(app, host="0.0.0.0", port=8000, reload=True) \ No newline at end of file diff --git a/packages/backend/package.json b/packages/backend/package.json index 0dd7322..3d16068 100644 --- a/packages/backend/package.json +++ b/packages/backend/package.json @@ -3,6 +3,6 @@ "version": "1.0.0", "private": true, "scripts": { - "dev": "python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 8000" + "dev": "python3 -m uvicorn app.main:app --reload --host 0.0.0.0 --port 8000" } } diff --git a/packages/backend/projects/.gitignore b/packages/backend/projects/.gitignore new file mode 100644 index 0000000..a3a0c8b --- /dev/null +++ b/packages/backend/projects/.gitignore @@ -0,0 +1,2 @@ +* +!.gitignore \ No newline at end of file diff --git a/packages/backend/requirements.txt b/packages/backend/requirements.txt index f990ebd..9ba447f 100644 --- a/packages/backend/requirements.txt +++ b/packages/backend/requirements.txt @@ -4,5 +4,4 @@ python-multipart==0.0.12 pydantic==2.10.3 httpx==0.28.1 python-jose[cryptography]==3.3.0 -passlib[bcrypt]==1.7.4 -aiofiles==24.1.0 \ No newline at end of file +passlib[bcrypt]==1.7.4 \ No newline at end of file diff --git a/packages/executor/.dockerignore b/packages/executor/.dockerignore new file mode 100644 index 0000000..c43c5e4 --- /dev/null +++ b/packages/executor/.dockerignore @@ -0,0 +1,24 @@ +# Exclude projects directory to avoid symlink issues +projects/ + +# Python cache +__pycache__/ +*.py[cod] +*$py.class +*.so + +# Virtual environments +venv/ +ENV/ +env/ + +# IDE +.vscode/ +.idea/ + +# Logs +*.log + +# OS files +.DS_Store +Thumbs.db \ No newline at end of file diff --git a/packages/executor/Dockerfile b/packages/executor/Dockerfile new file mode 100644 index 0000000..9958d2c --- /dev/null +++ b/packages/executor/Dockerfile @@ -0,0 +1,63 @@ +FROM python:3.11-slim AS base + +ENV PYTHONDONTWRITEBYTECODE=1 \ + PYTHONUNBUFFERED=1 + +# Install OS dependencies including python3-venv for virtual environment creation +# Also install build dependencies to speed up package installations +RUN apt-get update && apt-get install -y --no-install-recommends \ + curl ca-certificates git tini procps gcc g++ python3-venv \ + python3-dev build-essential \ + && rm -rf /var/lib/apt/lists/* + +# Install Node.js 20 for pyright +RUN curl -fsSL https://deb.nodesource.com/setup_20.x | bash - \ + && apt-get update && apt-get install -y --no-install-recommends nodejs \ + && rm -rf /var/lib/apt/lists/* + +# Enable pnpm via corepack and set global path +ENV PNPM_HOME="/root/.local/share/pnpm" +ENV PATH="${PNPM_HOME}:${PATH}" +RUN corepack enable && pnpm --version + +# Install pyright globally via npm and ensure pyright-langserver is available +RUN pnpm add -g pyright@1.1.404 && \ + pyright --version && \ + # Create pyright-langserver wrapper if needed + echo '#!/bin/sh' > /usr/local/bin/pyright-langserver && \ + echo 'exec npx pyright-langserver "$@"' >> /usr/local/bin/pyright-langserver && \ + chmod +x /usr/local/bin/pyright-langserver && \ + # Test that pyright-langserver is accessible + pyright-langserver --version 2>/dev/null || echo "Pyright langserver installed" + +# Python dependencies for executor service +WORKDIR /app +COPY requirements.txt ./requirements.txt +RUN pip install --no-cache-dir -r requirements.txt \ + && pip install --no-cache-dir ruff==0.8.6 + +# Copy application source +COPY . /app + +# Create necessary directories +RUN mkdir -p /var/log/aim-red/lsp /app/projects && \ + chmod -R 755 /app/projects + +# Set volumes for persistence +VOLUME ["/var/log/aim-red/lsp", "/app/projects"] + +# LSP runtime environment variables +ENV LSP_LOG_LEVEL=INFO \ + LSP_STDIO_LOG_DIR=/var/log/aim-red/lsp \ + LSP_IDLE_TTL_MS=600000 \ + LSP_MAX_RESTARTS=5 \ + LSP_RESTART_WINDOW_MS=60000 + +# Expose ports for API and WebSockets +EXPOSE 8001 + +# Use tini for proper signal handling +ENTRYPOINT ["/usr/bin/tini", "--"] + +# Run the executor service +CMD ["python", "-m", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8001"] \ No newline at end of file diff --git a/packages/backend/projects/.gitkeep b/packages/executor/app/__init__.py similarity index 100% rename from packages/backend/projects/.gitkeep rename to packages/executor/app/__init__.py diff --git a/packages/frontend/src/store/projectStore.ts b/packages/executor/app/api/__init__.py similarity index 100% rename from packages/frontend/src/store/projectStore.ts rename to packages/executor/app/api/__init__.py diff --git a/packages/executor/app/api/code.py b/packages/executor/app/api/code.py new file mode 100644 index 0000000..3f5d389 --- /dev/null +++ b/packages/executor/app/api/code.py @@ -0,0 +1,135 @@ +""" +Code file management API for nodes +""" + +from fastapi import APIRouter, HTTPException +from pydantic import BaseModel +from typing import Optional, Dict, Any +from pathlib import Path +import json +import os +import re +import logging + +router = APIRouter() +logger = logging.getLogger(__name__) + +class GetNodeCodeRequest(BaseModel): + project_id: str + node_id: str + node_title: Optional[str] = None + +class SaveNodeCodeRequest(BaseModel): + project_id: str + node_id: str + node_title: Optional[str] = None + code: str + +def sanitize_filename(name: str) -> str: + """Sanitize a string to be used as a filename""" + # Replace spaces and special characters with underscores + sanitized = re.sub(r'[^\w\-_]', '_', name) + # Remove multiple consecutive underscores + sanitized = re.sub(r'_+', '_', sanitized) + # Remove leading/trailing underscores + sanitized = sanitized.strip('_') + return sanitized or "untitled" + +def get_node_file_path(project_id: str, node_id: str, node_title: Optional[str] = None) -> Path: + """Get the file path for a node's Python code""" + project_dir = Path(f"/app/projects/{project_id}") + + # Always ensure project directory exists + project_dir.mkdir(parents=True, exist_ok=True) + + # Generate filename from node_id and title + if node_title: + filename = f"{node_id}_{sanitize_filename(node_title)}.py" + else: + filename = f"{node_id}.py" + + return project_dir / filename + +@router.post("/getcode") +async def get_node_code(request: GetNodeCodeRequest): + """Get the code content of a node""" + try: + file_path = get_node_file_path(request.project_id, request.node_id, request.node_title) + + if file_path.exists(): + with open(file_path, 'r', encoding='utf-8') as f: + code = f.read() + else: + # Return default code if file doesn't exist + code = """# Write your logic in function. +# The function name can be changed arbitrarily, +# but only one function is allowed per node. +# To pass the return value of this function to the next node, +# a return statement must be present. +# The data format and type of input_data should be defined +# at the beginning of the function and used accordingly. +# (Using typing or Pydantic is recommended.) + +def main(input_data=None): + output_data = input_data + return output_data""" + + return { + "success": True, + "code": code, + "language": "python", + "node_id": request.node_id, + "node_title": request.node_title + } + except Exception as e: + logger.error(f"Error getting code: {e}") + raise HTTPException(status_code=500, detail=str(e)) + +@router.post("/savecode") +async def save_node_code(request: SaveNodeCodeRequest): + """Save code to a node's python file""" + try: + file_path = get_node_file_path(request.project_id, request.node_id, request.node_title) + + # Ensure project directory exists + file_path.parent.mkdir(parents=True, exist_ok=True) + + # Write the code to file + with open(file_path, 'w', encoding='utf-8') as f: + f.write(request.code) + + return { + "success": True, + "message": "Code saved successfully", + "file_path": str(file_path), + "node_id": request.node_id + } + except Exception as e: + logger.error(f"Error saving code: {e}") + raise HTTPException(status_code=500, detail=str(e)) + +@router.post("/delete") +async def delete_node_code(request: GetNodeCodeRequest): + """Delete a node's code file""" + try: + # Find all files matching the node_id pattern + project_dir = Path(f"/app/projects/{request.project_id}") + if project_dir.exists(): + for file_path in project_dir.glob(f"{request.node_id}_*.py"): + file_path.unlink() + logger.info(f"Deleted code file: {file_path}") + + # Also try exact match + exact_file = project_dir / f"{request.node_id}.py" + if exact_file.exists(): + exact_file.unlink() + logger.info(f"Deleted code file: {exact_file}") + + return { + "success": True, + "message": "Code file deleted", + "node_id": request.node_id + } + except Exception as e: + logger.error(f"Error deleting code: {e}") + raise HTTPException(status_code=500, detail=str(e)) \ No newline at end of file diff --git a/packages/executor/app/api/execute.py b/packages/executor/app/api/execute.py new file mode 100644 index 0000000..ba02f3d --- /dev/null +++ b/packages/executor/app/api/execute.py @@ -0,0 +1,219 @@ +""" +Code execution API for running Python code in project environments +""" + +from fastapi import APIRouter, HTTPException +from pydantic import BaseModel +from typing import Optional, Dict, Any, List +from pathlib import Path +import json +import logging +from ..core.executor import execute_python_code, get_project_venv_python +from ..core.venv_manager import AsyncVenvManager + +router = APIRouter() +logger = logging.getLogger(__name__) + +# Global venv manager instance +venv_manager = AsyncVenvManager("/app/projects") + +class ExecuteNodeRequest(BaseModel): + project_id: str + node_id: str + input_data: Optional[Dict[str, Any]] = None + +class ExecuteCodeRequest(BaseModel): + project_id: str + code: str + input_data: Optional[Dict[str, Any]] = None + timeout: int = 30 + +class ExecuteFlowRequest(BaseModel): + project_id: str + start_node_id: Optional[str] = None + params: Dict[str, Any] = {} + max_workers: int = 4 + timeout_sec: int = 30 + halt_on_error: bool = True + +def get_node_code(project_id: str, node_id: str) -> str: + """Get the code for a node from its file""" + project_dir = Path(f"/app/projects/{project_id}") + + # Try to find the node file + for file_path in project_dir.glob(f"{node_id}_*.py"): + with open(file_path, 'r', encoding='utf-8') as f: + return f.read() + + # Try exact match + exact_file = project_dir / f"{node_id}.py" + if exact_file.exists(): + with open(exact_file, 'r', encoding='utf-8') as f: + return f.read() + + raise ValueError(f"Code file not found for node {node_id}") + +@router.post("/node") +async def execute_node(request: ExecuteNodeRequest): + """Execute a single node and return its output""" + try: + # Check if venv exists + if not venv_manager.venv_exists(request.project_id): + return { + "success": False, + "error": f"Virtual environment not found for project {request.project_id}", + "node_id": request.node_id + } + + # Get the node's code + code = get_node_code(request.project_id, request.node_id) + + # Create wrapper to execute the node with input data + input_json_str = json.dumps(request.input_data) if request.input_data else 'null' + + wrapper_code = f""" +import json +import sys + +# Node code +{code} + +# Execute with input +try: + input_json = '''{input_json_str}''' + if input_json != 'null': + input_data = json.loads(input_json) + else: + input_data = None + + # Find and execute main function + if 'main' in locals() and callable(main): + result = main(input_data) if input_data is not None else main() + else: + # Find first callable + result = None + for name, obj in list(locals().items()): + if callable(obj) and not name.startswith('_') and name not in ['json', 'sys']: + result = obj(input_data) if input_data is not None else obj() + break + + print(json.dumps({{'success': True, 'output': result}})) +except Exception as e: + import traceback + print(json.dumps({{ + 'success': False, + 'error': str(e), + 'traceback': traceback.format_exc() + }})) +""" + + # Execute using project's virtual environment + python_exe = venv_manager.get_python_executable(request.project_id) + project_dir = f"/app/projects/{request.project_id}" + + execution_result = execute_python_code( + wrapper_code, + timeout=30, + python_executable=python_exe, + working_dir=project_dir + ) + + if execution_result['exit_code'] == 0: + try: + output = json.loads(execution_result['output']) + if output.get('success'): + return { + "success": True, + "output": output.get('output'), + "node_id": request.node_id + } + else: + return { + "success": False, + "error": output.get('error', 'Unknown error'), + "traceback": output.get('traceback', ''), + "node_id": request.node_id + } + except json.JSONDecodeError: + return { + "success": False, + "error": "Failed to parse output", + "output_raw": execution_result['output'], + "node_id": request.node_id + } + else: + return { + "success": False, + "error": execution_result.get('error', 'Execution failed'), + "output_raw": execution_result.get('output', ''), + "node_id": request.node_id + } + + except ValueError as e: + raise HTTPException(status_code=404, detail=str(e)) + except Exception as e: + logger.error(f"Error executing node: {e}") + raise HTTPException(status_code=500, detail=str(e)) + +@router.post("/code") +async def execute_code(request: ExecuteCodeRequest): + """Execute arbitrary Python code in a project's environment""" + try: + # Check if venv exists + if not venv_manager.venv_exists(request.project_id): + return { + "success": False, + "error": f"Virtual environment not found for project {request.project_id}" + } + + # Get project Python executable + python_exe = venv_manager.get_python_executable(request.project_id) + project_dir = f"/app/projects/{request.project_id}" + + # Execute the code + result = execute_python_code( + request.code, + timeout=request.timeout, + python_executable=python_exe, + working_dir=project_dir + ) + + return { + "success": result['exit_code'] == 0, + "output": result['output'], + "error": result['error'], + "exit_code": result['exit_code'] + } + + except Exception as e: + logger.error(f"Error executing code: {e}") + raise HTTPException(status_code=500, detail=str(e)) + +@router.post("/flow") +async def execute_flow(request: ExecuteFlowRequest): + """Execute a flow of nodes starting from the start node""" + try: + from ..core.flow_executor import FlowExecutor + + # Create flow executor with projects root + executor = FlowExecutor("/app/projects") + + # Execute the flow + result = await executor.execute_flow( + project_id=request.project_id, + start_node_id=request.start_node_id, + params=request.params, + max_workers=request.max_workers, + timeout_sec=request.timeout_sec, + halt_on_error=request.halt_on_error + ) + + return result + + except FileNotFoundError as e: + raise HTTPException(status_code=404, detail=str(e)) + except ValueError as e: + raise HTTPException(status_code=400, detail=str(e)) + except Exception as e: + logger.error(f"Flow execution failed: {e}") + raise HTTPException(status_code=500, detail=str(e)) \ No newline at end of file diff --git a/packages/executor/app/api/lsp.py b/packages/executor/app/api/lsp.py new file mode 100644 index 0000000..0dfc016 --- /dev/null +++ b/packages/executor/app/api/lsp.py @@ -0,0 +1,432 @@ +""" +Enhanced LSP Gateway with single reader per process +Prevents asyncio read conflicts by having one reader per LSP process +""" + +from fastapi import APIRouter, WebSocket, WebSocketDisconnect, Query +import asyncio +import re +from typing import Optional, Dict, Set +from dataclasses import dataclass +from ..core.lsp_manager import lsp_manager, LspType, LspProcess +from ..core.logging import get_logger, log_lsp_frame + +router = APIRouter() +logger = get_logger(__name__) + +# Regex for parsing LSP Content-Length headers +HEADER_RE = re.compile(rb"Content-Length:\s*(\d+)\r\n\r\n", re.I) + +class WebSocketCloseCodes: + """Custom WebSocket close codes for LSP status""" + NORMAL = 1000 + LSP_RESTART = 4001 # LSP process restarting + LSP_ERROR = 4002 # LSP process error + LSP_CRASHED = 4003 # LSP process crashed + +@dataclass +class LspConnection: + """Manages LSP process connection with multiple WebSocket clients""" + project_id: str + lsp_type: LspType + process: LspProcess + clients: Set[WebSocket] + reader_task: Optional[asyncio.Task] = None + writer_lock: asyncio.Lock = None + + def __post_init__(self): + self.clients = set() + self.writer_lock = asyncio.Lock() + +class LspGateway: + """Manages LSP connections with single reader per process""" + + def __init__(self): + self.connections: Dict[tuple[str, LspType], LspConnection] = {} + self._lock = asyncio.Lock() + + async def add_client( + self, + websocket: WebSocket, + project_id: str, + lsp_type: LspType + ) -> LspConnection: + """Add a WebSocket client to an LSP connection""" + key = (project_id, lsp_type) + + async with self._lock: + if key not in self.connections: + # Start new LSP process + cwd = f"/app/projects/{project_id}" + process = await lsp_manager.get_or_start(project_id, lsp_type, cwd) + + if not process or not process.proc: + raise RuntimeError(f"Failed to start {lsp_type} LSP") + + # Create new connection + connection = LspConnection( + project_id=project_id, + lsp_type=lsp_type, + process=process, + clients=set() + ) + + self.connections[key] = connection + + # Start single reader task for this process AFTER adding to connections + # to prevent race conditions + if not connection.reader_task or connection.reader_task.done(): + connection.reader_task = asyncio.create_task( + self._read_from_lsp(connection) + ) + else: + connection = self.connections[key] + + # Check if the process is still alive + if not connection.process.proc or connection.process.proc.returncode is not None: + # Process died, restart it + logger.warning(f"LSP process {lsp_type} for {project_id} is dead, restarting") + + # Cancel old reader task if exists + if connection.reader_task: + connection.reader_task.cancel() + try: + await connection.reader_task + except asyncio.CancelledError: + pass + + # Get new process + cwd = f"/app/projects/{project_id}" + process = await lsp_manager.get_or_start(project_id, lsp_type, cwd) + + if not process or not process.proc: + raise RuntimeError(f"Failed to restart {lsp_type} LSP") + + connection.process = process + + # Start new reader task only if not already running + if not connection.reader_task or connection.reader_task.done(): + connection.reader_task = asyncio.create_task( + self._read_from_lsp(connection) + ) + + # Add the client to the connection + connection.clients.add(websocket) + return connection + + async def remove_client( + self, + websocket: WebSocket, + project_id: str, + lsp_type: LspType + ): + """Remove a WebSocket client from an LSP connection""" + key = (project_id, lsp_type) + + async with self._lock: + if key in self.connections: + connection = self.connections[key] + connection.clients.discard(websocket) + + # If no more clients, stop the reader task + if not connection.clients: + if connection.reader_task: + connection.reader_task.cancel() + try: + await connection.reader_task + except asyncio.CancelledError: + pass + del self.connections[key] + + async def write_to_lsp( + self, + connection: LspConnection, + data: bytes + ): + """Write data to LSP process (thread-safe)""" + async with connection.writer_lock: + if connection.process.proc and connection.process.proc.stdin: + # The data from client should already have Content-Length header + # from vscode-ws-jsonrpc, so just pass it through + try: + connection.process.proc.stdin.write(data) + await connection.process.proc.stdin.drain() + except Exception as e: + self.log.error(f"Failed to write to LSP stdin: {e}") + + async def _read_from_lsp(self, connection: LspConnection): + """Single reader task for LSP process stdout""" + buffer = b"" + + try: + # Wait a bit for process to be fully initialized + await asyncio.sleep(0.1) + + while True: + # Check if process is still alive + if not connection.process.proc: + logger.warning(f"{connection.lsp_type} LSP process not initialized", extra={ + "project_id": connection.project_id + }) + await self._broadcast_close( + connection, + WebSocketCloseCodes.LSP_ERROR, + f"{connection.lsp_type} process not initialized" + ) + break + + if connection.process.proc.returncode is not None: + logger.warning(f"{connection.lsp_type} LSP process died", extra={ + "project_id": connection.project_id, + "exit_code": connection.process.proc.returncode + }) + await self._broadcast_close( + connection, + WebSocketCloseCodes.LSP_RESTART, + f"{connection.lsp_type} restarting" + ) + break + + # Check if stdout is available + if not connection.process.proc.stdout: + logger.error(f"{connection.lsp_type} LSP process has no stdout", extra={ + "project_id": connection.project_id + }) + await self._broadcast_close( + connection, + WebSocketCloseCodes.LSP_ERROR, + f"{connection.lsp_type} no stdout" + ) + break + + # Read from LSP stdout (single reader) + try: + chunk = await asyncio.wait_for( + connection.process.proc.stdout.read(4096), + timeout=0.1 + ) + except asyncio.TimeoutError: + continue + + if not chunk: + # EOF from LSP + if connection.process.proc.returncode is not None: + logger.warning(f"{connection.lsp_type} LSP EOF", extra={ + "project_id": connection.project_id + }) + await self._broadcast_close( + connection, + WebSocketCloseCodes.LSP_CRASHED, + f"{connection.lsp_type} crashed" + ) + break + else: + await asyncio.sleep(0.01) + continue + + buffer += chunk + + # Parse and broadcast LSP frames + while True: + match = HEADER_RE.search(buffer) + if not match: + break + + content_length = int(match.group(1)) + header_end = match.end() + + # Check if we have the complete message + if len(buffer) - header_end < content_length: + break + + # Extract complete frame + body = buffer[header_end:header_end + content_length] + buffer = buffer[header_end + content_length:] + + # Log frame + try: + log_lsp_frame( + "s2c", + connection.project_id, + connection.lsp_type, + body.decode('utf-8', errors='replace') + ) + except Exception as e: + self.log.debug(f"Failed to log frame: {e}") + + # Send complete LSP message with Content-Length header to clients + # This is what vscode-ws-jsonrpc expects + full_message = f"Content-Length: {content_length}\r\n\r\n".encode('utf-8') + body + await self._broadcast_data(connection, full_message) + + except Exception as e: + import traceback + logger.error(f"Reader error for {connection.lsp_type}: {str(e)}", extra={ + "project_id": connection.project_id, + "error": str(e), + "traceback": traceback.format_exc() + }) + await self._broadcast_close( + connection, + WebSocketCloseCodes.LSP_ERROR, + f"Reader error: {str(e)}" + ) + + async def _broadcast_data(self, connection: LspConnection, data: bytes): + """Broadcast data to all connected clients""" + disconnected_clients = set() + + for client in connection.clients: + try: + await client.send_bytes(data) + # Update activity timestamp + connection.process.last_activity_ts = asyncio.get_event_loop().time() + except Exception as e: + logger.debug(f"Failed to send to client: {e}") + disconnected_clients.add(client) + + # Remove disconnected clients + connection.clients -= disconnected_clients + + async def _broadcast_close(self, connection: LspConnection, code: int, reason: str): + """Close all connected clients""" + for client in connection.clients: + try: + await client.close(code=code, reason=reason) + except: + pass + +# Global gateway instance +lsp_gateway = LspGateway() + +@router.get("/health") +async def lsp_health( + project_id: str = Query(..., description="Project ID"), + lsp_type: str = Query("pyright", description="LSP type (pyright or ruff)") +): + """Check LSP health for a project""" + from ..core.lsp_manager import lsp_manager + try: + health_status = lsp_manager.health(project_id, lsp_type) + return { + "success": True, + "project_id": project_id, + "lsp_type": lsp_type, + **health_status + } + except Exception as e: + return { + "success": False, + "error": str(e), + "project_id": project_id, + "lsp_type": lsp_type + } + +@router.websocket("/pyright") +async def websocket_pyright_lsp( + websocket: WebSocket, + project_id: str = Query(..., description="Project ID") +): + """WebSocket endpoint for Pyright LSP""" + await _handle_lsp_websocket(websocket, project_id, "pyright") + +@router.websocket("/ruff") +async def websocket_ruff_lsp( + websocket: WebSocket, + project_id: str = Query(..., description="Project ID") +): + """WebSocket endpoint for Ruff LSP""" + await _handle_lsp_websocket(websocket, project_id, "ruff") + +async def _handle_lsp_websocket( + websocket: WebSocket, + project_id: str, + lsp_type: LspType +): + """Common handler for LSP WebSocket connections""" + await websocket.accept() + logger.info(f"{lsp_type} LSP WebSocket connected", extra={ + "project_id": project_id, + "lsp": lsp_type + }) + + connection = None + + try: + # Check if venv exists first + from ..core.venv_manager import AsyncVenvManager + venv_manager = AsyncVenvManager("/app/projects") + + if not venv_manager.venv_exists(project_id): + logger.warning(f"Virtual environment not ready for project {project_id}") + await websocket.send_json({ + "jsonrpc": "2.0", + "error": { + "code": -32603, + "message": f"Virtual environment not ready for project {project_id}. Please wait for venv creation to complete." + } + }) + await websocket.close(code=WebSocketCloseCodes.LSP_ERROR) + return + + # Add client to LSP connection + connection = await lsp_gateway.add_client(websocket, project_id, lsp_type) + + # Handle client to server messages + while True: + try: + data = await websocket.receive_bytes() + + # Log frame (data from client should have Content-Length header) + try: + # Try to extract JSON-RPC body for logging + if data.startswith(b"Content-Length:"): + header_match = HEADER_RE.match(data) + if header_match: + header_end = header_match.end() + body = data[header_end:] + log_lsp_frame( + "c2s", + project_id, + lsp_type, + body.decode('utf-8', errors='replace') + ) + else: + # Log raw data if no header + log_lsp_frame( + "c2s", + project_id, + lsp_type, + data.decode('utf-8', errors='replace') + ) + except Exception as e: + logger.debug(f"Failed to log frame: {e}") + + # Write to LSP process + await lsp_gateway.write_to_lsp(connection, data) + + except WebSocketDisconnect: + break + except Exception as e: + logger.error(f"Error handling client message: {e}") + break + + except Exception as e: + logger.error(f"Error in {lsp_type} WebSocket handler", extra={ + "project_id": project_id, + "lsp": lsp_type, + "error": str(e) + }) + try: + await websocket.close(code=WebSocketCloseCodes.LSP_ERROR) + except: + pass + finally: + # Remove client from connection + if connection: + await lsp_gateway.remove_client(websocket, project_id, lsp_type) + + logger.info(f"{lsp_type} LSP WebSocket disconnected", extra={ + "project_id": project_id, + "lsp": lsp_type + }) \ No newline at end of file diff --git a/packages/executor/app/api/terminal.py b/packages/executor/app/api/terminal.py new file mode 100644 index 0000000..daf2f6a --- /dev/null +++ b/packages/executor/app/api/terminal.py @@ -0,0 +1,345 @@ +""" +Terminal WebSocket endpoint for in-IDE terminal functionality +Provides interactive shell with per-project virtual environment +""" + +from fastapi import APIRouter, WebSocket, WebSocketDisconnect, Query, HTTPException +import os +import sys +import pty +import termios +import fcntl +import struct +import select +import asyncio +import signal +import json +import tty +import time +from pathlib import Path +from typing import Optional, Dict, Any +from ..core.logging import get_logger + +router = APIRouter() +log = get_logger(__name__) + +# Configuration +IDLE_TIMEOUT_MS = int(os.getenv("TERMINAL_IDLE_TIMEOUT_MS", "600000")) # 10 minutes +MAX_SESSION_DURATION_MS = int(os.getenv("TERMINAL_MAX_SESSION_MS", "3600000")) # 1 hour + +class TerminalSession: + """Manages a single terminal session""" + + def __init__(self, project_id: str, mode: str = "pkg"): + self.project_id = project_id + self.mode = mode + self.pid: Optional[int] = None + self.master_fd: Optional[int] = None + self.last_activity = time.time() + self.start_time = time.time() + self.running = False + + def is_idle(self) -> bool: + """Check if session has been idle too long""" + return (time.time() - self.last_activity) * 1000 > IDLE_TIMEOUT_MS + + def is_expired(self) -> bool: + """Check if session has exceeded maximum duration""" + return (time.time() - self.start_time) * 1000 > MAX_SESSION_DURATION_MS + + def update_activity(self): + """Update last activity timestamp""" + self.last_activity = time.time() + +def _project_paths(project_id: str) -> tuple[Path, Path]: + """Get project root and venv paths""" + root = Path("/app/projects") / project_id + venv = root / "venv" + return root, venv + +def _build_env(base: dict, venv: Path, project_root: Path) -> dict: + """Build environment with venv activated""" + env = dict(base) + env["VIRTUAL_ENV"] = str(venv) + + # Set PATH with venv bin directory first + if os.name == 'nt': + bin_dir = str(venv / "Scripts") + else: + bin_dir = str(venv / "bin") + + env["PATH"] = bin_dir + os.pathsep + base.get("PATH", "") + env["PWD"] = str(project_root) + + # Set a custom prompt to show we're in the project venv + env["PS1"] = f"(aim-red:{project_root.name})$ " + + # Python-specific environment variables + env["PYTHONPATH"] = str(project_root) + + return env + +def _set_winsize(fd: int, rows: int, cols: int): + """Set terminal window size""" + try: + fcntl.ioctl(fd, termios.TIOCSWINSZ, struct.pack("HHHH", rows, cols, 0, 0)) + except Exception as e: + log.warning(f"Failed to set window size: {e}") + +@router.websocket("/terminal") +async def terminal_ws( + ws: WebSocket, + project_id: str = Query(...), + mode: str = Query("pkg", regex="^(pkg|shell)$") +): + """WebSocket endpoint for terminal sessions""" + await ws.accept() + session = TerminalSession(project_id, mode) + + try: + # Get project paths + project_root, venv = _project_paths(project_id) + + # Ensure project directory exists (create if not) + if not project_root.exists(): + project_root.mkdir(parents=True, exist_ok=True) + + if not venv.exists(): + # Virtual environment should exist already (created with project) + await ws.send_text(json.dumps({ + "type": "error", + "message": f"Virtual environment not found for project {project_id}. Please recreate the project." + })) + await ws.close() + return + + # Determine shell to use + shell = "/bin/bash" if os.path.exists("/bin/bash") else "/bin/sh" + + # Build environment with venv activated + env = _build_env(os.environ.copy(), venv, project_root) + + # Fork a PTY + try: + pid, master_fd = pty.fork() + except OSError as e: + log.error(f"Failed to fork PTY: {e}") + await ws.send_text(json.dumps({ + "type": "error", + "message": f"Failed to create terminal: {e}" + })) + await ws.close() + return + + if pid == 0: + # Child process - exec the shell + try: + os.chdir(str(project_root)) + os.execve(shell, [shell, "-l"], env) # -l for login shell + except Exception as e: + # Log error and exit + print(f"Failed to exec shell: {e}", file=sys.stderr) + os._exit(1) + # Should never reach here + os._exit(1) + + # Parent process - manage the PTY + session.pid = pid + session.master_fd = master_fd + session.running = True + + # Set terminal to raw mode + try: + tty.setraw(master_fd, termios.TCSANOW) + except Exception: + pass # Not critical if this fails + + # Set initial window size + _set_winsize(master_fd, 24, 80) + + # Send ready message + await ws.send_text(json.dumps({ + "type": "ready", + "pid": pid, + "mode": mode, + "project_id": project_id + })) + + # Send initial message based on mode + if mode == "pkg": + initial_msg = ( + f"\\033[1;32mPackage Console for project: {project_id}\\033[0m\\r\\n" + f"Virtual environment: {venv}\\r\\n" + f"Use 'pip install ' to install packages\\r\\n" + f"Use 'pip list' to see installed packages\\r\\n" + f"Type 'exit' to close the terminal\\r\\n\\r\\n" + ) + # Write initial message to PTY + os.write(master_fd, initial_msg.encode('utf-8')) + + # Create tasks for bidirectional communication + async def pty_to_ws(): + """Read from PTY and send to WebSocket""" + loop = asyncio.get_running_loop() + + while session.running: + try: + # Use select with timeout for non-blocking read + r, _, _ = await loop.run_in_executor( + None, select.select, [master_fd], [], [], 0.1 + ) + + if master_fd in r: + try: + data = os.read(master_fd, 65536) + if not data: + break + + # Send output to WebSocket + await ws.send_text(json.dumps({ + "type": "stdout", + "data": data.decode('utf-8', errors='replace') + })) + + session.update_activity() + + # Check for package installation/uninstallation patterns + data_str = data.decode('utf-8', errors='ignore') + if any(pattern in data_str for pattern in [ + "Successfully installed", + "Successfully uninstalled", + "Installing collected packages" + ]): + # Send notification to trigger LSP restart + await ws.send_text(json.dumps({ + "type": "package_changed", + "project_id": project_id + })) + + except OSError: + break + + # Check for idle timeout or session expiration + if session.is_idle() or session.is_expired(): + reason = "idle_timeout" if session.is_idle() else "session_expired" + log.info(f"Terminating session for {project_id}: {reason}") + os.kill(pid, signal.SIGTERM) + break + + except Exception as e: + log.error(f"Error in pty_to_ws: {e}") + break + + async def ws_to_pty(): + """Read from WebSocket and write to PTY""" + try: + while session.running: + msg = await ws.receive_text() + obj = json.loads(msg) + msg_type = obj.get("type") + + if msg_type == "stdin": + data = obj.get("data", "") + + # In package mode, optionally restrict certain commands + if mode == "pkg" and any(forbidden in data for forbidden in []): + # Currently no restrictions, but could add some here + pass + + os.write(master_fd, data.encode('utf-8')) + session.update_activity() + + elif msg_type == "resize": + rows = int(obj.get("rows", 24)) + cols = int(obj.get("cols", 80)) + _set_winsize(master_fd, rows, cols) + + elif msg_type == "kill": + log.info(f"Kill requested for terminal session {project_id}") + os.kill(pid, signal.SIGTERM) + break + + elif msg_type == "heartbeat": + session.update_activity() + + except WebSocketDisconnect: + log.info(f"WebSocket disconnected for {project_id}") + except Exception as e: + log.error(f"Error in ws_to_pty: {e}") + + # Run both tasks concurrently + p1 = asyncio.create_task(pty_to_ws()) + p2 = asyncio.create_task(ws_to_pty()) + + # Wait for either task to complete + done, pending = await asyncio.wait( + {p1, p2}, + return_when=asyncio.FIRST_COMPLETED + ) + + # Cancel pending tasks + for task in pending: + task.cancel() + + # Stop the session + session.running = False + + # Wait for process to exit and get status + try: + _, status = await asyncio.get_running_loop().run_in_executor( + None, os.waitpid, pid, 0 + ) + exit_code = (status >> 8) & 0xFF + except Exception: + exit_code = None + + # Send exit message + try: + await ws.send_text(json.dumps({ + "type": "exit", + "code": exit_code, + "signal": None + })) + except Exception: + pass + + except Exception as e: + import traceback + log.error(f"Terminal session error for {project_id}: {e}\n{traceback.format_exc()}") + try: + await ws.send_text(json.dumps({ + "type": "error", + "message": str(e) + })) + except Exception: + pass + + finally: + # Clean up resources + if session.master_fd is not None: + try: + os.close(session.master_fd) + except Exception: + pass + + # Ensure process is terminated + if session.pid is not None: + try: + os.kill(session.pid, signal.SIGKILL) + except ProcessLookupError: + pass # Process already dead + except Exception as e: + log.error(f"Error killing process: {e}") + + # Close WebSocket + try: + await ws.close() + except Exception: + pass + + log.info(f"Terminal session closed for {project_id}") + +@router.get("/terminal/ping") +async def terminal_ping(): + """Health check endpoint for terminal service""" + return {"status": "ok", "service": "terminal"} \ No newline at end of file diff --git a/packages/executor/app/api/venv.py b/packages/executor/app/api/venv.py new file mode 100644 index 0000000..2f806ea --- /dev/null +++ b/packages/executor/app/api/venv.py @@ -0,0 +1,118 @@ +""" +Virtual environment management API +""" + +from fastapi import APIRouter, HTTPException +from pydantic import BaseModel +from typing import Optional, List, Dict +from ..core.venv_manager import AsyncVenvManager +import logging + +router = APIRouter() +logger = logging.getLogger(__name__) + +# Global venv manager instance +venv_manager = AsyncVenvManager("/app/projects") + +class CreateVenvRequest(BaseModel): + project_id: str + +class VenvStatusRequest(BaseModel): + project_id: str + + +@router.post("/create") +async def create_venv(request: CreateVenvRequest): + """Create a virtual environment for a project""" + try: + # Ensure project directory exists in executor + import os + import json + project_dir = f"/app/projects/{request.project_id}" + os.makedirs(project_dir, exist_ok=True) + + # Create a basic structure.json if it doesn't exist + structure_file = f"{project_dir}/structure.json" + if not os.path.exists(structure_file): + with open(structure_file, 'w') as f: + json.dump({ + "project_id": request.project_id, + "nodes": [], + "edges": [] + }, f) + + result = await venv_manager.create_venv_async(request.project_id) + return result + except Exception as e: + logger.error(f"Error creating venv for {request.project_id}: {e}") + raise HTTPException(status_code=500, detail=str(e)) + +@router.get("/status") +async def get_venv_status(project_id: str): + """Get the status of a project's virtual environment""" + try: + # Check if venv exists first + if venv_manager.venv_exists(project_id): + return { + "success": True, + "project_id": project_id, + "status": "completed", + "venv_ready": True, + "message": "Virtual environment is ready" + } + + # Check task status + status = venv_manager.get_status(project_id) + if status: + return { + "success": True, + "project_id": project_id, + "venv_ready": status.get("status") == "completed", + **status + } + + # No venv and no task + return { + "success": True, + "project_id": project_id, + "status": "not_started", + "venv_ready": False, + "message": "Virtual environment not created yet" + } + except Exception as e: + logger.error(f"Error getting venv status for {project_id}: {e}") + return { + "success": False, + "project_id": project_id, + "venv_ready": False, + "error": str(e) + } + +@router.post("/exists") +async def venv_exists(request: VenvStatusRequest): + """Check if a virtual environment exists for a project""" + exists = venv_manager.venv_exists(request.project_id) + return { + "success": True, + "project_id": request.project_id, + "exists": exists + } + + +@router.delete("/delete") +async def delete_venv(project_id: str): + """Delete a project's virtual environment""" + try: + venv_manager.delete_venv(project_id) + return { + "success": True, + "project_id": project_id, + "message": f"Virtual environment for project {project_id} deleted" + } + except Exception as e: + logger.error(f"Error deleting venv for {project_id}: {e}") + return { + "success": False, + "project_id": project_id, + "error": str(e) + } \ No newline at end of file diff --git a/packages/executor/app/core/__init__.py b/packages/executor/app/core/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/packages/executor/app/core/executor.py b/packages/executor/app/core/executor.py new file mode 100644 index 0000000..a9da737 --- /dev/null +++ b/packages/executor/app/core/executor.py @@ -0,0 +1,87 @@ +""" +Code execution module for running Python code in isolated subprocess +""" + +import subprocess +import sys +import tempfile +from typing import Optional, Dict, Any +from pathlib import Path + +def execute_python_code( + code: str, + timeout: int = 30, + python_executable: Optional[str] = None, + working_dir: Optional[str] = None +) -> Dict[str, Any]: + """ + Execute Python code in a secure temporary environment + + Args: + code: Python code to execute + timeout: Maximum execution time in seconds + python_executable: Optional path to Python executable (for venv) + working_dir: Optional working directory for execution + + Returns: + Dictionary with output, error, and exit_code + """ + try: + with tempfile.NamedTemporaryFile(mode='w', suffix='.py', delete=False) as temp_file: + temp_file.write(code) + temp_file_path = temp_file.name + + try: + # Use provided Python executable or system default + python_exe = python_executable if python_executable else sys.executable + + result = subprocess.run( + [python_exe, temp_file_path], + capture_output=True, + text=True, + timeout=timeout, + cwd=working_dir # Set working directory if provided + ) + + return { + "output": result.stdout, + "error": result.stderr, + "exit_code": result.returncode + } + + finally: + # Clean up temp file + import os + try: + os.unlink(temp_file_path) + except: + pass + + except subprocess.TimeoutExpired: + return { + "output": "", + "error": f"Execution timed out after {timeout} seconds", + "exit_code": -1 + } + except Exception as e: + return { + "output": "", + "error": str(e), + "exit_code": -1 + } + +def get_project_venv_python(project_id: str) -> str: + """Get the Python executable path for a project's venv""" + import os + venv_path = Path(f"/app/projects/{project_id}/venv") + + if os.name == 'nt': + python_exe = venv_path / "Scripts" / "python.exe" + else: + python_exe = venv_path / "bin" / "python" + + if python_exe.exists(): + return str(python_exe) + + # Fallback to system Python + return sys.executable \ No newline at end of file diff --git a/packages/executor/app/core/flow_executor.py b/packages/executor/app/core/flow_executor.py new file mode 100644 index 0000000..90d5ccd --- /dev/null +++ b/packages/executor/app/core/flow_executor.py @@ -0,0 +1,320 @@ +""" +Flow executor for running node graphs in topological order +""" + +import asyncio +import json +import logging +from pathlib import Path +from typing import Dict, List, Any, Optional, Set +from concurrent.futures import ThreadPoolExecutor, TimeoutError +from collections import defaultdict +from .executor import execute_python_code +from .venv_manager import AsyncVenvManager + +logger = logging.getLogger(__name__) + +class FlowExecutor: + """Executes node flows with dependency management""" + + def __init__(self, projects_root: str): + self.projects_root = Path(projects_root) + self.venv_manager = AsyncVenvManager(projects_root) + + def get_project_structure(self, project_id: str) -> Dict[str, Any]: + """Load project structure from JSON file""" + project_path = self.projects_root / project_id + structure_file = project_path / "structure.json" + + if not structure_file.exists(): + raise FileNotFoundError(f"Project structure not found for {project_id}") + + with open(structure_file, 'r') as f: + return json.load(f) + + def get_node_code(self, project_id: str, node_id: str) -> str: + """Get the code for a node from its file""" + project_dir = self.projects_root / project_id + + # Try to find the node file with pattern node_id_*.py + for file_path in project_dir.glob(f"{node_id}_*.py"): + with open(file_path, 'r', encoding='utf-8') as f: + return f.read() + + # Try exact match + exact_file = project_dir / f"{node_id}.py" + if exact_file.exists(): + with open(exact_file, 'r', encoding='utf-8') as f: + return f.read() + + return "" # Return empty for nodes without code (start, result) + + def build_dependency_graph(self, nodes: List[Dict], edges: List[Dict]) -> Dict[str, Set[str]]: + """Build a dependency graph from edges""" + dependencies = defaultdict(set) + + for edge in edges: + target = edge['target'] + source = edge['source'] + dependencies[target].add(source) + + # Ensure all nodes are in the graph + for node in nodes: + if node['id'] not in dependencies: + dependencies[node['id']] = set() + + return dependencies + + def topological_sort(self, dependencies: Dict[str, Set[str]]) -> List[str]: + """Perform topological sort to determine execution order""" + in_degree = defaultdict(int) + graph = defaultdict(list) + + # Build adjacency list and calculate in-degrees + for node, deps in dependencies.items(): + in_degree[node] = len(deps) + for dep in deps: + graph[dep].append(node) + + # Find nodes with no dependencies + queue = [node for node, degree in in_degree.items() if degree == 0] + result = [] + + while queue: + node = queue.pop(0) + result.append(node) + + # Reduce in-degree for dependent nodes + for neighbor in graph[node]: + in_degree[neighbor] -= 1 + if in_degree[neighbor] == 0: + queue.append(neighbor) + + # Check for cycles + if len(result) != len(dependencies): + raise ValueError("Cycle detected in flow graph") + + return result + + async def execute_node(self, project_id: str, node_id: str, node_type: str, + input_data: Any, timeout: int = 30) -> Any: + """Execute a single node and return its output""" + + # Special handling for non-custom nodes + if node_type == 'start': + return input_data # Pass through initial params + elif node_type == 'result': + return input_data # Pass through the result + + # Get node code + code = self.get_node_code(project_id, node_id) + if not code: + logger.warning(f"No code found for node {node_id}, returning input") + return input_data + + # Create wrapper to execute the node with input data + input_json_str = json.dumps(input_data) if input_data is not None else 'null' + + wrapper_code = f""" +import json +import sys + +# Node code +{code} + +# Execute with input +try: + input_json = '''{input_json_str}''' + if input_json != 'null': + input_data = json.loads(input_json) + else: + input_data = None + + # Find and execute main function + if 'main' in locals() and callable(main): + result = main(input_data) if input_data is not None else main() + else: + # Find first callable that's not a builtin + result = None + for name, obj in list(locals().items()): + if callable(obj) and not name.startswith('_') and name not in ['json', 'sys']: + result = obj(input_data) if input_data is not None else obj() + break + + if result is None: + result = input_data # Pass through if no function found + + print(json.dumps({{'success': True, 'output': result}})) +except Exception as e: + import traceback + print(json.dumps({{ + 'success': False, + 'error': str(e), + 'traceback': traceback.format_exc() + }})) +""" + + # Execute using project's virtual environment + python_exe = self.venv_manager.get_python_executable(project_id) + project_dir = str(self.projects_root / project_id) + + execution_result = execute_python_code( + wrapper_code, + timeout=timeout, + python_executable=python_exe, + working_dir=project_dir + ) + + if execution_result['exit_code'] == 0: + try: + output = json.loads(execution_result['output']) + if output.get('success'): + return output.get('output') + else: + raise RuntimeError(f"Node {node_id} failed: {output.get('error', 'Unknown error')}") + except json.JSONDecodeError: + raise RuntimeError(f"Failed to parse output from node {node_id}: {execution_result['output']}") + else: + raise RuntimeError(f"Node {node_id} execution failed: {execution_result.get('error', 'Unknown error')}") + + async def execute_flow(self, project_id: str, start_node_id: Optional[str] = None, + params: Dict[str, Any] = None, max_workers: int = 4, + timeout_sec: int = 30, halt_on_error: bool = True) -> Dict[str, Any]: + """Execute the entire flow starting from a given node""" + + # Check if venv exists + if not self.venv_manager.venv_exists(project_id): + return { + "success": False, + "error": f"Virtual environment not found for project {project_id}" + } + + # Load project structure + structure = self.get_project_structure(project_id) + nodes = {node['id']: node for node in structure['nodes']} + edges = structure['edges'] + + # Find start node if not specified + if not start_node_id: + start_nodes = [n for n in nodes.values() if n.get('type') == 'start'] + if not start_nodes: + return { + "success": False, + "error": "No start node found in project" + } + start_node_id = start_nodes[0]['id'] + + if start_node_id not in nodes: + return { + "success": False, + "error": f"Start node {start_node_id} not found" + } + + # Build dependency graph + dependencies = self.build_dependency_graph(list(nodes.values()), edges) + + # Get execution order + try: + execution_order = self.topological_sort(dependencies) + except ValueError as e: + return { + "success": False, + "error": str(e) + } + + # Find nodes reachable from start node + reachable = set() + to_visit = [start_node_id] + edge_map = defaultdict(list) + + for edge in edges: + edge_map[edge['source']].append(edge['target']) + + while to_visit: + current = to_visit.pop(0) + if current not in reachable: + reachable.add(current) + to_visit.extend(edge_map[current]) + + # Filter execution order to only reachable nodes + execution_order = [n for n in execution_order if n in reachable] + + # Execute nodes + results = {} + node_outputs = {} + errors = [] + + try: + for node_id in execution_order: + node = nodes[node_id] + node_type = node.get('type', 'custom') + + # Determine input data + if node_id == start_node_id: + input_data = params + else: + # Get input from dependencies + deps = dependencies[node_id] + if len(deps) == 0: + input_data = None + elif len(deps) == 1: + dep_id = list(deps)[0] + input_data = node_outputs.get(dep_id) + else: + # Multiple dependencies - combine outputs + input_data = {dep_id: node_outputs.get(dep_id) for dep_id in deps} + + try: + # Execute the node + output = await self.execute_node( + project_id, node_id, node_type, input_data, timeout_sec + ) + + node_outputs[node_id] = output + results[node_id] = { + "success": True, + "output": output, + "type": node_type + } + + except Exception as e: + error_msg = str(e) + errors.append({ + "node_id": node_id, + "error": error_msg + }) + results[node_id] = { + "success": False, + "error": error_msg, + "type": node_type + } + + if halt_on_error: + break + + # Find result nodes + result_nodes = [n for n in nodes.values() + if n.get('type') == 'result' and n['id'] in reachable] + + # Collect outputs from result nodes + final_outputs = {} + for result_node in result_nodes: + node_id = result_node['id'] + if node_id in node_outputs: + final_outputs[node_id] = node_outputs[node_id] + + return { + "success": len(errors) == 0, + "execution_order": execution_order, + "results": results, + "final_outputs": final_outputs, + "errors": errors if errors else None + } + + except Exception as e: + return { + "success": False, + "error": f"Flow execution failed: {str(e)}", + "execution_order": execution_order, + "results": results + } \ No newline at end of file diff --git a/packages/executor/app/core/logging.py b/packages/executor/app/core/logging.py new file mode 100644 index 0000000..41dcf30 --- /dev/null +++ b/packages/executor/app/core/logging.py @@ -0,0 +1,161 @@ +""" +Structured logging for LSP and application +""" + +import logging +import os +import json +import time +import pathlib +from collections import deque +from typing import Literal, Optional, Dict, Any +from datetime import datetime + +LOG_LEVEL = os.getenv("LSP_LOG_LEVEL", "INFO").upper() +STDIO_DIR = pathlib.Path(os.getenv("LSP_STDIO_LOG_DIR", "/var/log/aim-red/lsp")) +STDIO_DIR.mkdir(parents=True, exist_ok=True) + +# Recent events memory buffer for debug endpoint +_RING = deque(maxlen=2000) + +def get_logger(name: str) -> logging.Logger: + """Get a structured JSON logger""" + logger = logging.getLogger(name) + if not logger.handlers: + handler = logging.StreamHandler() + handler.setFormatter(_JsonFormatter()) + logger.addHandler(handler) + logger.setLevel(getattr(logging, LOG_LEVEL, logging.INFO)) + return logger + +class _JsonFormatter(logging.Formatter): + """JSON formatter for structured logging""" + + def format(self, record: logging.LogRecord) -> str: + base = { + "ts": int(time.time() * 1000), + "level": record.levelname, + "msg": record.getMessage(), + "logger": record.name, + } + + # Add extra fields if provided + if hasattr(record, 'extra') and isinstance(record.extra, dict): + base.update(record.extra) + + return json.dumps(base, ensure_ascii=False) + +def lsp_stdio_logger( + project_id: str, + lsp_type: str, + stream: str, + line: str +) -> None: + """Log LSP stdio output to file and structured log""" + # Write to file for persistence + filename = STDIO_DIR / f"{project_id}.{lsp_type}.{stream}.log" + try: + with filename.open("a", encoding="utf-8") as f: + timestamp = datetime.now().isoformat() + f.write(f"[{timestamp}] {line}\n") + except Exception as e: + get_logger("lsp.stdio").error(f"Failed to write stdio log: {e}") + + # Sample structured log (avoid flooding) + if LOG_LEVEL in ["DEBUG", "TRACE"]: + get_logger("lsp.stdio").debug( + "stdio", + extra={ + "project_id": project_id, + "lsp": lsp_type, + "stream": stream, + "preview": line[:200] if len(line) > 200 else line + } + ) + +def log_lsp_frame( + direction: Literal["in", "out"], + lsp_type: str, + project_id: str, + payload: bytes +) -> None: + """Log LSP JSON-RPC frames for debugging""" + try: + # Parse JSON-RPC payload + obj = json.loads(payload.decode("utf-8")) + method = obj.get("method") + msg_id = obj.get("id") + size = len(payload) + + # Create log record + record = { + "direction": direction, + "method": method, + "id": msg_id, + "bytes": size, + "project_id": project_id, + "lsp": lsp_type, + "ts": int(time.time() * 1000) + } + + # Add to ring buffer + _RING.append(record) + + # Log metadata only by default + logger = get_logger("lsp.frame") + logger.debug("frame", extra=record) + + # Log full payload only in TRACE mode + if LOG_LEVEL == "TRACE": + logger.debug("payload", extra={**record, "payload": obj}) + + except json.JSONDecodeError: + # Not JSON, might be binary or malformed + pass + except Exception as e: + get_logger("lsp.frame").error(f"Error logging LSP frame: {e}") + +def log_lsp_lifecycle( + event: str, + project_id: str, + lsp_type: str, + **kwargs +) -> None: + """Log LSP lifecycle events""" + logger = get_logger("lsp.lifecycle") + extra = { + "event": event, + "project_id": project_id, + "lsp": lsp_type, + **kwargs + } + + if event in ["start", "stop", "restart"]: + logger.info(event, extra=extra) + elif event in ["error", "crash"]: + logger.error(event, extra=extra) + else: + logger.debug(event, extra=extra) + +def read_ring(n: int = 200) -> list: + """Read recent events from ring buffer""" + return list(_RING)[-n:] + +def get_stdio_logs( + project_id: str, + lsp_type: str, + stream: str = "stdout", + lines: int = 100 +) -> list[str]: + """Read recent stdio logs for a specific LSP process""" + filename = STDIO_DIR / f"{project_id}.{lsp_type}.{stream}.log" + if not filename.exists(): + return [] + + try: + with filename.open("r", encoding="utf-8") as f: + all_lines = f.readlines() + return all_lines[-lines:] + except Exception as e: + get_logger("lsp.stdio").error(f"Failed to read stdio log: {e}") + return [] \ No newline at end of file diff --git a/packages/executor/app/core/lsp_manager.py b/packages/executor/app/core/lsp_manager.py new file mode 100644 index 0000000..d5626a3 --- /dev/null +++ b/packages/executor/app/core/lsp_manager.py @@ -0,0 +1,537 @@ +""" +Enhanced LSP Manager with auto-restart and exponential backoff +""" + +from __future__ import annotations +import asyncio +import os +import time +import random +import signal +from dataclasses import dataclass, field +from typing import Dict, Optional, Literal, Tuple +from pathlib import Path +from .logging import get_logger, lsp_stdio_logger, log_lsp_lifecycle + +LspType = Literal["pyright", "ruff"] + +@dataclass +class LspProcess: + """Manages a single LSP process with restart capability""" + project_id: str + lsp_type: LspType + cwd: str + proc: Optional[asyncio.subprocess.Process] = None + started_at: float = 0.0 + restarts: int = 0 + last_started_ts: float = 0.0 + last_activity_ts: float = field(default_factory=lambda: time.time()) + is_starting: bool = False + is_stopping: bool = False + last_exit: Optional[int] = None + restart_attempts_in_window: int = 0 + window_start_ts: float = field(default_factory=lambda: time.time()) + +class LspManager: + """Enhanced LSP Manager with auto-restart and lifecycle management""" + + def __init__(self) -> None: + self.log = get_logger(__name__) + self._procs: Dict[Tuple[str, LspType], LspProcess] = {} + self._lock = asyncio.Lock() + + # Configuration from environment + self.LOG_LEVEL = os.getenv("LSP_LOG_LEVEL", "INFO") + self.IDLE_TTL_MS = int(os.getenv("LSP_IDLE_TTL_MS", "600000")) # 10 minutes + self.MAX_RESTARTS = int(os.getenv("LSP_MAX_RESTARTS", "5")) + self.RESTART_WINDOW_MS = int(os.getenv("LSP_RESTART_WINDOW_MS", "60000")) # 1 minute + + self.log.info("LSP Manager initialized", extra={ + "idle_ttl_ms": self.IDLE_TTL_MS, + "max_restarts": self.MAX_RESTARTS, + "restart_window_ms": self.RESTART_WINDOW_MS + }) + + def _cmd_for(self, lsp_type: LspType) -> Tuple[str, ...]: + """Get command for LSP type""" + if lsp_type == "pyright": + # When installed via npm globally in Docker, use pyright-langserver + # When installed via pip in venv, use pyright --langserver --stdio + import shutil + if shutil.which("pyright-langserver"): + return ("pyright-langserver", "--stdio") + else: + return ("pyright", "--langserver", "--stdio") + elif lsp_type == "ruff": + # Ruff server mode + return ("ruff", "server") + raise ValueError(f"Unknown LSP type: {lsp_type}") + + async def get_or_start( + self, + project_id: str, + lsp_type: LspType, + cwd: Optional[str] = None + ) -> LspProcess: + """Get existing process or start a new one""" + key = (project_id, lsp_type) + + # Use project directory if cwd not specified + if not cwd: + cwd = f"/app/projects/{project_id}" + + # Ensure project directory exists + import os + os.makedirs(cwd, exist_ok=True) + + async with self._lock: + lp = self._procs.get(key) + + # Check if process exists and is alive + if lp and lp.proc and lp.proc.returncode is None: + lp.last_activity_ts = time.time() + return lp + + # Create new process entry if needed + if not lp: + lp = LspProcess( + project_id=project_id, + lsp_type=lsp_type, + cwd=cwd + ) + self._procs[key] = lp + + # Start or restart the process + await self._start_process(lp) + return lp + + async def _start_process(self, lp: LspProcess) -> None: + """Start LSP process with exponential backoff on restart""" + if lp.is_starting: + return + + lp.is_starting = True + try: + # Check restart window + now = time.time() + if (now - lp.window_start_ts) * 1000 > self.RESTART_WINDOW_MS: + # Reset window + lp.restart_attempts_in_window = 0 + lp.window_start_ts = now + + # Apply exponential backoff if restarting + if lp.restart_attempts_in_window > 0: + if lp.restart_attempts_in_window > self.MAX_RESTARTS: + log_lsp_lifecycle( + "giveup", + lp.project_id, + lp.lsp_type, + restarts=lp.restart_attempts_in_window + ) + raise RuntimeError(f"Max restarts ({self.MAX_RESTARTS}) exceeded") + + # Exponential backoff with jitter + backoff = min(2 ** lp.restart_attempts_in_window, 30) # Max 30 seconds + jitter = random.random() * 0.5 + wait_time = backoff + jitter + + self.log.info( + "Waiting before restart", + extra={ + "project_id": lp.project_id, + "lsp": lp.lsp_type, + "wait_seconds": wait_time, + "attempt": lp.restart_attempts_in_window + } + ) + await asyncio.sleep(wait_time) + + # Get command - use project venv if available + venv_path = Path(lp.cwd) / "venv" + cmd = None + + if venv_path.exists(): + # Use pyright and ruff from project venv + if os.name == 'nt': + bin_dir = venv_path / "Scripts" + pyright_exe = bin_dir / "pyright.exe" + ruff_exe = bin_dir / "ruff.exe" + else: + bin_dir = venv_path / "bin" + pyright_exe = bin_dir / "pyright" + ruff_exe = bin_dir / "ruff" + + # Check if executable exists + if lp.lsp_type == "pyright" and pyright_exe.exists(): + # When installed via pip in venv, pyright provides pyright-langserver + pyright_langserver = bin_dir / ("pyright-langserver.exe" if os.name == 'nt' else "pyright-langserver") + if pyright_langserver.exists(): + cmd = (str(pyright_langserver), "--stdio") + else: + # Fallback to pyright with --langserver --stdio + cmd = (str(pyright_exe), "--langserver", "--stdio") + elif lp.lsp_type == "ruff" and ruff_exe.exists(): + cmd = (str(ruff_exe), "server") + else: + self.log.warning(f"LSP executable not found in venv for {lp.lsp_type}: {pyright_exe if lp.lsp_type == 'pyright' else ruff_exe}") + + # Fallback to system commands if venv doesn't have LSP + if not cmd: + cmd = self._cmd_for(lp.lsp_type) + self.log.info(f"Using system command for {lp.lsp_type}: {cmd}") + + # Ensure pyrightconfig.json exists for pyright + if lp.lsp_type == "pyright": + config_path = Path(lp.cwd) / "pyrightconfig.json" + if not config_path.exists(): + self._generate_pyright_config(lp.cwd) + + # Build environment with venv activated + env = os.environ.copy() + if venv_path.exists(): + env["VIRTUAL_ENV"] = str(venv_path) + if os.name == 'nt': + env["PATH"] = str(venv_path / "Scripts") + os.pathsep + env.get("PATH", "") + else: + env["PATH"] = str(venv_path / "bin") + os.pathsep + env.get("PATH", "") + env["PYTHONPATH"] = lp.cwd + + # Log startup + log_lsp_lifecycle( + "start", + lp.project_id, + lp.lsp_type, + cmd=" ".join(cmd) if isinstance(cmd, tuple) else cmd, + cwd=lp.cwd, + attempt=lp.restart_attempts_in_window + ) + + # Start the process with the project venv environment + # cmd can be tuple or string, handle both cases + if isinstance(cmd, tuple): + cmd_list = list(cmd) + else: + cmd_list = [cmd] + + # Ensure the working directory exists + if not Path(lp.cwd).exists(): + self.log.warning(f"Creating missing project directory: {lp.cwd}") + Path(lp.cwd).mkdir(parents=True, exist_ok=True) + + try: + self.log.info( + f"Starting LSP process", + extra={ + "project_id": lp.project_id, + "lsp": lp.lsp_type, + "command": cmd_list, + "cwd": lp.cwd + } + ) + + lp.proc = await asyncio.create_subprocess_exec( + *cmd_list, + cwd=lp.cwd, + stdin=asyncio.subprocess.PIPE, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + env=env + ) + + # Give the process a moment to start + await asyncio.sleep(0.1) + + # Check if process started successfully + if lp.proc.returncode is not None: + # Process already exited + stderr_output = await lp.proc.stderr.read() + self.log.error( + f"LSP process exited immediately", + extra={ + "project_id": lp.project_id, + "lsp": lp.lsp_type, + "returncode": lp.proc.returncode, + "stderr": stderr_output.decode(errors='replace')[:500] + } + ) + raise RuntimeError(f"LSP {lp.lsp_type} exited immediately with code {lp.proc.returncode}") + + except FileNotFoundError as e: + self.log.error( + f"LSP executable not found", + extra={ + "project_id": lp.project_id, + "lsp": lp.lsp_type, + "command": cmd_list[0] if cmd_list else "unknown", + "error": str(e) + } + ) + raise RuntimeError(f"LSP executable not found for {lp.lsp_type}: {cmd_list[0] if cmd_list else 'unknown'}") + except Exception as e: + self.log.error( + f"Failed to start LSP process", + extra={ + "project_id": lp.project_id, + "lsp": lp.lsp_type, + "command": cmd_list, + "error": str(e) + } + ) + raise + + lp.started_at = time.time() + lp.last_started_ts = lp.started_at + lp.last_activity_ts = lp.started_at + lp.restarts += 1 + lp.restart_attempts_in_window += 1 + + # Start background tasks for stdout/stderr draining and process monitoring + asyncio.create_task(self._drain_stream(lp, lp.proc.stdout, "stdout")) + asyncio.create_task(self._drain_stream(lp, lp.proc.stderr, "stderr")) + asyncio.create_task(self._wait_process(lp)) + + except Exception as e: + log_lsp_lifecycle( + "error", + lp.project_id, + lp.lsp_type, + error=str(e) + ) + raise + finally: + lp.is_starting = False + + def _generate_pyright_config(self, cwd: str) -> None: + """Generate pyrightconfig.json for project virtual environment""" + import json + venv_path = Path(cwd) / "venv" + config_path = Path(cwd) / "pyrightconfig.json" + + config = { + "venvPath": str(Path(cwd).absolute()), + "venv": "venv", + "pythonVersion": "3.11", + "include": ["."], + "exclude": ["venv", "__pycache__", ".git"], + "reportMissingImports": "warning", + "reportMissingTypeStubs": "information", + "reportGeneralTypeIssues": "warning", + "pythonPlatform": "Linux", + "typeCheckingMode": "basic" + } + + try: + with open(config_path, 'w') as f: + json.dump(config, f, indent=2) + self.log.info(f"Generated pyrightconfig.json at {config_path}") + except Exception as e: + self.log.error(f"Failed to generate pyrightconfig.json: {e}") + + async def _drain_stream( + self, + lp: LspProcess, + stream: asyncio.StreamReader, + name: str + ) -> None: + """Drain stdout/stderr and log output""" + try: + while lp.proc and lp.proc.returncode is None: + line = await stream.readline() + if not line: + break + + # Decode and log + txt = line.decode(errors="replace").rstrip("\n") + if txt: # Only log non-empty lines + lsp_stdio_logger(lp.project_id, lp.lsp_type, name, txt) + lp.last_activity_ts = time.time() + except Exception as e: + self.log.error( + f"Error draining {name}", + extra={ + "project_id": lp.project_id, + "lsp": lp.lsp_type, + "error": str(e) + } + ) + + async def _wait_process(self, lp: LspProcess) -> None: + """Wait for process exit and handle restart""" + try: + rc = await lp.proc.wait() + lp.last_exit = rc + + log_lsp_lifecycle( + "exit", + lp.project_id, + lp.lsp_type, + returncode=rc, + restarts=lp.restarts + ) + + # Auto-restart if not explicitly stopping and within limits + if not lp.is_stopping and lp.restart_attempts_in_window < self.MAX_RESTARTS: + self.log.info( + "Auto-restarting LSP", + extra={ + "project_id": lp.project_id, + "lsp": lp.lsp_type, + "exit_code": rc + } + ) + await self._start_process(lp) + elif lp.restart_attempts_in_window >= self.MAX_RESTARTS: + log_lsp_lifecycle( + "crash", + lp.project_id, + lp.lsp_type, + reason="max_restarts_exceeded" + ) + except Exception as e: + self.log.error( + "Error waiting for process", + extra={ + "project_id": lp.project_id, + "lsp": lp.lsp_type, + "error": str(e) + } + ) + + async def send_raw( + self, + project_id: str, + lsp_type: LspType, + data: bytes + ) -> None: + """Send raw data to LSP stdin""" + key = (project_id, lsp_type) + lp = self._procs.get(key) + + if not lp or not lp.proc or not lp.proc.stdin or lp.proc.returncode is not None: + raise RuntimeError(f"LSP {lsp_type} not running for project {project_id}") + + lp.proc.stdin.write(data) + await lp.proc.stdin.drain() + lp.last_activity_ts = time.time() + + async def read_raw( + self, + project_id: str, + lsp_type: LspType, + size: int = 4096 + ) -> bytes: + """Read raw data from LSP stdout""" + key = (project_id, lsp_type) + lp = self._procs.get(key) + + if not lp or not lp.proc or not lp.proc.stdout: + raise RuntimeError(f"LSP {lsp_type} not running for project {project_id}") + + data = await lp.proc.stdout.read(size) + lp.last_activity_ts = time.time() + return data + + async def idle_collect(self) -> None: + """Background task to clean up idle processes""" + while True: + try: + await asyncio.sleep(30) # Check every 30 seconds + now = time.time() + ttl = self.IDLE_TTL_MS / 1000.0 + + kill_list = [] + for key, lp in list(self._procs.items()): + if lp.proc and lp.proc.returncode is None: + if now - lp.last_activity_ts > ttl: + kill_list.append((key, lp)) + + for key, lp in kill_list: + await self.stop(lp.project_id, lp.lsp_type, reason="idle") + + except Exception as e: + self.log.error(f"Error in idle collection: {e}") + + async def stop( + self, + project_id: str, + lsp_type: LspType, + reason: str = "manual" + ) -> None: + """Stop a specific LSP process""" + key = (project_id, lsp_type) + lp = self._procs.get(key) + + if not lp or not lp.proc or lp.proc.returncode is not None: + return + + lp.is_stopping = True + + log_lsp_lifecycle( + "stop", + project_id, + lsp_type, + reason=reason + ) + + try: + # Try graceful termination first + lp.proc.send_signal(signal.SIGTERM) + try: + await asyncio.wait_for(lp.proc.wait(), timeout=3) + except asyncio.TimeoutError: + # Force kill if not responding + lp.proc.kill() + await lp.proc.wait() + except Exception as e: + self.log.error( + "Error stopping process", + extra={ + "project_id": project_id, + "lsp": lsp_type, + "error": str(e) + } + ) + finally: + lp.is_stopping = False + # Remove from process map + if key in self._procs: + del self._procs[key] + + def health(self, project_id: str, lsp_type: LspType) -> dict: + """Get health status of LSP process""" + key = (project_id, lsp_type) + lp = self._procs.get(key) + + if not lp: + return { + "running": False, + "reason": "not_created" + } + + running = lp.proc is not None and lp.proc.returncode is None + + return { + "running": running, + "pid": lp.proc.pid if lp.proc else None, + "restarts": lp.restarts, + "restart_attempts": lp.restart_attempts_in_window, + "last_exit": lp.last_exit, + "last_activity_ts": lp.last_activity_ts, + "started_at": lp.started_at, + "uptime_seconds": time.time() - lp.started_at if running else 0 + } + + async def restart(self, project_id: str, lsp_type: LspType) -> bool: + """Manually restart an LSP process""" + await self.stop(project_id, lsp_type, reason="manual_restart") + + # Reset restart counter for manual restart + key = (project_id, lsp_type) + if key in self._procs: + self._procs[key].restart_attempts_in_window = 0 + + lp = await self.get_or_start(project_id, lsp_type) + return lp.proc is not None and lp.proc.returncode is None + +# Global singleton instance +lsp_manager = LspManager() \ No newline at end of file diff --git a/packages/executor/app/core/venv_manager.py b/packages/executor/app/core/venv_manager.py new file mode 100644 index 0000000..8ddcb7b --- /dev/null +++ b/packages/executor/app/core/venv_manager.py @@ -0,0 +1,358 @@ +""" +Async Virtual Environment Manager with Status Tracking +Manages background venv creation with progress monitoring +""" + +import asyncio +import os +import sys +import subprocess +import json +import time +from pathlib import Path +from typing import Dict, Optional, Literal +from enum import Enum +from datetime import datetime +import threading +from concurrent.futures import ThreadPoolExecutor + +class VenvStatus(str, Enum): + """Virtual environment creation status""" + NOT_STARTED = "not_started" + CREATING = "creating" + INSTALLING_PIP = "installing_pip" + INSTALLING_BASE = "installing_base" + INSTALLING_LSP = "installing_lsp" + COMPLETED = "completed" + FAILED = "failed" + +class VenvCreationTask: + """Tracks a single venv creation task""" + + def __init__(self, project_id: str): + self.project_id = project_id + self.status = VenvStatus.NOT_STARTED + self.progress = 0 # 0-100 + self.message = "Not started" + self.error: Optional[str] = None + self.started_at: Optional[datetime] = None + self.completed_at: Optional[datetime] = None + self.current_package: Optional[str] = None + + def to_dict(self) -> Dict: + """Convert to dictionary for API response""" + return { + "project_id": self.project_id, + "status": self.status, + "progress": self.progress, + "message": self.message, + "error": self.error, + "current_package": self.current_package, + "started_at": self.started_at.isoformat() if self.started_at else None, + "completed_at": self.completed_at.isoformat() if self.completed_at else None, + "duration_sec": ( + (self.completed_at - self.started_at).total_seconds() + if self.started_at and self.completed_at + else None + ) + } + +class AsyncVenvManager: + """Manages async virtual environment creation with status tracking""" + + def __init__(self, projects_root: str): + self.projects_root = Path(projects_root) + self.tasks: Dict[str, VenvCreationTask] = {} + self.executor = ThreadPoolExecutor(max_workers=3) # Limit concurrent venv creations + self._lock = threading.Lock() + + # Base packages to install + self.base_packages = [ + "fastapi==0.115.5", + "uvicorn[standard]==0.32.1", + "pydantic==2.10.3", + "httpx==0.28.1", + "aiofiles==24.1.0", + "websockets==14.1", + "python-multipart==0.0.12", + ] + + self.lsp_packages = [ + "pyright", + "ruff==0.8.6" + ] + + def get_venv_path(self, project_id: str) -> Path: + """Get the virtual environment path for a project""" + return self.projects_root / project_id / "venv" + + def get_python_executable(self, project_id: str) -> str: + """Get the Python executable path for a project's venv""" + venv_path = self.get_venv_path(project_id) + + # Windows vs Unix paths + if os.name == 'nt': + python_exe = venv_path / "Scripts" / "python.exe" + else: + python_exe = venv_path / "bin" / "python" + + return str(python_exe) + + def venv_exists(self, project_id: str) -> bool: + """Check if a virtual environment exists for a project""" + venv_path = self.get_venv_path(project_id) + python_exe = Path(self.get_python_executable(project_id)) + return venv_path.exists() and python_exe.exists() + + def get_status(self, project_id: str) -> Optional[Dict]: + """Get the status of a venv creation task""" + with self._lock: + task = self.tasks.get(project_id) + return task.to_dict() if task else None + + def is_creating(self, project_id: str) -> bool: + """Check if venv is currently being created""" + with self._lock: + task = self.tasks.get(project_id) + return task and task.status in [ + VenvStatus.CREATING, + VenvStatus.INSTALLING_PIP, + VenvStatus.INSTALLING_BASE, + VenvStatus.INSTALLING_LSP + ] + + async def create_venv_async(self, project_id: str) -> Dict: + """Start async venv creation and return immediately""" + + # Check if already exists + if self.venv_exists(project_id): + return { + "success": True, + "message": "Virtual environment already exists", + "status": VenvStatus.COMPLETED + } + + # Check if already creating + if self.is_creating(project_id): + return { + "success": False, + "message": "Virtual environment creation already in progress", + "status": self.tasks[project_id].status + } + + # Create task and start in background + with self._lock: + task = VenvCreationTask(project_id) + task.status = VenvStatus.CREATING + task.started_at = datetime.now() + task.message = "Starting virtual environment creation..." + self.tasks[project_id] = task + + # Submit to executor for background processing + future = self.executor.submit(self._create_venv_sync, project_id) + + # Don't wait for completion - return immediately + return { + "success": True, + "message": "Virtual environment creation started", + "status": VenvStatus.CREATING, + "task_id": project_id + } + + def _create_venv_sync(self, project_id: str) -> bool: + """Synchronous venv creation with progress updates""" + task = self.tasks[project_id] + + try: + venv_path = self.get_venv_path(project_id) + project_dir = self.projects_root / project_id + + # Ensure project directory exists + project_dir.mkdir(parents=True, exist_ok=True) + + # Step 1: Create venv (30% progress) + task.status = VenvStatus.CREATING + task.progress = 10 + task.message = "Creating virtual environment structure..." + + print(f"[AsyncVenv] Creating virtual environment at {venv_path}...") + + # Use --copies to avoid symlinks which can be problematic in Docker + result = subprocess.run( + [sys.executable, "-m", "venv", "--copies", str(venv_path)], + capture_output=True, + text=True, + timeout=300 # 5 minutes timeout + ) + + if result.returncode != 0: + raise Exception(f"Failed to create venv: {result.stderr}") + + task.progress = 30 + task.message = "Virtual environment created, setting up pip..." + + # Step 2: Ensure pip (40% progress) + task.status = VenvStatus.INSTALLING_PIP + python_exe = self.get_python_executable(project_id) + + # Ensure pip is available + result = subprocess.run( + [python_exe, "-m", "ensurepip", "--upgrade", "--default-pip"], + capture_output=True, + text=True, + timeout=120 + ) + + # Upgrade pip + subprocess.run( + [python_exe, "-m", "pip", "install", "--upgrade", "pip", "setuptools", "wheel"], + capture_output=True, + text=True, + timeout=120 + ) + + task.progress = 40 + task.message = "Installing base packages..." + + # Step 3: Install base packages (40-70% progress) + task.status = VenvStatus.INSTALLING_BASE + task.progress = 40 + task.message = "Installing base packages..." + + # Install all base packages at once for better performance + print(f"[AsyncVenv] Installing base packages for project {project_id}...") + task.current_package = "base packages" + + result = subprocess.run( + [python_exe, "-m", "pip", "install", "--no-cache-dir", "--no-warn-script-location"] + self.base_packages, + capture_output=True, + text=True, + timeout=300 # 5 minutes for all base packages + ) + + if result.returncode != 0: + print(f"[AsyncVenv] Warning: Some packages failed to install: {result.stderr[:500]}") + # Try installing them one by one as fallback + for i, package in enumerate(self.base_packages): + task.current_package = package + task.progress = 40 + int((i / len(self.base_packages)) * 30) + subprocess.run( + [python_exe, "-m", "pip", "install", "--no-cache-dir", "--no-warn-script-location", package], + capture_output=True, + text=True, + timeout=120 + ) + + task.progress = 70 + task.message = "Installing LSP servers..." + + # Step 4: Install LSP servers (70-90% progress) + task.status = VenvStatus.INSTALLING_LSP + task.current_package = "LSP servers" + + print(f"[AsyncVenv] Installing LSP servers for project {project_id}...") + + # Install both LSP servers together + result = subprocess.run( + [python_exe, "-m", "pip", "install", "--no-cache-dir", "--no-warn-script-location"] + self.lsp_packages, + capture_output=True, + text=True, + timeout=360 # 6 minutes for LSP servers (they can be large) + ) + + if result.returncode != 0: + print(f"[AsyncVenv] Warning: LSP installation had issues: {result.stderr[:500]}") + # Try installing them one by one as fallback + for i, package in enumerate(self.lsp_packages): + task.current_package = package + task.progress = 70 + int((i / len(self.lsp_packages)) * 20) + subprocess.run( + [python_exe, "-m", "pip", "install", "--no-cache-dir", "--no-warn-script-location", package], + capture_output=True, + text=True, + timeout=240 + ) + + # Step 5: Generate pyrightconfig.json (90-100% progress) + task.progress = 90 + task.message = "Generating configuration files..." + self._generate_pyright_config(project_id) + + # Mark as completed + task.status = VenvStatus.COMPLETED + task.progress = 100 + task.message = "Virtual environment ready" + task.completed_at = datetime.now() + task.current_package = None + + print(f"[AsyncVenv] Successfully created venv for project {project_id}") + return True + + except subprocess.TimeoutExpired as e: + task.status = VenvStatus.FAILED + task.error = f"Timeout during venv creation: {str(e)}" + task.message = "Creation timed out" + task.completed_at = datetime.now() + print(f"[AsyncVenv] Timeout creating venv for project {project_id}: {e}") + return False + + except Exception as e: + task.status = VenvStatus.FAILED + task.error = str(e) + task.message = "Creation failed" + task.completed_at = datetime.now() + print(f"[AsyncVenv] Error creating venv for project {project_id}: {e}") + return False + + def _generate_pyright_config(self, project_id: str): + """Generate pyrightconfig.json for the project""" + try: + project_dir = self.projects_root / project_id + config_path = project_dir / "pyrightconfig.json" + + config = { + "include": ["."], + "venvPath": ".", + "venv": "venv", + "typeCheckingMode": "standard", + "reportMissingImports": "warning", + "reportMissingTypeStubs": "none", + "pythonVersion": "3.11", + "analysis": { + "autoImportCompletions": True, + "useLibraryCodeForTypes": True, + "autoSearchPaths": True, + "diagnosticMode": "workspace" + }, + "extraPaths": ["."] + } + + with open(config_path, 'w') as f: + json.dump(config, f, indent=2) + + print(f"[AsyncVenv] Generated pyrightconfig.json for project {project_id}") + except Exception as e: + print(f"[AsyncVenv] Error generating pyrightconfig.json: {e}") + + def cleanup_old_tasks(self, max_age_hours: int = 24): + """Clean up old completed or failed tasks""" + with self._lock: + now = datetime.now() + to_remove = [] + + for project_id, task in self.tasks.items(): + if task.completed_at: + age = (now - task.completed_at).total_seconds() / 3600 + if age > max_age_hours: + to_remove.append(project_id) + + for project_id in to_remove: + del self.tasks[project_id] + + def delete_venv(self, project_id: str) -> None: + """Delete a project's virtual environment""" + venv_path = self.get_venv_path(project_id) + if venv_path.exists(): + import shutil + shutil.rmtree(venv_path) + print(f"Deleted virtual environment at {venv_path}") \ No newline at end of file diff --git a/packages/executor/app/main.py b/packages/executor/app/main.py new file mode 100644 index 0000000..b9d5b0a --- /dev/null +++ b/packages/executor/app/main.py @@ -0,0 +1,96 @@ +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware +from contextlib import asynccontextmanager +import asyncio +import logging +from app.api import venv, lsp, terminal, execute, code +from app.core.lsp_manager import lsp_manager + +logging.basicConfig(level=logging.INFO) +logger = logging.getLogger(__name__) + +@asynccontextmanager +async def lifespan(app: FastAPI): + """Manage application lifecycle""" + # Startup + logger.info("Starting AIM Red Toolkit Executor Service") + + # Ensure projects directory exists + import os + os.makedirs("/app/projects", exist_ok=True) + logger.info("Projects directory ready") + + # Start LSP idle collection task + idle_task = asyncio.create_task(lsp_manager.idle_collect()) + logger.info("Started LSP idle collection task") + + yield + + # Shutdown + logger.info("Shutting down AIM Red Toolkit Executor Service") + + # Cancel idle collection task + idle_task.cancel() + try: + await idle_task + except asyncio.CancelledError: + pass + + logger.info("Shutdown complete") + +app = FastAPI( + title="AIM Red Toolkit Executor Service", + version="1.0.0", + lifespan=lifespan +) + +# CORS middleware configuration +app.add_middleware( + CORSMiddleware, + allow_origins=[ + "http://localhost:5173", + "http://frontend:5173", + "http://localhost:3000", + "http://frontend:3000", + "http://localhost:8000", + "http://backend:8000", + "http://localhost", + "http://frontend", + "http://backend" + ], + allow_credentials=True, + allow_methods=["*"], + allow_headers=["*"], +) + +@app.get("/") +async def root(): + return { + "service": "AIM Red Toolkit Executor", + "status": "healthy", + "features": { + "venv": "enabled", + "lsp": "enabled", + "terminal": "enabled", + "execution": "enabled" + } + } + +@app.get("/health") +async def health(): + return {"status": "healthy"} + +# Include API routers +app.include_router(venv.router, prefix="/api/venv") +app.include_router(code.router, prefix="/api/code") +app.include_router(execute.router, prefix="/api/execute") + +# LSP endpoints (WebSocket) +app.include_router(lsp.router, prefix="/api/lsp") + +# Terminal endpoint (WebSocket) +app.include_router(terminal.router, prefix="/api") + +if __name__ == "__main__": + import uvicorn + uvicorn.run(app, host="0.0.0.0", port=8001) \ No newline at end of file diff --git a/packages/executor/requirements.txt b/packages/executor/requirements.txt new file mode 100644 index 0000000..28491b8 --- /dev/null +++ b/packages/executor/requirements.txt @@ -0,0 +1,15 @@ +# FastAPI and server +fastapi==0.115.5 +uvicorn[standard]==0.32.1 +websockets==14.1 + +# Core dependencies +pydantic==2.10.3 +aiofiles==24.1.0 +python-multipart==0.0.12 + +# Development and testing +httpx==0.28.1 + +# For terminal support +ptyprocess==0.7.0 \ No newline at end of file diff --git a/packages/frontend/Dockerfile b/packages/frontend/Dockerfile index 9d109ed..f9ade50 100644 --- a/packages/frontend/Dockerfile +++ b/packages/frontend/Dockerfile @@ -12,9 +12,13 @@ COPY packages/frontend/package.json ./packages/frontend/ # Install dependencies RUN pnpm install --frozen-lockfile -# Copy frontend source +# Copy frontend source including public directory COPY packages/frontend ./packages/frontend +# Ensure public directory exists and has proper permissions +RUN mkdir -p /app/packages/frontend/public && \ + chmod -R 755 /app/packages/frontend/public + # Expose port EXPOSE 5173 diff --git a/packages/frontend/Dockerfile.prod b/packages/frontend/Dockerfile.prod deleted file mode 100644 index 55a6b79..0000000 --- a/packages/frontend/Dockerfile.prod +++ /dev/null @@ -1,33 +0,0 @@ -# Build stage -FROM node:20-alpine as builder - -WORKDIR /app - -# Install pnpm -RUN npm install -g pnpm - -# Copy root package files -COPY package.json pnpm-workspace.yaml pnpm-lock.yaml ./ -COPY packages/frontend/package.json ./packages/frontend/ - -# Install dependencies -RUN pnpm install --frozen-lockfile - -# Copy frontend source -COPY packages/frontend ./packages/frontend - -# Build the app -RUN pnpm --filter frontend build - -# Production stage -FROM nginx:alpine - -# Copy built files -COPY --from=builder /app/packages/frontend/dist /usr/share/nginx/html - -# Copy nginx configuration -COPY packages/frontend/nginx.conf /etc/nginx/conf.d/default.conf - -EXPOSE 80 - -CMD ["nginx", "-g", "daemon off;"] \ No newline at end of file diff --git a/packages/frontend/package.json b/packages/frontend/package.json index 79c3215..a90b018 100644 --- a/packages/frontend/package.json +++ b/packages/frontend/package.json @@ -10,16 +10,34 @@ "preview": "vite preview" }, "dependencies": { + "@codingame/monaco-vscode-api": "^20.2.1", + "@codingame/monaco-vscode-configuration-service-override": "^20.2.1", + "@codingame/monaco-vscode-keybindings-service-override": "^20.2.1", + "@codingame/monaco-vscode-languages-service-override": "^20.2.1", + "@codingame/monaco-vscode-model-service-override": "^20.2.1", + "@codingame/monaco-vscode-textmate-service-override": "^20.2.1", + "@codingame/monaco-vscode-theme-defaults-default-extension": "^20.2.1", + "@codingame/monaco-vscode-theme-service-override": "^20.2.1", "@monaco-editor/react": "^4.7.0", + "@vscode/vscode-languagedetection": "^1.0.22", + "@xterm/addon-fit": "^0.10.0", + "@xterm/addon-webgl": "^0.18.0", + "@xterm/xterm": "^5.5.0", "@xyflow/react": "^12.8.3", "clsx": "^2.1.1", - "monaco-editor": "^0.52.2", + "monaco-editor": "npm:@codingame/monaco-vscode-editor-api@^20.2.1", + "monaco-languageclient": "^9.11.0", + "vscode-languageclient": "^9.0.1", "react": "^19.1.1", "react-dom": "^19.1.1", "react-router-dom": "^7.8.0", + "vscode-oniguruma": "^2.0.1", + "vscode-textmate": "^9.2.0", + "vscode-ws-jsonrpc": "^3.5.0", "zustand": "^5.0.7" }, "devDependencies": { + "@codingame/esbuild-import-meta-url-plugin": "^1.0.3", "@eslint/js": "^9.33.0", "@tailwindcss/vite": "^4.1.11", "@types/react": "^19.1.10", diff --git a/packages/frontend/public/arrow-back.svg b/packages/frontend/public/arrow-back.svg new file mode 100644 index 0000000..30a3967 --- /dev/null +++ b/packages/frontend/public/arrow-back.svg @@ -0,0 +1,18 @@ + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/packages/frontend/src/App.tsx b/packages/frontend/src/App.tsx index 30c053c..acb0372 100644 --- a/packages/frontend/src/App.tsx +++ b/packages/frontend/src/App.tsx @@ -1,9 +1,14 @@ +import { useEffect } from "react"; import { BrowserRouter as Router, Routes, Route } from "react-router-dom"; import Home from "./pages/Home"; import Project from "./pages/Project/Project"; import WrongPath from "./pages/WrongPath/WrongPath"; function App() { + useEffect(() => { + console.log("[App] App component mounted successfully"); + }, []); + return ( diff --git a/packages/frontend/src/components/ErrorBoundary.tsx b/packages/frontend/src/components/ErrorBoundary.tsx new file mode 100644 index 0000000..624abaa --- /dev/null +++ b/packages/frontend/src/components/ErrorBoundary.tsx @@ -0,0 +1,74 @@ +import React, { Component, type ReactNode } from "react"; + +interface Props { + children: ReactNode; + fallback?: ReactNode; +} + +interface State { + hasError: boolean; + error?: Error; +} + +class ErrorBoundary extends Component { + constructor(props: Props) { + super(props); + this.state = { hasError: false }; + } + + static getDerivedStateFromError(error: Error): State { + console.error("[ErrorBoundary] Caught error:", error); + return { hasError: true, error }; + } + + componentDidCatch(error: Error, errorInfo: React.ErrorInfo) { + console.error("[ErrorBoundary] Error details:", { + error, + errorInfo, + componentStack: errorInfo.componentStack, + }); + } + + render() { + if (this.state.hasError) { + if (this.props.fallback) { + return this.props.fallback; + } + + return ( +
+
+

+ Something went wrong +

+

+ The application encountered an error. Please refresh the page to + try again. +

+ {this.state.error && ( +
+ + Error details + +
+                  {this.state.error.toString()}
+                  {this.state.error.stack}
+                
+
+ )} + +
+
+ ); + } + + return this.props.children; + } +} + +export default ErrorBoundary; diff --git a/packages/frontend/src/components/buttons/ide/ExportCodeButton.tsx b/packages/frontend/src/components/buttons/ide/ExportCodeButton.tsx index 6a961b1..e99ada7 100644 --- a/packages/frontend/src/components/buttons/ide/ExportCodeButton.tsx +++ b/packages/frontend/src/components/buttons/ide/ExportCodeButton.tsx @@ -23,7 +23,7 @@ export default function ExportCodeButton({ return ( + + + + + ); + + return createPortal(modalContent, document.body); +}; + +export default DeleteCheck; \ No newline at end of file diff --git a/packages/frontend/src/components/modal/Ide.tsx b/packages/frontend/src/components/modal/Ide.tsx index ce8fd59..5c7421b 100644 --- a/packages/frontend/src/components/modal/Ide.tsx +++ b/packages/frontend/src/components/modal/Ide.tsx @@ -1,15 +1,23 @@ import React, { useRef, useEffect, useState, useCallback } from "react"; import Editor from "@monaco-editor/react"; import type { editor } from "monaco-editor"; -import RunCodeButton from "../buttons/ide/RunCodeButton"; -import ExportCodeButton from "../buttons/ide/ExportCodeButton"; +import type * as Monaco from "monaco-editor"; +import SimpleExportButton from "../buttons/ide/SimpleExportButton"; import LoadingModal from "./LoadingModal"; +import { codeApi } from "../../utils/api"; +import X from "../buttons/modal/x"; +import { pythonLspClient, type LSPConnection } from "../../lsp/pythonLspClient"; +import { + initializeMonacoServices, + createModel, + disposeModel, +} from "../../lsp/monacoSetup"; +import ProjectTerminal from "../terminal/ProjectTerminal"; interface IdeModalProps { isOpen: boolean; onClose: () => void; projectId: string; - projectTitle: string; nodeId: string; nodeTitle: string; initialCode?: string; @@ -19,7 +27,6 @@ const IdeModal: React.FC = ({ isOpen, onClose, projectId, - projectTitle, nodeId, nodeTitle, initialCode = "# Write your Python function here\ndef foo():\n return 'Hello, World!'", @@ -31,7 +38,38 @@ const IdeModal: React.FC = ({ ); const [code, setCode] = useState(initialCode); const [isLoadingCode, setIsLoadingCode] = useState(false); - console.log(projectTitle); + const [runModalOpen, setRunModalOpen] = useState(false); + const [runStatus, setRunStatus] = useState<"loading" | "success" | "error">( + "loading" + ); + const [runResult, setRunResult] = useState(""); + const [lspConnection, setLspConnection] = useState( + null + ); + const [isLspConnecting, setIsLspConnecting] = useState(false); + const [venvStatus, setVenvStatus] = useState< + "checking" | "creating" | "ready" | "error" + >("checking"); + const [venvProgress, setVenvProgress] = useState<{ + progress?: number; + message?: string; + current_package?: string | null; + }>({ + progress: 0, + message: "", + current_package: null + }); + const modelRef = useRef(null); + const [showTerminal, setShowTerminal] = useState(false); + + // Handle package changes from terminal + const handleTerminalPackageChanged = useCallback(async () => { + // Restart LSP to pick up changes + if (lspConnection) { + console.log("Restarting LSP after terminal package change"); + await lspConnection.restart(); + } + }, [lspConnection]); // Fetch code from backend when modal opens const fetchCode = useCallback(async () => { @@ -39,35 +77,34 @@ const IdeModal: React.FC = ({ setIsLoadingCode(true); try { - const response = await fetch("/api/code/getcode", { - method: "POST", - headers: { - "Content-Type": "application/json", - }, - body: JSON.stringify({ - project_id: projectId, - node_id: nodeId, - node_title: nodeTitle, - }), + const data = await codeApi.getNodeCode({ + project_id: projectId, + node_id: nodeId, + node_title: nodeTitle, }); - if (response.ok) { - const data = await response.json(); - if (data.code) { - setCode(data.code); - if (editorRef.current) { - editorRef.current.setValue(data.code); + if (data.success && data.code) { + setCode(data.code); + // Only update model if it exists, otherwise the code will be used when model is created + if (editorRef.current && modelRef.current) { + try { + // Update model value directly instead of setValue on editor + modelRef.current.setValue(data.code); + } catch (error) { + console.error("Error updating model value:", error); + // Fallback: just update the state, it will be used when editor mounts } } } } catch (error) { console.error("Error fetching code:", error); + // If fetch fails, keep the default code } finally { setIsLoadingCode(false); } }, [projectId, nodeId, nodeTitle]); - const handleSave = async () => { + const handleSave = useCallback(async () => { if (!editorRef.current) return; setSaveModalOpen(true); @@ -75,32 +112,221 @@ const IdeModal: React.FC = ({ const currentCode = editorRef.current.getValue(); try { - const response = await fetch("/api/code/savecode", { - method: "POST", - headers: { - "Content-Type": "application/json", - }, - body: JSON.stringify({ - project_id: projectId, - node_id: nodeId, - node_title: nodeTitle, - code: currentCode, - }), + const data = await codeApi.saveNodeCode({ + project_id: projectId, + node_id: nodeId, + node_title: nodeTitle, + code: currentCode, }); - if (response.ok) { + if (data.success) { setSaveStatus("success"); setCode(currentCode); + setTimeout(() => setSaveModalOpen(false), 1500); } else { setSaveStatus("error"); } } catch (error) { - console.error("Error saving code:", error); setSaveStatus("error"); + console.error("Error saving code:", error); } - }; + }, [projectId, nodeId, nodeTitle]); - // Fetch code when modal opens + const handleRunCode = useCallback(async () => { + if (!editorRef.current) return; + + setRunModalOpen(true); + setRunStatus("loading"); + setRunResult(""); + + const currentCode = editorRef.current.getValue(); + + try { + // First save the code + await codeApi.saveNodeCode({ + project_id: projectId, + node_id: nodeId, + node_title: nodeTitle, + code: currentCode, + }); + + // Then execute it + const result = await codeApi.executeNode({ + project_id: projectId, + node_id: nodeId, + }); + + if (result.success) { + setRunStatus("success"); + setRunResult(JSON.stringify(result.output, null, 2)); + } else { + setRunStatus("error"); + setRunResult(result.error || "Execution failed"); + } + } catch (error) { + setRunStatus("error"); + setRunResult(error instanceof Error ? error.message : "Unknown error"); + } + }, [projectId, nodeId, nodeTitle]); + + // Initialize Monaco services on first mount + useEffect(() => { + initializeMonacoServices().catch(console.error); + }, []); + + // Check venv status only when modal opens + useEffect(() => { + let retryTimeout: NodeJS.Timeout | null = null; + let isCancelled = false; + + const checkVenvStatus = async () => { + if (!isOpen || !projectId || isCancelled) return; + + try { + setVenvStatus("checking"); + const { projectApi } = await import("../../utils/api"); + const venvStatusResult = await projectApi.getVenvStatus(projectId); + + if (isCancelled) return; + + console.log("Venv status check result:", venvStatusResult); + + // Update progress information + if (venvStatusResult.progress !== undefined) { + setVenvProgress({ + progress: venvStatusResult.progress || 0, + message: venvStatusResult.message || "", + current_package: venvStatusResult.current_package || null, + }); + } + + if (!venvStatusResult.venv_ready) { + // Check the detailed status + if (venvStatusResult.status === "not_started") { + console.log("Virtual environment not started, initiating creation..."); + setVenvStatus("creating"); + + // Start venv creation + try { + const createResult = await projectApi.createVenv(projectId); + console.log("Venv creation initiated:", createResult); + + // Poll for status after starting creation + if (!isCancelled) { + retryTimeout = setTimeout(() => { + checkVenvStatus(); + }, 2000); + } + } catch (createError) { + console.error("Failed to initiate venv creation:", createError); + setVenvStatus("error"); + } + return; + } else if ( + venvStatusResult.status === "creating" || + venvStatusResult.status === "installing_pip" || + venvStatusResult.status === "installing_base" || + venvStatusResult.status === "installing_lsp" + ) { + console.log( + `Virtual environment ${venvStatusResult.status}: ${venvStatusResult.message}` + ); + setVenvStatus("creating"); + // Poll again after 3 seconds for venv creation progress + if (!isCancelled) { + retryTimeout = setTimeout(() => { + checkVenvStatus(); + }, 3000); + } + return; + } else if (venvStatusResult.status === "completed") { + // Explicitly handle completed status + console.log("Virtual environment creation completed"); + setVenvStatus("ready"); + return; + } else if (venvStatusResult.status === "failed") { + console.error( + "Virtual environment creation failed:", + venvStatusResult.error + ); + setVenvStatus("error"); + return; + } + } else { + // Venv is ready + setVenvStatus("ready"); + } + } catch (error) { + console.error("Failed to check venv status:", error); + if (!isCancelled) { + setVenvStatus("error"); + } + } + }; + + if (isOpen && projectId) { + checkVenvStatus(); + } + + return () => { + isCancelled = true; + if (retryTimeout) { + clearTimeout(retryTimeout); + } + }; + }, [isOpen, projectId]); + + // Connect to LSP when venv is ready (separate effect) + useEffect(() => { + let connectTimeout: NodeJS.Timeout | null = null; + + const connectLSP = async () => { + if (!isOpen || !projectId || lspConnection || isLspConnecting || venvStatus !== "ready") { + return; + } + + try { + setIsLspConnecting(true); + console.log("Connecting to LSP..."); + const connection = await pythonLspClient.connect(projectId); + setLspConnection(connection); + console.log("LSP connected successfully"); + } catch (error) { + console.error("Failed to connect to LSP:", error); + // Retry LSP connection after 5 seconds, but only if modal is still open + if (isOpen) { + connectTimeout = setTimeout(() => { + connectLSP(); + }, 5000); + } + } finally { + setIsLspConnecting(false); + } + }; + + if (venvStatus === "ready") { + connectLSP(); + } + + return () => { + if (connectTimeout) { + clearTimeout(connectTimeout); + } + }; + }, [isOpen, projectId, venvStatus, lspConnection, isLspConnecting]); + + // Cleanup LSP connection when modal closes + useEffect(() => { + return () => { + if (!isOpen && lspConnection) { + lspConnection.dispose(); + setLspConnection(null); + setVenvStatus("checking"); + } + }; + }, [isOpen, lspConnection]); + + // Fetch code and packages when modal opens useEffect(() => { if (isOpen) { fetchCode(); @@ -135,9 +361,113 @@ const IdeModal: React.FC = ({ }; }, [isOpen, onClose]); - const handleEditorDidMount = (editor: editor.IStandaloneCodeEditor) => { - editorRef.current = editor; - }; + // Ctrl+S / Cmd+S save shortcut + useEffect(() => { + const handleKeyDown = (e: KeyboardEvent) => { + // Ctrl+S (Windows/Linux) or Cmd+S (Mac) + if ((e.ctrlKey || e.metaKey) && e.key === "s") { + e.preventDefault(); // Prevent browser's default save dialog + handleSave(); + } + }; + + if (isOpen) { + document.addEventListener("keydown", handleKeyDown); + } + + return () => { + document.removeEventListener("keydown", handleKeyDown); + }; + }, [isOpen, handleSave]); + + const handleEditorDidMount = useCallback( + (editor: editor.IStandaloneCodeEditor, monaco: typeof Monaco) => { + editorRef.current = editor; + + // Create a model for the file with proper URI that matches executor's file system + // Sanitize the title the same way the backend does + const sanitizedTitle = nodeTitle.replace(/[^a-zA-Z0-9]/g, "_"); + const fileUri = `file:///app/projects/${projectId}/${nodeId}_${sanitizedTitle}.py`; + + // Dispose old model if exists + if (modelRef.current) { + disposeModel(modelRef.current); + } + + // Create new model with current code + const model = createModel(code, "python", fileUri); + modelRef.current = model; + editor.setModel(model); + + // Focus editor to trigger LSP document sync + editor.focus(); + + // LSP will handle all language features (diagnostics, hover, completion, etc.) + + // Configure editor options + editor.updateOptions({ + automaticLayout: true, + minimap: { enabled: true }, + scrollBeyondLastLine: false, + fontSize: 14, + lineNumbers: "on", + roundedSelection: false, + cursorStyle: "line", + glyphMargin: true, + quickSuggestions: { + other: true, + comments: false, + strings: false, + }, + wordBasedSuggestions: "currentDocument", + suggestOnTriggerCharacters: true, + parameterHints: { + enabled: true, + }, + suggest: { + snippetsPreventQuickSuggestions: false, + showMethods: true, + showFunctions: true, + showVariables: true, + showClasses: true, + showModules: true, + showKeywords: true, + showSnippets: true, + insertMode: "replace", + }, + tabSize: 4, + insertSpaces: true, + formatOnType: true, + formatOnPaste: true, + autoIndent: "full", + folding: true, + foldingStrategy: "indentation", + }); + + // Add Python-specific keybindings + editor.addAction({ + id: "python-run-code", + label: "Run Python Code", + keybindings: [monaco.KeyCode.F5], + contextMenuGroupId: "navigation", + contextMenuOrder: 1.5, + run: () => { + handleRunCode(); + }, + }); + }, + [code, projectId, nodeId, nodeTitle, handleRunCode] + ); + + // Cleanup model on unmount + useEffect(() => { + return () => { + if (modelRef.current) { + disposeModel(modelRef.current); + modelRef.current = null; + } + }; + }, []); if (!isOpen) return null; @@ -147,117 +477,180 @@ const IdeModal: React.FC = ({ onClick={onClose} >
e.stopPropagation()} > - {/* Header */} -
- {/* Left side - Action buttons */} -
- - - -
- - {/* Center - Title */} -

+
+

{nodeTitle} + {isLoadingCode && ( + (Loading...) + )} + {venvStatus === "checking" && ( + + (Checking environment...) + + )} + {venvStatus === "creating" && ( +
+ + {venvProgress.message || "Setting up Python environment..."} + + {venvProgress.progress !== undefined && venvProgress.progress > 0 && ( +
+
+
+ )} + {venvProgress.current_package && ( + + ({venvProgress.current_package}) + + )} +
+ )} + {venvStatus === "error" && ( + (Environment error) + )} + {venvStatus === "ready" && !isLoadingCode && ( + ✓ Ready + )}

- - {/* Right side - Close button */} - +
- {/* Editor Container */} -
- {isLoadingCode ? ( -
- Loading -
Loading code...
-
- ) : ( +
+
setCode(value || "")} + loading={ +
+ Loading editor... +
+ } options={{ + automaticLayout: true, minimap: { enabled: true }, fontSize: 14, - lineNumbers: "on", - roundedSelection: false, - scrollBeyondLastLine: false, - readOnly: false, - automaticLayout: true, - wordWrap: "on", - scrollbar: { - vertical: "visible", - horizontal: "visible", - verticalScrollbarSize: 10, - horizontalScrollbarSize: 10, - }, - padding: { - top: 10, - bottom: 10, - }, }} /> +
+ + {showTerminal && ( +
+ +
)}
-
- setSaveModalOpen(false)} - notice={{ - loading: "Saving...", - success: "Saved Successfully", - error: "Fail to Save", - }} - /> +
+
+ + + +
+
+ +
+
+ + {/* LSP Status Indicator */} +
+ {isLspConnecting && ( + + + Connecting to language server... + + )} + {lspConnection && !isLspConnecting && ( + + + Language server connected + + )} + {!lspConnection && !isLspConnecting && ( + + + Language server offline + + )} +
+ + {/* Save Modal */} + setSaveModalOpen(false)} + /> + + {/* Run Modal */} + setRunModalOpen(false)} + /> +

); }; diff --git a/packages/frontend/src/components/modal/LoadingModal.tsx b/packages/frontend/src/components/modal/LoadingModal.tsx index 3d2dd11..ea41478 100644 --- a/packages/frontend/src/components/modal/LoadingModal.tsx +++ b/packages/frontend/src/components/modal/LoadingModal.tsx @@ -1,9 +1,11 @@ import React, { useEffect } from "react"; +import { createPortal } from "react-dom"; interface Notice { loading: string; success: string; error: string; + errorDetails?: string; } interface LoadingModalProps { @@ -20,7 +22,7 @@ export default function LoadingModal({ notice, }: LoadingModalProps) { useEffect(() => { - if (isOpen && status !== "loading") { + if (isOpen && status === "success") { const timer = setTimeout(() => { onClose(); }, 2000); @@ -40,13 +42,17 @@ export default function LoadingModal({ } }; - return ( + const modalContent = (
e.stopPropagation()} > {status === "loading" && ( @@ -102,12 +108,33 @@ export default function LoadingModal({ />
- + {notice.error} + {notice.errorDetails && ( +
+
+

+ Error Details: +

+
+                    {notice.errorDetails}
+                  
+
+ +
+ )}
)} ); + + // Use portal to render modal at document root level + return createPortal(modalContent, document.body); } diff --git a/packages/frontend/src/components/modal/Modal.tsx b/packages/frontend/src/components/modal/Modal.tsx deleted file mode 100644 index 4616406..0000000 --- a/packages/frontend/src/components/modal/Modal.tsx +++ /dev/null @@ -1,84 +0,0 @@ -import React, { useEffect } from "react"; - -interface ModalProps { - isOpen: boolean; - onClose: () => void; - children?: React.ReactNode; -} - -const Modal: React.FC = ({ isOpen, onClose, children }) => { - useEffect(() => { - if (isOpen) { - // Prevent body scroll when modal is open - document.body.style.overflow = "hidden"; - } else { - document.body.style.overflow = "unset"; - } - - return () => { - document.body.style.overflow = "unset"; - }; - }, [isOpen]); - - useEffect(() => { - const handleEscape = (e: KeyboardEvent) => { - if (e.key === "Escape" && isOpen) { - onClose(); - } - }; - - if (isOpen) { - document.addEventListener("keydown", handleEscape); - } - - return () => { - document.removeEventListener("keydown", handleEscape); - }; - }, [isOpen, onClose]); - - if (!isOpen) return null; - - return ( -
-
e.stopPropagation()} - > - -
{children}
-
-
- ); -}; - -export default Modal; diff --git a/packages/frontend/src/components/modal/NodeMaker.tsx b/packages/frontend/src/components/modal/ProjectMaker.tsx similarity index 59% rename from packages/frontend/src/components/modal/NodeMaker.tsx rename to packages/frontend/src/components/modal/ProjectMaker.tsx index 0359574..3618a4e 100644 --- a/packages/frontend/src/components/modal/NodeMaker.tsx +++ b/packages/frontend/src/components/modal/ProjectMaker.tsx @@ -1,12 +1,14 @@ import { useEffect, useState } from "react"; import { useNavigate } from "react-router-dom"; +import { projectApi } from "../../utils/api"; +import X from "../buttons/modal/x"; -interface NodeMakerProps { +interface ProjectMakerProps { isOpen: boolean; - onClose?: () => void; + onClose: () => void; } -export default function NodeMaker({ isOpen, onClose }: NodeMakerProps) { +export default function ProjectMaker({ isOpen, onClose }: ProjectMakerProps) { const navigate = useNavigate(); const [projectName, setProjectName] = useState(""); const [projectDescription, setProjectDescription] = useState(""); @@ -28,33 +30,21 @@ export default function NodeMaker({ isOpen, onClose }: NodeMakerProps) { const projectId = crypto.randomUUID(); try { - const response = await fetch("/api/project/make", { - method: "POST", - headers: { - "Content-Type": "application/json", - }, - body: JSON.stringify({ - project_name: projectName.trim(), - project_description: projectDescription.trim() || "", - project_id: projectId, - }), + const data = await projectApi.createProject({ + project_name: projectName.trim(), + project_description: projectDescription.trim() || "", + project_id: projectId, }); - if (!response.ok) { - const errorData = await response.json(); - throw new Error(errorData.detail || "Failed to create project"); - } - - const data = await response.json(); - if (data.success) { - // Navigate to the newly created project - navigate(`/project/${projectId}`); - - // Close modal if onClose is provided + // Close modal immediately if onClose is provided if (onClose) { onClose(); } + + // Navigate to the newly created project immediately + // The venv will be created in the background + navigate(`/project/${projectId}`); } else { throw new Error("Failed to create project"); } @@ -83,42 +73,18 @@ export default function NodeMaker({ isOpen, onClose }: NodeMakerProps) { return (
e.stopPropagation()} > - -
+
+ +
+ +
)} -
+
diff --git a/packages/frontend/src/components/modal/SetupModal.tsx b/packages/frontend/src/components/modal/SetupModal.tsx new file mode 100644 index 0000000..7bd76bd --- /dev/null +++ b/packages/frontend/src/components/modal/SetupModal.tsx @@ -0,0 +1,244 @@ +import { useEffect, useState } from "react"; +import X from "../buttons/modal/x"; + +interface SetupModalProps { + isOpen: boolean; + onClose: () => void; + onConfirm: (data: { + title: string; + description: string; + nodeType: "custom" | "start" | "result"; + }) => void; + hasStartNode: boolean; +} + +export default function SetupModal({ + isOpen, + onClose, + onConfirm, + hasStartNode, +}: SetupModalProps) { + const [title, setTitle] = useState(""); + const [description, setDescription] = useState(""); + const [nodeType, setNodeType] = useState<"custom" | "start" | "result">( + "custom" + ); + const [error, setError] = useState(""); + + useEffect(() => { + if (isOpen) { + // Prevent body scroll when modal is open + document.body.style.overflow = "hidden"; + } else { + document.body.style.overflow = "unset"; + } + + return () => { + document.body.style.overflow = "unset"; + }; + }, [isOpen]); + + useEffect(() => { + const handleEscape = (e: KeyboardEvent) => { + if (e.key === "Escape" && isOpen) { + onClose(); + } + }; + + if (isOpen) { + document.addEventListener("keydown", handleEscape); + } + + return () => { + document.removeEventListener("keydown", handleEscape); + }; + }, [isOpen, onClose]); + + if (!isOpen) return null; + + const handleConfirm = () => { + // For Start and Result nodes, use fixed titles + const finalTitle = + nodeType === "start" + ? "Start" + : nodeType === "result" + ? "Result" + : title.trim(); + + const finalDescription = + nodeType === "start" + ? "Flow entry point" + : nodeType === "result" + ? "Flow result output" + : description.trim(); + + // Validation only for custom nodes + if (nodeType === "custom" && !title.trim()) { + setError("Node title is required"); + return; + } + + // Check if trying to create another start node + if (nodeType === "start" && hasStartNode) { + setError("Only one Start node is allowed per project"); + return; + } + + onConfirm({ + title: finalTitle, + description: finalDescription, + nodeType, + }); + + // Reset form + setTitle(""); + setDescription(""); + setNodeType("custom"); + setError(""); + onClose(); + }; + + const handleClose = () => { + // Reset form on close + setTitle(""); + setDescription(""); + setNodeType("custom"); + setError(""); + onClose(); + }; + + return ( +
+
e.stopPropagation()} + > + {/* Header with close button */} +
+
+ +
+ +

+ Create New Node +

+ + {error && ( +
+ {error} +
+ )} +
+ + {/* Scrollable content area */} +
+
+
+ + +

+ {nodeType === "custom" && "Standard node for code execution"} + {nodeType === "start" && + "Entry point for flow execution (only one allowed)"} + {nodeType === "result" && + "Node for displaying execution results"} +

+
+ + {/* Only show title/description for custom nodes */} + {nodeType === "custom" ? ( + <> +
+ + { + setTitle(e.target.value); + setError(""); + }} + className="w-full px-3 py-2 bg-neutral-800 border border-neutral-700 rounded text-white focus:outline-none focus:border-red-500" + placeholder="Enter node title" + autoFocus + /> +
+ +
+ +