A web-based file explorer for browsing and filtering Xenium experiment files stored in AWS S3. This application provides an intuitive interface to search through experiment.xenium files with advanced filtering capabilities.
This workflow is currently hosted from /opt on boxer
- Real-time Filtering: Filter files by source type, project, block ID, panel, run ID, and creation date
- Search Functionality: Full-text search across file paths and metadata
- S3 Integration: Direct access to S3 URIs with one-click copy functionality
- Responsive Design: Clean, modern interface with Nord theme styling
- Docker Support: Containerized deployment with nginx
- Cache System: Efficient file metadata caching for fast browsing
- Frontend: Vanilla HTML/CSS/JavaScript with responsive design
- Backend: nginx serving static files
- Data Source: AWS S3 bucket (
cholab-xenium-explorer-storage) - Cache: JSON-based file metadata cache updated via Python script
- Deployment: Docker containerization with volume mounts
- Docker and Docker Compose
- AWS CLI configured with appropriate S3 permissions
- Python 3.x (for cache generation)
-
Clone the repository
git clone <repository-url> cd xenium-explorer-files
-
Generate initial cache
python3 bin/generate_xenium_cache.py
-
Start the application
docker-compose up -d
-
Access the web interface Open http://localhost:9753 in your browser
xenium-explorer-files/
├── assets/
│ ├── css/styles.css # Nord theme styling
│ └── js/main.js # Frontend JavaScript logic
├── bin/
│ └── generate_xenium_cache.py # Cache generation script
├── logs/ # Application logs (mounted volume)
├── docker-compose.yml # Container orchestration
├── Dockerfile # Container build instructions
├── index.html # Main web interface
├── nginx.conf # nginx configuration
├── xenium_cache.json # File metadata cache (generated)
└── README.md
The application uses a JSON cache file to store metadata about all experiment.xenium files in the S3 bucket.
python3 bin/generate_xenium_cache.pyThis script:
- Scans the entire S3 bucket for experiment.xenium files
- Extracts metadata from file paths (project, block ID, panel, run ID, etc.)
- Generates
xenium_cache.jsonwith structured file information - Logs operations to
logs/xenium_cache.log
{
"last_updated": "2025-01-15T10:30:00",
"bucket": "cholab-xenium-explorer-storage",
"file_count": 150,
"files": [
{
"date": "2025-01-15",
"time": "10:30:00",
"size": "1024",
"s3_uri": "s3://bucket/path/experiment.xenium",
"path": "path/to/experiment.xenium",
"metadata": {
"project": "project-name",
"brp_id": "12345A",
"panel": "TUQ97N",
"run_id": "run-123",
"source_type": "sopa",
"full_experiment_id": "12345A-TUQ97N-EA_2025-02-04-0920",
"created_date": "2025-02-04"
}
}
]
}- Port: Application runs on port 9753
- Volumes:
xenium_cache.jsonmounted for live updateslogs/directory for persistent logging
- Restart Policy: Always restart on failure
- Cache Control: Disabled caching for
xenium_cache.json - File Size Limit: 10MB maximum body size
- Error Handling: Custom error pages
-
Install dependencies
# No additional dependencies required for frontend # Python script requires: boto3, aws-cli
-
Generate test cache
python3 bin/generate_xenium_cache.py
-
Serve locally
# Using Python's built-in server python3 -m http.server 8000
The cache can be updated without rebuilding the container:
# Update cache
python3 bin/generate_xenium_cache.py
# Cache is automatically available to running container via volume mount- Source Type: Filter by 'sopa' or 'oba' outputs
- Project: Filter by project name
- Block ID: Filter by experiment block identifier
- Panel: Filter by panel identifier
- Run ID: Filter by OBA run identifier
- Created Date: Filter by creation date (SOPA files only)
- Search: Full-text search across all file paths
The system automatically extracts metadata from S3 file paths:
- Path pattern:
*/sopa/project/experiment-id/experiment.xenium - Extracts: project, block ID, panel, creation date
- Path pattern:
*/oba-outputs/panel/run-id/experiment.xenium - Extracts: panel, run ID, block ID from output folder
Application logs are stored in the logs/ directory:
xenium_cache.log: Cache generation operations- Container logs: Available via
docker-compose logs
-
Cache not loading
- Verify
xenium_cache.jsonexists - Check file permissions
- Review nginx logs
- Verify
-
S3 access errors
- Verify AWS credentials
- Check S3 bucket permissions
- Ensure bucket name is correct
-
Container startup issues
- Check port 9753 availability
- Verify Docker daemon is running
- Review docker-compose logs
If the cache becomes corrupted or outdated:
# Stop container
docker-compose down
# Regenerate cache
python3 bin/generate_xenium_cache.py
# Restart container
docker-compose up -d