Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,31 @@ project documentation. If you cannot find the documentation you're
looking for, please file a GitHub issue with details of what
you'd like to see documented.

## Firebase Overview

cellPACK Studio reads recipe data from Firebase and writes user edits back. The backend handles uploading recipes and running packing jobs.

### Collections Used by cellPACK Studio

| Collection | What It Does | Access |
|------------|--------------|---------------|
| `example_packings` | Maps recipes to UI display names and editable fields | Read |
| `editable_fields` | Defines which recipe fields users can edit | Read |
| `recipes` | Full recipe data with references | Read |
| `objects` | Object definitions (molecules, organelles) | Read |
| `gradients` | Gradient definitions for spatial distributions | Read |
| `composition` | Composition definitions | Read |
| `recipes_edited` | User-modified recipes | Write |
| `job_status` | Packing job progress | Poll (read) |

> **Note:** Collections like `configs` and `results` are managed by the backend. For the complete database schema, see [FIREBASE_SCHEMA.md](FIREBASE_SCHEMA.md).

### Getting Access

- **Development:** Create your own Firebase project in test mode. See [Firebase Firestore tutorial](https://firebase.google.com/docs/firestore).
- **Staging:** Contact the code owner for credentials, then configure your `.env` file (see README).


## How to Contribute

Typical steps to contribute:
Expand Down
19 changes: 19 additions & 0 deletions FIREBASE_SCHEMA.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Firebase Schema Documentation

This document describes the Firestore database schema used by cellPACK. Collections are accessed by the client (cellPACK Studio), the core package(backend), or both.

## Collections

| Collection Name | Purpose | Key Fields | Read | Write | Cleanup Policy | Related Process | Notes |
| -------------------- | ----------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------- | ------------------------------------------------------------ | --------------------------------------------- | ---------------------------------------------------------------- | ----------------------------------------------- |
| **recipes** | Stores the full-structured recipes | `name`, `id`, `dedup_hash`, `description`, `version`, `format_version`, `bounding_box`, `composition`, `objects`, `gradients`, `optional_gradients` | Recipe loading, UI display, user edit comparison, packing execution | Backend uploads via admin CLI tool | **No cleanup** - Permanent storage | Recipe loading, reference resolution, comparison with user edits | `optional_gradients` manually added |
| **recipes_edited** | Stores user-modified recipes temporarily | Same as `recipes` + `recipe_path`, `timestamp` | Backend retrieval for packing | User submits modified recipe (differs from default) | **24 hours** - cleaned if older than 24 hours | User recipe submission, packing job submission | Unnested format, no reference resolution needed |
| **configs** | Stores packing configuration settings | Referenced by path (e.g., `firebase:configs/{id}`) | Building packing job request, backend reads during packing | Backend admin uploads via CLI tool | **No cleanup** - Permanent storage | Packing job configuration, submitted to Docker server | Referenced but not directly queried by frontend |
| **objects** | Stores object definitions (molecules, organelles) | `name`, `id`, `dedup_hash`, `type`, `color`, `packing_mode`, `place_method`, `radius`, `inherit`, `gradient`, `representations`, `jitter_attempts` | Recipe reference resolution, inheritance chain resolution, packing execution | Backend admin uploads with deduplication | **No cleanup** - Permanent storage | Reference resolution in recipes, inheritance chain resolution | |
| **gradients** | Stores gradient definitions for spatial distributions | `name`, `id`, `dedup_hash`, `description`, `mode`, `mode_settings`, `weight_mode`, `pick_mode`, `reversed`, `invert` | Recipe/object reference resolution, editable field options | Backend admin uploads with deduplication | **No cleanup** - Permanent storage | Reference resolution, gradient options for editable fields | Can be combined (multiple per object) |
| **composition** | Stores composition definitions (what objects go where) | `name`, `id`, `dedup_hash`, `count`, `molarity`, `object`, `inherit`, `regions` (interior, surface, leaflets) | Recipe reference resolution, region-based placement | Backend admin uploads with topological sort | **No cleanup** - Permanent storage | Reference resolution, region-based object placement | |
| **job_status** | Tracks AWS packing job status | `status`, `error_message`, `outputs_directory`, `result_path`, `timestamp` | Frontend polling (every 500ms), progress monitoring, result retrieval | Backend updates during job execution (RUNNING → DONE/FAILED) | **24 hours** - cleaned if older than 24 hours | Job monitoring, status polling, result retrieval | |
| **example_packings** | Maps example recipes to configs and defines UI metadata | `name` (display name), `recipe` (recipe ID), `config` (config ID), `editable_fields` (array of field IDs), `result_path` | App startup, recipe dropdown population | The collection to receive recipes that users want to pack in Studio | **No cleanup** - Permanent storage | App initialization, recipe dropdown population, UI configuration | |
| **editable_fields** | Defines which recipe fields users can edit in UI | `id`, `name`, `data_type`, `input_type`, `description`, `path`, `min`, `max`, `options`, `gradient_options`, `conversion_factor`, `unit` | App initialization, form generation, input validation | Defined by users through upload script | **No cleanup** - Permanent storage | Dynamic form generation, input validation, gradient options | **Details for upload** - [documentation](https://github.com/mesoscope/cellpack/blob/main/docs/STUDIO_SITE.md#cellpack-studio-site) |
| **results** | Stores Simularium result metadata for auto-opening (backend only) | `batch_job_id`, `timestamp`, `url` (Simularium result S3 URL), `user` | Backend cleanup validation (S3 URL check) | Successful packing completion with Simularium output | **180 days** - Cleaned if S3 URL inaccessible | Result archiving, user history tracking, result URL storage | Cleanup validates S3 existence |

3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,9 @@ This client interacts with the cellPACK server, which consists of a variety of b
* **ECR**: Docker image built from the [cellPACK github repo](https://github.com/mesoscope/cellpack) is published to the `cellpack-private` ECR repository. That image defines the container specificationsin which the batch job will run.
* **CloudWatch**: Logs from each AWS Batch job are written to CloudWatch. These logs can be accessed via the GET /logs endpoint.

### cellPACK Database
* **Firebase Firestore**: The cellPACK database is hosted in Firebase Firestore. This database stores all recipes, objects, gradients, compositions, packing configurations, job statuses, and results metadata. See [CONTRIBUTING.md](CONTRIBUTING.md#firebase-overview) for Firebase overview and [FIREBASE_SCHEMA.md](FIREBASE_SCHEMA.md) for the complete database schema.

#### Resources
* [Server Architecture Overview Diagram](https://docs.google.com/presentation/d/1eG2XCxgYNaoDIYI-M6Tzef17bGuFirZZhmfZlBFTaXc/edit#slide=id.g26c8fd413da_0_34)
* [AWS Batch Dashboard](https://us-west-2.console.aws.amazon.com/batch/home?region=us-west-2#)
Expand Down