Skip to content

feat(fhir): NASS-1883: Allow SENAITE to specify a custom reference range for a lab test#9126

Open
rohan-bes wants to merge 11 commits intomainfrom
feat/nass-1883-senaite-custom-reference-range
Open

feat(fhir): NASS-1883: Allow SENAITE to specify a custom reference range for a lab test#9126
rohan-bes wants to merge 11 commits intomainfrom
feat/nass-1883-senaite-custom-reference-range

Conversation

@rohan-bes
Copy link
Copy Markdown
Collaborator

@rohan-bes rohan-bes commented Feb 6, 2026

Changes

We're now allowing individual LabTests to have a reference range that overrides the default defined in the LabTestType. This range will only be set if we're receiving lab results from SENAITE, which provides more accurate ranges than the ones in Tamanu.

Deploys

  • Deploy to Tamanu Internal

Tests

  • Run E2E Tests

Remember to...

  • ...write or update tests
  • ...add UI screenshots and testing notes to the Linear issue
  • ...add any manual upgrade steps to the Linear issue
  • ...update the config reference, settings reference, or any relevant runbook(s)
  • ...call out additions or changes to config files for the deployment team to take note of

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello @rohan-bes, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request enables the system to handle more granular lab test reference ranges by allowing individual lab tests to store and display their own specific reference range values. This is particularly useful for integrating with external systems like SENAITE, which can provide more accurate and context-specific ranges. The changes involve updating the database schema, enhancing the FHIR Observation processing logic to capture these ranges, and adjusting the user interface to correctly present them.

Highlights

  • Custom Reference Range Storage: Introduced new database columns (reference_range_min, reference_range_max) to the lab_tests table and updated the LabTest model to store specific reference ranges for individual lab tests.
  • FHIR Observation Processing: Enhanced the FhirObservation model to accept and process referenceRange data from incoming FHIR Observation resources, extracting and persisting the low and high values to the corresponding LabTest fields.
  • Frontend Display Logic: Modified the frontend components (LabResultsPrintout.jsx, PatientLabTestsTable.jsx, LabRequestResultsTable.jsx) and utility functions (labTests.ts) to prioritize and display these new custom LabTest reference ranges over the default LabTestType ranges where available.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • packages/central-server/tests/hl7fhir/materialised/Observation.test.js
    • Added a new test case to verify that LabTest reference range fields are correctly updated when an FHIR Observation with referenceRange is posted.
  • packages/database/src/migrations/1770200000000-addLabTestReferenceRangeColumns.ts
    • Added a new migration to introduce reference_range_min and reference_range_max columns (nullable DOUBLE) to the lab_tests table.
  • packages/database/src/models/LabTest.ts
    • Declared referenceRangeMin and referenceRangeMax properties in the LabTest model.
    • Configured referenceRangeMin and referenceRangeMax as nullable DOUBLE data types in the model definition.
  • packages/database/src/models/fhir/FhirObservation.ts
    • Added referenceRange property to the FhirObservation model to store an array of low/high value objects.
    • Configured referenceRange as a JSONB data type in the model definition.
    • Updated the Yup schema to validate the structure of the referenceRange property.
    • Modified the materialise method to extract low.value and high.value from the incoming FHIR Observation's referenceRange and update the LabTest's referenceRangeMin and referenceRangeMax fields accordingly.
  • packages/facility-server/app/routes/apiv1/patient/patientRelations.js
    • Modified the SQL query to include reference_range_min and reference_range_max in the JSON output for lab test results, making them accessible to the frontend.
  • packages/shared/src/utils/patientCertificates/LabResultsPrintout.jsx
    • Updated the getReferenceRangeWithUnit utility call to pass the individual labTest object, allowing it to use specific reference range overrides for printouts.
  • packages/utils/src/labTests.ts
    • Introduced a new type LabTestReferenceRangeOverride for defining reference range overrides.
    • Modified the GetReferenceRangeProps interface to accept an optional labTest object for reference range overrides.
    • Updated getReferenceRange and getReferenceRangeWithUnit functions to prioritize labTest.referenceRangeMin/Max if provided, over the default labTestType ranges.
  • packages/web/app/views/patients/PatientLabTestsTable.jsx
    • Adjusted the logic for determining the normalRange in the patient lab tests table, giving precedence to referenceRangeMin/Max from the individual lab test result.
    • Updated validation criteria for RangeValidatedCell to handle nullable min values more robustly.
  • packages/web/app/views/patients/components/LabRequestResultsTable.jsx
    • Updated the getReferenceRange utility call to pass the individual labTest object, allowing it to use specific reference range overrides in the lab request results table.
Activity
  • The pull request description indicates that 'Deploy to Tamanu Internal' and 'Run E2E Tests' are part of the planned activities or checks for this change.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Comment on lines +4 to +11
await query.addColumn('lab_tests', 'reference_range_min', {
type: DataTypes.DOUBLE,
allowNull: true,
});
await query.addColumn('lab_tests', 'reference_range_max', {
type: DataTypes.DOUBLE,
allowNull: true,
});
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to rebuild the lookup table if just adding new columns that can be null

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request effectively implements the feature to allow custom reference ranges for lab tests, overriding the defaults from LabTestType. The changes are well-structured, spanning from the database migration and model updates to the backend processing logic and frontend display. The inclusion of a new test case for the Observation resource ensures the new logic is covered.

I have one suggestion to improve code conciseness in FhirObservation.ts. Overall, great work!

Comment on lines +178 to +185
const lowValue =
first?.low != null && typeof first.low === 'object' && 'value' in first.low
? first.low.value
: null;
const highValue =
first?.high != null && typeof first.high === 'object' && 'value' in first.high
? first.high.value
: null;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The logic to extract lowValue and highValue is a bit verbose. Since the data is validated against a yup schema before this method is called, we can be confident about the shape of first.low and first.high. You can simplify this using optional chaining and the nullish coalescing operator, which would make the code more concise and readable.

      const lowValue = first?.low?.value ?? null;
      const highValue = first?.high?.value ?? null;

@github-actions
Copy link
Copy Markdown

github-actions Bot commented Feb 9, 2026

@github-actions
Copy link
Copy Markdown

github-actions Bot commented Feb 9, 2026

🍹 up on tamanu-on-k8s/bes/tamanu-on-k8s/feat-nass-1883-senaite-custom-reference-range

Pulumi report
   Updating (feat-nass-1883-senaite-custom-reference-range)

View Live: https://app.pulumi.com/bes/tamanu-on-k8s/feat-nass-1883-senaite-custom-reference-range/updates/12

Downloading plugin random-4.19.0: starting
Downloading plugin random-4.19.0: done
Installing plugin random-4.19.0: starting
Installing plugin random-4.19.0: done

@ Updating....
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running 
@ Updating....
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running read kubernetes:core/v1:Namespace tamanu-feat-nass-1883-senaite-custom-reference-range
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running read pulumi:pulumi:StackReference bes/k8s-core/tamanu-internal-main
@ Updating....
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running read pulumi:pulumi:StackReference bes/k8s-core/tamanu-internal-main
@ Updating....
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running read pulumi:pulumi:StackReference bes/core/tamanu-internal
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running read pulumi:pulumi:StackReference bes/core/tamanu-internal
@ Updating.....
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running read kubernetes:core/v1:Namespace tamanu-feat-nass-1883-senaite-custom-reference-range
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Waiting for central-db...
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Waiting for facility-1-db...
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Waiting for facility-2-db...
~  kubernetes:apps/v1:Deployment facility-2-web updating (0s) [diff: ~spec]
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running read kubernetes:core/v1:ConfigMap actual-provisioning
+  kubernetes:gateway.networking.k8s.io/v1:Gateway central creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway facility-2 creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway facility-1 creating (0s) 
~  kubernetes:apps/v1:Deployment patient-portal-web updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment central-web updating (0s) [diff: ~spec]
+  kubernetes:gateway.networking.k8s.io/v1:Gateway patient-portal creating (0s) 
~  kubernetes:apps/v1:Deployment facility-1-web updating (0s) [diff: ~spec]
@ Updating....
++ kubernetes:batch/v1:Job central-migrator creating replacement (0s) [diff: ~spec]
+  kubernetes:gateway.networking.k8s.io/v1:Gateway patient-portal creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway patient-portal creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway patient-portal created (0.74s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy patient-portal-traffic-policy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway central creating (1s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway central creating (1s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway central created (1s) 
@ Updating....
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy central-traffic-policy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway facility-1 creating (1s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway facility-1 creating (1s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway facility-1 created (1s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway facility-2 creating (1s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway facility-2 creating (1s) 
+  kubernetes:gateway.networking.k8s.io/v1:Gateway facility-2 created (1s) 
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running read kubernetes:core/v1:ConfigMap actual-provisioning
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-frontend creating (0s) 
++ kubernetes:batch/v1:Job central-migrator creating replacement (1s) [diff: ~spec]; 
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Secret facility-1-db-superuser not found or not ready: Error: HTTP-Code: 404
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Message: Unknown API Status Code!
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"facility-1-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"facility-1-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Headers: {"audit-id":"79041c23-6321-465a-bb13-ef3a10791480","cache-control":"no-cache, private","connection":"close","content-length":"220","content-type":"application/json","date":"Mon, 06 Apr 2026 23:53:58 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Secret facility-2-db-superuser not found or not ready: Error: HTTP-Code: 404
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Message: Unknown API Status Code!
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"facility-2-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"facility-2-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Headers: {"audit-id":"d186a636-f921-4cef-b641-0a68f21c07ce","cache-control":"no-cache, private","connection":"close","content-length":"220","content-type":"application/json","date":"Mon, 06 Apr 2026 23:53:58 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Secret central-db-superuser not found or not ready: Error: HTTP-Code: 404
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Message: Unknown API Status Code!
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"central-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"central-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range running Headers: {"audit-id":"afe77244-d154-4f92-89ce-2045a51099da","cache-control":"no-cache, private","connection":"close","content-length":"214","content-type":"application/json","date":"Mon, 06 Apr 2026 23:53:58 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy patient-portal-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy patient-portal-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy patient-portal-traffic-policy created (0.69s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-api-legacy created (0.72s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-frontend created (0.72s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute patient-portal-api created (0.76s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-api creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy facility-1-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy central-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy central-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy central-traffic-policy created (0.44s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-api-legacy created (0.54s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-api created (0.60s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute central-frontend created (0.60s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-api-legacy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy facility-2-traffic-policy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-frontend created (0.66s) 
@ Updating....
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-api created (0.66s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy facility-1-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy facility-1-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy facility-1-traffic-policy created (0.72s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-api-legacy created (0.73s) 
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (0s) [diff: ~spec]
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-api-legacy creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-1-api-legacy created (0.89s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy facility-2-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy facility-2-traffic-policy creating (0s) 
+  kubernetes:gateway.envoyproxy.io/v1alpha1:ClientTrafficPolicy facility-2-traffic-policy created (0.86s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-api creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-api created (0.90s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-frontend creating (0s) 
+  kubernetes:gateway.networking.k8s.io/v1:HTTPRoute facility-2-frontend created (0.86s) 
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (0s) [diff: ~spec]
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (0s) [diff: ~spec]; 
@ Updating....
++ kubernetes:batch/v1:Job central-migrator creating replacement (2s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/central-migrator-296bda60" to start
++ kubernetes:batch/v1:Job central-migrator creating replacement (2s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/central-migrator-296bda60" to succeed (Active: 1 | Succeeded: 0 | Failed: 0)
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/facility-1-migrator-5cf9f500" to start
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/facility-1-migrator-5cf9f500" to succeed (Active: 1 | Succeeded: 0 | Failed: 0)
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (0s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/facility-2-migrator-54f6c34a" to start
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/facility-2-migrator-54f6c34a" to succeed (Active: 1 | Succeeded: 0 | Failed: 0)
~  kubernetes:apps/v1:Deployment facility-2-web updating (3s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
~  kubernetes:apps/v1:Deployment central-web updating (3s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
~  kubernetes:apps/v1:Deployment patient-portal-web updating (3s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
~  kubernetes:apps/v1:Deployment facility-1-web updating (3s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
@ Updating.................
~  kubernetes:apps/v1:Deployment facility-1-web updating (17s) [diff: ~spec]; Waiting for app ReplicaSet to be available (1/2 Pods available)
@ Updating....
~  kubernetes:apps/v1:Deployment central-web updating (18s) [diff: ~spec]; Waiting for app ReplicaSet to be available (1/2 Pods available)
~  kubernetes:apps/v1:Deployment facility-2-web updating (19s) [diff: ~spec]; Waiting for app ReplicaSet to be available (1/2 Pods available)
@ Updating......
~  kubernetes:apps/v1:Deployment patient-portal-web updating (21s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment patient-portal-web updating (21s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment patient-portal-web updated (21s) [diff: ~spec]; 
@ Updating.....
~  kubernetes:apps/v1:Deployment central-web updating (23s) [diff: ~spec]; warning: [Pod tamanu-feat-nass-1883-senaite-custom-reference-range/central-web-5ced474e-86b8cb74d9-2rxnt]: containers with unready status: [http]
~  kubernetes:apps/v1:Deployment facility-1-web updating (23s) [diff: ~spec]; warning: [Pod tamanu-feat-nass-1883-senaite-custom-reference-range/facility-1-web-562d3307-7d46776bcf-p69hv]: containers with unready status: [http]
@ Updating........
~  kubernetes:apps/v1:Deployment facility-1-web updating (29s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-1-web updating (29s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-web updated (29s) [diff: ~spec]; 
@ Updating....
~  kubernetes:apps/v1:Deployment facility-2-web updating (29s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-2-web updating (29s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-web updating (29s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-web updating (29s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-web updated (30s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-web updated (29s) [diff: ~spec]; 
@ Updating.......
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (31s) [diff: ~spec]; warning: [Pod tamanu-feat-nass-1883-senaite-custom-reference-range/facility-2-migrator-54f6c34a-c9vjr]: Container "migrator" completed with exit code 0
@ Updating....
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (31s) [diff: ~spec]; warning: [Pod tamanu-feat-nass-1883-senaite-custom-reference-range/facility-1-migrator-5cf9f500-8trdb]: Container "migrator" completed with exit code 0
++ kubernetes:batch/v1:Job central-migrator creating replacement (33s) [diff: ~spec]; warning: [Pod tamanu-feat-nass-1883-senaite-custom-reference-range/central-migrator-296bda60-vnm9h]: Container "migrator" completed with exit code 0
@ Updating....
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (33s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/facility-2-migrator-54f6c34a" to succeed (Active: 0 | Succeeded: 0 | Failed: 0)
@ Updating....
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (33s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/facility-1-migrator-5cf9f500" to succeed (Active: 0 | Succeeded: 0 | Failed: 0)
++ kubernetes:batch/v1:Job central-migrator creating replacement (35s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/central-migrator-296bda60" to succeed (Active: 0 | Succeeded: 0 | Failed: 0)
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (33s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/facility-2-migrator-54f6c34a" to succeed (Active: 0 | Succeeded: 1 | Failed: 0)
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (33s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job facility-2-migrator created replacement (33s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job facility-2-migrator replacing (0s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job facility-2-migrator replaced (0.00s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-sync updating (0s) [diff: ~spec]
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (33s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/facility-1-migrator-5cf9f500" to succeed (Active: 0 | Succeeded: 1 | Failed: 0)
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (33s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-tasks updating (0s) [diff: ~spec]
++ kubernetes:batch/v1:Job facility-1-migrator created replacement (33s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-api updating (0s) [diff: ~spec]
++ kubernetes:batch/v1:Job central-migrator creating replacement (36s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/central-migrator-296bda60" to succeed (Active: 0 | Succeeded: 1 | Failed: 0)
++ kubernetes:batch/v1:Job central-migrator creating replacement (36s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job central-migrator created replacement (36s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job central-migrator replacing (0s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job facility-1-migrator replacing (0s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job facility-1-migrator replaced (0.00s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job central-migrator replaced (0.01s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-tasks updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment facility-1-sync updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment facility-1-api updating (0s) [diff: ~spec]
++ kubernetes:batch/v1:Job central-provisioner creating replacement (0s) [diff: ~spec]
@ Updating....
++ kubernetes:batch/v1:Job central-provisioner creating replacement (0s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job central-provisioner creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/central-provisioner-82c7dd81" to start
++ kubernetes:batch/v1:Job central-provisioner creating replacement (1s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/central-provisioner-82c7dd81" to succeed (Active: 1 | Succeeded: 0 | Failed: 0)
~  kubernetes:apps/v1:Deployment facility-2-api updating (1s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
@ Updating....
~  kubernetes:apps/v1:Deployment facility-2-sync updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-2-sync updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-sync updated (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-tasks updating (2s) [diff: ~spec]; warning: Replicas scaled to 0 for Deployment "facility-2-tasks-db9e64d1"
~  kubernetes:apps/v1:Deployment facility-2-tasks updating (2s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-2-tasks updating (2s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-tasks updated (2s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-sync updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-1-sync updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-sync updated (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-tasks updating (2s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-1-tasks updating (2s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-tasks updated (2s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-api updating (2s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-1-api updating (2s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-api updated (2s) [diff: ~spec]; 
@ Updating.........
~  kubernetes:apps/v1:Deployment facility-2-api updating (8s) [diff: ~spec]; Waiting for app ReplicaSet to be available (1/2 Pods available)
@ Updating....
++ kubernetes:batch/v1:Job central-provisioner creating replacement (8s) [diff: ~spec]; warning: [Pod tamanu-feat-nass-1883-senaite-custom-reference-range/central-provisioner-82c7dd81-qtnk6]: Container "provisioner" completed with exit code 0
@ Updating....
++ kubernetes:batch/v1:Job central-provisioner creating replacement (10s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/central-provisioner-82c7dd81" to succeed (Active: 0 | Succeeded: 0 | Failed: 0)
@ Updating....
++ kubernetes:batch/v1:Job central-provisioner creating replacement (10s) [diff: ~spec]; Waiting for Job "tamanu-feat-nass-1883-senaite-custom-reference-range/central-provisioner-82c7dd81" to succeed (Active: 0 | Succeeded: 1 | Failed: 0)
++ kubernetes:batch/v1:Job central-provisioner creating replacement (10s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job central-provisioner created replacement (10s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job central-provisioner replacing (0s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job central-provisioner replaced (0.00s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-api updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment central-tasks updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment central-fhir-refresh updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment central-fhir-resolver updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment facility-2-api updating (11s) [diff: ~spec]; warning: [Pod tamanu-feat-nass-1883-senaite-custom-reference-range/facility-2-api-596b956c7f-kgjtm]: containers with unready status: [server]
@ Updating....
~  kubernetes:apps/v1:Deployment central-api updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-api updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-api updated (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-resolver updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-fhir-resolver updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-refresh updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-fhir-refresh updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-resolver updated (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-refresh updated (1s) [diff: ~spec]; 
@ Updating....
~  kubernetes:apps/v1:Deployment central-tasks updating (1s) [diff: ~spec]; warning: Replicas scaled to 0 for Deployment "central-tasks-52ee2e05"
~  kubernetes:apps/v1:Deployment central-tasks updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-tasks updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-tasks updated (1s) [diff: ~spec]; 
@ Updating............
~  kubernetes:apps/v1:Deployment facility-2-api updating (21s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-2-api updating (21s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-api updated (22s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-provisioner deleting original (0s) [diff: ~spec]; 
@ Updating....
-- kubernetes:batch/v1:Job central-provisioner deleting original (0s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-provisioner deleted original (0.59s) [diff: ~spec]; 
-  kubernetes:networking.k8s.io/v1:Ingress bare-domain deleting (0s) 
-  kubernetes:networking.k8s.io/v1:Ingress patient-portal deleting (0s) 
-  kubernetes:networking.k8s.io/v1:Ingress facility-1 deleting (0s) 
-  kubernetes:networking.k8s.io/v1:Ingress facility-2 deleting (0s) 
-  kubernetes:networking.k8s.io/v1:Ingress central deleting (0s) 
-- kubernetes:batch/v1:Job facility-1-migrator deleting original (0s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-migrator deleting original (0s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-2-migrator deleting original (0s) [diff: ~spec]; 
@ Updating....
-  kubernetes:networking.k8s.io/v1:Ingress bare-domain deleting (1s) 
-  kubernetes:networking.k8s.io/v1:Ingress bare-domain deleted (1s) 
-- kubernetes:batch/v1:Job facility-1-migrator deleting original (1s) [diff: ~spec]; Job Completed. succeeded: 1/1
-- kubernetes:batch/v1:Job facility-1-migrator deleting original (1s) [diff: ~spec]; Resource scheduled for deletion
-- kubernetes:batch/v1:Job facility-2-migrator deleting original (1s) [diff: ~spec]; Job Completed. succeeded: 1/1
-- kubernetes:batch/v1:Job facility-2-migrator deleting original (1s) [diff: ~spec]; Resource scheduled for deletion
-  kubernetes:networking.k8s.io/v1:Ingress patient-portal deleting (1s) 
-  kubernetes:networking.k8s.io/v1:Ingress patient-portal deleted (1s) 
-  kubernetes:networking.k8s.io/v1:Ingress facility-1 deleting (1s) 
-  kubernetes:networking.k8s.io/v1:Ingress facility-1 deleted (1s) 
-  kubernetes:networking.k8s.io/v1:Ingress central deleting (1s) Resource scheduled for deletion
-  kubernetes:networking.k8s.io/v1:Ingress facility-2 deleting (1s) Resource scheduled for deletion
@ Updating....
-- kubernetes:batch/v1:Job facility-2-migrator deleting original (2s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-2-migrator deleted original (2s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-1-migrator deleting original (2s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-1-migrator deleted original (2s) [diff: ~spec]; 
-  kubernetes:networking.k8s.io/v1:Ingress central deleting (2s) 
-  kubernetes:networking.k8s.io/v1:Ingress central deleted (2s) 
-  kubernetes:networking.k8s.io/v1:Ingress facility-2 deleting (2s) 
-  kubernetes:networking.k8s.io/v1:Ingress facility-2 deleted (2s) 
@ Updating....
-- kubernetes:batch/v1:Job central-migrator deleting original (2s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-migrator deleted original (2s) [diff: ~spec]; 
   pulumi:pulumi:Stack tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range  15 messages
Diagnostics:
 pulumi:pulumi:Stack (tamanu-on-k8s-feat-nass-1883-senaite-custom-reference-range):
   Waiting for central-db...
   Waiting for facility-1-db...
   Waiting for facility-2-db...

   Secret facility-1-db-superuser not found or not ready: Error: HTTP-Code: 404
   Message: Unknown API Status Code!
   Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"facility-1-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"facility-1-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   Headers: {"audit-id":"79041c23-6321-465a-bb13-ef3a10791480","cache-control":"no-cache, private","connection":"close","content-length":"220","content-type":"application/json","date":"Mon, 06 Apr 2026 23:53:58 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
   Secret facility-2-db-superuser not found or not ready: Error: HTTP-Code: 404
   Message: Unknown API Status Code!
   Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"facility-2-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"facility-2-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   Headers: {"audit-id":"d186a636-f921-4cef-b641-0a68f21c07ce","cache-control":"no-cache, private","connection":"close","content-length":"220","content-type":"application/json","date":"Mon, 06 Apr 2026 23:53:58 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
   Secret central-db-superuser not found or not ready: Error: HTTP-Code: 404
   Message: Unknown API Status Code!
   Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"central-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"central-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   Headers: {"audit-id":"afe77244-d154-4f92-89ce-2045a51099da","cache-control":"no-cache, private","connection":"close","content-length":"214","content-type":"application/json","date":"Mon, 06 Apr 2026 23:53:58 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}

   [Pulumi Neo] Would you like help with these diagnostics?
   https://app.pulumi.com/bes/tamanu-on-k8s/feat-nass-1883-senaite-custom-reference-range/updates/12?explainFailure

Outputs:
   urls: {
       Central      : "https://central.feat-nass-1883-senaite-custom-reference-range.cd.tamanu.app"
       Facility- 1  : "https://facility-1.feat-nass-1883-senaite-custom-reference-range.cd.tamanu.app"
       Facility- 2  : "https://facility-2.feat-nass-1883-senaite-custom-reference-range.cd.tamanu.app"
       PatientPortal: "https://portal.feat-nass-1883-senaite-custom-reference-range.cd.tamanu.app"
   }

Resources:
   + 20 created
   ~ 14 updated
   - 5 deleted
   +-4 replaced
   43 changes. 44 unchanged

Duration: 1m9s

   

Base automatically changed from epic-senaite-results to main February 10, 2026 03:08
@rohan-bes rohan-bes force-pushed the feat/nass-1883-senaite-custom-reference-range branch from 884b149 to 0b485a6 Compare February 10, 2026 03:34
Copy link
Copy Markdown
Contributor

@tcodling tcodling left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lgtm!

Honestly the only thing for me is that the term referenceRange doesn't seem super clear to someone unfamiliar with this feature/fhir 🤔 Id maybe add some more comments around logic using this and how it relates to integrations if this was me but thats probably me being a bit overly cautious

@rohan-bes rohan-bes closed this Mar 9, 2026
@rohan-bes rohan-bes reopened this Mar 9, 2026
@rohan-bes rohan-bes closed this Mar 10, 2026
@rohan-bes rohan-bes reopened this Mar 10, 2026
@rohan-bes rohan-bes closed this Mar 19, 2026
@rohan-bes rohan-bes reopened this Mar 19, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants