Skip to content

fix(import): TAMOC-386: Apply same unique restrictions on user admin panel to user reference data import#9256

Closed
dannash100 wants to merge 9 commits intomainfrom
fix/tamoc-386-user-import-unique
Closed

fix(import): TAMOC-386: Apply same unique restrictions on user admin panel to user reference data import#9256
dannash100 wants to merge 9 commits intomainfrom
fix/tamoc-386-user-import-unique

Conversation

@dannash100
Copy link
Copy Markdown
Contributor

@dannash100 dannash100 commented Mar 1, 2026

Changes

Deploys

  • Deploy to Tamanu Internal

Tests

  • Run E2E Tests

Review Hero

  • Run Review Hero
  • Auto-fix review suggestions Wait for Review Hero to finish, resolve any comments you disagree with or want to fix manually, then check this to auto-fix the rest.
  • Auto-fix CI failures Check this to auto-fix lint errors, test failures, and other CI issues.

Remember to...

  • ...write or update tests
  • ...add UI screenshots and testing notes to the Linear issue
  • ...add any manual upgrade steps to the Linear issue
  • ...update the config reference, settings reference, or any relevant runbook(s)
  • ...call out additions or changes to config files for the deployment team to take note of

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances data integrity by enforcing consistent uniqueness constraints for user emails and display names across both the admin panel and reference data import functionalities. It centralizes validation logic into a reusable utility, streamlining user management and preventing duplicate entries from various sources. Additionally, it includes a cleanup of unused invoice product loading features, contributing to a more focused and maintainable codebase.

Highlights

  • User Uniqueness Validation Utility: A new utility function, checkUserUniqueness, has been introduced to centralize the logic for verifying the uniqueness of user emails and display names, including case-insensitive checks for display names and exclusion of the current user during updates.
  • Integration into Admin Panel User Management: The checkUserUniqueness utility has been integrated into the user creation, update, and uniqueness check endpoints within the admin panel, replacing previous redundant and duplicated validation logic.
  • User Reference Data Import Validation: User uniqueness checks have been extended to the reference data import process. The userLoader now utilizes checkUserUniqueness to prevent importing users with duplicate emails or display names, and validateTableRows includes a new validateUsers function to detect duplicates within an import batch.
  • Reference Data Importer Cleanup: The invoiceProductLoader and related constants (DRUG_STOCK_STATUSES, INVOICE_ITEMS_CATEGORIES, INVOICE_ITEMS_CATEGORIES_MODELS) have been removed from the reference data importer, simplifying the codebase.
  • Standardized Error Reporting: The pushError function calls across various loaders in the reference data importer have been simplified by removing the redundant model name argument, leading to more consistent error reporting.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • packages/central-server/tests/admin/userUniqueness.test.js
    • Added comprehensive unit tests for the new checkUserUniqueness utility.
    • Added tests for validateTableRows to ensure correct detection of duplicate user emails and display names within an import batch.
  • packages/central-server/app/admin/importer/validateTableRows.js
    • Added a new validateUsers function to perform uniqueness checks for user emails and display names within a batch of imported rows.
    • Integrated validateUsers into the MODEL_VALIDATION object to apply these checks during user data imports.
  • packages/central-server/app/admin/referenceDataImporter/loaders.js
    • Imported the new checkUserUniqueness utility.
    • Removed unused constants related to drug stock statuses and invoice item categories.
    • Updated permissionLoader, taskSetLoader, medicationTemplateLoader, and procedureTypeLoader to use a simplified pushError signature.
    • Removed the order field from LabTestPanelLabTestTypes in labTestPanelLoader.
    • Integrated checkUserUniqueness into userLoader to validate user email and display name uniqueness against existing users.
    • Added an id field to UserFacility objects created in userLoader.
    • Refactored drugLoader to remove complex logic for handling facility-specific drug stock and related error handling.
    • Removed the entire invoiceProductLoader function.
  • packages/central-server/app/admin/userValidation.js
    • Added a new file containing the checkUserUniqueness asynchronous utility function for validating user email and display name uniqueness.
  • packages/central-server/app/admin/users.js
    • Imported the new checkUserUniqueness utility.
    • Refactored the user creation (POST /users), uniqueness check (GET /users/check-uniqueness), and user update (PUT /users/:id) endpoints to utilize checkUserUniqueness for email and display name validation, replacing previous inline logic.
Activity
  • No human activity has been recorded on this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a centralized function checkUserUniqueness to enforce unique emails and display names for users, and applies it to the user import process and user admin panel APIs. This is a great refactoring that reduces code duplication. I've left a comment to improve the new uniqueness check by making email validation case-insensitive.

However, this PR also includes significant, undocumented changes, such as the removal of the invoiceProductLoader and a drastic simplification of the drugLoader, which removes functionality related to facility-specific drug stock. These changes are outside the scope of the PR's title and should be explained in the description. I've added comments requesting clarification on these removals.

I am having trouble creating individual review comments. Click here to see my feedback.

packages/central-server/app/admin/referenceDataImporter/loaders.js (796-866)

high

The invoiceProductLoader function has been completely removed. Similar to the changes in drugLoader, this is a significant modification to the importer's functionality that is not mentioned in the pull request description. Could you please add some context about this removal?

packages/central-server/app/admin/userValidation.js (6-8)

medium

The email uniqueness check is case-sensitive, which is likely not the desired behavior. Emails are generally treated as case-insensitive. For consistency with the displayName check and to prevent duplicate users with different email casing (e.g., 'test@example.com' and 'Test@example.com'), you should use a case-insensitive comparison here as well.

    const where = { email: { [Op.iLike]: email } };
    if (excludeId) where.id = { [Op.ne]: excludeId };
    dupes.email = !!(await User.findOne({ where }));

Comment thread packages/central-server/app/admin/referenceDataImporter/loaders.js
Comment thread packages/central-server/app/admin/referenceDataImporter/loaders.js
Comment thread packages/central-server/app/admin/referenceDataImporter/loaders.js
Comment thread packages/central-server/app/admin/referenceDataImporter/loaders.js
Comment thread packages/central-server/app/admin/userValidation.js Outdated
Comment thread packages/central-server/app/admin/userValidation.js Outdated
Comment thread packages/central-server/app/admin/userValidation.js Outdated
@review-hero
Copy link
Copy Markdown

review-hero Bot commented Mar 1, 2026

🦸 Review Hero Summary
3 agents reviewed this PR | 2 failed | 4 critical | 6 suggestions | 0 nitpicks

dannash100 and others added 2 commits March 2, 2026 10:48
Co-Authored-By: Review Hero <contact@bes.au>
@review-hero
Copy link
Copy Markdown

review-hero Bot commented Mar 1, 2026

🦸 Review Hero Auto-Fix
Applied fixes for 3 review comments.

Skipped 4 comments (replied on each thread).

@dannash100
Copy link
Copy Markdown
Contributor Author

bugbot run

Copy link
Copy Markdown
Contributor

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Autofix Details

Bugbot Autofix prepared a fix for the issue found in the latest run.

  • ✅ Fixed: Dropped errModel argument silently disables import error stats
    • Restored the second errModel argument to all pushError calls in permissionLoader, taskSetLoader, userLoader, taskTemplateLoader, medicationTemplateLoader, procedureTypeLoader, and invoiceProductLoader to re-enable per-model error statistics tracking.

Create PR

Or push these changes by commenting:

@cursor push 217b1aee68
Preview (217b1aee68)
diff --git a/packages/central-server/app/admin/referenceDataImporter/loaders.js b/packages/central-server/app/admin/referenceDataImporter/loaders.js
--- a/packages/central-server/app/admin/referenceDataImporter/loaders.js
+++ b/packages/central-server/app/admin/referenceDataImporter/loaders.js
@@ -270,7 +270,7 @@
   await validateObjectId(
     { ...item, noun: normalizedNoun, objectId: normalizedObjectId },
     models,
-    pushError,
+    message => pushError(message, 'Permission'),
   );
 
   // Any non-empty value in the role cell would mean the role
@@ -342,7 +342,7 @@
   }).then(tasks => tasks.map(({ id }) => id));
   const nonExistentTaskIds = taskIds.filter(taskId => !existingTaskIds.includes(taskId));
   if (nonExistentTaskIds.length > 0) {
-    pushError(`Tasks ${nonExistentTaskIds.join(', ')} not found`);
+    pushError(`Tasks ${nonExistentTaskIds.join(', ')} not found`, 'TaskSet');
   }
 
   if (!existingTaskIds.length) return [];
@@ -378,8 +378,8 @@
     displayName: otherFields.displayName,
     excludeId: id,
   });
-  if (duplicates.email) pushError(`Email "${otherFields.email}" is already in use by another user`);
-  if (duplicates.displayName) pushError(`Display name "${otherFields.displayName}" is already in use by another user`);
+  if (duplicates.email) pushError(`Email "${otherFields.email}" is already in use by another user`, 'User');
+  if (duplicates.displayName) pushError(`Display name "${otherFields.displayName}" is already in use by another user`, 'User');
 
   const allowedFacilityIds = allowedFacilities
     ? allowedFacilities.split(',').map(t => t.trim())
@@ -440,11 +440,11 @@
   for (const designation of designationIds) {
     const existingData = await models.ReferenceData.findByPk(designation);
     if (!existingData) {
-      pushError(`Designation "${designation}" does not exist`);
+      pushError(`Designation "${designation}" does not exist`, 'User');
       continue;
     }
     if (existingData.visibilityStatus !== VISIBILITY_STATUSES.CURRENT) {
-      pushError(`Designation "${designation}" doesn't have visibilityStatus of current`);
+      pushError(`Designation "${designation}" doesn't have visibilityStatus of current`, 'User');
       continue;
     }
     rows.push({
@@ -498,7 +498,7 @@
   );
   for (const designationId of designationIds) {
     if (!existingDesignationIds.includes(designationId)) {
-      pushError(`Designation "${designationId}" does not exist`);
+      pushError(`Designation "${designationId}" does not exist`, 'TaskTemplate');
       continue;
     }
     rows.push({
@@ -625,11 +625,17 @@
     where: { id: drugReferenceDataId, type: REFERENCE_TYPES.DRUG },
   });
   if (!drug) {
-    pushError(`Drug with ID "${drugReferenceDataId}" does not exist.`);
+    pushError(
+      `Drug with ID "${drugReferenceDataId}" does not exist.`,
+      'ReferenceMedicationTemplate',
+    );
   }
 
   if (isNaN(doseAmount) && doseAmount?.toString().toLowerCase() !== 'variable') {
-    pushError(`Dose amount must be a number or the string "variable".`);
+    pushError(
+      `Dose amount must be a number or the string "variable".`,
+      'ReferenceMedicationTemplate',
+    );
   }
 
   const existingTemplate = await models.ReferenceMedicationTemplate.findOne({
@@ -746,6 +752,7 @@
     if (nonExistentSurveyIds.length > 0) {
       pushError(
         `Linked survey${nonExistentSurveyIds.length > 1 ? 's' : ''} "${nonExistentSurveyIds.join(', ')}" for procedure type "${id}" not found.`,
+        'ProcedureTypeSurvey',
       );
     }
 
@@ -754,6 +761,7 @@
     if (nonProgramSurveys.length > 0) {
       pushError(
         `Survey${nonProgramSurveys.length > 1 ? 's' : ''} "${nonProgramSurveys.map(s => s.id).join(', ')}" for procedure type "${id}" must have survey_type of 'programs'.`,
+        'ProcedureTypeSurvey',
       );
     }
   }
@@ -799,12 +807,12 @@
   const rows = [];
 
   if (!category && sourceRecordId) {
-    pushError(`Must provide a category if providing a sourceRecordId.`);
+    pushError(`Must provide a category if providing a sourceRecordId.`, 'InvoiceProduct');
     return [];
   }
 
   if (category && !sourceRecordId) {
-    pushError(`Must provide a sourceRecordId if providing a category.`);
+    pushError(`Must provide a sourceRecordId if providing a category.`, 'InvoiceProduct');
     return [];
   }
 
@@ -822,19 +830,22 @@
 
   const validCategories = Object.values(INVOICE_ITEMS_CATEGORIES);
   if (!validCategories.includes(category)) {
-    pushError(`Invalid category: "${category}". Must be one of: ${validCategories.join(', ')}.`);
+    pushError(
+      `Invalid category: "${category}". Must be one of: ${validCategories.join(', ')}.`,
+      'InvoiceProduct',
+    );
     return [];
   }
 
   const modelName = INVOICE_ITEMS_CATEGORIES_MODELS[category];
   if (!modelName) {
-    pushError(`No model mapped to category: "${category}".`);
+    pushError(`No model mapped to category: "${category}".`, 'InvoiceProduct');
     return [];
   }
 
   const model = models[modelName];
   if (!model) {
-    pushError(`Model not found: "${modelName}".`);
+    pushError(`Model not found: "${modelName}".`, 'InvoiceProduct');
     return [];
   }
 
@@ -842,7 +853,10 @@
     where: { id: sourceRecordId },
   });
   if (!existingRecord) {
-    pushError(`Source record with ID "${sourceRecordId}" and category "${category}" does not exist.`);
+    pushError(
+      `Source record with ID "${sourceRecordId}" and category "${category}" does not exist.`,
+      'InvoiceProduct',
+    );
     return [];
   }
This Bugbot Autofix run was free. To enable autofix for future PRs, go to the Cursor dashboard.

Comment thread packages/central-server/app/admin/referenceDataImporter/loaders.js Outdated
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Mar 1, 2026

Android builds 📱

@dannash100
Copy link
Copy Markdown
Contributor Author

@cursor push 217b1ae

The errModel second argument was removed from pushError calls, which
silently disabled per-model error statistics tracking. The pushError
callback in sheet.js uses errModel to call updateStat() for error counts,
but the if (errModel) guard skips the stat update when errModel is
undefined.

This restores the errModel argument to all affected loaders:
- permissionLoader (Permission)
- taskSetLoader (TaskSet)
- userLoader (User)
- taskTemplateLoader (TaskTemplate)
- medicationTemplateLoader (ReferenceMedicationTemplate)
- procedureTypeLoader (ProcedureTypeSurvey)
- invoiceProductLoader (InvoiceProduct)

Applied via @cursor push command
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Mar 1, 2026

🍹 up on tamanu-on-k8s/bes/tamanu-on-k8s/fix-tamoc-386-user-import-unique

Pulumi report
   Updating (fix-tamoc-386-user-import-unique)

View Live: https://app.pulumi.com/bes/tamanu-on-k8s/fix-tamoc-386-user-import-unique/updates/5

Downloading plugin random-4.19.0: starting
Downloading plugin random-4.19.0: done
Installing plugin random-4.19.0: starting
Installing plugin random-4.19.0: done

@ Updating....
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running 
@ Updating.....
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running read pulumi:pulumi:StackReference bes/k8s-core/tamanu-internal-main
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running read kubernetes:core/v1:Namespace tamanu-fix-tamoc-386-user-import-unique
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running read pulumi:pulumi:StackReference bes/k8s-core/tamanu-internal-main
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Using tailscale proxy https://k8s-operator-tamanu-internal-main.tail53aef.ts.net
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running read pulumi:pulumi:StackReference bes/core/tamanu-internal
@ Updating....
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running read pulumi:pulumi:StackReference bes/core/tamanu-internal
@ Updating.....
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running read kubernetes:core/v1:Namespace tamanu-fix-tamoc-386-user-import-unique
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Waiting for central-db...
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Waiting for facility-1-db...
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Waiting for facility-2-db...
~  kubernetes:apps/v1:Deployment facility-2-web updating (0s) [diff: ~spec]
@ Updating....
~  kubernetes:apps/v1:Deployment facility-1-web updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment central-web updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment patient-portal-web updating (0s) [diff: ~spec]
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running read kubernetes:core/v1:ConfigMap actual-provisioning
++ kubernetes:batch/v1:Job central-migrator creating replacement (0s) [diff: ~spec]
@ Updating....
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Secret central-db-superuser not found or not ready: Error: HTTP-Code: 404
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Message: Unknown API Status Code!
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"central-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"central-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Headers: {"audit-id":"54999a56-371e-40f7-b514-1bbcba6d8b2d","cache-control":"no-cache, private","connection":"close","content-length":"214","content-type":"application/json","date":"Tue, 03 Mar 2026 01:53:20 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Secret facility-1-db-superuser not found or not ready: Error: HTTP-Code: 404
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Message: Unknown API Status Code!
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"facility-1-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"facility-1-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Headers: {"audit-id":"2ee560c6-5867-454b-93c6-5126b4a499f9","cache-control":"no-cache, private","connection":"close","content-length":"220","content-type":"application/json","date":"Tue, 03 Mar 2026 01:53:20 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Secret facility-2-db-superuser not found or not ready: Error: HTTP-Code: 404
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Message: Unknown API Status Code!
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"facility-2-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"facility-2-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running Headers: {"audit-id":"cf55852e-0fde-4247-a298-529dde684d43","cache-control":"no-cache, private","connection":"close","content-length":"220","content-type":"application/json","date":"Tue, 03 Mar 2026 01:53:20 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
++ kubernetes:batch/v1:Job central-migrator creating replacement (0s) [diff: ~spec]; 
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique running read kubernetes:core/v1:ConfigMap actual-provisioning
@ Updating....
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (0s) [diff: ~spec]
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (0s) [diff: ~spec]
++ kubernetes:batch/v1:Job central-migrator creating replacement (1s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/central-migrator-f76eaa93" to start
++ kubernetes:batch/v1:Job central-migrator creating replacement (1s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/central-migrator-f76eaa93" to succeed (Active: 1 | Succeeded: 0 | Failed: 0)
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (0s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/facility-1-migrator-bd9ddd23" to start
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/facility-1-migrator-bd9ddd23" to succeed (Active: 1 | Succeeded: 0 | Failed: 0)
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (0s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/facility-2-migrator-2c9fb88f" to start
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/facility-2-migrator-2c9fb88f" to succeed (Active: 1 | Succeeded: 0 | Failed: 0)
@ Updating....
~  kubernetes:apps/v1:Deployment facility-1-web updating (3s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
~  kubernetes:apps/v1:Deployment central-web updating (2s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
~  kubernetes:apps/v1:Deployment facility-2-web updating (3s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
~  kubernetes:apps/v1:Deployment patient-portal-web updating (2s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
@ Updating.............
~  kubernetes:apps/v1:Deployment facility-1-web updating (13s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/facility-1-web-bac7d097-746d848bc8-hljxr]: containers with unready status: [http]
~  kubernetes:apps/v1:Deployment central-web updating (12s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/central-web-845620a2-949c65d48-xxzgp]: containers with unready status: [http]
~  kubernetes:apps/v1:Deployment facility-2-web updating (13s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/facility-2-web-94703b71-676c68c67-frhd2]: containers with unready status: [http]
~  kubernetes:apps/v1:Deployment patient-portal-web updating (12s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/patient-portal-web-8197fd35-95c9cc559-t5xt2]: containers with unready status: [http]
@ Updating....
~  kubernetes:apps/v1:Deployment facility-2-web updating (14s) [diff: ~spec]; Waiting for app ReplicaSet to be available (1/2 Pods available)
~  kubernetes:apps/v1:Deployment facility-1-web updating (14s) [diff: ~spec]; Waiting for app ReplicaSet to be available (1/2 Pods available)
~  kubernetes:apps/v1:Deployment central-web updating (13s) [diff: ~spec]; Waiting for app ReplicaSet to be available (1/2 Pods available)
~  kubernetes:apps/v1:Deployment patient-portal-web updating (14s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment patient-portal-web updating (14s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment patient-portal-web updated (14s) [diff: ~spec]; 
@ Updating............
~  kubernetes:apps/v1:Deployment facility-1-web updating (23s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/facility-1-web-bac7d097-746d848bc8-lwqmh]: containers with unready status: [http]
~  kubernetes:apps/v1:Deployment central-web updating (22s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/central-web-845620a2-949c65d48-hbcc7]: containers with unready status: [http]
~  kubernetes:apps/v1:Deployment facility-2-web updating (23s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/facility-2-web-94703b71-676c68c67-ljq2v]: containers with unready status: [http]
~  kubernetes:apps/v1:Deployment facility-2-web updating (23s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-2-web updating (23s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-web updated (23s) [diff: ~spec]; 
@ Updating....
~  kubernetes:apps/v1:Deployment facility-1-web updating (24s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-1-web updating (24s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-web updated (24s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-web updating (23s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-web updating (23s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-web updated (23s) [diff: ~spec]; 
@ Updating....
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (22s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/facility-2-migrator-2c9fb88f-b7pht]: Container "migrator" completed with exit code 0
@ Updating.....
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (24s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/facility-2-migrator-2c9fb88f" to succeed (Active: 0 | Succeeded: 0 | Failed: 0)
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (25s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/facility-2-migrator-2c9fb88f" to succeed (Active: 0 | Succeeded: 1 | Failed: 0)
++ kubernetes:batch/v1:Job facility-2-migrator creating replacement (25s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job facility-2-migrator created replacement (25s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job facility-2-migrator replacing (0s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job facility-2-migrator replaced (0.00s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-api updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment facility-2-tasks updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment facility-2-sync updating (0s) [diff: ~spec]
@ Updating....
~  kubernetes:apps/v1:Deployment facility-2-api updating (0s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-2-api updating (0s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-api updated (0.82s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-tasks updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-2-tasks updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-tasks updated (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-sync updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-2-sync updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-2-sync updated (1s) [diff: ~spec]; 
@ Updating.....................
++ kubernetes:batch/v1:Job central-migrator creating replacement (46s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/central-migrator-f76eaa93-8wzj6]: Container "migrator" completed with exit code 0
@ Updating....
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (44s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/facility-1-migrator-bd9ddd23-lrblz]: Container "migrator" completed with exit code 0
@ Updating....
++ kubernetes:batch/v1:Job central-migrator creating replacement (48s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/central-migrator-f76eaa93" to succeed (Active: 0 | Succeeded: 0 | Failed: 0)
@ Updating....
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (46s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/facility-1-migrator-bd9ddd23" to succeed (Active: 0 | Succeeded: 0 | Failed: 0)
++ kubernetes:batch/v1:Job central-migrator creating replacement (48s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/central-migrator-f76eaa93" to succeed (Active: 0 | Succeeded: 1 | Failed: 0)
++ kubernetes:batch/v1:Job central-migrator creating replacement (48s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (46s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/facility-1-migrator-bd9ddd23" to succeed (Active: 0 | Succeeded: 1 | Failed: 0)
++ kubernetes:batch/v1:Job facility-1-migrator creating replacement (46s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job central-migrator created replacement (48s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job facility-1-migrator created replacement (46s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job central-migrator replacing (0s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job central-migrator replaced (0.00s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job central-provisioner creating replacement (0s) [diff: ~spec]
+- kubernetes:batch/v1:Job facility-1-migrator replacing (0s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job facility-1-migrator replaced (0.01s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-tasks updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment facility-1-api updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment facility-1-sync updating (0s) [diff: ~spec]
++ kubernetes:batch/v1:Job central-provisioner creating replacement (0s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job central-provisioner creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/central-provisioner-cc94be98" to start
++ kubernetes:batch/v1:Job central-provisioner creating replacement (0s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/central-provisioner-cc94be98" to succeed (Active: 1 | Succeeded: 0 | Failed: 0)
@ Updating....
~  kubernetes:apps/v1:Deployment facility-1-tasks updating (0s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-1-tasks updating (0s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-tasks updated (0.98s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-api updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-1-api updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-api updated (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-sync updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment facility-1-sync updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment facility-1-sync updated (1s) [diff: ~spec]; 
@ Updating.........
++ kubernetes:batch/v1:Job central-provisioner creating replacement (6s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/central-provisioner-cc94be98-8zmd2]: Container "provisioner" completed with exit code 0
@ Updating....
++ kubernetes:batch/v1:Job central-provisioner creating replacement (8s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/central-provisioner-cc94be98" to succeed (Active: 0 | Succeeded: 0 | Failed: 0)
@ Updating....
++ kubernetes:batch/v1:Job central-provisioner creating replacement (8s) [diff: ~spec]; Waiting for Job "tamanu-fix-tamoc-386-user-import-unique/central-provisioner-cc94be98" to succeed (Active: 0 | Succeeded: 1 | Failed: 0)
++ kubernetes:batch/v1:Job central-provisioner creating replacement (8s) [diff: ~spec]; 
++ kubernetes:batch/v1:Job central-provisioner created replacement (8s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job central-provisioner replacing (0s) [diff: ~spec]; 
+- kubernetes:batch/v1:Job central-provisioner replaced (0.00s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-resolver updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment central-tasks updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment central-api updating (0s) [diff: ~spec]
~  kubernetes:apps/v1:Deployment central-fhir-refresh updating (0s) [diff: ~spec]
@ Updating....
~  kubernetes:apps/v1:Deployment central-tasks updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-tasks updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-tasks updated (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-resolver updating (1s) [diff: ~spec]; Waiting for app ReplicaSet to be available (0/1 Pods available)
~  kubernetes:apps/v1:Deployment central-api updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-api updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-api updated (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-refresh updating (1s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-fhir-refresh updating (1s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-refresh updated (1s) [diff: ~spec]; 
@ Updating.............
~  kubernetes:apps/v1:Deployment central-fhir-resolver updating (10s) [diff: ~spec]; warning: [Pod tamanu-fix-tamoc-386-user-import-unique/central-fhir-resolver-a07fc594-76bfc5fffc-vx64v]: containers with unready status: [fhir-resolver]
@ Updating........
~  kubernetes:apps/v1:Deployment central-fhir-resolver updating (16s) [diff: ~spec]; Deployment initialization complete
~  kubernetes:apps/v1:Deployment central-fhir-resolver updating (16s) [diff: ~spec]; 
~  kubernetes:apps/v1:Deployment central-fhir-resolver updated (16s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-provisioner deleting original (0s) [diff: ~spec]; 
@ Updating....
-- kubernetes:batch/v1:Job central-provisioner deleting original (0s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-provisioner deleted original (0.50s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-2-migrator deleting original (0s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-1-migrator deleting original (0s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-migrator deleting original (0s) [diff: ~spec]; 
@ Updating....
-- kubernetes:batch/v1:Job facility-2-migrator deleting original (0s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-2-migrator deleted original (0.99s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-1-migrator deleting original (1s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job facility-1-migrator deleted original (1s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-migrator deleting original (1s) [diff: ~spec]; 
-- kubernetes:batch/v1:Job central-migrator deleted original (1s) [diff: ~spec]; 
@ Updating....
   pulumi:pulumi:Stack tamanu-on-k8s-fix-tamoc-386-user-import-unique  16 messages
Diagnostics:
 pulumi:pulumi:Stack (tamanu-on-k8s-fix-tamoc-386-user-import-unique):
   Waiting for central-db...
   Waiting for facility-1-db...
   Waiting for facility-2-db...

   Using tailscale proxy https://k8s-operator-tamanu-internal-main.tail53aef.ts.net

   Secret central-db-superuser not found or not ready: Error: HTTP-Code: 404
   Message: Unknown API Status Code!
   Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"central-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"central-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   Headers: {"audit-id":"54999a56-371e-40f7-b514-1bbcba6d8b2d","cache-control":"no-cache, private","connection":"close","content-length":"214","content-type":"application/json","date":"Tue, 03 Mar 2026 01:53:20 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
   Secret facility-1-db-superuser not found or not ready: Error: HTTP-Code: 404
   Message: Unknown API Status Code!
   Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"facility-1-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"facility-1-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   Headers: {"audit-id":"2ee560c6-5867-454b-93c6-5126b4a499f9","cache-control":"no-cache, private","connection":"close","content-length":"220","content-type":"application/json","date":"Tue, 03 Mar 2026 01:53:20 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}
   Secret facility-2-db-superuser not found or not ready: Error: HTTP-Code: 404
   Message: Unknown API Status Code!
   Body: "{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"secrets \\\"facility-2-db-superuser\\\" not found\",\"reason\":\"NotFound\",\"details\":{\"name\":\"facility-2-db-superuser\",\"kind\":\"secrets\"},\"code\":404}
"
   Headers: {"audit-id":"cf55852e-0fde-4247-a298-529dde684d43","cache-control":"no-cache, private","connection":"close","content-length":"220","content-type":"application/json","date":"Tue, 03 Mar 2026 01:53:20 GMT","x-kubernetes-pf-flowschema-uid":"3fb296fc-e46b-45d1-9306-057e37ddd229","x-kubernetes-pf-prioritylevel-uid":"feccf24d-a074-4fa8-aa6f-db82477fc2f5"}

   [Pulumi Neo] Would you like help with these diagnostics?
   https://app.pulumi.com/bes/tamanu-on-k8s/fix-tamoc-386-user-import-unique/updates/5?explainFailure

Outputs:
   urls: {
       Central      : "https://central.fix-tamoc-386-user-import-unique.cd.tamanu.app"
       Facility- 1  : "https://facility-1.fix-tamoc-386-user-import-unique.cd.tamanu.app"
       Facility- 2  : "https://facility-2.fix-tamoc-386-user-import-unique.cd.tamanu.app"
       PatientPortal: "https://portal.fix-tamoc-386-user-import-unique.cd.tamanu.app"
   }

Resources:
   ~ 14 updated
   +-4 replaced
   18 changes. 49 unchanged

Duration: 1m24s

   

Copy link
Copy Markdown
Contributor

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Comment thread packages/central-server/app/admin/userValidation.js
@dannash100 dannash100 changed the title fix(import): Apply same unique restrictions on user admin panel to user reference data import fix(import): TAMOC-386: Apply same unique restrictions on user admin panel to user reference data import Mar 1, 2026
Copy link
Copy Markdown
Collaborator

@rohan-bes rohan-bes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a high level thought, should we be using a database level uniqueness constraint here? Appreciate that it requires a migration and possible data cleanups, but feels a bit more robust. At least some model level validation could be nice?

@dannash100
Copy link
Copy Markdown
Contributor Author

Just a high level thought, should we be using a database level uniqueness constraint here? Appreciate that it requires a migration and possible data cleanups, but feels a bit more robust. At least some model level validation could be nice?

Yeah mabye a good call aye 🤔

@dannash100
Copy link
Copy Markdown
Contributor Author

@rohan-bes I will make a new branch I think and update you

@dannash100 dannash100 closed this Mar 4, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants