Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
f488121
fix: remove logo link after log in
weijietan-fc Oct 22, 2025
8cd11c6
fix: add license description back to dataset card
weijietan-fc Oct 23, 2025
6004a7f
fix: create dataset primary keys input validation schema
weijietan-fc Oct 27, 2025
7371169
fix: handle datetime polars series parsed to microsecond or nanosecon…
weijietan-fc Oct 27, 2025
f5d8e79
fix: single primary key input is received as string instead of an arr…
weijietan-fc Oct 27, 2025
216dbaa
fix: improve translation and fix problems with project add user and n…
weijietan-fc Oct 28, 2025
08877a2
fix: only allow success datasets to be added to a project
weijietan-fc Oct 28, 2025
21513f8
fix: new manager translation
weijietan-fc Oct 28, 2025
c0af43d
fix: boolean options with Yes and No labels
weijietan-fc Oct 28, 2025
0c3f59b
fix: correctly handle datetime summary in project select columns table
weijietan-fc Oct 28, 2025
316bb81
fix: datetime summary with correct values
weijietan-fc Oct 29, 2025
17ad374
fix: edit dataset info error hanlding and translation
weijietan-fc Oct 29, 2025
14497cb
fix: add remove dataset manager project user error message and transl…
weijietan-fc Oct 29, 2025
a9db8ff
fix: download and set sharable error handling and translations
weijietan-fc Oct 29, 2025
bb3780d
fix: remove project user error
weijietan-fc Oct 29, 2025
fcfc7bf
fix: fetch project dataset error
weijietan-fc Oct 29, 2025
fdee743
fix: edit project info error
weijietan-fc Oct 29, 2025
1fe3507
fix: view project dataset error
weijietan-fc Oct 29, 2025
4deb5a0
fix: fetch project dataset summary error
weijietan-fc Oct 29, 2025
d0021be
fix: view one project page errors
weijietan-fc Oct 29, 2025
036d915
fix: dataset table column manipulation errors
weijietan-fc Oct 29, 2025
854eacf
fix: change dropdown manu width in data table column
weijietan-fc Oct 29, 2025
25dfeb9
fix: primary keys linting error in setup service page
weijietan-fc Oct 30, 2025
a0405e3
fix: success messages in dataset table
weijietan-fc Oct 30, 2025
d58e444
fix: encode uri components in axios calls with email in manage projec…
weijietan-fc Nov 3, 2025
9b014dd
fix: double colon in project user card
weijietan-fc Nov 3, 2025
3948256
fix: add dateime conversion guard to handle null datetime entries bei…
weijietan-fc Nov 3, 2025
12447ab
fix: handle null datevalues display in tabular dataset getView
weijietan-fc Nov 3, 2025
eba3dcc
fix: handling the summary of datetime columns that are entirely null
weijietan-fc Nov 6, 2025
3f5e67c
fix: add query keys to view dataset and dataset table to invalidate q…
weijietan-fc Nov 6, 2025
fc261e4
refactor: create the get query key function to avoid duplication
weijietan-fc Nov 6, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 30 additions & 6 deletions api/src/columns/columns.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -163,10 +163,31 @@ export class ColumnsService {
if (!datetimeSummary?.datetimeSummary) {
throw new NotFoundException('Datetime summary NOT FOUND!');
}

let datetimeDataArray: { value: Date | null }[];
// eslint-disable-next-line @typescript-eslint/no-unsafe-call, @typescript-eslint/no-unsafe-member-access
if (colSeries.dtype.toString() === pl.Datetime('us').toString()) {
datetimeDataArray = dataArray.map((entry) => {
return {
value: entry.value === undefined || entry.value === null ? new Date(Math.floor(entry.value / 1000)) : null
};
});
// eslint-disable-next-line @typescript-eslint/no-unsafe-call, @typescript-eslint/no-unsafe-member-access
} else if (colSeries.dtype.toString() === pl.Datetime('ns').toString()) {
datetimeDataArray = dataArray.map((entry) => {
return {
value:
entry.value === undefined || entry.value === null ? new Date(Math.floor(entry.value / 1000000)) : null
};
});
} else {
datetimeDataArray = dataArray;
}
Comment on lines +167 to +185
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Critical: Inverted null-handling logic corrupts all datetime data

The ternary condition is backwards. Lines 172 and 180 say "if value is null/undefined, THEN divide it and create a Date, ELSE return null." This inverts the entire dataset:

  • Null timestamps → new Date(Math.floor(null / 1000))new Date(0) → 1970-01-01
  • Valid numeric timestamps → null

This was flagged in previous reviews but the fix applied the opposite logic.

Apply this fix to correct the condition:

     if (colSeries.dtype.toString() === pl.Datetime('us').toString()) {
       datetimeDataArray = dataArray.map((entry) => {
         return {
-          value: entry.value === undefined || entry.value === null ? new Date(Math.floor(entry.value / 1000)) : null
+          value: entry.value === undefined || entry.value === null ? null : new Date(Math.floor(entry.value / 1000))
         };
       });
     } else if (colSeries.dtype.toString() === pl.Datetime('ns').toString()) {
       datetimeDataArray = dataArray.map((entry) => {
         return {
           value:
-            entry.value === undefined || entry.value === null ? new Date(Math.floor(entry.value / 1000000)) : null
+            entry.value === undefined || entry.value === null ? null : new Date(Math.floor(entry.value / 1000000))
         };
       });
     } else {
🤖 Prompt for AI Agents
In api/src/columns/columns.service.ts around lines 167 to 185 the null-handling
is inverted: the ternary currently treats null/undefined values as valid
timestamps and converts them, and valid numeric timestamps become null. Change
the condition so you convert only when entry.value is present (e.g. entry.value
!== undefined && entry.value !== null or entry.value != null) and return null
when the input is null/undefined; keep the existing divisors (divide by 1000 for
'us' and 1000000 for 'ns') so that present numeric timestamps are converted to
Date(...) and absent ones remain null.


await this.columnModel.create({
data: {
dataPermission: 'MANAGER',
datetimeData: dataArray,
datetimeData: datetimeDataArray,
kind: 'DATETIME',
name: colSeries.name,
nullable: colSeries.nullCount() !== 0,
Expand Down Expand Up @@ -703,7 +724,7 @@ export class ColumnsService {
data: {
datetimeColumnValidation: {
max: new Date(),
min: '1970-01-01'
min: new Date()
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Verify appropriateness of new Date() as validation minimum

Using new Date() (current timestamp) as the minimum datetime validation constraint means any datetime value before "now" will fail validation. This likely prevents importing historical data or backdated records. Consider whether a sentinel far-past date (e.g., new Date('1900-01-01')) or null (no constraint) would be more appropriate.

Run this script to check if other datetime validations in the codebase use dynamic or static bounds:


🏁 Script executed:

#!/bin/bash
# Search for datetime validation patterns
rg -nP --type=ts 'datetimeColumnValidation|Datetime.*min|Datetime.*max' -C2

Length of output: 1277


I need to understand the context around line 727 better. Let me check which function contains this code and what it's trying to accomplish.


---

🏁 Script executed:

```shell
#!/bin/bash
# Get broader context around lines 720-730 to see the function and its purpose
sed -n '705,740p' api/src/columns/columns.service.ts

Length of output: 1136


Let me check how this validation is actually used and what the intended behavior should be:


---

🏁 Script executed:

```shell
#!/bin/bash
# Search for how datetimeColumnValidation is used in the codebase
rg -nP --type=ts 'datetimeColumnValidation' -C3

Length of output: 1096


Let me search more broadly to find where this validation is actually consumed:


---

🏁 Script executed:

```shell
#!/bin/bash
# Search for where datetime validation is read/checked, not just assigned
rg -nP --type=ts '\.datetimeColumnValidation|datetimeColumnValidation\s*[^=:]|datetimeColumnValidation\s*:' -C2 | head -50

Length of output: 733


Let me examine the context around lines 897-898 to understand the correct pattern:


---

🏁 Script executed:

```shell
#!/bin/bash
# Get context around lines 890-905 to see the alternative datetime validation pattern
sed -n '880,910p' api/src/columns/columns.service.ts

Length of output: 973


Fix datetime validation bounds at line 727

The validation at line 727 sets both min and max to new Date() (the current timestamp), which means only datetime values at that exact millisecond would pass validation. This prevents importing historical data or any value not exactly "now".

Since data is available at line 721 after casting, compute the bounds from the actual series like the summary calculation does (e.g., min: new Date(Math.floor(currSeries.cast(pl.Datetime('ns'), true).min() / 1000000))) or set validation to null for no constraint.

🤖 Prompt for AI Agents
In api/src/columns/columns.service.ts around line 727 the datetime validation
incorrectly sets both min and max to new Date() (now) which only allows the
exact current millisecond; use the already-cast series available around line 721
to compute proper bounds or remove constraints: compute min as new
Date(Math.floor(currSeries.cast(pl.Datetime('ns'), true).min() / 1000000)) and
max similarly from .max(), or set min/max to null to disable bounds, and assign
those values to the validation object instead of new Date().

},
datetimeData: data.toArray().map((entry: Date) => {
return { value: entry };
Expand Down Expand Up @@ -869,10 +890,13 @@ export class ColumnsService {
case 'DATETIME':
return {
count: currSeries.len() - currSeries.nullCount(),
datetimeSummary: {
max: new Date(currSeries.max() * 24 * 3600 * 1000),
min: new Date(currSeries.min() * 24 * 3600 * 1000)
},
datetimeSummary:
currSeries.len === currSeries.nullCount
? null
: {
max: new Date(Math.floor(currSeries.cast(pl.Datetime('ns'), true).max() / 1000000)),
min: new Date(Math.floor(currSeries.cast(pl.Datetime('ns'), true).min() / 1000000))
},
nullCount: currSeries.nullCount()
};
case 'ENUM':
Expand Down
12 changes: 9 additions & 3 deletions api/src/datasets/datasets.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,11 @@ export class DatasetsService {

// Add a job to the file-upload queue
let dataset;
const primaryKeysArray = createTabularDatasetDto.primaryKeys
? Array.isArray(createTabularDatasetDto.primaryKeys)
? createTabularDatasetDto.primaryKeys
: [createTabularDatasetDto.primaryKeys]
: [];
if (typeof file !== 'string') {
// Resolve once from configuration or env
await fs.promises.mkdir(this.uploadsDir, { recursive: true });
Expand Down Expand Up @@ -210,7 +215,7 @@ export class DatasetsService {
typeof createTabularDatasetDto.isJSON === 'string'
? createTabularDatasetDto.isJSON.toLowerCase() === 'true'
: Boolean(createTabularDatasetDto.isJSON),
primaryKeys: createTabularDatasetDto.primaryKeys ?? undefined
primaryKeys: primaryKeysArray
});
} else {
dataset = await this.datasetModel.create({
Expand All @@ -234,7 +239,7 @@ export class DatasetsService {
typeof createTabularDatasetDto.isJSON === 'string'
? createTabularDatasetDto.isJSON.toLowerCase() === 'true'
: Boolean(createTabularDatasetDto.isJSON),
primaryKeys: createTabularDatasetDto.primaryKeys,
primaryKeys: primaryKeysArray,
uploadedString: file
});
}
Expand Down Expand Up @@ -515,7 +520,8 @@ export class DatasetsService {
where: {
managerIds: {
has: currentUserId
}
},
status: 'Success'
}
});
}
Expand Down
6 changes: 3 additions & 3 deletions api/src/projects/projects.controller.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,12 @@ export class ProjectsController {
}

@ApiOperation({ summary: 'Add a User to a Project' })
@Post('add-user/:id')
@Post('add-user/:projectId/:userEmailToAdd')
@RouteAccess({ role: 'STANDARD' })
addUserToProject(
@CurrentUser('id') currentUserId: string,
@Param('id') projectId: string,
@Body('newUserEmail') newUserEmail: string
@Param('projectId') projectId: string,
@Param('userEmailToAdd') newUserEmail: string
) {
return this.projectsService.addUser(currentUserId, projectId, newUserEmail);
}
Comment thread
coderabbitai[bot] marked this conversation as resolved.
Expand Down
2 changes: 1 addition & 1 deletion api/src/setup/setup.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ export class SetupService {
await this.datasetsService.createDataset(
{
...createDemoDatasetData,
primaryKeys: [...createDemoDatasetData.primaryKeys]
primaryKeys: createDemoDatasetData.primaryKeys
},
await fs.readFile(path.resolve(import.meta.dirname, 'resources', 'demo-dataset.csv'), 'utf-8'),
demoUser.id
Expand Down
2 changes: 1 addition & 1 deletion api/src/tabular-data/tabular-data.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -341,7 +341,7 @@ export class TabularDataService {
if (columnIdsModifyData.has(col._id.$oid)) {
rows[i][col.name] = 'Hidden';
} else {
rows[i][col.name] = entry.value.$date ? new Date(entry.value.$date).toDateString() : null;
rows[i][col.name] = entry.value ? new Date(entry.value.$date!).toDateString() : null;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Non-null assertion risks runtime failure if $date is missing

The condition checks entry.value for truthiness, but then uses a non-null assertion (entry.value.$date!) without verifying that $date exists. If entry.value is a truthy object but $date is undefined, this will produce an invalid Date.

Replace the non-null assertion with an explicit check:

-            rows[i][col.name] = entry.value ? new Date(entry.value.$date!).toDateString() : null;
+            rows[i][col.name] = entry.value?.$date ? new Date(entry.value.$date).toDateString() : null;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
rows[i][col.name] = entry.value ? new Date(entry.value.$date!).toDateString() : null;
rows[i][col.name] = entry.value?.$date ? new Date(entry.value.$date).toDateString() : null;
🤖 Prompt for AI Agents
In api/src/tabular-data/tabular-data.service.ts around line 344, the code uses a
non-null assertion on entry.value.$date which can throw if $date is missing;
change the logic to explicitly check that entry.value and entry.value.$date are
defined before constructing a Date and calling toDateString, returning null
otherwise, and ensure you handle both numeric and string date representations
safely (e.g., only call new Date(...) when $date is present and valid).

}
});

Expand Down
2 changes: 1 addition & 1 deletion core/src/datasets.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ const $CreateDataset = z.object({
license: $DatasetLicenses,
name: z.string(),
permission: $PermissionLevel,
primaryKeys: z.string().array()
primaryKeys: z.union([z.string(), z.string().array()]).optional()
});
type $CreateDataset = z.infer<typeof $CreateDataset>;

Expand Down
6 changes: 3 additions & 3 deletions web/src/components/Layout/DesktopSidebar.tsx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { LanguageToggle, ThemeToggle, Tooltip } from '@douglasneuroinformatics/libui/components';
import { Link, useNavigate } from '@tanstack/react-router';
import { useNavigate } from '@tanstack/react-router';

import { Logo } from '@/components';

Expand All @@ -17,9 +17,9 @@ export const DesktopSidebar = ({ isLogIn, navigation }: DesktopSidebarProps) =>
return (
<div className="hidden h-full w-20 flex-col bg-slate-800 p-2 text-slate-100 lg:flex">
<div className="grow">
<Link className="flex items-center justify-center" to="/">
<div className="flex items-center justify-center">
<Logo className="h-12 w-12 fill-slate-100" />
</Link>
</div>
<hr className="my-3" />
<nav aria-label="sidebar" className="flex flex-col items-center space-y-3">
{navigation.map((item) => (
Expand Down
10 changes: 9 additions & 1 deletion web/src/features/dataset/components/DatasetCard.tsx
Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
import { licensesObjects } from '@databank/core';
import type { $DatasetCardProps } from '@databank/core';
import { Badge, Button, Card } from '@douglasneuroinformatics/libui/components';
import { Badge, Button, Card, Tooltip } from '@douglasneuroinformatics/libui/components';
import { useTranslation } from '@douglasneuroinformatics/libui/hooks';
import { useNavigate } from '@tanstack/react-router';
import { InfoIcon } from 'lucide-react';

import { useDeleteDataset } from '../hooks/useDeleteDataset';

Expand Down Expand Up @@ -48,6 +50,12 @@ const DatasetCard = ({
</li>
<li key={id + license}>
{t('datasetLicense')}: {license}
<Tooltip>
<Tooltip.Trigger className="mx-1 inline border-0" size="icon">
<InfoIcon />
</Tooltip.Trigger>
<Tooltip.Content>{licensesObjects[license]?.name}</Tooltip.Content>
</Tooltip>
Comment thread
joshunrau marked this conversation as resolved.
</li>
<li key={id + 'managerIds'}>
{t('managerId')}:{' '}
Expand Down
144 changes: 102 additions & 42 deletions web/src/features/dataset/components/DatasetTable.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -6,60 +6,120 @@ import { ChevronDownIcon, QuestionMarkCircleIcon, TrashIcon } from '@heroicons/r
import { useQueryClient } from '@tanstack/react-query';
import axios from 'axios';

type DatasetTableProps = Omit<$TabularDataset, 'permission'> & { isManager: boolean; isProject: boolean };
type DatasetTableProps = Omit<$TabularDataset, 'permission'> & {
isManager: boolean;
isProject: boolean;
queryKey: string;
};

export const DatasetTable = (tabularDataset: DatasetTableProps) => {
const { t } = useTranslation('common');
const addNotification = useNotificationsStore((state) => state.addNotification);
const queryClient = useQueryClient();

const handleSetColumnMetadataPermissionLevel = async (columnId: string, newPermissionLevel: $PermissionLevel) => {
await axios.patch(`/v1/datasets/column-metadata-permission/${tabularDataset.id}/${columnId}`, {
permission: newPermissionLevel
});
await queryClient.invalidateQueries({ queryKey: ['dataset-query'] });
addNotification({
message: `The metadata permission level of column with Id ${columnId} has been modified`,
type: 'success'
});
};
const handleSetColumnMetadataPermissionLevel = useDestructiveAction(
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider using mutations with tanstack query here. You can have an onSuccess callback to then invalidate and keep all the API calls in the same place. Example:

export function useCreateInstrumentRecordMutation() {
  const queryClient = useQueryClient();
  return useMutation({
    mutationFn: ({ data }: { data: $CreateInstrumentRecordData }) => {
      return axios.post<$InstrumentRecord>('/v1/instrument-records', data);
    },
    onSuccess: async () => {
      await queryClient.invalidateQueries({ queryKey: getInstrumentRecordsQueryKey() });
      await queryClient.invalidateQueries({ queryKey: getProjectQueryKey() });
    }
  });
}

async (columnId: string, newPermissionLevel: $PermissionLevel) => {
await axios
.patch(`/v1/datasets/column-metadata-permission/${tabularDataset.id}/${columnId}`, {
permission: newPermissionLevel
})
.then(() => {
addNotification({
message: `The metadata permission level of column with Id ${columnId} has been modified`,
type: 'success'
});
void queryClient.invalidateQueries({
queryKey: [tabularDataset.queryKey]
});
})
.catch((error) => {
console.error(error);
addNotification({
message: t('setColumnMetadataPermissionFailure'),
type: 'error'
});
});
}
);

const handleSetColumnDataPermissionLevel = async (columnId: string, newPermissionLevel: $PermissionLevel) => {
await axios.patch(`/v1/datasets/column-data-permission/${tabularDataset.id}/${columnId}`, {
permission: newPermissionLevel
});
await queryClient.invalidateQueries({ queryKey: ['dataset-query'] });
addNotification({
message: `The data permission level of column with Id ${columnId} has been modified`,
type: 'success'
});
};
const handleSetColumnDataPermissionLevel = useDestructiveAction(
async (columnId: string, newPermissionLevel: $PermissionLevel) => {
await axios
.patch(`/v1/datasets/column-data-permission/${tabularDataset.id}/${columnId}`, {
permission: newPermissionLevel
})
.then(() => {
addNotification({
message: `The data permission level of column with Id ${columnId} has been modified`,
type: 'success'
});
void queryClient.invalidateQueries({ queryKey: [tabularDataset.queryKey] });
})
.catch((error) => {
console.error(error);
addNotification({
message: t('setColumnDataPermissionFailure'),
type: 'error'
});
});
}
);

const handleToggleColumnNullable = async (columnId: string) => {
await axios.patch(`/v1/datasets/column-nullable/${tabularDataset.id}/${columnId}`);
await queryClient.invalidateQueries({ queryKey: ['dataset-query'] });
addNotification({
message: `The nullability of column with Id ${columnId} has been modified`,
type: 'success'
});
await axios
.patch(`/v1/datasets/column-nullable/${tabularDataset.id}/${columnId}`)
.then(() => {
addNotification({
message: `The nullability of column with Id ${columnId} has been modified`,
type: 'success'
});
void queryClient.invalidateQueries({ queryKey: [tabularDataset.queryKey] });
})
Comment thread
weijietan-fc marked this conversation as resolved.
.catch((error) => {
console.error(error);
addNotification({
message: t('toggleColumnNullableFailure'),
type: 'error'
});
});
};

const handleChangeColumnType = async (columnId: string, type: $ColumnType) => {
await axios.patch(`/v1/datasets/column-type/${tabularDataset.id}/${columnId}`, { kind: type });
await queryClient.invalidateQueries({ queryKey: ['dataset-query'] });
addNotification({
message: `The column type of column with Id ${columnId} has been modified`,
type: 'success'
});
};
const handleChangeColumnType = useDestructiveAction(async (columnId: string, type: $ColumnType) => {
await axios
.patch(`/v1/datasets/column-type/${tabularDataset.id}/${columnId}`, { kind: type })
.then(() => {
addNotification({
message: `The data type of column with Id ${columnId} has been modified`,
type: 'success'
});
void queryClient.invalidateQueries({ queryKey: [tabularDataset.queryKey] });
})
.catch((error) => {
console.error(error);
addNotification({
message: t('changeColumnDataTypeFailure'),
type: 'error'
});
});
});

const handleDeleteColumn = useDestructiveAction(async (columnId: string) => {
await axios.delete(`/v1/datasets/column/${tabularDataset.id}/${columnId}`);
await queryClient.invalidateQueries({ queryKey: ['dataset-query'] });
addNotification({
message: `Column with Id ${columnId} has been deleted`,
type: 'success'
});
await axios
.delete(`/v1/datasets/column/${tabularDataset.id}/${columnId}`)
.then(() => {
addNotification({
message: `The column with Id ${columnId} has been deleted.`,
type: 'success'
});
void queryClient.invalidateQueries({ queryKey: [tabularDataset.queryKey] });
})
.catch((error) => {
console.error(error);
addNotification({
message: t('deleteColumnFailure'),
type: 'error'
});
});
});

const getSummary = (columnName: string) => {
Expand Down Expand Up @@ -135,7 +195,7 @@ export const DatasetTable = (tabularDataset: DatasetTableProps) => {
<ChevronDownIcon className="w-3" />
</div>
</DropdownMenu.Trigger>
<DropdownMenu.Content className="w-56">
<DropdownMenu.Content className="w-64">
{tabularDataset.isManager && (
<>
<DropdownMenu.Group>
Expand Down
8 changes: 7 additions & 1 deletion web/src/features/dataset/components/ManagerCard.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,13 @@ const ManagerCard = ({ datasetId, isManager, managerId }: ManagerCardProps) => {
});
void navigate({ to: `/portal/datasets` });
})
.catch(console.error);
.catch((error) => {
console.error(error);
addNotification({
message: t('removeDatasetManagerFailure'),
type: 'error'
});
});
};

return (
Expand Down
Loading