Skip to content

Commit 3c891ec

Browse files
Update google link to use shared drive (#1819)
Update google link to use shared drive ### Checks <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [ ] Avoid including large-size files in the PR. - [ ] Clean up long text outputs from code cells in the notebook. - [ ] For security purposes, please check the contents and remove any sensitive info such as user names and private key. - [ ] Ensure (1) hyperlinks and markdown anchors are working (2) use relative paths for tutorial repo files (3) put figure and graphs in the `./figure` folder - [ ] Notebook runs automatically `./runner.sh -t <path to .ipynb file>` --------- Signed-off-by: YunLiu <55491388+KumoLiu@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
1 parent f85cef6 commit 3c891ec

File tree

25 files changed

+51
-45
lines changed

25 files changed

+51
-45
lines changed

3d_classification/densenet_training_array.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -200,7 +200,7 @@
200200
],
201201
"source": [
202202
"if not os.path.isfile(images[0]):\n",
203-
" resource = \"https://drive.google.com/file/d/1f5odq9smadgeJmDeyEy_UOjEtE_pkKc0/view?usp=sharing\"\n",
203+
" resource = \"https://developer.download.nvidia.com/assets/Clara/monai/tutorials/IXI-T1.tar\"\n",
204204
" md5 = \"34901a0593b41dd19c1a1f746eac2d58\"\n",
205205
"\n",
206206
" dataset_dir = os.path.join(root_dir, \"ixi\")\n",

3d_regression/densenet_training_array.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,7 @@
205205
"outputs": [],
206206
"source": [
207207
"if not os.path.isfile(images[0]):\n",
208-
" resource = \"https://drive.google.com/file/d/1f5odq9smadgeJmDeyEy_UOjEtE_pkKc0/view?usp=sharing\"\n",
208+
" resource = \"https://developer.download.nvidia.com/assets/Clara/monai/tutorials/IXI-T1.tar\"\n",
209209
" md5 = \"34901a0593b41dd19c1a1f746eac2d58\"\n",
210210
"\n",
211211
" dataset_dir = os.path.join(root_dir, \"ixi\")\n",

3d_segmentation/swin_unetr_brats21_segmentation_3d.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
"\n",
4646
"https://www.synapse.org/#!Synapse:syn27046444/wiki/616992\n",
4747
"\n",
48-
"The JSON file containing training and validation sets (internal split) needs to be downloaded from this [link](https://drive.google.com/file/d/1i-BXYe-wZ8R9Vp3GXoajGyqaJ65Jybg1/view?usp=sharing) and placed in the same folder as the dataset. As discussed in the following, this tutorial uses fold 1 for training a Swin UNETR model on the BraTS 21 challenge.\n",
48+
"The JSON file containing training and validation sets (internal split) needs to be downloaded from this [link](https://developer.download.nvidia.com/assets/Clara/monai/tutorials/brats21_folds.json) and placed in the same folder as the dataset. As discussed in the following, this tutorial uses fold 1 for training a Swin UNETR model on the BraTS 21 challenge.\n",
4949
"\n",
5050
"### Tumor Characteristics\n",
5151
"\n",
@@ -114,7 +114,7 @@
114114
" \"TrainingData/BraTS2021_01146/BraTS2021_01146_flair.nii.gz\"\n",
115115
" \n",
116116
"\n",
117-
"- Download the json file from this [link](https://drive.google.com/file/d/1i-BXYe-wZ8R9Vp3GXoajGyqaJ65Jybg1/view?usp=sharing) and placed in the same folder as the dataset.\n"
117+
"- Download the json file from this [link](https://developer.download.nvidia.com/assets/Clara/monai/tutorials/brats21_folds.json) and placed in the same folder as the dataset.\n"
118118
]
119119
},
120120
{

3d_segmentation/swin_unetr_btcv_segmentation_3d.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -331,7 +331,7 @@
331331
"outputs": [],
332332
"source": [
333333
"# uncomment this command to download the JSON file directly\n",
334-
"# wget -O data/dataset_0.json 'https://drive.google.com/uc?export=download&id=1qcGh41p-rI3H_sQ0JwOAhNiQSXriQqGi'"
334+
"# wget -O data/dataset_0.json 'https://developer.download.nvidia.com/assets/Clara/monai/tutorials/swin_unetr_btcv_dataset_0.json'"
335335
]
336336
},
337337
{

3d_segmentation/vista3d/vista3d_spleen_finetune.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -191,7 +191,7 @@
191191
}
192192
],
193193
"source": [
194-
"resource = \"https://drive.google.com/file/d/1Sbe6GjlgH-GIcXolZzUiwgqR4DBYNLQ3/view?usp=drive_link\"\n",
194+
"resource = \"https://developer.download.nvidia.com/assets/Clara/monai/tutorials/model_zoo/model_vista3d.pt\"\n",
195195
"if not os.path.exists(os.path.join(root_dir, \"model.pt\")):\n",
196196
" download_url(url=resource, filepath=os.path.join(root_dir, \"model.pt\"))\n",
197197
"if os.path.exists(os.path.join(root_dir, \"model.pt\")):\n",

competitions/MICCAI/surgtoolloc/preprocess_to_build_detection_dataset.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@
7171
"## Load useful data\n",
7272
"\n",
7373
"As described in `readme.md`, we manually labeled 1126 frames in order to build the detection model.\n",
74-
"Please download the manually labeled bounding boxes from [google drive](https://drive.google.com/file/d/1iO4bXTGdhRLIoxIKS6P_nNAgI_1Fp_Vg/view?usp=sharing), the uncompressed folder `labels` is saved into `label_14_tools_yolo_640_blur/`."
74+
"Please download the manually labeled bounding boxes from [google drive](https://developer.download.nvidia.com/assets/Clara/monai/tutorials/1126_frame_labels.zip), the uncompressed folder `labels` is saved into `label_14_tools_yolo_640_blur/`."
7575
]
7676
},
7777
{

deployment/ray/mednist_classifier_ray.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@
122122
"metadata": {},
123123
"outputs": [],
124124
"source": [
125-
"resource = \"https://drive.google.com/uc?id=1zKRi5FrwEES_J-AUkM7iBJwc__jy6ct6\"\n",
125+
"resource = \"https://developer.download.nvidia.com/assets/Clara/monai/tutorials/deployment/classifier.zip\"\n",
126126
"dst = os.path.join(\"..\", \"bentoml\", \"classifier.zip\")\n",
127127
"if not os.path.exists(dst):\n",
128128
" download_url(resource, dst)"

detection/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ Then run the following command and go directly to Sec. 3.2.
4646
python3 luna16_prepare_env_files.py
4747
```
4848

49-
Alternatively, you can download the original data and resample them by yourself with the following steps. Users can either download 1) mhd/raw data from [LUNA16](https://luna16.grand-challenge.org/Home/) or its [copy](https://drive.google.com/drive/folders/1-enN4eNEnKmjltevKg3W2V-Aj0nriQWE?usp=share_link), or 2) DICOM data from [LIDC-IDRI](https://wiki.cancerimagingarchive.net/pages/viewpage.action?pageId=1966254) with [NBIA Data Retriever](https://wiki.cancerimagingarchive.net/display/NBIA/Downloading+TCIA+Images).
49+
Alternatively, you can download the original data and resample them by yourself with the following steps. Users can either download 1) mhd/raw data from [LUNA16](https://luna16.grand-challenge.org/Home/), or 2) DICOM data from [LIDC-IDRI](https://wiki.cancerimagingarchive.net/pages/viewpage.action?pageId=1966254) with [NBIA Data Retriever](https://wiki.cancerimagingarchive.net/display/NBIA/Downloading+TCIA+Images).
5050

5151
The raw CT images in LUNA16 have various voxel sizes. The first step is to resample them to the same voxel size, which is defined in the value of "spacing" in [./config/config_train_luna16_16g.json](./config/config_train_luna16_16g.json).
5252

federated_learning/breast_density_challenge/data/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
## Example breast density data
22

3-
Download example data from https://drive.google.com/file/d/1Fd9GLUIzbZrl4FrzI3Huzul__C8wwzyx/view?usp=sharing.
3+
Download example data from https://developer.download.nvidia.com/assets/Clara/monai/tutorials/fl/preprocessed.zip.
44
Extract here.
55

66
## Data source

federated_learning/openfl/openfl_mednist_2d_registration/run_docker.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,7 @@ $ fx envoy start --shard-name env_two --disable-tls --envoy-config-path envoy_co
118118
```
119119
[13:48:42] INFO 🧿 Starting the Envoy. envoy.py:53
120120
Downloading...
121-
From: https://drive.google.com/uc?id=1QsnnkvZyJPcbRoV_ArW8SnE1OTuoVbKE
121+
From: https://developer.download.nvidia.com/assets/Clara/monai/tutorials/MedNIST.tar.gz
122122
To: /tmp/tmpd60wcnn8/MedNIST.tar.gz
123123
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 61.8M/61.8M [00:04<00:00, 13.8MB/s]
124124
2022-07-22 13:48:48,735 - INFO - Downloaded: MedNIST.tar.gz

0 commit comments

Comments
 (0)