Skip to content

Commit cd08cbe

Browse files
authored
677 Add hybrid programming example for bundle (#678)
* [DLMED] add hybrid placeholders Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] update doc Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] add scripts Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] split to 2 examples Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] rename to bundle Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] update hybrid programming Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] update bundles to bundle Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] add README Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] update according to comments Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] fix format Signed-off-by: Nic Ma <nma@nvidia.com>
1 parent b66354a commit cd08cbe

File tree

21 files changed

+262
-9
lines changed

21 files changed

+262
-9
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -178,7 +178,7 @@ Demonstrates the use of the `ThreadBuffer` class used to generate data batches d
178178
Illustrate reading NIfTI files and test speed of different transforms on different devices.
179179

180180
**modules**
181-
#### [engines](./modules/bundles)
181+
#### [bundle](./modules/bundle)
182182
Get started tutorial and concrete training / inference examples for MONAI bundle features.
183183
#### [engines](./modules/engines)
184184
Training and evaluation examples of 3D segmentation based on UNet3D and synthetic dataset with MONAI workflows, which contains engines, event-handlers, and post-transforms. And GAN training and evaluation example for a medical image generative adversarial network. Easy run training script uses `GanTrainer` to train a 2D CT scan reconstruction network. Evaluation script generates random samples from a trained network.

modules/bundle/README.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
# MONAI bundle
2+
This folder contains the get_started tutorial and concrete training / inference examples for MONAI bundle features.
3+
4+
### [spleen segmentation](./spleen_segmentation)
5+
A bundle example for volumetric (3D) segmentation of the spleen from CT image.
6+
### [customize component](./custom_component)
7+
Example shows the use cases of bringing customized python components, such as transform, network, and metrics, in a configuration-based workflow.
8+
### [hybrid programming](./hybrid_programming)
9+
Example shows how to parse the config files in your own python program, instantiate necessary components with python program and execute the inference.
Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# Description
2+
This example mainly shows a typical use case that brings customized python components (such as transform, network, metrics) in a configuration-based workflow.
3+
4+
Please note that this example depends on the `spleen_segmentation` bundle example and executes via overriding the config file of it.
5+
6+
## commands example
7+
To run the workflow with customized components, `PYTHONPATH` should be revised to include the path to the customized component:
8+
```
9+
export PYTHONPATH=$PYTHONPATH:"<path to 'custom_component/scripts'>"
10+
```
11+
And please make sure the folder `custom_component/scripts` is a valid python module (it has a `__init__.py` file in the folder).
12+
13+
Override the `train` config with the customized `transform` and execute training:
14+
```
15+
python -m monai.bundle run training --meta_file <spleen_configs_path>/metadata.json --config_file "['<spleen_configs_path>/train.json','configs/custom_train.json']" --logging_file <spleen_configs_path>/logging.conf
16+
```
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
{
2+
"train#preprocessing#transforms#6":
3+
{
4+
"_target_": "scripts.custom_transforms.PrintEnsureTyped",
5+
"keys": ["image", "label"]
6+
}
7+
}
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Copyright (c) MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# Copyright (c) MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
11+
12+
from monai.config import KeysCollection
13+
from monai.transforms import EnsureTyped
14+
15+
16+
class PrintEnsureTyped(EnsureTyped):
17+
"""
18+
Extend the `EnsureTyped` transform to print the image shape.
19+
20+
Args:
21+
keys: keys of the corresponding items to be transformed.
22+
23+
"""
24+
25+
def __init__(self, keys: KeysCollection, data_type: str = "tensor") -> None:
26+
super().__init__(keys, data_type=data_type)
27+
28+
def __call__(self, data):
29+
d = dict(super().__call__(data=data))
30+
for key in self.key_iterator(d):
31+
print(f"data shape of {key}: {d[key].shape}")
32+
return d
Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
"\n",
99
"A MONAI bundle usually includes the stored weights of a model, TorchScript model, JSON files which include configs and metadata about the model, information for constructing training, inference, and post-processing transform sequences, plain-text description, legal information, and other data the model creator wishes to include.\n",
1010
"\n",
11-
"For more information about MONAI bundles read the description: https://docs.monai.io/en/latest/bundle_intro.html.\n",
11+
"For more information about MONAI bundle read the description: https://docs.monai.io/en/latest/bundle_intro.html.\n",
1212
"\n",
1313
"This notebook is a step-by-step tutorial to help get started to develop a bundle package, which contains a config file to construct the training pipeline and also has a `metadata.json` file to define the metadata information.\n",
1414
"\n",
@@ -26,7 +26,7 @@
2626
"- Override config content at runtime.\n",
2727
"- Hybrid programming with config and python code.\n",
2828
"\n",
29-
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Project-MONAI/tutorials/blob/master/modules/bundles/get_started.ipynb)"
29+
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Project-MONAI/tutorials/blob/master/modules/bundle/get_started.ipynb)"
3030
]
3131
},
3232
{
@@ -143,7 +143,7 @@
143143
"source": [
144144
"## Define train config - Set imports and input / output environments\n",
145145
"\n",
146-
"Now let's start to define the config file for a regular training task. MONAI bundles support both `JSON` and `YAML` format, here we use `JSON` as the example.\n",
146+
"Now let's start to define the config file for a regular training task. MONAI bundle support both `JSON` and `YAML` format, here we use `JSON` as the example.\n",
147147
"\n",
148148
"According to the predefined syntax of MONAI bundle, `$` indicates an expression to evaluate, `@` refers to another object in the config content. For more details about the syntax in bundle config, please check: https://docs.monai.io/en/latest/config_syntax.html.\n",
149149
"\n",
@@ -463,7 +463,7 @@
463463
"Usually we need to execute validation for every N epochs during training to verify the model and save the best model.\n",
464464
"\n",
465465
"Here we don't define the `validate` section step by step as it's similar to the `train` section. The full config is available: \n",
466-
"https://github.com/Project-MONAI/tutorials/blob/master/modules/bundles/spleen_segmentation/configs/train.json\n",
466+
"https://github.com/Project-MONAI/tutorials/blob/master/modules/bundle/spleen_segmentation/configs/train.json\n",
467467
"\n",
468468
"Just show an example of `macro text replacement` to simplify the config content and avoid duplicated text. Please note that it's just token text replacement of the config content, not refer to the instantiated python objects."
469469
]
@@ -498,7 +498,7 @@
498498
"We can define a `metadata` file in the bundle, which contains the metadata information relating to the model, including what the shape and format of inputs and outputs are, what the meaning of the outputs are, what type of model is present, and other information. The structure is a dictionary containing a defined set of keys with additional user-specified keys.\n",
499499
"\n",
500500
"A typical `metadata` example is available: \n",
501-
"https://github.com/Project-MONAI/tutorials/blob/master/modules/bundles/spleen_segmentation/configs/metadata.json"
501+
"https://github.com/Project-MONAI/tutorials/blob/master/modules/bundle/spleen_segmentation/configs/metadata.json"
502502
]
503503
},
504504
{
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Description
2+
This example mainly shows a typical use case that parses the config files in your own python program, instantiates necessary components with python program and executes the inference.
3+
4+
## commands example
5+
6+
Parse the config files in the python program and execute inference from the python program:
7+
8+
```
9+
python -m scripts.inference run --config_file "['configs/data_loading.json','configs/net_inferer.json','configs/post_processing.json']" --ckpt_path <path_to_checkpoint>
10+
```
Lines changed: 52 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,52 @@
1+
{
2+
"image_key": "image",
3+
"preprocessing": {
4+
"_target_": "Compose",
5+
"transforms": [
6+
{
7+
"_target_": "LoadImaged",
8+
"keys": "@image_key"
9+
},
10+
{
11+
"_target_": "EnsureChannelFirstd",
12+
"keys": "@image_key"
13+
},
14+
{
15+
"_target_": "Orientationd",
16+
"keys": "@image_key",
17+
"axcodes": "RAS"
18+
},
19+
{
20+
"_target_": "Spacingd",
21+
"keys": "@image_key",
22+
"pixdim": [1.5, 1.5, 2.0],
23+
"mode": "bilinear"
24+
},
25+
{
26+
"_target_": "ScaleIntensityRanged",
27+
"keys": "@image_key",
28+
"a_min": -57,
29+
"a_max": 164,
30+
"b_min": 0,
31+
"b_max": 1,
32+
"clip": true
33+
},
34+
{
35+
"_target_": "EnsureTyped",
36+
"keys": "@image_key"
37+
}
38+
]
39+
},
40+
"dataset": {
41+
"_target_": "Dataset",
42+
"data": "@input_data",
43+
"transform": "@preprocessing"
44+
},
45+
"dataloader": {
46+
"_target_": "DataLoader",
47+
"dataset": "@dataset",
48+
"batch_size": 1,
49+
"shuffle": false,
50+
"num_workers": 4
51+
}
52+
}
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
{
2+
"network": {
3+
"_target_": "UNet",
4+
"spatial_dims": 3,
5+
"in_channels": 1,
6+
"out_channels": 2,
7+
"channels": [16, 32, 64, 128, 256],
8+
"strides": [2, 2, 2, 2],
9+
"num_res_units": 2,
10+
"norm": "batch"
11+
},
12+
"inferer": {
13+
"_target_": "SlidingWindowInferer",
14+
"roi_size": [96, 96, 96],
15+
"sw_batch_size": 4,
16+
"overlap": 0.5
17+
}
18+
}

0 commit comments

Comments
 (0)