Skip to content

Commit 25eab2b

Browse files
authored
Update full_gpu_inference_pipeline for monai 1.0.0rc (#908)
* Update client.ipynb * Update README.md * Removed "pip install monai==1.0.0rc" in readme
1 parent ef01e6b commit 25eab2b

File tree

2 files changed

+4
-42
lines changed

2 files changed

+4
-42
lines changed

full_gpu_inference_pipeline/README.md

Lines changed: 3 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -77,49 +77,11 @@ Before installing the packages in your conda environment, make sure that you hav
7777
export PYTHONNOUSERSITE=True
7878
```
7979
If this variable is not exported and similar packages are installed outside your conda environment, your tar file may not contain all the dependencies required for an isolated Python environment.
80-
Install Pytorch with CUDA 11.3 support. 
81-
```bash
82-
pip install torch==1.10.1+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
83-
```
84-
Install MONAI and the recommended dependencies.
85-
```bash
86-
BUILD_MONAI=1 pip install --no-build-isolation git+https://github.com/Project-MONAI/MONAI#egg=monai
87-
```
88-
Then we can verify the installation of MONAI and all its dependencies:
89-
```bash
90-
python -c 'import monai; monai.config.print_config()'
91-
```
92-
You'll see the output below, which lists the versions of MONAI and relevant dependencies.
80+
Install MONAI and the recommended dependencies, you can also refer to the [installation guide](https://docs.monai.io/en/latest/installation.html) of MONAI.
9381

9482
```bash
95-
MONAI version: 0.8.0+65.g4bd13fe
96-
Numpy version: 1.21.4
97-
Pytorch version: 1.10.1+cu113
98-
MONAI flags: HAS_EXT = True, USE_COMPILED = False
99-
MONAI rev id: 4bd13fefbafbd0076063201f0982a2af8b56ff09
100-
MONAI __file__: /usr/local/lib/python3.8/dist-packages/monai/__init__.py
101-
Optional dependencies:
102-
Pytorch Ignite version: NOT INSTALLED or UNKNOWN VERSION.
103-
Nibabel version: 3.2.1
104-
scikit-image version: 0.19.1
105-
Pillow version: 9.0.0
106-
Tensorboard version: NOT INSTALLED or UNKNOWN VERSION.
107-
gdown version: NOT INSTALLED or UNKNOWN VERSION.
108-
TorchVision version: NOT INSTALLED or UNKNOWN VERSION.
109-
tqdm version: NOT INSTALLED or UNKNOWN VERSION.
110-
lmdb version: NOT INSTALLED or UNKNOWN VERSION.
111-
psutil version: NOT INSTALLED or UNKNOWN VERSION.
112-
pandas version: NOT INSTALLED or UNKNOWN VERSION.
113-
einops version: NOT INSTALLED or UNKNOWN VERSION.
114-
transformers version: NOT INSTALLED or UNKNOWN VERSION.
115-
mlflow version: NOT INSTALLED or UNKNOWN VERSION.
116-
117-
For details about installing the optional dependencies, please visit:
118-
https://docs.monai.io/en/latest/installation.html#installing-the-recommended-dependencies
119-
```
120-
Install the dependencies of MONAI:
121-
```bash
122-
pip install nibabel scikit-image pillow tensorboard gdown ignite torchvision itk tqdm lmdb psutil cucim pandas einops transformers mlflow matplotlib tensorboardX tifffile cupy
83+
pip install 'monai[all]'
84+
pip install cupy
12385
```
12486
Next, we should package the conda environment by using `conda-pack` command, which will produce a package of monai.tar.gz. This file contains all the environments needed by the python backend model and is portable. Then put the created monai.tar.gz under the spleen_seg folder, and the config.pbtxt should be set as:
12587
```bash

full_gpu_inference_pipeline/client/non_ensemble/client.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -197,7 +197,7 @@
197197
" image_bytes = b''\n",
198198
" for i, nifti_file in enumerate(nifti_files):\n",
199199
" image = LoadImage(reader=\"NibabelReader\", image_only=True, dtype=np.float32)(nifti_file)\n",
200-
" input0_data = np.array([image], dtype=np.float32)\n",
200+
" input0_data = np.expand_dims(image.array,axis=0)\n",
201201
" print(input0_data.shape)\n",
202202
" inputs = [\n",
203203
" httpclient.InferInput(\"INPUT0\", input0_data.shape, tu.np_to_triton_dtype(input0_data.dtype)),\n",

0 commit comments

Comments
 (0)