You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Upgrade python in xenial docker to 3.7 from 3.6 and gcc version to 7
Build onnx with DISABLE_EXTERNAL_INITIALIZERS flag for both CPU and GPU
Update deps cache key in CI
Ignore overlap valgrind errors from onnx lib that was introduced in 1.9.0
This document describes how a backend for RedisAI can be built, from this repository. It highlights the supported compilation devices on a per-backend basis, and highlights the tools and commands required. Unless indicated otherwise, a backend is compiled in a docker, which is responsible for the configuration and installation of all tools required for a given backend on a per-platform basis.
4
+
5
+
To follow these instructions, this repository must be cloned with all of its submodules (i.e *git clone --recursive https://github.com/redisai/redisai*)
6
+
7
+
GNU Make is used as a runner for the dockerfile generator. Python is the language used for the generator script, and jinja is the templating library used to create the docker file from a template *dockerfile.tmpl* that can be found in the directory of a given backend listed below.
8
+
9
+
## Tools
10
+
11
+
Building the backends requires installation of the following tools:
12
+
13
+
1. gnu make
14
+
1. python (3.0 or higher)
15
+
1. docker
16
+
1. jinja2
17
+
18
+
On ubuntu bionic these can be installed by running the following steps, to install python3, create a virtual environment, and install the jinja templating dependency. Replace */path/to/venv* with your desired virtualenv location.
19
+
20
+
```
21
+
sudo apt install python3 python3-dev make docker
22
+
python3 -m venv /path/to/venv
23
+
source /path/to/venv/bin/activate
24
+
pip install jinja
25
+
```
26
+
27
+
-------
28
+
29
+
## Backends
30
+
31
+
### onnxruntime
32
+
33
+
We build Onnxruntime library with DISABLE_EXTERNAL_INITIALIZERS=ON build flag. This means that loading ONNX models that use external files to store the initial (usually very large) values of the model's operations, is invalid. That is, initializers values must be part of the serialized model, which is also the standard use case.
34
+
35
+
**Compilation target devices:**
36
+
37
+
1. x86\_64 bit linux systems
38
+
39
+
1. x86\_64 bit linux systems with a GPU
40
+
41
+
**Directory:** opt/build/onnxruntime
42
+
43
+
**Build options:**
44
+
45
+
1. To build run *make*
46
+
47
+
1. To build with GPU support on x86\_64 run *make GPU=1*
0 commit comments