You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+9-8Lines changed: 9 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -24,7 +24,7 @@ Additionally, the project uses:
24
24
The project ships with a [`docker-compose.yml`](docker-compose.yml) that can be used to get a local version of the component running:
25
25
26
26
```bash
27
-
docker-compose up
27
+
dockercompose up
28
28
```
29
29
30
30
> Note that for the Composite Handler to be able to interact with the target S3 bucket, the Docker Compose assumes that the `AWS_PROFILE` environment variable has been set and a valid AWS session is available.
@@ -58,7 +58,7 @@ The administrator user can be used to browse the database and manage the queue (
58
58
There are 3 possible entrypoints to make the above easier:
59
59
60
60
*`entrypoint.sh` - this will wait for Postgres to be available and run `manage.py migrate` and `manage.py createcachetable` if `MIGRATE=True`. It will run `manage.py createsuperuser` is `INIT_SUPERUSER=True` (also needs `DJANGO_SUPERUSER_*` envvars)
61
-
*`entrypoint-api.sh` - this runs above then `python manage.py runserver 0.0.0.0:8000`
61
+
*`entrypoint-api.sh` - this runs above then starts nginx instance fronting gunicorn process
62
62
*`entrypoint-worker.sh` - this runs above then `python manage.py qcluster`
63
63
64
64
## Configuration
@@ -88,6 +88,7 @@ The following list of environment variables are supported:
88
88
|`ENGINE_WORKER_MAX_ATTEMPTS`|`0`| Engine | The number of processing attempts a single task will undergo before it is abandoned. Setting this value to `0` will cause a task to be retried forever. |
89
89
|`MIGRATE`| None | API, Engine | If "True" will run migrations + createcachetable on startup if entrypoint used. |
90
90
|`INIT_SUPERUSER`| None | API, Engine | If "True" will attempt to create superuser. Needs standard Django envvars to be set (e.g. `DJANGO_SUPERUSER_USERNAME`, `DJANGO_SUPERUSER_EMAIL`, `DJANGO_SUPERUSER_PASSWORD`) if entrypoint used. |
91
+
|`GUNICORN_WORKERS`|`2`| API | The value of [`--workers`](https://docs.gunicorn.org/en/stable/run.html) arg when running gunicorn |
91
92
92
93
Note that in order to access the S3 bucket, the Composite Handler assumes that valid AWS credentials are available in the environment - this can be in the former of [environment variables](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-envvars.html), or in the form of ambient credentials.
93
94
@@ -96,15 +97,15 @@ Note that in order to access the S3 bucket, the Composite Handler assumes that v
96
97
The project ships with a [`Dockerfile`](./Dockerfile):
97
98
98
99
```bash
99
-
docker build -t dlcs/composite-handler:latest.
100
+
docker build -t dlcs/composite-handler:local.
100
101
```
101
102
102
103
This will produce a single image that can be used to execute any of the supported Django commands, including running the API and the engine:
103
104
104
105
```bash
105
-
docker run dlcs/composite-handler:latest python manage.py migrate # Apply any pending DB schema changes
106
-
docker run dlcs/composite-handler:latest python manage.py createcachetable # Create the cache table (if it doesn't exist)
107
-
docker run dlcs/composite-handler:latest python manage.py runserver 0.0.0.0:8000# Run the API
108
-
docker run dlcs/composite-handler:latest python manage.py qcluster# Run the engine
109
-
docker run dlcs/composite-handler:latest python manage.py qmonitor # Monitor the workers
106
+
docker run dlcs/composite-handler:local python manage.py migrate # Apply any pending DB schema changes
107
+
docker run dlcs/composite-handler:local python manage.py createcachetable # Create the cache table (if it doesn't exist)
108
+
docker run --env-file .env -it --rm dlcs/composite-handler:local /srv/dlcs/entrypoint-api.sh# Run the API
109
+
docker run --env-file .env -it --rm dlcs/composite-handler:local /srv/dlcs/entrypoint-worker.sh# Run the engine
110
+
docker run dlcs/composite-handler:local python manage.py qmonitor # Monitor the workers
0 commit comments