For developing the code locally, you need:
- Compile the C++ adaguc-server
- A postgresql server
- Start the application with the python wrapper.
- Test the server with geographical referenced testdata
After the python wrapper is started, the adaguc-server is accessible on your workstation via http. The easiest way to explore datasets is via de autowms feature, which will give you an overview of available data on your machine via de browser.
To be able to compile adaguc-server you need to have the required dependencies installed. These can be installed via the package manager of your system. Scripts are available:
- Dependencies for redhat
- Dependencies for Ubuntu18
- Dependencies for Ubuntu20
- Dependencies for Ubuntu22
- Dependencies for Mac
We provide several scripts to setup postgresql on your machine:
Or alternatively, run a postgresql database using docker:
docker run --rm -d \
--name adaguc_db_dev \
-e POSTGRES_USER=adaguc \
-e POSTGRES_PASSWORD=adaguc \
-e POSTGRES_DB=adaguc \
-p 5432:5432 \
postgres:17.4When started, the database is available via username adaguc, databasename adaguc, password adaguc, and localhost. You can use the following to inspect the database:
psql "dbname=adaguc user=adaguc password=adaguc host=localhost"
To make the application accesible via the web, a python wrapper is available. This requires at least python 3.8 and the ability to create a virtualenv with python.
To install virtualenv please check Virtual env on ubuntu
You have to do once:
python3 -m venv env
source env/bin/activate
pip3 install --upgrade pip pip-tools
pip3 install -r requirements.txt
pip3 install -r requirements-dev.txt
cd ./python/lib/ && python3 setup.py develop && cd ../../
Make sure the data directories are available:
sudo mkdir -p /data/adaguc-data # For data files connected to dataset configurations
sudo mkdir -p /data/adaguc-autowms # For Exploring your own data
sudo mkdir -p /data/adaguc-datasets # For XML dataset config files
sudo chown $USER: /data -R
And after each restart you only have to do
source env/bin/activate
export ADAGUC_PATH=`pwd`
export ADAGUC_DATASET_DIR=/data/adaguc-datasets
export ADAGUC_DATA_DIR=/data/adaguc-data
export ADAGUC_AUTOWMS_DIR=/data/adaguc-autowms
export ADAGUC_CONFIG=${ADAGUC_PATH}/python/lib/adaguc/adaguc-server-config-python-postgres.xml
export ADAGUC_NUMPARALLELPROCESSES=4
export ADAGUC_DB="user=adaguc password=adaguc host=localhost dbname=adaguc"
export ADAGUC_ENABLELOGBUFFER=FALSE
export ADAGUC_TRACE_TIMINGS=FALSE
After the dependencies have been installed you need to execute a script to start the compilation of the adaguc-server binaries.
bash compile.sh
source env/bin/activate
export ADAGUC_PATH=`pwd`
export ADAGUC_DATASET_DIR=/data/adaguc-datasets
export ADAGUC_DATA_DIR=/data/adaguc-data
export ADAGUC_AUTOWMS_DIR=/data/adaguc-autowms
export ADAGUC_CONFIG=${ADAGUC_PATH}/python/lib/adaguc/adaguc-server-config-python-postgres.xml
export ADAGUC_NUMPARALLELPROCESSES=4
export ADAGUC_DB="user=adaguc password=adaguc host=localhost dbname=adaguc"
export ADAGUC_ENABLELOGBUFFER=FALSE
export ADAGUC_TRACE_TIMINGS=TRUE
# To enable core dump generation, additionally do:
#ulimit -c unlimited
#sudo sysctl -w kernel.core_pattern=core-adagucserver #
# Then you can use:
# gdb ./bin/adagucserver `ls -Art core-adagucserver.* | tail -n 1`
python3 ./python/python_fastapi_server/main.py
The adaguc-server WMS server will then be accessible at http://127.0.0.1:8080/wms. The autowms can be explored at the adaguc-viewer via the following link: https://adaguc.knmi.nl/adaguc-viewer/index.html?autowms=http://localhost:8080/autowms. Keep in mind that you have to disable security, as the server is not running on https.
Note: the data directories cannot point to a symbolic link, for security purposes adaguc checks if the path contains no symbolic links.
Note: For production purposes the server should be started with gunicorn: Start adaguc-server with gunicorn
You can check the correct functioning of the adaguc-server by starting the functional tests by doing
docker compose -f Docker/docker-compose-test.yml up -Vd
bash runtests_psql.sh
Copy a test netcdf file and display:
cp ./data/datasets/testdata.nc /data/adaguc-autowms/
- You can browse your local instance via autowms using https://adaguc.knmi.nl/adaguc-viewer/index.html?autowms=http://localhost:8080/autowms
- You can now load the test dataset via https://localhost:8080//wms?source=testdata.nc& in https://adaguc.knmi.nl/adaguc-viewer/index.html
- Or directly via: https://adaguc.knmi.nl/adaguc-viewer/index.html?#addlayer('http://localhost:8080//wms?source=testdata.nc&','testdata')
First start the docker used for testing:
docker compose -f Docker/docker-compose-test.yml up -Vd
Then run the tests by doing:
bash runtests_psql.sh
You need to do these steps once to setup the pre-commit hooks:
- Install the python dependencies:
pip-sync requirements.txt requirements-dev.txt
cd ./python/lib/ && python3 setup.py develop && cd ../../- Setup pre-commit hooks
git init
pre-commit installFrom then, git commit will automatically run the python formatting tools.
To skip the pre-commit hook, run git commit -m "A message" --no-verify.
To manually execute the pre-commit hooks on all files, run: pre-commit run --all-files.
Important files:
pyproject.tomlfor configuration of python formatting.pre-commit-config.yamlfor pre-commit configuration
- list datasets:
bash ./scripts/scan.sh -l - scan a dataset:
bash ./scripts/scan.sh -d <dataset name>