Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .envs/db.mysql
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@ MYSQL_DATABASE=dummy_db
MYSQL_USER=dummy_user
MYSQL_PASSWORD=dummy_pw
MYSQL_ROOT_PASSWORD=dummy_root_pw
MYSQL_TCP_PORT=3306
MYSQL_TCP_PORT=3306
2 changes: 1 addition & 1 deletion .envs/db.postgres
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
POSTGRES_HOST=postgres
POSTGRES_DB=dummy_db
POSTGRES_USER=dummy_user
POSTGRES_PASSWORD=dummy_pw
POSTGRES_PASSWORD=dummy_pw
2 changes: 1 addition & 1 deletion .envs/epa.mysql
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ SQL_USER=dummy_user
SQL_PASSWORD=dummy_pw
SQL_HOST=db
SQL_PORT=3306
DATABASE=mysql
DATABASE=mysql
2 changes: 1 addition & 1 deletion .envs/epa.postgres
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,4 @@ SQL_USER=dummy_user
SQL_PASSWORD=dummy_pw
SQL_HOST=db_pg
SQL_PORT=5432
DATABASE=postgres
DATABASE=postgres
6 changes: 2 additions & 4 deletions .github/workflows/black_linter.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ on:
- 'README.md'
pull_request:
branches:
-main
- main

jobs:
build:
Expand All @@ -23,9 +23,7 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install black==19.3b pytest
pip install click==8.0.2
pip install -r app/requirements/postgres.txt
pip install -r app/requirements/local.txt
- name: Analysing the code with pylint
run: |
black --check .
Expand Down
27 changes: 13 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,9 @@ Prior to be able to develop locally, you might need to install postgres, simply

1. Create a virtual environment
2. Activate your virtual environment
3. Install the dependencies with `pip install -r app/requirements/postgres.txt`
4. Install extra local development dependencies with `pip install -r app/dev_requirements.txt`
5. Move to the `app` folder with `cd app`
6. Create environment variables (only replace content surrounded by `<>`)
3. Move to the `app` folder with `cd app`
4. Install local development dependencies with `pip install -r requirements/local.txt`
5. Create environment variables (only replace content surrounded by `<>`)
```
SQL_ENGINE=django.db.backends.postgresql
SQL_DATABASE=<your db name>
Expand All @@ -36,33 +35,33 @@ SQL_HOST=localhost
SQL_PORT=5432
DEBUG=(True|False)
```
8. Add an environment variable `MVS_HOST_API` and set the url of the simulation server you wish to use for your models
9. Execute the `local_setup.sh` file (`. local_setup.sh` on linux/mac `bash local_setup.sh` on windows) you might have to make it executable first. Answer yes to the question
10. Start the local server with `python manage.py runserver`
11. You can then login with `testUser` and `ASas12,.` or create your own account
6. Add an environment variable `MVS_HOST_API` and set the url of the simulation server you wish to use for your models
7. Execute the `local_setup.sh` file (`. local_setup.sh` on linux/mac `bash local_setup.sh` on windows) you might have to make it executable first. Answer yes to the question
8. Start the local server with `python manage.py runserver`
9. You can then login with `testUser` and `ASas12,.` or create your own account

## Deploy using Docker Compose
The following commands should get everything up and running, using the web based version of the MVS API.

You need to be able to run docker-compose inside your terminal. If you can't you should install [Docker desktop](https://www.docker.com/products/docker-desktop/) first.
You need to be able to run docker-compose inside your terminal. If you can't you should install [Docker desktop](https://www.docker.com/products/docker-desktop/) first.


* Clone the repository locally `git clone --single-branch --branch main https://github.com/open-plan-tool/gui.git open_plan_gui`
* Move inside the created folder (`cd open_plan_gui`)
* Edit the `.envs/epa.postgres` and `.envs/db.postgres` environment files
* Change the value assigned to `EPA_SECRET_KEY` with a [randomly generated one](https://randomkeygen.com/)
* Make sure to replace dummy names with you preferred names
* The value assigned to the variables `POSTGRES_DB`, `POSTGRES_USER`, `POSTGRES_PASSWORD` in `.envs/db.postgres` should match the ones of
* The value assigned to the variables `POSTGRES_DB`, `POSTGRES_USER`, `POSTGRES_PASSWORD` in `.envs/db.postgres` should match the ones of
the variables `SQL_DATABASE`, `SQL_USER`, `SQL_PASSWORD` in `.envs/epa.postgres`, respectively

* Define an environment variable `MVS_HOST_API` in `.envs/epa.postgres` and set the url of the simulation server
* Define an environment variable `MVS_HOST_API` in `.envs/epa.postgres` and set the url of the simulation server
you wish to use for your models (for example `MVS_API_HOST="<url to your favorite simulation server>"`), you can deploy your own [simulation server](https://github.com/open-plan-tool/simulation-server) locally if you need

* Assign the domain of your website (without `http://` or `https://`) to `TRUSTED_HOST` , see https://docs.djangoproject.com/en/4.2/ref/settings/#csrf-trusted-origins for more information

Next you can either provide the following commands inside a terminal (with ubuntu you might have to prepend `sudo`)
* `docker-compose --file=docker-compose-postgres.yml up -d --build` (you can replace `postgres` by `mysql` if you want to use mysql)
* `docker-compose --file=docker-compose-postgres.yml exec -u root app_pg sh initial_setup.sh` (this will also load a default testUser account with sample scenario).
* `docker-compose --file=docker-compose-postgres.yml exec -u root app_pg sh initial_setup.sh` (this will also load a default testUser account with sample scenario).

Or you can run a python script with the following command
* `python deploy.py -db postgres`
Expand All @@ -72,7 +71,7 @@ Finally
* You can then login with `testUser` and `ASas12,.` or create your own account

### Proxy settings (optional)
If you use a proxy you will need to set `USE_PROXY=True` and edit `PROXY_ADDRESS=http://proxy_address:port` with your proxy settings in `.envs/epa.postgres`.
If you use a proxy you will need to set `USE_PROXY=True` and edit `PROXY_ADDRESS=http://proxy_address:port` with your proxy settings in `.envs/epa.postgres`.

>**_NOTE:_** If you wish to use mysql instead of postgres, simply replace `postgres` by `mysql` and `app_pg` by `app` in the above commands or filenames
<hr>
Expand All @@ -89,7 +88,7 @@ If you use a proxy you will need to set `USE_PROXY=True` and edit `PROXY_ADDRESS

## Tear down (uninstall) docker containers
To remove the application (including relevant images, volumes etc.), one can use the following commands in terminal:

`docker-compose down --file=docker-compose-postgres.yml -v`

you can add `--rmi local` if you wish to also remove the images (this will take you a long time to rebuild the docker containers from scratch if you want to redeploy the app later then)
Expand Down
2 changes: 1 addition & 1 deletion app/.dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ webpack.config.js
# !users/**
# !epa.env
# !manage.py
# !setup.sh
# !setup.sh
27 changes: 27 additions & 0 deletions app/.pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks

exclude: 'docs|node_modules|vendors|migrations|.git|.tox'
default_stages: [pre-commit]
fail_fast: true

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.2.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-json
- id: check-added-large-files
- id: requirements-txt-fixer

- repo: https://github.com/pre-commit/mirrors-jshint
rev: v2.13.6
hooks:
- id: jshint

- repo: https://github.com/psf/black
rev: 24.8.0
hooks:
- id: black
34 changes: 19 additions & 15 deletions app/dashboard/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -172,21 +172,25 @@ def fetch_user_projects(user):

def kpi_scalars_list(kpi_scalar_values_dict, KPI_SCALAR_UNITS, KPI_SCALAR_TOOLTIPS):
return [
{
"kpi": key.replace("_", " "),
"value": round(val, 3)
if "currency/kWh" in KPI_SCALAR_UNITS[key]
else round(val, 2),
"unit": KPI_SCALAR_UNITS[key],
"tooltip": KPI_SCALAR_TOOLTIPS[key],
}
if key in KPI_SCALAR_UNITS.keys()
else {
"kpi": key.replace("_", " "),
"value": round(val, 3),
"unit": "N/A",
"tooltip": "",
}
(
{
"kpi": key.replace("_", " "),
"value": (
round(val, 3)
if "currency/kWh" in KPI_SCALAR_UNITS[key]
else round(val, 2)
),
"unit": KPI_SCALAR_UNITS[key],
"tooltip": KPI_SCALAR_TOOLTIPS[key],
}
if key in KPI_SCALAR_UNITS.keys()
else {
"kpi": key.replace("_", " "),
"value": round(val, 3),
"unit": "N/A",
"tooltip": "",
}
)
for key, val in kpi_scalar_values_dict.items()
]

Expand Down
65 changes: 39 additions & 26 deletions app/dashboard/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -691,9 +691,9 @@ def available_timeseries(self):
MAP_EPA_MVS.get(sub_cat, sub_cat)
)
if storage_subasset is not None:
storage_subasset[
"category"
] = format_storage_subasset_name(category, sub_cat)
storage_subasset["category"] = (
format_storage_subasset_name(category, sub_cat)
)
storage_subasset["type_oemof"] = asset["type_oemof"]
storage_subasset["energy_vector"] = asset[
"energy_vector"
Expand Down Expand Up @@ -755,10 +755,10 @@ def single_asset_results(self, asset_name, asset_category=None):
if storage_subasset is not None:
if answer is None:
answer = storage_subasset
answer[
"category"
] = format_storage_subasset_name(
category, sub_cat
answer["category"] = (
format_storage_subasset_name(
category, sub_cat
)
)
answer["energy_vector"] = asset["energy_vector"]
break
Expand Down Expand Up @@ -1045,15 +1045,19 @@ def graph_capacities(simulations, y_variables):

installed_capacity_dict = {
"capacity": [],
"name": _("Installed Capacity")
if multi_scenario is False
else _("Inst. Cap.") + f"{simulation.scenario.name}",
"name": (
_("Installed Capacity")
if multi_scenario is False
else _("Inst. Cap.") + f"{simulation.scenario.name}"
),
}
optimized_capacity_dict = {
"capacity": [],
"name": _("Optimized Capacity")
if multi_scenario is False
else _("Opt. Cap.") + f"{simulation.scenario.name}",
"name": (
_("Optimized Capacity")
if multi_scenario is False
else _("Opt. Cap.") + f"{simulation.scenario.name}"
),
}

# read information about the installed capacity
Expand Down Expand Up @@ -1304,16 +1308,20 @@ def graph_costs(
y = df.iloc[:, i].values.tolist()
y_values.append(
{
"base": df.iloc[:, :i].sum(axis=1).values.tolist()
if i > 0
else None,
"base": (
df.iloc[:, :i].sum(axis=1).values.tolist()
if i > 0
else None
),
"value": y,
"text": [name for j in range(len(x_values))],
"name": name
if multi_scenario is False
else name + f" {simulation.scenario.name}",
"name": (
name
if multi_scenario is False
else name + f" {simulation.scenario.name}"
),
"hover": "<b>%{text}, </b><br><br>Block value: %{customdata:.2f}$<br>Stacked value: %{y:.2f}$<extra> %{x}</extra>",
"customdata": y
"customdata": y,
# https://stackoverflow.com/questions/59057881/python-plotly-how-to-customize-hover-template-on-with-what-information-to-show
}
)
Expand All @@ -1326,14 +1334,18 @@ def graph_costs(
y = df.iloc[i, :].values.tolist()
y_values.append(
{
"base": df.iloc[:i, :].sum(axis=0).values.tolist()
if i > 0
else None,
"base": (
df.iloc[:i, :].sum(axis=0).values.tolist()
if i > 0
else None
),
"value": y,
"text": [name for j in range(len(x_values))],
"name": name
if multi_scenario is False
else name + f" {simulation.scenario.name}",
"name": (
name
if multi_scenario is False
else name + f" {simulation.scenario.name}"
),
"hover": "<b>%{text}</b><br><br>Block value: %{customdata:.2f}$<br>Stacked value: %{y:.2f}$",
"customdata": y,
}
Expand Down Expand Up @@ -1495,6 +1507,7 @@ def graph_sankey(simulation, energy_vector, timestep=None):
GRAPH_SANKEY: graph_sankey,
}


# # TODO change the form from this model to adapt the choices depending on single scenario/compare scenario or sensitivity
class ReportItem(models.Model):
title = models.CharField(max_length=120, default="", blank=True)
Expand Down
5 changes: 0 additions & 5 deletions app/dev_requirements.txt

This file was deleted.

2 changes: 1 addition & 1 deletion app/dev_setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,4 @@ export EXCHANGE_ACCOUNT="dummy@rl-institut.de"
export EXCHANGE_PW="dummy"
export EXCHANGE_EMAIL="dummy@rl-institut.de"
export EXCHANGE_SERVER="dummy"
export RECIPIENTS="dummy@b.de,dummy@a.com"
export RECIPIENTS="dummy@b.de,dummy@a.com"
2 changes: 1 addition & 1 deletion app/djangoq_setup.sh
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
#!/usr/local/bin/python
python manage.py collectstatic && \
python manage.py qcluster
python manage.py qcluster
31 changes: 18 additions & 13 deletions app/epa/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
For the full list of settings and their values, see
https://docs.djangoproject.com/en/3.0/ref/settings/
"""

import ast
import os

Expand Down Expand Up @@ -110,19 +111,23 @@
# https://docs.djangoproject.com/en/3.0/ref/settings/#databases
# SQLite is used if no other database system is set via environment variables.
DATABASES = {
"default": {
"ENGINE": os.environ.get("SQL_ENGINE"),
"NAME": os.environ.get("SQL_DATABASE"),
"USER": os.environ.get("SQL_USER"),
"PASSWORD": os.environ.get("SQL_PASSWORD"),
"HOST": os.environ.get("SQL_HOST"),
"PORT": os.environ.get("SQL_PORT"),
}
if os.environ.get("SQL_ENGINE")
else {
"ENGINE": os.environ.get("SQL_ENGINE", "django.db.backends.sqlite3"),
"NAME": os.environ.get("SQL_DATABASE", os.path.join(BASE_DIR, "db.sqlite3")),
}
"default": (
{
"ENGINE": os.environ.get("SQL_ENGINE"),
"NAME": os.environ.get("SQL_DATABASE"),
"USER": os.environ.get("SQL_USER"),
"PASSWORD": os.environ.get("SQL_PASSWORD"),
"HOST": os.environ.get("SQL_HOST"),
"PORT": os.environ.get("SQL_PORT"),
}
if os.environ.get("SQL_ENGINE")
else {
"ENGINE": os.environ.get("SQL_ENGINE", "django.db.backends.sqlite3"),
"NAME": os.environ.get(
"SQL_DATABASE", os.path.join(BASE_DIR, "db.sqlite3")
),
}
)
}

DEFAULT_AUTO_FIELD = "django.db.models.AutoField"
Expand Down
1 change: 1 addition & 0 deletions app/epa/urls.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
1. Import the include() function: from django.urls import include, path
2. Add a URL to urlpatterns: path('blog/', include('blog.urls'))
"""

from django.contrib import admin
from django.conf.urls.i18n import i18n_patterns
from django.urls import path, re_path, include
Expand Down
2 changes: 1 addition & 1 deletion app/fixtures/test_users.json
Original file line number Diff line number Diff line change
Expand Up @@ -35,4 +35,4 @@
"user_permissions": []
}
}
]
]
2 changes: 1 addition & 1 deletion app/fixtures/two_scenarios_fixture.json

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions app/local_setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@ python manage.py migrate && \
python manage.py update_assettype && \
python manage.py loaddata 'fixtures/multivector_fixture.json' && \
echo yes | python manage.py collectstatic && \
pre-commit install && \

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it added to the dependencies the developpers need to install in the README steps? If not can you add a it there?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch, I did forget to update the readme.

echo 'Completed Setup Successfully!!'
Loading
Loading