This is a work in progress.
We are currently attempting to build a website to address broken supply chains, such as occur during natural disasters. This is an attempt to build a website that allows makers and manufacturers to match demand for necessary articles which they can make. This matching of demand to supply is a fundamental human problem that could allow lives to be saved and distress relieved during the serial natural disaster that we face.
Our OKH and OKW libraries are implemented as publicly accessible Azure blob containers:
"Azure_Storage_ServiceName": "https://projdatablobstorage.blob.core.windows.net",
"Azure_Storage_OKH_ContainerName": "okh",
"Azure_Storage_OKW_ContainerName": "okw"
These OKHs and OKWs are taken from our repo: https://github.com/helpfulengineering/library.
Helpful created its own OKW template, and added an extenstion to OKH, both of which are defined here: https://github.com/helpfulengineering/OKF-Schema
We are currently working with the Internet of Production Alliance (IoPA) to unify these extensions with their official schemas.
At present, we are using Microsoft Azure to build a basic website.
Harry Pierson has explored an AI-based matching of tools and capabilities that are available to tools and capabilities which are are needed to make needed articles.
The matching engine used for supply-graph flows lives in a separate repository, supply-graph-ai (Open Hardware Manager, OHM): a FastAPI service that this Nuxt app calls from the browser.
We need volunteers. Although we can use a wide variety of skills we need:
- Programmers who can build a website with Azure
- Front-end programmers who, using Nuxt/Vue/Boostrap, can implement visual components. Apply here: https://helpful.directory/opportunity?id=Software-Developer---Typescript-446
- AI programmers who can use vector-matching a LLMs to build a robust matching algorithm.
This repository is a monorepo of separate npm packages (no root package.json). Install and start each part from its package directory.
| Component | Directory | Default URL | Purpose |
|---|---|---|---|
| Azure Functions API | packages/back-end |
http://127.0.0.1:7071/api |
OKH/OKW listing, blob-backed product data, incidents (Postgres), etc. |
| Nuxt front end | packages/front-end |
http://localhost:3000 |
Vue UI |
| Open Hardware Manager (supply-graph-ai) | separate clone | http://localhost:8001 |
Matching / supply-tree API (POST /v1/api/match, docs at /docs) |
| Mock match API (optional) | packages/mock-api |
http://localhost:8001 |
Tiny Express stub; same port as OHM — use one or the other, not both |
- Node.js (LTS recommended) and npm
- Azure Functions Core Tools v4 (installed with
packages/back-endvia npm; ensurefuncis on yourPATHafternpm install) - Azure CLI (
az) — install from Microsoft’s docs. You need subscription access and permissions for team resources (blob storage, etc.) as appropriate. - PostgreSQL — only if you use endpoints that query the database (e.g.
/api/incidents). Configure viapackages/back-end/.env(see packages/back-end/README.md). - supply-graph-ai — Python 3.12, Conda (recommended), and dependencies per that repo’s README (local dev or Docker Compose).
-
Copy
local.settings.json.templatetolocal.settings.jsoninpackages/back-end/.The Functions host reads Azure storage settings from here. The app fails at startup if
Azure_Storage_ServiceName,Azure_Storage_OKH_ContainerName, orAzure_Storage_OKW_ContainerNameare missing. -
If
func startreports missing job storage, setAzureWebJobsStorageinlocal.settings.jsonto a valid value (for example an Azurite connection string for local development). -
For database-backed routes, create
packages/back-end/.envwithPGHOST,PGUSER,PGPASSWORD,PGDATABASE, and optionallyPGPORT. -
From the repo root:
cd packages/back-end npm install npm run startnpm run startruns TypeScript build thenfunc start. HTTP routes are served under/api/...(for examplehttp://localhost:7071/api/listOKHsummaries). -
az login— use the subscription and permissions your team expects for Azure resources (blob access, etc.).
cd packages/front-end
npm install
npm run devOpen http://localhost:3000 (and http://localhost:3000/homepage if you use that route).
The front end uses two different configuration mechanisms:
-
nuxt.config.ts→runtimeConfig.public.baseUrl— used by pages that calluseRuntimeConfig().public.baseUrl(for example the home page). Default ishttp://127.0.0.1:7071/api(note127.0.0.1, notlocalhost; the project has seen browser/runtime quirks withlocalhosthere).Override with
BACKEND_URLwhen starting Nuxt, for example:BACKEND_URL=http://127.0.0.1:7071/api npm run dev
-
VITE_*variables — used by supply-graph-related pages (supply-graph-api, product supply tree). Set when starting Nuxt so they match your local Functions host and OHM:BACKEND_URL=http://127.0.0.1:7071/api \ VITE_API_BASE_URL=http://127.0.0.1:7071/api \ VITE_SUPPLY_GRAPH_AI_URL=http://localhost:8001 \ npm run dev
nuxt.config.tsalso exposesSUPPLY_GRAPH_AI_URLasruntimeConfig.public.supplyGraphAiUrlfor code that reads runtime config; the supply-graph pages above primarily useVITE_SUPPLY_GRAPH_AI_URL.
Clone and run supply-graph-ai outside this repo (for example next to it: ../supply-graph-ai). Typical local startup from that project:
cd /path/to/supply-graph-ai
conda create -n supply-graph-ai python=3.12
conda activate supply-graph-ai
pip install -r requirements.txt
pip install -e .
cp env.template .env # edit if your team requires storage or API keys
python run.pyThe HTTP API listens on http://localhost:8001 by default (API_PORT in that project’s settings). Interactive docs: http://localhost:8001/docs.
| Direction | What to configure |
|---|---|
| Browser → Azure Functions | BACKEND_URL / default baseUrl → http://127.0.0.1:7071/api (or your deployed API URL). |
| Browser → OHM | VITE_SUPPLY_GRAPH_AI_URL → base URL only, no path (e.g. http://localhost:8001). The front end appends /v1/api/match for match requests. |
| OHM → Azure Blob | Configure OHM’s .env / storage settings per supply-graph-ai documentation so it can load OKW/OKH data as needed for matching. |
Match endpoint: POST {VITE_SUPPLY_GRAPH_AI_URL}/v1/api/match — this is the route the Open Hardware Manager exposes; it is not the same as the stub in packages/mock-api (POST /v1/match).
CORS: In development, supply-graph-ai typically allows browser calls; if you change origins or run production-like settings, set CORS_ORIGINS in OHM’s environment so http://localhost:3000 (and/or http://127.0.0.1:3000) is allowed.
Ports: Do not run packages/mock-api on port 8001 at the same time as OHM; they conflict. Use the mock only when you want a minimal stub instead of the real Python service.
A minimal Express server for quick stubs. The code listens on port 8001 and implements POST /v1/match, which is not the same path as OHM’s POST /v1/api/match. Use it only when you are not running supply-graph-ai on 8001.
cd packages/mock-api
npm install
npm run devThe older prototype website is implemented with Github Pages.
TypeScript port of original Project Data Python Code.