Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 30 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ This repository is a **monorepo of separate npm packages** (no root `package.jso
|-----------|-----------|-------------|---------|
| Azure Functions API | `packages/back-end` | `http://127.0.0.1:7071/api` | OKH/OKW listing, blob-backed product data, incidents (Postgres), etc. |
| Nuxt front end | `packages/front-end` | `http://localhost:3000` | Vue UI |
| Open Hardware Manager (supply-graph-ai) | *separate clone* | `http://localhost:8001` | Matching / supply-tree API (`POST /v1/api/match`, docs at `/docs`) |
| Open Hardware Manager (supply-graph-ai) | *separate clone* (often **Docker Compose**) | `http://localhost:8001` | FastAPI matching / supply-tree API (`POST /v1/api/match`, docs at `/docs`) |
| Mock match API (optional) | `packages/mock-api` | `http://localhost:8001` | Tiny Express stub; **same port as OHM** — use one or the other, not both |

## Prerequisites
Expand All @@ -62,7 +62,7 @@ This repository is a **monorepo of separate npm packages** (no root `package.jso
- **Azure Functions Core Tools v4** (installed with `packages/back-end` via npm; ensure `func` is on your `PATH` after `npm install`)
- **Azure CLI** (`az`) — install from [Microsoft’s docs](https://learn.microsoft.com/en-us/cli/azure/install-azure-cli). You need subscription access and permissions for team resources (blob storage, etc.) as appropriate.
- **PostgreSQL** — only if you use endpoints that query the database (e.g. `/api/incidents`). Configure via `packages/back-end/.env` (see [packages/back-end/README.md](packages/back-end/README.md)).
- **supply-graph-ai** — Python 3.12, Conda (recommended), and dependencies per that repo’s README ([local dev](https://github.com/helpfulengineering/supply-graph-ai/blob/main/README.md#option-2-local-development-for-active-development) or Docker Compose).
- **supply-graph-ai** — easiest path is **Docker Desktop** and **`docker compose`** in that repo (service **`ohm-api`** maps **host port 8001**). Alternative: local Python 3.12 / Conda per [supply-graph-ai README](https://github.com/helpfulengineering/supply-graph-ai/blob/main/README.md).

## 1. Back end (`packages/back-end`)

Expand Down Expand Up @@ -121,30 +121,55 @@ The front end uses two different configuration mechanisms:

## 3. Open Hardware Manager — **supply-graph-ai** (matching API)

Clone and run [supply-graph-ai](https://github.com/helpfulengineering/supply-graph-ai) **outside** this repo (for example next to it: `../supply-graph-ai`). Typical local startup from that project:
Clone [supply-graph-ai](https://github.com/helpfulengineering/supply-graph-ai) **outside** this repo (for example next to it: `../supply-graph-ai`).

### Docker Desktop (typical setup)

With **Docker Desktop** running, from the **supply-graph-ai** repository root:

```bash
cd /path/to/supply-graph-ai
cp env.template .env # if you have not already; edit for storage/API keys as needed
docker compose up ohm-api
```

The Compose file publishes the FastAPI app on **host `localhost:8001`** (container service **`ohm-api`**, port mapping `8001:8001`). Confirm it is up:

- **http://localhost:8001/health**
- **http://localhost:8001/docs**

You can use `docker compose up` instead of `docker compose up ohm-api` if you intend to start the full stack defined in that repo’s `docker-compose.yml`.

### Local Python (no Docker)

For active development inside supply-graph-ai with hot reload:

```bash
cd /path/to/supply-graph-ai
conda create -n supply-graph-ai python=3.12
conda activate supply-graph-ai
pip install -r requirements.txt
pip install -e .
cp env.template .env # edit if your team requires storage or API keys
cp env.template .env
python run.py
```

The HTTP API listens on **`http://localhost:8001`** by default (`API_PORT` in that project’s settings). Interactive docs: **http://localhost:8001/docs**.
Same default URL: **`http://localhost:8001`** (`API_PORT` in that project’s settings).

### Wiring project-data-platform-ts ↔ supply-graph-ai

| Direction | What to configure |
|-----------|-------------------|
| Browser → **Azure Functions** | `BACKEND_URL` / default `baseUrl` → `http://127.0.0.1:7071/api` (or your deployed API URL). |
| Browser → **OHM** | `VITE_SUPPLY_GRAPH_AI_URL` → base URL only, **no path** (e.g. `http://localhost:8001`). The front end appends **`/v1/api/match`** for match requests. |
| OKH file URL for match | Optional **`VITE_PUBLIC_OKH_BLOB_BASE`** (default: `https://projdatablobstorage.blob.core.windows.net/okh`). The UI sends **`okh_url`** = `{base}/{fname}` so OHM can fetch the manifest over HTTPS. |
| OHM → **Azure Blob** | Configure OHM’s `.env` / storage settings per [supply-graph-ai documentation](https://github.com/helpfulengineering/supply-graph-ai) so it can load OKW/OKH data as needed for matching. |
| **OKH vs OKW in a match call** | This repo sends **OKH** as **`okh_url`**. **OKW** (capability) files are **not** attached by the browser; OHM’s **`OKWService`** loads them from **OHM-configured storage** and matches against the fetched OKH. See **[docs/match-endpoint-integration.md](docs/match-endpoint-integration.md)** (section *OKH + OKW workflow*). |

**Match endpoint:** **`POST {VITE_SUPPLY_GRAPH_AI_URL}/v1/api/match`** — this is the route the Open Hardware Manager exposes; it is **not** the same as the stub in `packages/mock-api` (`POST /v1/match`).

**End-to-end match testing:** Follow **[docs/match-endpoint-integration.md](docs/match-endpoint-integration.md)** for the ordered verification steps (OHM health, blob URL reachability from Docker, Azure Functions, CORS, and curl). Shared client helpers live in **`packages/front-end/utils/ohmMatch.ts`**. To assert the **same JSON body and headers** as the UI against a running OHM, run **`OKH_FNAME=… node scripts/verify-ohm-match.mjs`** from the repo root (see doc §6).

**CORS:** In development, supply-graph-ai typically allows browser calls; if you change origins or run production-like settings, set **`CORS_ORIGINS`** in OHM’s environment so **`http://localhost:3000`** (and/or `http://127.0.0.1:3000`) is allowed.

**Ports:** Do not run **`packages/mock-api`** on port 8001 at the same time as OHM; they conflict. Use the mock only when you want a minimal stub instead of the real Python service.
Expand Down
139 changes: 36 additions & 103 deletions packages/front-end/pages/products/[id]/supplyTree.vue
Original file line number Diff line number Diff line change
@@ -1,36 +1,26 @@
<script setup lang="ts">
import { useRouter, useRoute } from "#app";
import { ref, reactive, computed, onMounted } from "vue";
import { useRoute } from "#app";
import { ref, onMounted } from "vue";
import D3SupplyTree from "../../../components/D3Tree.vue";
import {
buildManufacturingMatchPayload,
parseOhmMatchBody,
postOhmMatch,
} from "~/utils/ohmMatch";

const route = useRoute();

const okh = useState("okh");

console.log("Shared product:", okh.value);



const productFilename = route.params.id as string;

console.log("productFilename", productFilename);

// now you have route.params.id → same product id
const okhData = ref<any>([]);
const supplyTreeData = ref<any>(null);
const loading = ref<boolean>(false);
const error = ref<string | null>(null);
const selectedOkh = ref<any>(null);
// API configuration
const apiBaseUrl = ref(
import.meta.env.VITE_API_BASE_URL || "http://localhost:7071/api"
);
// Updated to use port 8001 as requested
const supplyGraphApiUrl = ref(
import.meta.env.VITE_SUPPLY_GRAPH_AI_URL || "http://localhost:8001"
);
const supplyGraphApiEndpoint = ref("/v1/api/match"); // Path to the versioned supply tree creation endpoint


const sendToSupplyGraphAI = async (o: any) => {
if (!o) {
Expand Down Expand Up @@ -58,112 +48,55 @@ const sendToSupplyGraphAI = async (o: any) => {
error.value = null;
selectedOkh.value = okhItem;
try {
const originalData = okhItem.originalData || okhItem;

console.log("Preparing to send OKH item to Supply Graph AI:", okhItem);
console.log("HOPING FILENAME",okhItem.fname);

// WARNING! TODO: This is hard coding a recipe. We instatn want this to be the OKH!
// TODO: See if we can use our own backend for this....
const okh_blobstorage_file_name = "https://projdatablobstorage.blob.core.windows.net/okh";


const parts = productFilename.split('.');


const payload = {
"okh_url": `${okh_blobstorage_file_name}/${productFilename}`
};

// We would like to get this data from our backend, instead from
// azure, because we have already read this data.
// However, I don't think that will work because that system
// runs in docker, which does not have access to a local
// machine.
//
// const backend_url = "http://localhost:7071/api/getFile";
// const fileType = "okh";
// const payload = {
// "okh_url": `${backend_url}/okh/${parts[0]}/${parts[1]}`
// };



console.log("Enhanced payload for Supply Graph AI (port 8001):", payload);
console.log(
`Sending request to: ${supplyGraphApiUrl.value}${supplyGraphApiEndpoint.value}`
);
// Enhanced request to supply-graph-ai endpoint at port 8001
const response = await fetch(
`${supplyGraphApiUrl.value}${supplyGraphApiEndpoint.value}`,
{
method: "POST",
headers: {
"Content-Type": "application/json",
Accept: "application/json",
// Enhanced headers for better compatibility
"User-Agent": "project-data-platform-ts/1.0",
"X-Requested-With": "XMLHttpRequest",
// Add CORS headers if needed for local development
Origin:
typeof window !== "undefined"
? window.location.origin
: "http://localhost:3000",
},
mode: "cors", // Explicitly request CORS mode
body: JSON.stringify(payload),
}
);
const payload = buildManufacturingMatchPayload(productFilename);

console.log(
`Supply Graph AI response status: ${response.status} ${response.statusText}`
);
console.log("OHM match payload:", payload);
const response = await postOhmMatch(payload as Record<string, unknown>);

if (!response.ok) {
let errorData = null;
let errorData: unknown = null;
try {
errorData = await response.json();
} catch (e) {
// Response might not be JSON
} catch {
console.warn("Could not parse error response as JSON");
}

throw new Error(
`Supply Graph AI API error: ${response.status} ${response.statusText}` +
(errorData ? ` - ${JSON.stringify(errorData)}` : "")
`OHM match error: ${response.status} ${response.statusText}` +
(errorData ? ` ${JSON.stringify(errorData)}` : "")
);
}

// Parse the response from supply graph AI
const supplyTreeResponse = await response.json();
const { solutions } = parseOhmMatchBody(supplyTreeResponse);

if (!solutions.length) {
// HTTP 200 with empty solutions is valid; not a client/transport error.
error.value = null;
treeData.value = {
name: selectedOkh.value?.name || "Supply Tree",
children: [{ image: "/okh.png", children: [] }],
};
return;
}


console.log("Supply Graph AI Response2:", supplyTreeResponse);

// Clear previous solutions
const formattedSolutions: any[] = [];

supplyTreeResponse.data.solutions.forEach((solution: any) => {

const children = solution.tree.capabilities_used.map((capability: string) => ({
name: capability,
for (const solution of solutions) {
const tree = (solution.tree as Record<string, unknown> | undefined) || {};
const capRaw = tree.capabilities_used;
const caps = Array.isArray(capRaw) ? capRaw : [];
const children = caps.map((capability: unknown) => ({
name: String(capability),
image: "/OKP_icon.png",
}));


const formattedSolution = {
name: solution.facility_name,
formattedSolutions.push({
name: String(solution.facility_name ?? "Facility"),
image: "/okw_maker.png",
class: "test",
children: children,
};

formattedSolutions.push(formattedSolution);
});

console.log("formattedSolutions:", formattedSolutions);
children,
});
}

// Update treeData.value to trigger re-render in D3Tree component
treeData.value = {
name: selectedOkh.value?.name || "Supply Tree",
children: [
Expand Down
Loading