Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 4 additions & 2 deletions .coderabbit.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,7 @@ reviews:
contain a README.md or file README.ipynb.
- If an example contains Python code, it should be placed in a subdirectory named `src/` and should
contain a `pyproject.toml` file. Optionally, it might also contain scripts in a `scripts/` directory.
- If an example contains a `pyproject.toml` file, it should be added to the `examples` list in the root `pyproject.toml` file.
- If an example contains a tests directory, it should contain a `pyproject.toml` file.
- If an example contains YAML files, they should be placed in a subdirectory named `configs/`.
- If an example contains sample data files, they should be placed in a subdirectory named `data/`, and should
Expand All @@ -135,12 +136,13 @@ reviews:
- This directory contains packages for the toolkit, each should contain a `pyproject.toml` file.
- Not all packages contain Python code, if they do they should also contain their own set of tests, in a
`tests/` directory at the same level as the `pyproject.toml` file.
- When adding a new package, that new package name (as defined in the `pyproject.toml` file) should
be added as a dependency to the nvidia-nat-all package in `packages/nvidia_nat_all/pyproject.toml`
- path: "packages/*/pyproject.toml"
instructions: >-
- The `pyproject.toml` file should never declare a dependency on the `nvidia-nat` meta package.
- When adding a new package, that new package name (as defined in the `pyproject.toml` file) should
probably be added as a dependency to the most extra in the root `pyproject.toml` file.
- `nvidia-nat-core` should likely be listed as a dependency.
- `nvidia-nat-test` should likely be listed as an optional dependency in the `test` extra
- A single dependency should be listed on each line and should always have a version specifier.
- All dependencies should be listed under the `[tool.setuptools_dynamic_dependencies]` section
- Any dependency that is an NVIDIA NeMo Agent Toolkit package should be declared with a version constraint of `== {version}`
Expand Down
24 changes: 9 additions & 15 deletions .cursor/rules/nat-setup/nat-toolkit-installation.mdc
Original file line number Diff line number Diff line change
Expand Up @@ -103,21 +103,15 @@ uv pip install -e '.[profiling]'

## Available Plugin Options

When installing specific plugins, these are the available options:
- `agno` - Agno integration
- `crewai` - CrewAI integration
- `langchain` - LangChain/LangGraph integration
- `llama_index` - LlamaIndex integration
- `mem0ai` - Mem0 integration
- `mysql` - MySQL database integration
- `opentelemetry` - OpenTelemetry observability integration
- `phoenix` - Phoenix observability integration
- `ragaai` - RAGA AI evaluation integration
- `redis` - Redis memory integration
- `s3` - AWS S3 object storage integration
- `semantic_kernel` - Microsoft Semantic Kernel integration
- `weave` - Weights & Biases Weave integration
- `zep_cloud` - Zep Cloud integration
Inspect the root pyproject.toml file for extras as the following list may be incomplete.

When installing specific plugins, refer to the `[project.optional-dependencies]` section of the root `pyproject.toml` for the authoritative, up-to-date list of available extras/plugins.

To view all options, check the root `pyproject.toml` directly. Here is an example command you can run to see all available plugin options (extras):
```bash
uv pip show nvidia-nat | grep Provides-Extra
```
or simply open `pyproject.toml` and look for entries under `[project.optional-dependencies]`.

## Dependency Groups
When installing dependencies, you can use the following groups:
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/pr.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -79,10 +79,10 @@ jobs:
if: ${{ ! fromJSON(needs.prepare.outputs.has_skip_ci_label) }}
with:
# CI container
container: ghcr.io/astral-sh/uv:python3.13-bookworm
py_11_container: ghcr.io/astral-sh/uv:python3.11-bookworm
py_12_container: ghcr.io/astral-sh/uv:python3.12-bookworm
py_13_container: ghcr.io/astral-sh/uv:python3.13-bookworm
container: ghcr.io/astral-sh/uv:0.9.28-python3.13-bookworm
py_11_container: ghcr.io/astral-sh/uv:0.9.28-python3.11-bookworm
py_12_container: ghcr.io/astral-sh/uv:0.9.28-python3.12-bookworm
py_13_container: ghcr.io/astral-sh/uv:0.9.28-python3.13-bookworm
# Info about the PR. Empty for non PR branches. Useful for extracting PR number, title, etc.
pr_info: ${{ needs.prepare.outputs.pr_info }}
base_sha: ${{ needs.prepare.outputs.base_sha }}
Expand Down
2 changes: 1 addition & 1 deletion .gitlab-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ variables:
WORKSPACE_TMP: "${CI_PROJECT_DIR}/.tmp"

default:
image: ghcr.io/astral-sh/uv:python3.13-bookworm
image: ghcr.io/astral-sh/uv:0.9.28-python3.13-bookworm
cache:
- key: $CI_COMMIT_REF_SLUG
paths:
Expand Down
148 changes: 148 additions & 0 deletions ci/scripts/license_diff.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
#!/usr/bin/env python3
# SPDX-FileCopyrightText: Copyright (c) 2026, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
# SPDX-License-Identifier: Apache-2.0
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Compare dependency licenses between the current and base `uv.lock`.

This script fetches the base lockfile from the GitHub repository and compares it
to the local `uv.lock`. It prints added, removed, and changed third-party
packages and includes license data where possible.

The output is intended for human review during CI checks, not as a machine-
parsable report.
"""

import argparse
import json
import tomllib
import urllib.request


def pypi_license(name: str, version: str | None = None) -> str:
"""Resolve a package license from PyPI metadata.

Args:
name: Distribution name on PyPI.
version: Optional version pin used to query version-specific metadata.

Returns:
A best-effort license string from the available metadata fields.
"""
# Use version-specific metadata when available to avoid mismatches.
try:
url = f"https://pypi.org/pypi/{name}/json" if version is None else f"https://pypi.org/pypi/{name}/{version}/json"
with urllib.request.urlopen(url) as r:
data = json.load(r)
except Exception:
return "(License not found)"

info = data.get("info", {})
candidates = []
lic = (info.get("license_expression") or "").strip()
if lic:
candidates.append(lic)
classifiers = info.get("classifiers") or []
lic_cls = [c for c in classifiers if c.startswith("License ::")]
if lic_cls:
candidates.append("; ".join(lic_cls))
lic = (info.get("license") or "").strip()
if lic:
candidates.append(lic)

if candidates:
return min(candidates, key=len)
return "(License not found)"


def main(base_branch: str) -> None:
"""Compare the local `uv.lock` against a base branch lockfile.

Args:
base_branch: Git branch name used to locate the base `uv.lock` file.
"""
# Read the current lockfile from the workspace.
with open("uv.lock", "rb") as f:
head = tomllib.load(f)

# Fetch the reference lockfile from GitHub for comparison.
try:
with urllib.request.urlopen(
f"https://raw.githubusercontent.com/NVIDIA/NeMo-Agent-Toolkit/{base_branch}/uv.lock") as f:
base = tomllib.load(f)
except Exception:
print(f"Failed to fetch base lockfile from GitHub: {base_branch}")
return

# Index package metadata by name for easy diffing.
head_packages = {pkg["name"]: pkg for pkg in head["package"]}
base_packages = {pkg["name"]: pkg for pkg in base["package"]}

added = head_packages.keys() - base_packages.keys()
removed = base_packages.keys() - head_packages.keys()
intersection = head_packages.keys() & base_packages.keys()

# Track third-party dependency changes only (skip internal `nvidia-nat*`).
added_packages = {pkg: head_packages[pkg] for pkg in added}
removed_packages = {pkg: base_packages[pkg] for pkg in removed}
changed_packages = {pkg: head_packages[pkg] for pkg in intersection if not pkg.startswith("nvidia-nat")}

if added_packages:
print("Added packages:")
for pkg in sorted(added_packages.keys()):
try:
version = head_packages[pkg]["version"]
license = pypi_license(pkg, version)
print(f"- {pkg} {version} {license}")
except KeyError:
# "Source" entries lack pinned versions (VCS or local path).
print(f"- {pkg} (source)")

if removed_packages:
print("Removed packages:")
for pkg in sorted(removed_packages.keys()):
try:
version = base_packages[pkg]["version"]
print(f"- {pkg} {version}")
except KeyError:
print(f"- {pkg} (source)")

printed_header = False
for pkg in sorted(changed_packages.keys()):
try:
head_version = head_packages[pkg]["version"]
base_version = base_packages[pkg]["version"]
if head_version == base_version:
# Only report version or license changes.
continue
head_license = pypi_license(pkg, head_version)
base_license = pypi_license(pkg, base_version)
if not printed_header:
print("Changed packages:")
printed_header = True
if head_license != base_license:
print(f"- {pkg} {base_version} -> {head_version} ({base_license} -> {head_license})")
else:
print(f"- {pkg} {base_version} -> {head_version}")
except KeyError:
if not printed_header:
print("Changed packages:")
printed_header = True
print(f"- {pkg} (source)")


if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Report third-party dependency license changes between lockfiles.")
parser.add_argument("--base-branch", type=str, default="develop")
args = parser.parse_args()
main(args.base_branch)
2 changes: 1 addition & 1 deletion ci/scripts/run_ci_local.sh
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ USE_HOST_GIT=${USE_HOST_GIT:-0}
LOCAL_CI_TMP=${LOCAL_CI_TMP:-${NAT_ROOT}/.tmp/local_ci_tmp/${CI_ARCH}}
DOCKER_EXTRA_ARGS=${DOCKER_EXTRA_ARGS:-""}

CI_CONTAINER=${CI_CONTAINER:-"ghcr.io/astral-sh/uv:python3.13-bookworm"}
CI_CONTAINER=${CI_CONTAINER:-"ghcr.io/astral-sh/uv:0.9.28-python3.13-bookworm"}


# These variables are common to all stages
Expand Down
125 changes: 125 additions & 0 deletions ci/scripts/sbom_list.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,125 @@
#!/usr/bin/env python3
# SPDX-FileCopyrightText: Copyright (c) 2026, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
# SPDX-License-Identifier: Apache-2.0
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Generate a tab-separated list of dependency licenses from `uv.lock`.

The output is stored as `sbom_list.tsv` and includes package name, version, and
license metadata from PyPI. This is intended for lightweight SBOM checks in CI.
"""

import csv
import json
import tomllib
import urllib.request
from pathlib import Path

from tqdm import tqdm


def pypi_license(name: str, version: str | None = None) -> str:
"""Resolve a package license from PyPI metadata.

Args:
name: Distribution name on PyPI.
version: Optional version pin used to query version-specific metadata.

Returns:
A best-effort license string from the available metadata fields.
"""
# Use version-specific metadata when available to avoid mismatches.
try:
url = f"https://pypi.org/pypi/{name}/json" if version is None else f"https://pypi.org/pypi/{name}/{version}/json"
with urllib.request.urlopen(url) as r:
data = json.load(r)
except Exception:
return "(License not found)"

info = data.get("info", {})
candidates = []
lic = (info.get("license_expression") or "").strip()
if lic:
candidates.append(lic)
classifiers = info.get("classifiers") or []
lic_cls = [c for c in classifiers if c.startswith("License ::")]
if lic_cls:
candidates.append("; ".join(lic_cls))
lic = (info.get("license") or "").strip()
if lic:
candidates.append(lic)

if candidates:
return min(candidates, key=len)
return "(License not found)"


def process_uvlock(uvlock: dict, base_name: str) -> Path:
"""Write a generic license table from a loaded `uv.lock` structure.

Args:
uvlock: Parsed `uv.lock` content.
base_name: Logical label for the source data (kept for compatibility).

Returns:
Path to the generated `licenses.tsv` file.
"""
# Keep packages ordered to make diffs stable between runs.
sorted_packages = sorted(uvlock["package"], key=lambda x: x["name"])

with open("licenses.tsv", "w") as f:
writer = csv.writer(f, delimiter="\t")
writer.writerow(["Name", "Version", "License"])
for pkg in tqdm(sorted_packages, desc="Checking licenses", unit="packages"):
try:
name = pkg["name"]
version = pkg["version"]
license = pypi_license(name, version)
writer.writerow([name, version, license])
except KeyError:
# Skip entries that do not have name/version info.
pass
return Path("licenses.tsv")


def main() -> None:
"""Create `sbom_list.tsv` for third-party license reporting."""
# Load the lockfile that captures the dependency graph.
with open("uv.lock", "rb") as f:
head = tomllib.load(f)

# Index packages by name for quick lookups.
pkgs = {pkg["name"]: pkg for pkg in head["package"]}

sbom_list = []
for pkg in tqdm(pkgs.keys(), desc="Processing packages", unit="packages"):
try:
sbom_list.append({
"name": pkg,
"version": pkgs[pkg]["version"],
"license": pypi_license(pkg, pkgs[pkg]["version"]),
})
except KeyError:
# Skip entries that do not contain a version field.
pass

# Write the final SBOM table in a TSV format to keep it spreadsheet-friendly.
with open("sbom_list.tsv", "w") as f:
writer = csv.writer(f, delimiter="\t")
writer.writerow(["Name", "Version", "License"])
for pkg in sbom_list:
writer.writerow([pkg["name"], pkg["version"], pkg["license"].replace("\n", "\\n")])


if __name__ == "__main__":
main()
2 changes: 1 addition & 1 deletion docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ FROM --platform=$TARGETPLATFORM ${BASE_IMAGE_URL}:${BASE_IMAGE_TAG}
ARG PYTHON_VERSION
ARG NAT_VERSION

COPY --from=ghcr.io/astral-sh/uv:0.9.15 /uv /uvx /bin/
COPY --from=ghcr.io/astral-sh/uv:0.9.28 /uv /uvx /bin/

ENV PYTHONDONTWRITEBYTECODE=1

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ FROM ${BASE_IMAGE_URL}:${BASE_IMAGE_TAG}
ARG PYTHON_VERSION
ARG NAT_VERSION

COPY --from=ghcr.io/astral-sh/uv:0.9.15 /uv /uvx /bin/
COPY --from=ghcr.io/astral-sh/uv:0.9.28 /uv /uvx /bin/

ENV PYTHONDONTWRITEBYTECODE=1

Expand Down
2 changes: 1 addition & 1 deletion examples/frameworks/agno_personal_finance/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ FROM ${BASE_IMAGE_URL}:${BASE_IMAGE_TAG}
ARG PYTHON_VERSION
ARG NAT_VERSION

COPY --from=ghcr.io/astral-sh/uv:0.9.15 /uv /uvx /bin/
COPY --from=ghcr.io/astral-sh/uv:0.9.28 /uv /uvx /bin/

ENV PYTHONDONTWRITEBYTECODE=1

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ FROM ${BASE_IMAGE_URL}:${BASE_IMAGE_TAG}
ARG PYTHON_VERSION
ARG NAT_VERSION

COPY --from=ghcr.io/astral-sh/uv:0.9.15 /uv /uvx /bin/
COPY --from=ghcr.io/astral-sh/uv:0.9.28 /uv /uvx /bin/

ENV PYTHONDONTWRITEBYTECODE=1

Expand Down
Loading