Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions contrib/templates/llmops-basic/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# LLMOps basic template

This template introduces a base structure for organizing LLMOps projects
using DABs, Unity Catalog and MLflow.

Install it using

```
databricks bundle init https://github.com/databricks/bundle-examples --template-dir contrib/templates/llmops-basic
```
35 changes: 35 additions & 0 deletions contrib/templates/llmops-basic/databricks_template_schema.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
{
"properties": {
"project_name": {
"type": "string",
"default": "llmops_basic",
"description": "Name of the LLMOps project (use underscores, no hyphens)",
"order": 1
},
"catalog_name_dev": {
"type": "string",
"default": "dev_catalog",
"description": "Name of the Unity Catalog for development environment",
"order": 2
},
"catalog_name_prod": {
"type": "string",
"default": "prod_catalog",
"description": "Name of the Unity Catalog for production environment",
"order": 3
},
"workspace_host_dev": {
"type": "string",
"default": "https://your-workspace.azuredatabricks.net/",
"description": "Databricks workspace URL for development",
"order": 4
},
"workspace_host_prod": {
"type": "string",
"default": "https://your-workspace.azuredatabricks.net/",
"description": "Databricks workspace URL for production",
"order": 5
}
}
}

Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
.databricks/
.bundle/
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
venv/
env/
ENV/
.ipynb_checkpoints/
*.ipynb_checkpoints
.vscode/
.idea/
*.swp
*.swo
*~
.DS_Store
Thumbs.db
mlruns/
*.log

Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# {{.project_name}}

End-to-end LLMOps project for sentiment analysis of reviews using Databricks.

## Overview

This example demonstrates a complete LLMOps pipeline for building, evaluating, and deploying a sentiment analysis model.

**Pipeline stages:**
- Data preparation
- Model build and evaluation with MLflow
- Model deployment to serving endpoint
- Batch inference

## Requirements

- Databricks CLI (v0.218.0+)
- Unity Catalog enabled
- Required permissions on the catalog:
- `USE CATALOG` - to access the catalog
- `CREATE SCHEMA` - to create schemas

## Quick Start

**Deploy:**
```bash
databricks bundle deploy -t dev
```

**Run pipeline:**
```bash
databricks bundle run model_preprocessing -t dev
databricks bundle run model_build_evaluation -t dev
databricks bundle run model_endpoint_deploy -t dev
databricks bundle run model_inference -t dev
```

## Configuration

- **Dev Catalog**: `{{.catalog_name_dev}}`
- **Prod Catalog**: `{{.catalog_name_prod}}`

Edit `databricks.yml` to customize catalog names, schema name, model name, and experiment settings.

Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# This is a Databricks Asset Bundle for {{.project_name}}.
bundle:
name: "{{.project_name}}"

variables:
catalog_name:
description: "Name of the UC catalog to use"
default: "default_catalog"
schema_name:
description: "Name of the UC schema to use"
default: "sentiment_agent_project"
model_name:
description: "Name of the UC model to use"
default: "sentiment_agent"
experiment_name:
description: "Name of experiment"
default: "/Users/${workspace.current_user.userName}/${bundle.target}_sentiment_agent"

include:
- resources/*.yml

targets:
dev:
mode: development
default: true
workspace:
host: {{.workspace_host_dev}}
variables:
catalog_name: {{.catalog_name_dev}}

prod:
mode: production
workspace:
host: {{.workspace_host_prod}}
root_path: /Shared/.bundle/prod/${bundle.name}
variables:
catalog_name: {{.catalog_name_prod}}
{{- if not is_service_principal}}
run_as:
# This runs as {{user_name}} in production. Alternatively,
# a service principal could be used here using service_principal_name
# (see Databricks documentation).
user_name: {{user_name}}
{{end -}}
Loading