A Python package for converting heating hot water system data to Brick Schema models with comprehensive validation and portable analytics.
π Documentation | π Getting Started | π‘ Examples | π Issues
This package provides tools for converting building hot water system data to Brick Schema models and running portable analytics applications.
Core Contributions:
- CSV-to-Brick Converter: Automated conversion from tabular BMS data to Brick Schema 1.4 RDF models
- Multi-Level Validators: Ontology, point count, equipment count, and structural pattern validation
- Portable Analytics: Building-agnostic applications that use SPARQL to auto-discover required sensors
Key Benefits:
- Interoperability: Standardized semantic models work across different BMS platforms
- Portability: Write analytics once, run on any qualified building without recoding
- Quality Assurance: Comprehensive validation ensures model correctness
The package supports five hot water system types (condensing boilers, non-condensing boilers, generic boilers, district hot water, district steam) and has been tested on 216 real buildings.
For Users (when published to PyPI):
pip install hhw-brickFor Development (current method):
# Clone the repository
git clone https://github.com/CenterForTheBuiltEnvironment/HHW_brick.git
cd HHW_brick
# Install in editable mode
pip install -e .The -e flag installs the package in editable mode, so changes to the source code are immediately reflected without reinstalling.
System Requirements:
- Python 3.8 or higher
- All dependencies are automatically installed (see
pyproject.toml)
Input Data: For sample input data format, see: https://doi.org/10.5061/dryad.t4b8gtj8n
π View Full Documentation β
For comprehensive guides, tutorials, and API reference, please visit our documentation site:
- Getting Started Guide
- User Guide - Conversion
- User Guide - Validation
- User Guide - Applications
- Examples
The typical workflow consists of three steps: conversion, validation, and application.
from hhw_brick import CSVToBrickConverter
converter = CSVToBrickConverter()
graph = converter.convert_to_brick(
metadata_csv="metadata.csv",
vars_csv="vars_available_by_building.csv",
building_tag="105",
output_path="building_105.ttl"
)
print(f"Generated {len(graph)} RDF triples")from hhw_brick import BrickModelValidator, GroundTruthCalculator
# Generate ground truth from input CSV (expected counts)
calculator = GroundTruthCalculator()
ground_truth = calculator.calculate(
metadata_csv="metadata.csv",
vars_csv="vars_available_by_building.csv",
output_csv="ground_truth.csv"
)
# Validate generated model
validator = BrickModelValidator(ground_truth_csv_path="ground_truth.csv")
# Ontology validation
result = validator.validate_ontology("building_105.ttl")
print(f"Valid: {result['valid']}")
# Point count validation
point_result = validator.validate_point_count("building_105.ttl")
# Equipment count validation
equip_result = validator.validate_equipment_count("building_105.ttl")from hhw_brick import apps
# Discover available applications
available_apps = apps.list_apps()
# Load temperature difference analysis
app = apps.load_app("secondary_loop_temp_diff")
# Run on qualified building
results = app.run(
brick_model="building_105.ttl",
timeseries_data="building_105_data.csv"
)Process multiple buildings in parallel:
from hhw_brick import BatchConverter, BrickModelValidator
# Batch conversion
batch = BatchConverter()
results = batch.convert_all_buildings(
metadata_csv="metadata.csv",
vars_csv="vars_available_by_building.csv",
output_dir="output/",
show_progress=True # Show progress bar
)
# Batch validation
validator = BrickModelValidator()
validation_results = validator.batch_validate_ontology(
test_data_dir="output/",
max_workers=4
)The examples/ folder contains 8 complete examples:
- 01_convert_csv_to_brick.py - Single and batch CSV conversion
- 02_ontology_validation.py - SHACL-based ontology validation
- 03_point_count_validation.py - Sensor/point count verification
- 04_equipment_count_validation.py - Equipment count verification
- 05_subgraph_pattern_matching.py - Structural pattern validation
- 06_application_management.py - Discover and manage analytics apps
- 07_run_application.py - Run analytics on single building
- 08_batch_run_application.py - Batch run analytics on multiple buildings
Run any example:
python examples/01_convert_csv_to_brick.pyTwo CSV files are required:
- metadata.csv - Building configuration (system type, equipment counts, building ID)
- vars_available_by_building.csv - Sensor availability matrix (building ID Γ sensor names)
See example files in examples/fixtures/ for format details.
The converter generates Brick Schema 1.4 models in Turtle (.ttl) format with:
- Building and system hierarchy (
rec:Buildingβbrick:Hot_Water_Systemβbrick:Hot_Water_Loop) - Equipment instances (
brick:Boiler,brick:Pump) - Sensor points (
brick:Temperature_Sensor,brick:Flow_Sensor, etc.) - Semantic relationships (
brick:hasPart,brick:feeds,brick:isPointOf)
Example:
@prefix brick: <https://brickschema.org/schema/Brick#> .
@prefix hhws: <https://hhws.example.org#> .
hhws:building105 a rec:Building ;
rec:isLocationOf hhws:building105.hws .
hhws:building105.hws a brick:Hot_Water_System ;
brick:hasPart hhws:building105.hws.primary_loop .- Automatic system type detection
- Batch processing with parallel execution
- Support for 5 system types (condensing, non-condensing, generic boiler, district hot water, district steam)
- Ontology: SHACL validation against Brick Schema 1.4
- Point Count: Verify sensor counts (with
owl:sameAsdeduplication) - Equipment Count: Validate boilers and pumps (with subclass support)
- Structural Pattern: SPARQL-based topology validation
Built-in applications that work across buildings:
- secondary_loop_temp_diff: Secondary loop temperature analysis
- primary_loop_temp_diff: Primary loop temperature analysis (boiler systems)
Applications use SPARQL queries to auto-discover sensors from Brick models, eliminating hardcoded point names.
The package implements a three-stage validation approach:
- Syntactic Validation: RDF syntax and Brick Schema conformance (SHACL)
- Semantic Validation: Point and equipment counts against ground truth (calculated from input CSV)
- Structural Validation: Subgraph pattern matching for system topology
Ground truth values (expected counts) are computed independently from the input CSV data, not from the generated Brick model, ensuring unbiased validation.
Author: Mingchen Li
Email: liwei74123@gmail.com
Repository: https://github.com/CenterForTheBuiltEnvironment/HHW_brick
Issues: https://github.com/CenterForTheBuiltEnvironment/HHW_brick/issues
Version: 0.1.0
Last Updated: 2025-01-01