Thank you for your interest in contributing! WANPulse is a community-driven project and welcomes contributions of all kinds.
- Python 3.12+
- Home Assistant development environment (or a running HA instance for manual testing)
- Git
# Clone the repository
git clone https://github.com/polprog-tech/WANPulse.git
cd WANPulse
# Create a virtual environment
python -m venv venv
source venv/bin/activate # or `venv\Scripts\activate` on Windows
# Install development dependencies
pip install -r requirements_test.txt# Run all tests
pytest tests/ -v
# Run with coverage
pytest tests/ -v --cov=custom_components/wanpulse --cov-report=term-missing
# Run a specific test file
pytest tests/components/wanpulse/test_models.py -v# Check for lint errors
ruff check .
# Auto-fix lint errors
ruff check --fix .
# Format code
ruff format .Install the bundled hook to run lint, format, and tests before every commit:
cp scripts/pre-commit .git/hooks/pre-commit && chmod +x .git/hooks/pre-commitcustom_components/wanpulse/
├── __init__.py # Integration setup and platform forwarding
├── config_flow.py # Config and options flow
├── const.py # Constants, enums (ProbeMethod, etc.)
├── models.py # Domain models (ProbeTarget, ProbeResult, snapshots)
├── coordinator.py # DataUpdateCoordinator orchestration
├── entity.py # Base entity class
├── sensor.py # Sensor entities
├── binary_sensor.py # Binary sensor entities
├── button.py # Manual probe button entity
├── diagnostics.py # Diagnostics support
├── probes/
│ ├── __init__.py # Probe engine registry and factory
│ ├── base.py # ProbeEngine abstract base class
│ ├── dns.py # DNS probe engine
│ ├── http.py # HTTP probe engine
│ └── tcp.py # TCP probe engine
├── manifest.json
├── strings.json
└── translations/
├── en.json
└── pl.json
- Create
custom_components/wanpulse/probes/your_method.py - Extend
ProbeEnginefromprobes/base.py - Implement the
async_probemethod - Register in
probes/__init__.py - Add
"your_method"toPROBE_METHODSinconst.py - Write tests in
tests/components/wanpulse/test_probes.py - Update
strings.jsonwith any new user-facing text - Update
README.mdprobe method table
from .base import ProbeEngine
class YourProbeEngine(ProbeEngine):
async def async_probe(self, target: ProbeTarget, timeout: float) -> ProbeResult:
...- Add a new
EntityDescriptionto the appropriate sensor tuple insensor.pyorbinary_sensor.py - Provide a
value_fnthat extracts the value fromCoordinatorSnapshotorTargetSnapshot - Set
entity_registry_enabled_default=Falsefor non-essential entities - Add translation keys in
strings.json - Write tests
- Async-first - never block the event loop
- Typed - all functions must have type annotations
- Keep entities thin - they project backend state, don't contain business logic
- Use
translation_keyfor all entity names - no hardcoded English in entities - Use appropriate
device_class,state_class, andnative_unit_of_measurement - No print statements - use
logging.getLogger(__name__)instead - Docstrings - Google-style docstrings for all public classes and functions
- Test everything - aim for high coverage on core logic
- Probes must not require elevated privileges (no raw sockets) and should work on all HA installation types
- main - stable release branch
- Feature branches:
feature/description - Bug fixes:
fix/description - PRs should include tests and pass all CI checks
- Keep commits atomic and well-described
Use conventional commits:
feat: add DNS probe engine
fix: handle jitter calculation for single-sample windows
test: add tests for HTTP probe timeout handling
docs: update probe method table in README
refactor: extract common probe logic into base class
WANPulse uses scenario-oriented Given/When/Then (GWT) style tests for readability and maintainability.
Every test follows a clear three-part structure:
- Given - Set up preconditions (create targets, configure mocks)
- When - Perform the action under test (run a probe, call a service, submit a flow)
- Then - Assert the expected outcome (check return values, verify state changes)
- Test classes are organized by feature/scenario, not by module. For example,
TestTCPProbeEnginerather thanTestProbes. - Test method names describe the scenario:
test_successful_connect,test_timeout_returns_failure. - Each test uses
GIVEN,WHEN,THENdocstrings - a class-level GIVEN above thedef(or decorator), plus WHEN and THEN docstrings inside the method body. - We include happy-path, edge-case, and failure-path scenarios for every feature.
class TestTCPProbeOnTimeout:
"""Tests for TCP probe behavior on timeout."""
"""GIVEN a TCP target that does not respond in time"""
@pytest.mark.asyncio
async def test_returns_failure_with_error(self):
target = ProbeTarget(host="10.0.0.1", label="Slow", method=ProbeMethod.TCP, port=443)
"""WHEN the probe is executed"""
result = await engine.async_probe(target, timeout=0.001)
"""THEN the result indicates failure"""
assert result.success is False
assert result.error is not None- Prefer many small test classes (one scenario each) over large test classes with mixed scenarios.
- Use descriptive class docstrings that read as "Given ..." to set context.
- Aim for high coverage on core logic (models, coordinator, probes).
- Test both the happy path and meaningful edge cases (empty inputs, timeouts, unreachable hosts).
- All tests pass
- Ruff check passes
- New code has tests
- Strings are translatable (no hardcoded English in entities)
- No blocking I/O on the event loop
- Docstrings present for public API
- CHANGELOG.md updated
See LICENSE.