You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Set up cross-platform CI pipelines (GitHub Actions) that automatically run benchmarks on Linux and report results, ensuring the project works reliably beyond Windows.
Motivation
The project claims cross-platform portability via .NET 8 and ILGPU, but all current testing and benchmarking is done on Windows. Without CI validation on Linux, regressions or platform-specific issues could go unnoticed. Automated benchmark runs also provide a performance history that can catch regressions early and give contributors confidence that their changes don't break anything.
Acceptance Criteria
Add a GitHub Actions workflow that builds and runs the benchmark suite on Ubuntu (latest LTS)
CI should run on push to main and on pull requests
CPU-mode benchmarks should always run (no GPU required)
Results should be captured as CI artifacts (using CSV/Markdown export from CSV / Markdown export #10 if available)
Build failures and benchmark crashes should fail the CI pipeline
Optionally: add a Windows CI job for parity
Optionally: if a GPU runner is available (e.g. self-hosted), run CUDA/OpenCL benchmarks too
Add a CI status badge to the README
Technical Notes
GitHub-hosted runners don't have GPUs, so CI will primarily validate CPU-mode correctness and performance
Use dotnet build and dotnet run in the workflow — no special tooling needed
Consider caching NuGet packages for faster CI runs
For GPU CI, a self-hosted runner with NVIDIA drivers would be needed (future enhancement)
Benchmark results could be posted as PR comments using GitHub Actions for visibility
Summary
Set up cross-platform CI pipelines (GitHub Actions) that automatically run benchmarks on Linux and report results, ensuring the project works reliably beyond Windows.
Motivation
The project claims cross-platform portability via .NET 8 and ILGPU, but all current testing and benchmarking is done on Windows. Without CI validation on Linux, regressions or platform-specific issues could go unnoticed. Automated benchmark runs also provide a performance history that can catch regressions early and give contributors confidence that their changes don't break anything.
Acceptance Criteria
mainand on pull requestsTechnical Notes
dotnet buildanddotnet runin the workflow — no special tooling needed