Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
78 commits
Select commit Hold shift + click to select a range
c80d278
ML-396 Converted MLPRegressor to work with NumPower/NDArray related c…
apphp Feb 14, 2026
13acae6
ML-396 removed unneeded export function
apphp Feb 14, 2026
3b65a47
ML-396 added test for NumPower
apphp Feb 14, 2026
d7404f8
ML-396 added USE_NUMPOWER_TRANSPOSE option to Network
apphp Feb 14, 2026
d538799
ML-396 added USE_NUMPOWER_TRANSPOSE option to Network
apphp Feb 14, 2026
f333c67
ML-396 fixed issue with samples normalization
apphp Feb 14, 2026
1583ee3
ML-396 removed unneeded packages from composer
apphp Feb 14, 2026
57037c6
ML-396 removed unneeded packages from composer
apphp Feb 14, 2026
22df3e0
Merge branch '3.0' into 396-convert-mlp-classifier-to-NumPower
apphp Mar 3, 2026
b79ddb6
Merge branch '3.0' into 396-convert-mlp-classifier-to-NumPower
apphp Mar 10, 2026
cab6925
Merge branch '3.0' into 396-convert-mlp-classifier-to-NumPower
apphp Mar 28, 2026
b920665
ML-396 style fixes
apphp Mar 28, 2026
b99d65b
ML-396 migrated MLPRegressor
apphp Mar 28, 2026
e04867b
ML-396 migrated MLPRegressor
apphp Mar 28, 2026
36a282e
ML-396 migrated MLPRegressor
apphp Mar 28, 2026
9bc5107
ML-396 migrated MLPRegressor
apphp Mar 28, 2026
2a69709
ML-396 migrated MLPRegressor
apphp Mar 28, 2026
76093fd
ML-396 migrated Adaline
apphp Mar 28, 2026
0d35e60
ML-396 migrated Adaline
apphp Mar 28, 2026
289b822
ML-396 migrated Adaline
apphp Mar 28, 2026
4e19926
ML-396 migrated ExtraTreeRegressor with Hyperplane
apphp Mar 29, 2026
f18ceaa
ML-396 migrated RegressionTree
apphp Mar 29, 2026
f22c58c
ML-396 migrated GradientBoost
apphp Mar 29, 2026
8a24b57
ML-396 migrated Ridge
apphp Mar 29, 2026
5052675
ML-396 migrated Ridge
apphp Mar 29, 2026
4c31a38
ML-396 migrated Ridge
apphp Mar 29, 2026
b6f3665
ML-396 Added tests for Ridge
apphp Mar 30, 2026
41b144c
ML-396 Added AdalineTest
apphp Mar 30, 2026
7017b0f
ML-396 fixes for STAN
apphp Mar 30, 2026
60a1100
ML-396 fixes for STAN
apphp Mar 30, 2026
cfbd391
ML-396 fixes for Ridge and tests
apphp Apr 4, 2026
f730d69
ML-396 fix for ErrorAnalysisTest
apphp Apr 4, 2026
a700418
ML-396 fix for ErrorAnalysisTest
apphp Apr 4, 2026
320871f
ML-396 fix for tests
apphp Apr 4, 2026
5412c79
ML-396 fix for tests
apphp Apr 4, 2026
d1e9a6d
ML-396 fix for tests
apphp Apr 4, 2026
6016eef
ML-396 fix for tests
apphp Apr 4, 2026
2e627c0
ML-396 fix for tests
apphp Apr 4, 2026
8cb3a16
ML-396 fix for tests
apphp Apr 4, 2026
83d0a63
ML-396 fix for tests
apphp Apr 4, 2026
252003c
ML-396 fix for tests
apphp Apr 4, 2026
95516c4
ML-396 fix for tests
apphp Apr 4, 2026
a932a92
ML-396 fix for tests
apphp Apr 5, 2026
17e6bce
ML-396 fix for tests
apphp Apr 5, 2026
aa1553b
ML-396 additional tests for ExtraTreeRegressorTest
apphp Apr 5, 2026
2b0cbb3
ML-396 additional tests for ExtraTreeRegressorTest
apphp Apr 5, 2026
5c79fa1
ML-396 fix for tests
apphp Apr 5, 2026
6b1af3d
ML-396 fix for tests
apphp Apr 5, 2026
d558360
ML-396 fix for tests
apphp Apr 5, 2026
0de66d5
ML-396 additional tests for GradientBoostTest
apphp Apr 5, 2026
12aee96
ML-396 additional tests for GradientBoostTest
apphp Apr 5, 2026
d79f7a8
ML-396 additional tests for GradientBoostTest
apphp Apr 5, 2026
293837c
ML-396 fix for tests
apphp Apr 5, 2026
5372b35
ML-396 additional tests for MLPRegressorTest
apphp Apr 5, 2026
61f8204
ML-396 additional tests for RegressionTreeTest
apphp Apr 5, 2026
43a6c97
ML-396 fix for tests
apphp Apr 5, 2026
e396c04
ML-396 RadiusNeighborsRegressor migrated to NumPower
apphp Apr 5, 2026
c7e6448
ML-396 fix for tests
apphp Apr 5, 2026
e8197a9
ML-396 fix for tests
apphp Apr 5, 2026
3d92727
ML-396 fix for tests
apphp Apr 5, 2026
080006a
ML-396 fix for tests
apphp Apr 5, 2026
adb2d51
ML-396 fix for tests
apphp Apr 5, 2026
67673cc
ML-396 fix for tests
apphp Apr 5, 2026
02ca4eb
ML-396 fix for tests
apphp Apr 5, 2026
84bc347
ML-396 KNNRegressor migrated to NumPower
apphp Apr 5, 2026
72f08e7
ML-396 KDNeighborsRegressor migrated to NumPower
apphp Apr 12, 2026
bd0c461
ML-396 KDNeighborsRegressor migrated to NumPower
apphp Apr 12, 2026
0a395a4
ML-396 added DataProviderExternal to RidgeTest
apphp Apr 12, 2026
7e64e6f
ML-396 removed unused imports
apphp Apr 12, 2026
42664e1
ML-396 SVR migrated to dedicated namespace and updated dependencies
apphp Apr 12, 2026
2cc5ff4
ML-396 minor code style fixes
apphp Apr 12, 2026
269d17f
ML-396 added determinant test with DataProvider to NumPowerTest
apphp Apr 12, 2026
eaede75
ML-396 updated @var annotation for params in SVR class
apphp Apr 12, 2026
aac88da
ML-396 removed unneeded debug
apphp Apr 12, 2026
a49cc3e
ML-396 removed unneeded debug
apphp Apr 12, 2026
e179684
ML-396 added validation for empty ensemble, cleaned dependencies, upd…
apphp Apr 12, 2026
9daa783
ML-396 fixed regex pattern in PHPStan baseline to escape '#' properly
apphp Apr 12, 2026
75636f8
ML-396 updated changelog with array_pack function and NDArray migrati…
apphp Apr 12, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@ jobs:
run: composer analyze-ci

- name: Unit Tests
#run: vendor/bin/phpunit --display-warning --display-deprecations --display-notices --testsuite="Anomaly Detectors,Backends,Base,Classifiers,Clusterers,Cross Validation,Datasets,Extractors,Graph,Helpers,Kernels,Loggers,NeuralNet,Persisters,Regressors,Serializers,Specifications,Strategies,Tokenizers,Transformers"
run: composer test

- name: Check Coding Style
Expand Down
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@
- RBX Serializer only tracks major library version number
- Convert NeuralNet classes to use NDArray instead of Matrix
- Converted Network back from a class to an interface
- Added array_pack() function to replace array_map('array_values', $samples)
- Converted Regressor classes to use NDArray instead of Matrix

- 2.5.0
- Added Vantage Point Spatial tree
Expand Down
1 change: 1 addition & 0 deletions composer.json
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
"andrewdalpino/okbloomer": "^1.0",
"psr/log": "^1.1|^2.0|^3.0",
"rubix/tensor": "^3.0",
"rubixml/numpower": "dev-main",
"symfony/polyfill-mbstring": "^1.0",
"symfony/polyfill-php80": "^1.17",
"symfony/polyfill-php82": "^1.27",
Expand Down
4 changes: 2 additions & 2 deletions docs/datasets/generators/hyperplane.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Datasets/Generators/Hyperplane.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Datasets/Generators/Hyperplane/Hyperplane.php">[source]</a></span>

# Hyperplane
Generates a labeled dataset whose samples form a hyperplane in n-dimensional vector space and whose labels are continuous values drawn from a uniform random distribution between -1 and 1. When the number of coefficients is either 1, 2 or 3, the samples form points, lines, and planes respectively. Due to its linearity, Hyperplane is especially useful for testing linear regression models.
Expand All @@ -16,7 +16,7 @@ Generates a labeled dataset whose samples form a hyperplane in n-dimensional vec

## Example
```php
use Rubix\ML\Datasets\Generators\Hyperplane;
use Rubix\ML\Datasets\Generators\Hyperplane\Hyperplane;

$generator = new Hyperplane([0.1, 3, -5, 0.01], 150.0, 0.25);
```
Expand Down
4 changes: 2 additions & 2 deletions docs/datasets/generators/swiss-roll.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Datasets/Generators/SwissRoll.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Datasets/Generators/SwissRoll/SwissRoll.php">[source]</a></span>

# Swiss Roll
Generate a non-linear 3-dimensional dataset resembling a *swiss roll* or spiral. The labels are the seeds to the swiss roll transformation.
Expand All @@ -19,7 +19,7 @@ Generate a non-linear 3-dimensional dataset resembling a *swiss roll* or spiral.

## Example
```php
use Rubix\ML\Datasets\Generators\SwissRoll;
use Rubix\ML\Datasets\Generators\SwissRoll\SwissRoll;

$generator = new SwissRoll(5.5, 1.5, -2.0, 10, 21.0, 0.2);
```
Expand Down
8 changes: 4 additions & 4 deletions docs/regressors/adaline.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/Adaline.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/Adaline/Adaline.php">[source]</a></span>

# Adaline
*Adaptive Linear Neuron* is a single layer feed-forward neural network with a continuous linear output neuron suitable for regression tasks. Training is equivalent to solving L2 regularized linear regression ([Ridge](ridge.md)) online using Mini Batch Gradient Descent.
Expand All @@ -20,9 +20,9 @@

## Example
```php
use Rubix\ML\Regressors\Adaline;
use Rubix\ML\NeuralNet\Optimizers\Adam;
use Rubix\ML\NeuralNet\CostFunctions\HuberLoss;
use Rubix\ML\Regressors\Adaline\Adaline;
use Rubix\ML\NeuralNet\Optimizers\Adam\Adam;
use Rubix\ML\NeuralNet\CostFunctions\HuberLoss\HuberLoss;

$estimator = new Adaline(256, new Adam(0.001), 1e-4, 500, 1e-6, 5, new HuberLoss(2.5));
```
Expand Down
4 changes: 2 additions & 2 deletions docs/regressors/extra-tree-regressor.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/ExtraTreeRegressor.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/ExtraTreeRegressor/ExtraTreeRegressor.php">[source]</a></span>

# Extra Tree Regressor
*Extremely Randomized* Regression Trees differ from standard [Regression Trees](regression-tree.md) in that they choose candidate splits at random rather than searching the entire feature column for the best value to split on. Extra Trees are also faster to build and their predictions have higher variance than a regular decision tree regressor.
Expand All @@ -17,7 +17,7 @@

## Example
```php
use Rubix\ML\Regressors\ExtraTreeRegressor;
use Rubix\ML\Regressors\ExtraTreeRegressor\ExtraTreeRegressor;

$estimator = new ExtraTreeRegressor(30, 5, 0.05, null);
```
Expand Down
6 changes: 3 additions & 3 deletions docs/regressors/gradient-boost.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/GradientBoost.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/GradientBoost/GradientBoost.php">[source]</a></span>

# Gradient Boost
Gradient Boost (GBM) is a stage-wise additive ensemble that uses a Gradient Descent boosting scheme for training boosters (Decision Trees) to correct the error residuals of a base learner.
Expand Down Expand Up @@ -28,8 +28,8 @@ Gradient Boost (GBM) is a stage-wise additive ensemble that uses a Gradient Desc

## Example
```php
use Rubix\ML\Regressors\GradientBoost;
use Rubix\ML\Regressors\RegressionTree;
use Rubix\ML\Regressors\GradientBoost\GradientBoost;
use Rubix\ML\Regressors\RegressionTree\RegressionTree;
use Rubix\ML\CrossValidation\Metrics\SMAPE;

$estimator = new GradientBoost(new RegressionTree(3), 0.1, 0.8, 1000, 1e-4, 3, 10, 0.1, new SMAPE());
Expand Down
4 changes: 2 additions & 2 deletions docs/regressors/kd-neighbors-regressor.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/KDNeighborsRegressor.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/KDNeighborsRegressor/KDNeighborsRegressor.php">[source]</a></span>

# K-d Neighbors Regressor
A fast implementation of [KNN Regressor](knn-regressor.md) using a spatially-aware binary tree for nearest neighbors search. K-d Neighbors Regressor works by locating the neighborhood of a sample via binary search and then does a brute force search only on the samples close to or within the neighborhood of the unknown sample. The main advantage of K-d Neighbors over brute force KNN is inference speed, however, it cannot be partially trained.
Expand All @@ -16,7 +16,7 @@ A fast implementation of [KNN Regressor](knn-regressor.md) using a spatially-awa

## Example
```php
use Rubix\ML\Regressors\KDNeighborsRegressor;
use Rubix\ML\Regressors\KDNeighborsRegressor\KDNeighborsRegressor;
use Rubix\ML\Graph\Trees\BallTree;

$estimator = new KDNeighborsRegressor(20, true, new BallTree(50));
Expand Down
4 changes: 2 additions & 2 deletions docs/regressors/knn-regressor.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/KNNRegressor.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/KNNRegressor/KNNRegressor.php">[source]</a></span>

# KNN Regressor
K Nearest Neighbors (KNN) is a brute-force distance-based learner that locates the k nearest training samples from the training set and averages their labels to make a prediction. K Nearest Neighbors (KNN) is considered a *lazy* learner because it performs most of its computation at inference time.
Expand All @@ -19,7 +19,7 @@ K Nearest Neighbors (KNN) is a brute-force distance-based learner that locates t

## Example
```php
use Rubix\ML\Regressors\KNNRegressor;
use Rubix\ML\Regressors\KNNRegressor\KNNRegressor;
use Rubix\ML\Kernels\Distance\SafeEuclidean;

$estimator = new KNNRegressor(5, false, new SafeEuclidean());
Expand Down
14 changes: 7 additions & 7 deletions docs/regressors/mlp-regressor.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/MLPRegressor.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/MLPRegressor/MLPRegressor.php">[source]</a></span>

# MLP Regressor
A multilayer feed-forward neural network with a continuous output layer suitable for regression problems. The Multilayer Perceptron regressor is able to handle complex non-linear regression problems by forming higher-order representations of the input features using intermediate user-defined hidden layers. The MLP also has network snapshotting and progress monitoring to ensure that the model achieves the highest validation score per a given training time budget.
Expand Down Expand Up @@ -26,12 +26,12 @@ A multilayer feed-forward neural network with a continuous output layer suitable

## Example
```php
use Rubix\ML\Regressors\MLPRegressor;
use Rubix\ML\NeuralNet\CostFunctions\LeastSquares;
use Rubix\ML\NeuralNet\Layers\Dense;
use Rubix\ML\NeuralNet\Layers\Activation;
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU;
use Rubix\ML\NeuralNet\Optimizers\RMSProp;
use Rubix\ML\Regressors\MLPRegressor\MLPRegressor;
use Rubix\ML\NeuralNet\CostFunctions\LeastSquares\LeastSquares;
use Rubix\ML\NeuralNet\Layers\Dense\Dense;
use Rubix\ML\NeuralNet\Layers\Activation\Activation;
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU\ReLU;
use Rubix\ML\NeuralNet\Optimizers\RMSProp\RMSProp;
use Rubix\ML\CrossValidation\Metrics\RSquared;

$estimator = new MLPRegressor([
Expand Down
4 changes: 2 additions & 2 deletions docs/regressors/radius-neighbors-regressor.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/RadiusNeighborsRegressor.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/RadiusNeighborsRegressor/RadiusNeighborsRegressor.php">[source]</a></span>

# Radius Neighbors Regressor
This is the regressor version of [Radius Neighbors](../classifiers/radius-neighbors.md) implementing a binary spatial tree under the hood for fast radius queries. The prediction is a weighted average of each label from the training set that is within a fixed user-defined radius.
Expand All @@ -18,7 +18,7 @@ This is the regressor version of [Radius Neighbors](../classifiers/radius-neighb

## Example
```php
use Rubix\ML\Regressors\RadiusNeighborsRegressor;
use Rubix\ML\Regressors\RadiusNeighborsRegressor\RadiusNeighborsRegressor;
use Rubix\ML\Graph\Trees\BallTree;
use Rubix\ML\Kernels\Distance\Diagonal;

Expand Down
6 changes: 3 additions & 3 deletions docs/regressors/regression-tree.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/RegressionTree.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/RegressionTree/RegressionTree.php">[source]</a></span>

# Regression Tree
A decision tree based on the CART (*Classification and Regression Tree*) learning algorithm that performs greedy splitting by minimizing the variance of the labels at each node split. Regression Trees can be used on their own or as the booster in algorithms such as [Gradient Boost](gradient-boost.md).
Expand All @@ -18,7 +18,7 @@ A decision tree based on the CART (*Classification and Regression Tree*) learnin

## Example
```php
use Rubix\ML\Regressors\RegressionTree;
use Rubix\ML\Regressors\RegressionTree\RegressionTree;

$estimator = new RegressionTree(20, 2, 1e-3, 10, null);
```
Expand Down Expand Up @@ -50,4 +50,4 @@ public balance() : ?int

## References:
[^1]: W. Y. Loh. (2011). Classification and Regression Trees.
[^2]: K. Alsabti. et al. (1998). CLOUDS: A Decision Tree Classifier for Large Datasets.
[^2]: K. Alsabti. et al. (1998). CLOUDS: A Decision Tree Classifier for Large Datasets.
4 changes: 2 additions & 2 deletions docs/regressors/ridge.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/Ridge.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/Ridge/Ridge.php">[source]</a></span>

# Ridge
L2 regularized linear regression solved using a closed-form solution. The addition of regularization, controlled by the *alpha* hyper-parameter, makes Ridge less likely to overfit the training data than ordinary least squares (OLS).
Expand All @@ -14,7 +14,7 @@ L2 regularized linear regression solved using a closed-form solution. The additi

## Example
```php
use Rubix\ML\Regressors\Ridge;
use Rubix\ML\Regressors\Ridge\Ridge;

$estimator = new Ridge(2.0);
```
Expand Down
4 changes: 2 additions & 2 deletions docs/regressors/svr.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/SVR.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/Regressors/SVR/SVR.php">[source]</a></span>

# SVR
The Support Vector Machine Regressor (SVR) is a maximum margin algorithm for the purposes of regression. Similarly to the [SVC](../classifiers/svc.md), the model produced by SVR depends only on a subset of the training data, because the cost function for building the model ignores any training data close to the model prediction given by parameter *epsilon*. Thus, the value of epsilon defines a margin of tolerance where no penalty is given to errors.
Expand Down Expand Up @@ -33,7 +33,7 @@ public load(string $path) : void

## Example
```php
use Rubix\ML\Regressors\SVR;
use Rubix\ML\Regressors\SVR\SVR;
use Rubix\ML\Kernels\SVM\RBF;

$estimator = new SVR(1.0, 0.03, new RBF(), true, 1e-3, 256.0);
Expand Down
Loading
Loading