From ee8e2dac6e23324fe88473d48e6c5f590a187a57 Mon Sep 17 00:00:00 2001 From: lewardo Date: Mon, 21 Aug 2023 13:34:44 +0100 Subject: [PATCH 1/2] polynomialregressor reference documentation file --- doc/PolynomialRegressor.rst | 66 +++++++++++++++++++++++++++++++++++++ 1 file changed, 66 insertions(+) create mode 100644 doc/PolynomialRegressor.rst diff --git a/doc/PolynomialRegressor.rst b/doc/PolynomialRegressor.rst new file mode 100644 index 0000000..caf535a --- /dev/null +++ b/doc/PolynomialRegressor.rst @@ -0,0 +1,66 @@ +:digest: Regression with a polynomial +:species: data +:sc-categories: Statistical regression +:sc-related: +:see-also: DataSet +:description: + + Perform regression between :fluid-obj:`DataSet`\s using N parallel 1-to-1 polynomial regressors in one object. + +:discussion: + + A polynomial regressor is a very simple algorithm that, given a set of input-output univarate pairs - :math:`x` to :math`y`, for example - will find the line of best fit for that data. Linear regression is a special case when the ``degree`` of the polynomial is ``1``, meaning the highest power of :math:`x` fitted is ``1`` + Essentially, each element of each input is mapped to the corresponding element of the same output, hence it needing to be an N-to-N corpus, which is one limitation of this algorithm. + Tikhonov regularisation is ane extension of this algorithm, which compensates for noisy data and reduces overfitting in certain situations, a good explanation of how this works can be found on wikipedia (https://en.wikipedia.org/wiki/Ridge_regression#Tikhonov_regularization) + +:control degree: + + An integer that specifies the degree \(highest power of x\) that the fit polynomial will have; e.g. a degree of 2 means that the polynomial will have a form :math:`y = \alpha + \beta x + \gamma x^2` + +:control tikhonov: + + A floating point value that describes the strength of the Tikhonov filter, often denoted as :math:`\alpha` in explanations of the algorithm + +:message fit: + + :arg sourceDataSet: Source data + + :arg targetDataSet: Target data + + Fit the polynomial to map between a source and target :fluid-obj:`DataSet` + +:message predict: + + :arg sourceDataSet: Input data + + :arg targetDataSet: Output data + + Apply the regressed mapping to a :fluid-obj:`DataSet` (given a regressed set of polynomials) + +:message predictPoint: + + :arg sourceBuffer: Input point + + :arg targetBuffer: Output point + + Apply the regressed mapping to a single data point in a |buffer| + +:message clear: + + This will erase all the learned polynomials. + +:message print: + + Print object information to the console. + +:message read: + + :arg fileName: file to read from (optional, will prompt if not present) + + load regressed polynomials from a json file on disk + +:message write: + + :arg fileName: file to write to (optional, will prompt if not present) + + write current regression to a json file on disk From 20ef9f7ec243591cb05a464c479d2d49e1b94c5a Mon Sep 17 00:00:00 2001 From: lewardo Date: Wed, 30 Aug 2023 12:06:05 +0100 Subject: [PATCH 2/2] less terse programmer-y description (still needs improvement) --- doc/PolynomialRegressor.rst | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/doc/PolynomialRegressor.rst b/doc/PolynomialRegressor.rst index caf535a..1c9991d 100644 --- a/doc/PolynomialRegressor.rst +++ b/doc/PolynomialRegressor.rst @@ -1,17 +1,17 @@ -:digest: Regression with a polynomial +:digest: Polynomial regression on data sets. :species: data :sc-categories: Statistical regression :sc-related: :see-also: DataSet :description: - Perform regression between :fluid-obj:`DataSet`\s using N parallel 1-to-1 polynomial regressors in one object. + Perform regression between fluid.dataset\s using N parallel 1-to-1 polynomial regressors in one object. :discussion: - A polynomial regressor is a very simple algorithm that, given a set of input-output univarate pairs - :math:`x` to :math`y`, for example - will find the line of best fit for that data. Linear regression is a special case when the ``degree`` of the polynomial is ``1``, meaning the highest power of :math:`x` fitted is ``1`` + A polynomial regressor is a very simple algorithm that, given a set of input-output pairs - :math:`x` to :math`y`, for example - will find the line of best fit for that data. Linear regression is a special case when the ``degree`` of the polynomial is ``1``, meaning the highest power of :math:`x` fitted is ``1`` (it find the best :math`y = mx + c` to fit the data) Essentially, each element of each input is mapped to the corresponding element of the same output, hence it needing to be an N-to-N corpus, which is one limitation of this algorithm. - Tikhonov regularisation is ane extension of this algorithm, which compensates for noisy data and reduces overfitting in certain situations, a good explanation of how this works can be found on wikipedia (https://en.wikipedia.org/wiki/Ridge_regression#Tikhonov_regularization) + Tikhonov regularisation is an improvement of this algorithm, which compensates for noisy data and reduces overfitting in certain situations, a good explanation of how this works can be found on wikipedia (https://en.wikipedia.org/wiki/Ridge_regression#Tikhonov_regularization) :control degree: @@ -19,7 +19,7 @@ :control tikhonov: - A floating point value that describes the strength of the Tikhonov filter, often denoted as :math:`\alpha` in explanations of the algorithm + A floating point value that describes the strength of the Tikhonov filter, namely how much the algorithm is penalised for overfitting to the data :message fit: @@ -31,11 +31,11 @@ :message predict: - :arg sourceDataSet: Input data + :arg sourceDataSet: Input :fluid-obj:`DataSet` - :arg targetDataSet: Output data + :arg targetDataSet: Output :fluid-obj:`DataSet` - Apply the regressed mapping to a :fluid-obj:`DataSet` (given a regressed set of polynomials) + Apply the regressed mapping to a :fluid-obj:`DataSet` and predict the output value for each point :message predictPoint: