This is a project repository for codes developed in the course FYS-STK4155 at the University of Oslo. All code in project 1 and 2 is written in collaboration with Ellen Reeka and Ines Santandreu.
Implementation of Ordinary Least Squares (OLS), Ridge and LASSO regression for the purpose of studying topological data.
Implementation of a feed forward neural network (FFNN) from scratch with backpropagation. Under Examples the code is used for both regression and binary classification.
Implementation of Keras DNN and xgboost to predict thermal conductivity in inorganic materials. Implementation of regressor tuners are central to this project.
Library: contains call to return scaled, normalized and train-test splitted data sets.
In this code, Keras is used to create a neural network builder and wrap the builder in a Keras Tuner engine for hyperparameter optimization.
In this code, xgboost is used to call XGBRegressor(). The regressor is wrapped in a hyperparameter search function RandomizedSearchCV() from sklearn to determine the optimal configuration of the XGBRegressor. After determining the optimal configuration, a final 'hypermodel' is called using this parameter setup and saved as the tuned regressor.
Here each tuned regressor is loaded and studied with plots and metric computations.