thoughtminers provide a set of receipes and convenient functions for the graphical and probabilistic representation and manipulation of reasoning spaces from large language models (LLMs). Current functionalities include visualisation of latent activation of reasoning traces and inference time scalling methods based on uncertainty minimisation.
The last official release can be downloaded from PIP:
pip install thoughtminersThe current version under development can be installed from the master branch of the GitHub folder:
pip install “git+https://github.com/centre-for-humanities-computing/thoughtminers.git”See this tutorial for more details.
This package is developed and maintained by Nicolas Legrand with the support of the Centre for Humanities Computing.

