Experanto is a Python package designed for interpolating recordings and stimuli in neuroscience experiments. It enables users to load single or multiple experiments and create efficient dataloaders for machine learning applications.
Important
If you're interested in contributing or looking for a Google Summer of Code project, we discuss ideas and planned features in this ideas thread — please share your thoughts there before opening a PR.
- Unified Experiment Interface: Load and query multi-modal neuroscience data (neural responses, eye tracking, treadmill, visual stimuli) through a single
Experimentclass - Flexible Interpolation: Interpolate data at arbitrary time points with support for linear and nearest-neighbor methods
- Multi-Session Support: Combine data from multiple recording sessions into a single dataloader
- Configurable Preprocessing: YAML-based configuration for sampling rates, normalization, transforms, and filtering
- PyTorch Integration: Native PyTorch
DatasetandDataLoaderimplementations optimized for training
git clone https://github.com/sensorium-competition/experanto.git
cd experanto
pip install -e .To replicate the generate_sample example, use the following command (see allen_exporter):
pip install -e /path/to/allen_exporterTo replicate the sensorium_example (see sensorium_2023), install neuralpredictors (see neuralpredictors) as well:
pip install -e /path/to/neuralpredictors
pip install -e /path/to/sensorium_2023from experanto.experiment import Experiment
# Load a single experiment
exp = Experiment("/path/to/experiment")
# Query data at specific time points
import numpy as np
times = np.linspace(0, 10, 100) # 100 time points over 10 seconds
# Get interpolated data from all devices
data = exp.interpolate(times)
# Or from a specific device
responses = exp.interpolate(times, device="responses")Experanto uses YAML configuration files. See configs/default.yaml for all options:
dataset:
modality_config:
responses:
sampling_rate: 8
chunk_size: 16
transforms:
normalization: "standardize"
screen:
sampling_rate: 30
chunk_size: 60
transforms:
normalization: "normalize"
dataloader:
batch_size: 16
num_workers: 2Full documentation is available at Read the Docs.
Contributions are welcome! Please read our Contributing Guide and open an issue or submit a pull request on GitHub.
This project is licensed under the MIT License. See the LICENSE file for details.