Skip to content

Added ability to hunt for replay, as well as some more minor changes.#15

Open
WilburDoz wants to merge 5 commits intolindermanlab:callback_and_bugfixfrom
WilburDoz:master
Open

Added ability to hunt for replay, as well as some more minor changes.#15
WilburDoz wants to merge 5 commits intolindermanlab:callback_and_bugfixfrom
WilburDoz:master

Conversation

@WilburDoz
Copy link
Copy Markdown
Contributor

@WilburDoz WilburDoz commented Jun 25, 2023

In this version of the code the user can train the model on some neural data, extract the resulting sequence parameters, fix a subset of those, then use a model with fixed sequence parameters to go looking for replay of those specific sequences in new data.

This is demonstrated in the jupyter notebook 'songbird-sacred.ipynb'. Sacred is the name we've been using for fixed sequences. To do this there is an additional function - sanctify_model - that takes the parameters describing a set of sequences and uses them to fix them in a particular model.

Additionally there are some smaller changes:

  1. There are multiple types of warp values, and an additional parameter in the config that chooses between them. Their creation is done in model.jl
  2. Some small plotting edits are made in visualisation.jl
  3. The config is passed between functions a lot more. It really seems to make more sense to pass the config then the specific elements in the dict a lot of the time, so I took a step in that direction while implementing the sacred sequences. Really it's not the most logical mix at the moment, but that's why the arguments many functions take has changed.

Copy link
Copy Markdown
Member

@degleris1 degleris1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good overall. My main suggestion is to replace config = Dict(:key1 => value) with a struct GibbsSamplerParameters (or something like that). We can type the struct and provide default values for less commonly used arguments; this way we catch mismatched types early to prevent weird bugs / undefined behavior.

extra_split_merge_moves::Int64,
split_merge_window::Float64,
save_every::Int64;
config::Dict;
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason why the arguments were replaced by config?

Two alternatives that might be safer:

  1. Use keyword arguments, e.g., num_anneals = 10.
  2. Create a struct GibbsParameters(num_anneals, samples_per_anneal, <etc>) and pass this as an argument.

Option 2 is essentially what you have here, but it's typed. Typing helps catch bugs early / avoid unexpected behavior when, for example, someone passes an array instead of a scalar for max_temperature.

Copy link
Copy Markdown
Contributor Author

@WilburDoz WilburDoz Jul 31, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aha yeah this does seem non-optimal, basically I added some extra variable that I needed to pass deep inside some nesting of functions and got annoyed having to go and change the argument list for every function on the way down repeatedly, so shoved everything in the config. Very open to the idea that this is not the ideal way to do things.

More than happy to have a go changing

Differs from easy_sample.jl in that it uses masks and masked gibbs sampling.
Used for doing cross-validation on speckled hold-out data.
"""
function easy_sample_masked!(
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great.

# Create inverted masks to compute train log likelihood.
inv_masks = compute_complementary_masks(
masks, num_neurons(model), model.max_time)
masks, num_neurons(model), model.max_time+0.000000001)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the purpose of this?

Can we define a keyword argument max_time_tolerance = 1e-8 or something?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there was some strange behaviour where it would just slightly get confused about a spike that was on the edge of the max time and the whole thing would explode. This fixed it (though is admittedly janky)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants