Added ability to hunt for replay, as well as some more minor changes.#15
Added ability to hunt for replay, as well as some more minor changes.#15WilburDoz wants to merge 5 commits intolindermanlab:callback_and_bugfixfrom
Conversation
… and a new example
degleris1
left a comment
There was a problem hiding this comment.
Looks good overall. My main suggestion is to replace config = Dict(:key1 => value) with a struct GibbsSamplerParameters (or something like that). We can type the struct and provide default values for less commonly used arguments; this way we catch mismatched types early to prevent weird bugs / undefined behavior.
| extra_split_merge_moves::Int64, | ||
| split_merge_window::Float64, | ||
| save_every::Int64; | ||
| config::Dict; |
There was a problem hiding this comment.
Any reason why the arguments were replaced by config?
Two alternatives that might be safer:
- Use keyword arguments, e.g.,
num_anneals = 10. - Create a struct
GibbsParameters(num_anneals, samples_per_anneal, <etc>)and pass this as an argument.
Option 2 is essentially what you have here, but it's typed. Typing helps catch bugs early / avoid unexpected behavior when, for example, someone passes an array instead of a scalar for max_temperature.
There was a problem hiding this comment.
Aha yeah this does seem non-optimal, basically I added some extra variable that I needed to pass deep inside some nesting of functions and got annoyed having to go and change the argument list for every function on the way down repeatedly, so shoved everything in the config. Very open to the idea that this is not the ideal way to do things.
More than happy to have a go changing
| Differs from easy_sample.jl in that it uses masks and masked gibbs sampling. | ||
| Used for doing cross-validation on speckled hold-out data. | ||
| """ | ||
| function easy_sample_masked!( |
| # Create inverted masks to compute train log likelihood. | ||
| inv_masks = compute_complementary_masks( | ||
| masks, num_neurons(model), model.max_time) | ||
| masks, num_neurons(model), model.max_time+0.000000001) |
There was a problem hiding this comment.
What is the purpose of this?
Can we define a keyword argument max_time_tolerance = 1e-8 or something?
There was a problem hiding this comment.
I think there was some strange behaviour where it would just slightly get confused about a spike that was on the edge of the max time and the whole thing would explode. This fixed it (though is admittedly janky)
In this version of the code the user can train the model on some neural data, extract the resulting sequence parameters, fix a subset of those, then use a model with fixed sequence parameters to go looking for replay of those specific sequences in new data.
This is demonstrated in the jupyter notebook 'songbird-sacred.ipynb'. Sacred is the name we've been using for fixed sequences. To do this there is an additional function - sanctify_model - that takes the parameters describing a set of sequences and uses them to fix them in a particular model.
Additionally there are some smaller changes: