Conversation
Christian-B
left a comment
There was a problem hiding this comment.
remove one _n_synapse_cores and one _allow_delay_extensions
spynnaker/pyNN/spinnaker.py
Outdated
| db.insert_version("lazyarray_version", lazyarray_version) | ||
|
|
||
| # Clears all previously added ceiling on the number of neurons per core | ||
| AbstractPyNNModel.reset_all() |
There was a problem hiding this comment.
Not needed as also called by AbstractPyNNNeuronModel.reset_all()
There was a problem hiding this comment.
Yes, I wasn't certain if that actually did the right thing; I guess that is true though!
spynnaker/pyNN/models/abstract_models/sends_synaptic_inputs_over_sdram.py
Show resolved
Hide resolved
spynnaker/pyNN/extra_algorithms/splitter_components/spynnaker_splitter_selector.py
Show resolved
Hide resolved
| @abstractmethod | ||
| def synapses_per_second(self) -> int: | ||
| """ | ||
| Approximate number of synapses that can be processed per second |
There was a problem hiding this comment.
Would be good to add a comment about which it does under estimate, over estimate or as close as possible.
There was a problem hiding this comment.
Should that not be the other way around. Underestimating will create a system that can not handle the received spikes
There was a problem hiding this comment.
No, if you underestimate how many spikes you can process, you can process more than you estimate - that is, your estimate is under what you can really handle. If you overestimate you are guessing you can do more than you really can, so more spikes will be sent to you than you can handle.
This integrates the execution of neuron code and synapse code on separate cores more completely, making it nicer for the user to use. This includes:
What it doesn't do:
Requires:
Tested by: