Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fuse PSM rule change #567

Merged
merged 2 commits into from
Feb 28, 2023
Merged

Fuse PSM rule change #567

merged 2 commits into from
Feb 28, 2023

Conversation

neworderofjamie
Copy link
Contributor

When fusing postsynaptic models, the decision on which can be fused was previously very conservative - any with state variables or extra global parameters were immediately excluded. This makes simulating large models with e.g. alpha synapses pretty much impossible so here I've relaxed the rules to match those used for fusing pre and postsynaptic updates from weight update models:

  • Postsynaptic models cannot be merged if they have any extra global parameters that are referenced in their code strings (bit silly but it's consistent with the weight update models)
  • Models with variables can be merged as long as they are initialized to a constant and two postsynaptic models will be merged together if these constant values are the same.

I think this makes sense 😄

@neworderofjamie neworderofjamie added this to the GeNN 4.9.0 milestone Feb 20, 2023
Copy link
Member

@tnowotny tnowotny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Difficult business this but you have convinced me!

@neworderofjamie neworderofjamie merged commit 055e22c into master Feb 28, 2023
@neworderofjamie neworderofjamie deleted the fuse_psm_rule_change branch February 28, 2023 16:51
@neworderofjamie neworderofjamie modified the milestones: GeNN 4.9.0, GeNN 4.8.1 Mar 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants