Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Walberla: Flag field should not be ghost communicated at every time step #4855

Closed
RudolfWeeber opened this issue Jan 26, 2024 · 1 comment · Fixed by #4864
Closed

Walberla: Flag field should not be ghost communicated at every time step #4855

RudolfWeeber opened this issue Jan 26, 2024 · 1 comment · Fixed by #4864

Comments

@RudolfWeeber
Copy link
Contributor

In benchmarks, Espresso-dev with a pure LB fluid has worse scaleability than 4.2.1 but better performacne on low numbers of cores.

Possible cause: the flag_field (for boundaries) is part of the m_full_communication pack info whic is used for ghost communication after every time step. This is, to my understanding not necessary. The flag field only changes, when cells are marked as fluid/boudnary.

Possible solution:
have two pack infos,

  • one m_pdf_vel_force_communication for pdf, last applied force and velocity, to be run once per time step
  • one m_flag_field_communicaiton.

The first should be called in LBWalberlaImpl::ghost_communicaiton(), the second only when the fluid/boudnary state of a cell changes.

BTW: the pull scheme integraiton already calls ghost_communicaiotn(), the push scheme directly calls the call operator of the pack info. This can be changed to ghost_communicaiton() as qwkk,

Further options:

  • not ghsot communicating the velocity field and instead recalculate it in the ghost layers from pdf and last applied force. this will be more work to implement, though.
@jngrad
Copy link
Member

jngrad commented Feb 6, 2024

Quick benchmark on 8 cores: removing the flag field from the ghost communicator speeds up simulations by 2%.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants