Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fast logs in stellar collapse #295

Merged
merged 23 commits into from
Aug 31, 2023
Merged

Conversation

Yurlungur
Copy link
Collaborator

@Yurlungur Yurlungur commented Aug 19, 2023

PR Summary

For a while, I've been meaning to pull the fast logs code through the stellar collapse EOS model, as well as the singularity-opac models. This PR implements the first part of that. I thread fast logs through the stellar collapse eos. A few notes:

  • StellarCollapse supports reading either sp5 files or h5 files in original stellar collapse format. I maintain that capability, and add machinery to re-interpolate the the stellar collapse files to the new fast log grid. That said, this means loading from h5 files will be even slower, and loading from sp5 files for production is highly recommended.
  • I cleaned up the docs a little bit.
  • The README was getting pretty out of sync with the build docs. I just removed all non-essential information.

PR Checklist

  • Adds a test for any bugs fixed. Adds tests for new features.
  • Format your changes by using the make format command after configuring with cmake.
  • Document any new features, update documentation for changes made.
  • Make sure the copyright notice on any files you modified is up to date.
  • After creating a pull request, note it in the CHANGELOG.md file
  • If preparing for a new release, update the version in cmake.

@Yurlungur
Copy link
Collaborator Author

FYI @brryan @AstroBarker

Copy link
Collaborator

@brryan brryan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some questions but I'll approve now.

singularity-eos/base/fast-math/logs.hpp Outdated Show resolved Hide resolved
singularity-eos/eos/eos_stellar_collapse.hpp Outdated Show resolved Hide resolved
doc/sphinx/src/models.rst Outdated Show resolved Hide resolved
singularity-eos/eos/eos_stellar_collapse.hpp Outdated Show resolved Hide resolved
Copy link
Collaborator

@AstroBarker AstroBarker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Yurlungur ! LGTM

README.md Show resolved Hide resolved
@dholladay00
Copy link
Collaborator

How does this perform both speed/accuracy vs the previous implementation @Yurlungur ?

@Yurlungur
Copy link
Collaborator Author

@dholladay00 @jhp-lanl @brryan I believe all comments have been addressed.

How does this perform both speed/accuracy vs the previous implementation @Yurlungur ?

In principle, there's no error due to the fast log gridding. However, reinterpolation introduces an error proportional to 1/(ngrid points0^2. Performance improvement of P(rho, T) is a factor of a few, similar to what we saw with the SpinerEOS for sesame tables. T(rho, e) shows a 20% or so performance improvement.

@dholladay00
Copy link
Collaborator

@Yurlungur it looks like the python tests are failing. Could be some of the stellarcollapse changes weren't propagated.

@Yurlungur
Copy link
Collaborator Author

@Yurlungur it looks like the python tests are failing. Could be some of the stellarcollapse changes weren't propagated.

Yep I see. That's exactly what it is. I'll take a look.

@dholladay00
Copy link
Collaborator

This is now passing CI.

@Yurlungur
Copy link
Collaborator Author

Thanks for the fix @dholladay00 ! I'll go ahead and merge this then.

@Yurlungur Yurlungur merged commit 8e56561 into main Aug 31, 2023
@Yurlungur Yurlungur deleted the jmm/fast-logs-stellar-collapse branch August 31, 2023 22:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants