Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor single stage #317

Closed
wants to merge 6 commits into from
Closed

Conversation

landreman
Copy link
Contributor

In #301, we've discussed how it is awkward that finite_difference.py had to be modified to manipulate the non_dofs attribute, which most Optimizable objects do not have. This PR shows one way things could be refactored to avoid this, so non_dofs is never mentioned. Here the changes to finite_difference.py and optimizable.py in #301 are reverted to master, and instead the MPIFiniteDifference class is extended to allow extra data besides .x to be broadcast among the worker groups. The example single_stage_optimization_finite_beta.py is refactored correspondingly. This example gives identical values for all quantities at the end of the optimization and uses fewer calls to VirtualCasing.from_vmec() in this branch compared to #301.

I'm not wedded to the approach here, but wanted to put it on the table. @rogeriojorge @mbkumar what do you think?

@landreman landreman requested review from mbkumar and rogeriojorge May 26, 2023 21:30
@codecov
Copy link

codecov bot commented May 26, 2023

Codecov Report

Patch coverage: 100.00% and project coverage change: -0.41 ⚠️

Comparison is base (dbf0232) 90.78% compared to head (af32454) 90.37%.

Additional details and impacted files
@@                  Coverage Diff                   @@
##           rj/single_stage_PR     #317      +/-   ##
======================================================
- Coverage               90.78%   90.37%   -0.41%     
======================================================
  Files                      63       63              
  Lines                   10719    10724       +5     
======================================================
- Hits                     9731     9692      -39     
- Misses                    988     1032      +44     
Flag Coverage Δ
unittests 90.37% <100.00%> (-0.41%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/simsopt/_core/finite_difference.py 93.27% <100.00%> (-0.82%) ⬇️
src/simsopt/_core/optimizable.py 90.22% <100.00%> (+0.05%) ⬆️

... and 9 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

Copy link
Contributor

@rogeriojorge rogeriojorge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks much cleaner than the previous approach. Besides the left out comments, this looks very good to me.

Comment on lines +16 to +18
#from simsopt.util.mpi import log
#log()

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should these comments stay?

@rogeriojorge
Copy link
Contributor

@mbkumar are you ok with merging this into rj/single_stage_PR and then master?

Copy link
Collaborator

@mbkumar mbkumar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This implementation is tidy and looks good.

@@ -1639,7 +1639,6 @@ class TempOptimizable(Optimizable):
def __init__(self, func, *args, dof_indicators=None, **kwargs):

self.func = func
args = np.ravel(args) # The user may pass a tuple or list
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this deleted?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that line was present only in the rj/single_stage_PR branch, and not in the master branch. Therefore, its removal makes ml/single_stage_PR even simpler.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants