-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add automatic calibration #293
Merged
Merged
Changes from all commits
Commits
Show all changes
40 commits
Select commit
Hold shift + click to select a range
281e5e5
added RT, MS1 and mobility optimizers for automatic calibration
odespard 918bfda
added optimizers for targeted optimization of MS1, MS2, RT and mobility
odespard 25ef9d2
added plotting for automatic optimizers
odespard a9eae04
changed calibration and removed now-unused methods
odespard 20c91c6
fixed errors
odespard 71e98cd
Merge branch 'optimizer_classifier_integration' into additional_optim…
odespard 6faa3eb
update to use df
odespard 279558e
Merge branch 'optimizer_classifier_integration' into additional_optim…
odespard 19dde17
use df for optimization history in automatic optimization
odespard 335ec7a
Merge branch 'additional_optimizers' into refactor_calibration
odespard ecbe63a
change check for targeted calibration
odespard de6da9b
temporary_save
odespard 00d533d
automatic calibration in peptidecentric.py with required alterations …
odespard 3b24324
formatting
odespard 09c0d70
changed some TODOs and comments
odespard 60b884e
e2e now runs automatic calibration test
odespard 1170dcd
fixed bugs with classifier version and workflow.extraction()
odespard 599e3a2
make precursor_df consistent
odespard 5c4b3da
Merge branch 'development' into add_automatic_calibration
odespard 0878a77
formatting
odespard 1becbc9
improved logging
odespard b37e70c
remove empty list in completely automatic calibration
odespard 0cb800f
enforce minimum training iterations and backtrack for kernel
odespard 7d3f930
impose minimum iterations during extraction of optimization data
odespard 4542a7e
introduce extra optimization before loop and enforce minimum steps fo…
odespard 087a856
check min_steps as part of _check_convergence method
odespard 70dba73
add changes to automatic optimizers
odespard e6987c1
improved names and formatting
odespard d91689a
improve docstrings
odespard 8d5c989
add unit tests for optimizers
odespard 0d9e2eb
merge optimization and calibration_optimization managers
odespard f6d56f1
review comments
odespard 157c14a
update unit tests
odespard 8c61993
change feature for AutomaticMS1Optimizer
odespard 7240857
fixed unit test
odespard 84899f8
change e2e test for automatic calibration
odespard a5d01f2
adjust min_steps and optimization_lock_min_steps in default.yaml
odespard 4f9b094
formatting
odespard bd55b6c
update unit test
odespard b5ea3bb
modify_e2e_test_for_automatic_calibration
odespard File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this seems to me like
OptimizationManager
-specific logic that is outside ofOptimizationManager
:-)suggestion:
pass the
self.config
to theOptimizationManager
and do this mapping during initialization. Also, I suggest separating between constant values (probablefwhm_rt
does not change?) and dynamic values (e.g. classifier_version)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that is true. I'll change it.
I think everything in the optimization manager should change – fwhm_rt for instance is updated during calibration.