-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pynisher context is passed to metafeatures #1076
Pynisher context is passed to metafeatures #1076
Conversation
Codecov Report
@@ Coverage Diff @@
## development #1076 +/- ##
===============================================
- Coverage 85.46% 85.43% -0.04%
===============================================
Files 130 130
Lines 10334 10344 +10
===============================================
+ Hits 8832 8837 +5
- Misses 1502 1507 +5
Continue to review full report at Codecov.
|
@mfeurer , there is a consistent problem with 3.6 python and spawn (when enabling metaleaning context). I do not think we are using spawn, or plan to use spawn for metalearning. Should it be disabled? |
No, we're not using it. I just disabled this. |
* MAINT cleanup readme and remove old service yaml file (.landscape.yaml) * MAINT bump to dev version * move from fork to spawn * FIX_1061 (automl#1063) * FIX_1061 * Fxi type of target * Moving to classes_ * classes_ should be np.ndarray * Force float before nan * Pynisher context is passed to metafeatures (automl#1076) * Pynisher context to metafeatures * Update test_smbo.py Co-authored-by: Matthias Feurer <[email protected]> * Calculate loss support (automl#1075) * Calculate loss support * Relaxed log loss test for individual models * Feedback from automl#1075 * Missing loss in comment * Revert back test as well * Fix rank for metrics for which greater value is not good (automl#1079) * Enable Mypy in evaluation (except Train Evaluator) (automl#1077) * Almost all files for evaluation * Feedback from PR * Feedback from comments * Solving rebase artifacts * Revert bytes * Automatically update the Copyright when building the html (automl#1074) * update the year automatically * Fixes for new numpy * Revert test * Prepare new release (automl#1081) * prepare new release * fix unit test * bump version number * Fix 1072 (automl#1073) * Improve selector checking * Remove copy error * Rebase changes to development * No .cache and check selector path * Missing params in signature (automl#1084) * Add size check before trying to split for GMeans (automl#732) * Add size check before trying to split * Rebase to new code Co-authored-by: chico <[email protected]> * Fxi broken links in docs and update parallel docs (automl#1088) * Fxi broken links * Feedback from comments * Update manual.rst Co-authored-by: Matthias Feurer <[email protected]> * automl#660 Enable Power Transformations Update (automl#1086) * Power Transformer * Correct typo * ADD_630 * PEP8 compliance * Fix target type Co-authored-by: MaxGreil <[email protected]> * Stale Support (automl#1090) * Stale Support * Enhanced criteria for stale * Enable weekly cron job * test Co-authored-by: Matthias Feurer <[email protected]> Co-authored-by: Matthias Feurer <[email protected]> Co-authored-by: Rohit Agarwal <[email protected]> Co-authored-by: Pepe Berba <[email protected]> Co-authored-by: MaxGreil <[email protected]>
This enables us to pass the context we use for pynisher (defined in automl) to the metalearning calculation in SMBO.
It also creates a test to make sure that metalearning works in any context.