Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecation warnings + bugfix #122

Merged
merged 3 commits into from
Jun 14, 2023
Merged

Deprecation warnings + bugfix #122

merged 3 commits into from
Jun 14, 2023

Conversation

mikeknep
Copy link
Contributor

Deprecation warnings

Adds deprecation warnings guiding the user away from:

mt = MultiTable(rel_data, gretel_model="amplify")
mt.train_synthetics()

...and towards:

mt = MultiTable(rel_data)
mt.train_synthetics(config="synthetics/amplify")

Each line in the first block above will print a deprecation warning (but will continue to work). In a future release we'll remove the gretel_model parameter completely and make config a required argument to train_synthetics.

Bugfix

In the previous PR adding custom synthetics config support, I reused the existing supported_models property of the strategies for config validation. However, those values are essentially constants specific to Relational Trainer (the gretel_model param specifically); in the cases of amplify and actgan, they overlap with the model key we expect to see in a config, but that is not the case for:

gretel_model model key
lstm synthetics
tabular-dp tabular_dp

The fix here is a bit verbose but only temporarily—we'll remove the supported_gretel_models property when we remove the gretel_model param.

@mikeknep mikeknep force-pushed the deprecation-warnings branch from 6917b0f to 393800a Compare June 14, 2023 18:37
@mikeknep mikeknep merged commit 76486c3 into main Jun 14, 2023
@mikeknep mikeknep deleted the deprecation-warnings branch June 14, 2023 19:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants