Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Future regressors modularization #1144

Merged
merged 8 commits into from
Feb 8, 2023

Conversation

alfonsogarciadecorral
Copy link
Collaborator

@alfonsogarciadecorral alfonsogarciadecorral commented Feb 1, 2023

🔬 Background

  • Following the example of trend, the future regressors component is modularized.

🔮 Key changes

  • Created a new Seasonality abstract and FourierSeasonality class.

📋 Review Checklist

  • I have performed a self-review of my own code.
  • I have commented my code, added docstrings and data types to function definitions.

Please make sure to follow our best practices in the Contributing guidelines.

@github-actions
Copy link

github-actions bot commented Feb 1, 2023

Model Benchmark

Benchmark Metric main current diff
PeytonManning MAE_val 0.64636 0.64636 0.0%
PeytonManning RMSE_val 0.79276 0.79276 0.0%
PeytonManning Loss_val 0.01494 0.01494 0.0%
PeytonManning MAE 0.42701 0.42701 0.0%
PeytonManning RMSE 0.57032 0.57032 0.0%
PeytonManning Loss 0.00635 0.00635 0.0%
PeytonManning time 17.8738 13.79 -22.85% 🎉
YosemiteTemps MAE_val 1.72948 1.72949 0.0%
YosemiteTemps RMSE_val 2.27386 2.27386 0.0%
YosemiteTemps Loss_val 0.00096 0.00096 0.0%
YosemiteTemps MAE 1.45189 1.45189 0.0%
YosemiteTemps RMSE 2.16631 2.16631 0.0%
YosemiteTemps Loss 0.00066 0.00066 0.0%
YosemiteTemps time 148.303 115.05 -22.42% 🎉
AirPassengers MAE_val 15.4077 15.4077 -0.0%
AirPassengers RMSE_val 19.5099 19.5099 -0.0%
AirPassengers Loss_val 0.00196 0.00196 -0.0%
AirPassengers MAE 9.86947 9.86947 0.0%
AirPassengers RMSE 11.7222 11.7222 0.0%
AirPassengers Loss 0.00057 0.00057 0.0%
AirPassengers time 6.47205 4.99 -22.9% 🎉
Model training plots

Model Training

PeytonManning

YosemiteTemps

AirPassengers

@codecov-commenter
Copy link

codecov-commenter commented Feb 1, 2023

Codecov Report

Merging #1144 (df8ea73) into main (75ccf1a) will increase coverage by 0.03%.
The diff coverage is 92.85%.

📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more

@@            Coverage Diff             @@
##             main    #1144      +/-   ##
==========================================
+ Coverage   90.24%   90.27%   +0.03%     
==========================================
  Files          33       36       +3     
  Lines        4992     5018      +26     
==========================================
+ Hits         4505     4530      +25     
- Misses        487      488       +1     
Impacted Files Coverage Δ
.../components/future_regressors/future_regressors.py 77.27% <77.27%> (ø)
...alprophet/components/future_regressors/__init__.py 100.00% <100.00%> (ø)
...uralprophet/components/future_regressors/linear.py 100.00% <100.00%> (ø)
neuralprophet/components/router.py 89.28% <100.00%> (+1.78%) ⬆️
neuralprophet/plot_utils.py 89.82% <100.00%> (ø)
neuralprophet/time_net.py 90.68% <100.00%> (+0.51%) ⬆️
neuralprophet/utils.py 80.74% <100.00%> (-0.32%) ⬇️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@@ -465,7 +465,7 @@ def get_valid_configuration( # move to utils
}
)
elif validator == "plot_parameters":
regressor_param = m.model.get_reg_weights(regressor)[quantile_index, :]
regressor_param = m.model.future_regressors.get_reg_weights(regressor)[quantile_index, :]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sweet, like that you include the regularization already!

Copy link
Collaborator

@karl-richter karl-richter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's one if-else statement where I suspect some dublicate, please review and change if applicable. Otherwise looks really good, ready to go from my side.

n_multiplicative_regressor_params = 0
for name, configs in self.regressors_dims.items():
if configs["mode"] not in ["additive", "multiplicative"]:
log.error("Regressors mode {} not implemented. Defaulting to 'additive'.".format(configs["mode"]))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a comment: We should make a consistent decision where to raise error messages if certain parameters are missing. But fine for now.

Comment on lines -835 to -774
components["future_regressors_additive"] = self.scalar_features_effects(
features=inputs["regressors"]["additive"], params=self.regressor_params["additive"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really good solution, I like how much simpler the code now is!

Comment on lines 787 to 792
if mode == "additive":
features = inputs["regressors"]["additive"]
params = self.regressor_params["additive"]
mode = "additive"
else:
features = inputs["regressors"]["multiplicative"]
params = self.regressor_params["multiplicative"]
components[f"future_regressor_{regressor}"] = self.scalar_features_effects(
features=features, params=params, indices=index
)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What exactly is this section of the new code doing? The following looks kind of redundant

if mode == "additive":
    ...
    mode = "additive"

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@alfonsogarciadecorral can you have a quick look at this?

Comment on lines 13 to 34
def new_param(dims):
"""Create and initialize a new torch Parameter.

Parameters
----------
dims : list or tuple
Desired dimensions of parameter

Returns
-------
nn.Parameter
initialized Parameter
"""
if len(dims) > 1:
return nn.Parameter(nn.init.xavier_normal_(torch.randn(dims)), requires_grad=True)
else:
return nn.Parameter(torch.nn.init.xavier_normal_(torch.randn([1] + dims)).squeeze(0), requires_grad=True)


def init_parameter(dims):
"""
Create and initialize a new torch Parameter.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we did the same thing here ;)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed this in the latest commit

@karl-richter karl-richter added modularity status: ready PR is ready to be merged labels Feb 8, 2023
@karl-richter karl-richter added this to the Release 0.5.2 milestone Feb 8, 2023
Copy link
Owner

@ourownstory ourownstory left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great Work!!

Copy link
Owner

@ourownstory ourownstory left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only issue: let's not pass logger



class FutureRegressors(BaseComponent):
def __init__(self, config, id_list, quantiles, n_forecasts, device, log, config_trend_none_bool):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the logger can be retrived at the beginning of a file - does not need to be passed by. (see examples of other files)

@karl-richter karl-richter merged commit 75f7486 into main Feb 8, 2023
@karl-richter karl-richter deleted the feature/modular_future_regressors branch February 8, 2023 22:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: ready PR is ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants