Positivity constraint #141
Replies: 13 comments 2 replies
-
Hi Tim, Thank you! That should be feasible. Do you mind explaining, what kind of positivity constraint you have in mind: preprocessing of the regressor, or only having a positive impact on the target variable? Best, |
Beta Was this translation helpful? Give feedback.
-
Hi Oskar, having a positive impact on the target variable. This is a fairly common usecase in Marketing - you might have a regressor like "television spend per day" where you know (from common sense / business / expert knowledge) that it impacts the target (sales or revenue) non-negatively. In the stan-based Prophet implementation, that was apparently a bit a hassle to implement, since you would have to use different (non-negative) prior distributions in the underlying stan code. I am not too familiar with pytorch - but I assume the regressor effect is implemented with a Dense layer with linear activation (?) In Keras, there is the option to add a "kernel constraint" on the weights, to make them non-negative. Probably something like that exists in pytorch too? If you could point me to the right parts of the code, I could take a stab at it? |
Beta Was this translation helpful? Give feedback.
-
Joining this with issue #151 |
Beta Was this translation helpful? Give feedback.
-
Let's clarify what @TimKreienkamp and @johnf1004 are looking for:
Re: 1 & 2: Both lagged regressors and future regressors are of interest? As both of you pointed out, there are methods to implement this in Pytorch. |
Beta Was this translation helpful? Give feedback.
-
@TimKreienkamp
|
Beta Was this translation helpful? Give feedback.
-
Thanks Oskar, I'll take a look and report back here! |
Beta Was this translation helpful? Give feedback.
-
Hi Team, |
Beta Was this translation helpful? Give feedback.
-
Re 3: Yes, or non-negative specifically. I guess giving the user the option of using Relu would be perfect, or softplus if they want to force >0? |
Beta Was this translation helpful? Give feedback.
-
Hi @mekriti, |
Beta Was this translation helpful? Give feedback.
-
@johnf1004 Yes, a softplus would likely be the best option to force the options to be positive (for a deeper NN), for a linear model, Relu would be better, assuming 0 is a valid value. However, it would not work on its own due to normalization. Imagine a time series that is supposedly positive e.g. [0,1], but has a few erroneous negative entries of -1, if you use 'minmax' normalization, the model will be fitted on a range of [0,1], where 0 is mapped to 0.5, and the softplus would make sure none of the values are negative. However, once de-normalized, you would still have negative values. Nevertheless, this could be avoided by throwing a warning/error if a user sets a positivity constraint and then provides negative training values. Regarding your problem setting in #151, where many values are zero, and positive values are >>0, an additional prediction of the likelihood of the value being non-zero, multiplied by the predicted value, might be appropriate. If you would like to take a stab at implementing one of these features, I'll be happy to support you in doing so. |
Beta Was this translation helpful? Give feedback.
-
Hi Team,
Thanks for the clarification.
I'll contribute an example notebook.
Please let me know how to proceed.
Thanks
…On Tue, Dec 8, 2020, 10:14 AM Oskar Triebe ***@***.***> wrote:
looking to replicate the *add_regressors* function from Prophet in
neuralprophet
Hi @mekriti <https://github.com/mekriti>,
We renamed the function to add_future_regressor in order to differentiate
it from the new add_lagged_regressor function.
While there is no example notebook as of now, it is described in the
documentation
<https://ourownstory.github.io/neural_prophet/model/future-regressors/>.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<https://github.com/ourownstory/neural_prophet/issues/141#issuecomment-740371915>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABTPQ6YG7E7ZTODCONVUHTDSTWVKXANCNFSM4UKSKGFA>
.
|
Beta Was this translation helpful? Give feedback.
-
Converted this issue to a discussion (new github feature), as it seems better suited for this topic. |
Beta Was this translation helpful? Give feedback.
-
@ourownstory |
Beta Was this translation helpful? Give feedback.
-
Hi,
great package!
One concern and food for ongoing discussions in the original Prophet was the need to add positivity constraints for external regressors. Is this possible (if only with slight modifications of the pytorch code) here?
Thanks
Tim
Beta Was this translation helpful? Give feedback.
All reactions