Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes for quantile regression #148

Merged
merged 3 commits into from
May 23, 2023
Merged

Fixes for quantile regression #148

merged 3 commits into from
May 23, 2023

Conversation

tlienart
Copy link
Collaborator

Out of convenience, MLJLM defines residuals as $X\theta - y$, however the literature typically does the opposite. This matters only if the loss is asymmetrical which is the case for the pinball loss used in the quantile regression. The report in #147 exposed this.

Note that the report in #147 also compared the performance of an IP method with the result obtained by MLJLM. The objective achieved using MLJLM is about 9% worse than that from the IP method. When using a non-IP method from the quantreg package, the objective function is 1-2% better. This is not worrying per se, in the case presented LBFGS just stops at a non-optimal point, likely gets stuck in a very flat zone.

@tlienart tlienart merged commit a872d7c into master May 23, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants