-
Notifications
You must be signed in to change notification settings - Fork 487
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
All inferences have the same predicted value #2401
Comments
In my tests the batch predictions don't give me the same results for all items in the batch, though I think there is still something wrong with the regression example. I never realized that the normalization happens based on items in the batch. This might lead to issues, we should normalized based on some precomputed statistics instead. |
This happend when disabling the Then I enable But what did make the predicted values the same when disabling normalizaton ? |
I also find this problem. |
Huh, that's interesting to say the least. My guess is that without normalization the input values are out of the expected range from the trained model (possibly by a lot), so the model degenerates to the same response.
I actually updated the example to use a more representative dataset in #2405 with correct normalization. In my tests the model always had a MSE of ~0.55-0.6 and the predicted values were pretty close to the target (you can see an almost linear relationship in the predicted vs expected scatter plot). Let me know what you think :) |
@laggui |
The example already has a train/valid/test split, but actually with your shuffling it seems you get an easier validation set because I had around ~0.59 MSE. The results look pretty similar (though a bit better in your case) 🙂 |
Hi,
I perfromed the simple regression example and got a model. Then I infer the test data and find that when using batch data (, all the predicted values were the same.
here is the output:
Then I infer the items one by one. This time each item gave a different value. How this happened ?
The text was updated successfully, but these errors were encountered: