You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We've detected some issues when uploading large samples into production (see #2268, but we've already did some work on streaming) that makes the respondent group get out of sync with the actual respondents.
Specifically, uploading a sample with 92155 respondents yielded a DBConnection.ConnectionError in the app, and left the database with only 82000 respondents created - but no error to the user in the UI, and Surveda keeps saying the sample includes 92155 respondents.
Apart from actually being able to upload a large sample, we should make Surveda handle this transactionally: if Surveda doesn't create all the respondents in the sample, it should fail and not show the respondent group as having more respondents than it actually has.
This is currently happening at 0.37.2
The text was updated successfully, but these errors were encountered:
The user actually gets an error in the UI when the upload fails, but it's not at all descriptive - it's a generic JSON parsing error, probably due to Surveda trying to JSON-parse a response that's actually an plain-text error message.
We've detected some issues when uploading large samples into production (see #2268, but we've already did some work on streaming) that makes the respondent group get out of sync with the actual respondents.
Specifically, uploading a sample with 92155 respondents yielded a
DBConnection.ConnectionError
in the app, and left the database with only 82000 respondents created - but no error to the user in the UI, and Surveda keeps saying the sample includes 92155 respondents.Apart from actually being able to upload a large sample, we should make Surveda handle this transactionally: if Surveda doesn't create all the respondents in the sample, it should fail and not show the respondent group as having more respondents than it actually has.
This is currently happening at 0.37.2
The text was updated successfully, but these errors were encountered: