-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DEST Snowflake] - Sync succeeded but destination tables are empty #4745
Comments
@philippeboyd is possible to read this to download Snowflake logs from their side (not completely sure you can do it too) . Maybe you can find more information from the error, this could be really helpful to us. |
@philippeboyd is there any chance you also share the logs right before the snippet you provided above? Trying to see if there is anything potentially helpful context here Really as much as you can share from the log would be great |
There are two issues here:
|
@jbowlen good find! same thing, looks like there's a timeout at 30 minutes @sherifnada any possibilities to customize that timeout? |
This line is problematic when having big data: Line 43 in 40da541
Could the GCS/S3 |
thanks for the pointers everyone! Will get to this very soon. I think the fastest solution to unblock you will be to bump the timeout by a lot for now (let's say 12 hours). @philippeboyd how do you mean in batch? |
@sherifnada in batch I mean split the What if there's 1B rows to sync or 5 billion? That I'm just trying to think of a more permanent solution that can scale for any amount of data... But for sure, that 12 hours timeout will surely help for the time being (hopefully). It's still a quick and temporary fix though. |
Agreed - my goal is to unblock y'all ASAP. Your suggestion is a great one, Philippe. Will follow up on it with the team. |
As far as I can see the data uploaded to GCS is not compressed. Wouldn't it be better to (gz) compress the data before uploading it? My use case would be uploading from on-premise mssql to Snowflake and there you would benefit from compression. |
I have created a CSV file locally with the destination connector "local CSV". This 55 GB file was split into 28 files and gz compressed, uploaded to GCS and imported to Snowflake. I saw that there is a "best practice" issue (#4904). Possibly this issue will be covered there as well. |
just stumbled upon this. was also experiencing the 30min timeout! glad thats been resolved 🎉 |
Environment
Current Behavior
The sync appears to be successful but the tables (raw and normalized) in the destination (Snowflake) are empty. In the logs, there appear to be only 1 error during data transfer (I guess) which seems to makes the whole data transfer unsuccessful.
Expected Behavior
There should be an exponential backoff on the all potential pitfalls during data transfers to retry the failed command.
Logs
LOG
Steps to Reproduce
The text was updated successfully, but these errors were encountered: