You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I deployed the solution found at the "https://github.com/microsoft/Dynamics-365-FastTrack-Implementation-Assets/tree/master/Analytics/DataverseLink/DataIntegration" to get csv files exported by Synapse Link into our Dedicated Pool DB.
The problem is that for large tables, the query on the source interface (Synapse SQL Serverless) return error due to timeout, 30 minutes on Synapse SQL Serverless. Also, if the pipeline return errors, the involved tables are not reprocessed on the next run, cause the controtable on the destination db contains the table entry with end date set to null, I must delete the table entry to allow reprocess by pipeline, however the problem rise again cause table size.
I've found tables with many records also, but they have less columns.
The source CSV are stored on a datalake and they are gained by Dynamics 365 F&O.
The text was updated successfully, but these errors were encountered:
Hi,
I think I found the problem: the issue on the Synapse Servless depends from the the nvarchar(max) type. I tried to change it in nvarchar(4000) and now the query ends in 3 minutes for 1200000 records. Now I'll edit the max to 4000 into stored procedures and reload all the tables.
Hi,
I deployed the solution found at the "https://github.com/microsoft/Dynamics-365-FastTrack-Implementation-Assets/tree/master/Analytics/DataverseLink/DataIntegration" to get csv files exported by Synapse Link into our Dedicated Pool DB.
The problem is that for large tables, the query on the source interface (Synapse SQL Serverless) return error due to timeout, 30 minutes on Synapse SQL Serverless. Also, if the pipeline return errors, the involved tables are not reprocessed on the next run, cause the controtable on the destination db contains the table entry with end date set to null, I must delete the table entry to allow reprocess by pipeline, however the problem rise again cause table size.
I've found tables with many records also, but they have less columns.
The source CSV are stored on a datalake and they are gained by Dynamics 365 F&O.
The text was updated successfully, but these errors were encountered: