-
Notifications
You must be signed in to change notification settings - Fork 287
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Memory usage when upload file to DataLake gen2 #2140
Comments
/ping |
@Marusyk Could you share the code snippet used to make datalake calls? This issue can occur if the azure-data-lake sdk is used incorrectly as well, so it'd be great if you can share a small repro or code snippet doing datalake calls. |
I'm using Azure.Storage.Files.DataLake to send data every minute from my C# code.
The
and added
but the same result. It uses a lot of memory. |
@cijothomas call every few seconds:
|
any updates? |
We are also facing this issue. We now implemented a regular restart of our service but that is far from ideal. |
Hi @cijothomas, can anyone help with it? |
This issue is stale because it has been open 300 days with no activity. Remove stale label or comment or this will be closed in 7 days. |
The issue is not fixed |
Microsoft.ApplicationInsights.AspNetCore 2.14.0
Microsoft.ApplicationInsights.WorkerService 2.14.0
*.csproj
file): netcoreapp3.1I'm using Azure.Storage.Files.DataLake to send data every minute from my C# code with enabled Application Insights
There are a lot of operations with Datalake (now it's more than 5000 per hour).
I got a lot of memory usage of my application (each file ~150 bytes).
App without Application Insights enabled, after 3 days uptime: ~96Mb
App with Application Insights disabled, after 6 hours uptime: ~900Mb
Could you please suggest how to avoid AI memory leak?
I've tried but no success
The text was updated successfully, but these errors were encountered: