-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] client object increasing in size over multiple requests #178
Comments
Do you configured any logger? Is the logger maybe never flushing and having everything in the RAM? |
see the attached client construction, no logger is defined. Should I define one or can i disable logging? |
I had the same issue and removing logger solved the issue. |
@plumthedev Is it a bug in the client? Help us fix it, or document how to avoid it? |
I don't think so, that is a bug in client, but I can check it in a day. |
Do you use monolog? Did you configured a |
@shyim 'opensearch' => [
'days' => 7,
'driver' => 'daily',
'formatter' => JsonFormatter::class,
'level' => 'debug',
'path' => storage_path('logs/opensearch.jsonl'),
], Which is using Monolog RotatingFileHandler I don't see any option to set the |
I opened a discussion on Laravel community page: laravel/framework#52659 |
What is the bug?
When performing multiple bulk imports with the same client object, the size of the client object will increase each time
How can one reproduce the bug?
build a String that can be ingested in the bulkimport function. Now make a loop and import this string multiple times with the same client object and log the memory size used each time, you will see an increase.
What is the expected behavior?
I expected the client Object to stay on mostly the same memory usage even over multiple requests
What is your host/environment?
PHP 8.1.27
Build System: Linux
Do you have any additional context?
I am using only one client object to prevent our host provider from blocking me as a possible ddos attacker. Therefore i made the client object on __contruct in the import class:
Now i loop over all the Data that needs to be imported (A lot) but after a while, the importer crashes - exhausted memory.
I am paginating over all the results in steps of 500, thats why theres so many bulkimports after one another. The crash happens after about 150 bulkimport calls with 1GB of allowed memory size.
If I am misusing the client object and it is not a bug I would be happy to get some hints for how to do it better.
Cheers
The text was updated successfully, but these errors were encountered: