-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BigQuery Data Transfer Service, all GAPIC clients]: provide a close() method and context manager to clean up client-owned resources #9457
Comments
This client is 100% auto-generated via our gRPC toolchain. I'll try to summon some of our folks who work at those lower levels to investigate. |
@yebrahim This is unrelated, but you only need to create the from google.cloud import bigquery_datatransfer_v1
import time
dts_client = bigquery_datatransfer_v1.DataTransferServiceClient()
while True:
try:
dts_client.get_transfer_run('some run id').state
except:
print('error!')
time.sleep(2) |
Thank you both. |
This isn't a "leak", per se: every GAPIC client has a
|
|
FYI: I am able to reproduce this by using the psutil.Process.connections method to observe leaked resources. import psutil
import pprint
from google.cloud import bigquery_datatransfer_v1
current_process = psutil.Process()
print("connections before creating client: {}".format(len(current_process.connections())))
client = bigquery_datatransfer_v1.DataTransferServiceClient()
print("connections after creating client: {}".format(len(current_process.connections())))
sources = list(client.list_data_sources("projects/swast-scratch"))
print("connections after API request: {}".format(len(current_process.connections())))
del client
print("connections after deleting client: {}".format(len(current_process.connections()))) |
Output from ☝️ :
|
The workaround of import psutil
import pprint
from google.cloud import bigquery_datatransfer_v1
current_process = psutil.Process()
print("connections before creating client: {}".format(len(current_process.connections())))
client = bigquery_datatransfer_v1.DataTransferServiceClient()
print("connections after creating client: {}".format(len(current_process.connections())))
sources = list(client.list_data_sources("projects/swast-scratch"))
print("connections after API request: {}".format(len(current_process.connections())))
client.transport.channel.close()
print("connections after closing client.transport.channel: {}".format(len(current_process.connections()))) Output:
|
Does it make sense to wrap this call in an API method? Something like |
We have just added such a method in the main BigQuery client. Personally, I agree that this makes sense to do across all our client libraries. I've drafted an internal proposal to be reviewed by our core libraries team. |
Googlers: see approved proposal to update generated clients at go/closable-clients-python As discussed offline, since this requires changes to the generator, this feature will wait until after the new Python 3-only generator migration is underway. |
I'm only asking this in terms of viability, regarding @tseaver 's comment :
Is there anything wrong with this approach?
|
The DB-API is a bit of an oddball in regards to context managers. I would expect a Curor context manager to commit a transaction but not close the connection, which doesn't apply to BigQuery as it is not a transactional database. |
This feels still like a pressing issue, unless some of the linked issues have solved it (I can't immediately tell). @parthea can you also take a look? Maybe it needs to be moved to https://github.com/googleapis/gapic-generator-python? |
Closing as obsolete. This was completed in googleapis/gapic-generator-python#987 |
Environment details
pip list
output:OS: Ubuntu
Python 3.5.2
API: BigQuery Data Transfer Service
Steps to reproduce
Initialize more than one data transfer client.
Code example
Output of
ll /proc/<pid>/fd
after three seconds:0 1 10 11 2 3 4 5 6 7 8 9
And keeps growing.
There's also no way to clean up the client as it doesn't implement
__exit__
.The text was updated successfully, but these errors were encountered: