-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Closed
Labels
api: bigquerydatatransferIssues related to the BigQuery Data Transfer Service API.Issues related to the BigQuery Data Transfer Service API.api: coretype: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.‘Nice-to-have’ improvement, new feature or different behavior or design.
Description
Environment details
pip list output:
cachetools 3.1.1
certifi 2019.9.11
chardet 3.0.4
google-api-core 1.14.3
google-api-python-client 1.7.11
google-auth 1.6.3
google-auth-httplib2 0.0.3
google-cloud-bigquery 1.20.0
google-cloud-bigquery-datatransfer 0.4.1
google-cloud-core 1.0.3
google-cloud-pubsub 1.0.2
google-resumable-media 0.4.1
googleapis-common-protos 1.6.0
grpc-google-iam-v1 0.12.3
grpcio 1.24.1
httplib2 0.14.0
idna 2.8
pip 19.2.3
protobuf 3.10.0
pyasn1 0.4.7
pyasn1-modules 0.2.6
pytz 2019.3
requests 2.22.0
rsa 4.0
setuptools 41.2.0
six 1.12.0
uritemplate 3.0.0
urllib3 1.25.6
wheel 0.33.6
OS: Ubuntu
Python 3.5.2
API: BigQuery Data Transfer Service
Steps to reproduce
Initialize more than one data transfer client.
Code example
from google.cloud import bigquery_datatransfer_v1
import time
while True:
dts_client = bigquery_datatransfer_v1.DataTransferServiceClient()
try:
dts_client.get_transfer_run('some run id').state
except:
print('error!')
time.sleep(2)Output of ll /proc/<pid>/fd after three seconds:
0 1 10 11 2 3 4 5 6 7 8 9
And keeps growing.
There's also no way to clean up the client as it doesn't implement __exit__.
Metadata
Metadata
Assignees
Labels
api: bigquerydatatransferIssues related to the BigQuery Data Transfer Service API.Issues related to the BigQuery Data Transfer Service API.api: coretype: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.‘Nice-to-have’ improvement, new feature or different behavior or design.