-
Notifications
You must be signed in to change notification settings - Fork 127
Description
Hi,
I have a problem pulling my synthetics API tests with dogmover.
It crashes after pulling about 64 (sometimes a bit less, 62 or 63) synthetics API tests.
I tried to add a timeout of 60 seconds in the request, but no effect.
Here is the code :
def pull_synthetics_api_tests(options, tag):
path = False
count = 0
tags = [] if not tag else tag
r = requests.get('{}api/v1/synthetics/tests?api_key={}&application_key={}'.format(options["api_host"], options["api_key"], options["app_key"]))
synthetics = r.json()
for synthetic in synthetics["tests"]:
if synthetic["type"] == "api":
all_tags_found="true"
for tag in tags:
if not tag in synthetic["tags"]:
all_tags_found="false"
break
if all_tags_found == "true":
count = count + 1
print("Pulling: {} and writing to file: {}".format(synthetic["name"].encode('utf8'), path))
print ("count={}".format(count))
json_data = requests.get('{}api/v1/synthetics/tests/{}?api_key={}&application_key={}'.format(
options["api_host"],
synthetic["public_id"],
options["api_key"],
options["app_key"]
), timeout=60).json()
path = _json_to_file('synthetics_api_tests', synthetic["public_id"], json_data)
print("Retrieved '{}' synthetic tests.".format(count))
And the log & stack trace :
python ./dogmover.py pull synthetics_api_tests --tag stream:SOP
Pulling: [SOP] [OPUS] [ERROR] [PROD] [LMFR] TAXONOMY <!channel> and writing to file: False
count=1
Pulling: [SOP] [OPUS] [ERROR] [PROD] [LMFR] CATEGORY <!channel> and writing to file: ./synthetics_api_tests/9pq-ak5-fdr.json
count=2
[...]
Pulling: [SOP] [OPUS] [ERROR] [DEV] [LMFR] SEARCH WRITES RUNNING and writing to file: ./synthetics_api_tests/iu3-sbk-ic9.json
count=64
Traceback (most recent call last):
File "./dogmover.py", line 522, in
pull_synthetics_api_tests(_init_options("pull"), arguments["--tag"])
File "./dogmover.py", line 170, in pull_synthetics_api_tests
), timeout=60).json()
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='api.datadoghq.eu', port=443): Max retries exceeded with url: /api/v1/synthetics/tests/6tq-wxw-fup?api_key=xxxx&application_key=xxxxx (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f9091bdde10>: Failed to establish a new connection: [Errno 101] Network is unreachable',))
Do you have any idea ?
Is there a way to set up the connection pool with more connections ? or to re-use always the same ? or to free connections after each request ?