Python requests inside container. Max retries exceeded with url error

Hi, im not sure if its good place to ask this question. If not, sorry for that.

I’m writing a API scrapper, which is sending get requests to some site. From time to time, my docker stop working as expected: my requests are “blocked”. Python error:

raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='TARGET_URL', port=443): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7ffb3bdefa00>: Failed to establish a new connection: [Errno 110] Connection timed out'))

The main problem is that, my requests are fine. i mean, i still can get google response inside container:

>>> requests.get('https://google.com')
<Response [200]>

I can get response from google and target url in my windows machine:

>>> requests.get('TARGET_URL')
<Response [200]>
>>> requests.get('https://www.google.com')        
<Response [200]>
>>> 

But still, I’m blocked inside docker to send request to target_url. Has someone idea what’s going on? If I would be blocked by backend of target site, i wouldn’t send any request from my Windows machine. Or am I wrong? Is it possible that my ISP is blocking my docker container to send a request to a specific site when too many requests are sent? I have to mention, that my friend is working on that code too, and he had never been blocked on that site.

Docker spec: python:3.9-slim-buster