Quick verdict: Python's requests has NO default timeout — without one, your script can hang forever waiting for a slow server. Always pass timeout=(3.05, 27) (connect, read) or a single timeout=10 as a sane default. Catch requests.Timeout (parent of both ConnectTimeout and ReadTimeout) to handle either case.
This is the bug that has bitten every Python developer at least once:
import requests
r = requests.get("https://slow-or-broken-server.com")
# This can hang forever. There is no default timeout.The requests documentation strongly recommends always setting a timeout. The library leaves the default at None (no timeout) for backwards compatibility, but in production code it is a footgun.
The right call:
r = requests.get("https://api.example.com/data", timeout=10)10 seconds total. If anything (DNS, connect, TLS, body) takes longer, raise requests.Timeout.
timeout accepts two forms:
timeout=10 — same value for both connect and read (10s connect, 10s read)timeout=(3.05, 27) — tuple: 3.05s connect, 27s readThe connect timeout is short by convention because TCP+TLS handshake should complete in a few hundred ms on a healthy network. If it takes longer than 3 seconds, the server is probably down or the network is broken — fail fast.
The read timeout is longer because some endpoints legitimately take time to compute a response (e.g., a search or analytics query). 27 seconds is a common default; tune to your API's 99th percentile latency + a margin.
The 3.05 specifically is a documented recommendation — slightly longer than 3 to avoid edge-case timeout-on-the-3s-mark issues.
Three timeout exceptions in the requests hierarchy:
import requests
try:
r = requests.get(url, timeout=(3.05, 27))
r.raise_for_status()
except requests.ConnectTimeout:
# Could not establish TCP+TLS within 3.05s
print("Server unreachable")
except requests.ReadTimeout:
# Connected but server did not return body within 27s
print("Server too slow")
except requests.Timeout:
# Parent of both ConnectTimeout and ReadTimeout
# (this catches anything either subclass would catch)
print("Some timeout")
except requests.RequestException:
# Parent of everything (network errors, HTTP errors, JSON, etc.)
print("Generic request failure")The exception hierarchy:
requests.RequestException
+-- requests.ConnectionError
| +-- requests.ConnectTimeout # subclass of ConnectionError AND Timeout
+-- requests.Timeout
+-- requests.ConnectTimeout # also here, multiple inheritance
+-- requests.ReadTimeoutCatch requests.Timeout to handle either timeout cause; catch requests.RequestException to handle any failure.
requests.Session does not accept a timeout in its constructor. To set a default for every request from a session, subclass it:
import requests
class TimeoutSession(requests.Session):
def __init__(self, timeout=10):
super().__init__()
self.timeout = timeout
def request(self, *args, **kwargs):
kwargs.setdefault("timeout", self.timeout)
return super().request(*args, **kwargs)
session = TimeoutSession(timeout=(3.05, 27))
r = session.get("https://api.example.com/data") # uses 27s read timeout
r2 = session.get("https://api.example.com/slow", timeout=60) # overrideNow every request through this session has a default timeout, but you can override per-call.
Common misconception: timeout=10 means "the whole request must finish in 10s." Not quite.
The 10s is the longest gap between bytes — if a server slowly streams a 1 GB body sending data every 9 seconds, the request never times out (each byte resets the read clock). For a hard wall-clock budget, use retry logic with explicit time tracking.
import time, requests
def fetch_with_deadline(url, max_seconds=30):
start = time.time()
r = requests.get(url, timeout=10, stream=True)
chunks = []
for chunk in r.iter_content(8192):
if time.time() - start > max_seconds:
r.close()
raise TimeoutError("hard deadline exceeded")
chunks.append(chunk)
return b"".join(chunks)Proxies add latency. The connect timeout now covers: TCP to proxy + TLS to proxy + proxy's connect to upstream + TLS to upstream. Bump the connect side:
proxies = {"http": "http://USER:[email protected]:8000",
"https": "http://USER:[email protected]:8000"}
r = requests.get(url, proxies=proxies, timeout=(10, 30))10s connect (vs 3s direct) accommodates the double handshake. Read timeout stays at 30s.
For LTE mobile or far-away residential proxies, expect even higher connect latency — 15-20s connect timeout is reasonable.
The httpx library has different defaults — 5 seconds total by default (much safer than requests). Configure:
import httpx
client = httpx.Client(
timeout=httpx.Timeout(10.0, connect=3.0)
)
r = client.get("https://api.example.com/data")httpx exposes finer controls (connect, read, write, pool) than requests does.
requests.Timeout if you want to retry or fail gracefully. Catch requests.RequestException for the generic case.stream=True and iterate chunks; track wall-clock if you need a hard deadline.Related: Python requests retry, Python requests cookies, cURL timeout.