Quick verdict: HTTPX is the modern Python HTTP client — same API as requests for sync code, plus async support and HTTP/2. For new projects in 2026, HTTPX is the recommended pick. Existing requests code works fine; migrate when you need async, HTTP/2, or to consolidate sync + async into one library. Performance: 5-10x faster than requests for concurrent workloads, comparable for single requests.
This guide covers what HTTPX is, when to migrate from requests, performance comparisons, and 8 working examples for sync, async, streaming, HTTP/2, and proxies.
pip install httpx
# For HTTP/2 support
pip install httpx[http2]
| Feature | requests | HTTPX | aiohttp |
|---|---|---|---|
| Sync API | Yes | Yes | No |
| Async API | No | Yes | Yes |
| HTTP/2 | No | Yes | No (HTTP/3 in v4+) |
| API similarity to requests | Native | ~95% drop-in | Different |
| Throughput (100 concurrent reqs) | Slow (sync) | Fast | Fastest |
import httpx
r = httpx.get("https://api.example.com/items")
print(r.json())
with httpx.Client() as client:
r1 = client.get("https://api.example.com/items")
r2 = client.get("https://api.example.com/users")
# Connections are pooled and reused — much faster than separate calls
import asyncio
import httpx
async def main():
urls = [f"https://api.example.com/items/{i}" for i in range(100)]
async with httpx.AsyncClient() as client:
rs = await asyncio.gather(*[client.get(u) for u in urls])
return [r.json() for r in rs]
asyncio.run(main())
with httpx.Client(http2=True) as client:
r = client.get("https://www.cloudflare.com")
print(r.http_version) # "HTTP/2"
proxy = "http://USER:[email protected]:8080"
with httpx.Client(proxies={"http://": proxy, "https://": proxy}) as client:
r = client.get("https://api.example.com")
with httpx.stream("GET", "https://example.com/large-file.zip") as r:
with open("local.zip", "wb") as f:
for chunk in r.iter_bytes():
f.write(chunk)
timeout = httpx.Timeout(30.0, connect=10.0, read=20.0)
with httpx.Client(timeout=timeout) as client:
r = client.get("https://slow-api.com")
r = httpx.post(
"https://api.example.com/items",
json={"name": "sample", "price": 42},
headers={"Authorization": "Bearer YOUR_TOKEN"},
)
print(r.json())
For most code, the migration is a one-line import change:
# Before
import requests
r = requests.get(url, params={"q": "x"}, headers={"X-Key": "abc"})
# After
import httpx
r = httpx.get(url, params={"q": "x"}, headers={"X-Key": "abc"})
The 5% incompatibility cases:
requests.Session() → httpx.Client()verify=False works in HTTPX too, but for custom CA use verify=ssl.create_default_context(...)r.iter_content(); HTTPX uses r.iter_bytes() in a stream context{"http": ..., "https": ...}; HTTPX needs trailing slash {"http://": ..., "https://": ...}import asyncio
import httpx
PROXY = "http://USER:[email protected]:8080"
async def fetch(client, url):
r = await client.get(url, timeout=20)
return r.text
async def main(urls):
async with httpx.AsyncClient(
proxies={"http://": PROXY, "https://": PROXY},
http2=True,
limits=httpx.Limits(max_connections=50),
) as client:
return await asyncio.gather(*[fetch(client, u) for u in urls])
urls = ["https://example.com/" + str(i) for i in range(1000)]
results = asyncio.run(main(urls))
Through a rotating residential proxy with HTTPX async + HTTP/2, real-world throughput is 200-500 requests/second on a single client, vs 20-50/second with sync requests.