Quick verdict: Use -A "Mozilla/5.0 ..." to set a custom User-Agent (or -H "User-Agent: ...", equivalent). cURL's default UA is curl/X.Y.Z which many sites block immediately. But UA spoofing alone is not enough for modern bot detection — sites also check TLS fingerprint, HTTP/2 settings, and behavioral signals. Pair UA with curl_cffi (Chrome impersonation) or route through residential proxies for the rest.
$ curl -v https://httpbin.org/headers 2>&1 | grep User-Agent
> User-Agent: curl/8.1.2Honest and useful for debugging cURL itself. Hostile environments — CDN bot filters, anti-scraping tools — recognize this UA and reject immediately.
Two equivalent ways:
# Short form (-A flag)
curl -A "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36" https://example.com
# Long form (-H flag)
curl -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36" https://example.comBoth produce the same request. Use -A for cleaner scripts when UA is your only header customization. Use -H when you are setting multiple headers anyway.
Pick a UA matching a recent stable release. These rotate — check useragentstring.com for the latest.
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:126.0) Gecko/20100101 Firefox/126.0Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.5 Safari/605.1.15Mozilla/5.0 (iPhone; CPU iPhone OS 17_5 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.5 Mobile/15E148 Safari/604.1Mozilla/5.0 (Linux; Android 14; Pixel 8) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Mobile Safari/537.36UAS=(
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36"
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36"
"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:126.0) Gecko/20100101 Firefox/126.0"
)
for i in {1..10}; do
UA="${UAS[$RANDOM % ${#UAS[@]}]}"
curl -A "$UA" "https://target.com/page/$i"
sleep 2
doneEach request picks a random UA from the list. Combine with proxy rotation for serious scraping.
Empty value:
curl -A "" https://example.com
# or
curl -H "User-Agent:" https://example.comSome servers reject requests without a UA. Most accept it. Setting empty is different from leaving the default — sites that block "curl/X.Y" specifically might let "" through, and vice versa.
Modern anti-bot services (Cloudflare, Akamai, DataDome, PerimeterX) score your traffic at multiple layers. UA is just one signal:
impersonate="chrome126" for real Chrome TLS.-H in browser order to match.sec-ch-ua, sec-fetch-mode, sec-fetch-site. Real browsers send these; cURL does not unless you add them manually.For sites running modern anti-bot, plain cURL with a spoofed UA gets blocked even with the most realistic UA string. UA spoofing is necessary but not sufficient.
If you spoof Chrome on Windows, your other headers should match too:
curl https://api.target.com/data \
-A "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36" \
-H "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,*/*;q=0.8" \
-H "Accept-Language: en-US,en;q=0.9" \
-H "Accept-Encoding: gzip, deflate, br" \
-H "sec-ch-ua: \"Chromium\";v=\"126\", \"Not-A.Brand\";v=\"24\", \"Google Chrome\";v=\"126\"" \
-H "sec-ch-ua-mobile: ?0" \
-H "sec-ch-ua-platform: \"Windows\""This passes the basic header checks. TLS layer still needs curl_cffi for the toughest sites.
Some legitimate use cases for custom UAs:
-A "MyCrawler/1.0 (+https://mysite.com/about)" — tells site owners who you are and how to contact you. Good citizenship; some sites whitelist identified scrapers.UA is the LEAST important fingerprint signal in 2026:
| Layer | What it sees | cURL's default |
|---|---|---|
| UA string | Self-reported identity | curl/X.Y (easy to spoof) |
| TLS ClientHello | Crypto handshake fingerprint | cURL/OpenSSL signature (hard to spoof without curl_cffi) |
| HTTP/2 fingerprint | SETTINGS frame, header order | cURL's pattern (hard to change) |
| IP reputation | ASN, blocklists | Whatever you're on (use residential proxies) |
| JS fingerprint (canvas, WebGL) | Only fires if you run JS | cURL never runs JS (use Playwright) |
For scraping that survives modern detection, fix the bottom layers first.
Related: cURL auth, cURL GET, Browser fingerprinting, Cloudscraper.