Quick verdict: Use curl -OL URL for "download this file with its remote name and follow any redirects." Use curl -o myname.zip URL to set a custom name. Add -C - to resume an interrupted download. For multiple files in parallel, -Z is your friend (cURL 7.66+).
| Flag | What it does | Example |
|---|---|---|
-O (uppercase) | Save with the URL's remote filename | curl -O https://x.com/file.zip → saves as file.zip |
-o NAME (lowercase) | Save with a custom name | curl -o backup.zip https://x.com/file.zip |
| (no flag) | Print to stdout | curl https://x.com/file.zip > out.zip |
The default behavior (no flag) prints the body to stdout, which is usually not what you want for binary files — the terminal will show garbage and may misinterpret control bytes.
Many download URLs are 301/302 redirects to a CDN. Without -L, cURL stops at the redirect and saves the redirect HTML, not the actual file. Always add -L for downloads:
curl -OL https://github.com/user/repo/archive/main.zipThat URL redirects to codeload.github.com/... — without -L you get a 1KB redirect page instead of the actual archive.
Limit how many redirects cURL follows (default unlimited):
curl -OL --max-redirs 5 https://example.com/downloadIf a 4 GB download dies at 3.2 GB, you do not want to start over. Resume with -C -:
curl -O -C - https://example.com/big.isoThe - after -C tells cURL "auto-detect where to resume from" by checking the existing local file size. cURL sends a Range: header asking the server to start from byte X.
The server must support range requests (most CDNs do). If the server returns the whole file again, cURL will overwrite from the start.
cURL 7.66+ can download multiple URLs in parallel:
curl -Z -OL https://x.com/file1.zip https://x.com/file2.zip https://x.com/file3.zipDefault parallelism is 50. Tune it:
curl -Z --parallel-max 10 -OL URL1 URL2 URL3 ...For dozens of URLs, generate them with shell expansion:
curl -Z -OL https://x.com/img-{001..100}.jpgcURL expands {001..100} client-side and downloads all 100 images in parallel.
Cap download speed (handy on metered connections):
curl -O --limit-rate 1M https://x.com/big.iso1M = 1 megabyte per second. Suffix can be K, M, or G. Without a suffix, the value is bytes per second.
Abort downloads that are too slow:
curl -O --speed-time 30 --speed-limit 102400 https://x.com/big.iso"If the transfer rate stays below 100 KB/s for 30 seconds, abort with error 28." Useful for failing fast on stalled CDN connections.
Geo-restricted downloads or bulk scraping need a proxy:
curl -OL --proxy http://USER:[email protected]:8000 https://geo-locked.com/file.zipFor datacenter-speed bulk downloads where you do not need rotation, Static Datacenter proxies ($1.50/proxy/month, unlimited bandwidth) are the cheapest. For geo-targeting (e.g., download a US-only file from outside the US), use residential proxies.
Authenticated download (e.g., paid software, private S3 bucket):
curl -OL -u USERNAME:PASSWORD https://private.example.com/file.zipOr with a bearer token:
curl -OL -H "Authorization: Bearer eyJ..." https://api.example.com/files/123Some servers send Content-Disposition: attachment; filename="...". Use that name with -J:
curl -OJL https://example.com/download?id=123The URL ?id=123 would normally save as download (no extension). With -J, cURL reads the filename from the response header. Note -J requires -O.
cURL's default progress meter shows transfer rate and ETA. For a cleaner bar, use -#:
curl -# -O https://example.com/big.isoDisable the progress entirely (handy in scripts):
curl -s -O https://example.com/big.iso-s = silent. Combine with -S if you still want errors shown: -sS -O.
After download, verify the checksum:
curl -OL https://example.com/release.tar.gz
curl -OL https://example.com/release.tar.gz.sha256
sha256sum -c release.tar.gz.sha256Or one-shot pipe through sha256sum without saving the file:
EXPECTED="abcdef..."
echo "$EXPECTED -" | (curl -sL https://x.com/file.iso | sha256sum -c)-L.-v for the real status.--tlsv1.2 or update cURL.-o cleaname to override.For one-off downloads, wget URL is simpler — it follows redirects and uses the remote filename by default. For complex downloads (auth, multipart, JSON APIs), cURL's flexibility wins. See curl vs wget for the full comparison.
Related: cURL timeout fixes, cURL authentication, Mastering cURL.