spyderproxy

How to Download Files With cURL: Complete Guide (2026)

D

Daniel K.

|
Published date

Sun May 10 2026

Quick verdict: Use curl -OL URL for "download this file with its remote name and follow any redirects." Use curl -o myname.zip URL to set a custom name. Add -C - to resume an interrupted download. For multiple files in parallel, -Z is your friend (cURL 7.66+).

The Two Save Flags: -O vs -o

FlagWhat it doesExample
-O (uppercase)Save with the URL's remote filenamecurl -O https://x.com/file.zip → saves as file.zip
-o NAME (lowercase)Save with a custom namecurl -o backup.zip https://x.com/file.zip
(no flag)Print to stdoutcurl https://x.com/file.zip > out.zip

The default behavior (no flag) prints the body to stdout, which is usually not what you want for binary files — the terminal will show garbage and may misinterpret control bytes.

Follow Redirects: -L

Many download URLs are 301/302 redirects to a CDN. Without -L, cURL stops at the redirect and saves the redirect HTML, not the actual file. Always add -L for downloads:

curl -OL https://github.com/user/repo/archive/main.zip

That URL redirects to codeload.github.com/... — without -L you get a 1KB redirect page instead of the actual archive.

Limit how many redirects cURL follows (default unlimited):

curl -OL --max-redirs 5 https://example.com/download

Resume Interrupted Downloads: -C

If a 4 GB download dies at 3.2 GB, you do not want to start over. Resume with -C -:

curl -O -C - https://example.com/big.iso

The - after -C tells cURL "auto-detect where to resume from" by checking the existing local file size. cURL sends a Range: header asking the server to start from byte X.

The server must support range requests (most CDNs do). If the server returns the whole file again, cURL will overwrite from the start.

Parallel Downloads: -Z

cURL 7.66+ can download multiple URLs in parallel:

curl -Z -OL      https://x.com/file1.zip      https://x.com/file2.zip      https://x.com/file3.zip

Default parallelism is 50. Tune it:

curl -Z --parallel-max 10 -OL URL1 URL2 URL3 ...

For dozens of URLs, generate them with shell expansion:

curl -Z -OL https://x.com/img-{001..100}.jpg

cURL expands {001..100} client-side and downloads all 100 images in parallel.

Speed Limits

Cap download speed (handy on metered connections):

curl -O --limit-rate 1M https://x.com/big.iso

1M = 1 megabyte per second. Suffix can be K, M, or G. Without a suffix, the value is bytes per second.

Abort downloads that are too slow:

curl -O --speed-time 30 --speed-limit 102400 https://x.com/big.iso

"If the transfer rate stays below 100 KB/s for 30 seconds, abort with error 28." Useful for failing fast on stalled CDN connections.

Download Through a Proxy

Geo-restricted downloads or bulk scraping need a proxy:

curl -OL --proxy http://USER:[email protected]:8000      https://geo-locked.com/file.zip

For datacenter-speed bulk downloads where you do not need rotation, Static Datacenter proxies ($1.50/proxy/month, unlimited bandwidth) are the cheapest. For geo-targeting (e.g., download a US-only file from outside the US), use residential proxies.

Auth + Download

Authenticated download (e.g., paid software, private S3 bucket):

curl -OL -u USERNAME:PASSWORD https://private.example.com/file.zip

Or with a bearer token:

curl -OL -H "Authorization: Bearer eyJ..." https://api.example.com/files/123

Use Server-Suggested Filename

Some servers send Content-Disposition: attachment; filename="...". Use that name with -J:

curl -OJL https://example.com/download?id=123

The URL ?id=123 would normally save as download (no extension). With -J, cURL reads the filename from the response header. Note -J requires -O.

Progress Bar

cURL's default progress meter shows transfer rate and ETA. For a cleaner bar, use -#:

curl -# -O https://example.com/big.iso

Disable the progress entirely (handy in scripts):

curl -s -O https://example.com/big.iso

-s = silent. Combine with -S if you still want errors shown: -sS -O.

Verify Integrity

After download, verify the checksum:

curl -OL https://example.com/release.tar.gz
curl -OL https://example.com/release.tar.gz.sha256
sha256sum -c release.tar.gz.sha256

Or one-shot pipe through sha256sum without saving the file:

EXPECTED="abcdef..."
echo "$EXPECTED  -" | (curl -sL https://x.com/file.iso | sha256sum -c)

Common Errors

  • 0-byte file saved: server returned a redirect; add -L.
  • HTML saved instead of binary: URL is wrong, or server returned an error page; check -v for the real status.
  • "curl: (28) Operation timed out": connection or transfer timeout. See cURL timeout fixes.
  • "curl: (35) SSL connect error": TLS issue. Try --tlsv1.2 or update cURL.
  • Filename has weird characters: server's Content-Disposition has non-ASCII; use -o cleaname to override.

cURL vs wget for Downloads

For one-off downloads, wget URL is simpler — it follows redirects and uses the remote filename by default. For complex downloads (auth, multipart, JSON APIs), cURL's flexibility wins. See curl vs wget for the full comparison.

Related: cURL timeout fixes, cURL authentication, Mastering cURL.