cURL - The Ultimate Reference Guide
cURL short for “Client for URL” is a computer software project providing the libcurl library and curl command-line tool for transferring data such as downloads and uploads using various network protocols. The curl tool and libcurl library support a large selection of network protocols such as: DICT, FILE, FTP, FTPS, GOPHER, GOPHERS, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, MQTT, POP3, POP3S, RTMP, RTMPS, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, TELNET, TFTP, WS and WSS. This rich support for common data transfer protocols makes curl and libcurl a powerful tool in the arsenal of anyone interacting with network protocols.
I recently watched the above master class presentation on Mastering the Curl Command Line (slides) with Daniel Stenberg (@badger) the founder and lead developer of cURL and libcurl which was very detailed and full of useful information regarding the cURL command-line tool. From the video I added a plethora of new notes to my curl note collection. From these notes I decided to share them as a comprehensive reference and guide to the curl command-line tool. These notes are a compilation of my personal notes as well as notes taken while watching Mastering the Curl Command Line.
Before Starting
If you’re just learning about curl or are a seasoned curler looking to try new commands I highly recommend installing Wireshark while fiddling with curl. This allows you to see the network traffic you’re sending with curl.
Many commands and protocols can be tested against solutions such as Python’s http.server. Allowing you to test curl commands against localhost.
Installing cURL
Linux
Ubuntu & Debian
Redhat & CentOS
nix
Arch Linux
SUSE and openSUSE
SUSE SLE Micro and openSUSE MicroOS
macOS
Windows
Ensure that winget is installed on your Windows PC.
Docker
The curl command-line is available as curlimages/curl on docker hub.
Run with Docker
Running curl with docker
Finding Help & Usage
Version
-V, —version - options to show curl version number and quit
Verbose
-v, —verbose - options for verbose output.
—trace-time - option to add timestamps to trace and verbose output.
—trace - to write a debug trace to a file.
—trace-ascii - to write a debug trace to a file without hex.
Be careful when sharing trace files!
Config File
Config files allow users to write complex curl commands by storing options in a config file. Pass a config file to curl using the -K, —config option.
Passwords
-u, —user - combination to pass a username:password combination.
.netrc
To avoid password leakage you can pass a .netrc or config file. The command -n, --netrc will look for a .netrc file in the home directory.
Pass your own netrc file using the --netrc-file option.
Multiple URLs
Progress Meters
-s, —silent, —no-progress-meter - for silent, no output.
Use the -#, —progress-bar to display a transfer progress bar.
Reset URL State
When using multiple URLs use the -:, —next command to reset the state of each URL forcing each URL to use it’s own unique options.
Write Out
-w, —write-out - option to output variables after command completion.
See a full list of information on over 50 variables.
Downloads
Supports outputting using the -o, --output (specify filename) and -O, --remote-name (use server name) options.
Use the --remote-header-name to use content-disposition name for the filename, use with the -O, --remote-name option.
Output Options
-o, --output - Write to file instead of stdout
-O, --remote-name - Write output to a file named as the remote file
--output-dir - saves the -O, --remote-name data in another directory.
--create-dirs - Create necessary local directory hierarchy (useful to output to a dir that doesn’t exist
Single URL to File
-o, --output - Write to file instead of stdout
-O, --remote-name - Write output to a file named as the remote file
--remote-header-name - Use the content-disposition name for the filename, use with the -O, —remote-name
Multiple URLs to Files
-o, --output - Write to file instead of stdout
-O, --remote-name - Write output to a file named as the remote file
--remote-name-all - to use the remote name for all URL’s
Retry
Useful for transient errors such as the momentary loss of network connectivity to components and services.
--retry - perform a number of retries
--retry-max-time - during this timeframe
--retry-delay - Wait this long before retries
--retry-connrefused - Retry on connection refused (can be considered transient), use with --retry
--retry‐all-errors - Consider all errors transient
Uploads
-T, --upload-file - Transfer local file to destination.
If URL has no file name, appent the -T, --upload-file to the URL
Transfer Controls
-Y, --speed-limit - Stop transfers slower than this
-y, --speed-time - Trigger ‘speed-limit‘ abort after this
--limit-rate - Limit transfer speed rate
Transfer limits and time units
Naming Tricks
Provide a name + port => address mapping - use --resolve <host:port:address[,address]...> the host+port to this address.
Provide a name + port => name + port mapping - use --connect-to <HOST1:PORT1:HOST2:PORT2> to connect to host.
Set a fake host header (may fail certificate checks with TLS)
Network Layer Race Conditions
Curl uses both IPv4 and IPv6, fix the Internet Protocol (IP) version with --ipv4 or --ipv6
--ipv4
Resolve names to IPv4 addresses
--ipv6
Resolve names to IPv6 addresses
Connections
Curl options such as --interface, --local-port, --keepalive-time, --dns-ipv4-addr, --dns-ipv6-addr can be used to specify network interfaces, ports, keep alive, and dns addresses can be used for granular control over connections.
--interface
Use network INTERFACE (or address)
--local-port
Force use of RANGE for local port numbers
--keepalive-time
Interval time for keepalive probes
--dns-ipv4-addr
IPv4 address to use for DNS requests
--dns-ipv6-addr
IPv6 address to use for DNS requests
Timeouts
Controls the timeouts
-m, --max-time
Maximum time allowed for the transfer
--connect-timeout
Maximum time allowed for connection
Exit Status
The numerical value curl returns to the shell as an exit code.
Return code use $?
SSH
curl will attempt to use public key in the .ssh directory
SCP
SFTP
Reading Email
POP3
List message numbers & sizes
Download email 1
--request DELE - delete message 1
IMAP
List emails in work
Get mail using UID 57 from mailbox “stuff“
Get the mail with index 57 from the mailbox “fun”
Sending Email
Curl can send email with the -T, --upload-file with SMTP. The file needs to have all the mail headers and formatted correctly.
MQTT - IoT Messaging Standard
MQTT is an OASIS standard messaging protocol for the Internet of Things (IOT) using a publishing/subscribe messaging transport.
Subscribe
Subscribe to the bedroom temperature in the subject
-d, --data - Set the kitchen dimmer
TFTP
Trivial File Transfer Protocol (TFTP) is a simple protocol that provides basic file transfer function with no user authentication.
Download
Upload
Telnet
Teletype Network (Telnet) - Telnet is an early client/server application protocol used to access remote system terminals over networks, transmitting data including usernames and passwords in plaintext, making it unsuitable for secure applications. Its popularity has diminished in favor of more secure protocols like SSH, with some proposed extensions to add encryption to Telnet.
Session
Dictionary
Provides lookups (downloads)
WebSocket
Still experimental
h2c - Headers to cURL Command Line
h2c allows you to convert a series of headers into a curl command.
--libcurl
The --libcurl command allows you to generate libcurl source code from the command line.
TLS Versions
Curl uses latest versions, ciphers automatically
-1, --tlsv1, --tlsv1.0
Use TLSv1.0 or greater
--tlsv1.1
Use TLSv1.1 or greater
--tlsv1.2
Use TLSV1.2 or greater
--tlsv1.3
Use TLSV1.3 or greater
--tls-max
Set maximum allowed TLS version
--sslv2
Does not work due to security.
--sslv3
Does not work due to security.
TLS Certificates
SSL/TLS certificates allow web browsers to identify and establish encrypted network connections to web sites using the SSL/TLS protocol.
--cacert
CA certificate to verify peer against
--capath
CA directory to verify peer against
-k, --insecure
Allow insecure server connections when using SSL, don't use insecure!
Online Certificate Status Protocol (OCSP) Stapling
Online Certificate Status Protocol (OCSP) is an internet protocol described in RFC2560 that checks the validity status of a certificate in real-time. It is an alternative to Certificate Revocation Lists (CRL) described in RFC5280.
Client Certificates
Client Certificates are digital certificates for users and individuals to prove their identity to a server. Client certificates tend to be used within private organizations to authenticate requests to remote servers.
Ciphers
Cipher suites are a combination of ciphers used to negotiate security settings during the SSL/TLS handshake.
--ciphers
List of SSL ciphers to us
--tls13-ciphers
List of TLS 1.3 ciphers to use
--proxy-ciphers
List of SSL ciphers to use for proxy
TLS Backends
When curl is built, it gets told to use a specific TLS library (backend). That TLS library (backend) is the engine that provides curl with the powers to speak TLS over the wire.
SSLKEYLOGFILE
The curl command-line tool can push secrets to SSLKEYLOGFILE which can be read by Wireshark to decrypt traffic.
Set the SSLKEYLOGFILE environment variable pointing to a file.
Tell Wireshark to read from the SSLKEYLOGFILE.
Run curl
Read decrypted traffic in Wireshark
Proxy
A proxy server is a system or router that provides a gateway between users and the internet.
-x, --proxy
Use this proxy
Proxy Authentication
The authentication method that should be used to gain access to a resource behind a proxy server.
-U, --proxy-user
Proxy user and password
--socks5‐basic
Enable username/password auth for SOCKS5 proxies
--socks5‐gssapi
Enable GSS-API auth for SOCKS5 proxies
--proxy-basic
Use Basic authentication on the proxy
--proxy-digest
Use Digest authentication on the proxy
--proxy-negotiate
Use HTTP Negotiate (SPNEGO) authentication on the proxy
--proxy-ntlm
Use NTLM authentication on the proxy
Proxy Environment Variables
Users may set proxy environment variables with [scheme]_proxy syntax. The curl command-line tool will check for their existence.
Set the environment variable
2. Make curl request
Proxy Headers
Use --proxy-header to set a header specifically for the proxy and NOT the remote server
--proxy-header
Pass custom header(s) to proxy
HTTP
Hypertext Transfer Protocol (HTTP) is an application-layer protocol that serves as the foundational protocol for retrieving web resources, including HTML documents, and serves as the backbone for all data exchanges on the Internet. It operates as a client-server protocol, with requests typically initiated by the recipient, often a web browser.
HTTP Methods
HTTP request methods are commands used by a client to indicate the desired action for a resource on a web server, such as GET for retrieving data or POST for submitting data.
HTTP Headers in Terminal
A HEAD request is a type of HTTP request method used to retrieve only the headers of a resource without actually fetching the resource's body content.
-I, --head
Show document info only
HTTP Response Code
HTTP response status codes indicate whether a specific HTTP request has been successfully completed.
-f, --fail
Fail silently (no output at all) on HTTP errors
HTTP Response Headers
A response header is an HTTP header that can be used in an HTTP response and that doesn't relate to the content of the message.
-v, --verbose
Make the operation more talkative
-I, --head
Show document info only
-D, --dump-header
Write the received headers to file
HTTP Response Bodies
HTTP Message Body is the data bytes transmitted in an HTTP transaction message immediately following the headers
-o, --output
Write to file instead of stdout
-O, --remote-name
Write output to a file named as the remote file
--compressed
Request compressed response, will uncompress automatically
HTTP Authentication
HTTP Authentication is defined in RFC7235 which can be used by a server to challenge a client to authenticate.
401 Unauthorized
The HyperText Transfer Protocol (HTTP) 401 Unauthorized response status code indicates that the client request is unauthorized and must authenticate.
407 Proxy Authentication Required
The HTTP 407 Proxy Authentication Required client error status response code indicates the client request is unauthorized and must authenticate to a proxy between it and the server.
-u, --user
Server user and password
--anyauth
Pick any authentication method, use most secure method the remote server supports.
--basic
Use HTTP Basic Authentication, explicitly request this method.
--digest
Use HTTP Digest Authentication
--ntlm
Use HTTP NTLM authentication
--ntlm-wb
Use HTTP NTLM authentication with winbind
--negotiate
Use HTTP Negotiate (SPNEGO) authentication
HTTP Ranges
An HTTP Range request in which the server is asked to send only a portion of the HTTP server back.
-r, --Range
Retrieve only the bytes within RANGE
-C, --continue-at
Resumed transfer offset
HTTP Versions
HTTP (HyperText Transfer Protocol) has undergone many changes in it’s evolution from from a laboratory environment in 1989 to today’s modern world wide web.
--http0.9
Allow HTTP 0.9 responses
--http1.0
Use HTTP 1.0
--http1.1
Use HTTP 1.0
--http2
Use HTTP 2
--http3 (Experimental)
Use HTTP 3
--http2-prior-knowledge
Use HTTP 2 without HTTP/1.1 Upgrade
What HTTP version does the server support?
HTTP Time Based Conditions
Transfer resource only if… Newer
Transfer resource only if… Older
Transfer resource only if… newer than local file
-R, --remote-time
Set the remote file's time on the local output
HTTP Entity Tag (ETags)
The Entity Tag (ETag) is an identifier for a specific version of a resource.
--etag-save
This option saves an HTTP ETag to the specified file.
--etag-compare
This option makes a conditional HTTP request for the specific ETag read from the given file by sending a custom If-None-Match header using the stored ETag.
Transfer resource only if … resource is different
--data-raw
HTTP POST data, '@' allowed
--data-binary
HTTP POST binary data
HTTP POST: content-type
The content-type header is used to tell the server what the media (MIME) type of the resource the client is sending.
-d, --data defaults to Content-Type: application/x-www-form-urlencoded
HTTP POST - JSON
--json
Sends a JSON object to a server
Send a JSON file to a server
Send JSON from STDIN
Create JSON easily
Receive/parse JSON easily
HTTP POST - URL Encoding
URL Encoding is a method to encode limited ASCII characters that conforms to the URI standard.
--data-urlencode
HTTP POST data url encoded
--data-urlencode [content] where content is…
anything - URL encode the content
=anything - URL encode the content (leave out the ‘=’)
any=thing - Send as “any=[URL encoded thing]”
@anything - Read content from file, URL encode and use
any@thing - Send as “any=[URL encoded file contents]”
HTTP POST - Convert to GET (query)
-G, --get
Put the post data in the URL and use GET
curl --get -d name=admin -d shoesize=12 https://example.com/
--url-query
This is the recommended way as it’s more convenient & powerfull!
curl --url-query name=mrsmith --url-query color=blue http://example.com
HTTP POST: Expect 100-continue
For HTTP/1.1 only, the HTTP 100 Continue header used by curl when POST or PUT greater than 1MB. Is used to avoid sending data when server refused data, when server responds with 100, continue transmission.
Disable 100 Continue
Can waste time as often ignored by servers, so disable with curl
HTTP POST - Chunked
For HTTP/1.1 only, sends data without specifying transmission size at start.
HTTP POST - <form>
Enter <form> fields with curl using a POST, use the action= attribute to see where to send POST request. Use the copy as curl option in the browsers developer tools.
HTTP multipart form-data
Use the Content-Type multipart/form-data media type which uses a series of parts with each part having it’s own name, header, file name, etc and it separated by a boundary.
-F, --form
Specify multipart MIME data as plain text
Specify multipart MIME data as a file
Specify multipart MIME data as a file upload
Specify multipart MIME data as a file upload using a different file name
Specify multipart MIME data with custom content-type
HTTP Redirects
URL Redirects is technique in which the server redirects a client to another page, using a 3xx status code.
-L, --location
Follow redirects
--max-redirs
Maximum number of redirects allowed
--location-trusted
Like --location but also send auth credentials to following hosts
HTTP Request Modification
-X, --request
Specify request command to use
-H, --header
Pass custom header(s) to server
Add an HTTP header
Change an HTTP header
Remove an HTTP header
Blank HTTP header
--request-target
Specify the target for this request
--user-agent
Send User-Agent <name> to server
-e, --referer
Referrer URL
HTTP PUT
The HTTP PUT method creates or replaces a resource on a target.
-T, --upload-file
Transfer local FILE to destination and/or replace
Transfer from STDIN to destination and/or replace
Transfer local file to destination and place as local filename
Use globbing techniques to PUT multiple files
Living on the rebellious side with -X and -d
HTTP Cookies
An HTTP cookie (web cookie, browser cookie) is a small piece of data that a server sends to a user's web browser. The browser may store the cookie and send it back to the same server with later requests. Typically this cookie is used to store sessions, personalize the browser experience, and tracking the user.
HTTP Cookies - Sending
-b, --cookie
Send cookies from string/file
HTTP Cookies - Start Engine
By default curl ignores cookies, you need to enable cookies first. Cookies are stored in memory, forgets cookies that expire, and sends cookies according to the rules.
Start cookies with a blank string or read to file.
Start cookies combined with redirect following.
HTTP Cookies - Cookie Jar
Since curl defaults to saving cookies in memory if we need to save cookies we can do so with -c, --cookie-jar, which saves cookies to a file. The cookie jar is a readable text file, uses the netscape cookie format, and includes sessions.
-c, --cookie-jar
Write cookies to <filename> after operation
Read from and write to the cookie jar (can be different files)
HTTP cookies - Session
When interacting with sessions curl does not know when a session ends. You need to specify when a new session starts.
-j, --junk-session-cookies
Ignore session cookies read from file
HTTP/2
By default curl attempts to negotiate HTTP/2 for all HTTPS transfers. You can ask curl to use HTTP/2 for HTTP transfers with the --http2 option. With HTTP/2 curl can do multiplexed transfers with -Z, --parallel
-Z, --parallel
Perform transfers in parallel
--parallel-immediate
Do not wait for multiplexing (with --parallel)
--parallel-max
Maximum concurrency for parallel transfers
HTTP/3
In HTTP/3 is done over Quick UDP Internet Connections (QUIC), a new transport protocol.
In curl the HTTP protocol differences are obfuscated from the user, HTTP/3 is experimental in curl and it only available for HTTPS with the --http3.
--http3 races against HTTP/1 and HTTP/2 and picks a winner.
-Z, --parallel
Perform transfers in parallel
--parallel-immediate
Do not wait for multiplexing (with --parallel)
--parallel-max
Maximum concurrency for parallel transfers
HTTP alt-svc
The Alt-Svc Header allows a server to indicated another location in the network can be treated as the authority when making requests.
--alt-svc
Enable alt-svc with this cache file
HTTP - HTTP Strict Transport Security (HSTS)
HTTP Strict Transport Security (HSTS) is a standard to protect web users by ensuring that their browsers always connect to a website over HTTPS. Uses the Strict-Transport-Security response header. With curl the HSTS data can be saved and can be used later.
--hsts
Enable HSTS
File Transfer Protocol (FTP)
File Transfer Protocol (FTP) is a way to download, upload, and transfer one network location to another. Originally built on top of the Network Control Protocol (NCP) or ARPANET which uses a simplex protocol that uses two port addresses, establishing two connections. The 2nd connection server-to-client (active) or client-to-server (passive)
--ftp-pasv
Use PASV/EPSV instead of PORT
-P, --ftp-port
Use PORT instead of PASV
FTP Authentication
By default the user: anonymous password:ftp@example.com
-u, --user
Server user and password
FTP Directory Listing
With FTP curl can list the contents of a remote directory, use trailing /
l, --list-only
List only mode, show only file names
FTP Upload
Like other protocols use the -T, --upload-file command. Normally requires -u, --user to allow uploads.
-T, --upload-file
Transfer local FILE to destination with different name
Transfer local FILE to destination
-a, --append
Append to target file when uploading
--ftp-create-dirs
Create the remote dirs if not present
FTPS - FTP-Secure or FTP Secure
FTPS (FTP-Secure or FTP Secure) is an extension of FTP with the addition of Transport Layer Security (TLS) formerly Secure Sockets Layer (SSL) for added data security.
--ssl-reqd
Require SSL/TLS
Use FTPS:// if using (rare) implicit TLS. However, using this option is more problematic for stateful firewalls to handle.
Conclusion
The Mastering the Curl Command Line (slides) with Daniel Stenberg (@badger) was a massive talk that dove into great detail on this amazing tool from the author himself! If you’d like to learn more about curl you can visit the official Everything curl guide as well check out the official GitHub Repo. While you’re at it be sure to check out the curl releases webpage which gives information such as curl version, data, bugfixes, changes, and vulnerabilities.
I hope you found this cheatsheet helpful and happy curling!