Complete Guide to Bash curl Command

Introduction to curl Command

curl (Client URL) is a powerful command-line tool for transferring data using various network protocols. It supports HTTP, HTTPS, FTP, FTPS, SCP, SFTP, TFTP, DICT, TELNET, LDAP, and many more. curl is essential for testing APIs, downloading files, debugging web services, and automating web interactions.

Key Concepts

  • URLs: Supports various protocols (http://, https://, ftp://, etc.)
  • Methods: GET, POST, PUT, DELETE, HEAD, OPTIONS
  • Headers: Custom HTTP headers for requests
  • Data: Send data with POST/PUT requests
  • Authentication: Basic, digest, OAuth, certificates
  • Cookies: Store and send cookies
  • Redirects: Follow HTTP redirects

1. Basic curl Syntax

Command Structure

# Basic syntax
curl [options] [URL...]
# Simple GET request
curl https://api.example.com/users
# Save output to file
curl -o output.html https://example.com
# Download with original filename
curl -O https://example.com/file.zip
# Silent mode (no progress meter)
curl -s https://api.example.com/data

Simple Examples

# Basic webpage retrieval
curl https://www.example.com
# Get HTTP headers only
curl -I https://www.example.com
# Follow redirects
curl -L https://example.com
# Verbose output (debugging)
curl -v https://api.example.com/users
# Limit download speed
curl --limit-rate 100K -O https://example.com/bigfile.zip

2. HTTP Methods

GET Requests

# Basic GET
curl https://api.example.com/users
# GET with query parameters
curl "https://api.example.com/users?page=2&limit=10"
# GET with encoded parameters
curl -G -d "name=John Doe" -d "age=30" https://api.example.com/search
# GET with custom headers
curl -H "Accept: application/json" https://api.example.com/users
# GET with authentication
curl -u username:password https://api.example.com/private

POST Requests

# POST with form data
curl -X POST -d "name=John&age=30" https://api.example.com/users
# POST with JSON data
curl -X POST \
-H "Content-Type: application/json" \
-d '{"name":"John", "age":30}' \
https://api.example.com/users
# POST with file upload
curl -F "[email protected]" \
-F "description=My Document" \
https://api.example.com/upload
# POST with URL-encoded data
curl --data-urlencode "comment=Hello World!" \
https://api.example.com/comments
# POST multiple fields
curl -X POST \
-F "username=johndoe" \
-F "password=secret123" \
-F "[email protected]" \
https://api.example.com/register

PUT and PATCH

# PUT request (update entire resource)
curl -X PUT \
-H "Content-Type: application/json" \
-d '{"name":"John Updated", "age":31}' \
https://api.example.com/users/123
# PATCH request (partial update)
curl -X PATCH \
-H "Content-Type: application/json" \
-d '{"age":31}' \
https://api.example.com/users/123

DELETE and Other Methods

# DELETE request
curl -X DELETE https://api.example.com/users/123
# HEAD request (headers only)
curl -I https://api.example.com/users/123
# OPTIONS request (get allowed methods)
curl -X OPTIONS https://api.example.com/users
# TRACE request (debugging)
curl -X TRACE https://api.example.com

3. Headers and Authentication

Custom Headers

# Single header
curl -H "X-API-Key: 123456" https://api.example.com/data
# Multiple headers
curl -H "Accept: application/json" \
-H "Authorization: Bearer token123" \
-H "X-Custom-Header: custom-value" \
https://api.example.com/users
# User-Agent header
curl -A "Mozilla/5.0 (Linux; Android 10)" https://example.com
# Referer header
curl -e "https://google.com" https://example.com
# Conditional headers
curl -H "If-Modified-Since: Wed, 21 Oct 2023 07:28:00 GMT" \
https://example.com/file

Authentication Methods

# Basic authentication
curl -u username:password https://api.example.com/private
# Basic with base64 (manual)
curl -H "Authorization: Basic $(echo -n user:pass | base64)" \
https://api.example.com/private
# Bearer token (OAuth 2.0)
curl -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." \
https://api.example.com/secure
# Digest authentication
curl --digest -u username:password https://api.example.com/digest
# OAuth 2.0 with client credentials
curl -X POST \
-d "grant_type=client_credentials" \
-u "client_id:client_secret" \
https://auth.example.com/token
# AWS Signature v4
curl -X POST \
-H "Authorization: AWS4-HMAC-SHA256 ..." \
https://aws-api.example.com

4. Data Transfer

Downloading Files

# Download with original filename
curl -O https://example.com/file.zip
# Download with custom filename
curl -o custom_name.zip https://example.com/file.zip
# Download multiple files
curl -O https://example.com/file1.zip -O https://example.com/file2.zip
# Resume interrupted download
curl -C - -O https://example.com/largefile.zip
# Download with progress bar
curl -# -O https://example.com/largefile.zip
# Timeout for slow downloads
curl --connect-timeout 10 --max-time 30 -O https://example.com/file.zip
# Download only if newer
curl -z 2024-01-01 -O https://example.com/updated-file.zip

Uploading Files

# Upload file via POST
curl -F "[email protected]" https://api.example.com/upload
# Upload with additional fields
curl -F "[email protected]" \
-F "title=My Photo" \
-F "description=Vacation picture" \
https://api.example.com/photos
# Upload via PUT
curl -T "local-file.txt" https://api.example.com/remote-file.txt
# Upload multiple files
curl -F "files[][email protected]" \
-F "files[][email protected]" \
https://api.example.com/upload
# Stream upload from stdin
echo "data" | curl -T - https://api.example.com/stream

5. Cookies and Sessions

Cookie Handling

# Send cookies with request
curl -b "session=abc123; user=john" https://api.example.com/profile
# Save cookies to file
curl -c cookies.txt https://example.com/login
# Load cookies from file
curl -b cookies.txt https://example.com/dashboard
# Cookie jar (load and save)
curl -b cookies.txt -c cookies.txt https://example.com/session
# Session handling with cookies
# First login
curl -c cookies.txt -X POST \
-d "username=john&password=secret" \
https://api.example.com/login
# Use session for subsequent requests
curl -b cookies.txt https://api.example.com/private-data

Advanced Cookie Usage

# Ignore cookies
curl --no-sessionid https://example.com
# Cookie with domain and path
curl -b "session=abc123; Domain=.example.com; Path=/" \
https://api.example.com
# Parse cookie from response
curl -i https://example.com/login | grep -i set-cookie
# Multiple cookie files
curl -b cookies1.txt -b cookies2.txt https://example.com

6. Output Handling

Output Control

# Save output to file
curl -o output.html https://example.com
# Silent mode (no output except errors)
curl -s https://api.example.com/users
# Silent with progress
curl -sS https://example.com/largefile.zip  # -S shows errors
# Output to stdout
curl https://example.com > output.html
# Discard output
curl -o /dev/null https://example.com/largefile.zip
# Show only headers
curl -I https://example.com
# Show response headers with body
curl -i https://api.example.com/users
# Show verbose output (debug)
curl -v https://api.example.com/users
# Show everything (including SSL)
curl --trace-ascii trace.txt https://example.com

Formatting Output

# Pretty print JSON with jq
curl -s https://api.example.com/users | jq '.'
# Format JSON with Python
curl -s https://api.example.com/users | python -m json.tool
# Extract specific data with grep
curl -s https://example.com | grep -o '<title>[^<]*'
# Save headers and body separately
curl -D headers.txt -o body.txt https://example.com
# Write output to multiple files
curl https://example.com | tee output.html
# Append to file
curl -s https://api.example.com/log >> api.log

7. SSL and Certificates

SSL/TLS Options

# Ignore SSL certificate errors (insecure)
curl -k https://self-signed.example.com
# Specify CA certificate
curl --cacert ca-bundle.crt https://example.com
# Specify client certificate
curl --cert client.pem --key key.pem https://example.com
# Certificate with password
curl --cert client.pem:password https://example.com
# Specify SSL version
curl --ssl-reqd -3 https://example.com  # SSLv3
curl --tlsv1.2 https://example.com       # TLS 1.2
# Check certificate details
curl -vI https://example.com 2>&1 | grep -i "certificate"

SSL Troubleshooting

# Show certificate chain
openssl s_client -connect example.com:443 -showcerts
# Test SSL connection
curl -vI https://example.com
# Disable SSL verification (testing only)
curl -k https://example.com
# Use specific cipher
curl --ciphers 'ECDHE-RSA-AES128-GCM-SHA256' https://example.com
# SSL protocol versions
curl --tlsv1.0 https://example.com
curl --tlsv1.1 https://example.com
curl --tlsv1.2 https://example.com
curl --tlsv1.3 https://example.com

8. Proxy and Network Options

Proxy Configuration

# HTTP proxy
curl -x http://proxy.example.com:8080 https://example.com
# HTTPS proxy
curl -x https://proxy.example.com:8443 https://example.com
# Proxy with authentication
curl -x http://user:[email protected]:8080 https://example.com
# SOCKS proxy
curl --socks5 proxy.example.com:1080 https://example.com
# SOCKS with hostname resolution
curl --socks5-hostname proxy.example.com:1080 https://example.com
# Proxy from environment
export http_proxy="http://proxy.example.com:8080"
curl https://example.com
# Bypass proxy for specific domains
export no_proxy="localhost,127.0.0.1"

Network Settings

# Interface binding
curl --interface eth0 https://example.com
# Local port
curl --local-port 10000-20000 https://example.com
# DNS resolution
curl --dns-servers 8.8.8.8,8.8.4.4 https://example.com
# IPv4 only
curl -4 https://example.com
# IPv6 only
curl -6 https://example.com
# Connection timeout
curl --connect-timeout 5 https://example.com
# Total operation timeout
curl --max-time 30 https://example.com
# Speed limit
curl --speed-time 10 --speed-limit 1000 https://example.com

9. Rate Limiting and Retries

Retry Options

# Automatic retry
curl --retry 3 https://example.com/unstable
# Retry with delay
curl --retry 5 --retry-delay 10 https://example.com
# Retry on all errors
curl --retry 3 --retry-all-errors https://example.com
# Maximum time for retry
curl --retry 3 --retry-max-time 60 https://example.com
# No retry on specific HTTP codes
curl --retry 3 --retry-connrefused https://example.com

Rate Limiting

# Limit download speed
curl --limit-rate 200K https://example.com/largefile.zip
# Limit upload speed
curl --limit-rate 100K -T file.zip https://example.com/upload
# Sleep between requests
for i in {1..10}; do
curl https://api.example.com/data/$i
sleep 2
done
# Using delay in script
curl -w "@-" https://api.example.com << EOF
{
"requests": [
"request1",
"request2"
]
}
EOF

10. API Testing Examples

REST API Testing

#!/bin/bash
# API testing script
API_BASE="https://api.example.com/v1"
TOKEN="your-auth-token"
# Helper function
api_request() {
local method=$1
local endpoint=$2
local data=$3
curl -s -X "$method" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
${data:+-d "$data"} \
"$API_BASE/$endpoint"
}
# Test CRUD operations
echo "=== Testing API ==="
# Create user
echo "Creating user..."
response=$(api_request POST "users" '{"name":"John","email":"[email protected]"}')
user_id=$(echo "$response" | jq -r '.id')
echo "Created user with ID: $user_id"
# Get user
echo "Getting user..."
api_request GET "users/$user_id"
# Update user
echo "Updating user..."
api_request PUT "users/$user_id" '{"name":"John Updated"}'
# List users
echo "Listing users..."
api_request GET "users" | jq '.[:3]'
# Delete user
echo "Deleting user..."
api_request DELETE "users/$user_id"
# Response time measurement
echo "Response times:"
for i in {1..5}; do
curl -w "Request $i: %{time_total}s\n" -o /dev/null -s "$API_BASE/users"
done

GraphQL Queries

#!/bin/bash
# GraphQL query
GRAPHQL_ENDPOINT="https://api.example.com/graphql"
# Simple query
curl -X POST \
-H "Content-Type: application/json" \
-d '{"query": "{ users { id name email } }"}' \
"$GRAPHQL_ENDPOINT"
# Query with variables
curl -X POST \
-H "Content-Type: application/json" \
-d '{
"query": "query getUser($id: ID!) { user(id: $id) { name email posts { title } } }",
"variables": {"id": "123"}
}' \
"$GRAPHQL_ENDPOINT"
# Mutation
curl -X POST \
-H "Content-Type: application/json" \
-d '{
"query": "mutation createUser($input: UserInput!) { createUser(input: $input) { id name } }",
"variables": {"input": {"name": "John", "email": "[email protected]"}}
}' \
"$GRAPHQL_ENDPOINT"
# Batch queries
curl -X POST \
-H "Content-Type: application/json" \
-d '[
{"query": "{ users { name } }"},
{"query": "{ posts { title } }"}
]' \
"$GRAPHQL_ENDPOINT"

11. File Upload Scenarios

Multi-part Form Upload

#!/bin/bash
# File upload with metadata
upload_file() {
local file=$1
local title=$2
local description=$3
curl -X POST \
-F "file=@$file" \
-F "title=$title" \
-F "description=$description" \
-F "tags[]=important" \
-F "tags[]=document" \
-H "X-API-Key: your-api-key" \
https://api.example.com/upload
}
# Upload with progress
upload_with_progress() {
local file=$1
curl -# \
-F "file=@$file" \
https://api.example.com/upload
}
# Resumable upload
resumable_upload() {
local file=$1
local upload_id=$2
local offset=$3
dd if="$file" bs=1024 skip=$offset 2>/dev/null | \
curl -X PATCH \
-H "Upload-ID: $upload_id" \
-H "Content-Range: bytes $offset-$(stat -c%s "$file")" \
--data-binary @- \
https://api.example.com/upload
}
# Upload multiple files
upload_multiple() {
local dir=$1
for file in "$dir"/*; do
echo "Uploading $file..."
curl -F "file=@$file" https://api.example.com/upload
sleep 1
done
}
# Usage
# upload_file "document.pdf" "My Doc" "Important document"
# upload_with_progress "largefile.zip"

FTP Upload/Download

# FTP upload
curl -T file.txt ftp://ftp.example.com/upload/ --user username:password
# FTP download
curl -O ftp://ftp.example.com/file.txt --user username:password
# FTP with SSL
curl --ssl-reqd -T file.txt ftps://ftp.example.com/upload/ --user username:password
# List FTP directory
curl ftp://ftp.example.com/ --user username:password
# FTP wildcard download
curl ftp://ftp.example.com/*.txt --user username:password
# FTP with resume
curl -C - -O ftp://ftp.example.com/largefile.zip --user username:password

12. Advanced Features

Parallel Downloads

#!/bin/bash
# Download multiple files in parallel
download_parallel() {
local urls=("$@")
local pids=()
for url in "${urls[@]}"; do
curl -O "$url" &
pids+=($!)
done
# Wait for all downloads
for pid in "${pids[@]}"; do
wait "$pid"
done
echo "All downloads complete"
}
# Download with xargs
echo "https://example.com/file1.zip
https://example.com/file2.zip
https://example.com/file3.zip" | xargs -n 1 -P 3 curl -O
# Using GNU parallel
parallel curl -O ::: https://example.com/file{1..10}.zip

URL Manipulation

# URL encode
curl -G -v --data-urlencode "query=hello world" https://example.com/search
# URL decode
echo "hello%20world" | curl -G --data-urlencode @- https://example.com
# Follow redirects
curl -L https://bit.ly/shortlink
# Maximum redirects
curl --max-redirs 5 -L https://example.com
# Handle redirects with authentication
curl -L --user user:pass https://example.com

WebSockets (with additional tools)

# WebSocket connection (requires websocat)
websocat wss://echo.websocket.org
# WebSocket with curl-like interface
curl -i -N -H "Connection: Upgrade" \
-H "Upgrade: websocket" \
-H "Sec-WebSocket-Key: $(openssl rand -base64 16)" \
-H "Sec-WebSocket-Version: 13" \
https://websocket.example.com

13. Debugging and Troubleshooting

Verbose Output

# Basic verbose
curl -v https://example.com
# Very verbose
curl --trace-ascii - https://example.com
# Trace with timestamps
curl --trace-time -v https://example.com
# Write trace to file
curl --trace trace.txt https://example.com
# Show only response headers
curl -I https://example.com
# Show request and response headers
curl -i https://example.com
# Debug SSL connection
curl -vI https://example.com 2>&1 | grep -i "ssl\|tls"

Response Analysis

# Show response headers only
curl -s -o /dev/null -D - https://example.com
# Extract status code
curl -s -o /dev/null -w "%{http_code}" https://example.com
# Show timing information
curl -w "\
DNS: %{time_namelookup}s\n\
Connect: %{time_connect}s\n\
SSL: %{time_appconnect}s\n\
Pre-transfer: %{time_pretransfer}s\n\
Start transfer: %{time_starttransfer}s\n\
Total: %{time_total}s\n\
Speed: %{speed_download} B/s\n" \
-o /dev/null -s https://example.com
# Custom output format
curl -w "HTTP %{http_code} | %{size_download} bytes | %{time_total}s\n" \
-o /dev/null -s https://example.com
# Save headers and body
curl -D headers.txt -o body.txt https://example.com

Common Error Handling

#!/bin/bash
# Check HTTP status
check_status() {
local url=$1
local status=$(curl -s -o /dev/null -w "%{http_code}" "$url")
case $status in
200) echo "OK" ;;
301|302) echo "Redirect" ;;
401) echo "Unauthorized" ;;
403) echo "Forbidden" ;;
404) echo "Not Found" ;;
500) echo "Server Error" ;;
*) echo "Unknown status: $status" ;;
esac
}
# Retry on failure
retry_curl() {
local max_attempts=5
local attempt=1
local delay=5
while [ $attempt -le $max_attempts ]; do
echo "Attempt $attempt of $max_attempts"
if curl -f "$@"; then
return 0
fi
echo "Failed, waiting $delay seconds..."
sleep $delay
attempt=$((attempt + 1))
delay=$((delay * 2))
done
return 1
}
# Check SSL certificate expiration
check_cert_expiry() {
local host=$1
local port=${2:-443}
echo | openssl s_client -connect "$host:$port" 2>/dev/null | \
openssl x509 -noout -dates
}
# Usage
check_status "https://example.com"
retry_curl -O https://unstable.example.com/file.zip
check_cert_expiry "example.com"

14. Script Examples

Website Monitoring

#!/bin/bash
# Website monitoring script
MONITOR_URL="https://example.com"
ALERT_EMAIL="[email protected]"
LOG_FILE="/var/log/website_monitor.log"
check_website() {
local url=$1
local timestamp=$(date '+%Y-%m-%d %H:%M:%S')
# Check HTTP status
http_code=$(curl -s -o /dev/null -w "%{http_code}" "$url")
# Measure response time
response_time=$(curl -s -o /dev/null -w "%{time_total}" "$url")
# Check content
content=$(curl -s "$url" | grep -c "Expected Content")
# Log result
echo "$timestamp | Status: $http_code | Time: ${response_time}s | Content: $content" >> "$LOG_FILE"
# Alert if issues
if [ "$http_code" -ne 200 ]; then
echo "Website $url returned HTTP $http_code at $timestamp" | \
mail -s "Website Alert" "$ALERT_EMAIL"
fi
if (( $(echo "$response_time > 5" | bc -l) )); then
echo "Website $url is slow (${response_time}s) at $timestamp" | \
mail -s "Performance Alert" "$ALERT_EMAIL"
fi
}
# Monitor multiple websites
WEBSITES=(
"https://example.com"
"https://api.example.com"
"https://blog.example.com"
)
while true; do
for site in "${WEBSITES[@]}"; do
check_website "$site"
done
sleep 300  # Check every 5 minutes
done

API Load Tester

#!/bin/bash
# Simple load testing script
LOAD_TEST_URL="https://api.example.com/test"
CONCURRENT=10
REQUESTS=100
load_test() {
local url=$1
local concurrent=$2
local total=$3
echo "Load testing $url"
echo "Concurrent: $concurrent, Total requests: $total"
echo "========================================"
# Create temporary files
temp_dir=$(mktemp -d)
# Run concurrent requests
for i in $(seq 1 $total); do
(
start=$(date +%s%N)
curl -s -o /dev/null -w "%{http_code}" "$url" > "$temp_dir/status_$i"
end=$(date +%s%N)
duration=$(( ($end - $start) / 1000000 ))  # ms
echo "$duration" > "$temp_dir/time_$i"
) &
# Throttle concurrent requests
if [ $((i % concurrent)) -eq 0 ]; then
wait
fi
done
wait
# Analyze results
echo "========================================"
echo "Results:"
# Success rate
success=0
for f in "$temp_dir"/status_*; do
if [ "$(cat "$f")" = "200" ]; then
success=$((success + 1))
fi
done
echo "Success rate: $((success * 100 / total))%"
# Response times
times=()
for f in "$temp_dir"/time_*; do
times+=($(cat "$f"))
done
# Sort times
sorted=($(printf '%s\n' "${times[@]}" | sort -n))
# Calculate percentiles
count=${#sorted[@]}
p50=${sorted[$((count * 50 / 100))]}
p90=${sorted[$((count * 90 / 100))]}
p95=${sorted[$((count * 95 / 100))]}
p99=${sorted[$((count * 99 / 100))]}
echo "Response times (ms):"
echo "  Min: ${sorted[0]}"
echo "  Max: ${sorted[-1]}"
echo "  Avg: $(echo "${times[@]}" | awk '{sum+=$1} END {print sum/NR}')"
echo "  P50: $p50"
echo "  P90: $p90"
echo "  P95: $p95"
echo "  P99: $p99"
# Cleanup
rm -rf "$temp_dir"
}
# Usage
load_test "$LOAD_TEST_URL" "$CONCURRENT" "$REQUESTS"

Webhook Tester

#!/bin/bash
# Webhook testing script
WEBHOOK_URL="https://hooks.example.com/webhook"
send_webhook() {
local event=$1
local payload=$2
curl -X POST \
-H "Content-Type: application/json" \
-H "X-Event-Type: $event" \
-H "X-Signature: $(echo -n "$payload" | openssl dgst -sha256 -hmac "secret" | awk '{print $2}')" \
-d "$payload" \
"$WEBHOOK_URL"
}
# Test different webhook events
events=(
'{"type":"user.created","user":{"id":123,"name":"John"}}'
'{"type":"order.placed","order":{"id":456,"amount":99.99}}'
'{"type":"payment.received","payment":{"id":789,"amount":99.99}}'
)
for event in "${events[@]}"; do
type=$(echo "$event" | jq -r '.type')
echo "Sending $type webhook..."
send_webhook "$type" "$event"
echo
sleep 1
done

15. Integration with Other Tools

With jq for JSON Processing

#!/bin/bash
# Process JSON responses
API="https://api.example.com/users"
# Get specific fields
curl -s "$API" | jq '.[].name'
# Filter data
curl -s "$API" | jq '.[] | select(.age > 25)'
# Transform data
curl -s "$API" | jq '[.[] | {name: .name, email: .email}]'
# Count items
curl -s "$API" | jq length
# Create CSV
curl -s "$API" | jq -r '.[] | [.name, .email, .age] | @csv'
# Pretty print with sorting
curl -s "$API" | jq 'sort_by(.name)'

With sed/awk

# Extract specific data
curl -s https://example.com | sed -n '/<title>/,/<\/title>/p'
# Parse HTML table
curl -s https://example.com | grep -o '<td>[^<]*</td>' | sed 's/<[^>]*>//g'
# Process CSV response
curl -s "https://api.example.com/export.csv" | awk -F',' '{print $1","$3}'
# Clean HTML
curl -s https://example.com | sed 's/<[^>]*>//g' | tr -s ' '

With Python

# Process with Python one-liner
curl -s https://api.example.com/users | python -c "
import json, sys
data = json.load(sys.stdin)
for user in data:
print(f\"{user['name']}: {user['email']}\")
"
# Advanced processing with Python script
curl -s https://api.example.com/data | python3 -c '
import json
import sys
from collections import Counter
data = json.load(sys.stdin)
categories = Counter(item["category"] for item in data)
for cat, count in categories.most_common():
print(f"{cat}: {count}")
'

With mail

# Send email with curl output
curl -s https://api.example.com/report | mail -s "Daily Report" [email protected]
# Email with attachment
curl -o report.pdf https://example.com/report.pdf
echo "Report attached" | mail -s "Monthly Report" -A report.pdf [email protected]
# HTML email
curl -s https://api.example.com/dashboard | mail -a "Content-Type: text/html" -s "Dashboard" [email protected]

16. Security Best Practices

Secure Credential Handling

#!/bin/bash
# Don't hardcode credentials
# Bad
curl -u username:password https://api.example.com
# Good - read from environment
curl -u "${API_USER}:${API_PASS}" https://api.example.com
# Good - read from file
read API_KEY < ~/.secure/api_key.txt
curl -H "X-API-Key: $API_KEY" https://api.example.com
# Use credential helpers
curl -H "Authorization: Bearer $(gcloud auth print-access-token)" https://googleapis.com
# Encrypted credentials
API_KEY=$(gpg -d ~/.secure/api_key.gpg 2>/dev/null)
curl -H "X-API-Key: $API_KEY" https://api.example.com

SSL/TLS Best Practices

# Always verify SSL certificates
curl https://example.com  # Good
curl -k https://example.com  # Bad (only for testing)
# Use strong TLS versions
curl --tlsv1.2 https://example.com
# Specify CA bundle
curl --cacert /etc/ssl/certs/ca-certificates.crt https://example.com
# Check certificate pinning
curl --pinnedpubkey "sha256//..." https://example.com
# Certificate validation
curl --cert-status https://example.com

Safe Scripting

#!/bin/bash
# Safe scripting practices
# Validate URLs
validate_url() {
local url=$1
if [[ ! "$url" =~ ^https?:// ]]; then
echo "Invalid URL" >&2
return 1
fi
}
# Escape variables
safe_curl() {
local url=$(printf '%s' "$1" | sed 's/[^a-zA-Z0-9:\/\.-]//g')
curl "$url"
}
# Error handling
if ! curl -f "$url" -o "$output" 2>/dev/null; then
echo "Download failed" >&2
exit 1
fi
# Rate limiting
for i in {1..10}; do
curl "https://api.example.com/item/$i"
sleep 1  # Be nice to the API
done
# User agent identification
curl -A "MyApp/1.0 ([email protected])" https://api.example.com

Conclusion

curl is an incredibly versatile tool for transferring data across networks:

Key Takeaways

  1. Protocol Support: HTTP, HTTPS, FTP, and many more
  2. HTTP Methods: GET, POST, PUT, DELETE, PATCH
  3. Headers: Custom headers for authentication and content type
  4. Data Transfer: Upload and download files
  5. Cookies: Session management
  6. Authentication: Basic, digest, bearer tokens
  7. SSL/TLS: Secure connections with certificate validation
  8. Proxy Support: HTTP, HTTPS, SOCKS proxies

Common Use Cases

TaskCommand
Download filecurl -O https://example.com/file.zip
API GET requestcurl https://api.example.com/users
API POST JSONcurl -X POST -H "Content-Type: application/json" -d '{"name":"John"}' https://api.example.com/users
Check response timecurl -w "%{time_total}\n" -o /dev/null -s https://example.com
Test authenticationcurl -u user:pass https://api.example.com/private
Follow redirectscurl -L https://bit.ly/shortlink
Upload filecurl -F "[email protected]" https://api.example.com/upload

Best Practices

  1. Use -f to fail on HTTP errors
  2. Always verify SSL certificates in production
  3. Don't hardcode credentials - use environment variables or files
  4. Handle rate limits with appropriate delays
  5. Check HTTP status codes for error handling
  6. Use -L to follow redirects when needed
  7. Set timeouts for production scripts
  8. Specify user agent to identify your application
  9. Use -s for silent mode in scripts
  10. Validate URLs before making requests

curl is the Swiss Army knife of internet data transfer. Master it for effective API testing, web scraping, and automation!

Leave a Reply

Your email address will not be published. Required fields are marked *


Macro Nepal Helper