- Home
- HTTP Headers
- X-Cache Header
Header
X-Cache Header
Learn how the X-Cache header indicates cache hit or miss status from CDNs and proxies. Debug caching issues and verify CDN configuration with this header.
TL;DR: Non-standard header indicating cache status (HIT, MISS, BYPASS). Used by CDNs and proxies to show whether content was served from cache or origin.
What is X-Cache?
The X-Cache header is a non-standard header commonly used by CDNs (Content Delivery Networks) and caching proxies to indicate whether a response was served from cache (HIT) or fetched from the origin server (MISS). It helps developers understand cache behavior and troubleshoot caching issues.
While not part of the HTTP specification, it’s widely adopted by CDN providers like Cloudflare, Fastly, Akamai, and others.
How X-Cache Works
Cache hit (served from CDN cache):
HTTP/1.1 200 OK
Content-Type: text/html
X-Cache: HIT
X-Cache-Hits: 42
Age: 3600
<!DOCTYPE html>
<html>...
```text
**Cache miss (fetched from origin):**
```http
HTTP/1.1 200 OK
Content-Type: text/html
X-Cache: MISS
X-Cache-Hits: 0
Age: 0
<!DOCTYPE html>
<html>...
Common Values
Basic Status
X-Cache: HIT # Served from cache
X-Cache: MISS # Fetched from origin
X-Cache: BYPASS # Bypassed cache (uncacheable)
X-Cache: EXPIRED # Cached but expired, revalidated
X-Cache: STALE # Served stale content
X-Cache: UPDATING # Being updated in background
X-Cache: REVALIDATED # Cache validated with origin
```http
### With Location Information
```http
X-Cache: HIT from cloudflare
X-Cache: MISS from edge-server-01
X-Cache: HIT, HIT from cloudflare
Combined Status
# Hit at edge, miss at origin shield
X-Cache: HIT, MISS
```text
## Real-World Examples
### Cloudflare
```http
HTTP/1.1 200 OK
CF-Cache-Status: HIT
CF-Ray: 7a1b2c3d4e5f6g7h-SJC
Cache-Control: public, max-age=3600
Age: 1200
# Note: Cloudflare uses CF-Cache-Status instead of X-Cache
Fastly
HTTP/1.1 200 OK
X-Cache: HIT
X-Cache-Hits: 156
X-Served-By: cache-lax-klax8100056-LAX
Cache-Control: public, max-age=86400
Age: 12340
```http
### Akamai
```http
HTTP/1.1 200 OK
X-Cache: TCP_HIT from a104-110-79-18.deploy.akamaitechnologies.com
X-Cache-Remote: TCP_HIT from a104-110-79-18.deploy.akamaitechnologies.com
X-Check-Cacheable: YES
Amazon CloudFront
HTTP/1.1 200 OK
X-Cache: Hit from cloudfront
X-Amz-Cf-Pop: LAX50-C1
X-Amz-Cf-Id: abc123def456
Age: 3600
```http
### Varnish Cache
```http
HTTP/1.1 200 OK
X-Cache: HIT
X-Varnish: 123456 234567
Age: 600
Via: 1.1 varnish
Implementation
Node.js (Express with Cache)
const express = require('express')
const NodeCache = require('node-cache')
const app = express()
const cache = new NodeCache({ stdTTL: 3600 })
app.use((req, res, next) => {
const key = req.url
// Try to get from cache
const cached = cache.get(key)
if (cached) {
// Cache hit
const stats = cache.getTtl(key)
const age = Math.floor((Date.now() - (stats - 3600000)) / 1000)
res.set({
'X-Cache': 'HIT',
'X-Cache-Hits': cache.get(`${key}:hits`) || 1,
Age: age
})
// Increment hit counter
cache.set(`${key}:hits`, (cache.get(`${key}:hits`) || 0) + 1)
return res.send(cached)
}
// Cache miss
res.set('X-Cache', 'MISS')
// Capture response to cache it
const originalSend = res.send
res.send = function (data) {
if (res.statusCode === 200) {
cache.set(key, data)
cache.set(`${key}:hits`, 0)
}
originalSend.call(this, data)
}
next()
})
app.get('/api/users', (req, res) => {
// Simulate database query
const users = [
{ id: 1, name: 'Alice' },
{ id: 2, name: 'Bob' }
]
res.json(users)
})
app.listen(3000)
```javascript
### Redis-Based Caching
```javascript
const express = require('express')
const redis = require('redis')
const app = express()
const redisClient = redis.createClient()
const cacheMiddleware = async (req, res, next) => {
const key = `cache:${req.url}`
try {
const cached = await redisClient.get(key)
if (cached) {
// Cache hit
const hits = (await redisClient.get(`${key}:hits`)) || 0
await redisClient.incr(`${key}:hits`)
const ttl = await redisClient.ttl(key)
const age = 3600 - ttl
res.set({
'X-Cache': 'HIT',
'X-Cache-Hits': parseInt(hits) + 1,
Age: age
})
return res.send(JSON.parse(cached))
}
// Cache miss
res.set('X-Cache', 'MISS')
res.set('X-Cache-Hits', 0)
// Capture response
const originalJson = res.json
res.json = async function (data) {
// Cache the response
await redisClient.setEx(key, 3600, JSON.stringify(data))
await redisClient.set(`${key}:hits`, 0)
originalJson.call(this, data)
}
next()
} catch (error) {
// On error, bypass cache
res.set('X-Cache', 'BYPASS')
next()
}
}
app.use(cacheMiddleware)
app.get('/api/products', async (req, res) => {
const products = await db.getProducts()
res.json(products)
})
app.listen(3000)
Nginx
http {
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=1g inactive=60m;
server {
listen 80;
server_name example.com;
location / {
proxy_pass http://backend;
proxy_cache my_cache;
proxy_cache_valid 200 1h;
# Add X-Cache header
add_header X-Cache $upstream_cache_status;
# Add cache hit counter (requires custom module or lua)
# add_header X-Cache-Hits $cache_hits;
}
}
upstream backend {
server backend-server:3000;
}
}
```text
Values for `$upstream_cache_status`:
- **HIT** - Response served from cache
- **MISS** - Response not in cache
- **BYPASS** - Cache bypassed
- **EXPIRED** - Cache entry expired
- **STALE** - Stale cache entry served
- **UPDATING** - Cache being updated
- **REVALIDATED** - Cache entry was stale and validated
### Varnish
```vcl
vcl 4.1;
backend default {
.host = "backend-server";
.port = "3000";
}
sub vcl_deliver {
# Add X-Cache header
if (obj.hits > 0) {
set resp.http.X-Cache = "HIT";
set resp.http.X-Cache-Hits = obj.hits;
} else {
set resp.http.X-Cache = "MISS";
set resp.http.X-Cache-Hits = 0;
}
# Remove internal Varnish headers in production
# unset resp.http.X-Varnish;
# unset resp.http.Via;
}
Python (Flask with Cache)
from flask import Flask, jsonify, request
from flask_caching import Cache
import time
app = Flask(__name__)
cache = Cache(app, config={
'CACHE_TYPE': 'redis',
'CACHE_REDIS_URL': 'redis://localhost:6379/0',
'CACHE_DEFAULT_TIMEOUT': 3600
})
@app.after_request
def add_cache_headers(response):
cache_key = f"cache:{request.url}"
# Check if in cache
cached = cache.get(cache_key)
if cached:
hits = cache.get(f"{cache_key}:hits") or 0
cache.set(f"{cache_key}:hits", hits + 1)
response.headers['X-Cache'] = 'HIT'
response.headers['X-Cache-Hits'] = str(hits + 1)
else:
response.headers['X-Cache'] = 'MISS'
response.headers['X-Cache-Hits'] = '0'
return response
@app.route('/api/users')
@cache.cached(timeout=3600)
def get_users():
# Simulate slow database query
time.sleep(0.5)
return jsonify([
{'id': 1, 'name': 'Alice'},
{'id': 2, 'name': 'Bob'}
])
if __name__ == '__main__':
app.run(port=3000)
```javascript
## Analyzing Cache Performance
### Client-Side Monitoring
```javascript
// Fetch and log cache status
fetch('/api/users')
.then((response) => {
const cacheStatus = response.headers.get('X-Cache')
const cacheHits = response.headers.get('X-Cache-Hits')
const age = response.headers.get('Age')
console.log(`Cache Status: ${cacheStatus}`)
console.log(`Cache Hits: ${cacheHits}`)
console.log(`Age: ${age}s`)
return response.json()
})
.then((data) => console.log(data))
Cache Hit Rate Calculation
const express = require('express')
const app = express()
const stats = {
hits: 0,
misses: 0
}
app.use((req, res, next) => {
const originalSet = res.set
res.set = function (headers) {
if (headers['X-Cache'] === 'HIT') {
stats.hits++
} else if (headers['X-Cache'] === 'MISS') {
stats.misses++
}
return originalSet.call(this, headers)
}
next()
})
app.get('/stats', (req, res) => {
const total = stats.hits + stats.misses
const hitRate = total > 0 ? (stats.hits / total) * 100 : 0
res.json({
hits: stats.hits,
misses: stats.misses,
total: total,
hitRate: `${hitRate.toFixed(2)}%`
})
})
app.listen(3000)
```javascript
### Logging Cache Events
```javascript
app.use((req, res, next) => {
const start = Date.now()
res.on('finish', () => {
const duration = Date.now() - start
const cacheStatus = res.get('X-Cache')
console.log({
timestamp: new Date().toISOString(),
method: req.method,
url: req.url,
cacheStatus: cacheStatus,
duration: `${duration}ms`,
statusCode: res.statusCode
})
})
next()
})
Best Practices
1. Always Set X-Cache for Debugging
// ✅ Always indicate cache status
if (cached) {
res.set('X-Cache', 'HIT')
} else {
res.set('X-Cache', 'MISS')
}
// ❌ Missing cache status
// (no X-Cache header)
```text
### 2. Include Hit Counter
```javascript
// ✅ Track cache hits
res.set({
'X-Cache': 'HIT',
'X-Cache-Hits': hitCount
})
// ⚠️ Basic hit/miss only
res.set('X-Cache', 'HIT')
3. Use with Age Header
# ✅ Complete cache information
X-Cache: HIT
X-Cache-Hits: 42
Age: 3600
Cache-Control: public, max-age=7200
# ⚠️ Missing age information
X-Cache: HIT
```text
### 4. Remove in Production (Optional)
```javascript
// For security, remove detailed cache headers in production
if (process.env.NODE_ENV === 'production') {
res.removeHeader('X-Cache-Hits')
// Keep X-Cache for debugging if needed
}
5. Document Cache Behavior
// ✅ Clear status values
const CACHE_STATUS = {
HIT: 'HIT', // Served from cache
MISS: 'MISS', // Fetched from origin
BYPASS: 'BYPASS', // Cache bypassed
EXPIRED: 'EXPIRED' // Cache expired, revalidated
}
```text
## Testing Cache Behavior
### cURL
```bash
# First request (should be MISS)
curl -I https://example.com/api/users
# Output:
# X-Cache: MISS
# X-Cache-Hits: 0
# Second request (should be HIT)
curl -I https://example.com/api/users
# Output:
# X-Cache: HIT
# X-Cache-Hits: 1
# Age: 5
# Bypass cache with no-cache
curl -I -H "Cache-Control: no-cache" https://example.com/api/users
# Output:
# X-Cache: BYPASS
Automated Testing
const assert = require('assert')
const fetch = require('node-fetch')
async function testCache() {
// First request should be MISS
const response1 = await fetch('http://localhost:3000/api/users')
assert.strictEqual(response1.headers.get('X-Cache'), 'MISS')
console.log('✓ First request: MISS')
// Second request should be HIT
const response2 = await fetch('http://localhost:3000/api/users')
assert.strictEqual(response2.headers.get('X-Cache'), 'HIT')
console.log('✓ Second request: HIT')
// Cache hits should increment
const hits = parseInt(response2.headers.get('X-Cache-Hits'))
assert(hits > 0, 'Cache hits should be greater than 0')
console.log(`✓ Cache hits: ${hits}`)
}
testCache().catch(console.error)
```http
## Common Patterns
### Multi-Tier Caching
```http
# Edge cache hit, origin cache not checked
X-Cache: HIT
X-Cache-Edge: HIT
X-Cache-Origin: NONE
# Edge miss, origin hit
X-Cache: HIT
X-Cache-Edge: MISS
X-Cache-Origin: HIT
# Complete miss
X-Cache: MISS
X-Cache-Edge: MISS
X-Cache-Origin: MISS
Conditional Requests
# Cache expired, revalidated with 304
X-Cache: REVALIDATED
X-Cache-Revalidated: true
Age: 0
```http
### Stale Content
```http
# Serving stale while revalidating
X-Cache: STALE
X-Cache-Stale-While-Revalidate: true
Warning: 110 - "Response is stale"
Variations Across CDNs
Cloudflare
CF-Cache-Status: HIT
CF-Ray: 7a1b2c3d4e5f6g7h-SJC
```http
### Fastly
```http
X-Cache: HIT
X-Cache-Hits: 42
X-Served-By: cache-lax-klax8100056-LAX
AWS CloudFront
X-Cache: Hit from cloudfront
X-Amz-Cf-Pop: LAX50-C1
```http
### Google Cloud CDN
```http
X-Cache: HIT
X-Cloud-Trace-Context: 12345678901234567890
Azure CDN
X-Cache: TCP_HIT
X-EC-Debug: x-ec-cache-state: TCP_HIT
```text
## Debugging Cache Issues
### Check Cache Headers
```bash
# View all cache-related headers
curl -I https://example.com/page | grep -iE "(cache|age|x-)"
# Test specific endpoint
curl -v https://example.com/api/users 2>&1 | grep -i "x-cache"
Force Cache Bypass
# Various methods to bypass cache
curl -H "Cache-Control: no-cache" https://example.com/api/users
curl -H "Pragma: no-cache" https://example.com/api/users
curl "https://example.com/api/users?nocache=$(date +%s)"
```javascript
### Monitor Cache Performance
```javascript
// Track cache metrics
const metrics = {
total: 0,
hits: 0,
misses: 0,
bypasses: 0
}
app.use((req, res, next) => {
res.on('finish', () => {
metrics.total++
const status = res.get('X-Cache')
if (status === 'HIT') metrics.hits++
else if (status === 'MISS') metrics.misses++
else if (status === 'BYPASS') metrics.bypasses++
console.log(`Cache Hit Rate: ${((metrics.hits / metrics.total) * 100).toFixed(2)}%`)
})
next()
})
Security Considerations
Don’t Expose Sensitive Info
# ❌ Reveals internal infrastructure
X-Cache: HIT from internal-cache-server-us-west-2-prod.corp.internal
# ✅ Generic information
X-Cache: HIT
```text
### Remove Debug Headers in Production
```javascript
// Remove detailed cache headers in production
if (process.env.NODE_ENV === 'production') {
res.removeHeader('X-Cache-Hits')
res.removeHeader('X-Served-By')
res.removeHeader('X-Varnish')
}
Related Headers
- Cache-Control - Cache directives
- Age - Time in cache
- Expires - Cache expiration
- ETag - Cache validation
Frequently Asked Questions
What is X-Cache?
X-Cache indicates whether a response was served from cache. Common values are HIT (from cache), MISS (from origin), and variations like STALE or BYPASS.
What does X-Cache: HIT mean?
HIT means the response was served from cache without contacting the origin server. This is faster and reduces origin load.
What does X-Cache: MISS mean?
MISS means the cache did not have the content and fetched it from the origin. The response may now be cached for future requests.
Is X-Cache a standard header?
No, X-Cache is non-standard but widely used by CDNs and caches. Different providers may use different values. Check your CDN documentation.