HTTP

Header

Transfer-Encoding Header

Learn how the Transfer-Encoding header specifies encoding formats like chunked transfer for streaming responses when content length is unknown beforehand.

6 min read intermediate Try in Playground

TL;DR: Specifies how the message body is encoded for transfer. Most commonly “chunked” for streaming data when total size is unknown upfront.

What is Transfer-Encoding?

The Transfer-Encoding header specifies the form of encoding used to safely transfer the message body between the server and client. It’s like choosing how to package a shipment - you might break it into chunks, compress it, or use special handling.

The most common value is “chunked”, which allows servers to send data in pieces without knowing the total size upfront, perfect for streaming and dynamic content.

How Transfer-Encoding Works

Server sends chunked response:

HTTP/1.1 200 OK
Content-Type: text/plain
Transfer-Encoding: chunked

7\r\n
Mozilla\r\n
9\r\n
Developer\r\n
7\r\n
Network\r\n
0\r\n
\r\n
```http

The data is sent in chunks, each prefixed with its size in hexadecimal.

## Syntax

```http
Transfer-Encoding: chunked
Transfer-Encoding: compress
Transfer-Encoding: deflate
Transfer-Encoding: gzip
Transfer-Encoding: chunked, gzip

Values

  • chunked - Data sent in a series of chunks
  • compress - Lempel-Ziv-Welch (LZW) compression (obsolete)
  • deflate - Deflate compression (rarely used for transfer)
  • gzip - Gzip compression (use Content-Encoding instead)

Common Examples

Chunked Transfer

Transfer-Encoding: chunked
```text

Send data in chunks without knowing total size.

### Compressed and Chunked

```http
Transfer-Encoding: gzip, chunked

Apply compression, then send in chunks (order matters).

Plain Transfer

(No Transfer-Encoding header)

Send data all at once with Content-Length.

Real-World Scenarios

Streaming API Response

GET /api/stream HTTP/1.1
Host: api.example.com

HTTP/1.1 200 OK
Content-Type: text/event-stream
Transfer-Encoding: chunked

1A\r\n
data: {"event": "start"}\n\n\r\n
1E\r\n
data: {"event": "update 1"}\n\n\r\n
1E\r\n
data: {"event": "update 2"}\n\n\r\n
18\r\n
data: {"event": "end"}\n\n\r\n
0\r\n
\r\n
```text

### Live Search Results

```http
GET /search?q=javascript HTTP/1.1

HTTP/1.1 200 OK
Content-Type: application/json
Transfer-Encoding: chunked

32\r\n
{"result": 1, "title": "JavaScript Guide"}\n\r\n
35\r\n
{"result": 2, "title": "JavaScript Tutorial"}\n\r\n
2F\r\n
{"result": 3, "title": "JS Best Practices"}\n\r\n
0\r\n
\r\n

Server-Sent Events

GET /events HTTP/1.1

HTTP/1.1 200 OK
Content-Type: text/event-stream
Transfer-Encoding: chunked
Cache-Control: no-cache

14\r\n
event: message\ndata: hello\n\n\r\n
16\r\n
event: update\ndata: world\n\n\r\n
0\r\n
\r\n
```text

### Dynamic Page Generation

```http
GET /report HTTP/1.1

HTTP/1.1 200 OK
Content-Type: text/html
Transfer-Encoding: chunked

1F\r\n
<html><head><title>Report</title></head>\r\n
10\r\n
<body><h1>Data</h1>\r\n
50\r\n
<p>Processing... (this took time to generate)</p>\r\n
15\r\n
<p>Complete!</p></body></html>\r\n
0\r\n
\r\n

Server Implementation

Express.js (Node.js)

const express = require('express')
const app = express()

// Chunked streaming
app.get('/stream', (req, res) => {
  // Express automatically sets Transfer-Encoding: chunked
  // when you don't set Content-Length

  res.setHeader('Content-Type', 'text/plain')

  // Send chunks
  res.write('First chunk\n')

  setTimeout(() => {
    res.write('Second chunk\n')
  }, 1000)

  setTimeout(() => {
    res.write('Third chunk\n')
    res.end() // Send terminating chunk
  }, 2000)
})

// Server-Sent Events
app.get('/events', (req, res) => {
  res.setHeader('Content-Type', 'text/event-stream')
  res.setHeader('Cache-Control', 'no-cache')
  res.setHeader('Connection', 'keep-alive')

  // Send events periodically
  const intervalId = setInterval(() => {
    const data = { time: new Date().toISOString() }
    res.write(`data: ${JSON.stringify(data)}\n\n`)
  }, 1000)

  // Clean up on client disconnect
  req.on('close', () => {
    clearInterval(intervalId)
  })
})

// Manual chunked encoding
app.get('/manual-chunks', (req, res) => {
  res.setHeader('Transfer-Encoding', 'chunked')
  res.setHeader('Content-Type', 'text/plain')

  const chunks = ['Hello', 'World', 'From', 'Chunks']

  chunks.forEach((chunk, index) => {
    setTimeout(() => {
      res.write(chunk + '\n')

      if (index === chunks.length - 1) {
        res.end()
      }
    }, index * 500)
  })
})

// Stream large file
const fs = require('fs')

app.get('/download/large-file', (req, res) => {
  res.setHeader('Content-Type', 'application/octet-stream')
  // No Content-Length = automatic chunked encoding

  const stream = fs.createReadStream('./large-file.dat')
  stream.pipe(res)
})
```javascript

### Advanced Streaming

```javascript
const { Readable } = require('stream')

app.get('/stream-data', (req, res) => {
  res.setHeader('Content-Type', 'application/json')

  // Create readable stream
  const dataStream = new Readable({
    read() {}
  })

  // Pipe to response (automatic chunking)
  dataStream.pipe(res)

  // Generate data asynchronously
  let count = 0
  const interval = setInterval(() => {
    dataStream.push(JSON.stringify({ count: count++ }) + '\n')

    if (count >= 10) {
      clearInterval(interval)
      dataStream.push(null) // Signal end
    }
  }, 500)
})

FastAPI (Python)

from fastapi import FastAPI
from fastapi.responses import StreamingResponse
import asyncio
import json

app = FastAPI()

@app.get("/stream")
async def stream_response():
    async def generate():
        for i in range(10):
            yield f"Chunk {i}\n"
            await asyncio.sleep(0.5)

    return StreamingResponse(
        generate(),
        media_type="text/plain"
    )

@app.get("/events")
async def server_sent_events():
    async def event_generator():
        for i in range(5):
            data = json.dumps({"count": i})
            yield f"data: {data}\n\n"
            await asyncio.sleep(1)

    return StreamingResponse(
        event_generator(),
        media_type="text/event-stream"
    )

@app.get("/json-stream")
async def json_stream():
    async def generate_json():
        for i in range(100):
            yield json.dumps({"item": i}) + "\n"
            await asyncio.sleep(0.1)

    return StreamingResponse(
        generate_json(),
        media_type="application/x-ndjson"
    )
```javascript

### Django

```python
from django.http import StreamingHttpResponse
import time
import json

def stream_view(request):
    def generate():
        for i in range(10):
            yield f"Chunk {i}\n"
            time.sleep(0.5)

    response = StreamingHttpResponse(
        generate(),
        content_type='text/plain'
    )
    # Django automatically uses chunked transfer encoding
    return response

def event_stream_view(request):
    def event_generator():
        for i in range(5):
            data = json.dumps({'count': i})
            yield f'data: {data}\n\n'
            time.sleep(1)

    return StreamingHttpResponse(
        event_generator(),
        content_type='text/event-stream'
    )

Nginx (Chunked Proxy)

server {
    listen 80;
    server_name example.com;

    location /stream {
        proxy_pass http://backend;

        # Enable chunked transfer encoding
        proxy_http_version 1.1;
        proxy_buffering off;

        # Disable proxy buffering for streaming
        proxy_cache off;
    }

    location /events {
        proxy_pass http://backend/events;
        proxy_http_version 1.1;

        # Required for SSE
        proxy_set_header Connection '';
        proxy_buffering off;
        proxy_cache off;
    }
}
```javascript

## Best Practices

### For Servers

**1. Use chunked for dynamic content**

```javascript
// ✅ Don't wait to calculate content length
res.setHeader('Transfer-Encoding', 'chunked')
res.write('Starting...\n')
// ... generate more content ...
res.end('Done\n')

// ❌ Don't buffer everything to get length
const content = generateAllContent()
res.setHeader('Content-Length', content.length)
res.send(content)

2. Don’t use both Content-Length and Transfer-Encoding

# ❌ Invalid - can't have both
Content-Length: 1024
Transfer-Encoding: chunked

# ✅ Use one or the other
Transfer-Encoding: chunked
```javascript

**3. Use chunked for streaming responses**

```javascript
// ✅ Stream database results
app.get('/large-dataset', async (req, res) => {
  res.setHeader('Content-Type', 'application/json')

  const cursor = db.collection.find().stream()

  cursor.on('data', (doc) => {
    res.write(JSON.stringify(doc) + '\n')
  })

  cursor.on('end', () => {
    res.end()
  })
})

4. Disable buffering for true streaming

// ✅ Disable response buffering
res.setHeader('X-Accel-Buffering', 'no') // Nginx
res.setHeader('Cache-Control', 'no-cache')
```javascript

**5. Handle client disconnections**

```javascript
app.get('/stream', (req, res) => {
  const interval = setInterval(() => {
    res.write('data\n')
  }, 1000)

  // ✅ Clean up when client disconnects
  req.on('close', () => {
    clearInterval(interval)
    console.log('Client disconnected')
  })
})

For Clients

1. Handle chunked responses

// Fetch API handles chunking automatically
const response = await fetch('/stream')
const reader = response.body.getReader()

while (true) {
  const { done, value } = await reader.read()

  if (done) break

  // Process chunk
  console.log('Chunk:', new TextDecoder().decode(value))
}
```javascript

**2. Process Server-Sent Events**

```javascript
const eventSource = new EventSource('/events')

eventSource.onmessage = (event) => {
  console.log('Event:', event.data)
}

eventSource.onerror = (error) => {
  console.error('SSE error:', error)
  eventSource.close()
}

3. Handle streaming JSON

async function* streamJSON(url) {
  const response = await fetch(url)
  const reader = response.body.getReader()
  const decoder = new TextDecoder()
  let buffer = ''

  while (true) {
    const { done, value } = await reader.read()

    if (done) break

    buffer += decoder.decode(value, { stream: true })

    const lines = buffer.split('\n')
    buffer = lines.pop() // Keep incomplete line in buffer

    for (const line of lines) {
      if (line.trim()) {
        yield JSON.parse(line)
      }
    }
  }
}

// Usage
for await (const item of streamJSON('/json-stream')) {
  console.log('Item:', item)
}
```text

## Chunked Encoding Format

### Chunk Structure

```json
[chunk size in hex]\r\n
[chunk data]\r\n
...
0\r\n
\r\n
```text

### Example

```http
HTTP/1.1 200 OK
Transfer-Encoding: chunked

5\r\n
Hello\r\n
7\r\n
 World!\r\n
0\r\n
\r\n

This sends “Hello World!” in two chunks.

Common Use Cases

Server-Sent Events (SSE)

Transfer-Encoding: chunked
Content-Type: text/event-stream
```text

Real-time updates from server to client.

### Live Logs

```http
Transfer-Encoding: chunked
Content-Type: text/plain

Stream server logs to browser.

Progressive Rendering

Transfer-Encoding: chunked
Content-Type: text/html
```text

Send HTML as it's generated.

### Large File Streaming

```http
Transfer-Encoding: chunked
Content-Type: application/octet-stream

Stream files without knowing size upfront.

Database Result Streaming

Transfer-Encoding: chunked
Content-Type: application/x-ndjson
```text

Stream query results as they're fetched.

## Testing Transfer-Encoding

### Using curl

```bash
# View chunked response
curl -v https://example.com/stream

# Show raw chunks
curl -N https://example.com/stream

# Stream Server-Sent Events
curl -N https://example.com/events

Using JavaScript

// Check if response is chunked
fetch('/stream').then((response) => {
  console.log('Transfer-Encoding:', response.headers.get('Transfer-Encoding'))

  const reader = response.body.getReader()

  function read() {
    reader.read().then(({ done, value }) => {
      if (done) {
        console.log('Stream complete')
        return
      }

      console.log('Chunk received:', new TextDecoder().decode(value))
      read()
    })
  }

  read()
})

Frequently Asked Questions

What is Transfer-Encoding?

Transfer-Encoding specifies how the message body is encoded for transfer. The most common value is "chunked" which sends data in pieces without knowing the total size upfront.

What is chunked transfer encoding?

Chunked encoding sends the body in pieces, each prefixed with its size. It allows streaming responses when the total size is unknown. The transfer ends with a zero-length chunk.

When should I use chunked encoding?

Use chunked when you do not know the response size upfront, for streaming data, or for server-sent events. It allows sending data as it becomes available.

What is the difference between Transfer-Encoding and Content-Encoding?

Transfer-Encoding is about how data is transferred (chunked). Content-Encoding is about compression (gzip). Transfer-Encoding is hop-by-hop; Content-Encoding is end-to-end.

Keep Learning