nodejs|April 02, 2026|4 min read

Node.js Architecture — Event Loop Deep Dive

TL;DR

The Node.js event loop has 6 phases (timers, pending callbacks, idle/prepare, poll, check, close). Microtasks (Promise, process.nextTick) run between phases. Understanding this is key to writing performant, non-blocking Node.js code.

Why the Event Loop Matters

Node.js runs JavaScript on a single thread, yet handles thousands of concurrent connections. The secret is the event loop — a mechanism that offloads I/O operations to the operating system and processes callbacks when results are ready.

Understanding the event loop isn’t academic — it directly impacts how you debug performance issues, avoid blocking operations, and write efficient async code.

Node.js Architecture Layers

Before diving into the event loop, let’s understand what sits beneath your JavaScript code.

Node.js Architecture Layers

Node.js is built on three core pillars:

  • V8 Engine — Google’s JavaScript engine that compiles JS to machine code via JIT compilation
  • libuv — A C library that provides the event loop, async I/O, and a thread pool for operations the OS can’t handle asynchronously
  • Node.js Bindings — C++ glue code that bridges JavaScript with libuv and other low-level libraries

The 6 Phases of the Event Loop

The event loop is not a simple “while(true)” — it’s a series of distinct phases, each with its own queue of callbacks.

Event Loop Phases

Phase 1: Timers

Executes callbacks scheduled by setTimeout() and setInterval(). A timer specifies a threshold after which a callback may execute — not an exact time.

setTimeout(() => {
  console.log('Timer fired');
}, 100);

// This callback runs AFTER 100ms, not exactly AT 100ms
// The actual delay depends on other callbacks in the queue

Phase 2: Pending Callbacks

Executes I/O callbacks deferred from the previous loop iteration. This handles certain system-level callbacks like TCP errors (ECONNREFUSED).

Phase 3: Idle / Prepare

Internal use only. Node.js uses this phase for housekeeping — you can’t schedule callbacks here.

Phase 4: Poll

The most important phase. It:

  1. Calculates how long it should block and poll for I/O
  2. Processes events in the poll queue (I/O callbacks like reading files, network responses)

If no timers are scheduled, the event loop will wait here for new I/O events.

Phase 5: Check

Executes setImmediate() callbacks. This phase runs immediately after the poll phase completes.

Phase 6: Close Callbacks

Handles close events like socket.on('close') and cleanup operations.

Microtasks vs Macrotasks

This is where most developers get confused. Between every phase of the event loop, Node.js drains two special queues:

Microtask vs Macrotask

Microtask Queue (highest priority)

  1. process.nextTick() — Runs before any other microtask
  2. Promise callbacks (.then(), .catch(), await)

Macrotask Queue (runs in event loop phases)

  • setTimeout() / setInterval() → Timers phase
  • setImmediate() → Check phase
  • I/O callbacks → Poll phase

Execution Order Example

console.log('1: sync');

setTimeout(() => console.log('2: setTimeout'), 0);

setImmediate(() => console.log('3: setImmediate'));

Promise.resolve().then(() => console.log('4: Promise'));

process.nextTick(() => console.log('5: nextTick'));

console.log('6: sync');

Output:

1: sync
6: sync
5: nextTick
4: Promise
2: setTimeout
3: setImmediate

Why this order? The call stack executes sync code first (1, 6). Then nextTick drains (5), then Promises (4), then the event loop phases run — timers (2), then check (3).

Nested Microtasks Can Starve the Event Loop

// DANGER: This blocks the event loop forever
function recurse() {
  process.nextTick(recurse);
}
recurse();

// setTimeout will NEVER fire because nextTick
// queue keeps getting refilled
setTimeout(() => console.log('never reached'), 0);

libuv and the Thread Pool

Not all async operations use the OS async primitives. Some operations use libuv’s thread pool:

Uses OS async (epoll/kqueue) Uses Thread Pool
TCP/UDP sockets File system operations
DNS resolution (c-ares) DNS lookup (dns.lookup)
Pipes Crypto (pbkdf2, randomBytes)
Signals Zlib compression

The default thread pool size is 4 threads. You can increase it:

// Set before any I/O operations
process.env.UV_THREADPOOL_SIZE = 8; // max 1024

When the Thread Pool Becomes a Bottleneck

const crypto = require('crypto');

// Each pbkdf2 call uses one thread pool thread
// With default pool size of 4, only 4 run concurrently
const start = Date.now();

for (let i = 0; i < 8; i++) {
  crypto.pbkdf2('password', 'salt', 100000, 512, 'sha512', () => {
    console.log(`Hash ${i}: ${Date.now() - start}ms`);
  });
}

// Output shows two "batches" of 4:
// Hash 0-3: ~90ms  (first batch, 4 threads)
// Hash 4-7: ~180ms (second batch, waited for threads)

Common Pitfalls: Blocking the Event Loop

CPU-Intensive Operations

// BAD: Blocks the event loop
app.get('/fibonacci', (req, res) => {
  const n = parseInt(req.query.n);
  const result = fibonacci(n); // Synchronous, CPU-bound
  res.json({ result });
});

// GOOD: Offload to a worker thread
const { Worker } = require('worker_threads');

app.get('/fibonacci', (req, res) => {
  const worker = new Worker('./fibonacci-worker.js', {
    workerData: { n: parseInt(req.query.n) }
  });

  worker.on('message', (result) => {
    res.json({ result });
  });

  worker.on('error', (err) => {
    res.status(500).json({ error: err.message });
  });
});

Synchronous File Operations

// BAD: Blocks during file read
const data = fs.readFileSync('/large-file.json');

// GOOD: Non-blocking
const data = await fs.promises.readFile('/large-file.json');

JSON Parsing Large Objects

// Parsing a 50MB JSON string blocks the event loop
// for potentially hundreds of milliseconds
const huge = JSON.parse(largeString);

// Use streaming JSON parsers for large payloads
const JSONStream = require('jsonstream');
const stream = fs.createReadStream('large.json')
  .pipe(JSONStream.parse('*'));

stream.on('data', (item) => {
  // Process one item at a time
});

Detecting Event Loop Lag

Using monitorEventLoopDelay

const { monitorEventLoopDelay } = require('perf_hooks');

const histogram = monitorEventLoopDelay({ resolution: 20 });
histogram.enable();

setInterval(() => {
  console.log({
    min: histogram.min / 1e6,      // Convert ns to ms
    max: histogram.max / 1e6,
    mean: histogram.mean / 1e6,
    p99: histogram.percentile(99) / 1e6,
  });
  histogram.reset();
}, 5000);

Simple Lag Detection

let lastCheck = Date.now();

setInterval(() => {
  const now = Date.now();
  const lag = now - lastCheck - 1000; // Expected 1000ms
  if (lag > 100) {
    console.warn(`Event loop lag: ${lag}ms`);
  }
  lastCheck = now;
}, 1000);

Key Takeaways

  1. The event loop has 6 phases — timers, pending, idle, poll, check, close
  2. Microtasks (nextTick, Promises) run between every phase — they can starve macrotasks
  3. process.nextTick has higher priority than Promises
  4. libuv’s thread pool (default 4 threads) handles fs, crypto, and dns.lookup
  5. Never block the event loop — use worker threads for CPU work, streams for large data
  6. Monitor event loop lag in production with monitorEventLoopDelay

Understanding these internals helps you write Node.js code that truly leverages its async nature instead of accidentally fighting it.

Related Posts

Latest Posts