Node.js is an open-source, cross-platform JavaScript runtime built on Chrome's V8 engine. Created by Ryan Dahl in 2009, it lets developers run JavaScript outside the browser.
The key insight: most server workloads are I/O-bound, not CPU-bound. Traditional thread-per-request models waste resources waiting for disk, network, and database responses. Node uses a single-threaded event-driven model instead.
Netflix, LinkedIn, PayPal, Uber, NASA, Walmart, Trello, and millions of startups. npm serves 2+ billion package downloads per week.
Node.js is built on two core components: Google's V8 JavaScript engine and libuv, a cross-platform async I/O library written in C.
Google's JS engine. JIT compiles JavaScript to optimised native machine code. Manages heap, garbage collection, and inline caches.
Provides the event loop, async TCP/UDP sockets, file system ops, child processes, and a thread pool (default 4 threads) for blocking work.
C++ glue layer that connects JavaScript to libuv and OS-level APIs. Exposed to user code as built-in modules like fs, http, net.
The event loop is the heart of Node.js. It continuously checks for pending work and dispatches callbacks. Each iteration is called a tick.
Executes callbacks from setTimeout and setInterval whose threshold has elapsed.
Retrieves new I/O events. Executes I/O-related callbacks (almost all except close, timers, and setImmediate).
Executes setImmediate() callbacks. Runs immediately after the poll phase completes.
Microtasks run between every phase. process.nextTick() fires before Promise callbacks. Starving the event loop with recursive nextTick is a common anti-pattern.
The thread waits until the operation completes. No other work can happen.
const fs = require('fs');
// Thread blocks here until file is fully read
const data = fs.readFileSync('/var/log/app.log');
console.log(data.length);
// Nothing else runs until readFileSync returns
The call returns immediately. A callback fires when the operation finishes.
const fs = require('fs');
// Returns immediately — callback fires later
fs.readFile('/var/log/app.log', (err, data) => {
if (err) throw err;
console.log(data.length);
});
// Event loop is free to handle other requests
A single blocking call on the main thread freezes the entire server. With 10,000 concurrent connections, one readFileSync blocks all of them. Non-blocking I/O lets Node handle thousands of concurrent connections with a single thread, because the event loop delegates waiting to the OS and libuv thread pool.
Never use *Sync methods in server code. They're acceptable in CLI scripts and startup config loading.
libuv uses 4 threads (configurable via UV_THREADPOOL_SIZE) for inherently blocking ops: DNS lookups, file system, zlib, crypto.
TCP/UDP is handled by OS-level async primitives (epoll, kqueue, IOCP) — no thread pool needed.
Node has two module systems: the original CommonJS (require) and the newer ES Modules (import). Both coexist in modern Node.
// math.js
function add(a, b) { return a + b; }
module.exports = { add };
// app.js
const { add } = require('./math');
console.log(add(2, 3)); // 5
require().js extension — default in Noderequire() inside conditionals// math.mjs
export function add(a, b) { return a + b; }
// app.mjs
import { add } from './math.mjs';
console.log(add(2, 3)); // 5
.mjs or "type": "module" in package.jsonawait supportedrequire('foo') → 1. Core module? 2. ./node_modules/foo? 3. Parent node_modules? … walks up to root. Each resolved module is cached by its absolute path.
npm (Node Package Manager) ships with every Node installation. It manages dependencies, scripts, and publishing. The registry hosts 2.1 million+ packages.
# Initialise a new project
npm init -y
# Install dependencies
npm install express # production dep
npm install -D jest # dev dependency
# Key files
package.json # manifest & scripts
package-lock.json # exact dependency tree
node_modules/ # installed packages
{
"scripts": {
"start": "node server.js",
"dev": "node --watch server.js",
"test": "jest --coverage",
"lint": "eslint ."
}
}
MAJOR.MINOR.PATCH
^1.2.3 — allow minor + patch updates~1.2.3 — allow patch updates only1.2.3 — exact versionpackage-lock.json pins the exact version of every transitive dependency. Always commit it. Ensures reproducible installs across machines.
Error-first convention. Nesting leads to "callback hell".
fs.readFile('a.txt', (err, a) => {
if (err) return cb(err);
fs.readFile('b.txt', (err, b) => {
if (err) return cb(err);
fs.writeFile('c.txt', a + b, cb);
});
});
Chainable .then() / .catch(). Flat structure.
const fsp = require('fs').promises;
fsp.readFile('a.txt')
.then(a => fsp.readFile('b.txt')
.then(b => fsp.writeFile('c.txt', a + b)))
.catch(err => console.error(err));
Syntactic sugar over Promises. Reads like sync code.
const fsp = require('fs').promises;
async function merge() {
const a = await fsp.readFile('a.txt');
const b = await fsp.readFile('b.txt');
await fsp.writeFile('c.txt', a + b);
}
merge().catch(console.error);
// Run independent tasks concurrently
const [users, orders] = await Promise.all([
db.query('SELECT * FROM users'),
db.query('SELECT * FROM orders')
]);
try {
const data = await fetchData();
} catch (err) {
// Handle or rethrow
logger.error('Fetch failed', err);
throw err;
}
Streams process data in chunks rather than loading everything into memory. Essential for handling large files, network data, and real-time processing.
| Stream Type | Example | Use |
|---|---|---|
| Readable | fs.createReadStream | Read from source |
| Writable | fs.createWriteStream | Write to destination |
| Duplex | net.Socket | Both read & write |
| Transform | zlib.createGzip | Modify data in transit |
Fixed-size chunks of raw binary data. Allocated outside the V8 heap. Used for file I/O, network protocols, and cryptography.
const buf = Buffer.from('Hello, Node!');
console.log(buf.toString('hex'));
// 48656c6c6f2c204e6f646521
const fs = require('fs');
const zlib = require('zlib');
// Compress a large file using streams
// Memory usage stays low regardless of file size
fs.createReadStream('input.log') // Readable
.pipe(zlib.createGzip()) // Transform
.pipe(fs.createWriteStream('input.log.gz')) // Writable
.on('finish', () => console.log('Done!'));
const { pipeline } = require('stream/promises');
await pipeline(
fs.createReadStream('input.log'),
zlib.createGzip(),
fs.createWriteStream('input.log.gz')
);
// Automatically handles errors & cleanup
Node's built-in http module can create servers with zero external dependencies. This is what frameworks like Express build on top of.
const http = require('http');
const server = http.createServer((req, res) => {
// req = IncomingMessage (Readable stream)
// res = ServerResponse (Writable stream)
if (req.method === 'GET' && req.url === '/api/health') {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ status: 'ok' }));
} else {
res.writeHead(404);
res.end('Not Found');
}
});
server.listen(3000, () => {
console.log('Listening on http://localhost:3000');
});
https.createServer() with TLS certsThis is why frameworks exist →
const express = require('express');
const app = express();
// Built-in middleware
app.use(express.json()); // parse JSON bodies
app.use(express.static('public'));
// Custom middleware — runs on every request
app.use((req, res, next) => {
console.log(`${req.method} ${req.url}`);
next(); // pass control to next handler
});
// Route handlers
app.get('/api/users', async (req, res) => {
const users = await db.findAll();
res.json(users);
});
app.post('/api/users', async (req, res) => {
const user = await db.create(req.body);
res.status(201).json(user);
});
// Error-handling middleware (4 params)
app.use((err, req, res, next) => {
console.error(err.stack);
res.status(500).json({ error: 'Server error' });
});
app.listen(3000);
Middleware are functions with signature (req, res, next). They execute in order and form a pipeline. Each can modify the request/response, end the cycle, or call next().
Fastify (schema-based, 2-3x faster), Koa (async-native, minimal), Hono (ultrafast, edge-ready), NestJS (opinionated, TypeScript-first).
const fs = require('fs/promises');
const path = require('path');
async function processConfig() {
// Read
const raw = await fs.readFile('config.json', 'utf8');
const config = JSON.parse(raw);
// Write
config.updatedAt = new Date().toISOString();
await fs.writeFile(
'config.json',
JSON.stringify(config, null, 2)
);
// Directory operations
await fs.mkdir('logs', { recursive: true });
const files = await fs.readdir('logs');
// File info
const stats = await fs.stat('config.json');
console.log(`Size: ${stats.size} bytes`);
}
const path = require('path');
path.join('src', 'utils', 'db.js');
// Linux: 'src/utils/db.js'
// Windows: 'src\\utils\\db.js'
path.resolve('..', 'config.json');
// '/home/user/project/config.json'
path.basename('/app/server.js'); // 'server.js'
path.extname('photo.png'); // '.png'
path.dirname('/app/src/index.js'); // '/app/src'
// Common pattern: resolve relative to this file
const configPath = path.join(__dirname, 'config.json');
const watcher = fs.watch('./src', { recursive: true });
for await (const event of watcher) {
console.log(event.eventType, event.filename);
}
Environment variables separate config from code. The 12-Factor App methodology recommends storing all config in env vars.
# .env (never commit this file!)
PORT=3000
DATABASE_URL=postgres://user:pass@localhost/mydb
JWT_SECRET=super-secret-key
NODE_ENV=production
// Access in Node.js
const port = process.env.PORT || 3000;
const dbUrl = process.env.DATABASE_URL;
// Since Node 20.6 — built-in .env support
// node --env-file=.env server.js
.env files — add to .gitignoreThe most important env var. Conventional values:
Express and many libraries change behaviour based on this value. In production, Express caches view templates and generates less verbose errors.
process.env — environment variablesprocess.argv — command-line argumentsprocess.cwd() — current working directoryprocess.pid — process IDprocess.exit(code) — terminate the processprocess.memoryUsage() — heap statistics// Express async error wrapper
const asyncHandler = (fn) => (req, res, next) =>
Promise.resolve(fn(req, res, next)).catch(next);
app.get('/users/:id', asyncHandler(async (req, res) => {
const user = await db.findById(req.params.id);
if (!user) {
const err = new Error('User not found');
err.statusCode = 404;
throw err;
}
res.json(user);
}));
// Global error handler middleware
app.use((err, req, res, next) => {
const status = err.statusCode || 500;
const message = status === 500
? 'Internal Server Error'
: err.message;
if (status === 500) logger.error(err);
res.status(status).json({ error: message });
});
// Catch unhandled promise rejections
process.on('unhandledRejection', (reason) => {
logger.fatal('Unhandled rejection', reason);
process.exit(1); // let process manager restart
});
// Catch uncaught exceptions
process.on('uncaughtException', (err) => {
logger.fatal('Uncaught exception', err);
process.exit(1);
});
On SIGTERM, stop accepting new connections, finish in-flight requests, close database pools, then exit. Process managers (PM2, Kubernetes) send SIGTERM before force-killing.
// utils.js
function slugify(text) {
return text.toLowerCase()
.replace(/[^a-z0-9]+/g, '-')
.replace(/^-|-$/g, '');
}
module.exports = { slugify };
// utils.test.js
const { slugify } = require('./utils');
describe('slugify', () => {
test('converts spaces to hyphens', () => {
expect(slugify('Hello World')).toBe('hello-world');
});
test('strips special characters', () => {
expect(slugify('Node.js & npm!'))
.toBe('node-js-npm');
});
test('handles empty string', () => {
expect(slugify('')).toBe('');
});
});
const request = require('supertest');
const app = require('./app');
describe('GET /api/users', () => {
test('returns 200 and user list', async () => {
const res = await request(app)
.get('/api/users')
.expect('Content-Type', /json/)
.expect(200);
expect(res.body).toBeInstanceOf(Array);
expect(res.body[0]).toHaveProperty('name');
});
test('returns 404 for unknown user', async () => {
await request(app)
.get('/api/users/99999')
.expect(404);
});
});
npx jest # run all tests
npx jest --watch # re-run on changes
npx jest --coverage # coverage report
| Threat | Mitigation |
|---|---|
| Injection (SQL, NoSQL, command) | Parameterised queries, input validation, never concatenate user input into commands |
| XSS | Escape output, use helmet for CSP headers, avoid innerHTML |
| Prototype Pollution | Freeze prototypes, validate JSON keys, use Object.create(null) |
| ReDoS | Avoid complex regex on user input, set timeouts, use re2 |
| Dependency attacks | Audit with npm audit, pin versions, use lockfiles |
const express = require('express');
const helmet = require('helmet');
const rateLimit = require('express-rate-limit');
const app = express();
// Security headers (CSP, HSTS, X-Frame, etc.)
app.use(helmet());
// Rate limiting
app.use(rateLimit({
windowMs: 15 * 60 * 1000, // 15 min
max: 100, // 100 req/window
}));
// Never expose stack traces in production
app.set('env', 'production');
// Parameterised query — prevents SQL injection
const user = await db.query(
'SELECT * FROM users WHERE id = $1',
[req.params.id] // NOT string concatenation
);
Run npm audit regularly. Use npm audit fix to auto-patch. Integrate into CI.
Node.js LTS gets security patches for 30 months. Even-numbered releases (18, 20, 22) are LTS. Odd-numbered are current/experimental.
Run Node processes as non-root. Restrict file system access. Use minimal IAM roles in cloud environments.
Node runs JS on one thread, but the cluster module spawns worker processes to use all CPU cores.
const cluster = require('cluster');
const os = require('os');
if (cluster.isPrimary) {
const cpus = os.cpus().length;
console.log(`Forking ${cpus} workers`);
for (let i = 0; i < cpus; i++) {
cluster.fork();
}
cluster.on('exit', (worker) => {
console.log(`Worker ${worker.process.pid} died`);
cluster.fork(); // auto-restart
});
} else {
require('./server'); // each worker runs server
}
For CPU-intensive tasks (image processing, crypto, parsing), offload to worker threads to avoid blocking the event loop.
const { Worker } = require('worker_threads');
function runHeavyTask(data) {
return new Promise((resolve, reject) => {
const worker = new Worker('./heavy-task.js', {
workerData: data
});
worker.on('message', resolve);
worker.on('error', reject);
});
}
const result = await runHeavyTask({ iterations: 1e8 });
async/await for clean async codeThank you! — Built with Reveal.js · Single self-contained HTML file