TECHNICAL PRESENTATION

Introduction to
Node.js & Databases

Drivers, ORMs, Migrations & Production Patterns
pg · Prisma · Knex · Sequelize · Mongoose · migrations · transactions
02

Agenda

Foundations

  • Database access in Node.js
  • Raw SQL drivers (pg, mysql2, better-sqlite3)
  • Connection pooling & lifecycle
  • Query builders — Knex.js

ORMs & ODMs

  • Prisma — modern type-safe ORM
  • Prisma in practice — CRUD & transactions
  • Sequelize — classic ORM
  • Mongoose & MongoDB

Data Management

  • Database migrations
  • Seeding & fixtures
  • Transactions & ACID
  • Connection management in Express

Production

  • Error handling & retry strategies
  • Performance & N+1 queries
  • Testing database code
  • Choosing an ORM / query builder
03

Database Access in Node.js

Node.js has first-class async I/O, making it a natural fit for database work. Every database operation is non-blocking by default — no thread-per-connection model.

Driver Landscape

  • pg — PostgreSQL
  • mysql2 — MySQL / MariaDB
  • better-sqlite3 — SQLite (sync)
  • mongodb — MongoDB native driver
  • ioredis — Redis

Connection Patterns

  • Single client — one TCP connection
  • Connection pool — reusable connections
  • Connection stringpostgres://user:pass@host/db
  • SSL/TLS — encrypted in production

Async by Default

  • Every query returns a Promise
  • Use async/await for clean flow
  • Promise.all() for parallel queries
  • Streams for large result sets

Abstraction Levels

LevelToolTrade-off
Raw SQLpg, mysql2Full control, no abstraction
Query BuilderKnex.jsComposable, DB-agnostic SQL
ORMPrisma, SequelizeModels, relations, type safety
ODMMongooseSchema enforcement for MongoDB

Key Decision Factors

  • How complex are your queries?
  • Do you need migrations & schema versioning?
  • TypeScript project? (Prisma shines here)
  • Team familiarity & ecosystem maturity
  • Performance-critical vs rapid development
04

Raw SQL Drivers

Raw drivers give you direct access to SQL with minimal abstraction. Every Node.js database library is built on top of these.

pg — PostgreSQL

const { Client } = require('pg');
const client = new Client({
  connectionString: process.env.DATABASE_URL
});
await client.connect();

// Parameterised query — prevents SQL injection
const { rows } = await client.query(
  'SELECT * FROM users WHERE email = $1',
  ['alice@example.com']
);
console.log(rows[0]);

await client.end();

mysql2 — MySQL

const mysql = require('mysql2/promise');
const conn = await mysql.createConnection({
  host: 'localhost',
  user: 'root',
  database: 'myapp'
});

// Parameterised with ? placeholders
const [rows] = await conn.execute(
  'SELECT * FROM users WHERE email = ?',
  ['alice@example.com']
);
console.log(rows[0]);

await conn.end();

better-sqlite3

const Database = require('better-sqlite3');
const db = new Database('app.db');

// Synchronous — ideal for CLI tools & Electron
const stmt = db.prepare(
  'SELECT * FROM users WHERE email = ?'
);
const user = stmt.get('alice@example.com');
console.log(user);

// Transactions via function wrapper
const insert = db.transaction((users) => {
  for (const u of users)
    db.prepare('INSERT INTO users (name) VALUES (?)')
      .run(u.name);
});

Always Use Parameterised Queries

Never interpolate user input into SQL strings: `SELECT * FROM users WHERE id = ${id}`. Use $1 (pg), ? (mysql2/sqlite), or named parameters. This is your primary defence against SQL injection.

05

Connection Pooling

Opening a new TCP connection per query is expensive (~20-50ms). A connection pool maintains a set of pre-established connections that are reused across requests.

const { Pool } = require('pg');

const pool = new Pool({
  connectionString: process.env.DATABASE_URL,
  max: 20,              // max connections in pool
  idleTimeoutMillis: 30000,  // close idle after 30s
  connectionTimeoutMillis: 5000, // fail if no conn in 5s
});

// Option 1: pool.query() — auto checkout/checkin
const { rows } = await pool.query(
  'SELECT * FROM users WHERE active = $1', [true]
);

// Option 2: manual checkout for transactions
const client = await pool.connect();
try {
  await client.query('BEGIN');
  await client.query('UPDATE accounts SET balance = balance - $1 WHERE id = $2', [100, 1]);
  await client.query('UPDATE accounts SET balance = balance + $1 WHERE id = $2', [100, 2]);
  await client.query('COMMIT');
} catch (e) {
  await client.query('ROLLBACK');
  throw e;
} finally {
  client.release();  // return to pool — CRITICAL
}

Why Pools Matter

  • Avoid TCP handshake + TLS negotiation per query
  • Limit connections to protect the database
  • Queue requests when all connections are busy
  • Automatic reconnection on failure

Pool Config Rules of Thumb

  • max = CPU cores × 2 + disk spindles
  • PostgreSQL default max_connections = 100
  • With 5 app instances: max = 100 / 5 = 20 per pool
  • Set connectionTimeoutMillis to fail fast

Connection Lifecycle

  • Idle — waiting in pool for checkout
  • Active — checked out, running queries
  • Expired — idle timeout reached, destroyed
  • Error — removed from pool, replaced
06

Query Builders — Knex.js

Knex is a SQL query builder that supports PostgreSQL, MySQL, SQLite3, and MSSQL. It generates parameterised SQL and provides migrations, seeds, and a chainable API.

Setup & Chainable Queries

const knex = require('knex')({
  client: 'pg',
  connection: process.env.DATABASE_URL,
  pool: { min: 2, max: 10 }
});

// Select with conditions
const users = await knex('users')
  .select('id', 'name', 'email')
  .where('active', true)
  .andWhere('role', 'admin')
  .orderBy('created_at', 'desc')
  .limit(20);

// Insert returning
const [newUser] = await knex('users')
  .insert({ name: 'Alice', email: 'alice@dev.io' })
  .returning('*');

// Update
await knex('users')
  .where('id', 42)
  .update({ last_login: knex.fn.now() });

Schema Building & Migrations

// migrations/20240101_create_users.js
exports.up = function(knex) {
  return knex.schema.createTable('users', (t) => {
    t.increments('id').primary();
    t.string('name').notNullable();
    t.string('email').unique().notNullable();
    t.boolean('active').defaultTo(true);
    t.timestamps(true, true); // created_at, updated_at
  });
};

exports.down = function(knex) {
  return knex.schema.dropTable('users');
};
# CLI commands
npx knex migrate:latest    # run pending migrations
npx knex migrate:rollback  # undo last batch
npx knex seed:run          # populate test data

When to Use Knex

  • You want SQL control with a composable API
  • Dynamic queries built from user filters
  • Multi-dialect support needed
  • Don't need full ORM model layer
07

Prisma — Modern ORM

Prisma is a next-generation ORM with a declarative schema, auto-generated type-safe client, and powerful migration system. It generates TypeScript types from your schema.

schema.prisma

// prisma/schema.prisma
datasource db {
  provider = "postgresql"
  url      = env("DATABASE_URL")
}

generator client {
  provider = "prisma-client-js"
}

model User {
  id        Int      @id @default(autoincrement())
  email     String   @unique
  name      String
  posts     Post[]   // 1-to-many relation
  profile   Profile? // 1-to-1 relation
  createdAt DateTime @default(now())
}

model Post {
  id        Int      @id @default(autoincrement())
  title     String
  content   String?
  published Boolean  @default(false)
  author    User     @relation(fields: [authorId], references: [id])
  authorId  Int
  tags      Tag[]    // many-to-many (implicit)
}

model Tag {
  id    Int    @id @default(autoincrement())
  name  String @unique
  posts Post[]
}

Generate & Use

# Install and initialise
npm install prisma @prisma/client
npx prisma init

# Generate client from schema
npx prisma generate

# Create and apply migration
npx prisma migrate dev --name init

# Visual database browser
npx prisma studio

Type-Safe Client

import { PrismaClient } from '@prisma/client';
const prisma = new PrismaClient();

// Full autocompletion & type checking
const user = await prisma.user.findUnique({
  where: { email: 'alice@dev.io' },
  include: { posts: true, profile: true }
});
// user.posts is typed as Post[]

Key Features

  • Auto-generated types from schema
  • Relation queries with include/select
  • Migration system with drift detection
  • Prisma Studio — visual data browser
  • Multi-provider — Postgres, MySQL, SQLite, MongoDB
08

Prisma in Practice

CRUD Operations

// Create
const user = await prisma.user.create({
  data: { name: 'Alice', email: 'alice@dev.io' }
});

// Read with filtering
const admins = await prisma.user.findMany({
  where: {
    role: 'ADMIN',
    createdAt: { gte: new Date('2024-01-01') }
  },
  orderBy: { name: 'asc' }
});

// Update
const updated = await prisma.user.update({
  where: { id: 1 },
  data: { name: 'Alice Smith' }
});

// Delete
await prisma.user.delete({ where: { id: 1 } });

Pagination

// Offset pagination
const page = await prisma.post.findMany({
  skip: 20,
  take: 10,
  orderBy: { createdAt: 'desc' }
});

// Cursor pagination (more efficient)
const next = await prisma.post.findMany({
  take: 10,
  cursor: { id: lastId },
  skip: 1, // skip the cursor itself
});

Transactions

// Interactive transaction
const transfer = await prisma.$transaction(async (tx) => {
  const sender = await tx.account.update({
    where: { id: 1 },
    data: { balance: { decrement: 100 } }
  });
  if (sender.balance < 0) throw new Error('Insufficient funds');

  const receiver = await tx.account.update({
    where: { id: 2 },
    data: { balance: { increment: 100 } }
  });
  return { sender, receiver };
});

Raw SQL Escape Hatch

// When Prisma's API isn't enough
const result = await prisma.$queryRaw`
  SELECT u.name, COUNT(p.id) as post_count
  FROM "User" u
  LEFT JOIN "Post" p ON p."authorId" = u.id
  GROUP BY u.id
  HAVING COUNT(p.id) > ${minPosts}
`;

// Execute without returning rows
await prisma.$executeRaw`
  TRUNCATE TABLE "Session" CASCADE
`;
09

Sequelize

Sequelize is a mature, full-featured ORM for Node.js supporting PostgreSQL, MySQL, MariaDB, SQLite, and MSSQL. It follows the Active Record pattern and is widely used in legacy projects.

Model Definition

const { Sequelize, DataTypes } = require('sequelize');
const sequelize = new Sequelize(process.env.DATABASE_URL);

const User = sequelize.define('User', {
  name: {
    type: DataTypes.STRING,
    allowNull: false
  },
  email: {
    type: DataTypes.STRING,
    unique: true,
    validate: { isEmail: true }
  },
  role: {
    type: DataTypes.ENUM('USER', 'ADMIN'),
    defaultValue: 'USER'
  }
}, { paranoid: true }); // soft deletes

const Post = sequelize.define('Post', {
  title: DataTypes.STRING,
  content: DataTypes.TEXT,
  published: { type: DataTypes.BOOLEAN, defaultValue: false }
});

// Associations
User.hasMany(Post, { foreignKey: 'authorId' });
Post.belongsTo(User, { foreignKey: 'authorId' });

Querying

// Find with eager loading
const users = await User.findAll({
  where: { role: 'ADMIN' },
  include: [{ model: Post, where: { published: true } }],
  order: [['createdAt', 'DESC']],
  limit: 10
});

// Create with association
const user = await User.create({
  name: 'Bob',
  email: 'bob@dev.io',
  Posts: [{ title: 'Hello World' }]
}, { include: [Post] });

Hooks (Lifecycle Events)

User.beforeCreate(async (user) => {
  user.email = user.email.toLowerCase();
});

User.afterDestroy(async (user) => {
  await AuditLog.create({
    action: 'USER_DELETED', userId: user.id
  });
});

Sequelize vs Prisma

  • Sequelize: imperative, runtime model definition
  • Prisma: declarative schema, generated types
  • Sequelize better for dynamic/runtime schemas
  • Prisma better for TypeScript & type safety
10

Mongoose & MongoDB

Mongoose is the de-facto ODM (Object Document Mapper) for MongoDB in Node.js. It adds schema validation, middleware, and relationship management to MongoDB's flexible document model.

Schema & Model

const mongoose = require('mongoose');
await mongoose.connect(process.env.MONGO_URI);

const userSchema = new mongoose.Schema({
  name:  { type: String, required: true, trim: true },
  email: { type: String, unique: true, lowercase: true },
  posts: [{ type: mongoose.Schema.Types.ObjectId, ref: 'Post' }],
  metadata: {
    loginCount: { type: Number, default: 0 },
    lastSeen: Date
  }
}, { timestamps: true }); // createdAt, updatedAt

// Middleware (hooks)
userSchema.pre('save', function(next) {
  if (this.isModified('password')) {
    this.password = hash(this.password);
  }
  next();
});

// Virtual fields
userSchema.virtual('postCount').get(function() {
  return this.posts.length;
});

const User = mongoose.model('User', userSchema);

Population (Joins)

// Populate references — MongoDB "join"
const user = await User.findById(userId)
  .populate({
    path: 'posts',
    match: { published: true },
    select: 'title createdAt',
    options: { sort: { createdAt: -1 }, limit: 5 }
  });

Aggregation Pipeline

const stats = await Post.aggregate([
  { $match: { published: true } },
  { $group: {
      _id: '$authorId',
      totalPosts: { $sum: 1 },
      avgLength: { $avg: { $strLenCP: '$content' } }
  }},
  { $sort: { totalPosts: -1 } },
  { $limit: 10 }
]);

When to Use MongoDB

  • Flexible/evolving schemas (CMS, IoT data)
  • Document-shaped data (nested objects, arrays)
  • Horizontal scaling with sharding
  • Not ideal: heavy joins, strict ACID across docs
11

Database Migrations

Migrations are version control for your database schema. They ensure every environment (dev, staging, production) has the same structure, and changes are applied in order.

Why Migrations Matter

  • Reproducible schema across environments
  • Team collaboration without conflicts
  • Rollback capability for failed deploys
  • Audit trail of schema evolution

Knex Migrations

// 20240315_add_posts_table.js
exports.up = function(knex) {
  return knex.schema.createTable('posts', (t) => {
    t.increments('id').primary();
    t.string('title').notNullable();
    t.text('content');
    t.integer('author_id')
      .references('id').inTable('users')
      .onDelete('CASCADE');
    t.timestamps(true, true);
  });
};

exports.down = function(knex) {
  return knex.schema.dropTable('posts');
};

Prisma Migrate

# Create migration from schema changes
npx prisma migrate dev --name add_posts

# Apply in production (no prompts)
npx prisma migrate deploy

# Check migration status
npx prisma migrate status
-- Generated SQL (prisma/migrations/...)
CREATE TABLE "Post" (
  "id" SERIAL NOT NULL,
  "title" TEXT NOT NULL,
  "content" TEXT,
  "authorId" INTEGER NOT NULL,
  CONSTRAINT "Post_pkey" PRIMARY KEY ("id"),
  CONSTRAINT "Post_authorId_fkey"
    FOREIGN KEY ("authorId")
    REFERENCES "User"("id") ON DELETE CASCADE
);

Migration Workflow

  • 1. Modify schema/create migration file
  • 2. Test locally with migrate dev
  • 3. Commit migration files to git
  • 4. CI/CD runs migrate deploy on release
  • 5. Rollback if health checks fail
12

Seeding & Fixtures

Seeds populate databases with test data for development, testing, and demos. Fixtures provide deterministic data for automated tests.

Knex Seeds

// seeds/01_users.js
exports.seed = async function(knex) {
  await knex('users').del(); // clear existing

  await knex('users').insert([
    { id: 1, name: 'Alice', email: 'alice@test.io', role: 'ADMIN' },
    { id: 2, name: 'Bob',   email: 'bob@test.io',   role: 'USER' },
    { id: 3, name: 'Carol', email: 'carol@test.io',  role: 'USER' }
  ]);
};
npx knex seed:run          # run all seeds
npx knex seed:run --specific=01_users.js

Prisma Seed

// prisma/seed.ts
import { PrismaClient } from '@prisma/client';
const prisma = new PrismaClient();

async function main() {
  const alice = await prisma.user.upsert({
    where: { email: 'alice@test.io' },
    update: {},
    create: { name: 'Alice', email: 'alice@test.io' }
  });
}
main().finally(() => prisma.$disconnect());

Factory Patterns

// test/factories/user.factory.ts
import { faker } from '@faker-js/faker';

export function buildUser(overrides = {}) {
  return {
    name: faker.person.fullName(),
    email: faker.internet.email(),
    role: 'USER',
    ...overrides  // allow test-specific overrides
  };
}

// In tests
const admin = buildUser({ role: 'ADMIN' });
const user = await prisma.user.create({
  data: buildUser()
});

Seed Best Practices

  • Idempotent — safe to run multiple times
  • Deterministic — same output every run for tests
  • Isolated — seeds don't depend on external APIs
  • Ordered — respect foreign key constraints
  • Separate dev seeds from test fixtures
13

Transactions

A transaction groups multiple operations into an atomic unit. Either all operations succeed (commit) or all are rolled back. Essential for data integrity.

ACID Properties

PropertyMeaning
AtomicityAll or nothing — partial writes never persist
ConsistencyDB moves from one valid state to another
IsolationConcurrent txns don't interfere with each other
DurabilityCommitted data survives crashes

Prisma Interactive Transactions

await prisma.$transaction(async (tx) => {
  // All queries use the same transaction
  const order = await tx.order.create({
    data: { userId: 1, total: 99.99 }
  });
  await tx.inventory.update({
    where: { productId: 42 },
    data: { stock: { decrement: 1 } }
  });
  await tx.payment.create({
    data: { orderId: order.id, amount: 99.99 }
  });
  // If any fails, ALL are rolled back
});

Knex Transactions

await knex.transaction(async (trx) => {
  const [orderId] = await trx('orders')
    .insert({ user_id: 1, total: 99.99 })
    .returning('id');

  await trx('inventory')
    .where('product_id', 42)
    .decrement('stock', 1);

  await trx('payments')
    .insert({ order_id: orderId, amount: 99.99 });
});

Isolation Levels

LevelPrevents
Read UncommittedNothing (fastest)
Read CommittedDirty reads
Repeatable Read+ Non-repeatable reads
Serializable+ Phantom reads (safest)

PostgreSQL default: Read Committed. Use Serializable for financial operations.

14

Connection Management in Express

How you wire your database into Express affects performance, testability, and resource management.

Shared Pool (Recommended)

// db.js — singleton pool
const { Pool } = require('pg');
const pool = new Pool({
  connectionString: process.env.DATABASE_URL,
  max: 20
});
module.exports = pool;

// routes/users.js
const pool = require('../db');

router.get('/users', async (req, res) => {
  const { rows } = await pool.query(
    'SELECT * FROM users'
  );
  res.json(rows);
});

Dependency Injection

// Attach to app for testability
function createApp(pool) {
  const app = express();

  app.get('/users', async (req, res) => {
    const { rows } = await pool.query(
      'SELECT * FROM users'
    );
    res.json(rows);
  });

  return app;
}

// Production
const app = createApp(realPool);

// Test
const app = createApp(mockPool);

Middleware Pattern

// Attach client to request
app.use(async (req, res, next) => {
  req.db = await pool.connect();
  res.on('finish', () => req.db.release());
  next();
});

// Use in routes
app.get('/users', async (req, res) => {
  const { rows } = await req.db.query(
    'SELECT * FROM users'
  );
  res.json(rows);
});

Anti-Pattern

Never create a new pool per request. Pool creation is expensive and you'll exhaust database connections.

Prisma in Express

// Create ONE PrismaClient instance and reuse it globally
const prisma = new PrismaClient();
app.get('/users', async (req, res) => { res.json(await prisma.user.findMany()); });
// Prisma manages its own connection pool internally — no manual pool management needed.
15

Error Handling

Database errors are inevitable. Your app must handle constraint violations, connection failures, and timeouts gracefully.

Constraint Violations

try {
  await prisma.user.create({
    data: { email: 'alice@dev.io' } // duplicate
  });
} catch (e) {
  if (e.code === 'P2002') { // Prisma unique violation
    return res.status(409).json({
      error: `${e.meta.target} already exists`
    });
  }
  throw e; // re-throw unexpected errors
}

// pg: error.code === '23505' (unique_violation)
// pg: error.code === '23503' (fk_violation)

Connection Failures

pool.on('error', (err) => {
  console.error('Idle client error:', err.message);
  // Pool auto-removes dead connections
});

// Health check endpoint
app.get('/health', async (req, res) => {
  try {
    await pool.query('SELECT 1');
    res.json({ db: 'connected' });
  } catch {
    res.status(503).json({ db: 'disconnected' });
  }
});

Retry Strategy

async function withRetry(fn, { retries = 3, delay = 100 } = {}) {
  for (let i = 0; i < retries; i++) {
    try {
      return await fn();
    } catch (err) {
      const isTransient = ['ECONNRESET', 'ETIMEDOUT',
        '57P01'].includes(err.code); // 57P01 = admin shutdown
      if (!isTransient || i === retries - 1) throw err;
      await new Promise(r =>
        setTimeout(r, delay * Math.pow(2, i)) // expo backoff
      );
    }
  }
}

Dead Letter Patterns

  • Failed operations → dead letter queue
  • Retry with exponential backoff + jitter
  • Alert after N failures (PagerDuty, Slack)
  • Manual review queue for persistent failures
  • Circuit breaker: stop calling failing DB

Common Error Codes (PostgreSQL)

CodeMeaning
23505Unique violation
23503Foreign key violation
42P01Table does not exist
40001Serialization failure
16

Performance

The biggest database performance killers in Node.js apps are N+1 queries, missing indexes, and misconfigured connection pools.

The N+1 Problem

// BAD: 1 query for users + N queries for posts
const users = await prisma.user.findMany();
for (const user of users) {
  user.posts = await prisma.post.findMany({
    where: { authorId: user.id }  // N queries!
  });
}

// GOOD: eager loading — 1-2 queries total
const users = await prisma.user.findMany({
  include: { posts: true }
});

Query Analysis

-- Use EXPLAIN ANALYZE to find slow queries
EXPLAIN ANALYZE
SELECT u.*, COUNT(p.id)
FROM users u
LEFT JOIN posts p ON p.author_id = u.id
GROUP BY u.id;

-- Look for: Seq Scan (missing index), Nested Loop
-- (N+1), high actual time, row estimates vs actuals

Indexing

// Prisma schema indexes
model Post {
  id       Int    @id @default(autoincrement())
  authorId Int
  status   String
  title    String

  @@index([authorId])          // single column
  @@index([status, authorId])  // composite
  @@index([title], type: BTree) // explicit type
}

Eager Loading Strategies

StrategyORMMethod
IncludePrismainclude: { posts: true }
EagerSequelizeinclude: [Post]
PopulateMongoose.populate('posts')
JoinKnex.join('posts', ...)

Connection Pool Tuning

  • Monitor pool.waitingCount — if high, increase max
  • Monitor pool.idleCount — if high, decrease max
  • Set statement_timeout to kill runaway queries
  • Use pgBouncer for connection multiplexing at scale
17

Testing Database Code

Database tests need isolation (no test affects another), speed (fast feedback loops), and reproducibility (same results every run).

Test Database Strategy

// jest.setup.js — per-test transaction rollback
beforeEach(async () => {
  await pool.query('BEGIN');
});

afterEach(async () => {
  await pool.query('ROLLBACK'); // undo all changes
});

In-Memory SQLite

// Use SQLite for fast unit tests
const knex = require('knex')({
  client: 'better-sqlite3',
  connection: { filename: ':memory:' },
  useNullAsDefault: true
});

beforeAll(async () => {
  await knex.migrate.latest(); // apply schema
  await knex.seed.run();       // load fixtures
});

afterAll(() => knex.destroy());

Prisma Test Setup

// Use a separate test database
// .env.test
// DATABASE_URL="postgresql://localhost/myapp_test"

beforeEach(async () => {
  // Clean all tables between tests
  const tables = Prisma.dmmf.datamodel.models
    .map(m => m.dbName || m.name);
  for (const table of tables) {
    await prisma.$executeRawUnsafe(
      `TRUNCATE TABLE "${table}" CASCADE`
    );
  }
});

Testing Strategies Compared

StrategySpeedFidelity
Transaction rollbackFastHigh
Truncate between testsMediumHigh
In-memory SQLiteFastestLower
Docker per test suiteSlowHighest

Fixtures

Use factory functions (see Slide 12) for test data. Avoid shared mutable state. Each test should create exactly the data it needs.

18

Choosing an ORM / Query Builder

There is no "best" tool — the right choice depends on your project's type safety needs, query complexity, team experience, and performance requirements.

Criterion pg / mysql2 Knex Prisma Sequelize Mongoose
Type SafetyNoneBasicExcellentVia TS defsVia TS defs
Learning CurveLow (just SQL)Low-MediumMediumMedium-HighMedium
Query FlexibilityFull SQLHighMediumMediumHigh (aggregation)
MigrationsManualBuilt-inBuilt-inBuilt-inN/A
PerformanceBestNear-nativeGood (Rust engine)OverheadDepends on queries
RelationsManual JOINsManual JOINsDeclarativeDeclarativePopulation
DatabaseOne per driverPG/MySQL/SQLitePG/MySQL/SQLite/MongoPG/MySQL/SQLite/MSSQLMongoDB only
EcosystemHugeMatureGrowing fastMatureMature

Choose Prisma When

  • TypeScript project
  • Want generated types from schema
  • Standard CRUD-heavy application
  • New greenfield project

Choose Knex / Raw When

  • Complex reporting queries
  • Dynamic query composition
  • Maximum performance needed
  • Team knows SQL well

Choose Sequelize / Mongoose When

  • Existing project already uses it
  • Need runtime model definition
  • MongoDB document store (Mongoose)
  • Rich hook/middleware system needed
19

Summary & Next Steps

What We Covered

  • Raw SQL drivers: pg, mysql2, better-sqlite3
  • Connection pooling & lifecycle management
  • Query builder: Knex.js — composable SQL
  • Modern ORM: Prisma — type-safe, declarative schema
  • Classic ORM: Sequelize — Active Record pattern
  • ODM: Mongoose — MongoDB schema validation
  • Migrations, seeding, transactions, ACID
  • Error handling, retries, performance tuning
  • Testing strategies: rollback, fixtures, in-memory DB

Next Steps

  • Build a REST API with Prisma + Express
  • Set up migrations & seed a dev database
  • Write integration tests with transaction rollback
  • Profile queries with EXPLAIN ANALYZE
  • Explore connection pooling with pgBouncer
  • Try Drizzle ORM — another modern TypeScript ORM
  • Learn about database replication & read replicas
  • Explore Redis for caching & session storage

Practice Projects

  • Blog API with Prisma + migrations
  • E-commerce with transactions
  • Chat app with Mongoose + MongoDB

Key Takeaway

Start with Prisma for new TypeScript projects. Use raw SQL when you need performance or complex queries. Always use connection pooling, parameterised queries, and migrations.