Node.js Microservice Architecture Connecting a Node.js app with a Postgres database
Node.js Part 8 Apr 16, 2022
We are setting up a local Postgres database with docker-compose for our Node.js service such that we can persist data.
A database allows us to persist data beyond the lifetime of the application. Postgres is an SQL database which we can access with the Node.js library Knex and the relevant Postgres driver. You can read in a different post why TypeScript is awesome for SQL queries.
Local docker database 🔗
First, we create a file docker-compose.yml
to manage the local Postgres database.
version: "3.1"
services:
database:
image: "postgres:14"
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: secret
POSTGRES_DB: postgres
ports:
- "5432:5432"
volumes:
- database-data:/var/lib/postgresql/data/
volumes:
database-data:
The file docker-compose.yml
specifies that a Postgres database should be running. It has a fixed user, database name and password. The database port 5432
is exposed on localhost:5432
such that the Node application npm start
can access it. The data is persisted on a named docker volume database-data
.
With the docker-compose
configuration specified, the docker-compose
tool can be run.
Note The tool
docker-compose
needs to be natively installed on your machine.
docker-compose up -d
The running container can be inspected.
docker-compose ps
The container can be shut down.
docker-compose down
Database configuration 🔗
We need to install both the Node.js packages knex
and pg
. The knex
package contains the TypeScript types for Knex. The pg
package requires the package @types/pg
package for the types.
npm install --save knex pg
npm install --save-dev @types/pg
The application needs to know where it can find the database and how to access it. We capture this in the application configuration.
import { Knex } from "knex";
export interface Config {
environment: Environment;
// ...
database: Knex.PgConnectionConfig;
}
export interface ProcessVariables {
ENV?: Environment;
// ...
DATABASE_URL?: string;
}
The process variables define a variable for the connection string. This string has the format postgresql://username:password@host:port/dbname
. Alternatively, the host, port, user, password and database name could be passed independently via environment variables.
The configuration contains an object database
that defines the connection configuration expected by Postgres.
Production configuration 🔗
In the production configuration, we assign the database
object from the environment variable DATABASE_URL
.
import { Config, ProcessVariables } from "../config.type";
export function getProductionConfig(
processVariables: ProcessVariables
): Config {
return {
environment: "production",
// ...
database: {
connectionString: processVariables.DATABASE_URL,
ssl: true,
},
};
}
We can add a test to verify the correct assignment.
import { Knex } from "knex";
import { getProductionConfig } from "./get-production.config";
describe("the production configuration", () => {
// ...
it("reads the database configuration from the environment", () => {
expect(
getProductionConfig({
DATABASE_URL: "postgresql://username:password@host:port/dbname",
})
).toHaveProperty("database", {
connectionString: "postgresql://username:password@host:port/dbname",
ssl: true,
} as Knex.PgConnectionConfig);
});
});
Local configuration 🔗
The local configuration consists of constant values which are the counterpart to the values defined in the docker-compose configuration docker-compose.yml
.
import { Config, ProcessVariables } from "../config.type";
export function getLocalConfig(processVariables: ProcessVariables): Config {
return {
environment: "local",
// ...
database: {
user: "postgres",
host: "localhost",
database: "postgres",
port: 5432,
password: "secret",
ssl: false,
},
};
}
Connection pool 🔗
With the configuration set up, we create a constant db
that holds the global database connection pool of the application. By default, it holds a handful of connections. You can provide more configuration to change that.
import { Knex } from "knex";
import { config } from "../configuration/config";
export const db = require("knex")({
client: "pg",
connection: config.database,
}) as Knex;
Accessing the database 🔗
An endpoint GET /health
allows us from the outside to verify if the application can access the database. We send a select
query to one of Postgres already existing tables to see if it succeeds. If the database can not be accessed, the query would throw an error and lead to a 500
response.
import { NextFunction, Request, Response } from "express";
import { db } from "../database/db";
export async function healthController(
_request: Request,
response: Response,
next: NextFunction
): Promise<void> {
try {
await db.from("information_schema.tables").select();
response.sendStatus(204);
} catch (error) {
next(error);
}
}
We register the health check in the application app.ts
.
import express from "express";
import { healthController } from "./controllers/health.controller";
const app = express();
// ...
app.use("/health", healthController);
// ...
export default app;
Now, calling the endpoint yields a success 204
if the database is accessible.
http http://localhost:3000/health
HTTP/1.1 204 No Content
Connection: keep-alive
Date: Sat, 16 Apr 2022 14:40:26 GMT
ETag: W/"a-bAsFyilMr4Ra1hIU5PyoyFRunpI"
Keep-Alive: timeout=5
X-Powered-By: Express
If the database is inaccessible (forgot to run docker-compose up -d
?), then it returns an error 500
.
http http://localhost:3000/health
HTTP/1.1 500 Internal Server Error
Connection: keep-alive
Content-Length: 49
Content-Type: application/json; charset=utf-8
Date: Sat, 16 Apr 2022 14:41:46 GMT
ETag: W/"31-N3CMxmHVH2QRVY7BhlpI3H4+usU"
Keep-Alive: timeout=5
X-Powered-By: Express
{
"message": "connect ECONNREFUSED 127.0.0.1:5432"
}
Conclusion 🔗
We have setup a local Postgres database. It is accessible for the local application and can be configured via environment variables for production.
Become a GitHub sponsor to access the code of this post as a GitHub repo.