Securing Node-Postgres With SSL

Recently developing the Consentric platform, we’ve found the need to connect a lightweight micro-service written using Node.js to a PostgreSQL database. Thankfully, there’s a really nice library to use called node-postgres that makes that job super simple.

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

We began by connecting to the database using SSL in the way that the node-postgres library outlines in a clear and concise example using a self-signed certificate: 

const config = { 

  database: ‘database-name’, 

  host: ‘host-or-ip’, 

  // this object will be passed to the TLSSocket constructor 

  ssl: { 

    rejectUnauthorized: false, 

    ca: fs.readFileSync(‘/path/to/server-certificates/root.crt’).toString(), 

    key: fs.readFileSync(‘/path/to/client-key/postgresql.key’).toString(), 

    cert: fs.readFileSync(‘/path/to/client-certificates/postgresql.crt’).toString(), 

  }, 

import { Pool } from ‘pg’ 

const pool = new Pool(config) 

pool 

  .connect() 

  .then(client => { 

    console.log(‘connected’) 

    client.release() 

  }) 

  .catch(err => console.error(‘error connecting’, err.stack)) 

  .then(() => pool.end()) 

This is perfectly functional and allows basic encrypted SSL communication to take place. However the eagle-eyed amongst you will have observed the presence of the “rejectUnauthorized“ flag. This flag indicates whether or not the client should accept a connection which is not authorized within the list of supplied certificate authorities. For a self-signed certificate this would be expected, but if we want to be 100% confident of our security over the public internet for a production system, then we need stronger verification than that.  

We would do well here to take a quick step back and remind ourselves exactly what we mean by SSL and its various components. A full description of the process can be found in the links below, but a simplified version of the steps is as follows. 

  1. Client sends request to the server asking for an HTTPS connection 
  1. Server responds with a certificate 
  1. Client verifies the certificate with known authorities and sends back a key generated from the server certificate and the client key 
  1. Server verifies the returned key using its private key and stores it for further communication. 

The key point in the above lies in step 3. If one does not verify the certificate, then your client is effectively entirely agnostic as to which server it is speaking to. As long as the server provides the other security components, the client will happily establish a connection. Omitting that step would therefore leave a production system much more exposed to man-in-the-middle attacks. More information on those can be found in the links below. 

Returning to our initial example, in our current state, we are not verifying the certificate authority as the ‘rejectUnauthorized’ flag is set to false; hence we are vulnerable to the exploits mentioned above.  

Unfortunately, the fix is not simply a matter of switching the flag to ‘true’. In the ‘node-postgres’ connection simply doing so will result in the following error:  

Hostname/IP does not match certificate’s altnames: Host: localhost/https. is not cert’s CN:  

We have now started attempting to verify the certificate authority, which is great! The downside is that the default host used to compare with the common name of the cert, is usually either ‘localhost’ or simply ‘https’. The ‘ca’ field we passed into our example connection earlier overrides the default list of trusted certificate authorities to be solely the authority of our server certificate, but it does not provide a name under which it may be verified. 

What we need to do in order to solve this is override the native node tls.checkServerIdentity() function used to attempt to verify the certificate authority. Something like so: 

 

const fs = require(‘fs’); 

const tls = require(‘tls’); 

const { Pool } = require(‘pg’); 

const checkServerIdentity = (host, cert) => { 

    return tls.checkServerIdentity(‘CERT_COMMON_NAME_HERE’, cert); 

}; 

const pool = new Pool({ 

    connectionString: ‘postgres://user:password@hostnameOrIP:port/databaseName’, 

    connectionTimeoutMillis: 5000, 

    max: 1, 

    ssl: { 

        rejectUnauthorized: true, 

        ca: fs.readFileSync(‘/path/to/cert-authority.pem’, ‘utf8’), 

        key: fs.readFileSync(‘/path/to/client-key.pem’, ‘utf8’), 

        cert: fs.readFileSync(‘/path/to/client-cert.pem’, ‘utf8’), 

        checkServerIdentity, 

    } 

}); 

In our case the solution is as simple as wrapping the native checkServerIdentity function with our own, where we pass a specific string to the function instead of the default host. This example is hard-coded but it could be imported from another file or passed as an environment variable, that’s entirely up to you as the developer.  

You must remember the return statement in your checkServerIdentity override. The Node TLS layer will treat an ‘undefined’ return value from this function as passing verification regardless of the result of any calls to tls.checkServerIdentity. 

Once we’d provided our own wrapper function, the error disappeared and the connection was fully secured and verified using HTTPS. Hopefully this proved interesting and maybe even helped someone out of a bind!  

Remember to check out Consentric in the links below. 

Links: 

Andy Summers

Andy Summers

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

Opinion

Pubs And Restaurants Opening, Can Tech Help?

“We better go to the pub then dad!” said my seven-year-old on 19th March as the imminent shutting of pubs was reported on the news. At the time few believed we would not be back until July, never mind what the restrictions would look like when we did.

Blog

Independence Day Requires Your Consent

After 15 weeks of being forced to close their doors, pubs, restaurants, cafes and even holiday parks will be opening soon; This is great news for businesses, the economy, and for individuals social engagements, but it comes with challenges. To ensure safety and limit the chances of another Covid outbreak, there are strict rules being applied. Government guidelines imply these businesses must keep a record of visiting customers for contact tracing purposes. We have yet to hear any updates from the ICO on their guidelines.

Find out how you can do more with data

Get in touch