• Breaking News

    PostgreSQL and NodeJS - Tutorial with working example

    Do you need to set up a connection between your Node JS application and your PostgreSQL database? Cool! here you have a working example.


    What elements do yo we need?

    We don't need a bunch of stuff to achieve this goal, we just need:


    • Express framework
    • node-postgres package


    And that's all we need.

    Some database foundations

    Here's the thing: when dealing with databases you may face two scenarios:


    • Transactional queries
    • Non-Transactional queries

    And when connecting to databases, you may be using:


    • A single connection
    • A pooled connection


    There's plenty of information about those topics available on the Internet. You can dig deeper if you still have doubts.

    Working example code

    Key concepts: In this example, I will create 2 routes

    One route will be used to create a single connection and run a non-transactional query.
    The other one will be used to create a pooled connection and run a transactional query.

    I've named these routes as "/isolated" and "/pool" respectively.

    Code Organization


    The code will be organized in 2 files, the main server file, and the database functions file.

    The server file goes like this:

    const express = require('express')
    const bodyParser = require('body-parser')
    const app = express()
    const port = 3000
    const https = require('https');
    var fs = require('fs');
    var db = require('./testDB');
    
    var PATH_TO_BUNDLE_CERT_1 = '.....' ;
    var PATH_TO_BUNDLE_CERT_2 = '....' ;
    var PATH_TO_CERT = '....' ;
    var PATH_TO_KEY = '....' ;
    
        var options = {
          ca: [fs.readFileSync(PATH_TO_BUNDLE_CERT_1), fs.readFileSync(PATH_TO_BUNDLE_CERT_2)],
          cert: fs.readFileSync(PATH_TO_CERT),
          key: fs.readFileSync(PATH_TO_KEY)
        };
    
    var SSL_PORT = 3000 ;
    var SERVER_MESSAGE = 'Server started: ' + SSL_PORT ;
    
    app.use(bodyParser.json());
    
    app.use(
      bodyParser.urlencoded({
        extended: true,
      })
    );
    
    app.get('/', (request, response) => {
      response.json({ info: 'Node.js, Express, and Postgres API' })
    });
    
    var server = https.createServer(options, app);
        server.listen(SSL_PORT, function(){
            console.log(SERVER_MESSAGE);
        });
    
    app.get('/isolated', db.doIsolated);
    app.get('/pool', db.doPool);
    

    And the testDB.js file will have this content:

    const Pool = require('pg').Pool
    const pool = new Pool({
      user: '----',
      host: 'localhost',
      database: '---',
      password: '---',
      port: 5432,
      max: 4,
    });
    
    
    let contador = 0 ;
    const doIsolated = (request, response) => {
      contador = contador + 1;
      pool.query('SELECT * FROM sample', (error, results) => {
        if (error) {
          throw error
        }
        response.status(200).json({pedido: contador,resultset: results.rows})
      })
    }
    
    const doPool = (request, response) => {
      pool.connect((err, client, release) => {
       if (err) {
        return console.error('Error acquiring client', err.stack)
       }
       client.query('SELECT * FROM sample', (err, results) => {
        release()
        if (err) {
          return console.error('Error executing query', err.stack)
        }
        contador = contador + 1;
        response.status(200).json({pedido: contador,resultset: results.rows})
       });
      });
    }
    
    module.exports = {
      doIsolated,
      doPool,
    }

    And that's all folks!

    Wrapping Up


    You can play changing the max number of connections allowed in the database pool, or you can play submitting lots of concurrent requests to those routes and see how they perform under stress.

    Feel free to ping me in the comments section if you still have questions or doubts.


    No comments