I have been writing API-applications with Ruby on Rails for years and i am pretty satisfied with it. Usual instruments that provide the desired usability are:

  1. Rails::API is a bit more lightweight and faster Rails
  2. PostgreSQL as a relational database
  3. Active Model Serializers for configuration the response’s appearance

This scope is a good fit for me. However, in these few-post series, I’ll look at the basics of the obtaining rails-like node.js API instead. There are several reasons for that.

Firstly, last year we saw a huge rise of microservices popularity. Separating the large monolithic applications into microservices allows us to replace rails applications with something else.

Moreover, in some cases it would be better to avoid such heavyweight framework as rails. Plus, when you write a node.js application, you’re writing and learning Javascript. And the lifelong practice of learning is what makes us humans and our lives worthwhile.

So I have decided to make a try and to “translate” the familiar stack of technologies into Javascript.

In this part I am going to describe how to setup project that is based on the features of express.js. As the first step I am going to create a simple CRUD logic for models that I gave the name article.

1. Create new express application

Let’s get it going by fetching express-generator (it installs express as an ensemble):

npm install -g express-generator

Once we’re done, we can generate new application and set dependencies up:

express example-express-api
cd example-express-api
npm install

The server could be already fired by typing npm start, but for the better developing experience we need to install some tool for the monitoring code changes. Without it the server has to be restarted manually each time when the code is modificated. There is nodemon that will restart the node application automatically.

npm install -g nodemon

And then switch the lines in package.json to:

"scripts": {
  "start": "nodemon ./bin/www"

Type npm start again and navigate to http://localhost:3000/ in the browser. We should see the “Welcome to Express” text. Voila, preparatory work is complete and we are ready to begin the actual development.

2. Using ES2015

Before we move on to the database details, let’s add an ability to support the modern standard of Javascript — ES2015. It runs by transpiling new syntax that ES2015 brings into the one supported by the browser. To put it into the action let’s install babel and preset for it.

npm install --save-dev babel-cli babel-preset-es2015

Then update the start script in the package.json file:

"scripts": {
  "start": "nodemon ./bin/www --exec babel-node --presets es2015"

Because babel-cli has been applied earlier, we have access to the babel-node executable, which transforms all of the code before running it through nodemon. Optionally, for the finishing touch we could update generated by the express files. Basically, we need to accomplish the following three things:

  • substitute the things like var routes = require('./routes/index') to import routes from './routes/index'
  • use let instead of var
  • introduce arrow functions. For example: function(req, res, next) {...} transforms to (req, res, next) => {...}

3. Connect PostgreSQL to application

Rails has a really awesome migration system. It allows us to evolve database schema overtime. I want to achieve this mechanism in my js-app too. As with many things in JS, there is a package that helps us to do this called pg-migrate.

npm install -g node-pg-migrate --save


pg-migrate create create_articles

It generates new migration inside migrations folder. Fill it out with the following code:

exports.up = (pgm) => {
  pgm.createTable('articles', {
    id: { type: 'serial', primaryKey: true },
    title: { type: 'varchar(140)' },
    body: { type: 'text' },
    created_at: { type: 'timestamp' }

exports.down = (pgm) => {

According to the documentaion for the successful migration we have to add .env in the root directory and include require('dotenv').load(); at the top of app.js file. In .env I have to specify the database connection url by setting the environment variable DATABASE_URL. For instance:


Here I set up a simple structure of the articles table. The particular article consists of two text fields (title and body) and one timestamp. It is time to create database with name my_blog in psql console and run the migration command by entering pg-migrate up.

4. First couple of endpoints

Let’s keep it simple by adding new route file articles.js inside routes directory. Two things to note inside app.js: new routes and specify base route. This is what the app.js should contain:

import articles from './routes/articles';
app.use('/articles', articles);

Now, let’s build the first couple of endpoints: index and create. To make a request to database the connection pool should be initialized. It takes one argument presented url to database.

import express from 'express';
import pg from 'pg';

let router = express.Router();

router.get('/', (req, res, next) => {
  pg.connect(process.env.DATABASE_URL, (err, client, done) => {
    let results = [];

    if (err) {
      return res.status(500).json({ success: false, data: err });

    const query = client.query("SELECT * from articles ORDER BY id ASC");
    query.on('row', (row) => {

    query.on('end', () => {
      return res.json(results);

module.exports = router;

For testing POST action I am using the POSTman extension for google chrome. We should send response via x-www-form-urlencoded format. Data is available through res.body attribute.

router.post('/', (req, res, next) => {
  let data = { title: req.body.title, body: req.body.body };

  pg.connect(process.env.DATABASE_URL, (err, client, done) => {
    if (err) {
      return res.status(500).json({ success: false, data: err });

    client.query("INSERT INTO articles(title, body, created_at) values($1, $2, $3)", [data.title, data.body, new Date()]);

In-between conclusion

It seems to take much more time to string everything together in express.js. I guess the main reason is the lack of opionated structure. In next post I’m going to build remained endpoints to obtain the full CRUD interface and also polish routes by introducing controllers.