Home Blog Talks

Serverless best practices for database connections and queries

2019-10-16

The second lecture on Serverless best practices is here. This lecture will help you get the following skills:

1. Use the life cycle of cloud functions to manage database connections

In the first lecture on the life cycle of cloud functions, we have already mentioned the two benefits of creating a database connection in the Mount stage of cloud functions:

  1. Effectively reduce the number of database connections (create one connection per request -> create one connection per instance)
  2. Performance optimization (each request creates a connection -> multiple requests reuse the connection of the instance)

Let’s review the sample code again:

import { Func } from '@faasjs/func'; // FaasJS cloud function class
import { Sql } from '@faasjs/sql'; // Sql plug-in for FaasJS

//Initialize database object
const sql = new Sql();

// Return cloud function instance
export default new Func({
   plugins: [sql], //Plug-in management, FaasJS will automatically manage the life cycle of the plug-in
   async handler(){ // business code
     return await sql.query('SELECT * FROM users WHERE id = ?', [1]);
   }
});

The Sql plug-in of FaasJS supports Mysql, PostgreSql and Sqlite and databases that support these three database protocols, and has internally encapsulated the best practices based on the cloud function life cycle mechanism, so developers only need to use it directly.

2. Use Knex and TypeScript to improve development efficiency and quality

Knex is a SQL statement generation plug-in that can be combined with TypeScript to greatly simplify developers’ operations on the database.

Let’s look directly at the code example:

// user.func.ts
import { Func } from '@faasjs/func'; // FaasJS cloud function class
import { Sql } from '@faasjs/sql'; // Sql plug-in for FaasJS
import knex from 'knex';

// Use TypeScript to define the structure of the user table
interface User {
   id: number;
   name: string;
}

//Initialize database object
const sql = new Sql();

// Return cloud function instance
export default new Func({
   plugins: [sql], // Plug-in management, FaasJS will automatically manage the life cycle of the plug-in
   async handler(){ // business code
     const users = knex<User>({
         client: sql.adapterType
       }) // Tell Knex the data type returned and the type of the database
       .from('users') // Tell Knex the table name
       .connection(sql.adapter!.pool); //Reuse the database connection automatically maintained by the sql plug-in

     return await users.where({ id: 1 }); // Database query in Knex form
   }
});

There are two important points in the above code:

  1. Knex supports using TypeScript’s interface as the return data type
  2. The sql plug-in needs to inject the connection pool into Knex to use the life cycle of the cloud function to manage the connection

According to the above writing method, the business code of the cloud function itself is no problem, but Knex also supports operations such as creating tables, which is very useful for automated testing, so let’s take a deeper look at how to write better automated test scripts:

// __tests__/user.test.ts
import { FuncWarpper } from '@faasjs/test'; // FaasJS encapsulation for testing cloud functions
import { Sql } from '@faasjs/sql'; // Introduce Sql plug-in
import knex from 'knex'; //Introduce the knex plug-in

// FaasJS uses Jest as the testing framework
describe('user', function () {
   let func: FuncWarpper;

   beforeEach(async function () {
     // Generate cloud function
     func = new FuncWarpper(require.resolve('../user.func') as string);

     // In order to facilitate various operations on the database in the test script, we put a shortcut to the sql plug-in instance on the func object
     func.sql = func.plugins[0] as Sql;

     // Since the database connection is generated during the mount phase, mount it here first.
     await func.mountedHandler();

     // Create table
     awaitknex({
       client: func.sql.adapterType
     })
       .schema
       .connection(func.sql.adapter!.pool)
       .dropTableIfExists('users')
       .createTable('users', function (t) {
         t.integer('id').notNullable();
         t.string('name').notNullable();
       });
   });

   test('should work', async function () {
     //Insert fake data
     awaitknex({
         client: func.sql.adapterType
       })
       .from('users')
       .connection(func.sql.adapter!.pool)
       .insert({
         id: 1,
         name: 'hi'
       });

     //Call cloud function
     const res = await func.handler();

     // Check whether the returned result is as expected
     expect(res.length).toEqual(1);
     expect(res[0].id).toEqual(1);
     expect(res[0].name).toEqual('hi');
   });
  });

Here is a small question: when multiple cloud functions need to call this data table, how to encapsulate it better? (See the answer below)

3. Divide libraries at the right time to reduce coupling

As your business grows, you will inevitably encounter an increasing number of data types and quantities. If a large number of cloud functions are connected to a database, it will inevitably put greater pressure on the database. Therefore, it is recommended that when the development reaches a certain level, , carry out branching operations in advance to decouple data and code.

The folder structure of FaasJS naturally supports splitting databases. Suppose we split the users table and orders into two databases, we only need to put them in two different folders, and configure their own faas in each folder. .yaml is enough.

For specific examples, you can click “Read the original text” below to view the sample code I wrote on Github. The sample code includes the following best practice examples:

Back to all posts