Jan Sunavec

Http Server Performance: NodeJS vs. Go

Who delivers the higher number of concurrent requests?

Add to Pieces

We are developing something like an ad proxy or Google Ad Buffer. Service just forward ad HTTP requests to SSPs server. For this purpose, it’s necessary to create many HTTP requests with minimum hardware resources. Therefore we decided to do research and compare programming languages with virtual machines and compiled one.

NodeJs Script to Create Concurrent Connections

Tags: fastify, connection, javascript, node.js

Nodejs script to run a number of conccurerrent connections to localhost:3008.

const fastify = require(“fastify”)({
 logger: false,
});


fastify.get(“/fillbuffer”, async (request, reply) => {
 reply.type(“application/json”).code(200);
 return {
 result: `{result: “Hello world”}`,
 };
});


fastify.listen(3008, (err, address) => {
 if (err) throw err;
});

Related links:

  1. https://medium.com/better-programming/http-server-performance-nodejs-vs-go-397751e8d275
  2. https://github.com/fastify/fastify

NodeJS: 100 Requests using ApacheBench Tool Results

Tags: javascript, node.js, output, apachebench

I used the ApacheBench (ab) tool for testing. Let’s skip the full hardware specification. I can just tell that I used an I7–8550U CPU.

ab -n 1000000 -c 100 localhost:3008/fillbuffer

Requests per second: 12925.33 [#/sec] (mean)
Time per request: 7.737 [ms] (mean)
Time per request: 0.077 [ms] (mean, across all concurrent requests)


Percentage of the requests served within a certain time (ms)
 50% 8
 66% 8
 75% 8
 80% 8
 90% 9
 95% 10
 98% 12
 99% 13
 100% 106 (longest request)




Related links:

  1. https://medium.com/better-programming/http-server-performance-nodejs-vs-go-397751e8d275
  2. https://github.com/fastify/fastify

NodeJS: 500 Concurrent Connections Using ApacheBench Result

Tags: node.js, npm, javascript, apachebench

Trying even more connections here and increasing the concurrent amount to 500.

Results:
Requests per second: 9673.37 [#/sec] (mean)
Time per request: 51.688 [ms] (mean)
Time per request: 0.103 [ms] (mean, across all concurrent request


Percentage of the requests served within a certain time (ms)
 50% 48
 66% 49
 75% 50
 80% 51
 90% 58
 95% 79
 98% 137
 99% 156
 100% 286 (longest request)

Related links:

  1. https://medium.com/better-programming/http-server-performance-nodejs-vs-go-397751e8d275
  2. https://github.com/fastify/fastify

Http Connection Using GO

Tags: go, pointers, http

500 concurrent connection hit CPU limit and Node solution starts struggling but let’s get Go numbers. The script is a bit longer but still short.

package main

import (
 “encoding/json”
 “fmt”
 “log”
 “github.com/valyala/fasthttp”
)


var (
 addr = “:3008”
 strContentType = []byte(“Content-Type”)
 strApplicationJSON = []byte(“application/json”)
 httpClient *fasthttp.Client
)


func main() {
 fmt.Println(“Starting server…”)
 h := requestHandler
 h = fasthttp.CompressHandler(h)


httpClient = &fasthttp.Client{
 MaxConnsPerHost: 2048,
 }


if err := fasthttp.ListenAndServe(addr, h); err != nil {
 log.Fatalf(“Error in ListenAndServe: %s”, err)
 }
}


func requestHandler(ctx *fasthttp.RequestCtx) {
 if string(ctx.Method()) == “GET” {
 switch string(ctx.Path()) {
 case “/fillbuffer”:
 ctx.Response.Header.SetCanonical(strContentType, strApplicationJSON)
 ctx.Response.SetStatusCode(200)
 response := map[string]string{“result”: fmt.Sprintf(“hello world”)}
 if err := json.NewEncoder(ctx).Encode(response); err != nil {
 log.Fatal(err)
 }
 }
 }
}



Related links:

  1. https://medium.com/better-programming/http-server-performance-nodejs-vs-go-397751e8d275
  2. https://github.com/fastify/fastify

Results for 100 Requests on fasthttp Server

Tags: javascript, requests, output, fasthttp

As you can see I decided to use fasthttp as the HTTP server. The server is not based on any HTTP lib. So it’s really pure HTTP protocol implementation. Let’s see the result for 100 concurrent requests.

ab -n 1000000 -c 100 localhost:3008/fillbuffer

Requests per second: 15847.80 [#/sec] (mean)
Time per request: 6.310 [ms] (mean)
Time per request: 0.063 [ms] (mean, across all concurrent requests)


Percentage of the requests served within a certain time (ms)
 50% 6
 66% 7
 75% 7
 80% 7
 90% 7
 95% 7
 98% 8
 99% 8
 100% 18 (longest request)

Related links:

  1. https://medium.com/better-programming/http-server-performance-nodejs-vs-go-397751e8d275
  2. https://github.com/fastify/fastify

Results for 500 Requests on fasthttp Server Using GO

Tags:  javascript, requests, output, fasthttp

Go is the only winner here especially with the higher number of concurrent requests. So tiny layer running under the V8 engine is not so tiny. With 100 concurrent requests, deliver Go over 18% more requests. 500 concurrent requests increased gain to over 34%.

ab -n 1000000 -c 500 localhost:3008/fillbuffer

Requests per second: 14682.27 [#/sec] (mean)
Time per request: 34.055 [ms] (mean)
Time per request: 0.068 [ms] (mean, across all concurrent requests)


Percentage of the requests served within a certain time (ms)
 50% 34
 66% 36
 75% 37
 80% 37
 90% 39
 95% 40
 98% 41
 99% 41
 100% 62 (longest request)



Related links:

  1. https://medium.com/better-programming/http-server-performance-nodejs-vs-go-397751e8d275
  2. https://github.com/fastify/fastify