Skip to content

Comparing benchmarks of 0.10.3 and 1.0.0-dev #491

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
indexzero opened this issue Sep 26, 2013 · 4 comments
Open

Comparing benchmarks of 0.10.3 and 1.0.0-dev #491

indexzero opened this issue Sep 26, 2013 · 4 comments

Comments

@indexzero
Copy link
Member

@yawnt I added some baseline benchmark scripts that we can use in the 1.0.0-dev branch (they are also in the caronte branch, but that's behind now). These benchmarks exposed what appear to be (at first glance) serious performance issues in the 1.0.0-dev branch.

node: v0.8.25 | http-proxy: v0.10.3

I am using the benchmark branch. The documentation on how to run these benchmarks is in benchmark/README.md. On average I am seeing just under 3000 request / second.

$ wrk -c 20 -r 2000 -t 4 http://127.0.0.1:8000

$ wrk -c 20 -r 2000 -t 4 http://127.0.0.1:8000
Making 2000 requests to http://127.0.0.1:8000
  4 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     6.64ms    1.51ms   8.87ms   56.25%
    Req/Sec     0.00      0.00     0.00    100.00%
  2000 requests in 690.93ms, 253.91KB read
Requests/sec:   2894.63
Transfer/sec:    367.48KB

$ wrk -c 20 -r 10000 -t 4 http://127.0.0.1:8000

$ wrk -c 20 -r 10000 -t 4 http://127.0.0.1:8000
Making 10000 requests to http://127.0.0.1:8000
  4 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     6.76ms    2.69ms  20.34ms   83.91%
    Req/Sec     0.00      0.00     0.00    100.00%
  10000 requests in 3.33s, 1.24MB read
Requests/sec:   2999.05
Transfer/sec:    380.74KB

The highest I pushed it was 100k total requests across 8 threads with a concurrency of 20 per thread:

$ wrk -c 20 -r 100k -t 8 http://127.0.0.1:8000

$ wrk -c 20 -r 100k -t 8 http://127.0.0.1:8000
Making 100000 requests to http://127.0.0.1:8000
  8 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     4.82ms    1.51ms  19.94ms   90.74%
    Req/Sec     0.00      0.00     0.00    100.00%
  100000 requests in 29.82s, 21.89MB read
  Non-2xx or 3xx responses: 83616
Requests/sec:   3353.81
Transfer/sec:    751.67KB

node: v0.10.19 | http-proxy: v1.0.0-dev

I am using the 1.0.0-dev branch. Again, the documentation on how to run these benchmarks is in benchmark/README.md. There are two main problems here:

1. A 50% performance degradation

This could be a number of things (such as changes to node core itself), but right now I'm seeing a 50% performance hit.

$ wrk -c 20 -r 2000 -t 4 http://127.0.0.1:8000
Making 2000 requests to http://127.0.0.1:8000
  4 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    13.60ms    4.21ms  24.18ms   84.85%
    Req/Sec     0.00      0.00     0.00    100.00%
  2000 requests in 1.29s, 244.14KB read
Requests/sec:   1547.58
Transfer/sec:    188.91KB

2. Proxies fall over with ETIMEOUT

Increasing the number of total requests beyond ~6k without raising the concurrency or threads of the wrk process causes node benchmark/scripts/proxy.js to fall over with ETIMEOUT.

$ wrk -c 20 -r 10000 -t 4 http://127.0.0.1:8000

$ wrk -c 20 -r 10000 -t 4 http://127.0.0.1:8000
Making 10000 requests to http://127.0.0.1:8000
  4 threads and 20 connections
// !! Never exits...

This happens consistently and in a reproducible fashion. This is a show stopping bug and needs to be fixed:

$ node benchmark/scripts/proxy.js 

/Git/nodejitsu/node-http-proxy/lib/http-proxy/passes/web-incoming.js:112
        throw err;
              ^
Error: connect ETIMEDOUT
    at errnoException (net.js:901:11)
    at Object.afterConnect [as oncomplete] (net.js:892:19)
@mmalecki
Copy link
Contributor

I keep trying to benchmark caronte branch (@indexzero: branch v1.0.0-dev does not exist), yet I keep getting:

[root@fca5c5d5-b424-4273-9732-76c37f2f4395 ~/caronte]# node benchmark/scripts/proxy.js

/root/caronte/lib/http-proxy/passes/web-incoming.js:112
        throw err;
              ^
Error: connect ECONNREFUSED
    at exports._errnoException (util.js:676:11)
    at Object.afterConnect [as oncomplete] (net.js:944:19)

Target server stays up all the time, ulimit is set to reasonable values. @yawnt any idea?

@jcrugzz
Copy link
Contributor

jcrugzz commented Sep 27, 2013

@mmalecki error handling needs to be refactored as there is no easy way to handle the errors with how they are namespaced. See #462

@cronopio
Copy link
Contributor

After some improve from @yawnt I ran again the benchmark and this what I got in my humble laptop.

Node: v0.10.20
Last commit at caronte branch: 86750c7
wrk: wrk 3.0.1 [epoll] Copyright (C) 2012 Will Glozer

~ % wrk -c 20 -d 60 -t 2 http://127.0.0.1:8000 
Running 1m test @ http://127.0.0.1:8000
  2 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    11.04ms    2.76ms  29.10ms   81.18%
    Req/Sec     0.94k   163.12     1.26k    62.72%
  111086 requests in 1.00m, 13.24MB read
Requests/sec:   1851.42
Transfer/sec:    226.01KB

I would love to see what numbers show at @indexzero's laptop because my humble laptop show some improve.

@cronopio
Copy link
Contributor

And a 3 minutes benchmark show this:

~ % wrk -c 20 -d 3m -t 2 http://127.0.0.1:8000 
Running 3m test @ http://127.0.0.1:8000
  2 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    12.80ms    3.91ms  41.85ms   75.79%
    Req/Sec   837.57    200.98     1.28k    64.23%
  295806 requests in 3.00m, 35.26MB read
Requests/sec:   1643.36
Transfer/sec:    200.61KB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants