Skip to content

Huge performance issue #929

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
kar1m opened this issue Dec 27, 2015 · 10 comments
Closed

Huge performance issue #929

kar1m opened this issue Dec 27, 2015 · 10 comments
Labels

Comments

@kar1m
Copy link

kar1m commented Dec 27, 2015

Hi all,

I have a huge performance issue with node-http-proxy, maybe I'm missing something, hopefully someone can point out what's wrong.
Here's the code I'm using for the proxy (listening on 9000):

var httpProxy = require('http-proxy');

var proxy = httpProxy.createServer({
    target:'http://127.0.0.1:8000'
});

proxy.on('error', function(e) {
    console.error(e);
});

proxy.listen(9000);

and here's the code of the server listening on 8000:

var http = require('http');

var server = http.createServer(function (req, res) {
    res.writeHead(200, {'Content-Type' : 'text/plain'});
    res.end('Hello world');
});
server.listen(8000);

I used wrk to benchmark the performance of the proxy.
First, I test the http server to see how many requests it can handle:
wrk -c 64 -d 15s http://127.0.0.1:8000
Requests/sec: 26255.78

Now when I test the proxy:
wrk -c 64 -d 15s http://127.0.0.1:9000
Requests/sec: 543.84

Something is clearly wrong, so I inspected the cpu usage during the benchmark and I noticed that when I directly test the http server (port 8000), the cpu usage of the server's process stays very close to 100% for the duration of the test (which is expected), but when I test the proxy, the proxy's process gets to 100% but quickly (1-2 seconds after the beginning of the test) drops to 0%
Note that wrk, the proxy and the hello world server run on different threads, this issue is not related to insufficient resources (I have 8 thread on my machine).
I have also tested nginx and HAProxy, and both did fine.

The main reason I want to use http-proxy is for load balancing, but so far it looks like it's gonna be a bottleneck more than anything else.

@peol
Copy link

peol commented Jan 5, 2016

Try ending the proxied response:

proxy.on('error', function(e, req, res) {
    console.error(e);
    res.status( 500 ).end( "Error occurred" );
});

@indexzero
Copy link
Member

@kar1m what version of node are you using? Can you compare your results against 0.10, 0.12, and 4.2?

@pyper
Copy link

pyper commented Jan 5, 2016

I just ran @kar1m example and I am seeing the same issue here. Requests per second drop by a factor of 10-20x when using the proxy. @indexzero I tested on versions v0.10.36 and v5.2.0.

@indexzero
Copy link
Member

You probably want a shared Agent.

@pyper
Copy link

pyper commented Jan 5, 2016

Thanks @indexzero, that made a huge difference. @kar1m You need to create an agent with keepAlive set to true and pass it the agent option when you create the server. You might need to play with some of the other agent options too.

@kar1m
Copy link
Author

kar1m commented Jan 5, 2016

Thanks @indexzero and @pyper, It made a huge difference (10x previous performance), although it's still much slower than HAProxy. I'll try playing with the agent options and let you know if I can get a better performance.

@indexzero
Copy link
Member

We're not going to beat HAProxy. That is not the goal of this project. Thanks for reporting this issue, going to close it now. This will be a good FAQ discussion.

@indexzero indexzero added the faq label Jan 5, 2016
@ronaldocpontes
Copy link

@indexzero, what do you mean by shared Agent?

Can you link to some resources?

@jcrugzz
Copy link
Contributor

jcrugzz commented Mar 2, 2016

@ronaldocpontes you'd want to pass in an http.Agent or https.Agent on a per request basis based on the target URL. Check out the node docs here

@indexzero
Copy link
Member

Locking this issue now that the entirety of the FAQ resources are here.

@http-party http-party locked and limited conversation to collaborators Mar 2, 2016
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

6 participants