-
Notifications
You must be signed in to change notification settings - Fork 2k
slow performance using basic-proxy.js vs. no proxy #305
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
So I identified the majority of the bottleneck is in the dns lookup call done from in the node.js source from http.js > net.js > dns.js. I added some debug statements to the node.js source and found this: DEBUG - http.js socket write ms:796ms That's 84ms of what was a 100ms request just in processing the dns lookup. Node is doing no caching of this either, so you take that hit every time. I created this test script to help narrow down where the |
Interesting. |
Is there any fix for this? |
same issue here. ab-client machine and http-proxy machine are both Ubuntu 13.10 Servers. is used the basic-proxy.js example code: without proxy: ab -n 10000 -c 100 http://10.10.10.1:9003/ : with http-proxy: ab -n 10000 -c 100 http://10.10.10.1:8003/ : setNoDelay(); is set to true AND http.globalAgent.maxSockets is set to 10000 AND further more i did some linux-kernel tuning mentioned here: http://urbanairship.com/blog/2010/09/29/linux-kernel-tuning-for-c500k/ but without any better results. even a local benchmark returns the same result. AND it does not seem to be an prob with the DNS resolving, i used this script https://gist.github.com/gabrielf/7746695#file-nodejs-dns-vs-ip-lookup-test-js by (https://gist.github.com/gabrielf) and the results looked fine: lookup of 10.10.10.1: 1ms |
@zlaoz That looks similar to the 200ms fixed overhead from small requests I saw in anything greater than Node v0.10.15. |
@grantkl so in an older node version this delay does not took place? which version? (i am on v0.10.28) |
same with node v0.10.10 |
I didn't see the same delay in v0.10.15 or before. There were changes in v0.10.16 that introduced a 200ms hit for my use case. It has been a couple of months since I looked into this as I started vetting other solutions for my problem. |
@zlaoz i would personally try using |
@jcrugzz thanks for the hints. with wrk the results are the same: Running 10s test @ http://10.10.10.1:9003/ (direct) Running 10s test @ http://10.10.10.1:8003/ (http-proxy) passing an agent to http-proxy is the next thing... hold on |
@zlaoz and when passing in an agent make sure it has maxSockets set reasonably high as well. |
same result using: var agent = new http.Agent(); var server = httpProxy.createServer({ |
@zlaoz hmm and this is on ubuntu only? Do you have the code that reproduces this that i can grab and test myself? I want to see if its reproducible on OS X. I would also try and run this on Paste the |
@jcrugzz i created a gist with all required code-snippets and infos: https://gist.github.com/zlaoz/4a730bfe7f322f6442fd#file-node-http-proxy-slow-performance |
@zlaoz ive tested your same code locally on my macbook and it shows an overhead of 60ms of latency with the proxy (7ms -> 60ms). Have you been able to try this on |
@jcrugzz sry for the late response! tried it using |
It would be good to have benchmarks related to the newest version of |
Hey all,
I have been running some ab tests against the basicproxy.js service and have noticed it's about 3 times slower using the proxy than not. I'm on node 0.8.8 and ubuntu 11.10.
Using proxy
No proxy
I ran the proxy with node http debugging enabled for a single request and this was the output.
NODE_DEBUG=http node basic-proxy.js
1 request through proxy
HTTP: SERVER new http connection
HTTP: server response shouldKeepAlive: false
I'm proxyrequest
get base result
HTTP: outgoing message end.
HTTP: SERVER new http connection
HTTP: server response shouldKeepAlive: true
HTTP: write ret = true
HTTP: outgoing message end.
HTTP: AGENT incoming response!
HTTP: AGENT isHeadResponse false
HTTP: write ret = true
HTTP: AGENT socket keep-alive
HTTP: outgoing message end.
HTTP: server socket close
HTTP: server socket close
1 request without proxy
HTTP: SERVER new http connection
HTTP: server response shouldKeepAlive: false
HTTP: write ret = true
HTTP: outgoing message end.
HTTP: server socket close
Is the performance difference simply because we're asking the proxy to do that much more, that 3x increase seems like something is askew here. I tested this with different ab parameters, played around with the maxSockets settting and tried a different http agent, but nothing seemed to speed up going through the proxy. I'm surprised at the discrepancy here, am I missing something?
Thanks,
Rob
The text was updated successfully, but these errors were encountered: