Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

suppose, for example, we have two people connecting to the server at the same time. if it is the node.js server, one person waited for 5 seconds, the other one waited for 10 seconds, the average waiting time is (5+10) / 2 = 7.5

assume that the python server is less efficient, which takes 7 seconds to finish the job. but it runs in parallel. so both two people waited 7 seconds.

therefore, on average, the python server is in fact, faster



i am not a node.js expert. one thing i am wondering is that, modern computers have more than just one core. how node.js utilizes these multiple cores?


By starting multiple processes.

See cluster (https://github.com/LearnBoost/cluster), fugue (https://github.com/pgte/fugue), multi-node (https://github.com/kriszyp/multi-node)


This is true for Python, Ruby, PHP. (I can't see ted's issues with node other than misunderstanding of the framework's runtime)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: