How Node.JS handles multiple requests without blocking?



I have been using Node.JS for a while and just wonder how it handles when multiple clients causing some blocking / time consuming works to response ?

Consider the following situation
1- There are many endpoints and one of them is time consuming and responds in a few seconds.
2- Suppose, 100 clients simultaneously make requests to my endpoints, which one of them takes a considerable amount of time.

Does that endpoint block all event loop and make other requests wait ?

Or , In general, Do requests block each other in Node.JS ?

If not , why ? It is single-threaded, why do not they block each other ?


Node.Js does use threads behind the scenes to perform I/O operations. To be more spesific to your question – there will be a limit where a client will have to wait for an idle thread to perform a new I/O task.

You can make an easy toy example – running several I/O tasks concurrently (by using Promise.all for instance) and measure the time it takes for each to finish. Then add a new task and repeat.
At some point you’ll notice two groups. For example 4 requests took 250ms and the other 2 took 350ms (and there you get "requests blocking each other").

Node.Js is commonly refered as single threaded for its default CPU-operations excecution (in contrary to its Non-blocking I/O architecture). therefore it won’t be very wise using it for intensive CPU operations, but very efficient when it comes to I/O operations.

Answered By – Tamir Nakar

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More