Is Deploying Node.js on Multi-Core Machines a Resource Waste
Our server-side project is deployed in a cluster, with each POD running a docker-nodejs application. POD configuration is 2 CPU cores, 16GB memory. Since Node.js is single-threaded, is there resource waste here? Let’s think about this issue.
Application Situation
- Multiple HTTP servers are started under app.js, each server listening on a different port.
Although multiple port services are started here, they still only occupy one process on the machine. Since Node.js is single-threaded, these multiple web services themselves also run in a single thread, and there is resource competition between them.
CPU Waste?
Looking at the actual situation of the production project, CPU usage is very low, reaching only 18% at maximum.
Is Multi-Core Usage Possible?
Node.js applications run under a single process, but Node.js has some commands like child_process
, cluster
, worker_threads
. These commands can create child processes, and child processes can run in different threads, thus utilizing multiple cores. Therefore, there is a possibility of multi-core usage.
Conclusion
- Single-threaded Node.js applications on multi-core machines have the possibility of resource waste. But to be safe, it should be based on actual load conditions
- Considering that Node.js has the possibility of multi-process usage, multi-core is also fine.