Skip to main content

Threads

Epoll is really efficient to handle multiple clients on a single thread by taking away the hassle of blocking I/O, but these operations are not the only bottleneck we face while scaling our server.

The problem is that the processing of the requests is done synchronously. In the case of a very high load, the server will now be blocked by the processing time of the requests.

To solve this problem, we will have to multithread our server in order to process multiple requests at the same time.

danger

This feature should only be attempted if you already have a working asynchronous server using Epoll.

Thread Pool Pattern

The solution to our problem is not that obvious at first glance. To properly solve it, we will use a very common concurrency pattern called a thread pool.

The idea is to create a pool of worker threads that will be able to process requests in parallel. The main thread, also called the master thread, will be responsible for accepting new connections using epoll, and will then dispatch the requests to an available worker thread in the pool.

Thread Pool

tip

In C, you will use pthread(7) to create your thread pool.

A queue should be used to store the requests that are waiting to be processed. When the main thread receives a new request, it will push it to the queue. Each thread in the pool will be responsible for popping a request from the queue and processing it. Since multiple threads will be accessing the same queue at a given time, we will have to implement a thread-safe queue to not run into race conditions.

tip

You might want to check pthread_mutex_init(3) and pthread_mutex_lock(3) when using shared data.

danger

When doing multithreaded programming, you must avoid race conditions at all costs. Thus, you will need to update any non-read-only data structure and other tools such as your logger to make them thread safe.

danger

You should not use fork(2) for this feature.