8 Type of Load Balancing
Ravin Rau

Ravin Rau @juniourrau

About: Curiosity makes exploring more interesting. A JavaScript avid with so many questions about life.

Location:
Kuala Lumpur, Malaysia
Joined:
May 8, 2021

8 Type of Load Balancing

Publish Date: Dec 10 '24
247 41

If you're diving into the world of web infrastructure, you've probably heard about load balancing. It's like the traffic cop of the internet, making sure all those data requests get to the right place without causing a jam. In this article, we'll break down some popular load-balancing techniques and show you how to set them up using NGINX. Share your favorite load-balancing strategy in the comments and tell us how it helped solve your problem.

1. Round Robin

Round Robin

When to Use It: Perfect for spreading requests evenly when your servers are all pretty similar.

What's It About: Think of it like taking turns. Each server gets a request in order, one after the other. It's simple and works great when all your servers are equally capable.

Downside: Doesn't account for server load or capacity differences, which can lead to uneven performance if servers vary in power.

How to Set It Up in NGINX:

upstream backend {
    server server1.example.com;
    server server2.example.com;
    server server3.example.com;
}
Enter fullscreen mode Exit fullscreen mode

2. Least Connection

Least Connection

When to Use It: Great for when some servers are busier than others.

What's It About: This one sends traffic to the server with the fewest active connections. It's like choosing the shortest line at the grocery store.

Downside: Can lead to uneven distribution if some servers are slower or have less capacity, as they might still end up with more connections.

How to Set It Up in NGINX:

upstream backend {
    least_conn;
    server server1.example.com;
    server server2.example.com;
    server server3.example.com;
}
Enter fullscreen mode Exit fullscreen mode

3. Weighted Round Robin

Weighted Round Robin

When to Use It: Handy when your servers have different strengths.

What's It About: Similar to Round Robin, but you can give some servers more "turns" based on their capacity.

Downside: Requires manual configuration and tuning of weights, which can be complex and needs regular adjustments as server loads change.

How to Set It Up in NGINX:

upstream backend {
    server server1.example.com weight=3;
    server server2.example.com weight=1;
    server server3.example.com weight=2;
}
Enter fullscreen mode Exit fullscreen mode

4. Weighted Least Connection

Weighted Least Connection

When to Use It: Best for mixed environments with varying server loads and capabilities.

What's It About: Combines the best of both worlds—Least Connection and Weighted Round Robin.

Downside: Like Weighted Round Robin, it requires careful configuration and monitoring to ensure weights are set correctly.

How to Set It Up in NGINX:

upstream backend {
    least_conn;
    server server1.example.com weight=3;
    server server2.example.com weight=1;
    server server3.example.com weight=2;
}
Enter fullscreen mode Exit fullscreen mode

5. IP Hash

IP Hash

When to Use It: Perfect for keeping users connected to the same server.

What's It About: Uses the client's IP address to decide which server to use, ensuring consistency.

Downside: Can lead to uneven distribution if a large number of users share the same IP range, and doesn't handle server failures gracefully.

How to Set It Up in NGINX:

upstream backend {
    ip_hash;
    server server1.example.com;
    server server2.example.com;
    server server3.example.com;
}
Enter fullscreen mode Exit fullscreen mode

6. Least Response Time

Least Response Time

When to Use It: Ideal when speed is everything.

What's It About: Sends requests to the server that responds the fastest. NGINX doesn't support this out of the box, but you can use some third-party magic like Nginx Upstream Fair Module..

Downside: Requires additional monitoring and third-party modules, which can add complexity and potential points of failure.


7. Random

Random

When to Use It: Good for testing or when you just want to mix things up.

What's It About: Randomly picks a server for each request. Again, you'll need a third-party module for this like Nginx Random Load Balancer Module.

Downside: Can lead to uneven load distribution and isn't suitable for production environments where performance is critical.


8. Least Bandwidth

Least Bandwidth

When to Use It: Useful when bandwidth usage is all over the place.

What's It About: Directs traffic to the server using the least bandwidth. For this one, you'll need some custom setup like custom scripts or monitoring tools.

Downside: Requires custom monitoring and setup, which can be complex and resource-intensive.


Other Cool Load Balancing Tricks

  1. Geolocation-Based: Directs traffic based on where users are located. Great for reducing latency.
  2. Consistent Hashing: Keeps requests going to the same server, even if the server pool changes. Perfect for caching systems.
  3. Custom Load Balancing: Tailor it to your needs with custom scripts or Lua scripting in NGINX.

Conclusion

Choosing the right load-balancing strategy is all about understanding your app's needs. NGINX is super flexible and can easily handle many of these strategies. Whether you're using built-in methods or third-party modules, there's a solution out there for you. Just be mindful of the potential downsides and plan accordingly. Please share your favorite load-balancing strategy in the comments. Happy balancing!

Comments 41 total

  • Georgia Prisoners’ Speak
    Georgia Prisoners’ SpeakDec 10, 2024

    Good!

  • صابر مصطفی
    صابر مصطفیDec 10, 2024

    Good share...

  • raman000
    raman000Dec 11, 2024

    Good content .

  • Ben
    BenDec 11, 2024

    Good, Thank you so much!

  • Vijay Koushik, S. 👨🏽‍💻
    Vijay Koushik, S. 👨🏽‍💻Dec 11, 2024

    Excellent job! Your concise and straightforward explanation helped me quickly grasp multiple load balancing approaches. Keep up the good work!

    • Ravin Rau
      Ravin RauDec 11, 2024

      Thank you very much @svijaykoushik, I keep it concise to make it easy for me to grasp the approaches from time to time. I am glad that it is useful for others too.

  • Ben Borla
    Ben BorlaDec 11, 2024

    I learned something new today. Thank you!

    • Ravin Rau
      Ravin RauDec 11, 2024

      Thank you very much @benborla. Glad that I can contribute something to you today.

  • Victor Catalan
    Victor CatalanDec 11, 2024

    nice Content! Thanks for sharing :)

  • Sohail SJ | TheZenLabs
    Sohail SJ | TheZenLabsDec 11, 2024

    Easy and Great Read!

  • ammar629
    ammar629Dec 11, 2024

    As someone who is learning about backend development this was a great way to understand the load balancing concept Thank you

    • Ravin Rau
      Ravin RauDec 12, 2024

      Thank you so much @ammar629 and all the best on your backend journey. I have learned a few things on the backend side that I will be sharing soon.

      • ammar629
        ammar629Dec 12, 2024

        I'm following you and looking forward to what you write next

  • Franklin Thaker
    Franklin ThakerDec 12, 2024

    very well written, thanks for sharing. very very helpful.

  • Uchechukwu Noble
    Uchechukwu NobleDec 12, 2024

    I'm new to backend using Django
    How do I implement this in my projects
    @juniourrau

    • Ravin Rau
      Ravin RauDec 12, 2024

      Hi @uchechukwu_noble_28129eb5, honestly I haven't tried Django properly yet but based on my understanding here is what I have in mind.

      Normally when you host/deploy a Django app you will need Gunicorn as the WSGI/Process Manager to manage the worker processes to communicate with your Django application. You can set nginx to handle the HTTP requests

      Normal Setup

      If you want to use nginx as a load balancer, it is when you have multiple servers hosting your Django application and you want to distribute your load around the server properly. You can set it like the article above based on your strategy.
      Django Setup with Load Balancer

      My recommendation is to first go with the normal setup and understand the interworking of how it works, then later on you can start experimenting with the load balancer once you understand how to deploy your application with docker.

  • АнонимDec 13, 2024

    [hidden by post author]

  • Connections Hint
    Connections HintDec 13, 2024

    Great!

  • Indranil Kamulkar
    Indranil KamulkarDec 13, 2024

    Fantastic explanation, short and sweet and easy, helped me a lot

  • Lakh Bawa
    Lakh BawaDec 13, 2024

    Thanks for sharing

  • Kaustubh Joshi
    Kaustubh JoshiDec 15, 2024

    Great content👏🏻

  • Raghavendra Kedlaya
    Raghavendra KedlayaDec 15, 2024

    The article is well-covered and addresses an essential topic.

    While it primarily discusses load-balancing, it also serves as a failover mechanism.

    I believe the concept of load-balancing goes beyond distributing traffic across web servers. Many front-end servers deliver UI elements, JavaScripts and assets. Modern browsers do leverage caching, powerfull front-end technologies like Angular, React etc provide data processing and deliver seamless user experience.

    From a portal’s perspective, the heavier workload typically lies within the application’s middle-tier or API servers. These servers interface with databases, process requests, and prepare data tailored to user needs. They need to manage concurrency, heavy data operations, and resource competition efficiently.

    In my experience, effective load-balancing can also involve functional segregation of API servers. For instance, I’ve implemented setups where multiple API server groups with identical functionality are isolated by functional criteria, such as separating data by State of the Country or business unit. User requests are routed to the appropriate API server group based on the user group, ensuring both better performance and logical isolation.

    • Ravin Rau
      Ravin RauDec 16, 2024

      Thank you very much @rkedlaya for your input. You are spot on—load balancing isn't just about spreading traffic across servers. It's also key for keeping things running smoothly and acting as a backup when needed.

      I love your idea of splitting API servers based on function. It shows how load balancing can be customized to fit specific needs. By directing requests based on function, we boost performance and keep things organized, which is crucial for handling lots of data and resources efficiently. Maybe I can cover this in an article in the future.

  • Girish B Nair
    Girish B NairDec 17, 2024

    Well explained!!!
    Got to learn something in a effective and easier.
    Thanks for yoir input!!!

  • Jeffrey
    JeffreyDec 25, 2024

    You have good exp with NGINX. Great

    • Ravin Rau
      Ravin RauDec 25, 2024

      Thanks, I would not say that I have good experience. Still learning it.

  • Sibasis Padhi
    Sibasis PadhiJan 8, 2025

    Simple & clear!

Add comment