If you’re old enough (don’t worry, I won’t ask you to raise your hand) you may recall Saturday morning cartoons and, in particular, “School House Rock.”
If you remember that, you’ll likely remember that many of us learned (or had reinforced our lessons) about conjunctions by the engineer of a train who sang “Conjunction, junction, what’s your function?”
Go ahead, sing it. You know you want to.
Now, this little trip down nostalgic lane has a purpose, which is to explain that app services – and in particular, load balancing - are the conjunctions of the data center. Instead of hooking up “words” and “phrases”, however, we’re hooking up “users” and “apps”.
Much in the same way you’d use the words “and”, “but”, and “nor” to hook up two phrases or clauses in a sentence, you use app services to hook up users (whether they be things or people) to apps (whether they’re in the cloud or on-premises).
The best example of this is load balancing. It’s a classic example of an “app service” that acts like “and” between two related things. Apples and oranges. Baseball and hot dogs. Beer and brats. Load balancing proxies provide the conjunctive glue between users and apps that ensure the two get connected, which in turn makes the business train run smoothly. We hook up user Bob to app instance three. And user Alice to app instance two. And thing one to app four. And API version two to API backend three.
Now this is true for POLB (Plain Old Load Balancing), which proxies requests and chooses the right app instance based on an algorithmic decision, and it’s also true for L7 Load Balancing, which uses application layer information like the URI, the host, or values in the HTTP header, to make the decision on how to “hook up” a user and an app. This is a critical function in the data center and is the means by which we achieve the scale (and availability) required to support modern applications.
This conjunctive capability is increasingly important to enable the implementation of a variety of DevOps deployment patterns. A/B Testing, Blue-Green Deployment, Canary Deployments, API metering and API versioning are all good examples of operational deployment patterns that support applications by hooking up users and apps based on specific business and operational needs that exist at that moment in time (that’s context, by the way).
Once you realize that load balancing with modern, programmable proxies is more than just scale or availability, you start recognizing its potential as part of the application architecture itself; as a means to improve performance, to add business value, and to provide a platform on which you can standardize and reap the benefits of lower operating costs and repeatable deployment processes.