Get started Bring yourself up to speed with our introductory content.

How geo load balancing makes a world of difference in Exchange 2013

When you need your Exchange 2013 end users to connect to the closest data center, consider geo load balancing as an option.

A key component in a number of highly available Exchange 2013 implementations is multisite availability with geo...

load balancing.

Building Exchange to work across more than one data center mitigates the major business risk that comes with keeping Exchange Server on-premises -- the loss of a data center or site due to a power issue, a networking issue or something worse.

When end users are regionally homed or when a data center is in a specific disaster recovery facility, it might not be desirable to allow end users to connect to just any data center. Instead, the IT organization should connect them to their closest data center with geo load balancer technology. This also means using a single mail server HTTPS address across all data centers. We'll look at what's involved in simplifying how to publish Exchange and allowing it to cleanly fail-over to a backup data center with no administrator intervention.

A load balancing example scenario

An example company, Lisa Jane Designs, has undergone a review of its IT infrastructure and determined that running Exchange only from a data center in its head office site presents a significant risk. The company has a second data center in a remote manufacturing facility, and wants Exchange to be available from this data center in the event of an outage. The desired infrastructure should include internal and external Exchange as well as a domain name system (Figure 1).

Desired data center infrastructure
Figure 1

Two new Exchange Servers will be implemented in the remote data center hosting database copies. Lisa Jane Designs originally considered using round-robin domain name system (DNS) for all clients and allowing traffic to traverse the wide area network link, but would prefer connecting clients day-to-day to the primary site.

Free geo load balancers versus paid options

Free load balancers provide a good option for smaller organizations. The features are limited compared to paid-for geo load balancers. Kemp Technologies' free load balancer, for example, provides most of the features of the paid version, but it excludes the ability to run a failover pair, restricts speeds to 20 MB per second per load balancer and only includes community support. On my website, I provide a free HAProxy-based version; however, this option doesn't support geo-DNS based multisite load balancing out of the box.

There are a number of devices on the market that offer similar geo-load balancing features and more at a cost, including Kemp's full version as well as offerings from F5 and other companies. The most compelling reason for purchasing a load balancer is for unlocking additional speed and support from the vendor.

Prerequisite setup

Before we configure the load balancer, we'll make some assumptions about the state of the environment. At this point, we'll assume that the base Exchange Server 2013 infrastructure is set up and configured as seen in Figure 1. Instead of load balancers, the initial environment is up and running using round robin DNS. Through the rest of this guide, we'll move from the round robin DNS set up to a fully load balanced configuration.

Security requirements

Each organization has different security requirements. Some may choose to install an internal-only set of load balancers and another set in the demilitarized zone (DMZ) to segregate traffic. In this example, we'll locate each load balancer on the local area network (LAN) and provide it an arm within the DMZ.

On both the internal LAN and DMZ, we'll need an additional IP address to use as the load balanced "Virtual IP" for the Exchange service. We'll also need to set up basic firewall rules to cover the following scenarios:

  • Communications on port 53/user datagram protocol and transmission control protocol (TCP) between each load balancer
  • Communications on port 22/TCP between each load balancer
  • Communications on port 443/TCP and 25/TCP from the load balancer to Exchange servers in the same site as each load balancer
  • Communications outbound on port 443/TCP to the Internet for license registration
  • Inbound communications on port 443/TCP and 25/TCP to each DMZ virtual IP

Because we're demonstrating the geo load balancing capabilities in this example scenario, we'll just load balance using network Layer 4; this means HTTPS traffic will not be unencrypted, examined and re-encrypted, but instead, will be just forwarded through. To simplify our example, we'll just monitor the remote procedure call over the HTTPS virtual directory.

Download the free load balancer

Start by obtaining and installing the load balancer. For our example, we'll use  this load balancer. The free load balancer is a virtual machine-based load balancer and supports the following hypervisors:

Install instances in each data center

The load balancer needs to be installed onto the virtual platform. This product requires two virtual CPUs and 2 GB of RAM -- check the load balancer's requirements when you select one. Because there is no on-site high availability (HA), use the hypervisor's HA setup to provide additional availability to the geo load balancing and failover.

In this example, we extract the ZIP file from the download and then use Hyper-V's Import option to create a new instance on the virtual infrastructure (Figure 2).

New virtual infrastructure instance
Figure 2

After installation, we'll start the virtual machine (VM) and then navigate to the IP address the VM assigned itself from the Dynamic Host Configuration Protocol.

After logging in with the default username bal and the password in the PDF supplied with the download, choose a new, secure password.

If you aren't sure of the assigned IP address, open it on the hypervisor console. After login using the same bal username and password, you'll see the auto-assigned address and have an opportunity to change it (Figure 3). We will complete the configuration and set up multisite load balancing in part two.

Change auto-assigned address
Figure 3

About the author:
Steve Goodman is an Exchange MVP and is the head of unified communications at the U.K.’s leading Office 365 partner. Steve has worked in the IT industry for 16 years and has worked extensively with Microsoft Exchange since version 5.5. Steve is the author of a number of books about Exchange, regularly presents at conferences, co-hosts The UC Architects podcast and regularly blogs about Exchange Server, Office 365 and PowerShell at www.stevieg.org.

Next Steps

Use open source tools for Exchange 2013 load balancing

How to prevent workload strain with load balancing

Load balancing options for Exchange 2013

This was last published in July 2015

Dig Deeper on Exchange Server Deployment and Migration Tools

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

Which load balancer does your organization use for Exchange?
Cancel

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseDesktop

SearchCloudComputing

SearchSQLServer

Close