Recently, Veeam, has opened up its Veeam Cloud Connect – Service Provider framework to Enterprises. Prior to this, you had to specifically be a Service Provider and there was a 100 socket minimum purchase required (ruling out most enterprises) in order to deploy Cloud Connect.
This new offering opens Veeam Cloud Connect to enterprises looking to centrally manage their own data from various locations were previously unreachable via Veeam Cloud Connect (through service providers).
- VCC-E is a perpetual add-on purchase in addition to Veeam® Availability Suite™ or Veeam Backup & Replication™ (not Veeam Backup Essentials™)
- There is no service provider requirement or requirement to join the VCP Program (the enterprise takes on the role of the service provider)
- Available with Veeam Availability Suite v8 Update 3
Here is a reference deployment architecture
In addition, they have added a Cloud Connect image to the Azure Marketplace to make deployment even easier!
Let’s take a look at how to shift your tape spend to Azure!
First lets head over to the Azure Marketplace and find our image.
MAKE SURE not to choose the Service Provider option!
You’ll see that this is using the new Azure Resource Manager (ARM) framework, not the classic deployment method, which really makes deployment very easy!
Let’s go ahead and configure our Virtual Machine Basics.
Note that your resource group will contain all of your deployment objects in ARM deployments so you can also easily tear/shut it down (like in demos :))
This is a lab, and given that the marketplace VM doesn’t allow for domain join, we’re going to just use a workgroup computer in Azure with a local user/password. In Production, you may want to consider connecting this via a VNET attached to a Site-to-Site Tunnel back to your corporate office, or via ExpressRoute.
Make sure to choose the appropriate subscription as well as the Azure Region you’d like your data to reside in.
Choose your appropriate sized image, please reference Veeam best practices for sizing here
Very important point here
By default, Azure VMs can only hold 16x1TB disks, 1 of which is used for the OS drive. If you need more than 16TB of backup capacity in the cloud (post-dedupe/compression), then you should look into premium storage, which can provide up to 32TB of storage. I recommend reviewing your storage requirements ahead of time.
NOTE: Premium disk can also be used for higher performance write operations on the Azure VM if needed.
Review the summary and deploy!
Note that you WILL be charged for the cost of the VM and Veeam is BYOL here, so reach out to your Veeam Sales rep for a trial key or to purchase!
Come back a bit later and see a created Veeam Cloud Connect server!
Now onto the fun part, configuration
First Choose Connect to open an RDP session to the Azure VM
Let’s first go ahead and add our disks that will be our Backup Repository.
We’ll just add 1 for this test, but you can add up to 16 total for standard storage
Now let’s go to the server and setup our Storage Pool. This is best practice and recommended over using software disk striping, as well as gives you a lot of flexibility!
If we navigate to File and Storage Services, you’ll see our newly added disk ready to be added
Let’s go ahead and create a new Storage Pool, which will allow us to easily expand in the future as well as provide options for parity and mirroring within the OS.
Give your Storage Pool a name
Select the disk(s)
Confirm the details
Time to create the pool!
Now, we need to create a new Virtual Disk on that pool to present to the server
Choose your recently created storage pool
Give your virtual disk a name
Choose the option for storage layout. I recommend NOT using simple as it stripes across all disks, if you lose one disk, you will lose your data. At minimum, parity should be used, which requires a minimum of 3 disks.
We’re going to thick provision the disk, it will be the only thing using the pool
Choose the max
And we’re off!
It will automatically launch the New Volume wizard.
Choose the server and disk
Size of the volume
File system details
Now that the backup disk for Veeam has been configured, we need to go ahead and configure Veeam in the Azure instance.
When logging onto to the server for the first time, you’ll get prompted to enter the license file. We grabbed our trial key and uploaded it. You can leave the checkbox unchecked.
It will ask you to note the information below.
This is slightly out of date and was a different process in the Azure legacy portal, so I’ve documented the correct steps to do this in the current Azure portal.
You will need to create a public DNS name so that Veeam can connect to it.
NOTE: In the Azure Portal (not legacy) VCC-E template, port 6180 is published by default (shown below), you DO NOT need to take the steps above to open the endpoint port.
You can see it is already opened below
To set your public endpoint name, navigate to Public IP Addresses in Azure
Choose your Public IP name that has been created
Choose Configuration, and set your public-facing address to a unique name
Now we have to configure the new repositories in the Azure Veeam instance.
Open the console on the Azure Veeam instance and go to Backup Repositories
Right-click to create a new Backup Repository
Choose your E: drive
Uncheck the option for vPower NFS, it will not be needed in the Azure instance.
Choose to change the default location from the C: repository to the E: repository
Now let’s setup our Cloud Connect user account that our on-premises Veeam server will use to connect.
Choose Add User…
Enter a username and password
Define which repository that account will be using and a quote.
Here you can create different user accounts for different source sites and have separate passwords and quotas if you require them, in this case, we’re using 1 for all of our sites.
Choose your backup repository to associate with your cloud repository and choose the quota
We also need to add the public DNS name under the Cloud Gateway settings
Choose Finish and it will restart the cloud gateway service
Note that you can also define what certificates you want to use, by default it will be a self-signed certificate, which we’ll use in our lab here. In a production environment, you may want an externally secured or internally issued certificate.
We won’t proceed with changing that at this time
Now, finally to the on-premises configuration!
Let’s go to Backup Infrastructure and Add Service Provider
It will retrieve the certificate
You’ll need to add an account for credentials to verify
It will test a connection to the service provider
And you can see that it worked!
We know that because it shows the target volume information.
From here, it will now appear as a repository in your Backup Jobs, just like an on-premises repository.
You can choose to either backup directly to Azure or create a copy job to copy your on-premises data from an existing job to the cloud repository. This is ideal because it keeps a short-term amount of data on-premises for quick restores, say for 7-10 days, and the rest can be offloaded to cheap Azure storage!
Enjoy migrating your backup data to the cloud!