NAKIVO Amazon Cloud Integration


Recently I have been reviewing NAKIVO Backup and Replication 6.2 in my lab, see my previous post on initial installation and integration here. I will now look at the Amazon Cloud integration. NAKIVO Backup and Replication can be simply integrated with AWS to use as a a backup repository for backup jobs or for offsite copy jobs. It can also be used to schedule Elastic Cloud Compute (EC2) backups and / or replication. The following will illustrate this integration.

First of all I dont consume any AWS services myself so I initially signed up for their free tier. Once signed up I then set up a user account in the IAM console on AWS and noted the Access Key ID and secret. The first step is to add my AWS details to Inventory.

Currently I have no instances configured.

The ports that are required to be open can be found here.

I now need to configure a Transporter in AWS under Configuration - Transporter - Add. The Transporter is the product component that does all of the heavy-lifting: it performs backup, replication, and recovery, as well as data compression, deduplication, and encryption.

Now that is configured, I am going to run a new Windows instance in AWS and go through the backup process. I deploy a new Windows 2016 instance and make sure I can access it.

To back this up I go to Dashboard - Create - Amazon EC2 Backup Job,

Select my new instance.

I now need to choose the repository for this backup, currently I only have an on-site local repository so I will choose that.

I choose a schedule.

Now I pick the job options such as job name, encryption, application aware backups and retention points.

I then run the job, this will backup my AWS instance and store it locally on-site.

I now want to go ahead and configure my appliance to store backups in AWS. I first need to setup a storage instance which I can do from the NAKIVO console. I go to Configuration - Repositories - Add. Set the storage configuration to meet your requirements and subscription type and also configure NAKIVO features such as compression and deduplication.

Now I go to Dashboard - Create - VMware vSphere Backup Job. I pick the VM I want to backup to AWS storage.

Select the newly created AWS repository.

Select the same scheduling and backup job options as a usual backup job.

I run the job and it completes.

So now I have an on-premises VM backed up to AWS, I now want to test recovery. I go back to Dashboard - Recover. I can either pick the recovery by VM or by repository.

Select my on-premises restore location, notice here I cant restore to AWS which is a shame but I can recovery to any vSphere target.

I set my recovery options and run the job.

The VM recovers just fine.

Backup Copy Jobs are also easy to point to AWS by simply pointing it to the repository in AWS. I select the source backup job I want to copy.

I select AWS repository.

I choose the schedule and job options.

This will run just like any other copy job but using AWS as the target.

Finally I will show Replication. Its possible to replicate a AWS instance to another region in AWS, sadly I cant replicate from vSphere to AWS or from AWS to vSphere.

I go to Dashboard - Create - Amazon EC2 Replication Job. Select the instance in AWS I want to replicate.

Select the region to replicate to.

Select the schedule and job options and complete.

Conclusion

I found the integration into AWS very simple and straight forward, I dont consume any AWS resources and I had this setup within an hour. The job options within NAKIVO are the same as on-premises jobs, the ease and speed to get this setup will be very appealing to people out there. Its a fantastic way to configure offsite backups.

What would have made it perfect for me would be if I could somehow have the ability to restore direct to AWS from a vSphere repository and run the restored VM in AWS and vice versa. The same applies to replication, it would be great to replicate direct from vSphere to AWS.

Regardless of that the product does what it says, it integrates simply with AWS for an offsite backup location and works very well.

 

Leave a comment

Your email address will not be published. Required fields are marked *