Pivotal Cloud Foundry (PCF) is an open source platform based on Cloud Foundry and offered as a collaboration between Pivotal, EMC, and GE. Pivotal Cloud Foundry runs on almost all popular cloud infrastructures, including VMWare, AWS, and OpenStack. PCF as a platform is dynamic, developer friendly, and features full-lifecycle support.
Organizations implementing Pivotal Cloud Foundry as their cloud platform free themselves from managing application infrastructure. When integrating freely available third-party tools and services, they can also achieve high-availability, auto-scaling, dynamic routing, multi-lingual support, and log analysis.
Pivotal Cloud Foundry performs exceedingly well when intelligently designed and maintained, but there are still some time-consuming tasks that demand an admin’s attention. One such operational task is ensuring that installation settings and essential internal databases are regularly backed up. Pivotal recommends that you back up your installation settings by exporting them at regular intervals (weekly, bi-weekly, monthly, etc). We’re going to discuss designing an effective and reliable back up process…and how to apply an archive when you need to restore your installation.
Note: According to Pivotal Cloud Foundry documentation, exporting your installation only backs up your installation settings. It does not back up your VMs or any external MySQL databases that you might have configured on the Ops Manager Director Config page.
Before jumping in, it’s a good idea to make sure that you’ve covered all the prerequisites you’ll need to make Pivotal Cloud Foundry happy. You’ll need:
Backing up a Pivotal installation is critical for the operation and availability of your Pivotal Cloud Foundry data center. Backing up Pivotal Cloud Foundry data centers is like creating restore points on a Windows machine. In the event of a crash or the failure of an upgrade process, you can restore your back up settings to fall back to an earlier, functional image. Here’s what you’ll need to do:
To make sure that your system is ready for the process, there are some important details that will need taking care of in the pre-backup stage:
Pivotal Cloud Foundry’s Cloud Controller Database maintains a database with records of orgs, spaces, apps, services, service instances, user roles, etc. Backing up this database is critical if you want to protect your existing settings (and you DO want to protect your existing settings).
$bosh target <IP_OF_YOUR_OPS_MANAGER_DIRECTOR> $bosh login
Your username: director Enter password: Logged in as `director'
$bosh deployments >> /pcf-backup/deployments_09_20_2015.txt
$bosh download manifest DEPLOYMENT-NAME LOCAL-SAVE-NAME $bosh download manifest cf-1234xyzabcd1234 cf-backup-09_20_2015.yml
$bosh deployment cf-backup-09_20_2015.yml
$bosh vms cf-1234xyzabcd1234
$bosh -d cf-backup-09_20_2015.yml stop cloud_controller-partition-cdabcd1234b253f40 $bosh -d cf-backup-09_20_2015.yml stop cloud_controller_worker-partition- cdabcd1234b253f40
ccdb: address: 1.2.99.16 port: 2544 db_scheme: postgres
vm Credentials vcap / xyz1234567989pqr
$ssh vcap@<IP_ADDRESS_OF_CCDB>
$find /var/vcap | grep 'bin/psql'
Your output should look something like this:
$/var/vcap/data/packages/postgres/b63fe0176a93609bd4ba44751ea490a3ee0f646c.1-9eea4f5b6de7b1d8fff28b94456f61e8e22740ce/bin/psql
$/var/vcap/data/packages/postgres/<random-string>/bin/pg_dump -h 1.2.99.16 -U admin -p 2544 ccdb > ccdb_09_20_2015.sql
#scp vcap@1.2.99.16:/home/vcap/ccdb_09_20_2015.sql /pcf-backup
This will complete the CCDB backup process.
vm Credentials vcap / xxxxxxxxxxxx Credentials root / xxxxxxxxxxxxxxxxxx
$ssh vcap@1.2.90.17
$find /var/vcap | grep 'bin/psql'
#/var/vcap/data/packages/postgres/<random-string>/bin/pg_dump -h 1.2.90.17 -U root -p 2544 uaa > uaa_09_20_2015.sql
# scp vcap@1.2.90.17:/home/vcap/uaa_09_20_2015.sql /pcf-backup
This completes the UAADB backup process.
The Console Database is referred to as the Apps Manager Database in Elastic Runtime 1.5.
Vm Credentials vcap / xxxxxxxxxxxxx Credentials root / xxxxxxxxxxxxxxxxxxx
$ssh vcap@1.2.90.18
$find /var/vcap | grep 'bin/psql' $/var/vcap/data/packages/postgres/<random-string>/bin/pg_dump -h 1.2.90.18 -U root -p 2544 console > console_09_20_2015.sql
$scp vcap@1.2.90.18:/home/vcap/console_09_20_2015.sql /pcf-backup
This completes the Console Database backup process.
$ssh vcap@1.2.90.15
$tar cz shared > nfs_09_20_2015.tar.gz
$scp vcap@1.2.90.15:/var/vcap/store/nfs_09_20_2015.tar.gz /pcf-backup
This completes the NFS Server backup process.
Backup your MySQL Database:
$bosh download manifest p-mysql-abcd1234f2ad3752 mysql_09_20_2015.yml
$mysqldump -u root -p -h 1.2.90.20 --all-databases > user_databases_09_20_2015.sql
This completes the MySQL DB backup process.
$bosh -d cf-backup-09_20_2015.yml start cloud_controller-partition-cdabcd1234b253f40 $bosh -d cf-backup-09_20_2015.yml start cloud_controller_worker-partition- cdabcd1234b253f40
Restoring a Pivotal Cloud Foundry deployment requires that you reinstall your installation settings restoration and key system databases. Or, in other words, everything we backed up in the previous operations. You’ll need to follow these steps:
We’ll use the UAADB as an example. The rest will follow the same process.
$ bosh stop <uaa job>
$ssh vcap@[uaadb vm ip]
$/var/vcap/data/packages/ /postgres/<random-string> /bin/psql -U vcap -p 2544 uaa
drop schema public cascade; create schema public;
$scp uaa.sql vcap@[uaadb vm IP]: #UAADB server $/var/vcap/data/packages/postgres/<random-string>/bin/psql -U vcap -p 2544 uaa < uaa.sql
$bosh start <uaa job>
A Pivotal Cloud Foundry backup process can be scheduled (and scripted) to create restore points for your installation. You could also use the settings file backups to launch a new installation in a different availability zone or even on a different platform. PCF has provided excellent documentation on both the backup and restore process.
Thoughts? Add your comments below.
It's Flash Sale time! Get 50% off your first year with Cloud Academy: all access to AWS, Azure, and Cloud…
In this blog post, we're going to answer some questions you might have about the new AWS Certified Data Engineer…
This is my 3rd and final post of this series ‘Navigating the Vocabulary of Gen AI’. If you would like…