Python Script To Automate Databases Backup

Share

Database backup is an important aspect of infrastructure management, and it has to be part of AWS Cloud infrastructure automation services so that it can be easier for remote cloud engineers to restore the database backup during an incident.

Therefore, we have created a sample Python script to automate three databases Mysql, MongoDB, and PostgreSQL present on a remote server and store those database backup files on an AWS s3 bucket.

Step1: We will store our database backup files in an s3 bucket directly from our remote server. First, we will connect to our remote server with the help of the “paramiko module”. Here we have enabled password-less authentication, so you only need a username to connect to the remote server.

Step 2: First, we’ll define the function mysql_dump() to use mysqldump to backup the Mysql database. Define a single command to first take a dump of the database, then take the output in gzip format and upload it to the s3 bucket.

Step 3: Using mongodump, we will create a backup of the MongoDB database. Define the function mongo_dump().

Step 4: Create a function called pg_dump(). backup with the help of the pgdump command with output in tar format. Because there is no option to pass the password from the command line in pgdump, we must define it separately with “PGPASSWORD.”

Please consider a complete script to automate database backups to better manage the infrastructure during an unexpected incident or software product release.

#!/usr/bin/python3

import paramiko
import os
import subprocess
import datetime
hosts=["test2"]
db_host="localhost"
user_name="vagrant"
PGPASSWORD=["Vagrant@18"]
db_user="admin"
password="vagrant@18"
now = datetime.datetime.now()
timestamp = str(now.strftime("%Y%m%d_%H:%M:%S"))

for i in hosts:
name =i
command = ['ping', '-c', '2', i]
p= subprocess.Popen(command, stdout=subprocess.DEVNULL,stderr=subprocess.DEVNULL)
p.wait()
if p.poll():
print(f'{i} server is down')
else:
ssh_client=paramiko.SSHClient()
ssh_client.load_system_host_keys()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(hostname=i,username=user_name)

def mysql_dump():

print("Taking MYSQL DB Backup")
cmd = "mysqldump -h %s -u %s -p%s --all-databases --single-transaction --flush-logs --master-data=2 | gzip > mysql_full_bkup_%s.gz | aws s3 cp - s3://chkmt-bucket/%s/dump/mysql/mysql_full_bkup_%s.gz " % (db_host, db_user, password, timestamp, name, timestamp)
stdin,stdout,stderr=ssh_client.exec_command(cmd)
x = stdout.channel.recv_exit_status()
if x==0:
print("MYSQL Backup Successful")
else:

print("MYSQL Backup Failed")

def mongo_dump():
print("\nTaking MONGO DB Backup")
cmd = "mongodump --host=%s -u %s -p %s --out- | gzip > mongo_full_bkup_%s.gz | aws s3 cp - s3://chkmt-bucket/%s/dump/mongod/mongo_full_bkup_%s.gz" % (db_host, db_user, password, timestamp, name, timestamp)
stdin,stdout,stderr=ssh_client.exec_command(cmd)
y = stdout.channel.recv_exit_status()
if y==0:
print("MONGODB Backup Successful")
else:
print("MONGODB Backup Failed")

def pgres_dump():
print("\nTaking POSTGRES DB Backup")
cmd = "pg_dump -h %s -U %s -F t chkmtq > postgres_full_bkup_%s.tar | aws s3 cp - s3://chkmt-bucket/%s/dump/postgres/postgres_full_bkup_%s.tar" % (db_host, db_user,timestamp, name, timestamp)

stdin,stdout,stderr=ssh_client.exec_command(cmd)
z = stdout.channel.recv_exit_status()
if z==0:
print("POSTGRES Backup Successful")
else:
print("POSTGRES Backup Failed")

mysql_dump()
mongo_dump()
pgres_dump()

Eventually, Please schedule this script in the crontab so that it can run automatically every week. For instance, in the following example, this script would run every Sunday at 12AM.

 

Does Checkmate have the right answer for every infrastructure issue?

We have experienced DevOps engineers to manage daily IT operations, optimize costs, and automate infrastructure. Our cloud computing services can help you stabilize your infrastructure for faster product releases. If you need dependable cloud and DevOps expertise, look no further. We have an experienced DevOps engineering team to achieve business objectives faster.

Contact us  with our Virtual CTO Services to learn about effective cloud strategy services in India, product management,  cloud engineering, Technology Architecture Consulting, IT Staff Augmentation Services in India , Hire Software Developers in India  and DevOps management services. We can help you hire experienced cloud engineer to manage DevOps infrastructure stack.

Leave a Reply

Your email address will not be published.

*