Task – 1 Create a bash script to restore and backup a database
Do the following tasks:
Requirements
1. Explore about PostgreSQL command basic
2. Learning bash script
3. You can create dummy database for testing or get it somewhere else
Scripts:
1. Backup a database (any) on PostgreSQL to S3
2. Restore a database (any) to PostgreSQL from S3
Solution
- Take a server Ubuntu 22.04/24.04
- Create an IAM user and create Access keys
- Create an S3 bucket for store
- Install PostgreSQL on server
To install PostgreSQL, first refresh your server’s local package index
sudo apt update
Then, install the Postgres package along with a -contrib package that adds some additional utilities and functionality
sudo apt install postgresql postgresql-contrib -y
One way is to switch over to the postgres account on your server by running the following command
sudo -i -u postgres
Then you can access the Postgres prompt by running
psql
This will log you into the PostgreSQL prompt, and from here you are free to interact with the database management system right away.
To exit out of the PostgreSQL prompt, run the following
\q
This will bring you back to the postgres Linux command prompt. To return to your regular system user, run the exit command
exit
Another way to connect to the Postgres prompt is to run the psql command as the postgres account directly with sudo
sudo -u postgres psql
This will log you directly into Postgres without the intermediary bash shell in between.
Again, you can exit the interactive Postgres session by running the following
\q
But i need here Password whenever i login to postgresql
Let’s do it
Before that log in again to psql
sudo -u postgres psql
ALTER USER postgres WITH PASSWORD 'password';
Let’s Create a Database for store and take backup
CREATE DATABASE my_database;
Connect to new database
\c my_database;
Create a Users table to store Data
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name VARCHAR(100),
email VARCHAR(100)
);
Here is a dummy data to store in database that we created now
-- Insert 50 dummy data entries into the users table
INSERT INTO users (name, email) VALUES
('Alice Johnson', 'alice.johnson@example.com'),
('Bob Smith', 'bob.smith@example.com'),
('Charlie Brown', 'charlie.brown@example.com'),
('Diana Prince', 'diana.prince@example.com'),
('Ethan Hunt', 'ethan.hunt@example.com'),
('Fiona Gallagher', 'fiona.gallagher@example.com'),
('George Washington', 'george.washington@example.com'),
('Hannah Montana', 'hannah.montana@example.com'),
('Ian Malcolm', 'ian.malcolm@example.com'),
('Julia Roberts', 'julia.roberts@example.com'),
('Kevin Bacon', 'kevin.bacon@example.com'),
('Laura Croft', 'laura.croft@example.com'),
('Michael Scott', 'michael.scott@example.com'),
('Nina Simone', 'nina.simone@example.com'),
('Oscar Wilde', 'oscar.wilde@example.com'),
('Paula Abdul', 'paula.abdul@example.com'),
('Quentin Tarantino', 'quentin.tarantino@example.com'),
('Rachel Green', 'rachel.green@example.com'),
('Steve Jobs', 'steve.jobs@example.com'),
('Tina Fey', 'tina.fey@example.com'),
('Uma Thurman', 'uma.thurman@example.com'),
('Victor Hugo', 'victor.hugo@example.com'),
('Wanda Maximoff', 'wanda.maximoff@example.com'),
('Xena Warrior', 'xena.warrior@example.com'),
('Yara Shahidi', 'yara.shahidi@example.com'),
('Zach Galifianakis', 'zach.galifianakis@example.com'),
('Alice Cooper', 'alice.cooper@example.com'),
('Bob Marley', 'bob.marley@example.com'),
('Cathy Freeman', 'cathy.freeman@example.com'),
('David Beckham', 'david.beckham@example.com'),
('Eva Mendes', 'eva.mendes@example.com'),
('Frank Sinatra', 'frank.sinatra@example.com'),
('Gina Rodriguez', 'gina.rodriguez@example.com'),
('Henry Cavill', 'henry.cavill@example.com'),
('Isla Fisher', 'isla.fisher@example.com'),
('Jack Sparrow', 'jack.sparrow@example.com'),
('Kylie Jenner', 'kylie.jenner@example.com'),
('Leonardo DiCaprio', 'leonardo.dicaprio@example.com'),
('Megan Fox', 'megan.fox@example.com'),
('Nicolas Cage', 'nicolas.cage@example.com'),
('Olivia Wilde', 'olivia.wilde@example.com'),
('Pablo Picasso', 'pablo.picasso@example.com'),
('Queen Latifah', 'queen.latifah@example.com'),
('Ryan Gosling', 'ryan.gosling@example.com'),
('Selena Gomez', 'selena.gomez@example.com'),
('Tom Hanks', 'tom.hanks@example.com'),
('Uma Thurman', 'uma.thurman@example.com'),
('Vin Diesel', 'vin.diesel@example.com'),
('Will Smith', 'will.smith@example.com'),
('Xander Cage', 'xander.cage@example.com'),
('Yasmine Bleeth', 'yasmine.bleeth@example.com'),
('Zoe Saldana', 'zoe.saldana@example.com');
Okay check data is added or not
SELECT * FROM users;
Give permissions for above file to execute
You will face an issue with
cd /etc/postgresql/16/main/
sudo vi pg_hba.conf
Update postgres with md5 instead of peer
sudo systemctl status postgresql
sudo systemctl reload postgresql
Lets install AWS CLI
sudo apt install unzip -y
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
Add keys generated for IAM user
aws configure
Create a .env file
vi .env
# .env
PG_USER="postgres"
PG_PASSWORD="password"
PG_DATABASE="my_database" # Database to back up
PG_TARGET_DATABASE="new_database" # Database to restore to
S3_BUCKET="BUCKET_NAME"
S3_PATH="PATH"
Create a backup file
vi backup.sh
Add script
#!/bin/bash
# Load environment variables from .env file
export $(grep -v '^#' .env | xargs)
# Backup Script
BACKUP_FILE="backup_$(date +"%Y%m%d_%H%M%S").sql"
# Perform database backup
PGPASSWORD="$PG_PASSWORD" pg_dump -U "$PG_USER" -d "$PG_DATABASE" > "$BACKUP_FILE"
# Compress the backup into a tar.gz file
tar -czvf "$BACKUP_FILE.tar.gz" "$BACKUP_FILE"
# Upload backup to Amazon S3
aws s3 cp "$BACKUP_FILE.tar.gz" "s3://$S3_BUCKET/$S3_PATH/$BACKUP_FILE.tar.gz"
# Check if the upload to S3 was successful
if [ $? -eq 0 ]; then
echo "Backup uploaded to S3 successfully. Removing local backup files."
# Remove the original SQL backup file and compressed file
rm "$BACKUP_FILE"
rm "$BACKUP_FILE.tar.gz"
else
echo "Failed to upload backup to S3. Keeping local backup files."
fi
Give permissions for backup file
sudo chmod +x backup.sh
./backup.sh
Create restore file
vi restore.sh
Add script
#!/bin/bash
# Load environment variables from .env file
export $(grep -v '^#' .env | xargs)
# Get the latest backup file from S3
LATEST_BACKUP_FILE=$(aws s3 ls s3://$S3_BUCKET/$S3_PATH/ | sort | tail -n 1 | awk '{print $4}')
if [ -z "$LATEST_BACKUP_FILE" ]; then
echo "No backup files found in S3."
exit 1
fi
# Download the latest backup file from S3
aws s3 cp s3://$S3_BUCKET/$S3_PATH/$LATEST_BACKUP_FILE $LATEST_BACKUP_FILE
# Unzip the backup file
tar -xzvf $LATEST_BACKUP_FILE
# Extract the SQL file name from the tar.gz file
SQL_FILE=$(basename "$LATEST_BACKUP_FILE" .tar.gz)
# Drop the target database if it exists
PGPASSWORD="$PG_PASSWORD" psql -U $PG_USER -c "DROP DATABASE IF EXISTS $PG_TARGET_DATABASE;"
# Create the target database
PGPASSWORD="$PG_PASSWORD" psql -U $PG_USER -d postgres -c "CREATE DATABASE $PG_TARGET_DATABASE;"
# Restore the database backup to the target database
PGPASSWORD="$PG_PASSWORD" psql -U $PG_USER -d $PG_TARGET_DATABASE < "$SQL_FILE"
# Verify the restoration
psql -U $PG_USER -d $PG_TARGET_DATABASE -c "SELECT COUNT(*) FROM users;"
# Remove the backup files if the verification is successful
if [ $? -eq 0 ]; then
echo "Restoration successful. Removing local backup files."
rm "$SQL_FILE"
rm "$LATEST_BACKUP_FILE"
else
echo "Restoration failed. Keeping local backup files."
fi
Check restore is working or not going into Psql
sudo -u postgres psql
\c new_database;
SELECT * FROM users;
Shell script for backup and Restore is Successful.
Leave a Reply