Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removed old BackupServer and added new changes #89

Open
wants to merge 8 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions SETUP.md
Original file line number Diff line number Diff line change
Expand Up @@ -602,6 +602,8 @@ To access mqtt channel, user needs credentials to access it.
# mosquitto_passwd -c /etc/mosquitto/credentials/passwd <user>
Password:
Reenter password:

# chmod 644 /etc/mosquitto/credentials/passwd
```

3. Close the connection to mqtts (Ctrl+D).
Expand Down
2 changes: 1 addition & 1 deletion apiserver/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

# Build the APISERVER using phusion base image

FROM phusion/baseimage:master-amd64
FROM phusion/baseimage:jammy-1.0.1

# Enabling SSH service
RUN rm -f /etc/service/sshd/down
Expand Down
33 changes: 18 additions & 15 deletions cron-backup/Dockerfile → backup/Dockerfile
100755 → 100644
Original file line number Diff line number Diff line change
Expand Up @@ -7,24 +7,25 @@
# 5. mongodb

# To find the version of installed Mongodb service
FROM mongo:latest AS mongodb
FROM mongo:5.0.11 AS mongodb
RUN env | grep MON > /root/env


# Building cron-backup instance
FROM phusion/baseimage:master-amd64
FROM phusion/baseimage:jammy-1.0.1
# Copying mongodb's version
COPY --from=mongodb /root/env /root/env

# Installing same Mongodb's tools as in the copied version here in the cron-backup instance
RUN set -x \
&& export $(xargs < /root/env) \
&& echo "deb http://$MONGO_REPO/apt/ubuntu focal/${MONGO_PACKAGE%-unstable}/$MONGO_MAJOR multiverse" | tee "/etc/apt/sources.list.d/${MONGO_PACKAGE%-unstable}.list" \
&& apt-key adv --keyserver keyserver.ubuntu.com --recv-keys B00A0BD1E2C63C11 \
&& export DEBIAN_FRONTEND=noninteractive && apt-get update && ln -s /bin/true /usr/local/bin/systemctl && apt-get install -y \
${MONGO_PACKAGE}=$MONGO_VERSION \
${MONGO_PACKAGE}-tools=$MONGO_VERSION

&& echo "deb http://security.ubuntu.com/ubuntu focal-security main" | tee /etc/apt/sources.list.d/focal-security.list \
&& apt-get install -y gpg curl \
&& curl -fsSL https://pgp.mongodb.com/server-7.0.asc | \
gpg -o /usr/share/keyrings/mongodb-server-7.0.gpg \
--dearmor\
&& echo "deb [ arch=amd64,arm64 signed-by=/usr/share/keyrings/mongodb-server-7.0.gpg ] https://repo.mongodb.org/apt/ubuntu jammy/mongodb-org/7.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-7.0.list\
&& apt-get update \
&& apt-get install -y mongodb-org mongodb-org-database mongodb-org-server mongodb-org-shell mongodb-org-mongos mongodb-org-tools

# some basic package installation for troubleshooting
RUN apt-get update && apt-get install -y \
Expand Down Expand Up @@ -80,16 +81,18 @@ RUN chmod +x /bin/nginx_backup.sh
COPY mqtts_backup.sh /bin/mqtts_backup.sh
RUN chmod +x /bin/mqtts_backup.sh

# Backup script for startup.sh
COPY startup.sh /etc/service/startup/run
RUN chmod +x /etc/service/startup/run

# Backup script for mongodb
COPY mongodb_backup.sh /bin/mongodb_backup.sh
RUN chmod +x /bin/mongodb_backup.sh
#COPY mongodb_backup.sh /etc/service/mongodb_backup/run
#RUN chmod +x /etc/service/mongodb_backup/run


# Start the postfix daemon during container startup
RUN mkdir -p /etc/my_init.d
COPY postfix.sh /etc/my_init.d/postfix.sh
RUN chmod +x /etc/my_init.d/postfix.sh

# To Enable crontab
RUN mkdir -p /etc/my_init.d
COPY cron.sh /etc/my_init.d/cron.sh
RUN chmod +x /etc/my_init.d/cron.sh
# end of file
50 changes: 50 additions & 0 deletions backup/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# [cron-backup](./cron-backup) Docker Container Usage

This instance provides backup support for the `Nginx`, `Node-red` and `Grafana` containers and pushed the backed up data to S3-compatible storage.

## Shell script

For backing up the directory data

- It uses [`grafana_backup.sh`](backup\grafana_backup.sh) for `Grafana` container.
- It uses [`nodered_backup.sh`](backup\nodered_backup.sh) for `Node-red` container.
- It uses [`nginx_backup.sh`](backup\nginx_backup.sh) for `Nginx` container.
- It uses [`mqtts_backup.sh`](backup\mqtts_backup.sh) for `Mqtts` container.

## Scheduling backup using `Daemon thread`

The following backup jobs are added to run at specific time.

``` bash

# Start up the Process
while true
do
HOUR="$(date +'%H')"
MINUTE="$(date +'%M')"

if [ "$HOUR" = "06" ] && [ "$MINUTE" = "35" ]
then
/bin/nodered_backup.sh
sleep 60
fi
if [ "$HOUR" = "07" ] && [ "$MINUTE" = "35" ]
then
/bin/grafana_backup.sh
sleep 60
fi
if [ "$HOUR" = "08" ] && [ "$MINUTE" = "35" ]
then
/bin/nginx_backup.sh
sleep 60
fi
if [ "$HOUR" = "09" ] && [ "$MINUTE" = "35" ]
then
/bin/mqtts_backup.sh
sleep 60
fi
```

## Mail Alert

The above backup shell scripts were configured to send mail for the both successful/unsuccessful run.
144 changes: 144 additions & 0 deletions backup/grafana_backup.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,144 @@
#!/bin/bash
#Purpose: The Shell script will be used for taking backup and send it to S3 bucket and Prune Old Data in S3 Bucket.
#Version:v0.1
#Created Date:2022-08-26
#Modified Date:12-10-2022
#Reviewer: Terry Moore.
#Author: Shashi, VishnuNambi.

a=$(date +%b)
b=Mar
c=Jun
d=Sep
e=Dec
DATE1=$(date +%Y%m%d%H%M)
DATE=$(date +%d-%m-%y_%H-%M)

mkdir -p /var/lib/backup/grafana

grafana_src='/grafana'

if [ ! -d $grafana_src ]; then
{
echo "DATE:" "$DATE"
echo ""
echo "DESCRIPTION: ${SOURCE_NAME}_Grafana backup"
echo ""
echo "STATUS: Grafana backup failed"
echo ""
echo "The source backup directory: grafana_src is not available"
}>> /tmp/grafana.txt
< /tmp/grafana.txt mail -s "${SOURCE_NAME}: Grafana Data Backup" "${BACKUP_MAIL}"
exit
else
tar cvzf /var/lib/backup/grafana/"${SOURCE_NAME}"_grafana_data_backup_"${DATE1}".tgz ${grafana_src}/
fi

# Moving the backup to S3 bucket (Daily backup)
if s3cmd put -r --no-mime-magic /var/lib/backup/grafana/ s3://"${S3_BUCKET_GRAFANA}"/grafana/;
then
{
echo "DATE:" "$DATE"
echo ""
echo "DESCRIPTION: ${SOURCE_NAME}_Grafana Daily backup"
echo ""
echo "STATUS: Grafana Daily backup succeeded."
echo ""
echo "******* Grafana Data Backup ****************"
echo ""
s3cmd ls --no-mime-magic s3://"${S3_BUCKET_GRAFANA}"/grafana/ --human-readable | grep -i "${SOURCE_NAME}"_grafana_data | cut -d' ' -f3- | tac | head -10 | sed "s,s3:\/\/""${S3_BUCKET_GRAFANA}""\/,,g" &>> /tmp/grafana.txt
echo ""
echo "************** END **************************"
} >> /tmp/grafana.txt
else
{ echo "DATE:" "$DATE"
echo ""
echo "DESCRIPTION: ${SOURCE_NAME}_Grafana Daily backup"
echo ""
echo "STATUS: Grafana Daily backup failed"
echo ""
echo "Something went wrong, please check it"
} >> /tmp/grafana.txt
< /tmp/grafana.txt mail -s "${SOURCE_NAME}: Grafana Data Backup" "${BACKUP_MAIL}"
fi


# Moving the backup to S3 bucket (Monthly backup)
if [ "$(date -d +1day +%d)" -eq 01 ]; then
if s3cmd put -r --no-mime-magic /var/lib/backup/grafana/ s3://"${S3_BUCKET_GRAFANA}"/monthly_backup/grafana/;
then
{
echo "DATE:" "$DATE"
echo ""
echo "DESCRIPTION: ${SOURCE_NAME}_Grafana Monthly backup"
echo ""
echo "STATUS: Grafana Monthly backup succeeded."
echo "" >> /tmp/grafana.txt
echo "******* Grafana Data Backup ****************"
echo ""
s3cmd ls --no-mime-magic s3://"${S3_BUCKET_GRAFANA}"/monthly_backup/grafana/ --human-readable | grep -i "${SOURCE_NAME}"_grafana_data | cut -d' ' -f3- | tac | head -10 | sed "s,s3:\/\/""${S3_BUCKET_GRAFANA}""/monthly_backup/grafana/\/,,g" &>> /tmp/grafana.txt
echo ""
echo "************** END **************************"
} >> /tmp/grafana.txt
else
{
echo "DATE:" "$DATE"
echo ""
echo "DESCRIPTION: ${SOURCE_NAME}_Grafana Monthly backup"
echo ""
echo "STATUS: Grafana Monthly backup failed"
echo ""
echo "Something went wrong, please check it"
}>> /tmp/grafana.txt
< /tmp/grafana.txt mail -s "${SOURCE_NAME}: Grafana Data Backup" "${BACKUP_MAIL}"
fi
fi


# Moving the backup to S3 bucket (Yearly backup)
if [ "$a" == "$b" ] || [ "$a" == "$c" ] || [ "$a" == "$d" ] || [ "$a" == "$e" ] && [ "$(date -d +1day +%d)" -eq 01 ]; then
if s3cmd put -r --no-mime-magic /var/lib/backup/grafana/ s3://"${S3_BUCKET_GRAFANA}"/yearly_backup/grafana/;
then
{
echo "DATE:" "$DATE"
echo ""
echo "DESCRIPTION: ${SOURCE_NAME}_Grafana Yearly backup"
echo ""
echo "STATUS: Grafana Yearly backup succeeded."
echo ""
echo "******* Grafana Data Backup ****************"
echo ""
s3cmd ls --no-mime-magic s3://"${S3_BUCKET_GRAFANA}"/yearly_backup/grafana/ --human-readable | grep -i "${SOURCE_NAME}"_grafana_data | cut -d' ' -f3- | tac | head -10 | sed "s,s3:\/\/""${S3_BUCKET_GRAFANA}""/yearly_backup/grafana/\/,,g" &>> /tmp/grafana.txt
echo ""
echo "************** END **************************"
} >> /tmp/grafana.txt
else
{
echo "DATE:" "$DATE"
echo ""
echo "DESCRIPTION: ${SOURCE_NAME}_Grafana Yearly backup"
echo ""
echo "STATUS: Grafana Yearly backup failed"
echo ""
echo "Something went wrong, please check it"
}>> /tmp/grafana.txt
< /tmp/grafana.txt mail -s "${SOURCE_NAME}: Grafana Data Backup" "${BACKUP_MAIL}"
fi
fi


< /tmp/grafana.txt mail -s "${SOURCE_NAME}: Grafana Data Backup" "${BACKUP_MAIL}"

# Remove the old backup data in local directory to avoid excessive storage use
find /var/lib/backup/grafana/ -type f -exec rm {} \;
rm /tmp/grafana.txt
###PRUNE###

# prune the old backup data in S3 bucket to avoid excessive storage use(Daily backup)
s3cmd ls -r s3://"${S3_BUCKET_GRAFANA}"/grafana/ | awk -v DEL="$(date +%F -d "31 days ago")" '$1 < DEL {print $4}' | while read -r file; do s3cmd rm "$file"; done


if [ "$(date -d +1day +%d)" -eq 01 ]; then
# prune the old backup data in S3 bucket to avoid excessive storage use(Monthly backup)
s3cmd ls -r s3://"${S3_BUCKET_GRAFANA}"/monthly_backup/grafana/ | awk -v DEL="$(date +%F -d "366 days ago")" '$1 < DEL {print $4}' | while read -r file; do s3cmd rm "$file"; done
fi
File renamed without changes.
Loading
Loading