In this article, we are discussing a bash script to upload CyberPanel backups to Google Drive. CyberPanel is one of the best opensource control panels and supports openlitespeed web server out of the box. It’s enterprise editions supports LiteSpeed Enterprise editions. More details can be found on their website here.
We can schedule the backups from the CyberPanel Main » Backup » Create Backup with the destination as “Home” directory. It will take backups at a specified time to each user’s home directory.
Script to upload CyberPanel backups to Google Drive
[Updated: Couldn’t figure out the backup locations while using the BackupScheduleLocal.py as earlier in the latest versions. Added support for taking backups using CyberPanel CLI]
Our script here, will generate a fresh backup, move all of them to a single temporary directory for easy uploading, upload it to Google Drive using a script developed by prasmussen, gdrive. Once uploaded, the backups and the temporary directory are deleted.
#!/bin/bash # This script takes CyberPanel backups and upload to the Google Drive # gdrive script is adopted from https://github.com/prasmussen/gdrive # Author : Arun D # Rev : 2.0 # Checking whether the gdrive is already installed if [ ! -e /usr/local/bin/gdrive ] then echo "gdrive not found. Installing it" wget -O gdrive "https://docs.google.com/uc?id=0B3X9GlR6EmbnQ0FtZmJJUXEyRTA&export=download" wait sudo install gdrive /usr/local/bin/gdrive wait echo "Configure your Google Drive connection " gdrive about echo "gdrive installed and linked to your account. Please edit the G_ID variable with the Directory ID and re-run the script" exit; fi # G_ID is the Google Drive Directory ID. To get it, go to the directory created or add a new one in Google Drive. The GID is the random string found at the end of the URL. # Variables :- DATE="$(date +%Y-%m-%d)" G_ID="1J5XXXXXXXX-YhgXXXXXXXXXXXXXXjDANC" BACKUP_DIR="/home/backups" echo " Continuing with the backup generation" echo "------------------------------------------" echo $DATE # Deleteing old backups and Journal Files and create fresh temporary backup directory rm -rf /home/*/backup/* /var/log/journal/*/*.journal $BACKUP_DIR && mkdir -p "$BACKUP_DIR/$DATE" wait # Executing a new CyberPanel's Local Backup Script instance echo "Executing CyberPanel Backup CLI commands for each website" ls -1 /home -Icyberpanel -Idocker -Ibackup -Ilscache -Ivmail | while read user; do echo "--- Taking backup of $user ---"; cyberpanel createBackup --domainName $user > /dev/null done # Copying tar.gz backup files from the default backup location to the script's backup location echo "Copying tar.gz files to Backup Directory" mv /home/*/backup/*.tar.gz "$BACKUP_DIR/$DATE" wait echo $(cd $BACKUP_DIR/$DATE && ls -A | wc -l) files in the backup directory # Upload backup files to Directory with ID provided echo "Uploading Backup tar files to Google Drive" /usr/local/bin/gdrive upload --recursive --parent $G_ID $BACKUP_DIR/$DATE wait # Remove backup directory to avoid confusions echo "Removing temporary backup directory created" rm -rf $BACKUP_DIR wait echo " Backups are uploaded to your GDRIVE!!!" sleep 5 exit
This script works out of the box. Just need to edit the variable G_ID with directory ID created in your Google Drive. You can run this manually or as a cron job.
Try this and let me know how it goes.