If you like DNray Forum, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...

 

How to automate DB backup

Started by Novel Web Solution, Jul 18, 2022, 01:05 AM

Previous topic - Next topic

Novel Web SolutionTopic starter

Making manual backups of the site database has become exhausting for me, and I'm searching for a way to automate it. Specifically, I want a backup to be created three times daily at 18:00, 21:00, and 24:00 automatically from the hosting, saved to a folder at the root of my E drive.

I don't want the 21:00 backup to overwrite the 18:00 backup or vice versa; rather, each backup should be saved in a new folder with the date and time as the name. Is there a better solution or approach to accomplish this? Ideally, after three days, backups older than that will be deleted, ensuring that only nine copies of the database exist in the past three days.
This task must be completed using free hosting services from Beget.
  •  


SHADOW_FANTOM

It is necessary to create a backup dump on the server using cron.
 There are several methods to pull it together onto a local machine: using wget with a scheduler or an FTP client. As a user of Cygwin under Windows, I plan to configure cron and at least transfer it using ssh.
  •  

rajan

The cron program permits the user to execute processes at specific times or frequency. It is important to note that cron only exists under Unix, so Windows OS users should consult their hosting provider for advice on how to run processes at the specified time. This information may only be useful to Unix users in general.

To create a backup of the database, we would need to run the following command in the Unix shell by running crontab -e:

0 0 * * * /usr/local/bin/mysqldump -uLOGIN -PPORT -hHOST -pPASS DBNAME | gzip -c > date "+\%Y-\%m-\%d".gz
This command creates a dump file of the DBNAME database every day at midnight (00:00) and archives the file with gzip into an archive file named after the current date. To enable creation of files with date names, we use the date command, which allows arbitrary date output formatting. By using backticks, we force the Unix shell to insert the result of another command into the current one.

Save the rule for cron and wait for the results. The command creates an archived copy of the database on disk daily. You can easily locate archives by name, which will correspond to the date when they were created, allowing you to restore data if needed.

If you want to automate the removal of old archives, utilize cron and try incorporating the find command, which is available in Unix. It deletes archives older than seven days by periodically running find ~/directory-with-archives -name "*.gz" -mtime +7. Find's documentation is available through the man find command in the Unix shell.

If your machine is constantly connected to the Internet, you can copy the backup you created to it using cron. Use the ftp and scp commands to copy the file to another machine. Add their launch to cron. As reliable as a hosting provider's machine may be, it is better to be safe than sorry.
  •  

onCring

You're on Beget, which offers some basic automation tools, but you're not exactly working with high-end services here. That being said, it's possible to set up automated backups using a cron job, but Beget's free hosting doesn't provide the most robust options.

Here's what you can do anyway: First off, you'll need to set up a cron job to automatically export your database at those times—18:00, 21:00, and 24:00. In Beget's hosting panel, you can configure cron jobs through their built-in scheduler. Essentially, you'll write a shell script that connects to your database, exports it as a SQL file, and saves it with a timestamp in the filename. This script can be run at those exact times using cron expressions. You'll have something like:

0 18 * * * /usr/bin/mysqldump -u [user] -p[password] [database] > /e/backup_$(date +"%Y%m%d%H%M").sql
0 21 * * * /usr/bin/mysqldump -u [user] -p[password] [database] > /e/backup_$(date +"%Y%m%d%H%M").sql
0 24 * * * /usr/bin/mysqldump -u [user] -p[password] [database] > /e/backup_$(date +"%Y%m%d%H%M").sql

As for storing backups in the "E drive," you're likely talking about your local system, not Beget's file storage. This is tricky, as Beget doesn't allow you to directly save files to a remote system (such as your local E drive) unless you're utilizing FTP or SCP protocols. So, you'll probably want to add an additional line to your script that FTPs the SQL file to your local drive after it's been generated. You can schedule this in the same cron job.

The part about keeping only the last nine backups? You'll want to add a line in your script that deletes backups older than three days. You can easily do this by running a find command to locate and remove older files. Something like this could work:

find /e/backups/ -type f -mtime +3 -exec rm {} \;
This will delete any backups older than three days from the backup directory. Of course, this means you'll only have the last nine backups available, assuming you make three backups a day.

While this will get you by on a free hosting service, you're probably going to hit limits soon enough. Beget's storage caps might start getting annoying, and manual configuration can only take you so far. Honestly, at some point, you might want to think about upgrading to a better hosting plan or even using cloud storage for your backups (Amazon S3, for instance, isn't a bad option).
  •  


If you like DNray forum, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...