Collecting site backups

Started by hainvv, Dec 01, 2022, 09:12 AM

Previous topic - Next topic

hainvvTopic starter

Hi there!

I would like to know about a possible solution to initiate an incremental backup of files and databases from a home machine and upload archives to it. There are several hosting services available that are quite diverse and the remote machines have SSH. Additionally, it is preferable to store all data such as logs, exception tables, etc. only on the home page.


It seems that most of the solutions available require the installation of additional software on remote sites, which may not be possible in cases where shared hosting services are used. Therefore, using tar and gzip might be the best option in such situations.

Thank you!
  •  

Zora2012

Hi there!

To initiate the backup process, one can install cygwin locally on Windows and write a primitive script similar to the one below on the local machine:

scp user1@server1:/path1/files/* /local/path1/
scp user2@server2:/path2/files/* /local/path2/

Moreover, it is recommended to configure authorization by the key file on the servers. It is better to use rsync instead of scp as it is more efficient in terms of traffic, especially when changes are not made to the entire file list. However, installing rsync on servers as a daemon is necessary to increase efficiency, which may not be a problem for *nix.

-----

Additionally, there is a service named www.dropmysite.com that might be suitable for backups. The free version provides 2 GB of space for backups and collects them automatically according to a schedule via FTP or can directly connect to the database. I have used it myself for a couple of days and it seems to work fine.

I hope this helps!
  •  

bergercpafirst

Hello!

I recently had the task of backing up 1.7 TB of photos using Borg, which stores backups in binary files in a single array. However, listing could take hours, and restoring hundreds of files for a couple of days is similar to the original backup process. Moreover, feedback on errors or missing files was very slow.

It seems like this is an issue with all backup solutions that store backups as snapshots. Out of curiosity, how does your utility handle this scenario? Is it challenging to verify the existence and restore several files from an incremental archive?
  •  

Elyman

In cases where installing additional software on remote sites is not feasible, using tar and gzip can indeed be a good option for initiating an incremental backup of files and databases from a home machine and uploading archives to it. Here's a possible solution:

1. First, create a local script on your home machine that handles the backup process. This script should include commands to create incremental backups of files and databases, and then compress them using tar and gzip.

2. To perform an incremental backup of files, you can use the `rsync` command, which allows you to synchronize files between local and remote locations over SSH. You can specify the remote location as the destination for the backup.

3. For databases, you can use tools specific to the database management system you are using (e.g., mysqldump for MySQL) to create backups. These backups can then be included in the tar archive created in the previous step.

4. After creating the necessary backups, you can use the tar command to bundle all the files together into a single archive. You can use gzip to compress the archive, reducing its size.

5. Finally, you can use SSH to securely transfer the compressed archive to the remote machine. You can use the scp command to copy the archive to the desired location on the remote machine.

By using this approach, you can initiate an incremental backup from your home machine, compress the backups using tar and gzip, and then upload the resulting archive to the remote machine over SSH.

considerations and steps you can take to enhance your incremental backup solution using tar and gzip:

1. Determine a suitable backup schedule: Decide how often you want to perform the incremental backups. This could be daily, weekly, or any other interval based on your requirements.

2. Implement a naming convention: To organize your backups, consider incorporating a naming convention in your script that includes the date or timestamp of the backup. This will help you easily identify and manage different backups over time.

3. Set up a secure SSH connection: Make sure you have SSH access to the remote machine. Ensure that the necessary SSH keys or credentials are set up securely and that you have the required permissions to access and write to the desired directory on the remote machine.

4. Handle log files and exceptions: If you want to store logs and exception tables separately from the incremental backups, you can modify your script to include specific directories or files for backup. Exclude these directories or files from the incremental backup process by specifying the appropriate exclusion parameters with the tar command.

5. Monitor and validate backups: It's important to regularly monitor and validate the success of your backups. You can include checks in your script to verify the integrity of the backup files, such as comparing checksums or performing test restores on a separate environment.

6. Consider encryption: If security is a concern, you can encrypt your backups before transferring them to the remote machine. Tools like GnuPG can be utilized to encrypt the archive file with a passphrase.

enhance your incremental backup solution using tar and gzip:

1. Implement retention policies: Define how long you want to keep the incremental backups on the remote machine. Consider implementing a retention policy that automatically deletes older backups to manage disk space efficiently. You can modify your script to include this functionality by checking the creation date of the backups and deleting them based on the specified retention period.

2. Handle database consistency: When backing up databases, ensure that you handle the consistency of the data during the backup process. Depending on the database management system you are using, you might need to take additional steps like locking tables or using transaction logs to create a consistent backup.

3. Error handling and notification: Implement error handling in your script to handle any potential failures during the backup process. This can include logging errors, sending notifications, or alerting you via email or other methods when backups fail. Proper error handling will help ensure that you are aware of any issues and can take appropriate action.

4. Consider network limitations: If you are using a shared hosting service or have limited bandwidth, it's worth considering the impact of transferring large backup files over the network. In such cases, you may want to optimize the backup process by compressing the backup files more aggressively or splitting them into smaller chunks for easier transfer.

5. Test the restore process: Regularly test the restore process to ensure that you can successfully recover your data from the backups. It's important to know that your backup solution is reliable and that you can restore your data when needed.
  •