Question about backing up large file folders

Hi, I’m currently backing up a SQL Server database as well as some shared files for our company. The files add up to around 180 gigabytes of data. The shared files are the bulk of the files. I save these files on a local device as well as upload them to the S3 cloud. The first time I uploaded these files to the S3 cloud it took about 15 hours. I have “multiple parallel” threads setup to “4”. Is there anything else I can do to save time for this backup? I also am backing up the entire set of files each time.

Thanks.

Hello,

Thank you for your message.

To reduce the execution time of the backup process in your case, please consider the following recommendations:

  • Disable compression in the backup job and instead enable native SQL Server compression for database backups (see the screenshot for reference).
  • Increase the number of parallel threads in the S3 destination settings beyond 4 and test if a higher value improves upload speed.

Ok I will try that.

I’m also having problems with this backup job because I’m backing up the SQL database as well as some shared document files. I get random error messages for certain files that I know exist. This causes the backup to fail for the shared document files. See the screenshot.

Thanks.

Hello,

Thank you for your message.

To resolve the issue, please try using 7zip compression for your backups instead of zip. You can configure this option in the “Compress backups” section of your backup job settings (see the attached screenshot).