Backing up big databases and other

Our company started to use the tool before more than one year. During this period the usage of this application became the primary backup solution for Postgres databases.
Currently we have a couple of issues due to high activity on our servers:

  1. For PG backups the tool uses pg_dump utility. The dump process requires shared lock on the database. In case of big size the process takes about 10-11 hours which is problematic from two perspectives: a. The long duration of shared lock; b. Long backup process.
    In order to resolve this problem the incremental backups should be implemented in the tool (for example using pg_backrest utility instead/beside of pg_dump). For more information please refer this topic: PostgreSQL Incremental
  2. The company uses this tool as centralized job execution scheduler since “Add new maintenance job” feature was implemented in version 12.4.8 and "run before/after backup with sql or ssh script was implemented in previous versions.
    In order to sort jobs and to make the list of backup and maintenance jobs more readable it will be (at least) nice to have the option to create folders;
  3. Currently the temp backup before archiving are sent to the same folder as the archived file itself. In case of 1T database size the free space should be 1T (database dump) + 200G archived backup. Is there any option to separate the folders (to send the temp backup to configured temp folder and the archive directly to the backup folder;

Thank you in advance

Hi Michael_Highguy,

Thank you for your requests. All these requests are useful and perhaps they will be added in future releases. Sorry, but currently we have tasks with more important priorities.

Thank you for using SQLBackupAndFTP.