Hello,
I’m having issue wile uploading large backup on Amazon S3. It seems that Amazon has upper limit on number of file parts. Limit is set to 10k.
As I understand the solution is simple - increase the size of single package so the total number of packages will decreasee below 10k. How to do that using SqlBackupAndFTP? I’ve got v10.1.15
Thank you
Logs from SQLBackupAndFTP:
11/24/2019 00:53:00 | WARNING: Failed to upload part #10001; Position: 167772160000; Size: 16777216 > Part number must be an integer between 1 and 10000, inclusive > The remote server returned an error: (400) Bad Request. > The remote server returned an error: (400) Bad Request. |
---|---|
11/24/2019 00:53:00 | Trying to upload part #10001 again… |
11/24/2019 00:53:00 | WARNING: Failed to upload part #10002; Position: 167788937216; Size: 16777216 > Part number must be an integer between 1 and 10000, inclusive > The remote server returned an error: (400) Bad Request. > The remote server returned an error: (400) Bad Request. |
11/24/2019 00:53:00 | Trying to upload part #10002 again… |
11/24/2019 00:53:01 | WARNING: Failed to upload part #10001; Position: 167772160000; Size: 16777216 > Part number must be an integer between 1 and 10000, inclusive > The remote server returned an error: (400) Bad Request. > The remote server returned an error: (400) Bad Request. |