This page is copied from the community documentation:
Definition of a disaster
This chapter describes how to recover from a disaster. Before we continue, we first have to define what a disaster actually is. Two categories can be distinguished:
Loss or corruption of source files or the complete source system.
Missing or corrupted backup files.
How to restore files to the original location of the same system and how to restore files from a consistent backup to a new computer is described in and .
This chapter describes the process of restoring as much as possible from a backup that is inconsistent due to corrupted or missing files at the backend, without access to the source files and the Duplicati setup.
Usually you can install Duplicati on any computer and point to the location that contains your backup to restore files. Duplicati will try to automatically recover from problems it finds, but if there is significant damage in your backup files, the restore process may fail, resulting in aborting the restore operation, leaving files unrecovered that are potentially restorable. In this situation you can use the Duplicati.CommandLine.RecoveryTool.exe to restore files that are not affected by the backup corruption. You can use this tool to perform the operations manually that are normally done automatically by the standard tools.
Test scenario
To explain the working of the Duplicati.CommandLine.RecoveryTool.exe, this setup is assumed:
The computer that contained the source files had 4 backup versions of the My Pictures folder. This computer, including Duplicati installation and picture files are assumed to be lost.
The backup location is an FTP server. The default Upload Volume size of 50MB is decreased to 10MB, resulting in more, but smaller files, which makes more sense for this example. After 4 backup operations, the files at the backend look like this:
There is one .dlist file for each backup version. The data itself is stored in a number of .dblock files. Each .dblock file has an accompanying .dindex file. This is a consistent backup, but in this test scenario, some files intentionally are corrupted by replacing the contents with random data and by removing a .dblock file.
The first command returns which source files need information from the remote file duplicati-b69a2a32a50bb4c6d8780389efdbf7442.dblock.zip.aes.
Conclusion: if the 2 remote files mentioned above are not available, the 6 picture files should be considered lost, but with the Duplicati Recoverytool all other files in the backup should be recoverable.
This is, of course, something that never should be done in a production environment, but for this test scenario we will intentionally damage the backup set, making it unusable for standard backup- and restore operations.
The following actions are performed on the backend:
File duplicati-b69a2a32a50bb4c6d8780389efdbf7442.dblock.zip.aes is deleted.
File duplicati-b5f8cd40e22a54b5b988689370b8cde34.dblock.zip.aes is replaced by a file with the same name containing random data.
Restoring files from this corrupted backup set will fail before the first file is actually restored. You can recover from this situation by using one of these procedures:
Recovering by purging unrestorable files from the backups.
Recovering by using the Duplicati Recovery Tool.
To be able to restore files in these scenarios, you will need:
The protocol, location and credentials of the remote location where your backup files are stored.
The passphrase used to encrypt your backup (if any).
A computer that you can use for restoring data with enough free storage capacity for all files you want to restore
The Duplicati Command Line tools. These tools are part of a standard Duplicati setup.
If you are using the Duplicati Recovery Tool: temporary local storage with enough free space to store all backup files.
If you still have access to your computer running Duplicati and the backup job has a valid local database, Duplicati can analyze the files that should be in the backup and compare this what's actually at the remote location. Use the Duplicati command list-broken-files to list files that cannot be restored due to corruption or missing data. The command purge-broken-filesactually deletes these files from all backup versions.
To get an impression of the damage to the backup set, run this command:
The purge-broken-files command returns this information:
No broken filesets found in database, checking for missing remote files
Listing remote folder ...
remote file duplicati-b5f8cd40e22a54b5b988689370b8cde34.dblock.zip.aes is listed as Verified with size 9569280 but should be 10468957, please verify the sha256 hash "hIPABrSE/6xN041ut6IKb0sUSMxYGRI3ZqAWwY+q6JM="
Marked 1 remote files for deletion
Found 4 broken filesets with 17 affected files, purging files
Purging 5 file(s) from fileset 11/9/2017 11:06:06 AM
Starting purge operation
Replacing fileset duplicati-20171109T100606Z.dlist.zip.aes with duplicati-20171109T100607Z.dlist.zip.aes which has with 5 fewer file(s) (10.33 MB reduction)
Uploading file (4.47 KB) ...
Deleting file duplicati-20171109T100606Z.dlist.zip.aes ...
Purging 4 file(s) from fileset 11/9/2017 11:06:53 AM
Starting purge operation
Replacing fileset duplicati-20171109T100653Z.dlist.zip.aes with duplicati-20171109T100654Z.dlist.zip.aes which has with 4 fewer file(s) (10.33 MB reduction)
Uploading file (6.22 KB) ...
Deleting file duplicati-20171109T100653Z.dlist.zip.aes ...
Purging 4 file(s) from fileset 11/9/2017 11:07:37 AM
Starting purge operation
Replacing fileset duplicati-20171109T100737Z.dlist.zip.aes with duplicati-20171109T100738Z.dlist.zip.aes which has with 4 fewer file(s) (10.33 MB reduction)
Uploading file (8.09 KB) ...
Deleting file duplicati-20171109T100737Z.dlist.zip.aes ...
Purging 4 file(s) from fileset 11/9/2017 11:08:15 AM
Starting purge operation
Replacing fileset duplicati-20171109T100815Z.dlist.zip.aes with duplicati-20171109T100816Z.dlist.zip.aes which has with 4 fewer file(s) (10.33 MB reduction)
Uploading file (9.22 KB) ...
Deleting file duplicati-20171109T100815Z.dlist.zip.aes ...
Deleting file duplicati-b69a2a32a50bb4c6d8780389efdbf7442.dblock.zip.aes (9.97 MB) ...
Operation Delete with file duplicati-b69a2a32a50bb4c6d8780389efdbf7442.dblock.zip.aes attempt 1 of 5 failed with message: The remote server returned an error: (550) File unavailable (e.g., file not found, no access). => The remote server returned an error: (550) File unavailable (e.g., file not found, no access).
Some information from the messages above:
duplicati-b5f8cd40e22a54b5b988689370b8cde34.dblock.zip.aes is corrupted and marked for deletion.
Files that cannot be restored are deleted from all backup versions that contain these files (17 total). Note that this are not 17 unique source files, one file is usually included in multiple backup versions.
New, consistent backup files are generated and uploaded to the backend.
The Duplicati Recovery tool allows to perform actions manually that are normally done automatically when running backup or restore operations. A normal restore consists of the following operatrions:
Duplicati determines which remote files are needed to restore the specified files.
Duplicati downloads the first required remote file.
The file is decrypted using the supplied passphrase.
Duplicati uses the .DINDEX files to determine how files can be recreated by merging blocks inside .DBLOCK files in the correct order.
The recreated files are moved to the supplied Restore location.
The Duplicati Recovery Tool can perform these actions step by step, giving you more control over each step in the restore process.
In disaster recovery scenarios, the Duplicati Recovery Tool performs 3 steps:
All remote files are downloaded from the backend, decrypted and stored in the local filesystem.
An index is built that allows Duplicati to keep track of what information is stored in which file.
Files are restored from the downloaded backend files by recreating them using the blocks inside the .DBLOCK files.
Optionally, these additional actions can be performed:
List files that are available in the downloaded remote files.
Recompress and/or re-upload files to the backend. This is useful if you want to change the compression type of an existing backup job. Changing the compression type (.7z to .zip) is not supported, but you can do this by downloading the complete backup, decrypt all files, extract all files, recompress the files using another compression type, re-encrypt the files and re-upload them to the backend. Additionally edit the backup configuration to use the new compression type for future backups.
The first step is downloading all files that are used by the backup job. This step is required, because a lot of read/write operations have to be performed to the remote files. All files must be decrypted and the contents of all files must be read to analyze the contents.
Remote files can be downloaded using the download command:
This command fill download all remote files from <remoteurl>, decrypt the files and store the decrypted files in <localfolder>.
Required information:
Address, path and credentials to access the remote files
In this example the address is myftpserver.com, the path is /Backup/Pictures, the FTP username is duplicati and the FTP password is backup.
The passphrase used to encrypt the backup
In this example the passphrase 4u7P_re5&+Gb>6NO{ was used for the backup.
Create an empty folder in your local filesystem, for example C:\BackendFiles. Be sure that the location you download the backup files to has enough free space to store all backup files.
This command downloads and decrypts all backup files and stores these files in C:\BackendFiles:
In this example, 49 files were found at the backend. From all .DBLOCK files, 1 file was corrupt and could not be decrypted. Files with the .DINDEX extension are index files that will be recreated, therefore they are not downloaded. 4 .DLIST files were found and downloaded to C:\BackendFiles.
As a result, the C:\BackendFiles folder contains 25 unencrypted .Zip files: 4 .dlist.zip files and 21 .dblock.zip files.
When all files that contain applicable information are downloaded, an index file must be created. Without this index, we have nothing more than a bunch of files containing hashes and raw data. The index can be created with the Duplicati Recovery Tool using the index command:
Duplicati.RecoveryTool.exe index <localfolder> [options]
This command only requires the location of the local folder to be specified, in this example C:\BackendFiles. The index file will be created in the same folder. If you want the index file to be created in another folder, use advanced option --indexfile to specify the location. The Temporary files folder is used intensively by this process. Optionally you can specify a custom location with the --tempdir option.
To build an index of the files in C:\BackendFiles, use this command:
Duplicati.CommandLine.RecoveryTool.exe index "C:\BackendFiles"
The resulting index file index.txt contains a list of hashes and .DBLOCK filenames.
Before the actual restore operation is performed, you can see what is inside the downloaded and encrypted remote files. Use the Recovery Tool's list command to retrieve this information:
Duplicati.RecoveryTool.exe list <localfolder> [version] [options]
Without a version specified, all available backup versions are listed. When a version number is supplied, all restorable files from that backup version are listed. Try these commands:
Duplicati.CommandLine.RecoveryTool.exe list C:\BackendFiles
Duplicati.CommandLine.RecoveryTool.exe list C:\BackendFiles 0
After all backup files are downloaded, decrypted and indexed, you can start with the actual restore process. With the Duplicati Recovery Tool, use therestore command to restore all files that can be recovered from any backup version to the location of your choice:
Sorting index file ... done!
Building lookup table for file hashes
Index file has 2047 hashes in total
Building lookup table with 2046 entries, giving increments of 1
Computing restore path
Restoring 75 files to C:\Restore
Removing common prefix C:\Users\User\ from files
All restored files are listed. The list probably contains errors, because files that need data from corrupted blocks cannot be restored.
In this example, from a corrupted backup with one deleted dblock file and one corrupted dblock file, 69 of 75 picture files were recovered successfully.
Inventory of files that are going to be corrupted
Prior to corrupting the consistent backup, we can inventory what the consequences are if these files get lost. You can use the Duplicati command affected to see which files are affected by a remote file. The affected command needs the local database, so you can perform this operation only if you have a fully working Duplicati installation for this backup job. See for more information.
Making the backup inconsistent
Prerequisites for recovery
Recovering by purging files
Add advanced option --dry-run to the command below to see what the command will do, before actually purging the files from the backups.
Recovering by using the Duplicati Recovery tool
Downloading all remote files using the Recovery Tool
Storage type
In this example the backup is stored using FTP, but all storage types are supported. See for more information.
Optional advanced options for access to the remote files
If you applied any options that are needed to get access to the backend files, supply these options here. See for more information.
Store information about your backup configuration (storage provider, storage location, credentials and passphrase) on a safe location that is also available when your computer Duplicati is lost. Without this information, your backup files are useless, because the passphrase is the only way to decrypt the files in your backup.
If you are unsure about the required free space, verify how many space is used by all files with a filename that start with duplicati- (or any prefix you specified in the backup job with the --prefix option). If still unsure, use an empty external disk with enough capacity. You have to start over the complete download process if free space runs out when downloading files.
Creating an index of downloaded files using the Recovery Tool
List backup versions and files using the Recovery Tool
Restoring files using the Recovery Tool
<localfolder> is a required option. It should point to the location where your downloaded remote files are stored. Optionally add --targetpath to specify where files must be restored to, otherwise the files are restored to their original locations. Use filters or the --exclude option to perform a partial restore. See and for more information.