Welcome to Part 2 in a series on taking smart backups with duplicity. This article builds on the basics of duplicity with a tool called duply.

Duply is a frontend for duplicity that integrates smoothly with recurring tools like cron or systemd. Its headline features are:

  • keeps recurring settings in profiles per backup job
  • automates import/export of keys between profile and keyring
  • enables batch operations eg. backup_verify_purge
  • runs pre/post scripts
  • precondition checking for flawless duplicity operation

The general form for running duply is:

duply PROFILE COMMAND [OPTIONS]

Installation

duply is available in the Fedora repositories. To install it, use the sudo command with dnf:

dnf install duply

Create a profile

duply stores configuration settings for a backup job in a profile. To create a profile, use the create command.

$ duply documents create

Congratulations. You just created the profile 'documents'.
The initial config file has been created as 
'/home/link/.duply/documents/conf'.
You should now adjust this config file to your needs.

IMPORTANT:
  Copy the _whole_ profile folder after the first backup to a safe place.
  It contains everything needed to restore your backups. You will need 
  it if you have to restore the backup from another system (e.g. after a 
  system crash). Keep access to these files restricted as they contain 
  _all_ informations (gpg data, ftp data) to access and modify your backups.

  Repeat this step after _all_ configuration changes. Some configuration 
  options are crucial for restoration.

The newly created profile includes two files: conf and exclude. The main file, conf, contains comments for variables necessary to run duply. Read over the comments for any settings unique to your backup environment. The important ones are SOURCE, TARGET, GPG_KEY and GPG_PW.

To convert the single invocation of duplicity from the first article, split it into 4 sections:

duplicity --name duply_documents --encrypt-sign-key **************** --include $HOME/Documents --exclude '**'  $HOME   s3+http://**********-backup-docs
          [                         OPTIONS                        ] [                 EXCLUDES             ] [SOURCE] [             TARGET           ]

Comment out the lines starting with TARGET, SOURCE, GPG_KEY and GPG_PW by adding # in front of each line. Add the following lines to conf:

SOURCE=/home/link
TARGET=s3+http://**********-backup-docs
GPG_KEY=****************
GPG_PW=************
AWS_ACCESS_KEY_ID=********************
AWS_SECRET_ACCESS_KEY=****************************************

The second file, exclude, stores file paths to include/exclude from the backup. In this case, add the following to $HOME/.duply/documents/exclude.

+ /home/link/Documents
- **

Running duply

Run a backup with the backup command. An example run appears below.

$ duply documents backup
Start duply v2.0.2, time is 2017-07-04 17:14:03.
Using profile '/home/link/.duply/documents'.
Using installed duplicity version 0.7.13.1, python 2.7.13, gpg 1.4.21 (Home: ~/.gnupg), awk 'GNU Awk 4.1.4, API: 1.1 (GNU MPFR 3.1.5, GNU MP 6.1.2)', grep 'grep (GNU grep) 3.0', bash '4.4.12(1)-release (x86_64-redhat-linux-gnu)'.
Autoset found secret key of first GPG_KEY entry 'XXXXXXXXXXXXXXXX' for signing.
Checking TEMP_DIR '/tmp' is a folder and writable (OK)
Test - Encrypt to 'XXXXXXXXXXXXXXXX' & Sign with 'XXXXXXXXXXXXXXXX' (OK)
Test - Decrypt (OK)
Test - Compare (OK)
Cleanup - Delete '/tmp/duply.15349.1499213643_*'(OK)
Backup PUB key 'XXXXXXXXXXXXXXXX' to profile. (OK)
Write file 'gpgkey.XXXXXXXXXXXXXXXX.pub.asc' (OK)
Backup SEC key 'XXXXXXXXXXXXXXXX' to profile. (OK)
Write file 'gpgkey.XXXXXXXXXXXXXXXX.sec.asc' (OK)

INFO:

duply exported new keys to your profile.
You should backup your changed profile folder now and store it in a safe place.


--- Start running command PRE at 17:14:04.115 ---
Skipping n/a script '/home/link/.duply/documents/pre'.
--- Finished state OK at 17:14:04.129 - Runtime 00:00:00.014 ---

--- Start running command BKP at 17:14:04.146 ---
Reading globbing filelist /home/link/.duply/documents/exclude
Local and Remote metadata are synchronized, no sync needed.
Last full backup date: Tue Jul  4 14:16:00 2017
Reuse configured PASSPHRASE as SIGN_PASSPHRASE
--------------[ Backup Statistics ]--------------
StartTime 1499213646.13 (Tue Jul  4 17:14:06 2017)
EndTime 1499213646.40 (Tue Jul  4 17:14:06 2017)
ElapsedTime 0.27 (0.27 seconds)
SourceFiles 1205
SourceFileSize 817997271 (780 MB)
NewFiles 1
NewFileSize 4096 (4.00 KB)
DeletedFiles 0
ChangedFiles 0
ChangedFileSize 0 (0 bytes)
ChangedDeltaSize 0 (0 bytes)
DeltaEntries 1
RawDeltaSize 0 (0 bytes)
TotalDestinationSizeChange 787 (787 bytes)
Errors 0
-------------------------------------------------

--- Finished state OK at 17:14:07.789 - Runtime 00:00:03.643 ---

--- Start running command POST at 17:14:07.806 ---
Skipping n/a script '/home/link/.duply/documents/post'.
--- Finished state OK at 17:14:07.823 - Runtime 00:00:00.016 ---

Remember duply is a wrapper around duplicity. Because you specified –name during the backup creation in part 1, duply picked up the local cache for the documents profile. Now duply runs an incremental backup on top of the full one created last week.

Restoring a file

duply offers two commands for restoration. Restore the entire backup with the restore command.

$ duply documents restore ~/Restore
Start duply v2.0.2, time is 2017-07-06 22:06:23.
Using profile '/home/link/.duply/documents'.
Using installed duplicity version 0.7.13.1, python 2.7.13, gpg 1.4.21 (Home: ~/.gnupg), awk 'GNU Awk 4.1.4, API: 1.1 (GNU MPFR 3.1.5, GNU MP 6.1.2)', grep 'grep (GNU grep) 3.0', bash '4.4.12(1)-release (x86_64-redhat-linux-gnu)'.
Autoset found secret key of first GPG_KEY entry 'XXXXXXXXXXXXXXXX' for signing.
Checking TEMP_DIR '/tmp' is a folder and writable (OK)
Test - Encrypt to 'XXXXXXXXXXXXXXXX' & Sign with 'XXXXXXXXXXXXXXXX' (OK)
Test - Decrypt (OK)
Test - Compare (OK)
Cleanup - Delete '/tmp/duply.12704.1499403983_*'(OK)

--- Start running command RESTORE at 22:06:24.368 ---
Local and Remote metadata are synchronized, no sync needed.
Last full backup date: Thu Jul 6 21:46:01 2017
--- Finished state OK at 22:06:44.216 - Runtime 00:00:19.848 ---

Restore a single file or directory with the fetch command.

$ duply documents fetch Documents/post_install ~/Restore
Start duply v2.0.2, time is 2017-07-06 22:11:11.
Using profile '/home/link/.duply/documents'.
Using installed duplicity version 0.7.13.1, python 2.7.13, gpg 1.4.21 (Home: ~/.gnupg), awk 'GNU Awk 4.1.4, API: 1.1 (GNU MPFR 3.1.5, GNU MP 6.1.2)', grep 'grep (GNU grep) 3.0', bash '4.4.12(1)-release (x86_64-redhat-linux-gnu)'.
Autoset found secret key of first GPG_KEY entry 'XXXXXXXXXXXXXXXX' for signing.
Checking TEMP_DIR '/tmp' is a folder and writable (OK)
Test - Encrypt to 'XXXXXXXXXXXXXXXX' & Sign with 'XXXXXXXXXXXXXXXX' (OK)
Test - Decrypt (OK)
Test - Compare (OK)
Cleanup - Delete '/tmp/duply.14438.1499404312_*'(OK

--- Start running command FETCH at 22:11:52.517 ---
Local and Remote metadata are synchronized, no sync needed.
Last full backup date: Thu Jul 6 21:46:01 2017
--- Finished state OK at 22:12:44.447 - Runtime 00:00:51.929 ---

duply includes quite a few commands. Read the documentation for a full list of commands.

Other features

Timer runs become easier with duply than with duplicity. The systemd user session lets you create automated backups for your data. To do this, modify ~/.config/systemd/user/backup.service, replacing ExecStart=/path/to/backup.sh with ExecStart=duply documents backup. The wrapper script backup.sh is no longer required.

duply also makes a great tool for backing up a server. You can create system wide profiles inside /etc/duply to backup any part of a server. Now with a combination of system and user profiles, you’ll spend less time worrying about your data being backed up.