Backup Nextcloud To Backblaze B2

Minio is an official integration with backblaze b2. It is like a translation layer that allows you to access your backblaze b2 account (or any other supported type of cloud storage) using the Amazon S3 API. Then I configured nextcloud to use the Amazon S3 storage driver instead of the local file disk. Granted, it isn’t the best solution for cloud storage (see pCloud for that instead). Still, with 600 Petabytes data stored and over 30 billion files recovered thus far, it really is the best choice for simple and convenient backing up of your computer files. 6 Things We Love About Backblaze. What makes Backblaze a worthwhile backup cloud storage provider is how it is engineered. For the backup destination, select B2 storage to use your Backblaze B2 storage. Enter the name of your B2 bucket. My example used “hng-backup”. Unless you are using your bucket for more than just backups, you likely do not need to set a folder path. Enter a “/” to save files in the root directory.

Amazon S3 has been around for more than ten years now and I have been happily using it for offsite backups of my servers for a long time. Backblaze’s cloud backup service has been around for about the same length of time and I have been happily using it for offsite backups of my laptop, also for a long time.

Backup Nextcloud To Backblaze B2 Drive

In September 2015, Backblaze launched a new product, B2 Cloud Storage, and while S3 standard pricing is pretty cheap (Glacier is even cheaper) B2 claims to be “the lowest cost high performance cloud storage in the world”. The first 10 GB of storage is free as is the first 1 GB of daily downloads. For my small server backup requirements this sounds perfect.

My backup tool of choice is duplicity, a command line backup tool that supports encrypted archives and a whole load of different storage services, including S3 and B2. It was a simple matter to create a new bucket on B2 and update my backup script to send the data to Backblaze instead of Amazon.

Backblaze

Here is a simple backup script that uses duplicity to keep one month’s worth of backups. Nfs most wanted mega trainer 1.3. In this example we dump a few MySQL databases but it could easily be expanded to back up any other data you wish.

Backblaze B2 Client

Backup Nextcloud To Backblaze B2 Download

2
4
6
8
10
12
14
16
18
20
22
24
26
28
30
32
34
36
38
40
42
44
46
48
50
########################################################################
# Uses duplicity (http://duplicity.nongnu.org/)
# Run this daily and keep 1 month's worth of backups
########################################################################
# b2 variables
B2_APPLICATION_KEY=application_key
ENCRYPT_KEY=gpg_key_id
DB_USER='root'
DATABASES=(my_db_1 my_db_2 my_db_3)
# Working directory
########################################################################
# Make the working directory
# Dump the databases
fordatabasein${DATABASES[@]};do
mysqldump-u$DB_USER-p$DB_PASS$database>$WORKING_DIR/$database.sql
duplicity--full-if-older-than7D--encrypt-key='$ENCRYPT_KEY'$WORKING_DIRb2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET
# Verify
duplicity verify--encrypt-key='$ENCRYPT_KEY'b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET$WORKING_DIR
# Cleanup
duplicity remove-older-than30D--force--encrypt-key=$ENCRYPT_KEYb2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET
# Remove the working directory

Run this via a cron job, something like this:

0 4 * * * /root/b2_backup.sh >>/var/log/duplicity/backup.log