Skip to content

Google Photos Automatic Backups

Warning

This process has been deprecated in favor of Rclone. Instructions on using Rclone to backup Google Photos automatically can be found here.

Summary

The idea behind this is that with the amount of priceless photos living in Google Photos, all it takes is something like getting hacked to lose it all. That's not an option, and therefore, automatically backing these photos up is the way forward. I'll be leveraging the Google Photos API to download photos to the Yunohost VM, and will then be shuttling them to Backblaze B2.

The following will be the steps taken to accomplish this.

How-To

The idea to do this came from Jake Wharton (who ironically works for Google) who created a sync tool to accomplish this, however, having it run in Docker only complicated it. I ended up coming across another sync tool that allowed for accomplishing this same thing, but by way of a bash script running in a loop to call the Google Photos API for new data every 45 seconds.

From the README.md file, with the addition of how to unzip the bz2 tar file...

gitmoo-goog tool uses Google Photos API to continuously download all photos on a local device.

It can be used as a daemon to keep in sync with a google-photos account.

Download:

Unzip:

  • bz2 gitmoo-goog

Change to executable:

  • chmod +x gitmoo-goog

Google Photos API

Perform the following in order to enable the Google Photos API:

  1. Go to the Google API Console.
  2. From the menu bar, select a project or create a new project.
  3. To open the Google API Library, from the Navigation menu, select APIs & Services > Library.
  4. Search for "Google Photos Library API". Select the correct result and click Enable.

Although the API is enabled, the script will require an OAuth 2.0 client ID. The following will outline how to obtain this:

  1. Go to the Google API Console and select your project.
  2. From the menu, select APIs & Services > Credentials.
  3. On the Credentials page, click Create Credentials > OAuth client ID.
  4. Select your Application type. Choose Other and click Create.
  5. Download the credentials.json client configuration file.

Tip

The .json file will likely download with a long generated filename. This will need to be renamed and moved to the appropriate folder in order to work properly.

  • Copy the downloaded credentials.json to the same folder with gitmoo-goog.
  • Run ./gitmoo-goog
  • Go to the following link in your browser then type the authorization code: https://accounts.google.com/o/oauth2/auth?access_type=...
  • The link provided, when opened in a browser, will present a security warning. Accept the warning and click Allow.
  • Copy the token into the terminal and hit enter.
  • The script will begin running at this point, but should be cancelled with Ctrl + C.

Usage

Usage of ./gitmoo-goog:

* album
        download only from this album (use google album id)
  -folder string
        backup folder
  -force
        ignore errors, and force working
  -logfile string
        log to this file
  -loop
        loops forever (use as daemon)
  -max int
        max items to download (default 2147483647)
  -pagesize int
        number of items to download on per API call (default 50)
  -throttle int
        Time, in seconds, to wait between API calls (default 5)
  -folder-format string
        Time format used for folder paths based on <https://golang.org/pkg/time/#Time.Format> (default "2016/Janurary")
  -use-file-name
        Use file name when uploaded to Google Photos (default off)
  -download-throttle
        Rate in KB/sec, to limit downloading of items (default off)
  -concurrent-downloads
        Number of concurrent item downloads (default 5)

Running in a Loop

In order to run this in a continuous loop to look for new content, use the following:

./gitmoo-goog -folder archive -logfile gitmoo.log -loop -throttle 45 &

This will start the process in background, making an API call every 45 seconds, looping forever on all items and saving them to {pwd}/archive.

Logfile will be saved as gitmoo.log.

Naming and Folder Permissions

Files are created as follows:

[folder][year][month][day]_[hash].json and .jpg. The json file holds the metadata from google-photos.

The default permissions may need to be changed. In order to do this, use the following:

find /home/xenadmin/Desktop/gphotos-backup -type d -exec chmod 777 {} \;

Current Configuration

Virtual Machine Folder
Yunohost /home/xenadmin/Desktop/gphotos-backup

Google Takeout

Google Takeout is also utilized to take a complete export of all photos and videos on a bi-monthly basis. At the time of this writing, the job runs on the 14th of every odd month (ex: May 14th, July 14th, etc).

These exports are broken up into 50GB tar.gz archives - two in total at the time of this writing. They are stored as-is on AWS using AWS Deep Glacier Archive storage tier.

References

https://github.com/JakeWharton/docker-gphotos-sync

https://github.com/dtylman/gitmoo-goog

https://developers.google.com/photos/library/guides/get-started#enable-the-api

https://stackoverflow.com/questions/3740152/how-do-i-change-permissions-for-a-folder-and-all-of-its-subfolders-and-files-in