I Thought He Came With You is Robert Ellison’s blog about software, marketing, politics, photography and time lapse.

How to backup Google Photos to Google Drive automatically after July 2019 with Apps Script

Updated on Wednesday, September 25, 2019

Google Photos backup to Google Drive shutting down in July 2019

Google has decided that backing up your photos via Google Drive is 'confusing' and so Drive based backup is going away this month. I love Google Photos but I don't trust it - I pull everything into Drive and then I stick a monthly backup from there onto an external drive in a fire safe. There is a way to get Drive backup working again using Google Apps Script and the Google Photos API. There are a few steps to follow but it's pretty straightforward - you should only need to change two lines in the script to get this working for your account.

First two caveats to be aware of. Apps Script has a time limit and so it's possible that it could fail if moving a large number of photos. You should get an email if the script ever fails so watch out for that. Secondly and more seriously you could end up with two copies of your photos. If you use Backup and Sync to add photos from Google Drive then these photos will be downloaded from Google Photos by the script and added to Drive again. You need to either upload directly to Google Photos (i.e. from the mobile app or web site) or handle the duplicates in some way. If you run Windows then I have released a command line tool that sorts photos into year+month taken folders and handles de-duplication.

One more limitation. After a comment by 'Logan' below I realized that Apps Script has a 50MB limitation for adding files to Google Drive. The latest version of the script will detect this and send you an email listing any files that could not be copied automatically.

On to the script. In Google Drive create a new spreadsheet. This is just a host for the script and makes it easy to authorize it to access Google Photos. Select 'Script editor' from the Tools menu to create a new Apps Script project.

In the script editor select 'Libraries...' from the Resources menu. Enter 1B7FSrk5Zi6L1rSxxTDgDEUsPzlukDsi4KGuTMorsTQHhGBzBkMun4iDF next to 'Add a library' and click add. This will find the Google OAuth2 library Pick the most recent version and click Save.

Select 'Project properties' from the File menu and copy the Script ID (a long sequence of letters and numbers). You'll need this when configuring the Google Photos API.

In a new window open the Google API Console, part of the Google Cloud Platform. Create a new project, click Enable APIs and Services and find and enable the Google Photos API. Then go to the Keys section and create an OAuth Client ID. You'll need to add a consent screen, the only field you need to fill out is the product name. Choose Web Application as the application type. When prompted for the authorized redirect URL enter https://script.google.com/macros/d/{SCRIPTID}/usercallback and replace {SCRIPTID} with the Script ID you copied above. Copy the Client ID and Client Secret which will be used in the next step.

Go back to the Apps Script project and paste the code below into the Code.gs window:

Enter the Client ID and Client Secret inside the empty quotes at the top of the file. You also need to add an email address to receive alerts for large files. There is a BackupFolder option at the top as well - the default is 'Google Photos' which will mimic the old behavior. You can change this if you like but make sure that the desired folder exists before running the script. Save the script.

Go back to the spreadsheet you created and reload. After a few seconds you will have a Google Photos Backup menu (to the right of the Help menu). Choose 'Authorize' from this menu. You will be prompted to give the script various permissions which you should grant. After this a sidebar should appear on the spreadsheet (if not choose 'Authorize' from the Google Photos Backup menu again). Click the authorize link from the sidebar to grant access to Google Photos. Once this is done you should be in business - choose Backup Now from the Google Photos Backup menu and any new items from yesterday should be copied to the Google Photos folder in Drive (or the folder you configured above if you changed this).

Finally you should set up a trigger to automate running the script every day. Choose 'Script editor' from the Tools menu to re-open the script, and then in the script window choose 'Current project's triggers' from the Edit menu. This will open yet another window. Click 'Add Trigger' which is cunningly hidden at the bottom right of the window. Under 'Choose which function to run' select 'runBackup'. Then under 'Select event source' select 'Time-driven'. Under 'Select type of time based trigger' select 'Day timer'. Under 'Select time of day' select the time window that works best for you. Click Save. The backup should now run every day.

The way the script is written you'll get a backup of anything added the previous day each time it runs. If there are any duplicate filenames in the backup folder the script will save a new copy of the file with (1) appended in front of the filename. Let me know in the comments if you use this script or have any suggestions to improve it.

Facebook Interoperability

In TechCrunch today Josh Constine gets friend portability for Facebook almost right:

"In other words, the government should pass regulations forcing Facebook to let you export your friend list to other social networks in a privacy-safe way. This would allow you to connect with or follow those people elsewhere so you could leave Facebook without losing touch with your friends. The increased threat of people ditching Facebook for competitors would create a much stronger incentive to protect users and society."

The problem is having a list of friends does me no good at all when none of them are on Google Plus, Diaspora or whatever.

What we need is legislation that forces interoperability. I can share with my friends via an open protocol, and Facebook is forced to both send and receive posts from other networks. This would actually create an opportunity for plausible competition in a way that a friend export could never do. Social networking should work like email, not CompuServe.

Better related posts with word2vec (C#)

I have been experimenting with word2vec recently. Word2vec trains a neural network to guess which word is likely to appear given the context of the surrounding words. The result is a vector representation of each word in the trained vocabulary with some amazing properties (the canonical example is king - man + woman = queen). You can also find similar words by looking at cosine distance - words that are close in meaning have vectors that are close in orientation.

This sounds like it should work well for finding related posts. Spoiler alert: it does!

My old system listed posts with similar tags. This worked reasonably well, but it depended on me remembering to add enough tags to each post and a lot of the time it really just listed a few recent posts that were loosely related. The new system (live now) does a much better job which should be helpful to visitors and is likely to help with SEO as well.

I don't have a full implementation to share as it's reasonably tightly coupled to my custom CMS but here is a code snippet which should be enough to get this up and running anywhere:

The first step is getting a vector representation of a post. Word2vec just gives you a vector for a word (or short phrase depending on how the model is trained). A related technology, doc2vec, adds the document to the vector. This could be useful but isn't really what I needed here (i.e. I could solve my forgetfulness around adding tags by training a model to suggest them for me - might be a good project for another day). I ended up using a pre-trained model and then averaging together the vectors for each word. This paper (PDF) suggests that this isn't too crazy.

For the model I used word2vec-slim which condenses the Google News model down from 3 million words to 300k. This is because my blog runs on a very modest EC2 instance and a multi-gigabyte model might kill it. I load the model into Word2vec.Tools (available via NuGet) and then just get the word vectors (GetRepresentationFor(...).NumericVector) and average them together.

I haven't included code to build the word list but I just took every word from the post, title, meta description and tag list, removed stop words (the, and, etc) and converted to lower case.

Now that each post has a vector representation it's easy to compute the most related posts. For a given post compute the cosine distance between the post vector and every other post. Sort the list in ascending order and pick however many you want from the top (the distance between the post and itself would be 1, a totally unrelated post would be 0). The last line in the code sample shows this comparison for one post pair using Accord.Math, also on Nuget.

I'm really happy with the results. This was a fast implementation and a huge improvement over tag based related posts.

Animation of a year of Global Cloud Cover

Animation of a year of Global Cloud Cover

Here's an animation showing a year of global cloud cover (from July 2017 to July 2018) :

The clouds are sourced from the free daily download at xplanet. I run a Google apps script that saves a copy of the image to Google Drive every day (basically the same as this script to save Nest cam images). The hard part was waiting a year to get enough frames. Xplanet combines GEOS, METEOSAT and GMS satellite imagery with some reflection near the poles. Although I didn't need to for this project note that you can subscribe to higher quality / more frequent downloads.

As well as the clouds you can also see the terminator between day and night change shape over the course of the year. This video starts and ends with the Summer equinox when days are longest in the Northern hemisphere.

Where it's nighttime the image is based on NASA's Black Marble. The daytime is based on Blue Marble, but blended with a different older image which has better ocean colors and interpolated daily between twelve monthly Blue Marble satellite images. The result of this is that you can see snow and ice coverage changing over the course of the year. If you look closely you'll also notice vegetation growing and dying back with the seasons.

Rendered in a slightly modified build of Catfood Earth (the main release doesn't know how to access my private cache of xplanet cloud images). As well as combining day, night and cloud images Catfood Earth can also show you earthquakes, volcanoes, US weather radar, political borders, places and time zones. It has been enlivening Windows desktop wallpaper for fifteen years now (as shareware back when that was a thing, these days it's a free download for Windows and Android).

Export Google Fit Daily Steps, Weight and Distance to a Google Sheet

Updated on Saturday, December 7, 2019

Google Fit Daily Step Export

Google Fit is a great way to keep track of your daily step count without needing to carry a Fitbit or other dedicated tracker. It's not easy to get that data out though, as far as I can tell the only way is Google Takeout which is not made for automation. Luckily there is an API and you can do almost anything with Google Sheets.

If you're looking to export your step count, weight and distance this post has everything you need, just follow the instructions below to get your spreadsheet up and running. This is also a good primer on using OAuth2 with Google Apps Script and should be a decent starting point for a more complex Google Fit integration. If you have any questions or feedback please leave a comment below.

To get started you need a Google Sheet, an apps script project attached to the sheet and a Google API Project that will provide access to the Fitness API. That might sound intimidating but it should only take a few minutes to get everything up and running.

In Google Drive create a new spreadsheet and call it whatever you like. Rename the first tab to 'Metrics'. Enter 'Date' in cell A1, 'Steps' in B1, 'Weight' in C1 and 'Distance' in D1. To grab history as well create another tab called 'History' with the same headers. Next select 'Script editor...' from the Tools menu which will open a new apps script project.

Give the apps script project a name and then select 'Libraries...' from the Resources menu. Next to 'Add a library' enter 1B7FSrk5Zi6L1rSxxTDgDEUsPzlukDsi4KGuTMorsTQHhGBzBkMun4iDF and click Add. This will find the Google OAuth2 library. Choose the most recent version (24 at the time of writing) and click Save. Then select 'Project properties' from the File menu and make a note of the Script ID (a long series of letters and numbers).

Open the Google API Console. Create a new project and name it something like 'Google Fit Sheet'. From the Dashboard click Enable APIs and Services and find and select the Fitness API. Then go to Keys and create an OAuth Client ID. You'll be asked to create a consent screen, the only field you need to enter is the product name (i.e. 'My Fit App'). Then choose Web Application as the application type. You need to set the name and the authorized redirect URL. The redirect URL is https://script.google.com/macros/d/{SCRIPTID}/usercallback replacing {SCRIPTID} with the actual Script ID you made a note of above. After adding this make a note of the Client ID and Client Secret.

Go back to the apps script project and paste the code below into the Code.gs window:

Right at the top of the code there are spaces to enter the Client ID and Client Secret from the API Console. Enter these and save the project.

Switch back to your Google Sheet and reload. After reloading there will be a Google Fit menu item. First select Authorize... You'll get a screen to authorize the script and then a sidebar with a link. Click the link to authorize the script to access your Google Fit data. You can then close the sidebar and select Get Metrics for Yesterday from the Google Fit menu. You should see a new row added to the spreadsheet with yesterday's date and fitness data.

The final step is to automate pulling in the data. Go back to the apps script project and select Current project's triggers from the Edit menu. Add a trigger to run getMetrics() as a time driven day timer - I recommend between 5 and 6am. You can also click notifications to add an email alert if anything goes wrong, like your Google Fit authorization expiring (in which case you just need to come back and authorize from the Google Fit menu again.

At this point you're all set. Every day the spreadsheet will automatically update with your step count from the day before. You can add charts, moving averages, export to other systems, pull in your weight or BMI, etc. I want to add a seven day moving average step count to this blog somewhere as a semi-public motivational tool... watch this space.

Note that weight will be blank in the spreadsheet for days with no weight data. Google Fit doesn't return the last known weight, only the known value for days where an update was recorded.

If you are looking to extend this sample to other data types then this API explorer page is very helpful for finding data types that the API documentation doesn't list.

A couple of times working on this script I got my authorization in a bad state and started getting a 400 error response from the API. If this happens run your Google Fit app, click the Profile icon at the bottom and then the Settings icon at the top right. Click Manage connected apps and then disconnect the script from Google Fit. Finally run the Reset Settings option from the menu in the sheet and then authorize again.

I updated this post on Jan 21, 2019 to extend the sample to handle weight and distance as well as steps. I also improved the history function to handle many days in one API call rather than a quick hack I added earlier that pulled a day at a time. I'd recommend using the code above rather than anything included in comments below (at least comments before this update).

Enable GZIP compression for Amazon S3 hosted website in CloudFront

Updated on Tuesday, November 12, 2019

Enable GZIP compression for Amazon S3 hosted website in CloudFront

By default compression doesn't work in CloudFront for a website backed by an Amaxon S3 bucket.

The first step is pretty obvious - switch on compression in CloudFront:

Compress Objects Automatically option in Amazon CloudFront

To get to this setting open you distribution, go to the Behaviors tab and edit your behavior(s). Scroll down to the bottom and toggle Compress Objects Automatically to On. Save and drum your fingers while the distribution updates.

The less obvious piece is that CloudFront will only compress files between 1,000 and 10,000,000 bytes (as of writing this post) and it detects the filesize from the Content-Length header. What the documentation doesn't mention is that S3 does not send the Content-Length header by default and so no compression is applied.

Go to S3 and open the properties for your bucket (not for individual files). Expand Permissions and then click Edit CORS Configuration. You need to add Content-Length as an allowed header like this:

Amazon S3 CORS Configuration

Catfood Software Support

Updated on Monday, December 9, 2019

Catfood Software Support

Need help with a Catfood Software product? Please leave a comment below.

Catfood Earth 3.41

Updated on Wednesday, February 22, 2017

Catfood Earth 3.41

Catfood Earth 3.41 fixes a problem that was preventing the weather radar layer from loading.

I've also updated to the latest (2015g) time zone database and the latest time zone map from Eric Muller.

Download the latest Catfood Earth.

(Previously)

Catfood Earth 3.40

Updated on Wednesday, February 22, 2017

Catfood Earth 3.40

I've just released Catfood Earth 3.40 for Windows and 1.50 for Android.

Both updates fix a problem with the clouds layer not updating. The Android update also adds compatibility for Android 5 / Lollipop.

Also, Catfood Earth for Android is now free. I had been charging $0.99 for the Android version but I've reached the conclusion that I'm never going to retire based on this (or even buy more than a couple of beers) so it's not worth the hassle. Catfood Earth for Windows has been free since 3.20.

Enjoy!

Fortune Cookies for Android

Updated on Thursday, November 12, 2015

Fortune Cookies for Android

Fortune is now available on Google Play. It's an Android version of the UNIX fortune program and will send a random fortune cookie to your notification area at 8ish every morning.