I Thought He Came With You is Robert Ellison’s blog about software, marketing, politics, photography and time lapse.

Kidlapse - Make a Movie of Your Child Growing Up

Kidlapse - Make a Movie of Your Child Growing Up

Kidlapse is now live. This is a service I've been working on that uses machine learning to recognize faces and then rotate and zoom you so get pretty good alignment between each photo. You upload one photo per month and Kidlapse then creates a timelapse movie of your child growing up. If that sounds like something you'd be interested in sign up and give it a try.

Here are a couple of sample videos created using Kidlapse:

 

Transit of Mercury

Transit of Mercury November 11 2019

Detail from Transit of Mercury November 11 2019

The transit of Mercury on November 11, 2019 shot from San Francisco, CA with a Sony RX10 IV with an ND5.0 filter (and a better filter adapter than this one).

(Recent Photos)

Bangalore Timelapse

Bangalore, Karnataka, India

Dawn to dusk 60 frame per second timelapse of Bangalore (Bengaluru) in Karnataka, India. Shot on a GoPro Session from the 10th floor of the Sheraton Grand Hotel over two days.

Fleet Week 2019 Air Show

Fleet Week 2019 Air Show

Fleet Week 2019 Air Show

Fleet Week 2019 Air Show

Fleet Week 2019 Air Show

Fleet Week 2019 Air Show

Fleet Week 2019 Air Show

Blue Angels, United 777, a Patriot passing Treasure Island and an F-35 wtih a P-51.

(Recent Photos)

Fleet Week 2019 Parade of Ships

Updated on Wednesday, October 16, 2019

Timelapse of the 2019 Parade of Ships at the San Francisco Fleet Week. Includes the USS Somerset, USS Zumwalt, USS Princeton and USS Charleston.

Timelapse of the 2019 Fleet Week Parade of Ships in San Francisco, California.

Ships include the USS Somerset, USS Zumwalt, USS Princeton and USS Charleston.

Sugarloaf Stars

Night sky over Sugarloaf Ridge State Park

4K timelapse of the night sky over Sugarloaf Ridge State Park in Sonoma, California (home to the Robert Ferguson Observatory, which is home to the first Laser SETI site).

Kidlapse

I'm working on a project to generate a timelapse of a kid growing up. I wasn't organized enough to shoot my kids in the same pose on the same background so it's quite a tough problem. To fix this I'm using machine learning to recognize faces in photos and then automatically rotate and align them so the face is in the same place in every shot. From there it's just a matter of generating frames that fade between the different photos and stitching them together into a video. If this sounds interesting check it out at kidlapse.com and sign up to get notified when the service launches.

Life, Non-locality and the Simulation Hypothesis

Updated on Wednesday, August 21, 2019

Conway's Game of Life, Recently

Conway's Game of Life is a cellular automaton where simple rules lead to surprisingly complex behavior. You can even build a Turing Machine in it. Life consists of a grid of cells which are either alive or dead. For each generation a cell flips from dead to alive if it has three alive neighbors. If a cell is alive and has two or three neighbors then it survives to the next generation, otherwise if dies. When programming a non-infinite Life game it's common to wrap the logic at the extent of the grid - so the some 'neighbors' of the cells at the very top are the cells at the very bottom and so on.

Imagine that you discover such a system and try to figure out the physics of it.

After observation of a sample of cells you'd figure out the rules that govern the life and death of most cells. You'd also figure out a speed of 'light' for the system - information can only travel one cell per generation. The state of cells further away have no influence. You've got a kind of classical physics of the Game of Life.

Further study would throw up a puzzle though. Cells at the extremes of the system are influenced by cells at the other extreme. In some cases the speed of 'light' is violated - you now have a non-local physics in the mix. At this point you might fix the problem with geometry - maybe the grid is actually wrapped around a torus (even though you're looking at a rectangular grid). This makes the system logically consistent again but it's wrong - the non-local behavior occurs because you're trying to analyze a simulation.

In quantum physics observing the state of a property on one particle in a pair of entangled particles will instantly effect the observation of that property on the other particle, no matter the distance between them. This is Einstein's spooky action at a distance. It seems like it can't possibly be true, but has been demonstrated repeatedly (and quite spectacularly using starlight to select which property to measure).

There are many different interpretations of how to understand quantum physics. But as you might expect from physicists these concern themselves with a physical universe (or multiverse depending on the flavor). It's possible though that non-locality (and the apparant quantized nature of our reality) is trying to tell us something else. Non-local effects are entirely consistent with a reality that is being generated frame by frame, just like a souped up Game of Life.

(Read the full simulation hypothesis series: Part 1: Can I move to a Better Simulation Please?, Part 2: Have we Already Proved that the Simulation Hypothesis is False?, Part 3: Life, Non-locality and the Simulation Hypothesis.)

Road Trip Timelapse

Swiftcurrent Lake at Glacier National Park

Compilation of time lapses from a recent toad trip - Grand Teton (with less than perfect focus), Yellowstone (Old Faithful), Yellowstone (Mammoth Hot Springs), Glacier (Swiftcurrent lake at the Many Glacier Area), Humboldt Redwoods State Park and then finally stars over Humboldt.

How to backup Google Photos to Google Drive automatically after July 2019 with Apps Script

Updated on Wednesday, September 25, 2019

Google Photos backup to Google Drive shutting down in July 2019

Google has decided that backing up your photos via Google Drive is 'confusing' and so Drive based backup is going away this month. I love Google Photos but I don't trust it - I pull everything into Drive and then I stick a monthly backup from there onto an external drive in a fire safe. There is a way to get Drive backup working again using Google Apps Script and the Google Photos API. There are a few steps to follow but it's pretty straightforward - you should only need to change two lines in the script to get this working for your account.

First two caveats to be aware of. Apps Script has a time limit and so it's possible that it could fail if moving a large number of photos. You should get an email if the script ever fails so watch out for that. Secondly and more seriously you could end up with two copies of your photos. If you use Backup and Sync to add photos from Google Drive then these photos will be downloaded from Google Photos by the script and added to Drive again. You need to either upload directly to Google Photos (i.e. from the mobile app or web site) or handle the duplicates in some way. If you run Windows then I have released a command line tool that sorts photos into year+month taken folders and handles de-duplication.

One more limitation. After a comment by 'Logan' below I realized that Apps Script has a 50MB limitation for adding files to Google Drive. The latest version of the script will detect this and send you an email listing any files that could not be copied automatically.

On to the script. In Google Drive create a new spreadsheet. This is just a host for the script and makes it easy to authorize it to access Google Photos. Select 'Script editor' from the Tools menu to create a new Apps Script project.

In the script editor select 'Libraries...' from the Resources menu. Enter 1B7FSrk5Zi6L1rSxxTDgDEUsPzlukDsi4KGuTMorsTQHhGBzBkMun4iDF next to 'Add a library' and click add. This will find the Google OAuth2 library Pick the most recent version and click Save.

Select 'Project properties' from the File menu and copy the Script ID (a long sequence of letters and numbers). You'll need this when configuring the Google Photos API.

In a new window open the Google API Console, part of the Google Cloud Platform. Create a new project, click Enable APIs and Services and find and enable the Google Photos API. Then go to the Keys section and create an OAuth Client ID. You'll need to add a consent screen, the only field you need to fill out is the product name. Choose Web Application as the application type. When prompted for the authorized redirect URL enter https://script.google.com/macros/d/{SCRIPTID}/usercallback and replace {SCRIPTID} with the Script ID you copied above. Copy the Client ID and Client Secret which will be used in the next step.

Go back to the Apps Script project and paste the code below into the Code.gs window:

Enter the Client ID and Client Secret inside the empty quotes at the top of the file. You also need to add an email address to receive alerts for large files. There is a BackupFolder option at the top as well - the default is 'Google Photos' which will mimic the old behavior. You can change this if you like but make sure that the desired folder exists before running the script. Save the script.

Go back to the spreadsheet you created and reload. After a few seconds you will have a Google Photos Backup menu (to the right of the Help menu). Choose 'Authorize' from this menu. You will be prompted to give the script various permissions which you should grant. After this a sidebar should appear on the spreadsheet (if not choose 'Authorize' from the Google Photos Backup menu again). Click the authorize link from the sidebar to grant access to Google Photos. Once this is done you should be in business - choose Backup Now from the Google Photos Backup menu and any new items from yesterday should be copied to the Google Photos folder in Drive (or the folder you configured above if you changed this).

Finally you should set up a trigger to automate running the script every day. Choose 'Script editor' from the Tools menu to re-open the script, and then in the script window choose 'Current project's triggers' from the Edit menu. This will open yet another window. Click 'Add Trigger' which is cunningly hidden at the bottom right of the window. Under 'Choose which function to run' select 'runBackup'. Then under 'Select event source' select 'Time-driven'. Under 'Select type of time based trigger' select 'Day timer'. Under 'Select time of day' select the time window that works best for you. Click Save. The backup should now run every day.

The way the script is written you'll get a backup of anything added the previous day each time it runs. If there are any duplicate filenames in the backup folder the script will save a new copy of the file with (1) appended in front of the filename. Let me know in the comments if you use this script or have any suggestions to improve it.