By Robert Ellison. Updated on Saturday, February 12, 2022.
In the New York Times last weekend Preston Greene has an op-ed piece on the simulation hypothesis where he argues that we shouldn't check, because:
"If we were to prove that we live inside a simulation, this could cause our creators to terminate the simulation — to destroy our world."
But let's back up. To start with he trots out Bostrom:
"In 2003, the philosopher Nick Bostrom made an ingenious argument that we might be living in a computer simulation created by a more advanced civilization."
Am I living in a simulated universe where I am the only person to have ever consumed any science fiction, or spent late nights discussing the nature of the universe in a bad simulation of a kitchen? For some reason Nick Bostrom is now almost universally credited with the simulation hypothesis. Every article on the topic seems to starts with this revelation. In 2003! Like right after he finished watching The Matrix Revolutions. Have no newspaper editors ever read any Philip K. Dick? Descartes? This is not a new idea, and Bostrom's ancestor simulations are a rather tortured special case of a much wider set of possibilities.
And then:
"Professor Smoot estimates that the ratio of simulated to real people might be as high as 1012 to 1."
Sounds specific. It could be 1016 though. Or 7. Not really subject to numerical analysis at our current level of knowledge (which Greene would not increase).
And given that we don't know this invalidates the whole point of the article:
"In much the same way, as I argue in a forthcoming paper in the journal Erkenntnis, if our universe has been created by an advanced civilization for research purposes, then it is reasonable to assume that it is crucial to the researchers that we don’t find out that we’re in a simulation."
That's one possibility, sure. Reasonable to assume? No. Equally possible is that the researchers are trying to find universes that figure out that they are simulated. They keep the ones that manage it within 13.773 billion years or so and discard the others.
I think it's even more likely that simulated universes are a commodity and the number running as screen savers vastly outnumbers those used for serious research projects. Our fate depends on whether the entity that installed us is having a three martini lunch or heading back after two.
(Published to the Fediverse as:
Can I move to a Better Simulation Please? #etc#simulationhypothesis If the simulation hypothesis is true should we avoid checking? Rebuttal to New York Times op-ed by Preston Green. Spoiler alert, there is no way to know. Also, can we stop pretending that this was Nick Bostrom's idea?)
I've just tidied up and released a tool I've used for a while to sort photos and videos. It does a pretty good job figuring out the date each was taken and then moves them to a year + month subfolder. The source code and a binary release are now available on github - see photo-sorter.
This is a command line application with two arguments, a source folder and a destination folder. Use it like this (paths are examples and note that if there are spaces then the entire argument needs to be in quotes):
This will process all files in the source folder, including subfolders, even if they are not photos or videos. Each file will be moved to a year + month subfolder in the destination (i.e. 2019-08) or to a special subfolder (An Unknown Date) for any files where the date the photo or video was taken cannot be determined.
In addition to moving files the tool also handles de-duplication. If the file already exists in the destination folder it is just deleted from the source and not moved. This is checked by file contents (hash) and not by name. If a different file with the same name already exists in the destination folder then PhotoSorter will move it to a unique, new filename.
I originally wrote this to handle my 'Google Photos' folder - when this feature worked it just dumped everything from Google Photos into one Drive folder with no organization. I used this periodically to tidy everything into my Photos folder also backed up to Google Drive. Now that Google has stopped syncing Drive and Photos this is still useful, especially with my script that copies new photos over to Google Drive.
Photo Sorter has been updated to handle some duplicates I've been developing in Google Photos. These are pretty specific rules but might be helpful if you are trying to maintain a local archive from Google Photos via Google Takeout. You can get the latest binary and source from github (or fork away if it's not quite what you need).
The first change is that Photo Sorter now checks for duplicates in the source folder as well as the destination. If two source files have the same date taken and the same filename then the larger file is chosen as the winner and the smaller file is deleted. The filename check ignores anything in parentheses, so 123.jpg is considered to be the same filename as 123(1).jpg. This helps alleviate a fun bug where Google Photos will export via the API a different file that was originally uploaded. I've stopped using the Google Photos API for this reason, and because it will under no circumstances allow you to download a video that is the same quality as the original upload. Crazy edge case Google. Happily Google Takeout still works so I'm stuck doing it slowly and wastefully.
The second change is that if a source duplicate is found using the rules above then it will also be deleted from the destination folder (in order to be replaced by the presumed better version of itself).
Photo Sorter copies some folder full of photos and movies to a different folder with a clean structure and some de-duplication. It's been keeping me sane since 2018.
Photo Sorter 1.10
Photo sorter has been updated to skip metadata when comparing JPEG files.
I've been picking up some duplicates when I have both a local copy and a version downloaded from Google Photos. Google Photos knocks out some metadata and so the files look different even though the photo is the same. If you've used Photo Sorter before you'll need to run it over everything again to knock out any copies.
(Published to the Fediverse as:
Photo Sorter #code#photosorter#jpeg Windows command line tool to de-duplicate and sort photos by year and month, available as source and binary on GitHub.)
95% of my incoming calls are now spam. Most of them are some strange pre-recorded Chinese voice with music playing in the background but I occasionally get a free hotel stay as well.
So far Google has rolled out Call Screen. This means I can waste my time watching Google Assistant talk to the spammer. It's way faster not to bother, hang up all calls and delete the voicemails later.
It seems like instead of Call Screen there could be a better way to deal with this.
Firstly, send any call not from someone in my contacts directly to voice mail. This would actually solve a lot of the problem.
Next, for extra credit, run spam detection on the voice mail before sending it to me. If it's two seconds long and blank then just bin it. If it's Chinese with music bin it. Only if it passes the smell test should it appear in my actual voice mail. Google is very good at this for Gmail.
(Published to the Fediverse as:
Please fix phone spam Google! #etc#google#spam#phone Why can't Google manage to fix phone spam when Gmail does such a good job?)
By Robert Ellison. Updated on Saturday, September 3, 2022.
Warning - I no longer recommend using this script to backup Google Photos. The Google Photos API has too many bugs that Google doensn't seem interested in fixing. My personal approach at this point is to use Google Takeout to get a periodic archive of my most recent year of photos and videos. I have a tool that does some deduplication and puts everything in year/month folders. See Photo Sorter for more details.
Google has decided that backing up your photos via Google Drive is 'confusing' and so Drive based backup is going away this month. I love Google Photos but I don't trust it - I pull everything into Drive and then I stick a monthly backup from there onto an external drive in a fire safe. There is a way to get Drive backup working again using Google Apps Script and the Google Photos API. There are a few steps to follow but it's pretty straightforward - you should only need to change two lines in the script to get this working for your account.
First two four caveats to be aware of. Apps Script has a time limit and so it's possible that it could fail if moving a large number of photos. You should get an email if the script ever fails so watch out for that. Secondly and more seriously you could end up with two copies of your photos. If you use Backup and Sync to add photos from Google Drive then these photos will be downloaded from Google Photos by the script and added to Drive again. You need to either upload directly to Google Photos (i.e. from the mobile app or web site) or handle the duplicates in some way. If you run Windows then I have released a command line tool that sorts photos into year+month taken folders and handles de-duplication.
One more limitation. After a comment by 'Logan' below I realized that Apps Script has a 50MB limitation for adding files to Google Drive. The latest version of the script will detect this and send you an email listing any files that could not be copied automatically.
And a fourth limitation after investigating a comment by 'Tim' it turns out there is a bug in the Google Photos API that means it will not download original quality video files. You get some lower resolution version instead. Together with the file size limit this is a bit of a deal breaker for most videos. Photos will be fine, but videos will need a different fix.
On to the script. In Google Drive create a new spreadsheet. This is just a host for the script and makes it easy to authorize it to access Google Photos. Select 'Script editor' from the Tools menu to create a new Apps Script project.
In the script editor select 'Libraries...' from the Resources menu. Enter 1B7FSrk5Zi6L1rSxxTDgDEUsPzlukDsi4KGuTMorsTQHhGBzBkMun4iDF next to 'Add a library' and click add. This will find the Google OAuth2 library Pick the most recent version and click Save.
Select 'Project properties' from the File menu and copy the Script ID (a long sequence of letters and numbers). You'll need this when configuring the Google Photos API.
In a new window open the Google API Console, part of the Google Cloud Platform. Create a new project, click Enable APIs and Services and find and enable the Google Photos API. Then go to the Keys section and create an OAuth Client ID. You'll need to add a consent screen, the only field you need to fill out is the product name. Choose Web Application as the application type. When prompted for the authorized redirect URL enter https://script.google.com/macros/d/{SCRIPTID}/usercallback and replace {SCRIPTID} with the Script ID you copied above. Copy the Client ID and Client Secret which will be used in the next step.
Go back to the Apps Script project and paste the code below into the Code.gs window:
Enter the Client ID and Client Secret inside the empty quotes at the top of the file. You also need to add an email address to receive alerts for large files. There is a BackupFolder option at the top as well - the default is 'Google Photos' which will mimic the old behavior. You can change this if you like but make sure that the desired folder exists before running the script. Save the script.
Go back to the spreadsheet you created and reload. After a few seconds you will have a Google Photos Backup menu (to the right of the Help menu). Choose 'Authorize' from this menu. You will be prompted to give the script various permissions which you should grant. After this a sidebar should appear on the spreadsheet (if not choose 'Authorize' from the Google Photos Backup menu again). Click the authorize link from the sidebar to grant access to Google Photos. Once this is done you should be in business - choose Backup Now from the Google Photos Backup menu and any new items from yesterday should be copied to the Google Photos folder in Drive (or the folder you configured above if you changed this).
Finally you should set up a trigger to automate running the script every day. Choose 'Script editor' from the Tools menu to re-open the script, and then in the script window choose 'Current project's triggers' from the Edit menu. This will open yet another window. Click 'Add Trigger' which is cunningly hidden at the bottom right of the window. Under 'Choose which function to run' select 'runBackup'. Then under 'Select event source' select 'Time-driven'. Under 'Select type of time based trigger' select 'Day timer'. Under 'Select time of day' select the time window that works best for you. Click Save. The backup should now run every day.
The way the script is written you'll get a backup of anything added the previous day each time it runs. If there are any duplicate filenames in the backup folder the script will save a new copy of the file with (1) appended in front of the filename. Let me know in the comments if you use this script or have any suggestions to improve it.
(Published to the Fediverse as:
How to backup Google Photos to Google Drive automatically after July 2019 with Apps Script #code#software#photos#appsscript#google#sheets#drive#api Easy to configure apps script to continue backing up your Google Photos to Google Drive after the July 2019 change. You just need to change two lines of the script to get this running with your Google Photos account.)
By Robert Ellison. Updated on Saturday, February 19, 2022.
Timelapse of the night sky over Pinnacles National Park in California. Jupiter and the Milky Way are both prominent. Shot from the campground on July 4, 2019.
(Published to the Fediverse as:
Stars over Pinnacles #timelapse#stars#pinnacles#video A timelapse showing the Milky Way and Jupiter over the Pinnacles National Park in California.)