More on simulation. Have we proved this isn't true? No. Also - what does analyzing the physics of Conway's Game of Life tell us?
New project: generate a timelapse of a kid growing up from a set of any photos (uses machine learning to spot and align faces). Check it out at kidlapse.com.
(Published to the Fediverse as:
Sugarloaf Stars #timelapse#video#4k#sugarloaf#stars A 4K timelapse of the night sky over Sugarloaf Ridge State Park in Sonoma, California (which is home to the Robert Ferguson Observatory).)
I'm working on a project to generate a timelapse of a kid growing up. I wasn't organized enough to shoot my kids in the same pose on the same background so it's quite a tough problem. To fix this I'm using machine learning to recognize faces in photos and then automatically rotate and align them so the face is in the same place in every shot. From there it's just a matter of generating frames that fade between the different photos and stitching them together into a video. If this sounds interesting check it out at kidlapse.com and sign up to get notified when the service launches.
By Robert Ellison. Updated on Saturday, February 12, 2022.
Conway's Game of Life is a cellular automaton where simple rules lead to surprisingly complex behavior. You can even build a Turing Machine in it. Life consists of a grid of cells which are either alive or dead. For each generation a cell flips from dead to alive if it has three alive neighbors. If a cell is alive and has two or three neighbors then it survives to the next generation, otherwise if dies. When programming a non-infinite Life game it's common to wrap the logic at the extent of the grid - so the some 'neighbors' of the cells at the very top are the cells at the very bottom and so on.
Imagine that you discover such a system and try to figure out the physics of it.
After observation of a sample of cells you'd figure out the rules that govern the life and death of most cells. You'd also figure out a speed of 'light' for the system - information can only travel one cell per generation. The state of cells further away have no influence. You've got a kind of classical physics of the Game of Life.
Further study would throw up a puzzle though. Cells at the extremes of the system are influenced by cells at the other extreme. In some cases the speed of 'light' is violated - you now have a non-local physics in the mix. At this point you might fix the problem with geometry - maybe the grid is actually wrapped around a torus (even though you're looking at a rectangular grid). This makes the system logically consistent again but it's wrong - the non-local behavior occurs because you're trying to analyze a simulation.
In quantum physics observing the state of a property on one particle in a pair of entangled particles will instantly effect the observation of that property on the other particle, no matter the distance between them. This is Einstein's spooky action at a distance. It seems like it can't possibly be true, but has been demonstrated repeatedly (and quite spectacularly using starlight to select which property to measure).
There are many different interpretations of how to understand quantum physics. But as you might expect from physicists these concern themselves with a physical universe (or multiverse depending on the flavor). It's possible though that non-locality (and the apparant quantized nature of our reality) is trying to tell us something else. Non-local effects are entirely consistent with a reality that is being generated frame by frame, just like a souped up Game of Life.
(Published to the Fediverse as:
Life, Non-locality and the Simulation Hypothesis #etc#simulationhypothesis#conway How Conway's Game of Life illustrates non-locality and how this might be interpreted as evidence in favor of the simulation hypothesis when looking at non-locality in quantum physics.)
"A recent study by theoretical physicists from Oxford University in the U.K., which was published in the journal Scientific Advances just last week, definitively confirms that life and reality aren’t products of a computer simulation."
Strong statement. This is because they determined that running a simulation of a small quantum system was intractable:
"To store information about a couple hundred electrons, they noted, one needs a computer memory that requires more atoms than what’s available in the universe."
This might have something to say about what we can simulate on a classical computer in our universe, but it has no bearing on if our universe is itself simulated. If it is we have no idea what kind of computer is doing the simulating, nor what the physical laws are of the universe where that computer is running, nor even how many atoms it has at its disposal.
It's okay Elon, you still might be on to something.
(Published to the Fediverse as:
Have we Already Proved that the Simulation Hypothesis is False? #etc#simulationhypothesis Oxford University confirms that we don't live in a simulation - but they haven't proved what they think they've proved.)
By Robert Ellison. Updated on Saturday, February 12, 2022.
In the New York Times last weekend Preston Greene has an op-ed piece on the simulation hypothesis where he argues that we shouldn't check, because:
"If we were to prove that we live inside a simulation, this could cause our creators to terminate the simulation — to destroy our world."
But let's back up. To start with he trots out Bostrom:
"In 2003, the philosopher Nick Bostrom made an ingenious argument that we might be living in a computer simulation created by a more advanced civilization."
Am I living in a simulated universe where I am the only person to have ever consumed any science fiction, or spent late nights discussing the nature of the universe in a bad simulation of a kitchen? For some reason Nick Bostrom is now almost universally credited with the simulation hypothesis. Every article on the topic seems to starts with this revelation. In 2003! Like right after he finished watching The Matrix Revolutions. Have no newspaper editors ever read any Philip K. Dick? Descartes? This is not a new idea, and Bostrom's ancestor simulations are a rather tortured special case of a much wider set of possibilities.
And then:
"Professor Smoot estimates that the ratio of simulated to real people might be as high as 1012 to 1."
Sounds specific. It could be 1016 though. Or 7. Not really subject to numerical analysis at our current level of knowledge (which Greene would not increase).
And given that we don't know this invalidates the whole point of the article:
"In much the same way, as I argue in a forthcoming paper in the journal Erkenntnis, if our universe has been created by an advanced civilization for research purposes, then it is reasonable to assume that it is crucial to the researchers that we don’t find out that we’re in a simulation."
That's one possibility, sure. Reasonable to assume? No. Equally possible is that the researchers are trying to find universes that figure out that they are simulated. They keep the ones that manage it within 13.773 billion years or so and discard the others.
I think it's even more likely that simulated universes are a commodity and the number running as screen savers vastly outnumbers those used for serious research projects. Our fate depends on whether the entity that installed us is having a three martini lunch or heading back after two.
(Published to the Fediverse as:
Can I move to a Better Simulation Please? #etc#simulationhypothesis If the simulation hypothesis is true should we avoid checking? Rebuttal to New York Times op-ed by Preston Green. Spoiler alert, there is no way to know. Also, can we stop pretending that this was Nick Bostrom's idea?)
I've just tidied up and released a tool I've used for a while to sort photos and videos. It does a pretty good job figuring out the date each was taken and then moves them to a year + month subfolder. The source code and a binary release are now available on github - see photo-sorter.
This is a command line application with two arguments, a source folder and a destination folder. Use it like this (paths are examples and note that if there are spaces then the entire argument needs to be in quotes):
This will process all files in the source folder, including subfolders, even if they are not photos or videos. Each file will be moved to a year + month subfolder in the destination (i.e. 2019-08) or to a special subfolder (An Unknown Date) for any files where the date the photo or video was taken cannot be determined.
In addition to moving files the tool also handles de-duplication. If the file already exists in the destination folder it is just deleted from the source and not moved. This is checked by file contents (hash) and not by name. If a different file with the same name already exists in the destination folder then PhotoSorter will move it to a unique, new filename.
I originally wrote this to handle my 'Google Photos' folder - when this feature worked it just dumped everything from Google Photos into one Drive folder with no organization. I used this periodically to tidy everything into my Photos folder also backed up to Google Drive. Now that Google has stopped syncing Drive and Photos this is still useful, especially with my script that copies new photos over to Google Drive.
Photo Sorter has been updated to handle some duplicates I've been developing in Google Photos. These are pretty specific rules but might be helpful if you are trying to maintain a local archive from Google Photos via Google Takeout. You can get the latest binary and source from github (or fork away if it's not quite what you need).
The first change is that Photo Sorter now checks for duplicates in the source folder as well as the destination. If two source files have the same date taken and the same filename then the larger file is chosen as the winner and the smaller file is deleted. The filename check ignores anything in parentheses, so 123.jpg is considered to be the same filename as 123(1).jpg. This helps alleviate a fun bug where Google Photos will export via the API a different file that was originally uploaded. I've stopped using the Google Photos API for this reason, and because it will under no circumstances allow you to download a video that is the same quality as the original upload. Crazy edge case Google. Happily Google Takeout still works so I'm stuck doing it slowly and wastefully.
The second change is that if a source duplicate is found using the rules above then it will also be deleted from the destination folder (in order to be replaced by the presumed better version of itself).
Photo Sorter copies some folder full of photos and movies to a different folder with a clean structure and some de-duplication. It's been keeping me sane since 2018.
Photo Sorter 1.10
Photo sorter has been updated to skip metadata when comparing JPEG files.
I've been picking up some duplicates when I have both a local copy and a version downloaded from Google Photos. Google Photos knocks out some metadata and so the files look different even though the photo is the same. If you've used Photo Sorter before you'll need to run it over everything again to knock out any copies.
(Published to the Fediverse as:
Photo Sorter #code#photosorter#jpeg Windows command line tool to de-duplicate and sort photos by year and month, available as source and binary on GitHub.)