For no good reason I downloaded my gas and electricity consumption data by day for the last couple of years.
The electricity trend is unsurprising. At the start of the pandemic it jumps up and stays up. With work and school from home we're running four computers non-stop, burning lights and (the horror) printing things. Overall we used 24% more electricity in 2020.
Gas on the other hand is pretty flat. There are some different peaks at the start and end of the year, but our total gas consumption increased by 0.08%. This doesn't make any sense to me. Being at home doesn't make much of a difference to laundry but it should have had a big impact on everything else. The heating has been on way more, we're cooking breakfasts and lunches that would have occurred out of the house in 2019 and we must be using more hot water as well.
There is one strange difference between how electricity and gas are metered. Fractional kWh are distributed randomly between .00 and .99 as you'd expect. Fractional therms are totally different - we're apparently likely to use 1.02 or 2.03 therms but never 1.50. This feels like it must be some sort of rounding or other billing oddness but I can't find any reasonable explanation despite asking Google three different ways.
In a move that I might come to bitterly regret I have emailed PG&E to see if they can explain it. I'll update this post if I hear back. Or if you're a therm metering expert please leave a comment!
Updated 2021-02-20 13:51:
PG&E say:
"Thank you for contacting our Customer Service Center. Gas usage is registered by recording therms usage. If you view your daily usage online, you will see that therms are only registered in whole units. The only pace that you will see therms not as whole units is when you review the average daily usage. The pandemic started in March 2020 and since then your gas usage is up slightly versus previous years. Most customers will see a larger increase in electric usage versus gas usage when staying home more than normal. The majority of customers set the tempatures of the their heaters to very similar temperatures year over year and your heater will work to keep your house at the temperature whether you are home or not at home."
So the fractional therms are some sort of odd rounding on the downloaded data. Fair enough.
The majority of customers use the same temperature setting? Really? So that might be a good explanation if you constantly heat your house to the same temperature, but I know for sure that isn't us. We have a Nest Learning Thermostat and as I've previously reported this doesn't so much learn as just constantly turn the heating off. So staying warm is a constant battle with the thing.
Maybe the difference is that the pandemic started around Spring when San Francisco is warm enough to not need much heating. I'll look again when I can just compare winter vs winter in a couple of months.
Updated 2023-08-06 18:11:
Took a while to update, but here is some more data. Electricity stayed high until Spring 2021 and then dropped to roughly pre-pandemic levels. This is because I spent a lot of time in 2021 upgrading lighting. My house has a different type of fixture/bulb in every room making this a painful process but I'm almost 100% LED at this point which has made a difference. Gas on the other hand has got higher and stayed there and I should really replace some more windows and add some more insulation...
I really wish the utility companies made this data available through some useful API instead of needing to download the occasional CSV. I'd build a dashboard and obsess over energy usage far more.
(Published to the Fediverse as:
Pandemic Gas Mystery #etc#coronavirus#gas#electricity Why is my gas bill flat in 2020 when electricity usage has gone up 24%? A pandemic gas mystery based on PG&E data.)
(Published to the Fediverse as:
Sunset #6 #timelapse#4k#sunset#video Timelapse of sunset looking west over The Pacific from West Portal in San Francisco.)
(Published to the Fediverse as:
Post Storm Sunset #photo#sanfrancisco#sunset Photo of a dramatic sunset after a winter storm from Grand View park in San Francisco, California.)
Photo sorter has been updated to skip metadata when comparing JPEG files.
I've been picking up some duplicates when I have both a local copy and a version downloaded from Google Photos. Google Photos knocks out some metadata and so the files look different even though the photo is the same. If you've used Photo Sorter before you'll need to run it over everything again to knock out any copies.
(Published to the Fediverse as:
Stormy #timelapse#sanfrancisco#storm#video Time lapse of clouds developing and a storm sweeping in over the Sunset District in San Francisco, California.)
By Robert Ellison. Updated on Saturday, February 19, 2022.
TensorFlow Hub has a great sample for transferring the style of one image to another. You might have seen Munch's The Scream applied to a turtle, or Hokusai's The Great Wave off Kanagawa to the Golden Gate Bridge. It's fun to play with and I wondered how it would work for a timelapse video. I just posted my first attempt, four different shots of San Francisco and I think it turned out pretty well.
The four sequences were all shot on an A7C, one second exposure, ND3 filter and aperture/ISO as needed to hit one thousandth of a second before fitting the filter. Here's an example shot:
I didn't want The Scream or The Wave, so I asked my kids to pick two random pieces of art each so I could have a different style for each sequence:
The style transfer network wants a 256x256 style image so I cropped and resized the art as needed.
The sample code pulls images from URLs. I modified it to connect to Google Drive, iterate through a source folder of images and write the transformed images to a different folder. I'm running this in Google Colab which has the advantage that you get to use Google's GPUs and the disadvantage that it will disconnect, timeout, run out of memory etc. To work around this the modified code can be run as many times as needed to get through all the files and will only process input images that don't already exist in the output folder. Here's a gist of my adapted colab notebook:
One final problem is that the style transfer example produces square output images. I just set the output to 1920x1920 and then cropped a HD frame out the middle of each image to get around this.
Here's a more detailed workflow for the project:
I usually shoot timelapse with a neutral density filter to get some nice motion blur. When I shot this sequence it was the first time I'd used my filter on a new camera/lens and screwing in the filter threw off the focus enough to ruin the shots. Lesson learned - on this camera I need to nail the focus after attaching the filter. As I've been meaning to try style transfer for timelapse I decided to use this slightly bad sequence as the input. Generally for timelapse I shoot manual / manual focus, fairly wide aperture and ISO 100 unless I need to bump this up a bit to get to a 1 second exposure with the filter.
After shooting I use LRTimelapse and Lightroom 6 to edit the RAW photos. LRTimelapse reduces flicker and works well for applying a pan and/or zoom during processing as well. For this project I edited before applying the style transfer. The style transfer network preserves detail very well and then goes crazy in areas like the sky. Rather than zooming into those artifacts I wanted to keep them constant which I think gives a greater sense of depth as you zoom in or out.
Once the sequence is exported from Lightroom I cancel out of the LRTimelapse render window and switch to Google Colab. Copy the rendered sequence to the input folder and the desired style image and then run the notebook to process. If it misbehaves then Runtime -> Restart and run all is your friend.
To get to video I use ffmpeg to render each sequence. For this project at 24 frames per second and cropping a 1920x1080 frame from each of the 1920x1920 style transfer images.
Then DaVinci Resolve to edit the sequences together. I added a 2 second cross dissolve between each sequence and a small fade at the very beginning and end.
Finally, music. I use Filmstro Pro and for this video I used the track Durian.
(Published to the Fediverse as:
Style Transfer for Time Lapse Photography #code#ml#tensorflow#drive#python#video How to apply the TensorFlow Hub style transfer to every frame in a timelapse video using Python and Google Drive.)
By Robert Ellison. Updated on Saturday, February 19, 2022.
A timelapse of San Francisco on New Year's Eve 2020:
Shot from Treasure Island, Corona Heights Park. Fort Baker and near Battery 129 in the Marin Headlands. Each sequence was transformed with a style transfer neural network (full details).
(Published to the Fediverse as:
San Francisco New Year's Eve Timelapse 2020 #timelapse#sanfrancisco#video San Francisco New Year's Eve Timelapse 2020 (style transfer neural network version).)
(Published to the Fediverse as:
Abrigo Valley #hike#map Briones Regional Park hike taking in Abrigo Valley Trail, Santos Trail and Briones Crest Trail. 3.5 miles.)