Rivians have a drive cam feature that will continually record footage from four cameras (front, rear, left and right) while you're driving. It's a built in dash cam which immediately got me excited to make hyperlapse style movies of interesting drives.
My first attempt was very, very sad. Rivian dumps out the footage in some fisheye format that looks terrible. It also often skips frames, so when imported to DaVinci Resolve the dread Media Offline error pops up all the time during playback. Insta360 Studio handles the dropped frames and so I created the hyperlapse there and tried to zoom in enough to fix the fisheye but overall I was very disappointed. Hopefully Rivian fixes the footage or provides some sort of tool to make this feature usable at some point.
Today I wrestled with the problem a bit more deeply and got something working. The image at the top of this post is a drive cam frame that is dramatically improved. The trick is using the lenscorrection filter in ffmpeg. The filter requires k1 and k2 coefficients which I solved for by generating hundreds of videos and eyeballing them, like the horrifying experience of visiting an optician and suspecting that they're going to write your prescription based on your opinion of which letter looks better. After much juggling I settled on -0.45 and 0.11. In terms of command line this translates to:
This re-encoding also has the happy side effect of fixing the dropped frames.
I would love to have some official numbers to plug in (hint, hint Rivian). My Rivian is a 2025 Gen 2 R1S - I have no idea how much the camera module varies between different Rivian variants so this might work for you or might need more fine tuning. Having cracked this I'm currently processing some footage of a trip to Shasta Lake and will post that soon (update - it's here).
(Published to the Fediverse as:
Fix Rivian Drive Cam Distortion #code#rivian#ffmpeg Using ffmpeg and the lenscorrection filter to fix the fisheye distortion on Rivian Drive Cam footage.)
We recently got an electric vehicle and unsurprisingly our electricity usage has shot up - something like 125% so far. This is of course offset by not needing to buy gas, but the PG&E bill is starting to look eye watering.
PG&E offers an exciting and nearly impenetrable number of rate plans. Right now we're on E-TOU-C which PG&E says is the best choice for us. This is a time of use plan which makes a lot of sense - electricity is cheap off peak and expensive when it's in high demand. Running the dishwasher at the end of the day saves a few cents. Charging an EV at the right time is a big deal.
I decided to simulate our bill on each plan, with and without EV charging.
This turns out to be astonishingly complicated. There is probably a significant energy saving in having the billing systems sweat a bit less. It's not just peak vs. off peak, the rates are different for summer and winter. In some plans peak is a daily occurrence and in others it doesn't apply to weekends and holidays (raising the exciting sub investigation of what PG&E considers to be a holiday). Some plans have a daily use fee. Our plan has a discount for baseline usage, others do not.
That's all just for the conventional time of use plans. The EV plans introduce a 'part-peak' period so there are three different rates based on time of day. They also have different definitions of summer.
I had imagined a quick spreadsheet but this has turned into a python exercise. The notebook is included below. If you use this you'll need to estimate your average daily EV charging needs and also your baseline details. It uses a year of data downloaded from PG&E to run the simulation, so use the year before you started charging an EV. I think I've captured most of the details but I did take a shortcut with the baseline calculations - it uses calendar months instead of billing periods. PG&E billing periods range from 28-33 days, presumably because that will be cheaper in the long run.
It would be nice if PG&E had some kind of what-if modelling but I guess that's not in their best interests. Right now the web site says I should stick on E-TOU-C, which looks like a bad idea even based on the past year of usage. All of the plans are pretty close for me based on historical usage though. Adding an EV shows a huge difference. Off peak rates are a lot cheaper but in exchange the peak rates are much higher. I'll save a lot moving to the EV2 plan, which is what I've just done. It's not clear how you should choose between the different EV oriented plans without getting into this level of detail, but they are all better than the conventional time of use options if you have better things to do.
I evaluated the E-TOU-B, E-TOU-C and E-TOU-D time of use plans and the EV Rate A, EV Rate B, EV2 and E-ELEC plans for people with an EV or other qualifying electrical thing. The chart at the top of the post shows PG&E's estimates for the past year, my estimates and then my estimates with EV charging included.
(Published to the Fediverse as:
Which PG&E rate plan works best for EV charging? #code#pge#electricity#ev#python Simulating PG&E bills with and without EV charging across 7 rate plans to discover the cheapest option (Python).)
By Robert Ellison. Updated on Sunday, September 29, 2024.
Catfood Earth for Android 4.40 is now available on Google Play.
Earth has an updated look and feel and two new features.
The volcanoes layer has been ported over from the Windows version of Catfood Earth. When enabled this will show volcanoes that have recent activity (within the past week) using data from the Smithsonian Institution's Global Volcanism Program.
It's now possible to show your current location on the map. I'm not sure it's a replacement for Google Maps just yet but it does help you find where you are on the satellite image.
The release was prompted by Google requiring API level 34 support... completing this for Fortune Cookies was a nightmare but having learnt from that experience Earth made the jump to MAUI pretty smoothly.
If you already use Earth for Android you should get the new version shortly. If not, this is what Android live wallpaper was made for so give it a try!
Fortune Cookies for Android 1.50 is now available in the Google Play Store.
This update was driven by Google insisting that I target API level 34. Which is fair enough and I figured this would be a five minute task followed by a smooth release. I should have known better.
Of course the starting point is updating Visual Studio, updating the Android SDK, learning that my emulator won't launch any more and eventually coaxing it back to life. That's a couple of hours. Why this doesn't just happen when I'm doing other things I don't know, but for dev tools this has to be a ceremony.
Once all of that was done I learned that Xamarin was officially deprecated in May. I'm going to have to figure out MAUI.
There is a helpful migration page with this gob smacking advice:
"Once your dependencies are resolved and your code and resource files are added to your .NET native project, you should build your project. Any errors will guide you towards next steps."
I think they hired Yoda:
"Errors, they are. Guide you, they will, towards your next steps. Warnings, hmm, check them out you must... eventually. But information issues? Merely whispers they are, nudging you towards shiny new platform features, yes! Listen, you might, if time you have."
Anyway... the actual mechanics of getting this working in MAUI were not that bad. It could be that I need to reinstall my system with extreme prejudice but the platform itself seems to be very unstable. I constantly got Visual Studio and cryptic compile errors that went away on rebuild or a restart. Starting the android emulator has completely frozen my system several times requiring a hard reboot. I don't think I've had that experience since the Clinton administration.
Once it was finally working the Google Play Developer console wanted my "private" key, which I gave it; and to have a conversation about my tax situation in Cuba, which I'm ignoring for now.
As well as a brand new API target Fortune has a nifty new color scheme, a floating action button with a little fortune cookie on it, and will ask you nicely for permission to send notifications.
(Published to the Fediverse as:
Fortune Cookies for Android 1.50 #code#fortune#software#cookie#catfood#xamarin#maui Catfood Fortune for Android is based on the UNIX command of the same name and will display a random and possibly no longer socially acceptable fortune from a less civilized era.)
I just published a San Francisco Budget GPT to the ChatGPT store. This is free to use, you just need an OpenAI account.
The chatbot is constructed from the 2010-2025 data available on DataSF, the most recent 2025-2026 budget draft and a five year revenue projection. I added some instructions to help interpret the data and I've tested against a range of queries. This is generative AI and so be cautious about what it says. It's pretty good when it can find the right data and will happily invent things if it can't.
Not that this is limited to chatbots. There was a lot of press yesterday around San Francisco being the worst run city in the nation based on this 'analysis' from WalletHub. They are dividing a measure of service quality by budget dollars per capita. This fails to take into account that San Francisco is a city and a county and so the budget includes county services like a Sheriff department. It also ignores that San Francisco runs an international airport and port, not paid for by taxpayers but still in the budget. And it doesn't adjust for regional differences in income that make services more expensive to provide (and higher total dollar taxes to provide them). So I'll take an occasional chatbot glitch over a willfully incurious press pushing out a PR piece that tickles their confirmation biases.
Spring starts now (03:07 UTC on March 20, 2024) in the Northern Hemisphere and Autumn for the equatorially challenged. The image above shows the precise moment of the equinox in Catfood Earth.
(Published to the Fediverse as:
Vernal (Spring) Equinox 2024 #code#earth#equinox#spring#autumn#vernal The exact moment (03:07 UTC on March 20, 2024) of Spring Equinox 2024 rendered in Catfood Earth.)
By Robert Ellison. Updated on Thursday, November 28, 2024.
There is a stunningly simple way to get a file out of sharepoint and I'll get to that soon (or just skip to the very end of the post).
I have been automating the shit out of a lot of routine work in Microsoft Teams recently. Teams is the result of Skype and Sharepoint having too much to drink at the Microsoft holiday party. It often shows. One annoyance is that channel threads are ordered by the time that someone last responded. Useful for quickly seeing the latest gossip but a pain when you need to keep an eye on each individual thread. After listlessly scrolling around trying to keep up with the flow I came up with a dumb solution - I sync the channel to Obsidian (my choice of note app, could be anything) and then I can just check there for new threads. It's a small convenience but has meaningully improved my life.
Unfortunately I got greedy. These messages usually have a PowerPoint presentation attached to them and so why not have an LLM summarize this while updating my notes?
It doesn't look like Copilot has a useful API yet. You can build plug-ins, but I don't want to talk to Copilot about presentations, I just want it to do the heavy lifting while I sleep so I can read the summary in the morning. Hopefully in the future there will be a simple way to say hey, Copilot, summarize this PPTX. Not yet.
So the outline of a solution here is download the presentation, send it ChatGPT, generate a summary and stick that in Obsidian. This felt like a half hour type of project. And it should have been - getting GPT4 Turbo to summarize a PPTX file took about ten minutes. Downloading the file has taken days and sent my self esteem back to primary school.
You would think that downloading a file would be the Graph API's bread and butter. Especially as I have a ChatMessage from the channel that includes attachments and links. The link is for a logged in human, but it must be easy to translate from this to an API call, right?
It turns out that all you need is the site ID, the drive ID and the item ID.
These IDs are not in the attachment URL or the ChatMessageAttachment. It would be pretty RESTful to include the obvious next resource I'm going to need in that return type. No dice though.
I tried ChatGPT which helpfully suggested API calls that looked really plausible and helpful but that did not in fact exist. So I then read probably hundreds of blogs and forum posts from equally confused and desperate developers. Here is a typical example:
"Now how can I upload and download files to this library with the help of Graph API (GraphServiceClient)."
To which Microsoft, terrifyingly, reply:
"We are currently looking into this issue and will give you an update as soon as possible."
Ignoring the sharepoint part and glossing over where that drive ID is coming from. Other documentation suggests that you can lookup your site by the URL, and then download a list of drives to go looking for the right one. Well, the first page in paginated drive collection anyway implying that just finding the ID might get you a call from the quota police.
I know Microsoft is looking after a lot of files for a lot of organizations, but how can it be this hard?
It isn't. It's just hidden. I eventually found this post from Alex Terentiev that points out that you just need to base64 encode the sharing url, swap some characters around and then call:
If Google was doing its job right this would be the top result. I should be grateful they're still serving results at all and not just telling me that my pastimes are all harmful.
The documentation is here and Microsoft should link to it on every page that discusses drives and DriveItems. For GraphServiceClient the call to get to an actual stream is:
(Published to the Fediverse as:
Download a Sharepoint File with GraphServiceClient (Microsoft Graph API) #code#ml#graph#sharepoint#c# Everyone developing applications with the Graph API should know about the shares endpoint that allows you to download files easily.)
Winter starts right now for those of us at the top of the planet. It's summer time down under. Winter Solstice 2023 rendered in Catfood Earth (03:28 on December 22, 2023 UTC).
(Published to the Fediverse as:
Winter Solstice 2023 #code#winter#solstice#catfood#earth The exact moment of Winter Solstice 2023 (03:28 UTC on December 22) rendered in Catfood Earth.)