For some reason I have set myself the task of completing the Scot Harvath series. They are OK thrillers lessened by reading back to back as fundamentally they're very similar. This one is about a deliberate pandemic so it's a good time to read it. Nearly there...
The Last Day by Andrew Hunter Murray
I picked this up because I listen to the no such thing as a fish podcast, co-hosted by the author. I wasn't expecting much but it's really quite good (and not at all funny, more tense and melancholy). I want to find out what happens next.
I've just launched a redesign of I Thought He Came With You. The main thrust is to make the site more usable on desktops. Which seems nuts, but the data doesn't lie. The site has low mobile traffic and for a while I thought this was some kind of technical issue. I optimized the design heavily for mobile and spent a lot of time on speed and some AMP. I guess it's the content. Google loves it when I write documentation for them and doesn't think I have anything useful to say on politics. They're probably right. So I've gone back to having an old school sidebar and I've taken the performance hit of using Bootstrap to get some better looking forms and navigation without spending a lot of time on it. I hope you enjoy it, and if you find anything broken please email or leave a comment.
Six 4K images a day at 24 frames per second (so each second is four days) from April 18, 2019 to April 17, 2020:
I made a version of this video a couple of years ago using xplanet clouds. That was lower resolution and only had one frame per day so it's pretty quick. This version uses the new 4K cloud image I developed for Catfood Earth just over a year ago. I've been patiently saving the image six times a day (well, patiently waiting as a script does this for me). It's pretty amazing to see storms developing and careening around the planet. The still frame at the top of the post shows Dorian hitting Florida back in September.
Not getting far from home any time soon so all hikes for now will be local. I found a great web app, Routeshuffle, that will generate a random hike from the starting location of your choice. It's a great way of seeing streets you'd never venture down normally. The map is generated with some software I wrote to combine multiple KML files for easy rendering in Google Earth (oh, and Google Earth, I guess that does the heavy lifting). I'll post these every month while we're locked down. March is 15.6 miles total.
This post describes how to get metrics (in this case average response time) from an Azure App Service into a Google Sheet. I’m doing this so I can go from the sheet to a Data Studio dashboard. I already have a report in Data Studio that pulls from Ads, Analytics and other sources. I’d rather spend hours adding Azure there than be forced to have one more tab open. You might have different reasons. Read on.
Create a Google Sheet and give it a memorable name. Rename the first sheet to AvgResponseTime and put ‘Date’ in A1 and ‘Average Response Time’ in B1.
Create a script (Script editor from the Tools menu) and give that a good name as well.
In the script editor pick Libraries from the Resources menu. Enter 1B7FSrk5Zi6L1rSxxTDgDEUsPzlukDsi4KGuTMorsTQHhGBzBkMun4iDF which is the Google OAuth library, pick the latest version and click Save.
Select Project properties from the File menu and make a note of the Script ID.
Log into your Azure Console and then go to https://resources.azure.com/. You are looking for a metricdefinitions node for the resource that you want to monitor. In my case it’s subscriptions / MyName Subscription / resourceGroups / providers / Microsoft.Web / sites / MySite / metricdefintions. Browse through this list to find the id of the metric you’re interested in. For me it’s AverageResponseTime. Finding this was the hardest part. Microsoft’s documentation for resourceUri is literally ‘The identifier of the resource.’ Why even bother Microsoft? Make a note of the id and remove the ‘metricDefinitions/AverageResponseTime’ from the end, because of course the ID isn’t quite right for some reason. Mine looks something like this: /subscriptions/mylongid/resourceGroups/mysomethingResourceGroup/providers/Microsoft.Web/sites/mysiteid
Go back to the Azure Console and open Azure Active Directory. Select App registrations under Manage and create a New registration. Time to come up with another name. You probably want ‘Accounts in this organizational directory only’. The redirect URL is https://script.google.com/macros/d/SCRIPTID/usercallback - replace SCRIPTID with the Script ID you made a note of in step 4.
Click the View API permissions button, then Add a permission and then pick Azure Service Management. I’m using Delegated permissions and the user_impersonation permission. Then click Grant admin consent for Default Directory.
Go to Certificates & secrets (under manage) and create a new client secret. Make a careful note of it.
Go to Authentication (under Manage), check Access tokens under Implicit grant and then click Save at the top of the page.
Go to Overview and make a note of your Application (client) ID and Directory (tennant) ID.
You are getting close! Go to the script editor (from step 2) and paste in the code at the bottom of this post. There are four variables to enter at the top of the script. ClientID and TennantID are from step 10. ClientSecret is from step 8. ResourceID is from step 5. Save the script.
Reload the spreadsheet (back from step 1). You should get an Azure Monitor menu item. Choose Authorize from this menu. Google will ask you to authorize the script, do this for the Google account you’re using. Choose Authorize again, this time a sidebar will appear with a link. Follow the link and authorize against Azure (if you’re logged in this might just go directly to success). If you get authorization errors in the future run this step again. If that does help use Reset Settings and then try again.
You should be ready to get data. Choose Fetch Data from the Azure Monitor menu. If this doesn’t work check through steps 1-12 carefully again!
Last step - automate. Go back to the script editor. Choose Current project’s triggers from the Edit menu. Add a trigger (the small blue button hidden at the bottom right of the screen - not everything needs a floating action button Google!) to run fetchData daily at some reasonable time.
You should now have a daily record of average response time flowing to a Google sheet. This can easily be extended to support other metrics, and other time periods (you could get data by minute and run the script hourly for instance. See the metrics documentation for more ideas. I got this working for an App Service but it should be roughly the same flow for anything that supports metrics, you’ll just need to work on finding the right resourceUri / ID.
Mostly read this in a deserted nightclub on the roof of a hotel in India with a warm but dirty breeze and some trippy chilled out music. It's the only way to read this strange book. Anywhere else I might not have enjoyed it quite a much after the first few chapters, but in the setting it was transcendental.