I've just launched a redesign of I Thought He Came With You. The main thrust is to make the site more usable on desktops. Which seems nuts, but the data doesn't lie. The site has low mobile traffic and for a while I thought this was some kind of technical issue. I optimized the design heavily for mobile and spent a lot of time on speed and some AMP. I guess it's the content. Google loves it when I write documentation for them and doesn't think I have anything useful to say on politics. They're probably right. So I've gone back to having an old school sidebar and I've taken the performance hit of using Bootstrap to get some better looking forms and navigation without spending a lot of time on it. I hope you enjoy it, and if you find anything broken please email or leave a comment.
By Robert Ellison. Updated on Saturday, February 19, 2022.
Six 4K images a day at 24 frames per second (so each second is four days) from April 18, 2019 to April 17, 2020:
I made a version of this video a couple of years ago using xplanet clouds. That was lower resolution and only had one frame per day so it's pretty quick. This version uses the new 4K cloud image I developed for Catfood Earth just over a year ago. I've been patiently saving the image six times a day (well, patiently waiting as a script does this for me). It's pretty amazing to see storms developing and careening around the planet. The still frame at the top of the post shows Dorian hitting Florida back in September.
(Published to the Fediverse as:
4K One Year Global Cloud Timelapse #code#software#video#timelapse#animation#clouds#earth Animation of a year of global cloud cover from April 18, 2019 to April 17, 2020. You can see storms developing and careening around the planet. Rendered from six daily 4K images.)
(Published to the Fediverse as:
Starlink Train #photo#starlink#stars Photo of a Starlink Train, three stacked six second exposures of Starlink satellites passing over San Francisco, California.)
Not getting far from home any time soon so all hikes for now will be local. I found a great web app, Routeshuffle, that will generate a random hike from the starting location of your choice. It's a great way of seeing streets you'd never venture down normally. The map is generated with some software I wrote to combine multiple KML files for easy rendering in Google Earth (oh, and Google Earth, I guess that does the heavy lifting). I'll post these every month while we're locked down. March is 15.6 miles total.
(Published to the Fediverse as:
Social Undistancing #photo#coronavirus Composite photograph of many people and cars walking past my house during the Coronavirus lockdown.)
By Robert Ellison. Updated on Saturday, February 12, 2022.
This post describes how to get metrics (in this case average response time) from an Azure App Service into a Google Sheet. I’m doing this so I can go from the sheet to a Data Studio dashboard. I already have a report in Data Studio that pulls from Ads, Analytics and other sources. I’d rather spend hours adding Azure there than be forced to have one more tab open. You might have different reasons. Read on.
Create a Google Sheet and give it a memorable name. Rename the first sheet to AvgResponseTime and put ‘Date’ in A1 and ‘Average Response Time’ in B1.
Create a script (Script editor from the Tools menu) and give that a good name as well.
In the script editor pick Libraries from the Resources menu. Enter 1B7FSrk5Zi6L1rSxxTDgDEUsPzlukDsi4KGuTMorsTQHhGBzBkMun4iDF which is the Google OAuth library, pick the latest version and click Save.
Select Project properties from the File menu and make a note of the Script ID.
Log into your Azure Console and then go to https://resources.azure.com/. You are looking for a metricdefinitions node for the resource that you want to monitor. In my case it’s subscriptions / MyName Subscription / resourceGroups / providers / Microsoft.Web / sites / MySite / metricdefintions. Browse through this list to find the id of the metric you’re interested in. For me it’s AverageResponseTime. Finding this was the hardest part. Microsoft’s documentation for resourceUri is literally ‘The identifier of the resource.’ Why even bother Microsoft? Make a note of the id and remove the ‘metricDefinitions/AverageResponseTime’ from the end, because of course the ID isn’t quite right for some reason. Mine looks something like this: /subscriptions/mylongid/resourceGroups/mysomethingResourceGroup/providers/Microsoft.Web/sites/mysiteid
Go back to the Azure Console and open Azure Active Directory. Select App registrations under Manage and create a New registration. Time to come up with another name. You probably want ‘Accounts in this organizational directory only’. The redirect URL is https://script.google.com/macros/d/SCRIPTID/usercallback - replace SCRIPTID with the Script ID you made a note of in step 4.
Click the View API permissions button, then Add a permission and then pick Azure Service Management. I’m using Delegated permissions and the user_impersonation permission. Then click Grant admin consent for Default Directory.
Go to Certificates & secrets (under manage) and create a new client secret. Make a careful note of it.
Go to Authentication (under Manage), check Access tokens under Implicit grant and then click Save at the top of the page.
Go to Overview and make a note of your Application (client) ID and Directory (tennant) ID.
You are getting close! Go to the script editor (from step 2) and paste in the code at the bottom of this post. There are four variables to enter at the top of the script. ClientID and TennantID are from step 10. ClientSecret is from step 8. ResourceID is from step 5. Save the script.
Reload the spreadsheet (back from step 1). You should get an Azure Monitor menu item. Choose Authorize from this menu. Google will ask you to authorize the script, do this for the Google account you’re using. Choose Authorize again, this time a sidebar will appear with a link. Follow the link and authorize against Azure (if you’re logged in this might just go directly to success). If you get authorization errors in the future run this step again. If that does help use Reset Settings and then try again.
You should be ready to get data. Choose Fetch Data from the Azure Monitor menu. If this doesn’t work check through steps 1-12 carefully again!
Last step - automate. Go back to the script editor. Choose Current project’s triggers from the Edit menu. Add a trigger (the small blue button hidden at the bottom right of the screen - not everything needs a floating action button Google!) to run fetchData daily at some reasonable time.
You should now have a daily record of average response time flowing to a Google sheet. This can easily be extended to support other metrics, and other time periods (you could get data by minute and run the script hourly for instance. See the metrics documentation for more ideas. I got this working for an App Service but it should be roughly the same flow for anything that supports metrics, you’ll just need to work on finding the right resourceUri / ID.
(Published to the Fediverse as:
Using the Azure Monitor REST API from Google Apps Script #code#azure#appsscript#gas#google#microsoft How to call the Azure Monitor REST API via OAuth from Google Apps Script. Worked example shows how to log average response time for an Azure App Service.)
Mostly read this in a deserted nightclub on the roof of a hotel in India with a warm but dirty breeze and some trippy chilled out music. It's the only way to read this strange book. Anywhere else I might not have enjoyed it quite a much after the first few chapters, but in the setting it was transcendental.
The Volume Shadow Copy Service (VSS) can be used to mount a copy of a drive in a crash consistent state (like you just unplugged your computer) as a different drive letter. My shadow task command line tool makes it easy to do this. Here's an example:
ShadowTask64 C V test.bat
This creates a shadow copy of the C: drive, mounts it as V: and then runs test.bat. When test.bat completes the shadow V: drive is removed.
What can you do with this?
The simple case is grabbing a copy of a locked file. In this case test.bat could just run pause and then grab the file you need from V:.
It's also a great way to run a backup. In this case the batch file can run XCOPY (to a network share or portable drive) and you'll get all your files including anything locked like running executables or Outlook PST files.
Grab the latest from GitHub: abfo/shadow/releases. You'll find a ZIP file and source code if you want to use/extend this. There are 32 and 64-bit binaries - you need to use the right one for your computer and you also need to run the tool with administrative privileges. This only works with NTFS drives.
(Published to the Fediverse as:
Backup locked files on Windows 10: Volume Shadow Copy Update #code#vss#backup Command line tool that mounts a shadow copy of any NTFS drive in Windows 10 so you can grab a locked file or run a complete crash-consistent XCOPY backup.)