California Climate Credit

Updated on Sunday, November 20, 2022

Illustration of money in a burned out forest

Once in a while I'm stupid enough to read my email. This month I'm getting a climate credit! Must have done something right? No:

"The California Climate Credit is part of California’s efforts to fight climate change. This credit is from a state program that requires power plants, natural gas providers, and other large industries that emit greenhouse gases to buy carbon pollution permits. The credit on your bill is your share of the payments from the State’s program."

So... apparently part of fighting climate change is making my energy bill randomly cheaper?

It's hard to think of anything less likely to help. Just as I'm starting to feel the pain of winter bills I'm paying slightly less and so I'm slightly less inclined to turn down the heating or finally do something about my beautiful but effectively absent front windows.

A problem with carbon taxation is that it's regressive. So why not use this money to make the first $xx cheaper, and maybe even charge more at the high end of usage?

Add your comment...

Related Posts

You Might Also Like

(All Etc Posts)

(Published to the Fediverse as: California Climate Credit #etc #gas #electricity #climatechange #california Why do I get a California Climate Credit? This is the worst possible way to fight climate change. Do something smarter California! )

Pandemic Gas Mystery

Updated on Sunday, August 6, 2023

For no good reason I downloaded my gas and electricity consumption data by day for the last couple of years.

Electricity usage in kWh 2020 vs 2019 7 day moving average

The electricity trend is unsurprising. At the start of the pandemic it jumps up and stays up. With work and school from home we're running four computers non-stop, burning lights and (the horror) printing things. Overall we used 24% more electricity in 2020.

Gas usuage in therms 2020 vs 2019 7 day moving average

Gas on the other hand is pretty flat. There are some different peaks at the start and end of the year, but our total gas consumption increased by 0.08%. This doesn't make any sense to me. Being at home doesn't make much of a difference to laundry but it should have had a big impact on everything else. The heating has been on way more, we're cooking breakfasts and lunches that would have occurred out of the house in 2019 and we must be using more hot water as well.

There is one strange difference between how electricity and gas are metered. Fractional kWh are distributed randomly between .00 and .99 as you'd expect. Fractional therms are totally different - we're apparently likely to use 1.02 or 2.03 therms but never 1.50. This feels like it must be some sort of rounding or other billing oddness but I can't find any reasonable explanation despite asking Google three different ways.

Fractional therms billed distribution for PG&E billing days in 2019 and 2020

In a move that I might come to bitterly regret I have emailed PG&E to see if they can explain it. I'll update this post if I hear back. Or if you're a therm metering expert please leave a comment!

Updated 2021-02-20 13:51:

PG&E say:

"Thank you for contacting our Customer Service Center. Gas usage is registered by recording therms usage.  If you view your daily usage online, you will see that therms are only registered in whole units.  The only pace that you will see therms not as whole units is when you review the average daily usage.  The pandemic started in March 2020 and since then your gas usage is up slightly versus previous years. Most customers will see a larger increase in electric usage versus gas usage when staying home more than normal.  The majority of customers set the tempatures of the their heaters to very similar temperatures year over year and your heater will work to keep your house at the temperature whether you are home or not at home."

So the fractional therms are some sort of odd rounding on the downloaded data. Fair enough.

The majority of customers use the same temperature setting? Really? So that might be a good explanation if you constantly heat your house to the same temperature, but I know for sure that isn't us. We have a Nest Learning Thermostat and as I've previously reported this doesn't so much learn as just constantly turn the heating off. So staying warm is a constant battle with the thing.

Maybe the difference is that the pandemic started around Spring when San Francisco is warm enough to not need much heating. I'll look again when I can just compare winter vs winter in a couple of months.

Updated 2023-08-06 18:11:

Pandemic Gas Mystery

Pandemic Gas Mystery

Took a while to update, but here is some more data. Electricity stayed high until Spring 2021 and then dropped to roughly pre-pandemic levels. This is because I spent a lot of time in 2021 upgrading lighting. My house has a different type of fixture/bulb in every room making this a painful process but I'm almost 100% LED at this point which has made a difference. Gas on the other hand has got higher and stayed there and I should really replace some more windows and add some more insulation...

I really wish the utility companies made this data available through some useful API instead of needing to download the occasional CSV. I'd build a dashboard and obsess over energy usage far more.

Add your comment...

Related Posts

You Might Also Like

(All Etc Posts)

(Published to the Fediverse as: Pandemic Gas Mystery #etc #coronavirus #gas #electricity Why is my gas bill flat in 2020 when electricity usage has gone up 24%? A pandemic gas mystery based on PG&E data. )

Using the Azure Monitor REST API from Google Apps Script

Updated on Saturday, February 12, 2022

Average Server Response Time in Azure Metrics

This post describes how to get metrics (in this case average response time) from an Azure App Service into a Google Sheet. I’m doing this so I can go from the sheet to a Data Studio dashboard. I already have a report in Data Studio that pulls from Ads, Analytics and other sources. I’d rather spend hours adding Azure there than be forced to have one more tab open. You might have different reasons. Read on. 

  1. Create a Google Sheet and give it a memorable name. Rename the first sheet to AvgResponseTime and put ‘Date’ in A1 and ‘Average Response Time’ in B1.
  2. Create a script (Script editor from the Tools menu) and give that a good name as well.
  3. In the script editor pick Libraries from the Resources menu. Enter 1B7FSrk5Zi6L1rSxxTDgDEUsPzlukDsi4KGuTMorsTQHhGBzBkMun4iDF which is the Google OAuth library, pick the latest version and click Save.
  4. Select Project properties from the File menu and make a note of the Script ID.
  5. Log into your Azure Console and then go to https://resources.azure.com/. You are looking for a metricdefinitions node for the resource that you want to monitor. In my case it’s subscriptions / MyName Subscription / resourceGroups / providers / Microsoft.Web / sites / MySite / metricdefintions. Browse through this list to find the id of the metric you’re interested in. For me it’s AverageResponseTime. Finding this was the hardest part. Microsoft’s documentation for resourceUri is literally ‘The identifier of the resource.’ Why even bother Microsoft? Make a note of the id and remove the ‘metricDefinitions/AverageResponseTime’ from the end, because of course the ID isn’t quite right for some reason. Mine looks something like this: /subscriptions/mylongid/resourceGroups/mysomethingResourceGroup/providers/Microsoft.Web/sites/mysiteid
  6. Go back to the Azure Console and open Azure Active Directory. Select App registrations under Manage and create a New registration. Time to come up with another name. You probably want ‘Accounts in this organizational directory only’. The redirect URL is https://script.google.com/macros/d/SCRIPTID/usercallback - replace SCRIPTID with the Script ID you made a note of in step 4.
  7. Click the View API permissions button, then Add a permission and then pick Azure Service Management. I’m using Delegated permissions and the user_impersonation permission. Then click Grant admin consent for Default Directory.
  8. Go to Certificates & secrets (under manage) and create a new client secret. Make a careful note of it.
  9. Go to Authentication (under Manage), check Access tokens under Implicit grant and then click Save at the top of the page.
  10. Go to Overview and make a note of your Application (client) ID and Directory (tennant) ID.
  11. You are getting close! Go to the script editor (from step 2) and paste in the code at the bottom of this post. There are four variables to enter at the top of the script. ClientID and TennantID are from step 10. ClientSecret is from step 8. ResourceID is from step 5. Save the script.
  12. Reload the spreadsheet (back from step 1). You should get an Azure Monitor menu item. Choose Authorize from this menu. Google will ask you to authorize the script, do this for the Google account you’re using. Choose Authorize again, this time a sidebar will appear with a link. Follow the link and authorize against Azure (if you’re logged in this might just go directly to success). If you get authorization errors in the future run this step again. If that does help use Reset Settings and then try again.
  13. You should be ready to get data. Choose Fetch Data from the Azure Monitor menu. If this doesn’t work check through steps 1-12 carefully again!
  14. Last step - automate. Go back to the script editor. Choose Current project’s triggers from the Edit menu. Add a trigger (the small blue button hidden at the bottom right of the screen - not everything needs a floating action button Google!) to run fetchData daily at some reasonable time.
You should now have a daily record of average response time flowing to a Google sheet. This can easily be extended to support other metrics, and other time periods (you could get data by minute and run the script hourly for instance. See the metrics documentation for more ideas. I got this working for an App Service but it should be roughly the same flow for anything that supports metrics, you’ll just need to work on finding the right resourceUri / ID.

Add your comment...

More Google Apps Script Projects

(All Code Posts)

(Published to the Fediverse as: Using the Azure Monitor REST API from Google Apps Script #code #azure #appsscript #gas #google #microsoft How to call the Azure Monitor REST API via OAuth from Google Apps Script. Worked example shows how to log average response time for an Azure App Service. )

Email Alerts for new Referers in Google Analytics using Apps Script

Updated on Monday, February 13, 2023

Referral Traffic in Google Analytics

It's useful to know when you have a new website referrer. Google Analytics is plagued with spam referral and you want to filter this out of reporting as quickly as possible to stop it from skewing your data. It's also helpful to be able to respond quickly to new referral traffic - maybe leave a comment or promote the new link on social media.

The script below will send you a daily email with links to any new referrers (this is GA3, there is a GA4 version later in this post).

Start a new apps script project in Google Drive and paste in the code. At the top enter the view ID that you want to monitor and the email address that should receive reports.

Choose Advanced Google Services from the Resources menu and switch on the Google Analytics API. Then click the Google API Console link and enable the Google Analytics API there as well.

Finally pick Current project's triggers from the Edit menu and trigger the main function daily at a convenient time.

This script saves known referrers in script properties. For a site with lots of traffic this may run out of space in which case you might need to switch this out and write known referrers to a sheet instead.

For Google Analytics 4 properties use the version of the script below. The setup process is the same, but you need the Google Analytics Data API instead of the Google Analytics API.

Add your comment...

More Google Apps Script Projects

(All Code Posts)

(Published to the Fediverse as: Email Alerts for new Referers in Google Analytics using Apps Script #code #googleanalytics #appsscript #gas #ga4 Apps script that will email you any new referral traffic from Google Analytics. Useful for responding to new links and referrer spam. GA3 and GA4 versions. )

Get an email if your site stops being mobile friendly (no longer available)

Updated on Tuesday, December 5, 2023

Google axed this tool today, so the script won't work any more. If you're looking for a replacement check out my Core Web Vitals script.

Get an email if your site stops being mobile friendly

Google just released an API for the mobile friendly test and so I've whipped up a script to send an alert if a web page violates their guidelines. This will run the test as often as you like and send you an email if it detects a problem. Alternatively if you're not mobile friendly it will keep emailing you until you fix any problems which might be a good motivational tool.

First start a new apps script project in drive and paste in the code below:

There are three variables you need to set, urlToMonitor is the full URL of the page to test, alertEmail is your email address (or whoever needs to be pestered) and runTestKey is the API key for the service. To get this go to the Google API Console, click Enable API, search for 'Google Search Console URL Testing Tools API' and click enable. Then click the Credentials option and generate a browser key.

Once you've configured the script choose 'Current project's triggers' from the Resources menu in apps script and set up a schedule for the mobileFriendlyMonitor() function.

Add your comment...

More Google Apps Script Projects

(All Code Posts)

(Published to the Fediverse as: Get an email if your site stops being mobile friendly (no longer available) #code #mobile #appsscript #gas Use Google Apps Script and the Mobile Friendly Test API to constantly monitor your site for any violations. )

Automate Google PageSpeed Insights and Core Web Vitals Logging with Apps Script

Updated on Friday, September 30, 2022

Upload

Here's a quick script to automatically monitor your Google PageSpeed Insights desktop and mobile scores for a web page, together with core web vitals (LCP, FID and CLS):

You need a spreadsheet with a tab called results and an API key for PageSpeed Insights (activate the API in the console and create an API key for it, the browser based / JavaScript option). Paste the code above into the script editor for the spreadsheet and add your API key and URL to monitor. Then just choose triggers from the Resources menu and schedule the monitor function to run once per day.

The script will log the overall PageSpeed score out of 100 for the monitored page. It also logs 75th percentile origin level core web vitals (largest contentful paint (LCP, seconds), first input delay (FID, seconds) and cumulative layout shift (CLS, percent)). If your origin does not have enough data the metric will be omitted. You can change from origin to page level web vitals if you have enough data, just change originLoadingExperience to loadingExperience in the script.

The results are repeated for desktop and mobile, so your spreadsheet header should be Desktop PSI, Desktop LCP, Desktop FID, Desktop CLS, Mobile PSI, Mobile LCP, Mobile FID, Mobile CLS.

There are a lot of other values returned (like number and types of resources on the page) that you could choose to monitor as well. It would also be easy to extend this to monitor more URLs, or to send you an email if the score drops below a threshold.

Updated May 5, 2019 to use version 5 of the PageSpeed API.

Updated June 13, 2021 to include core web vitals.

Add your comment...

More Google Apps Script Projects

(All Code Posts)

(Published to the Fediverse as: Automate Google PageSpeed Insights and Core Web Vitals Logging with Apps Script #code #google #appsscript #gas #pagespeed How to automatically monitor page load performance using the Google PageSpeed Insights API and Apps Script )