ECONOMIC ANALYSIS OF THE 2022 FEDERAL CLEAN FUELS STANDARD
I have published, through LFXAssociates.ca, an analysis of the proposed federal Clean Fuels Standard over the interval from 2021 to 2040.
This is the first application of a new version of the LFX Canadian Model which I started developing in 2020. This version embeds many new features including recursive dynamics and endogenous savings behaviour. My analysis shows that the CFS will be a drag on output and growth, so that by 2030 the economy will be 2.8 percent smaller than it otherwise would be and labour demand will be reduced by 72,000 jobs nationally. If compliance can be achieved through creation of credits at a capped price of $275 per tonne the costs in terms of lost GDP are roughly halved and the employment losses are about a third. While Canadian GHG emissions at the consumer stage drop, since US-based ethanol has the same lifetime GHG emissions profile as gasoline, no net global GHG emissions reduction occurs.
NOVEMBER 23, 2022
ECONOMIC ANALYSIS OF THE PROPOSED CANADIAN CLEAN FUELS STANDARD
Newspaper Columns, Commentary and Other
SOME RECENT NEWSPAPER OP-EDS
COMPLETE LISTING HERE
October 13, 2022
Yet again, IPCC's climate math doesn't check out
July 26, 2022
Why climate change is different than other environmental problems
June 24, 2022
Junk science has led to junk policies
We didn’t have inflation after 2008. Why are we having it now?
April 1, 2022
The 2030 Emissions Plan: Canada's gift to Putin
February 25, 2022
Don't be afraid to debate climate science
DISCUSSION WITH ROBERT MURPHY ON THE REASONS FOR TODAY'S OUTBREAK OF INFLATION
On June 10 2022 I joined Bob Murphy of the Mises Institute for a discussion of monetary policy and specifically we examined why we are having inflation now but we didn't in 2009 after the financial crisis.
CLIMATE POLICY: WHEN EMOTION MEETS REALITY
I have done a few talks recently on the theme of why CO2 emissions, unlike other types of air pollution, have proven so hard to reduce. One such presentation was for the Irish Climate Science Forum and is available online:
My talk covers some of the main reasons why, after 30 years of concerted public policy effort, there has been so little achieved on climate policy, and why I think this will continue to be the case going forward. Governments do a disservice to the public when they keep promising more than they can deliver and when they try to rally support for climate policy by claiming not only will it not cost anything but will make us wealthier. Marcel Crok wrote an article about the presentation for Clintel here.
TEMPERATURE TRENDS IN CANADA SINCE 1888
We hear a lot about climate change. Would someone who lived in, say, 1918 notice much change in the average weather conditions compared to today? Once you delve into temperature data you will see that it's very hard to offer a simple answer to such a question. Patterns vary over time, by season and by place. For those Canadians who are curious about how the climate might have changed near where they live, I have written a rather lengthy report on the subject.
Or rather, I wrote an R program that generated a lengthy report. I analyze long term records on monthly average daytime highs in Canada, in various segments based on collections of stations available back 40, 60, 80, 100 and 130 years. There are also some nice graphs. If you think you know what "climate change" looks like in Canada, now you can test your perceptions against the data. The R program is here.
The idea of this site is very simple: to build the complete environmental record of every community across Canada. The site currently shows air emissions by source (back to 1990), air contaminant levels (back to 1974), monthly average high temperatures (back to 1900) for hundreds of places across the country, and water pollution records for several provinces.
The layout is self-explanatory and it's very easy to use. The data are all from government agencies, but most of it has not hitherto been disseminated in a usable form to the public. All my sources are linked and the data I use are easily-downloadable.
So the next time you find yourself in a conversation about some aspect of the environment and you wonder what is actually going on, look at yourenvironment.ca to find out.
Recent Journal Articles and Discussion Papers
UPWARD BIAS IN OPTIMAL FINGERPRINTING FROM USE OF TLS
Continuing with my examination of the optimal fingerprinting methodology used in climatology to attribute climate change to GHG's I have an article in Climate Dynamics comparing OLS and TLS estimators.
I have posted a blog at Climate Etc. offering a non- (or less-) technical explanation of the findings. Basically, TLS was proposed 20 years ago as a solution to attenuation bias in OLS, which can cause the coefficients in a fingerprinting regression to be underestimated. The OLS bias may thwart "detection" of the GHG signal and since fingerprint coefficients are used in carbon budget calculations it may overstate the "allowable" CO2 emissions associated with a warming target. The problem is that in many circumstances TLS over-corrects and imparts an upward bias, thus exaggerating the size of the forcing signal. I run a series of Monte Carlo experiments and show that TLS is not automatically preferred to OLS and can easily be more biased, but in the opposite direction. This propensity was known in the 1990s. Econometricians never use TLS as far as I know, we use IV regression to fix attenuation bias since unlike TLS it can be shown to be consistent.
TOTAL LEAST SQUARES BIAS WHEN EXPLANATORY VARIABLES ARE CORRELATED
Continuing my exploration of the statistical elements of the IPCC climate attribution methodology I have a couple of papers under review at journals in which I use Monte Carlo simulations to analyse the properties of Total Least Squares (TLS, the preferred regression method) under conditions typical in a signal detection regression. There is very little underlying theory about when TLS yields consistent or unbiased results. In a single-variable model with a random explanatory variable TLS corrects a downward bias in Ordinary Least Squares (OLS, the standard regression method). But in many other cases it over-corrects or introduces new biases, and consistency results are not available without imposing unrealistic and untestable assumptions. In one paper I examine the consequences of omitted variables bias, and I will disseminate that paper separately. In this paper I look at what happens if the explanatory variables are allowed to be correlated (as they are in signal detection regressions). The results are, frankly, bizarre. I have posted a draft of the paper on the Earth and Space Science pre-print archive here:
I don't know what to make of the results and I would welcome comments. Unfortunately the table formatting in the archived version is wonky so here is a better version. The turn-key R code is in the Supplement provided, and is also here.
CHECKING FOR MODEL CONSISTENCY IN OPTIMAL FINGERPRINTING: A COMMENT
I have several papers underway assessing the statistical methods used by IPCC authors over the past 20 years for attributing climate changes to greenhouse gas emissions. The first is a critique of the seminal paper in the field, published in 1999 in Climate Dynamics. My paper has likewise been published in Climate Dynamics.
UPDATE III (May 2022): Chen et al. (Peking University) have released a pre-print commenting on my paper (available here). As far as I know it isn't published. I was asked to review it shortly after it appeared and my comments are here.
UPDATE II (October 18): I have published through the Global Warming Policy Foundation a non-technical explanation of my paper. At the accompanying website the GWPF has reproduced comments Myles Allen provided to earlier media inquiries, to which I have added a reply, and Richard Tol has supplied a commentary on the exchange.
UPDATE: Here is a non-technical Backgrounder to try and make the material more accessible.
I have published a (somewhat) non-technical summary at Judith Curry's blog (PDF here). Optimal Fingerprinting has long been the dominant tool in climatology for attributing climate changes to greenhouse gases. It is a matrix-weighted generalized least squares (GLS) regression model, and as such is based on tools familiar to economists, although changed in non-standard ways. In my article I show that those changes destroy the properties of consistency and unbiasedness associated with regular GLS methods. Unfortunately the problems have been concealed by exclusive reliance on a test statistic introduced by Allen and Tett that is meaningless for checking specification errors. As a result, none of the applications of this method over the past 20 years can be considered to have yielded reliable results.
THE ECONOMICS LITERATURE DOES NOT SUPPORT THE 1.5C TARGET
Robert Murphy and I have published a study for the Fraser Institute arguing that standard mainstream economic analysis does not endorse the 1.5C target.
We don't conduct a cost-benefit analysis ourselves. And we point out that the IPCC SR1.5 likewise didn't do a cost-benefit analysis (they even admit as much early in the report). Instead we show that the economists who have done CBA's have found that the costs of trying to keep to a 1.5C warming target vastly exceed the benefits. Nordhaus' analysis, for instance, shows that it would be better to do nothing at all than to try to get warming down to 1.5C. And he got a Nobel Prize.
PRESENTATION TO THE HOUSE OF COMMONS COMMITTEE ON NATURAL RESOURCES
The Government's NR Committee is studying the potential for biofuels and renewable fuels to play a role in reducing greenhouse gas emissions in Canada. I was invited to speak to the Committee on rather short notice. In my presentation I summarize work I have done on this issue in the past as well as some basic principles that guide my thinking about climate policy.