George Rebane
Most well-read people look at Gross Domestic Product as the gold standard of our economy’s performance. And all know that GDP growth will cure almost all the ills we still see in the economy. But when we take a deeper look at how the GDP statistic is concocted, we discover that it’s really a rat’s nest of formulas and calculations and selective capture of dubiously reliable data, that then pops out a number which, more often than not, must be rejiggered in a couple of months to give a new and improved version of itself.
And the GDP also comes with error bounds that would surprise most of us were they included with the published number – but they aren’t, and thereby hangs a tale. Diane Coyle wrote an excellent book on the topic – GDP: A Brief But Affectionate History (2014) – wherein she says –
There is no such entity out there as GDP in the real world, waiting to be measured by economists. It is an abstract idea… I also ask whether GDP alone is still a good enough measure of economic performance—and conclude not. It is a measure designed for the twentieth-century economy of physical mass production, not for the modern economy of rapid innovation and intangible, increasingly digital, services. How well the economy is doing is always going to be an important part of everyday politics, and we’re going to need a better measure of “the economy” than today’s GDP.
Investment economist Bill Mauldin writes (here), “GDP doesn’t capture intangible goods production, or the services that form a large part of today’s economy. It was designed for the agricultural and industrial economy its Depression-era designers knew”, and goes on to explain –
GDP is a huge undertaking, full of rules, with almost as many exceptions to the rules, changes, fixes, and qualifications, so that, as one Amazon reviewer noted, GDP is in reality so complex there are only a handful of people in the world who fully understand it, and that does not include the commentators and politicians who pontificate about it almost daily. The quarterly release of GDP statistics is more akin to a religious service than anything resembling a scientific study. The awe and breathlessness with which the number is discussed is somewhat amusing to those who understand the sausage-making process that goes into producing the number. Whether the GDP reading is positive or negative, it often changes less in a given quarter than the margin of error in the figure itself, and it can be and generally is revised significantly—often many years later when almost no one is paying attention. When’s the last time the mainstream media reported a five-year-old revision?
To this, economic and social theorist Jeremy Rifkin adds – “The problem with the [GDP] index is that it counts negative as well as positive economic activity. If a country invests large sums of money in armaments, builds prisons, expands police security, and has to clean up polluted environments and the like, it’s included in the GDP.”
The major flaw in GDP is that it includes the spending of all levels of government, and that means it is double counting the part of that spending which comes from taxes and fees. These had to be already earned and counted previously in the GDP formula. Add to that the distortions provided by deficit spending supported by borrowing from ‘outside the economy’ and ongoing quantitative easing by the Fed, and you start to get the picture of what information is mish-mashed in the published GDP numbers. (See the figure for a graphic display of the formula and money flows.)
Today the concoction and publication of ‘global temperature’ is perhaps the other most intense practice of secular religiosity that impacts public attitudes and shapes public policies. As RR readers have known for years, there is no place on Earth where you can stick a thermometer and read its temperature. Global temperature is a mathematical construct, the output of a complex model jiggered together with a lot of subjective decisions to include this and not that. And there exist several such formulas that compute global temperature, some in competition, and others that come with contingent trappings. All of them cloud their genesis and meaning so that the only purpose they effectively serve is to bolster agenda driven political statements, ideological propaganda, and new public policies for bigger government and more restrictive regulations.
The only related, and definitely the more dreadful aspect, of this exercise of faith-based science is the embedding of some of these temperature formulas in climate prediction models (aka ‘General Circulation Models’) from which derive the scary predictions of the coming hot house Earth. These predicted temperatures also come with rarely revealed error bounds which themselves tell an embarrassing tale. Over the last several decades during which out-year temperatures have been predicted, the error bounds of such predictions have not gotten smaller. A gold standard characteristic that confirms a maturing field in science is that its understanding of the part of nature that it studies and models continues to yield predictions with ever smaller or tighter error bounds, which when compared with actual measurements (experiments) attest to their growing reliability.
This is not happening in the pursuit of predictive climate science, mainly due to the utter complexity and our primitive knowledge of the process (Earth’s various systems of energy storage and exchange – atmosphere, stratosphere, oceans, ground masses, polar regions, …). Those who loudly proclaim their utter faith in this or that statistic about Earth’s future climate are truly dangerous demogauges and/or self-serving yokels who attempt to convince the less-read public that they have the cure for what they believe is preventable man-made global warming.
[9feb20 update] Right on schedule we welcome the retort to my assessment of climate models and modeling. Mr Steven Frisch, RR reader and commenter, is the CEO of an NGO that takes cover under the moniker Sierra Business Council. SBC gets its income from grants and contracts for promulgating and explaining the advantages and workings of big government to institutions and governmental jurisdictions in the business of providing for the public good. Among these functions SBC explains the latest laws and regulations, and their compliance requirements to their clients, who are overwhelmed by the verbiage and complex constraints of each new issue from the legislature and/or government bureau. One big revenue generator for SBC derives from government’s imposition of environmental dicta, especially those having to do with ‘climate change’.
As we have done in the past, we again illustrate the progressive pilgrim’s progress in contending my critique of climate models with his own words from the comment stream of this commentary.
Seriously climate models are amongst the most studied, peer reviewed, and transparent parts of scientific research today. To be accepted and widely used their methodologies and assumptions must be published, the data sets going into the models are scrutinized and reviewed, they cannot “cloud their genesis.” The reason there are competing climate models is that they are designed to work on a range of assumptions.
Mr Frisch is not a technologist, nor does he claim to be one. But he knows that his bread is buttered on both sides of the preventable manmade global warming issue, deriving benefit from promoting both the leftwing’s political agenda and its climate change ‘science’ agenda. The bottom line is that every claim in his above comment has been shown to be false. I am not accusing Mr Frisch of lying, for he is simply a true believer working in the vineyards of progressive thought, and knows not of which he speaks. And by no means is he alone.
Over the years, I and RR readers with technical credentials have reviewed the considerable list of flaws and unmet challenges in the various General Circulation Models (the most rigorous and celebrated of the IPCC’s climate prediction tools). Rehashing some major points –
- A GCM is a large and very complex piece of software variously cobbled together with a plethora of subroutines that attempt to model various types of energy exchanges that occur between the millions of discrete cells that layer and cover the Earth’s surface (510\6 km2). The subroutine models are contributed by specialists in the various areas of physics, fluid dynamics, meteorology, …, none of whom know, nor need to know, the architecture of the overall GCM into which their models will be programmed.
- Each GCM’s architecture and operation is known to only a few investigators who necessarily understand their contributing subroutines as ‘black boxes’ with in/out data requirements. Consequently, GCMs may be seen to be one of the least intensely studied areas of science when compared to intensely studied areas like AI, molecular biology (e.g cancer research), genomics, the standard model, image understanding, speech understanding, autonomous control & estimation, … .
- GCMs are iterative numerical models that attempt to characterize a very poorly understood stochastic process – i.e. Earth’s atmosphere. As such, their performance (prediction reliability) is constrained by natural computability limits that include propagation of error in such models due to realworld computer shortcomings, the unreliability of input data, and the total ad hoc nature of specifying modeling constraints and feedback loops.
- A GCM run to model climate decades from now takes several weeks. Therefore, the reliability (error) bounds are neither well developed nor known.
- All climate models are ‘tuned’ with error prone historical data, the effects of which are hard to measure. Such tunings that seek to replicate past climate history are an attempt to fit subjectively selected regression functions (aka curve fitting), and suffer from the well-known shortcomings (often catastrophic) of such regression models when exercised outside the range of the data epoch to which they were tuned or fitted.
This short list addresses just a part of the realworld’s unmet challenges encountered by the climate and systems scientists. Many more problems arise from the lack of and/or error-prone interpretations of input data that itself had to be developed from actual error-prone measurements (e.g. fossilized tree rings) put through precursor models to derive, say, CO2 levels eons ago. A discussion of this aspect of climate modeling is presented in ‘Flawed Climate Models’ from the Stanford University’s Hoover Institution. And much more about the technical limitations of quantitative climate prediction are available to the diligent reader with a little googling.
Mr Frisch’s claim that the GCM models are widely available and “scrutinized” is not true. To the extent that they have been scrutinized by the technically qualified whose livelihood is not beholden to political and bureaucratic climate catastrophe agendas, ALL of the models have been found to be woefully inadequate for developing the kind of information required to make prudent public policies. And this goes doubly for the data sets with which the GCMs are tuned and tested; some of these data sets have yet to be released by their developers for independent assessment.
The claim of catastrophic, yet preventable manmade global warming continues to be the perfect storm for globalists who work ardently to eliminate the current world order of sovereign nation-states and create a worldwide collective under one overarching socialist government that will then direct all the efforts of future humans. Mr Frisch is again merely doing his fastidious part.



Leave a comment