A fuel-economy change that protect freedom and saves lives

by H. Sterling Burnett


If finalized the proposal by the U.S. Environmental Protection Agency (EPA) and the National Highway Traffic Safety Administration (NHTSA) to freeze fuel-economy targets at 2020 levels through 2026 is good news for anyone concerned about consumer choice, vehicle affordability, and highway safety.

Acting EPA Administrator Andrew Wheeler’s determination freezing fuel-economy standards would benefit the American people should surprise no one, because in April EPA announced it would revoke the Obama-era standards requiring cars and light trucks sold in the United States to achieve an average of more than 50 miles per gallon (mpg) by 2025.

President Obama signed off on the 50 mpg standards just before leaving office in December 2016, two years before the previous standards were scheduled to be reviewed. Studies show the 50 mpg standard would substantially increase the price of cars, change the composition of the nation’s automobile and light truck fleet, and put lives at risk.

The “Safer Affordable Fuel-Efficient (SAFE) Vehicles Rule for Model Years 2021-2026 Passenger Cars and Light Trucks” is a culmination of EPA’s consultation with NHTSA to determine how fuel-economy standards can best balance consumers’ concerns about automobile affordability, vehicle safety, and fuel economy. 

“Our proposal aims to strike the right regulatory balance based on the most recent information and create a 50-state solution that will enable more Americans to afford newer, safer vehicles that pollute less,” Wheeler said. 

“There are compelling reasons for a new rulemaking on fuel economy standards for 2021-2026. More realistic standards will promote a healthy economy by bringing newer, safer, cleaner and more fuel-efficient vehicles to U.S. roads and we look forward to receiving input from the public,” stated Transportation Secretary Elaine Chao.

EPA calculates freezing fuel-economy standards at 2020 levels through 2026 will save more than 500 billion dollars in societal costs over the next 50 years and reduce highway fatalities by 12,700 lives. 

Fuel standard mandates began in 1975, when Congress established Corporate Average Fuel Economy (CAFE) standards to reduce dependence on foreign oil following the 1973–74 Arab oil embargo. The law required car manufacturers to meet mandated fuel-economy targets or else pay a hefty tax on gas-guzzling sedans. What happened? Some people bought smaller, more fuel-efficient cars. Others, however, started driving trucks, and new categories of vehicles were born: SUVs and minivans.

Over the years, compact cars have become less popular because of low fuel prices, underpowered engines, and lack of passenger and storage space. Most full-sized cars and trucks can seat five adults, and minivans and many SUVs can seat between seven and nine people. Numerous SUVs, trucks, and minivans offer ample cargo space and are capable of hauling a trailer or boat, which no subcompact can do safely. 

Ironically, the high popularity of trucks, SUVs, and minivans is at least partially a result of environmentalists’ efforts to reduce the appeal of large, powerful cars. EPA’s stringent fuel-economy standards didn’t apply to trucks, SUVs, or minivans, which didn’t then exist. So, to keep the features they liked, millions of people replaced the family sedan or station wagon with an SUV or truck. As fuel efficiency increased and driving became cheaper, people drove more miles — thereby negating the marginal gains of owning more-fuel-efficient vehicles.

CAFE standards did not reduce America’s dependence on foreign oil — it would take the fracking revolution to do that — but they did have deadly unintended consequences. To meet federal fuel-economy guidelines, carmakers reduced vehicle size, weight, and power. By doing so, manufacturers compromised cars’ safety, resulting in tens of thousands of unnecessary injuries and deaths in vehicle crashes. For every 100 pounds shaved off new cars to meet CAFE standards, between 440 and 780 additional people are killed in auto accidents, amounting to 2,200 to 3,900 lives lost per year, according to researchers at Harvard University and the Brookings Institution. As a result, CAFE has resulted in more deaths than all U.S. soldiers lost in the Vietnam War and every U.S. military engagement since then.

The laws of physics will never change. In a vehicle crash, larger and heavier is safer than lighter and smaller. EPA’s fuel-economy freeze will prevent unnecessary deaths while protecting consumer choice.

If fuel economy is the driving force behind your purchasing decisions, nothing changes under EPA’s decision to freeze current fuel-economy standards. You are free to continue buying the electric, hybrid, or clean diesel vehicle of your choice. If, however, comfort, power, vehicle safety, and the ability to haul a boat or ferry a little league team are your goals, EPA’s CAFE freeze ensures you can continue to make that choice as well. 

Ain’t freedom grand!



EPA’s Non-Politicized Science Benefits Americans

by H. Sterling Burnett


A direct challenge to the hardcore enviros who heretofore controlled and corrupted the agency.

President Donald Trump committed to fundamentally transforming the U.S. Environmental Protection Agency (EPA) from an agency producing politicized science to one instilling sound scientific standards for research. By doing so, Americans should expect improved environmental and health outcomes.

Currently, regulatory costs top $1.9 trillion annually, which amounts to $14,842 per U.S. household. That’s nearly $15,000 less for Americans to pay for health insurance, medical bills, education expenses, groceries, gasoline, or entertainment. Because the economic and social implications of regulations are profound, the science they are built upon must be impeccable.

Over the last few decades — under Republican and Democratic administrations — EPA formed a cozy relationship with radical environmental activists and liberal academic researchers. With the support of environmental lobbyists who despise capitalism (expressed by consumers’ free choices in the marketplace) EPA bureaucrats, in pursuit of more power and expanded budgets for the agency, funded researchers who, because they were largely dependent on government grants for the majority of their funding, were only too happy to produce results claiming industry was destroying the earth.

Of course, the only way to prevent environmental collapse was more government control of the economy. However, these reports were produced despite the fact poverty and hunger have steadily declined and people are living longer and more productive lives than ever before.

As Jay Lehr, a colleague and science director at the Heartland Institute told me once, “For decades, EPA has been a wholly owned subsidiary of the environmental left. Together, radical environmentalists and EPA bureaucrats, including the members of all their advisory panels, have used their considerable power to thwart American business at every turn.

Under Trump, EPA changed how it pursues science to pay greater fealty to the scientific method and remove temptations for scientific self-aggrandizement and corruption.

Not surprisingly, researchers, environmentalists, and bureaucrats, seeing their power curtailed and their gravy train ending, are crying foul saying the Trump administration is undermining science. However, in reality this is simply not true.

EPA’s scientific advisory panels are tasked with ensuring the research the agency uses to develop and justify regulations is rigorous, has integrity, and is based on the best available science.

To better ensure this, EPA ceased automatically renewing the terms of board members on various panels. EPA is now filling its scientific panels and boards on a competitive basis as each board member’s term expires.

This should improve the science EPA uses to inform its decisions, by expanding diversity — diversity of interests, diversity of scientific disciplines, and diversity of backgrounds — thus bringing in a wider array of viewpoints to EPA decision-making.

In addition, to reduce opportunities for corruption, EPA ceased allowing members of its federal advisory committees to apply for EPA research grants and instituted policies to ensure advisory panel members and grant recipients have no other conflicts of interest. It was always a foolish practice to allow those recommending, often determining, who gets EPA grants to also be in the running for those grants. However, this was business as usual at EPA, where grant makers awarded themselves, research teams they were members of, or their friends billions of taxpayer dollars over the years.

In April, then EPA Administrator Scott Pruitt declared “The era of secret science at EPA is coming to an end.” Pruitt proposed requiring the data underlying scientific studies used by EPA to craft regulations be available for public inspection, criticism, and independent verification.

For years, EPA bureaucrats have used the results of studies by researchers who would not disclose the data underlying their results to be examined and retested for confirmation or falsification. Fortunately, EPA is finally ending this unjustifiable practice.

Many scientists have objected to EPA’s new secret science policy because they claim the studies EPA uses have undergone “peer review.” However, the peer review process is often nothing more than other researchers, often hand-picked by the scientists whose research is being reviewed, sitting around in their ivory towers reading the reports and saying, “this looks okay or reasonable to me.”


Unless the reviewers are able examine the underlying data and assumptions, and attempt to replicate the results, peer review is unable to ensure the validity of studies used to underpin regulations. Absent transparency and replicability, peer review is hollow.

Another long overdue EPA regulatory reform was the decision to end exclusive use of the “Linearity No Threshold” (LNT) model when assessing the dangers of radiation, carcinogens, and other toxic substances in the environment. Going forward, EPA will incorporate uncertainty into its risk assessments using a variety of other, more realistic models.

The LNT model assumes there is no safe dose of ionizing radiation or exposure to various other chemicals or toxins. Relying on flawed studies from the effect of ionizing radiation on fruit flies from the 1950s, EPA and other regulatory agencies have used LNT as a basis for regulation of environmental clean-ups, setting safety standards for nuclear plants, and limiting low dose radiation treatments for medical patients, a policy that has cost lives and billions of taxpayer dollars.

Although science has progressed phenomenally since the 1950s, with copious amounts of research showing the LNT model is seriously flawed, EPA and other agencies never questioned the LNT standard. That is, until now.

In fact, adverse effects from low dose exposures to radiation and most other chemicals and potential toxins are often non-existent. Indeed, substances that may be harmful in large quantities can be beneficial in small amounts, a process known as hormesis.

In the commonly paraphrased words of Swiss physician and astronomer Paracelsus, “the dose makes the poison.” Vitamins, which are valuable in small quantities, and even water, which is literally necessary for life, can become deadly if too much of either is taken over a short period of time. Or consider sun exposure. While exposure to too much sunlight can contribute to skin cancer, sunlight is required to catalyze the final synthesis of Vitamin D, which strengthens the bones, helping prevent osteoporosis and rickets. There is also ample evidence sunlight can help fight depression and several skin and inflammatory ailments.

Replacing reliance on the untenable LNT model with other models of exposure and response will result in better safety and health protocols, potentially saving billions of dollars and thousands of lives each year.

In service of the American people and the pursuit of continued American greatness, science practices at EPA are improving under President Trump. One can only hope equivalent changes are adopted at other executive agencies so the regulations they produce are grounded in the best available science, free of political corruption and bureaucratic incentives for agency mission creep and growth.




The article first appeared here.

Trackside - Why the Incadescent Bulb Ban Amounts to Nothing

   by J.  D'Aloia


Elected officials often introduce laws for the sole purpose of having a piñata to bash for the cameras and the folks back home. Congressman Poe in the linked video is certainly making such use of the law banning incandescent light bulbs - I did not research to find out how he voted, but it matters not - he is making the most of it for his time in front of the camera.

My cynical side says the law was applauded by the environmental Luddites not because compact fluorescent light bulbs were reducing the dreaded greenhouse gases, but because the law was a means to further control society. Such is their goal. Demand changes in what society uses and how they use it to satisfy some environmental talking point. When the change has been ordained by a sycophantic legislative body, then raise a new issue and demand that new laws placing further control over society be enacted to counter the threats now spotlighted. More rules, more government, more taxes, less freedom.

Another cynical wonderment - why all the fuss about broken CFLs? Why has it not all played out for fluorescent tubes? They too have mercury in similar amounts. There has not been an avalanche of reports of people suffering from mercury poisoning from broken fluorescent tubes or moon-suited technicians cleaning up the family room after a tube was broken. Could it be that within the grand strategy, the timing was not right to play the poison card? And with LED light bulbs coming on the market, with an even greater energy efficiency (and much higher cost than CFLs), will CFLs be banned next? 

‘Tis a tempest in a teapot. CFLs do have a place in the grand scheme of things, especially for those lights the replacement of which is an all-day project, or if the spectrum you want cannot be obtained with an Edison special, or if your lighting demands are such that the cost vs. energy saved equation comes out to your benefit. Prudent respect for the dangers mitigates the dangers.

Tacitus nailed it in the First Century AD: "Corruptissima republicae, plurimae leges" - The worse the state, the more laws it has.

See you Trackside.


Cold Facts on Global Warming: Global temperature caused by doubling atmospheric CO2 is bounded by an upper limit of 1.4-2.7 degrees centigrade

{The article below supports the observation that Anthropogenic sources of carbon dioxide are not significantly affecting either global warming or climate change. These man made sources of CO2 are calculated and documented in Global Warming or global fraud? This is an abridged version of the original T. J. Nelson piece, still quite lengthy but well worth the read. It is reprinted by permission of the author. The complete article is available here. - Ed. }

Photo above: Absorption of ultraviolet, visible, and infrared radiation by various gases in the atmosphere. Most of the ultraviolet light (below 0.3 microns) is absorbed by ozone (O3) and oxygen (O2). Carbon dioxide has three large absorption bands in the infrared region at about 2.7, 4.3, and 15 microns. Water has several absorption bands in the infrared, and even has some absorption well into the microwave region. {Peixoto, J.P. and Oort, A.H., Physics of Climate, Springer, 1992, p. 118.}



by T. J. Nelson

What is the contribution of anthropogenic carbon dioxide to global warming? This question has been the subject of many heated arguments, and a great deal of hysteria. In this article, we will consider a simple calculation, based on well-accepted facts, that shows that the expected global temperature increase caused by doubling atmospheric carbon dioxide levels is bounded by an upper limit of 1.4-2.7 degrees centigrade. This result contrasts with the results of the IPCC's climate models, whose projections are shown to be unrealistically high.

There is general agreement that the Earth is naturally warmed to some extent by atmospheric gases, principally water vapor, in what is often called a "greenhouse effect". The Earth absorbs enough radiation from the sun to raise its temperature by 0.5 degrees per day, but is theoretically capable of emitting sufficient long-wave radiation to cool itself by 5 times this amount. The Earth maintains its energy balance in part by absorption of the outgoing longwave radiation in the atmosphere, which causes warming.

On this basis, it has been estimated that the current level of warming is on the order of 33 degrees C. That is to say, in the absence of so-called greenhouse gases, the Earth would be 33 degrees cooler than it is today, or about 255 K (-0.4° F). Of these greenhouse gases, water is by far the most important. Although estimates of the contribution from water vapor vary widely, most sources place it between 90 and 95% of the warming effect, or about 30-31 of the 33 degrees. Carbon dioxide, although present in much lower concentrations than water, absorbs more infrared radiation than water on a per-molecule basis and contributes about 84% of the total non-water greenhouse gas equivalents, or about 4.2-8.4% of the total greenhouse gas effect.

The 33 degree increase in temperature is not caused simply by absorption of radiation but mostly by the Earth's adaptation to higher temperatures, which includes secondary effects such as increased water vapor, cloud formation, and changes in albedo or surface reflectivity caused by melting and aging of snow and ice. Accurately calculating the relative contribution of each of these components presents major difficulties.

Traditionally, greenhouse gas levels are presented as dimensionless numbers representing parts per billion (ppb) multiplied by a scaling factor (global warming potential- GWP) that allows their relative efficiency of producing global temperature increases to be compared. For carbon dioxide, this scaling factor is 1.0. The factors for methane and nitrous oxide are 21 and 310, respectively, while sulfur hexafluoride is 23,900 times more effective than carbon dioxide. The GWP from carbon dioxide is primarily due to the position of its absorption bands in the critical longwave infrared region at 2, 3, 5, and 13-17 micrometers.

Methane, nitrous oxide, ozone, CFCs and other miscellaneous gases absorb radiation even more efficiently than carbon dioxide, but are also present at much lower concentrations. Their high GWP results from their molecular structure which makes them absorb strongly and at different wavelengths from water vapor and carbon dioxide. For example, although ozone is usually thought of as an absorber of ultraviolet radiation, it also absorbs longwave infrared at 9.6 micrometers. These gases account for another 1.3% of the natural greenhouse gas effect. The increase in the global energy balance caused by greenhouse gases is called "radiative forcing".

The GWP of a greenhouse gas is the ratio of the time-integrated radiative forcing from 1 kg of the gas in question compared to 1 kg of carbon dioxide. These GWP values are calculated over a 100 year time horizon and take into consideration not only the absorption of radiation at different wavelengths, but also the different atmospheric lifetimes of each gas and secondary effects such as effects on water vapor. For some gases, the GWP is too complex to calculate because the gas participates in complex chemical reactions. Most researchers use the GWPs compiled by the World Meteorological Organization (WMO).

Even though most of the so-called greenhouse effect is caused by water vapor, about 1-2 degrees of our current empirically-measured temperature of roughly 288 K (59° F) can be attributed to carbon dioxide. Water vapor is at least 99.99% of 'natural' origin, which is to say that no amount of deindustrialization could ever significantly change the amount of water vapor in the atmosphere. Thus, climatologists have concentrated mostly on carbon dioxide and methane.

Figures from the U.S. Department of Energy show that the pre-industrial baseline of carbon dioxide is 288,000 ppb. The total current carbon dioxide is 368,400 parts per billion, or 0.0368% of the atmosphere. The ocean and biosphere possess a large buffering capacity, mainly because of carbon dioxide's large solubility in water. Because of this, it's safe to conclude that the anthropogenic component of atmospheric carbon dioxide concentration will continue to remain roughly proportional to the rate of carbon dioxide emissions and therefore is in no danger of being saturated, which would allow all the emitted carbon dioxide to go into the atmosphere.

Of course, climate, like weather, is complex, nonlinear, and perhaps even chaotic. Increased solar irradiation can lower the albedo, which would amplify any effect caused by changes in solar flux, making the relation between radiation and temperature greater than linear.

Increased temperatures also cause increased evaporation of sea water, which can cause warming because of water's greenhouse effect, and also can affect the radiation flux by creating additional clouds.

The arithmetic of absorption of infrared radiation also works to decrease the linearity. Absorption of light follows a logarithmic curve -->

Figure 1: Transmitted light is a logarithmic function of concentration. This curve is the familiar Beer's Law.

as the amount of absorbing substance increases. It is generally accepted that the concentration of carbon dioxide in the atmosphere is already high enough to absorb almost all the infrared radiation in the main carbon dioxide absorption bands over a distance of only a few km. Thus, even if the atmosphere were heavily laden with carbon dioxide, it would still only cause an incremental increase in the amount of infrared absorption over current levels.

The net effect of all these processes is that doubling carbon dioxide would not double the amount of global warming. In fact, the effect of carbon dioxide is roughly logarithmic. Each time carbon dioxide (or some other greenhouse gas) is doubled, the increase in temperature is less than the previous increase. The reason for this is that, eventually, all the longwave radiation that can be absorbed has already been absorbed. It would be analogous to closing more and more shades over the windows of your house on a sunny day -- it soon reaches the point where doubling the number of shades can't make it any darker.

What does saturation mean?

The "saturation" argument does not mean that global warming doesn't occur. What saturation tells us is that exponentially higher levels of CO2 would be needed to produce a linear increase in absorption, and hence temperature. This is basic physics. Beer's law has not been repealed. CO2 is very nearly homogeneous throughout the atmosphere, so its concentration (as a percentage of the total) is about the same at all altitudes.

The presence or absence of water vapor has no bearing on whether radiation is absorbed by CO2. Water vapor is a red herring: it has essentially no effect on what CO2 does. Where water vapor becomes important is in the earth's response to CO2.

Saturation does not tell us whether CO2 can raise the atmospheric temperature, but it gives us a powerful clue about the shape of the curve of temperature vs. concentration.

Linear Climate projections:

The linear projection shown here, while obviously simplistic, is a more straightforward argument than those used in climate models, because it does not treat the radiative forcing caused by carbon dioxide separately from the planet's adaptation to it. In other words, we did not just build a model and add carbon dioxide, but instead took numbers that are based on empirical measurements and extended them by a small percentage.

Figure 2: Estimated greenhouse gas-induced global warming plotted against greenhouse gas concentrations expressed as a percentage of current-day values. The black curve is a linear extrapolation calculated from the DOE estimates of total current greenhouse gases. The sharp jump at the right is the data point from one computer model that predicts a nine degree increase from doubling current levels of carbon dioxide.

From the numbers, it is easy to calculate, assuming a linear dependence of temperature on greenhouse gas concentrations, that a doubling of atmospheric carbon dioxide

would produce an additional warming of (0.042 to 0.084) x 33 = 1.38 to 2.77 degrees centigrade.

Our calculation also assumes that the increase in temperature is linearly proportional to the greenhouse gas levels. However, as indicated above, the relationship is not linear, but logarithmic. A plot of temperature vs. gas concentration (expressed as a percentage of current-day levels) would be a convex curve, something like the blue curve in Figure 2. Thus, 1.4-2.7 degrees is an upper bound, and depending on the exact shape of the blue curve, could be an overestimate of the warming effect.

This 1.4-2.7 degree estimate is comparable to the estimate of 1.4 degrees associated with the "empiricist" school of the University of Delaware, University of Virginia, and Arizona State University. An increase of 1.4 degrees was also predicted by P.J. Michaels and R.C. Balling using the NCAR Community Climate Model 3 model, after the large increases in projected carbon dioxide in the original paper in which the model was described were replaced with more realistic ones.

Because a linear increase in temperature requires an exponential increase in carbon dioxide (thanks to the physics of radiation absorption described above), we know that the next two-fold increase in CO2 will produce exactly the same temperature increase as the previous two-fold increase. Although we haven't had a two-fold increase yet, it is easy to calculate from the observed values what to expect.

Between 1900 and 2000, atmospheric CO2 increased from 295 to 365 ppm, while temperatures increased about 0.57 degrees C (using the value cited by Al Gore and others). It is simple to calculate the proportionality constant (call it 'k') between the observed increase in CO2 and the observed temperature increase:

This shows that doubling CO2 over its current values should increase the earth's temperature by about 1.85 degrees C. Doubling it again would raise the temperature another 1.85 degrees C. Since these numbers are based on actual measurements, not models, they include the effects of amplification. These estimates assume that the correlation between global temperature and carbon dioxide is causal in nature. Therefore, the 1.85 degree estimate should also be regarded as an upper limit.

Figure 3: Calculated Temperature rise as a function of CO2 concentration.

It goes without saying that the results shown here depend on the accuracy of the original 33 degree estimate and the validity of extrapolating the existing curve out by an additional small increment. However, we can check the plausibility of the IPCC's result by asking the following question: Instead of 33 degrees, what number would result if we calculated backwards from the IPCC estimates?

Using the same assumption of linearity, if a 9 degree increase resulted from the above-mentioned increase of greenhouse gas levels, the current greenhouse gas level (which is by definition 100%) would be equivalent to a greenhouse gas-induced temperature increase of at least 107 degrees C. This means the for the 9 degree figure to be correct, the current global temperature would have to be at least 255 + 107 - 273 = 89 degrees centigrade, or 192° Fahrenheit! A model that predicts a current-day temperature well above the highest-ever observed temperature is clearly in need of serious tweaking.

What about secondary effects, such as ice melting, changes in albedo, and so forth? Doesn't this increase the predicted temperature beyond the 1.39 to 1.76 degree estimate? In short, no. Because these calculations are based on observed measurements, they automatically take into account all of the earth's responses. Whatever way the climate adapted to past CO2 increases, whether through melting, changes in albedo, or other effects, is already reflected in the measured temperature.

Some climatologists, making assumptions about ever increasing rates of carbon dioxide production, assert that the doubling will occur within a few decades instead of a few centuries. However, they are doing sociology, not climatology. The only honest way to estimate the change of CO2 levels is by using a simple linear extrapolation; otherwise, we are merely indulging in speculation about future social trends. At the current rate of increase, CO2 will not double its current level until 2255.

Conclusions:

Although carbon dioxide is capable of raising the Earth's overall temperature, the IPCC's predictions of catastrophic temperature increases produced by carbon dioxide have been challenged by many scientists. In particular, the importance of water vapor is frequently overlooked by environmental activists and by the media.

The above discussion shows that the large temperature increases predicted by many computer models are unphysical and inconsistent with results obtained by basic measurements. Skepticism is warranted when considering computer-generated projections of global warming that cannot even predict existing observations.


{T. J. Nelson is a practicing research scientist at a non-profit institute affiliated with a large University. - Ed.}